WorldWideScience

Sample records for previous version comput

  1. Matched cohort study of external cephalic version in women with previous cesarean delivery.

    Science.gov (United States)

    Keepanasseril, Anish; Anand, Keerthana; Soundara Raghavan, Subrahmanian

    2017-07-01

    To evaluate the efficacy and safety of external cephalic version (ECV) among women with previous cesarean delivery. A retrospective study was conducted using data for women with previous cesarean delivery and breech presentation who underwent ECV at or after 36 weeks of pregnancy during 2011-2016. For every case, two multiparous women without previous cesarean delivery who underwent ECV and were matched for age and pregnancy duration were included. Characteristics and outcomes were compared between groups. ECV was successful for 32 (84.2%) of 38 women with previous cesarean delivery and 62 (81.6%) in the control group (P=0.728). Multivariate regression analysis confirmed that previous cesarean was not associated with ECV success (odds ratio 1.89, 95% confidence interval 0.19-18.47; P=0.244). Successful vaginal delivery after successful ECV was reported for 19 (59.4%) women in the previous cesarean delivery group and 52 (83.9%) in the control group (P<0.001). No ECV-associated complications occurred in women with previous cesarean delivery. To avoid a repeat cesarean delivery, ECV can be offered to women with breech presentation and previous cesarean delivery who are otherwise eligible for a trial of labor. © 2017 International Federation of Gynecology and Obstetrics.

  2. Validation of the Online version of the Previous Day Food Questionnaire for schoolchildren

    Directory of Open Access Journals (Sweden)

    Raquel ENGEL

    Full Text Available ABSTRACT Objective To evaluate the validity of the web-based version of the Previous Day Food Questionnaire Online for schoolchildren from the 2nd to 5th grades of elementary school. Methods Participants were 312 schoolchildren aged 7 to 12 years of a public school from the city of Florianópolis, Santa Catarina, Brazil. Validity was assessed by sensitivity, specificity, as well as by agreement rates (match, omission, and intrusion rates of food items reported by children on the Previous Day Food Questionnaire Online, using direct observation of foods/beverages eaten during school meals (mid-morning snack or afternoon snack on the previous day as the reference. Multivariate multinomial logistic regression analysis was used to evaluate the influence of participants’ characteristics on omission and intrusion rates. Results The results showed adequate sensitivity (67.7% and specificity (95.2%. There were low omission and intrusion rates of 22.8% and 29.5%, respectively when all food items were analyzed. Pizza/hamburger showed the highest omission rate, whereas milk and milk products showed the highest intrusion rate. The participants who attended school in the afternoon shift presented a higher probability of intrusion compared to their peers who attended school in the morning. Conclusion The Previous Day Food Questionnaire Online possessed satisfactory validity for the assessment of food intake at the group level in schoolchildren from the 2nd to 5th grades of public school.

  3. The efficacy and safety of external cephalic version after a previous caesarean delivery.

    Science.gov (United States)

    Weill, Yishay; Pollack, Raphael N

    2017-06-01

    External cephalic version (ECV) in the presence of a uterine scar is still considered a relative contraindication despite encouraging studies of the efficacy and safety of this procedure. We present our experience with this patient population, which is the largest cohort published to date. To evaluate the efficacy and safety of ECV in the setting of a prior caesarean delivery. A total of 158 patients with a fetus presenting as breech, who had an unscarred uterus, had an ECV performed. Similarly, 158 patients with a fetus presenting as breech, and who had undergone a prior caesarean delivery also underwent an ECV. Outcomes were compared. ECV was successfully performed in 136/158 (86.1%) patients in the control group. Of these patients, 6/136 (4.4%) delivered by caesarean delivery. In the study group, 117/158 (74.1%) patients had a successful ECV performed. Of these patients, 12/117 (10.3%) delivered by caesarean delivery. There were no significant complications in either of the groups. ECV may be successfully performed in patients with a previous caesarean delivery. It is associated with a high success rate, and is not associated with an increase in complications. © 2016 The Royal Australian and New Zealand College of Obstetricians and Gynaecologists.

  4. External cephalic version among women with a previous cesarean delivery: report on 36 cases and review of the literature.

    Science.gov (United States)

    Abenhaim, Haim A; Varin, Jocelyne; Boucher, Marc

    2009-01-01

    Whether or not women with a previous cesarean section should be considered for an external cephalic version remains unclear. In our study, we sought to examine the relationship between a history of previous cesarean section and outcomes of external cephalic version for pregnancies at 36 completed weeks of gestation or more. Data on obstetrical history and on external cephalic version outcomes was obtained from the C.H.U. Sainte-Justine External Cephalic Version Database. Baseline clinical characteristics were compared among women with and without a history of previous cesarean section. We used logistic regression analysis to evaluate the effect of previous cesarean section on success of external cephalic version while adjusting for parity, maternal body mass index, gestational age, estimated fetal weight, and amniotic fluid index. Over a 15-year period, 1425 external cephalic versions were attempted of which 36 (2.5%) were performed on women with a previous cesarean section. Although women with a history of previous cesarean section were more likely to be older and para >2 (38.93% vs. 15.0%), there were no difference in gestational age, estimated fetal weight, and amniotic fluid index. Women with a prior cesarean section had a success rate similar to women without [50.0% vs. 51.6%, adjusted OR: 1.31 (0.48-3.59)]. Women with a previous cesarean section who undergo an external cephalic version have similar success rates than do women without. Concern about procedural success in women with a previous cesarean section is unwarranted and should not deter attempting an external cephalic version.

  5. Computer Game Lugram - Version for Blind Children

    OpenAIRE

    V. Delić; N. Vujnović Sedlar; B. Lučić

    2011-01-01

    Computer games have undoubtedly become an integral part of educational activities of children. However, since computer games typically abound with audio and visual effects, most of them are completely useless for children with disabilities. Specifically, computer games dealing with the basics of geometry can contribute to mathematics education, but they require significant modifications in order to be suitable for the visually impaired children. The paper presents the results of research and ...

  6. Computer Game Lugram - Version for Blind Children

    Directory of Open Access Journals (Sweden)

    V. Delić

    2011-06-01

    Full Text Available Computer games have undoubtedly become an integral part of educational activities of children. However, since computer games typically abound with audio and visual effects, most of them are completely useless for children with disabilities. Specifically, computer games dealing with the basics of geometry can contribute to mathematics education, but they require significant modifications in order to be suitable for the visually impaired children. The paper presents the results of research and adaptation of the educational computer game Lugram to the needs of completely blind children, as well as the testing of the prototype, whose results are encouraging to further research and development in the same direction.

  7. Trace contaminant control simulation computer program, version 8.1

    Science.gov (United States)

    Perry, J. L.

    1994-01-01

    The Trace Contaminant Control Simulation computer program is a tool for assessing the performance of various process technologies for removing trace chemical contamination from a spacecraft cabin atmosphere. Included in the simulation are chemical and physical adsorption by activated charcoal, chemical adsorption by lithium hydroxide, absorption by humidity condensate, and low- and high-temperature catalytic oxidation. Means are provided for simulating regenerable as well as nonregenerable systems. The program provides an overall mass balance of chemical contaminants in a spacecraft cabin given specified generation rates. Removal rates are based on device flow rates specified by the user and calculated removal efficiencies based on cabin concentration and removal technology experimental data. Versions 1.0 through 8.0 are documented in NASA TM-108409. TM-108409 also contains a source file listing for version 8.0. Changes to version 8.0 are documented in this technical memorandum and a source file listing for the modified version, version 8.1, is provided. Detailed descriptions for the computer program subprograms are extracted from TM-108409 and modified as necessary to reflect version 8.1. Version 8.1 supersedes version 8.0. Information on a separate user's guide is available from the author.

  8. Space shuttle general purpose computers (GPCs) (current and future versions)

    Science.gov (United States)

    1988-01-01

    Current and future versions of general purpose computers (GPCs) for space shuttle orbiters are represented in this frame. The two boxes on the left (AP101B) represent the current GPC configuration, with the input-output processor at far left and the central processing unit (CPU) at its side. The upgraded version combines both elements in a single unit (far right, AP101S).

  9. The Importance of Business Model Factors for Cloud Computing Adoption: Role of Previous Experiences

    Directory of Open Access Journals (Sweden)

    Bogataj Habjan Kristina

    2017-08-01

    Full Text Available Background and Purpose: Bringing several opportunities for more effective and efficient IT governance and service exploitation, cloud computing is expected to impact the European and global economies significantly. Market data show that despite many advantages and promised benefits the adoption of cloud computing is not as fast and widespread as foreseen. This situation shows the need for further exploration of the potentials of cloud computing and its implementation on the market. The purpose of this research was to identify individual business model factors with the highest impact on cloud computing adoption. In addition, the aim was to identify the differences in opinion regarding the importance of business model factors on cloud computing adoption according to companies’ previous experiences with cloud computing services.

  10. Computational Ecology and Software (http://www.iaees.org/publications/journals/ces/online-version.asp

    Directory of Open Access Journals (Sweden)

    ces@iaees.org

    Full Text Available Computational Ecology and Software ISSN 2220-721X URL: http://www.iaees.org/publications/journals/ces/online-version.asp RSS: http://www.iaees.org/publications/journals/ces/rss.xml E-mail: ces@iaees.org Editor-in-Chief: WenJun Zhang Aims and Scope COMPUTATIONAL ECOLOGY AND SOFTWARE (ISSN 2220-721X is an open access, peer-reviewed online journal that considers scientific articles in all different areas of computational ecology. It is the transactions of the International Society of Computational Ecology. The journal is concerned with the ecological researches, constructions and applications of theories and methods of computational sciences including computational mathematics, computational statistics and computer science. It features the simulation, approximation, prediction, recognition, and classification of ecological issues. Intensive computation is one of the major stresses of the journal. The journal welcomes research articles, short communications, review articles, perspectives, and book reviews. The journal also supports the activities of the International Society of Computational Ecology. The topics to be covered by CES include, but are not limited to: •Computation intensive methods, numerical and optimization methods, differential and difference equation modeling and simulation, prediction, recognition, classification, statistical computation (Bayesian computing, randomization, bootstrapping, Monte Carlo techniques, stochastic process, etc., agent-based modeling, individual-based modeling, artificial neural networks, knowledge based systems, machine learning, genetic algorithms, data exploration, network analysis and computation, databases, ecological modeling and computation using Geographical Information Systems, satellite imagery, and other computation intensive theories and methods. •Artificial ecosystems, artificial life, complexity of ecosystems and virtual reality. •The development, evaluation and validation of software and

  11. Amnioinfusion for women with a singleton breech presentation and a previous failed external cephalic version: a randomized controlled trial.

    Science.gov (United States)

    Diguisto, Caroline; Winer, Norbert; Descriaud, Celine; Tavernier, Elsa; Weymuller, Victoire; Giraudeau, Bruno; Perrotin, Franck

    2018-04-01

    Our trial aimed to assess the effectiveness of amnioinfusion for a second attempt at external cephalic version (ECV). This open randomized controlled trial was planned with a sequential design. Women at a term ≥36 weeks of gestation with a singleton fetus in breech presentation and a first unsuccessful ECV were recruited in two level-3 maternity units. They were randomly allocated to transabdominal amnioinfusion with a 500-mL saline solution under ultrasound surveillance or no amnioinfusion before the second ECV attempt. Trained senior obstetricians performed all procedures. The primary outcome was the cephalic presentation rate at delivery. Analyses were conducted according to intention to treat (NCT00465712). Recruitment difficulties led to stopping the trial after a 57-month period, 119 women were randomized: 59 allocated to amnioinfusion + ECV and 60 to ECV only. Data were analyzed without applying the sequential feature of the design. The rate of cephalic presentation at delivery did not differ significantly according to whether the second version attempt was or was not preceded by amnioinfusion (20 versus 12%, p = .20). Premature rupture of the membranes occurred for 15% of the women in the amnioinfusion group. Amnioinfusion before a second attempt to external version does not significantly increase the rate of cephalic presentation at delivery.

  12. Emphysema and bronchiectasis in COPD patients with previous pulmonary tuberculosis: computed tomography features and clinical implications

    Directory of Open Access Journals (Sweden)

    Jin J

    2018-01-01

    Full Text Available Jianmin Jin,1 Shuling Li,2 Wenling Yu,2 Xiaofang Liu,1 Yongchang Sun1,3 1Department of Respiratory and Critical Care Medicine, Beijing Tongren Hospital, Capital Medical University, Beijing, 2Department of Radiology, Beijing Tongren Hospital, Capital Medical University, Beijing, 3Department of Respiratory and Critical Care Medicine, Peking University Third Hospital, Beijing, China Background: Pulmonary tuberculosis (PTB is a risk factor for COPD, but the clinical characteristics and the chest imaging features (emphysema and bronchiectasis of COPD with previous PTB have not been studied well.Methods: The presence, distribution, and severity of emphysema and bronchiectasis in COPD patients with and without previous PTB were evaluated by high-resolution computed tomography (HRCT and compared. Demographic data, respiratory symptoms, lung function, and sputum culture of Pseudomonas aeruginosa were also compared between patients with and without previous PTB.Results: A total of 231 COPD patients (82.2% ex- or current smokers, 67.5% male were consecutively enrolled. Patients with previous PTB (45.0% had more severe (p=0.045 and longer history (p=0.008 of dyspnea, more exacerbations in the previous year (p=0.011, and more positive culture of P. aeruginosa (p=0.001, compared with those without PTB. Patients with previous PTB showed a higher prevalence of bronchiectasis (p<0.001, which was more significant in lungs with tuberculosis (TB lesions, and a higher percentage of more severe bronchiectasis (Bhalla score ≥2, p=0.031, compared with those without previous PTB. The overall prevalence of emphysema was not different between patients with and without previous PTB, but in those with previous PTB, a higher number of subjects with middle (p=0.001 and lower (p=0.019 lobe emphysema, higher severity score (p=0.028, higher prevalence of panlobular emphysema (p=0.013, and more extensive centrilobular emphysema (p=0.039 were observed. Notably, in patients with

  13. Value of computed tomography pelvimetry in patients with a previous cesarean section

    International Nuclear Information System (INIS)

    Yamani, Tarik Y.; Rouzi, Abdulrahim A.

    1998-01-01

    A case-control study was conducted at the Department of Obstetrics and Gynaecology, King Abdulaziz University Hospital, Jeddah, Saudi Arabia to determine the value of computed tomography pelivimetry in patients with a previous cesarean section. Between January 1993 and December 1995, 219 pregnant women with one previous cesarean had antenatal CT pelvimetry for assessment of the pelvis. One hundred and nineteen women did not have CT pelvimetry and served as control. Fifty-one women (51%) in the CT pelvimetry group were delivered by cesarean section. Twenty-three women (23%) underwent elective cesarean section for contracted pelvis based upon the findings of CT pelvimetry and 28 women (28%) underwent emergency cesarean section after trial of labor. In the group who did not have CT pelvimetry, 26 women (21.8%) underwent emergency cesarean section. This was a statistically significant difference (P=0.02). There were no statistically significant differences in birthweight and Apgar scores either group. There was no prenatal or maternal mortality in this study. Computed tomography pelvimetry increased the rate of cesarean delivery without any benefit in the immediate delivery outcomes. Therefore, the practice of documenting the adequacy of the pelvis by CT pelvimetry before vaginal birth after cesarean should be abandoned. (author)

  14. Standardized computer-based organized reporting of EEG SCORE - Second version

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se......Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted...... in the second, revised version of SCORE (Standardized Computer-based Organized Reporting of EEG), which is presented in this paper. The revised terminology was implemented in a software package (SCORE EEG), which was tested in clinical practice on 12,160 EEG recordings. Standardized terms implemented in SCORE....... In the end, the diagnostic significance is scored, using a standardized list of terms. SCORE has specific modules for scoring seizures (including seizure semiology and ictal EEG patterns), neonatal recordings (including features specific for this age group), and for Critical Care EEG Terminology. SCORE...

  15. Low-dose computed tomography image restoration using previous normal-dose scan

    International Nuclear Information System (INIS)

    Ma, Jianhua; Huang, Jing; Feng, Qianjin; Zhang, Hua; Lu, Hongbing; Liang, Zhengrong; Chen, Wufan

    2011-01-01

    Purpose: In current computed tomography (CT) examinations, the associated x-ray radiation dose is of a significant concern to patients and operators. A simple and cost-effective means to perform the examinations is to lower the milliampere-seconds (mAs) or kVp parameter (or delivering less x-ray energy to the body) as low as reasonably achievable in data acquisition. However, lowering the mAs parameter will unavoidably increase data noise and the noise would propagate into the CT image if no adequate noise control is applied during image reconstruction. Since a normal-dose high diagnostic CT image scanned previously may be available in some clinical applications, such as CT perfusion imaging and CT angiography (CTA), this paper presents an innovative way to utilize the normal-dose scan as a priori information to induce signal restoration of the current low-dose CT image series. Methods: Unlike conventional local operations on neighboring image voxels, nonlocal means (NLM) algorithm utilizes the redundancy of information across the whole image. This paper adapts the NLM to utilize the redundancy of information in the previous normal-dose scan and further exploits ways to optimize the nonlocal weights for low-dose image restoration in the NLM framework. The resulting algorithm is called the previous normal-dose scan induced nonlocal means (ndiNLM). Because of the optimized nature of nonlocal weights calculation, the ndiNLM algorithm does not depend heavily on image registration between the current low-dose and the previous normal-dose CT scans. Furthermore, the smoothing parameter involved in the ndiNLM algorithm can be adaptively estimated based on the image noise relationship between the current low-dose and the previous normal-dose scanning protocols. Results: Qualitative and quantitative evaluations were carried out on a physical phantom as well as clinical abdominal and brain perfusion CT scans in terms of accuracy and resolution properties. The gain by the use

  16. Computer Security: Mac security – nothing for old versions

    CERN Multimedia

    Stefan Lueders, Computer Security Team

    2016-01-01

    A fundamental pillar of computer security is the regular maintenance of your code, operating system and application software – or, in computer lingo: patching, patching, patching.   Only software which is up-to-date should be free from any known vulnerabilities and thus provide you with a basic level of computer security. Neglecting regular updates is putting your computer at risk – and consequently your account, your password, your data, your photos, your videos and your money. Therefore, prompt and automatic patching is paramount. But the Microsofts, Googles and Apples of this world do not always help… Software vendors handle their update policy in different ways. While Android is a disaster – not because of Google, but due to the slow adaptation of many smartphone vendors (see “Android’s Armageddon”) – Microsoft provides updates for their Windows 7, Windows 8 and Windows 10 operating systems through their &ldq...

  17. Fuel rod computations. The COMETHE code in its CEA version

    International Nuclear Information System (INIS)

    Lenepveu, Dominique.

    1976-01-01

    The COMETHE code (COde d'evolution MEcanique et THermique) is intended for computing the irradiation behavior of water reactor fuel pins. It is concerned with steadily operated cylindrical pins, containing fuel pellet stacks (UO 2 or PuO 2 ). The pin consists in five different axial zones: two expansion chambers, two blankets, and a central core that may be divided into several stacks parted by plugs. As far as computation is concerned, the pin is divided into slices (maximum 15) in turn divided into rings (maximum 50). Information are obtained for each slice: the radial temperature distribution, heat transfer coefficients, thermal flux at the pin surface, changes in geometry according to temperature conditions, and specific burn-up. The physical models involved take account for: heat transfer, fission gas release, fuel expansion, and creep of the can. Results computed with COMETHE are compared with those from ELP and EPEL irradiation experiments [fr

  18. Advanced Simulation and Computing FY17 Implementation Plan, Version 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hendrickson, Bruce [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wade, Doug [National Nuclear Security Administration (NNSA), Washington, DC (United States). Office of Advanced Simulation and Computing and Institutional Research and Development; Hoang, Thuc [National Nuclear Security Administration (NNSA), Washington, DC (United States). Computational Systems and Software Environment

    2016-08-29

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.

  19. A PC [personal computer]-based version of KENO V.a

    International Nuclear Information System (INIS)

    Nigg, D.A.; Atkinson, C.A.; Briggs, J.B.; Taylor, J.T.

    1990-01-01

    The use of personal computers (PCs) and engineering workstations for complex scientific computations has expanded rapidly in the last few years. This trend is expected to continue in the future with the introduction of increasingly sophisticated microprocessors and microcomputer systems. For a number of reasons, including security, economy, user convenience, and productivity, an integrated system of neutronics and radiation transport software suitable for operation in an IBM PC-class environment has been under development at the Idaho National Engineering Laboratory (INEL) for the past 3 yr. Nuclear cross-section data and resonance parameters are preprocessed from the Evaluated Nuclear Data Files Version 5 (ENDF/B-V) and supplied in a form suitable for use in a PC-based spectrum calculation and multigroup cross-section generation module. This module produces application-specific data libraries that can then be used in various neutron transport and diffusion theory code modules. This paper discusses several details of the Monte Carlo criticality module, which is based on the well-known highly-sophisticated KENO V.a package developed at Oak Ridge National Laboratory and previously released in mainframe form by the Radiation Shielding Information Center (RSIC). The conversion process and a variety of benchmarking results are described

  20. A Computer Adaptive Testing Version of the Addiction Severity Index-Multimedia Version (ASI-MV): The Addiction Severity CAT

    Science.gov (United States)

    Butler, Stephen F.; Black, Ryan A.; McCaffrey, Stacey A.; Ainscough, Jessica; Doucette, Ann M.

    2017-01-01

    The purpose of this study was to develop and validate a computer adaptive testing (CAT) version of the Addiction Severity Index-Multimedia Version (ASI-MV®), the Addiction Severity CAT. This goal was accomplished in four steps. First, new candidate items for Addiction Severity CAT domains were evaluated after brainstorming sessions with experts in substance abuse treatment. Next, this new item bank was psychometrically evaluated on a large non-clinical (n =4419) and substance abuse treatment sample (n =845). Based on these results, final items were selected and calibrated for the creation of the Addiction Severity CAT algorithms. Once the algorithms were developed for the entire assessment, a fully functioning prototype of an Addiction Severity CAT was created. CAT simulations were conducted and optimal termination criteria were selected for the Addiction Severity CAT algorithms. Finally, construct validity of the CAT algorithms was evaluated by examining convergent/discriminant validity and sensitivity to change. The Addiction Severity CAT was determined to be valid, sensitive to change, and reliable. Further, the Addiction Severity CAT’s time of administration was found to be significantly less than the average time of administration for the ASI-MV composite scores. This study represents the initial validation of an IRT-based Addiction Severity CAT, and further exploration of the Addiction Severity CAT is needed. PMID:28230387

  1. A computer adaptive testing version of the Addiction Severity Index-Multimedia Version (ASI-MV): The Addiction Severity CAT.

    Science.gov (United States)

    Butler, Stephen F; Black, Ryan A; McCaffrey, Stacey A; Ainscough, Jessica; Doucette, Ann M

    2017-05-01

    The purpose of this study was to develop and validate a computer adaptive testing (CAT) version of the Addiction Severity Index-Multimedia Version (ASI-MV), the Addiction Severity CAT. This goal was accomplished in 4 steps. First, new candidate items for Addiction Severity CAT domains were evaluated after brainstorming sessions with experts in substance abuse treatment. Next, this new item bank was psychometrically evaluated on a large nonclinical (n = 4,419) and substance abuse treatment (n = 845) sample. Based on these results, final items were selected and calibrated for the creation of the Addiction Severity CAT algorithms. Once the algorithms were developed for the entire assessment, a fully functioning prototype of an Addiction Severity CAT was created. CAT simulations were conducted, and optimal termination criteria were selected for the Addiction Severity CAT algorithms. Finally, construct validity of the CAT algorithms was evaluated by examining convergent and discriminant validity and sensitivity to change. The Addiction Severity CAT was determined to be valid, sensitive to change, and reliable. Further, the Addiction Severity CAT's time of completion was found to be significantly less than the average time of completion for the ASI-MV composite scores. This study represents the initial validation of an Addiction Severity CAT based on item response theory, and further exploration of the Addiction Severity CAT is needed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  2. DNAStat, version 2.1--a computer program for processing genetic profile databases and biostatistical calculations.

    Science.gov (United States)

    Berent, Jarosław

    2010-01-01

    This paper presents the new DNAStat version 2.1 for processing genetic profile databases and biostatistical calculations. The popularization of DNA studies employed in the judicial system has led to the necessity of developing appropriate computer programs. Such programs must, above all, address two critical problems, i.e. the broadly understood data processing and data storage, and biostatistical calculations. Moreover, in case of terrorist attacks and mass natural disasters, the ability to identify victims by searching related individuals is very important. DNAStat version 2.1 is an adequate program for such purposes. The DNAStat version 1.0 was launched in 2005. In 2006, the program was updated to 1.1 and 1.2 versions. There were, however, slight differences between those versions and the original one. The DNAStat version 2.0 was launched in 2007 and the major program improvement was an introduction of the group calculation options with the potential application to personal identification of mass disasters and terrorism victims. The last 2.1 version has the option of language selection--Polish or English, which will enhance the usage and application of the program also in other countries.

  3. High-Throughput Computational Assessment of Previously Synthesized Semiconductors for Photovoltaic and Photoelectrochemical Devices

    DEFF Research Database (Denmark)

    Kuhar, Korina; Pandey, Mohnish; Thygesen, Kristian Sommer

    2018-01-01

    Using computational screening we identify materials with potential use as light absorbers in photovoltaic or photoelectrochemical devices. The screening focuses on compounds of up to three different chemical elements which are abundant and nontoxic. A prescreening is carried out based on informat...

  4. The NASA Ames PAH IR Spectroscopic Database: Computational Version 3.00 with Updated Content and the Introduction of Multiple Scaling Factors

    Science.gov (United States)

    Bauschlicher, Charles W., Jr.; Ricca, A.; Boersma, C.; Allamandola, L. J.

    2018-02-01

    Version 3.00 of the library of computed spectra in the NASA Ames PAH IR Spectroscopic Database (PAHdb) is described. Version 3.00 introduces the use of multiple scale factors, instead of the single scaling factor used previously, to align the theoretical harmonic frequencies with the experimental fundamentals. The use of multiple scale factors permits the use of a variety of basis sets; this allows new PAH species to be included in the database, such as those containing oxygen, and yields an improved treatment of strained species and those containing nitrogen. In addition, the computed spectra of 2439 new PAH species have been added. The impact of these changes on the analysis of an astronomical spectrum through database-fitting is considered and compared with a fit using Version 2.00 of the library of computed spectra. Finally, astronomical constraints are defined for the PAH spectral libraries in PAHdb.

  5. An improved computational version of the LTSN method to solve transport problems in a slab

    International Nuclear Information System (INIS)

    Cardona, Augusto V.; Oliveira, Jose Vanderlei P. de; Vilhena, Marco Tullio de; Segatto, Cynthia F.

    2008-01-01

    In this work, we present an improved computational version of the LTS N method to solve transport problems in a slab. The key feature relies on the reordering of the set of S N equations. This procedure reduces by a factor of two the task of evaluating the eigenvalues of the matrix associated to SN approximations. We present numerical simulations and comparisons with the ones of the classical LTS N approach. (author)

  6. DEVELOPMENT OF QUARRY SOLUTION VERSION 1.0 FOR QUICK COMPUTATION OF DRILLING AND BLASTING PARAMETERS

    OpenAIRE

    B. ADEBAYO; A. W. BELLO

    2014-01-01

    Computation of drilling cost, quantity of explosives and blasting cost are routine procedure in Quarry and all these parameters are estimated manually in most of the quarries in Nigeria. This paper deals with the development of application package QUARRY SOLUTION Version 1.0 for quarries using Visual Basic 6.0. In order to achieve this data were obtained from the quarry such as drilling and blasting activities. Also, empirical formulae developed by different researchers were used for computat...

  7. The HARWELL version of the computer code E-DEP-1

    International Nuclear Information System (INIS)

    Matthews, M.D.

    1983-03-01

    This document describes the modified HARWELL version of the computer program EDEP-1 which has been in use on the IBM Central Computer for some years. The program can be used to calculate heavy ion ranges and/or profiles of energy deposited into nuclear processes for a wide variety of ion/target combinations. The initial setting up of this program on the IBM Central Computer has been described in an earlier report. A second report was later issued to bring the first report up to date following changes to this code required to suit the needs of workers at HARWELL. This later report described in particular the provision of new electronic stopping powers and an alternative method for calculating the energy straggle of beam ions with depth in a target. This new report describes further extensions to the electronic stopping powers available in the HARWELL version of this program and, for the first time, gives details of alternative nuclear stopping powers now available. This new document is intended as a reference manual for the use of the HARWELL version of EDEP-1. In this respect this document should be the final report on the status of this program. (author)

  8. BOLD VENTURE COMPUTATION SYSTEM for nuclear reactor core analysis, Version III

    International Nuclear Information System (INIS)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W. III.

    1981-06-01

    This report is a condensed documentation for VERSION III of the BOLD VENTURE COMPUTATION SYSTEM for nuclear reactor core analysis. An experienced analyst should be able to use this system routinely for solving problems by referring to this document. Individual reports must be referenced for details. This report covers basic input instructions and describes recent extensions to the modules as well as to the interface data file specifications. Some application considerations are discussed and an elaborate sample problem is used as an instruction aid. Instructions for creating the system on IBM computers are also given

  9. BOLD VENTURE COMPUTATION SYSTEM for nuclear reactor core analysis, Version III

    Energy Technology Data Exchange (ETDEWEB)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W. III.

    1981-06-01

    This report is a condensed documentation for VERSION III of the BOLD VENTURE COMPUTATION SYSTEM for nuclear reactor core analysis. An experienced analyst should be able to use this system routinely for solving problems by referring to this document. Individual reports must be referenced for details. This report covers basic input instructions and describes recent extensions to the modules as well as to the interface data file specifications. Some application considerations are discussed and an elaborate sample problem is used as an instruction aid. Instructions for creating the system on IBM computers are also given.

  10. U.S. Army weapon systems human-computer interface style guide. Version 2

    Energy Technology Data Exchange (ETDEWEB)

    Avery, L.W.; O`Mara, P.A.; Shepard, A.P.; Donohoo, D.T.

    1997-12-31

    A stated goal of the US Army has been the standardization of the human computer interfaces (HCIs) of its system. Some of the tools being used to accomplish this standardization are HCI design guidelines and style guides. Currently, the Army is employing a number of HCI design guidance documents. While these style guides provide good guidance for the command, control, communications, computers, and intelligence (C4I) domain, they do not necessarily represent the more unique requirements of the Army`s real time and near-real time (RT/NRT) weapon systems. The Office of the Director of Information for Command, Control, Communications, and Computers (DISC4), in conjunction with the Weapon Systems Technical Architecture Working Group (WSTAWG), recognized this need as part of their activities to revise the Army Technical Architecture (ATA), now termed the Joint Technical Architecture-Army (JTA-A). To address this need, DISC4 tasked the Pacific Northwest National Laboratory (PNNL) to develop an Army weapon systems unique HCI style guide, which resulted in the US Army Weapon Systems Human-Computer Interface (WSHCI) Style Guide Version 1. Based on feedback from the user community, DISC4 further tasked PNNL to revise Version 1 and publish Version 2. The intent was to update some of the research and incorporate some enhancements. This document provides that revision. The purpose of this document is to provide HCI design guidance for the RT/NRT Army system domain across the weapon systems subdomains of ground, aviation, missile, and soldier systems. Each subdomain should customize and extend this guidance by developing their domain-specific style guides, which will be used to guide the development of future systems within their subdomains.

  11. Multi keno-VAX a modified version of the reactor computer code Multi keno-2

    Energy Technology Data Exchange (ETDEWEB)

    Imam, M [National center for nuclear safety and radiation control, atomic energy authority, Cairo, (Egypt)

    1995-10-01

    The reactor computer code Multi keno-2 is developed in Japan from the original Monte Carlo Keno-IV. By applications of this code on some real problems, fatal errors were detected. These errors are related to the restart option in the code. The restart option is essential for solving time-consuming problems on mini-computer like VAX-6320. These errors were corrected and other modifications were carried out in the code. Because of these modifications new input data description was written for the code. Thus a new VAX/VMS version for the program was developed which is also adaptable for mini-mainframes. This new developed program, called Multi keno-VAX is accepted in the Nea-IAEA data bank and is added to its international computer codes library. 1 fig.

  12. Multi keno-VAX a modified version of the reactor computer code Multi keno-2

    International Nuclear Information System (INIS)

    Imam, M.

    1995-01-01

    The reactor computer code Multi keno-2 is developed in Japan from the original Monte Carlo Keno-IV. By applications of this code on some real problems, fatal errors were detected. These errors are related to the restart option in the code. The restart option is essential for solving time-consuming problems on mini-computer like VAX-6320. These errors were corrected and other modifications were carried out in the code. Because of these modifications new input data description was written for the code. Thus a new VAX/VMS version for the program was developed which is also adaptable for mini-mainframes. This new developed program, called Multi keno-VAX is accepted in the Nea-IAEA data bank and is added to its international computer codes library. 1 fig

  13. Dynamic Computation of Change Operations in Version Management of Business Process Models

    Science.gov (United States)

    Küster, Jochen Malte; Gerth, Christian; Engels, Gregor

    Version management of business process models requires that changes can be resolved by applying change operations. In order to give a user maximal freedom concerning the application order of change operations, position parameters of change operations must be computed dynamically during change resolution. In such an approach, change operations with computed position parameters must be applicable on the model and dependencies and conflicts of change operations must be taken into account because otherwise invalid models can be constructed. In this paper, we study the concept of partially specified change operations where parameters are computed dynamically. We provide a formalization for partially specified change operations using graph transformation and provide a concept for their applicability. Based on this, we study potential dependencies and conflicts of change operations and show how these can be taken into account within change resolution. Using our approach, a user can resolve changes of business process models without being unnecessarily restricted to a certain order.

  14. DEVELOPMENT OF QUARRY SOLUTION VERSION 1.0 FOR QUICK COMPUTATION OF DRILLING AND BLASTING PARAMETERS

    Directory of Open Access Journals (Sweden)

    B. ADEBAYO

    2014-10-01

    Full Text Available Computation of drilling cost, quantity of explosives and blasting cost are routine procedure in Quarry and all these parameters are estimated manually in most of the quarries in Nigeria. This paper deals with the development of application package QUARRY SOLUTION Version 1.0 for quarries using Visual Basic 6.0. In order to achieve this data were obtained from the quarry such as drilling and blasting activities. Also, empirical formulae developed by different researchers were used for computation of the required parameters viz: practical burden, spacing, length of hole, cost of drilling consumables, drilling cost, powder factor, quantity of column charge, total quantity of explosives, volume of blast and blasting cost. The output obtained from the software QUARRY SOLUTION Version 1.0 for length of drilling, drilling cost, total quantity of explosives, volume of blast and blasting cost were compared with the results manually computed for these routine parameters estimated during drilling and blasting operation in quarry, it was then discovered that they followed the same trend. The computation from the application package revealed that 611 blast-holes require 3326.71 kg of high explosives (166 cartons of explosives and 20147.2 kg of low explosives (806 bags of explosives. The total cost was computed to be N 5133999:50 ($ 32087.49. Moreover, the output showed that these routine parameters estimated during drilling and blasting could be computed within a short time frame using this QUARRY SOLUTION, therefore, improving productivity and efficiency. This application package is recommended for use in open-pit and quarries when all necessary inputs are supplied.

  15. ClustalXeed: a GUI-based grid computation version for high performance and terabyte size multiple sequence alignment

    Directory of Open Access Journals (Sweden)

    Kim Taeho

    2010-09-01

    Full Text Available Abstract Background There is an increasing demand to assemble and align large-scale biological sequence data sets. The commonly used multiple sequence alignment programs are still limited in their ability to handle very large amounts of sequences because the system lacks a scalable high-performance computing (HPC environment with a greatly extended data storage capacity. Results We designed ClustalXeed, a software system for multiple sequence alignment with incremental improvements over previous versions of the ClustalX and ClustalW-MPI software. The primary advantage of ClustalXeed over other multiple sequence alignment software is its ability to align a large family of protein or nucleic acid sequences. To solve the conventional memory-dependency problem, ClustalXeed uses both physical random access memory (RAM and a distributed file-allocation system for distance matrix construction and pair-align computation. The computation efficiency of disk-storage system was markedly improved by implementing an efficient load-balancing algorithm, called "idle node-seeking task algorithm" (INSTA. The new editing option and the graphical user interface (GUI provide ready access to a parallel-computing environment for users who seek fast and easy alignment of large DNA and protein sequence sets. Conclusions ClustalXeed can now compute a large volume of biological sequence data sets, which were not tractable in any other parallel or single MSA program. The main developments include: 1 the ability to tackle larger sequence alignment problems than possible with previous systems through markedly improved storage-handling capabilities. 2 Implementing an efficient task load-balancing algorithm, INSTA, which improves overall processing times for multiple sequence alignment with input sequences of non-uniform length. 3 Support for both single PC and distributed cluster systems.

  16. A new Fortran 90 program to compute regular and irregular associated Legendre functions (new version announcement)

    Science.gov (United States)

    Schneider, Barry I.; Segura, Javier; Gil, Amparo; Guan, Xiaoxu; Bartschat, Klaus

    2018-04-01

    This is a revised and updated version of a modern Fortran 90 code to compute the regular Plm (x) and irregular Qlm (x) associated Legendre functions for all x ∈(- 1 , + 1) (on the cut) and | x | > 1 and integer degree (l) and order (m). The necessity to revise the code comes as a consequence of some comments of Prof. James Bremer of the UC//Davis Mathematics Department, who discovered that there were errors in the code for large integer degree and order for the normalized regular Legendre functions on the cut.

  17. A computationally efficient description of heterogeneous freezing: A simplified version of the Soccer ball model

    Science.gov (United States)

    Niedermeier, Dennis; Ervens, Barbara; Clauss, Tina; Voigtländer, Jens; Wex, Heike; Hartmann, Susan; Stratmann, Frank

    2014-01-01

    In a recent study, the Soccer ball model (SBM) was introduced for modeling and/or parameterizing heterogeneous ice nucleation processes. The model applies classical nucleation theory. It allows for a consistent description of both apparently singular and stochastic ice nucleation behavior, by distributing contact angles over the nucleation sites of a particle population assuming a Gaussian probability density function. The original SBM utilizes the Monte Carlo technique, which hampers its usage in atmospheric models, as fairly time-consuming calculations must be performed to obtain statistically significant results. Thus, we have developed a simplified and computationally more efficient version of the SBM. We successfully used the new SBM to parameterize experimental nucleation data of, e.g., bacterial ice nucleation. Both SBMs give identical results; however, the new model is computationally less expensive as confirmed by cloud parcel simulations. Therefore, it is a suitable tool for describing heterogeneous ice nucleation processes in atmospheric models.

  18. Comparability of Computer-based and Paper-based Versions of Writing Section of PET in Iranian EFL Context

    OpenAIRE

    Mohammad Mohammadi; Masoud Barzgaran

    2010-01-01

    Computer technology has provided language testing experts with opportunity to develop computerized versions of traditional paper-based language tests. New generations of TOEFL and Cambridge IELTS, BULATS, KET, PET are good examples of computer-based language tests. Since this new method of testing introduces new factors into the realm of language assessment ( e.g. modes of test delivery, familiarity with computer, etc.),the question may be whether the two modes of computer- and paper-based te...

  19. Standardized computer-based organized reporting of EEG SCORE - Second version

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se...

  20. Independent validation testing of the FLAME computer code, Version 1.0

    International Nuclear Information System (INIS)

    Martian, P.; Chung, J.N.

    1992-07-01

    Independent testing of the FLAME computer code, Version 1.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Validation tests, (i.e., tests which compare field data to the computer generated solutions) were used to determine the operational status of the FLAME computer code and were done on a qualitative basis through graphical comparisons of the experimental and numerical data. These tests were specifically designed to check: (1) correctness of the FORTRAN coding, (2) computational accuracy, and (3) suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: (1) independent applications, and (2) graduated difficulty of test cases. Three tests ranging in complexity from simple one-dimensional steady-state flow field problems under near-saturated conditions to two-dimensional transient flow problems with very dry initial conditions

  1. US Army Weapon Systems Human-Computer Interface (WSHCI) style guide, Version 1

    Energy Technology Data Exchange (ETDEWEB)

    Avery, L.W.; O`Mara, P.A.; Shepard, A.P.

    1996-09-30

    A stated goal of the U.S. Army has been the standardization of the human computer interfaces (HCIS) of its system. Some of the tools being used to accomplish this standardization are HCI design guidelines and style guides. Currently, the Army is employing a number of style guides. While these style guides provide good guidance for the command, control, communications, computers, and intelligence (C4I) domain, they do not necessarily represent the more unique requirements of the Army`s real time and near-real time (RT/NRT) weapon systems. The Office of the Director of Information for Command, Control, Communications, and Computers (DISC4), in conjunction with the Weapon Systems Technical Architecture Working Group (WSTAWG), recognized this need as part of their activities to revise the Army Technical Architecture (ATA). To address this need, DISC4 tasked the Pacific Northwest National Laboratory (PNNL) to develop an Army weapon systems unique HCI style guide. This document, the U.S. Army Weapon Systems Human-Computer Interface (WSHCI) Style Guide, represents the first version of that style guide. The purpose of this document is to provide HCI design guidance for RT/NRT Army systems across the weapon systems domains of ground, aviation, missile, and soldier systems. Each domain should customize and extend this guidance by developing their domain-specific style guides, which will be used to guide the development of future systems within their domains.

  2. Verification of RESRAD-build computer code, version 3.1

    International Nuclear Information System (INIS)

    2003-01-01

    RESRAD-BUILD is a computer model for analyzing the radiological doses resulting from the remediation and occupancy of buildings contaminated with radioactive material. It is part of a family of codes that includes RESRAD, RESRAD-CHEM, RESRAD-RECYCLE, RESRAD-BASELINE, and RESRAD-ECORISK. The RESRAD-BUILD models were developed and codified by Argonne National Laboratory (ANL); version 1.5 of the code and the user's manual were publicly released in 1994. The original version of the code was written for the Microsoft DOS operating system. However, subsequent versions of the code were written for the Microsoft Windows operating system. The purpose of the present verification task (which includes validation as defined in the standard) is to provide an independent review of the latest version of RESRAD-BUILD under the guidance provided by ANSI/ANS-10.4 for verification and validation of existing computer programs. This approach consists of a posteriori V and V review which takes advantage of available program development products as well as user experience. The purpose, as specified in ANSI/ANS-10.4, is to determine whether the program produces valid responses when used to analyze problems within a specific domain of applications, and to document the level of verification. The culmination of these efforts is the production of this formal Verification Report. The first step in performing the verification of an existing program was the preparation of a Verification Review Plan. The review plan consisted of identifying: Reason(s) why a posteriori verification is to be performed; Scope and objectives for the level of verification selected; Development products to be used for the review; Availability and use of user experience; and Actions to be taken to supplement missing or unavailable development products. The purpose, scope and objectives for the level of verification selected are described in this section of the Verification Report. The development products that were used

  3. Radiological Safety Analysis Computer (RSAC) Program Version 7.0 Users’ Manual

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Bradley J Schrader

    2009-03-01

    The Radiological Safety Analysis Computer (RSAC) Program Version 7.0 (RSAC-7) is the newest version of the RSAC legacy code. It calculates the consequences of a release of radionuclides to the atmosphere. A user can generate a fission product inventory from either reactor operating history or a nuclear criticality event. RSAC-7 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates the decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated for inhalation, air immersion, ground surface, ingestion, and cloud gamma pathways. RSAC-7 can be used as a tool to evaluate accident conditions in emergency response scenarios, radiological sabotage events and to evaluate safety basis accident consequences. This users’ manual contains the mathematical models and operating instructions for RSAC-7. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-7. This program was designed for users who are familiar with radiological dose assessment methods.

  4. Radiological Safety Analysis Computer (RSAC) Program Version 7.2 Users’ Manual

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Bradley J Schrader

    2010-10-01

    The Radiological Safety Analysis Computer (RSAC) Program Version 7.2 (RSAC-7) is the newest version of the RSAC legacy code. It calculates the consequences of a release of radionuclides to the atmosphere. A user can generate a fission product inventory from either reactor operating history or a nuclear criticality event. RSAC-7 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates the decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated for inhalation, air immersion, ground surface, ingestion, and cloud gamma pathways. RSAC-7 can be used as a tool to evaluate accident conditions in emergency response scenarios, radiological sabotage events and to evaluate safety basis accident consequences. This users’ manual contains the mathematical models and operating instructions for RSAC-7. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-7. This program was designed for users who are familiar with radiological dose assessment methods.

  5. Radiological Safety Analysis Computer (RSAC) Program Version 7.0 Users Manual

    International Nuclear Information System (INIS)

    Schrader, Bradley J.

    2009-01-01

    The Radiological Safety Analysis Computer (RSAC) Program Version 7.0 (RSAC-7) is the newest version of the RSAC legacy code. It calculates the consequences of a release of radionuclides to the atmosphere. A user can generate a fission product inventory from either reactor operating history or a nuclear criticality event. RSAC-7 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates the decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated for inhalation, air immersion, ground surface, ingestion, and cloud gamma pathways. RSAC-7 can be used as a tool to evaluate accident conditions in emergency response scenarios, radiological sabotage events and to evaluate safety basis accident consequences. This users manual contains the mathematical models and operating instructions for RSAC-7. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-7. This program was designed for users who are familiar with radiological dose assessment methods

  6. A randomised clinical trial of intrapartum fetal monitoring with computer analysis and alerts versus previously available monitoring

    Directory of Open Access Journals (Sweden)

    Santos Cristina

    2010-10-01

    Full Text Available Abstract Background Intrapartum fetal hypoxia remains an important cause of death and permanent handicap and in a significant proportion of cases there is evidence of suboptimal care related to fetal surveillance. Cardiotocographic (CTG monitoring remains the basis of intrapartum surveillance, but its interpretation by healthcare professionals lacks reproducibility and the technology has not been shown to improve clinically important outcomes. The addition of fetal electrocardiogram analysis has increased the potential to avoid adverse outcomes, but CTG interpretation remains its main weakness. A program for computerised analysis of intrapartum fetal signals, incorporating real-time alerts for healthcare professionals, has recently been developed. There is a need to determine whether this technology can result in better perinatal outcomes. Methods/design This is a multicentre randomised clinical trial. Inclusion criteria are: women aged ≥ 16 years, able to provide written informed consent, singleton pregnancies ≥ 36 weeks, cephalic presentation, no known major fetal malformations, in labour but excluding active second stage, planned for continuous CTG monitoring, and no known contra-indication for vaginal delivery. Eligible women will be randomised using a computer-generated randomisation sequence to one of the two arms: continuous computer analysis of fetal monitoring signals with real-time alerts (intervention arm or continuous CTG monitoring as previously performed (control arm. Electrocardiographic monitoring and fetal scalp blood sampling will be available in both arms. The primary outcome measure is the incidence of fetal metabolic acidosis (umbilical artery pH ecf > 12 mmol/L. Secondary outcome measures are: caesarean section and instrumental vaginal delivery rates, use of fetal blood sampling, 5-minute Apgar score Discussion This study will provide evidence of the impact of intrapartum monitoring with computer analysis and real

  7. Air Space Proportion in Pterosaur Limb Bones Using Computed Tomography and Its Implications for Previous Estimates of Pneumaticity

    Science.gov (United States)

    Martin, Elizabeth G.; Palmer, Colin

    2014-01-01

    Air Space Proportion (ASP) is a measure of how much air is present within a bone, which allows for a quantifiable comparison of pneumaticity between specimens and species. Measured from zero to one, higher ASP means more air and less bone. Conventionally, it is estimated from measurements of the internal and external bone diameter, or by analyzing cross-sections. To date, the only pterosaur ASP study has been carried out by visual inspection of sectioned bones within matrix. Here, computed tomography (CT) scans are used to calculate ASP in a small sample of pterosaur wing bones (mainly phalanges) and to assess how the values change throughout the bone. These results show higher ASPs than previous pterosaur pneumaticity studies, and more significantly, higher ASP values in the heads of wing bones than the shaft. This suggests that pneumaticity has been underestimated previously in pterosaurs, birds, and other archosaurs when shaft cross-sections are used to estimate ASP. Furthermore, ASP in pterosaurs is higher than those found in birds and most sauropod dinosaurs, giving them among the highest ASP values of animals studied so far, supporting the view that pterosaurs were some of the most pneumatized animals to have lived. The high degree of pneumaticity found in pterosaurs is proposed to be a response to the wing bone bending stiffness requirements of flight rather than a means to reduce mass, as is often suggested. Mass reduction may be a secondary result of pneumaticity that subsequently aids flight. PMID:24817312

  8. COMODI: an ontology to characterise differences in versions of computational models in biology.

    Science.gov (United States)

    Scharm, Martin; Waltemath, Dagmar; Mendes, Pedro; Wolkenhauer, Olaf

    2016-07-11

    Open model repositories provide ready-to-reuse computational models of biological systems. Models within those repositories evolve over time, leading to different model versions. Taken together, the underlying changes reflect a model's provenance and thus can give valuable insights into the studied biology. Currently, however, changes cannot be semantically interpreted. To improve this situation, we developed an ontology of terms describing changes in models. The ontology can be used by scientists and within software to characterise model updates at the level of single changes. When studying or reusing a model, these annotations help with determining the relevance of a change in a given context. We manually studied changes in selected models from BioModels and the Physiome Model Repository. Using the BiVeS tool for difference detection, we then performed an automatic analysis of changes in all models published in these repositories. The resulting set of concepts led us to define candidate terms for the ontology. In a final step, we aggregated and classified these terms and built the first version of the ontology. We present COMODI, an ontology needed because COmputational MOdels DIffer. It empowers users and software to describe changes in a model on the semantic level. COMODI also enables software to implement user-specific filter options for the display of model changes. Finally, COMODI is a step towards predicting how a change in a model influences the simulation results. COMODI, coupled with our algorithm for difference detection, ensures the transparency of a model's evolution, and it enhances the traceability of updates and error corrections. COMODI is encoded in OWL. It is openly available at http://comodi.sems.uni-rostock.de/ .

  9. Computer-aided detection system performance on current and previous digital mammograms in patients with contralateral metachronous breast cancer

    International Nuclear Information System (INIS)

    Kim, Seung Ja; Moon, Woo Kyung; Cho, Nariya; Chang, Jung Min

    2012-01-01

    Background: The computer-aided detection (CAD) system is widely used for screening mammography. The performance of the CAD system for contralateral breast cancer has not been reported for women with a history of breast cancer. Purpose: To retrospectively evaluate the performance of a CAD system on current and previous mammograms in patients with contralateral metachronous breast cancer. Material and Methods: During a 3-year period, 4945 postoperative patients had follow-up examinations, from whom we selected 55 women with contralateral breast cancers. Among them, 38 had visible malignant signs on the current mammograms. We analyzed the sensitivity and false-positive marks of the system on the current and previous mammograms according to lesion type and breast density. Results: The total visible lesion components on the current mammograms included 27 masses and 14 calcifications in 38 patients. The case-based sensitivity for all lesion types was 63.2% (24/38) with false-positive marks of 0.71 per patient. The lesion-based sensitivity for masses and calcifications was 59.3% (16/27) and 71.4% (10/14), respectively. The lesion-based sensitivity for masses in fatty and dense breasts was 68.8% (11/16) and 45.5% (5/11), respectively. The lesion-based sensitivity for calcifications in fatty and dense breasts was 100.0% (3/3) and 63.6% (7/11), respectively. The total visible lesion components on the previous mammograms included 13 masses and three calcifications in 16 patients, and the sensitivity for all lesion types was 31.3% (5/16) with false-positive marks of 0.81 per patient. On these mammograms, the sensitivity for masses and calcifications was 30.8% (4/13) and 33.3% (1/3), respectively. The sensitivity in fatty and dense breasts was 28.6% (2/7) and 33.3% (3/9), respectively. Conclusion: In the women with a history of breast cancer, the sensitivity of the CAD system in visible contralateral breast cancer was lower than in most previous reports using the same CAD

  10. Comparability of Computer-based and Paper-based Versions of Writing Section of PET in Iranian EFL Context

    Directory of Open Access Journals (Sweden)

    Mohammad Mohammadi

    2010-11-01

    Full Text Available Computer technology has provided language testing experts with opportunity to develop computerized versions of traditional paper-based language tests. New generations of TOEFL and Cambridge IELTS, BULATS, KET, PET are good examples of computer-based language tests. Since this new method of testing introduces new factors into the realm of language assessment ( e.g. modes of test delivery, familiarity with computer, etc.,the question may be whether the two modes of computer- and paper-based tests comparably measure the same construct, and hence, the scores obtained from the two modes can be used interchangeably. Accordingly, the present study aimed to investigate the comparability of the paper- and computer-based versions of a writing test. The data for this study were collected from administering the writing section of a Cambridge Preliminary English Test (PET to eighty Iranian intermediate EFL learners through the two modes of computer- and paper-based testing. Besides, a computer familiarity questionnaire was used to divide participants into two groups with high and low computer familiarity. The results of the independent samples t-test revealed that there was no statistically significant difference between the learners' computer- and paper-based writing scores. The results of the paired samples t-test showed no statistically significant difference between high- and low-computer-familiar groups on computer-based writing. The researchers concluded that the two modes comparably measured the same construct.

  11. Computer-Aided System Engineering and Analysis (CASE/A) Programmer's Manual, Version 5.0

    Science.gov (United States)

    Knox, J. C.

    1996-01-01

    The Computer Aided System Engineering and Analysis (CASE/A) Version 5.0 Programmer's Manual provides the programmer and user with information regarding the internal structure of the CASE/A 5.0 software system. CASE/A 5.0 is a trade study tool that provides modeling/simulation capabilities for analyzing environmental control and life support systems and active thermal control systems. CASE/A has been successfully used in studies such as the evaluation of carbon dioxide removal in the space station. CASE/A modeling provides a graphical and command-driven interface for the user. This interface allows the user to construct a model by placing equipment components in a graphical layout of the system hardware, then connect the components via flow streams and define their operating parameters. Once the equipment is placed, the simulation time and other control parameters can be set to run the simulation based on the model constructed. After completion of the simulation, graphical plots or text files can be obtained for evaluation of the simulation results over time. Additionally, users have the capability to control the simulation and extract information at various times in the simulation (e.g., control equipment operating parameters over the simulation time or extract plot data) by using "User Operations (OPS) Code." This OPS code is written in FORTRAN with a canned set of utility subroutines for performing common tasks. CASE/A version 5.0 software runs under the VAX VMS(Trademark) environment. It utilizes the Tektronics 4014(Trademark) graphics display system and the VTIOO(Trademark) text manipulation/display system.

  12. USER'S GUIDE TO THE PERSONAL COMPUTER VERSION OF THE BIOGENIC EMISSIONS INVENTORY SYSTEM (PC-BEIS2)

    Science.gov (United States)

    The document is a user's guide for an updated Personal Computer version of the Biogenic Emissions Inventory System (PC-BEIS2), allowing users to estimate hourly emissions of biogenic volatile organic compounds (BVOCs) and soil nitrogen oxide emissions for any county in the contig...

  13. Monteray Mark-I: Computer program (PC-version) for shielding calculation with Monte Carlo method

    International Nuclear Information System (INIS)

    Pudjijanto, M.S.; Akhmad, Y.R.

    1998-01-01

    A computer program for gamma ray shielding calculation using Monte Carlo method has been developed. The program is written in WATFOR77 language. The MONTERAY MARH-1 is originally developed by James Wood. The program was modified by the authors that the modified version is easily executed. Applying Monte Carlo method the program observe photon gamma transport in an infinity planar shielding with various thick. A photon gamma is observed till escape from the shielding or when its energy less than the cut off energy. Pair production process is treated as pure absorption process that annihilation photons generated in the process are neglected in the calculation. The out put data calculated by the program are total albedo, build-up factor, and photon spectra. The calculation result for build-up factor of a slab lead and water media with 6 MeV parallel beam gamma source shows that they are in agreement with published data. Hence the program is adequate as a shielding design tool for observing gamma radiation transport in various media

  14. THEAP-I: A computer program for thermal hydraulic analysis of a thermally interacting channel bundle of complex geometry. The micro computer version user`s manual

    Energy Technology Data Exchange (ETDEWEB)

    Megaritou, A; Bartzis, J G

    1987-09-01

    In the present report the micro computer version of the code is described. More emphasis is given in the new features of the code (i.e. input data structure). A set of instructions for running in an IBM-AT2 computer with the Microsoft FORTRAN V.4.0 is also included together with a sample problem refering to the Greek Research Reactor.

  15. SACRD: a data base for fast reactor safety computer codes, contents and glossary of Version 1 of the system

    International Nuclear Information System (INIS)

    Greene, N.M.; Forsberg, V.M.; Raiford, G.B.; Arwood, J.W.; Flanagan, G.F.

    1979-01-01

    SACRD is a data base of material properties and other handbook data needed in computer codes used for fast reactor safety studies. This document lists the contents of Version 1 and also serves as a glossary of terminology used in the data base. Data are available in the thermodynamics, heat transfer, fluid mechanics, structural mechanics, aerosol transport, meteorology, neutronics and dosimetry areas. Tabular, graphical and parameterized data are provided in many cases

  16. Certification of version 1.2 of the PORFLO-3 code for the WHC scientific and engineering computational center

    International Nuclear Information System (INIS)

    Kline, N.W.

    1994-01-01

    Version 1.2 of the PORFLO-3 Code has migrated from the Hanford Cray computer to workstations in the WHC Scientific and Engineering Computational Center. The workstation-based configuration and acceptance testing are inherited from the CRAY-based configuration. The purpose of this report is to document differences in the new configuration as compared to the parent Cray configuration, and summarize some of the acceptance test results which have shown that the migrated code is functioning correctly in the new environment

  17. CASKETSS-2: a computer code system for thermal and structural analysis of nuclear fuel shipping casks (version 2)

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1991-08-01

    A computer program CASKETSS-2 has been developed for the purpose of thermal and structural analysis of nuclear fuel shipping casks. CASKETSS-2 means a modular code system for CASK Evaluation code system Thermal and Structural Safety (Version 2). Main features of CASKETSS-2 are as follow; (1) Thermal and structural analysis computer programs for one-, two-, three-dimensional geometries are contained in the code system. (2) There are simplified computer programs and a detailed one in the structural analysis part in the code system. (3) Input data generator is provided in the code system. (4) Graphic computer program is provided in the code system. In the paper, brief illustration of calculation method, input data and sample calculations are presented. (author)

  18. COSY INFINITY Version 9

    International Nuclear Information System (INIS)

    Makino, Kyoko; Berz, Martin

    2006-01-01

    In this paper, we review the features in the newly released version of COSY INFINITY, which currently has a base of more than 1000 registered users, focusing on the topics which are new and some topics which became available after the first release of the previous versions 8 and 8.1. The recent main enhancements of the code are devoted to reliability and efficiency of the computation, to verified integration, and to rigorous global optimization. There are various data types available in COSY INFINITY to support these goals, and the paper also reviews the feature and usage of those data types

  19. Test-retest reliability and comparability of paper and computer questionnaires for the Finnish version of the Tampa Scale of Kinesiophobia.

    Science.gov (United States)

    Koho, P; Aho, S; Kautiainen, H; Pohjolainen, T; Hurri, H

    2014-12-01

    To estimate the internal consistency, test-retest reliability and comparability of paper and computer versions of the Finnish version of the Tampa Scale of Kinesiophobia (TSK-FIN) among patients with chronic pain. In addition, patients' personal experiences of completing both versions of the TSK-FIN and preferences between these two methods of data collection were studied. Test-retest reliability study. Paper and computer versions of the TSK-FIN were completed twice on two consecutive days. The sample comprised 94 consecutive patients with chronic musculoskeletal pain participating in a pain management or individual rehabilitation programme. The group rehabilitation design consisted of physical and functional exercises, evaluation of the social situation, psychological assessment of pain-related stress factors, and personal pain management training in order to regain overall function and mitigate the inconvenience of pain and fear-avoidance behaviour. The mean TSK-FIN score was 37.1 [standard deviation (SD) 8.1] for the computer version and 35.3 (SD 7.9) for the paper version. The mean difference between the two versions was 1.9 (95% confidence interval 0.8 to 2.9). Test-retest reliability was 0.89 for the paper version and 0.88 for the computer version. Internal consistency was considered to be good for both versions. The intraclass correlation coefficient for comparability was 0.77 (95% confidence interval 0.66 to 0.85), indicating substantial reliability between the two methods. Both versions of the TSK-FIN demonstrated substantial intertest reliability, good test-retest reliability, good internal consistency and acceptable limits of agreement, suggesting their suitability for clinical use. However, subjects tended to score higher when using the computer version. As such, in an ideal situation, data should be collected in a similar manner throughout the course of rehabilitation or clinical research. Copyright © 2014 Chartered Society of Physiotherapy. Published

  20. GROGi-F. Modified version of GROGi 2 nuclear evaporation computer code including fission decay channel

    International Nuclear Information System (INIS)

    Delagrange, H.

    1977-01-01

    This report is the user manual of the GR0GI-F code, modified version of the GR0GI-2 code. It calculates the cross sections for heavy ion induced fission. Fission probabilities are calculated via the Bohr-Wheeler formalism

  1. Next Generation Computer Resources: Reference Model for Project Support Environments (Version 2.0)

    National Research Council Canada - National Science Library

    Brown, Alan

    1993-01-01

    The objective of the Next Generation Computer Resources (NGCR) program is to restructure the Navy's approach to acquisition of standard computing resources to take better advantage of commercial advances and investments...

  2. Trusted Network Interpretation of the Trusted Computer System Evaluation Criteria. Version 1.

    Science.gov (United States)

    1987-07-01

    for Secure Computer Systema, MTR-3153, The MITRE Corporation, Bedford, MA, June 1975. 1 See, for example, M. D. Abrams and H. J. Podell , Tutorial...References References Abrams, M. D. and H. J. Podell , Tutorial: Computer and Network Security, IEEE Com- puter Society Press, 1987. Addendum to the

  3. Calculations of reactor-accident consequences, Version 2. CRAC2: computer code user's guide

    International Nuclear Information System (INIS)

    Ritchie, L.T.; Johnson, J.D.; Blond, R.M.

    1983-02-01

    The CRAC2 computer code is a revision of the Calculation of Reactor Accident Consequences computer code, CRAC, developed for the Reactor Safety Study. The CRAC2 computer code incorporates significant modeling improvements in the areas of weather sequence sampling and emergency response, and refinements to the plume rise, atmospheric dispersion, and wet deposition models. New output capabilities have also been added. This guide is to facilitate the informed and intelligent use of CRAC2. It includes descriptions of the input data, the output results, the file structures, control information, and five sample problems

  4. MINIVER: Miniature version of real/ideal gas aero-heating and ablation computer program

    Science.gov (United States)

    Hendler, D. R.

    1976-01-01

    Computer code is used to determine heat transfer multiplication factors, special flow field simulation techniques, different heat transfer methods, different transition criteria, crossflow simulation, and more efficient thin skin thickness optimization procedure.

  5. Montage Version 3.0

    Science.gov (United States)

    Jacob, Joseph; Katz, Daniel; Prince, Thomas; Berriman, Graham; Good, John; Laity, Anastasia

    2006-01-01

    The final version (3.0) of the Montage software has been released. To recapitulate from previous NASA Tech Briefs articles about Montage: This software generates custom, science-grade mosaics of astronomical images on demand from input files that comply with the Flexible Image Transport System (FITS) standard and contain image data registered on projections that comply with the World Coordinate System (WCS) standards. This software can be executed on single-processor computers, multi-processor computers, and such networks of geographically dispersed computers as the National Science Foundation s TeraGrid or NASA s Information Power Grid. The primary advantage of running Montage in a grid environment is that computations can be done on a remote supercomputer for efficiency. Multiple computers at different sites can be used for different parts of a computation a significant advantage in cases of computations for large mosaics that demand more processor time than is available at any one site. Version 3.0 incorporates several improvements over prior versions. The most significant improvement is that this version is accessible to scientists located anywhere, through operational Web services that provide access to data from several large astronomical surveys and construct mosaics on either local workstations or remote computational grids as needed.

  6. FLICA-4 (version 1) a computer code for three dimensional thermal analysis of nuclear reactor cores

    International Nuclear Information System (INIS)

    Raymond, P.; Allaire, G.; Boudsocq, G.

    1995-01-01

    FLICA-4 is a thermal-hydraulic computer code developed at the French Energy Atomic Commission (CEA) for three dimensional steady state or transient two phase flow for design and safety thermal analysis of nuclear reactor cores. The two phase flow model of FLICA-4 is based on four balance equations for the fluid which includes: three balance equations for the mixture and a mass balance equation for the less concentrated phase which permits the calculation of non-equilibrium flows as sub cooled boiling and superheated steam. A drift velocity model takes into account the velocity disequilibrium between phases. The thermal behaviour of fuel elements can be computed by a one dimensional heat conduction equation in plane, cylindrical or spherical geometries and coupled to the fluid flow calculation. Convection and diffusion of solution products which are transported either by the liquid or by the gas, can be evaluated by solving specific mass conservation equations. A one dimensional two phase flow model can also be used to compute 1-D flow in pipes, guide tubes, BWR assemblies or RBMK channels. The FLICA-4 computer code uses fast running time steam-water functions. Phasic and saturation physical properties are computed by using bi-cubic spline functions. Polynomial coefficients are tabulated from 0.1 to 22 MPa and 0 to 800 degrees C. Specific modules can be utilised in order to generate the spline coefficients for any other fluid properties

  7. High Performance Computing - Power Application Programming Interface Specification Version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Laros, James H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grant, Ryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Levenhagen, Michael J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Olivier, Stephen Lecler [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pedretti, Kevin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ward, H. Lee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Younge, Andrew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-03-01

    Measuring and controlling the power and energy consumption of high performance computing systems by various components in the software stack is an active research area. Implementations in lower level software layers are beginning to emerge in some production systems, which is very welcome. To be most effective, a portable interface to measurement and control features would significantly facilitate participation by all levels of the software stack. We present a proposal for a standard power Application Programming Interface (API) that endeavors to cover the entire software space, from generic hardware interfaces to the input from the computer facility manager.

  8. CARES (Computer Analysis for Rapid Evaluation of Structures) Version 1.0, seismic module

    International Nuclear Information System (INIS)

    Xu, J.; Philippacopoulas, A.J.; Miller, C.A.; Costantino, C.J.

    1990-07-01

    During FY's 1988 and 1989, Brookhaven National Laboratory (BNL) developed the CARES system (Computer Analysis for Rapid Evaluation of Structures) for the US Nuclear Regulatory Commission (NRC). CARES is a PC software system which has been designed to perform structural response computations similar to those encountered in licensing reviews of nuclear power plant structures. The documentation of the Seismic Module of CARES consists of three volumes. This report is Volume 2 of the three volume documentation of the Seismic Module of CARES and represents the User's Manual. 14 refs

  9. CARES (Computer Analysis for Rapid Evaluation of Structures) Version 1.0, seismic module

    International Nuclear Information System (INIS)

    Xu, J.; Philippacopoulas, A.J.; Miller, C.A.; Costantino, C.J.

    1990-07-01

    During FY's 1988 and 1989, Brookhaven National Laboratory (BNL) developed the CARES system (Computer Analysis for Rapid Evaluation of Structures) for the US Nuclear Regulatory Commission (NRC). CARES is a PC software system which has been designed to perform structural response computations similar to those encountered in licensing reviews of nuclear power plant structures. The documentation of the Seismic Module of CARES consists of three volumes. This report represents Volume 3 of the volume documentation of the Seismic Module of CARES. It presents three sample problems typically encountered in the Soil-Structure Interaction analyses. 14 refs., 36 figs., 2 tabs

  10. CARES (Computer Analysis for Rapid Evaluation of Structures) Version 1.0, seismic module

    International Nuclear Information System (INIS)

    Xu, J.; Philippacopoulas, A.J.; Miller, C.A.; Costantino, C.J.

    1990-07-01

    During FY's 1988 and 1989, Brookhaven National Laboratory (BNL) developed the CARES system (Computer Analysis for Rapid Evaluation of Structures) for the US Nuclear Regulatory Commission (NRC). CARES is a PC software system which has been designed to perform structural response computations similar to those encountered in licencing reviews of nuclear power plant structures. The docomentation of the Seismic Module of CARES consists of three volumes. This report represents Volume 1 of the three volume documentation of the Seismic Module of CARES. It concentrates on the theoretical basis of the system and presents modeling assumptions and limitations as well as solution schemes and algorithms of CARES. 31 refs., 6 figs

  11. Bifurcation approach to the predator-prey population models (Version of the computer book)

    International Nuclear Information System (INIS)

    Bazykin, A.D.; Zudin, S.L.

    1993-09-01

    Hierarchically organized family of predator-prey systems is studied. The classification is founded on two interacting principles: the biological and mathematical ones. The different combinations of biological factors included correspond to different bifurcations (up to codimension 3). As theoretical so computing methods are used for analysis, especially concerning non-local bifurcations. (author). 6 refs, figs

  12. Green computing: power optimisation of VFI-based real-time multiprocessor dataflow applications (extended version)

    NARCIS (Netherlands)

    Ahmad, W.; Holzenspies, P.K.F.; Stoelinga, Mariëlle Ida Antoinette; van de Pol, Jan Cornelis

    2015-01-01

    Execution time is no longer the only performance metric for computer systems. In fact, a trend is emerging to trade raw performance for energy savings. Techniques like Dynamic Power Management (DPM, switching to low power state) and Dynamic Voltage and Frequency Scaling (DVFS, throttling processor

  13. The SX Solver: A Computer Program for Analyzing Solvent-Extraction Equilibria: Version 3.0

    International Nuclear Information System (INIS)

    Lumetta, Gregg J.

    2001-01-01

    A new computer program, the SX Solver, has been developed to analyze solvent-extraction equilibria. The program operates out of Microsoft Excel and uses the built-in Solver function to minimize the sum of the square of the residuals between measured and calculated distribution coefficients. The extraction of nitric acid by tributyl phosphate has been modeled to illustrate the programs use

  14. FLICA-4 (version 1). A computer code for three dimensional thermal analysis of nuclear reactor cores

    International Nuclear Information System (INIS)

    Raymond, P.; Allaire, G.; Boudsocq, G.; Caruge, D.; Gramont, T. de; Toumi, I.

    1995-01-01

    FLICA-4 is a thermal-hydraulic computer code, developed at the French Atomic Energy Commission (CEA) for three-dimensional steady-state or transient two-phase flow, and aimed at design and safety thermal analysis of nuclear reactor cores. It is available for various UNIX workstations and CRAY computers under UNICOS.It is based on four balance equations which include three balance equations for the mixture and a mass balance equation for the less concentrated phase which allows for the calculation of non equilibrium flows such as sub-cooled boiling and superheated steam. A drift velocity model takes into account the velocity unbalance between phases. The equations are solved using a finite volume numerical scheme. Typical running time, specific features (coupling with other codes) and auxiliary programs are presented. 1 tab., 9 refs

  15. Advanced Simulation and Computing Fiscal Year 2016 Implementation Plan, Version 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hendrickson, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-27

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The purpose of this IP is to outline key work requirements to be performed and to control individual work activities within the scope of work. Contractors may not deviate from this plan without a revised WA or subsequent IP.

  16. High Performance Computing - Power Application Programming Interface Specification Version 1.4

    Energy Technology Data Exchange (ETDEWEB)

    Laros III, James H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); DeBonis, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grant, Ryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kelly, Suzanne M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Levenhagen, Michael J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Olivier, Stephen Lecler [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pedretti, Kevin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-10-01

    Measuring and controlling the power and energy consumption of high performance computing systems by various components in the software stack is an active research area [13, 3, 5, 10, 4, 21, 19, 16, 7, 17, 20, 18, 11, 1, 6, 14, 12]. Implementations in lower level software layers are beginning to emerge in some production systems, which is very welcome. To be most effective, a portable interface to measurement and control features would significantly facilitate participation by all levels of the software stack. We present a proposal for a standard power Application Programming Interface (API) that endeavors to cover the entire software space, from generic hardware interfaces to the input from the computer facility manager.

  17. V.S.O.P. (99/09) computer code system for reactor physics and fuel cycle simulation. Version 2009

    International Nuclear Information System (INIS)

    Ruetten, H.J.; Haas, K.A.; Brockmann, H.; Ohlig, U.; Pohl, C.; Scherer, W.

    2010-07-01

    V.S.O.P. (99/ 09) represents the further development of V.S.O.P. (99/ 05). Compared to its precursor, the code system has been improved again in many details. The main motivation for this new code version was to update the basic nuclear libraries used by the code system. Thus, all cross section libraries involved in the code have now been based on ENDF/B-VII. V.S.O.P. is a computer code system for the comprehensive numerical simulation of the physics of thermal reactors. It implies the setup of the reactor and of the fuel element, processing of cross sections, neutron spectrum evaluation, neutron diffusion calculation in two or three dimensions, fuel burnup, fuel shuffling, reactor control, thermal hydraulics and fuel cycle costs. The thermal hydraulics part (steady state and time-dependent) is restricted to gas-cooled reactors and to two spatial dimensions. The code can simulate the reactor operation from the initial core towards the equilibrium core. This latest code version was developed and tested under the WINDOWS-XP - operating system. (orig.)

  18. V.S.O.P. (99/09) computer code system for reactor physics and fuel cycle simulation. Version 2009

    Energy Technology Data Exchange (ETDEWEB)

    Ruetten, H.J.; Haas, K.A.; Brockmann, H.; Ohlig, U.; Pohl, C.; Scherer, W.

    2010-07-15

    V.S.O.P. (99/ 09) represents the further development of V.S.O.P. (99/ 05). Compared to its precursor, the code system has been improved again in many details. The main motivation for this new code version was to update the basic nuclear libraries used by the code system. Thus, all cross section libraries involved in the code have now been based on ENDF/B-VII. V.S.O.P. is a computer code system for the comprehensive numerical simulation of the physics of thermal reactors. It implies the setup of the reactor and of the fuel element, processing of cross sections, neutron spectrum evaluation, neutron diffusion calculation in two or three dimensions, fuel burnup, fuel shuffling, reactor control, thermal hydraulics and fuel cycle costs. The thermal hydraulics part (steady state and time-dependent) is restricted to gas-cooled reactors and to two spatial dimensions. The code can simulate the reactor operation from the initial core towards the equilibrium core. This latest code version was developed and tested under the WINDOWS-XP - operating system. (orig.)

  19. White paper: A vision for a computing initiative for MFE. Revised version

    International Nuclear Information System (INIS)

    Cohen, R.H.; Crotinger, J.A.; Baldwin, D.E.

    1996-01-01

    The scientific base of magnetic fusion research comprises three capabilities: experimental research, theoretical understanding and computational modeling, with modeling providing the necessary link between the other two. The US now faces a budget climate that will preclude the construction of major new MFE facilities and limit MFE experimental operations. The situation is rather analogous to the one experienced by the DOE Defense Programs (DP), in which continued viability of the nuclear stockpile must be ensured despite the prohibition of underground experimental tests. DP is meeting this challenge, in part, by launching the Accelerated Strategic Computing Initiative (ASCI) to bring advanced algorithms and new hardware to bear on the problems of science-based stockpile stewardship (SBSS). ASCI has as its goal the establishment of a ''virtual testing'' capability, and it is expected to drive scientific software and hardware development through the next decade. The authors argue that a similar effort is warranted for the MFE program, that is, an initiative aimed at developing a comprehensive simulation capability for MFE, with the goal of enabling ''virtual experiments.'' It would play a role for MFE analogous to that played by present-day and future (ASCI) codes for nuclear weapons design and by LASNEX for ICF, and provide a powerful augmentation to constrained experimental programs. Developing a comprehensive simulation capability could provide an organizing theme for a restructured science-based MFE program. The code would become a central vehicle for integrating the accumulating science base. In the context the authors propose, the relationship would ultimately be reversed: computer simulation would become a primary vehicle for exploration, with experiments providing the necessary confirmatory evidence (or guidance for code improvements)

  20. Wien Automatic System Planning (WASP) Package. A computer code for power generating system expansion planning. Version WASP-IV. User's manual

    International Nuclear Information System (INIS)

    2001-01-01

    As a continuation of its efforts to provide methodologies and tools to Member States to carry out comparative assessment and analyse priority environmental issues related to the development of the electric power sector, the IAEA has completed a new version of the Wien Automatic System Planning (WASP) Package WASP-IV for carrying out power generation expansion planning taking into consideration fuel availability and environmental constraints. This manual constitutes a part of this work and aims to provide users with a guide to use effectively the new version of the model WASP-IV. WASP was originally developed in 1972 by the Tennessee Valley Authority and the Oak Ridge National Laboratory in the USA to meet the IAEA needs to analyse the economic competitiveness of nuclear power in comparison to other generation expansion alternatives for supplying the future electricity requirements of a country or region. Previous versions of the model were used by Member States in many national and regional studies to analyse the electric power system expansion planning and the role of nuclear energy in particular. Experience gained from its application allowed development of WASP into a very comprehensive planning tool for electric power system expansion analysis. New, improved versions were developed, which took into consideration the needs expressed by the users of the programme in order to address important emerging issues being faced by the electric system planners. In 1979, WASP-IV was released and soon after became an indispensable tool in many Member States for generation expansion planning. The WASP-IV version was continually upgraded and the development of version WASP-III Plus commenced in 1992. By 1995, WASP-III Plus was completed, which followed closely the methodology of the WASP-III but incorporated new features. In order to meet the needs of electricity planners and following the recommendations of the Helsinki symposium, development of a new version of WASP was

  1. PC-BEIS: a personal computer version of the biogenic emissions inventory system

    International Nuclear Information System (INIS)

    Pierce, T.E.; Waldruff, P.S.

    1991-01-01

    The US Environmental Protection Agency's Biogenic Emissions Inventory System (BEIS) has been adapted for use on IBM-compatible personal computers (PCs). PC-BEIS estimates hourly emissions of isoprene, α-pinene, other monoterpenes, and unidentified hydrocarbons for any county in the contiguous United States. To run the program, users must provide hourly data on ambient temperature, relative humidity, wind speed, cloud cover, and a code that identifies the particular county. This paper provides an overview of the method used to calculate biogenic emissions, shows an example application, and gives information on how to obtain a copy of the program

  2. Computer-aided structure elucidation Pt. 3. Extended version of assigner system

    Energy Technology Data Exchange (ETDEWEB)

    Szalontai, G; Csapo, Z; Recsey, Zs [Nehezvegyipari Kutato Intezet, Veszprem (Hungary)

    1982-01-01

    Computer-aided interpretation of /sup 13/C-NMR, /sup 1/H-NMR and IR spectra of organic molecules (M.W.<=500) has been performed by an artificial intelligence approach. A procedure for the joint /sup 13/C-NMR - /sup 1/H-NMR - IR spectrum interpretation is outlined. Possible ways of finding acceptable greater fragments on the basis of /sup 1/H-NMR - /sup 13/C-NMR data and of /sup 13/C-NMR data alone are also described. Detailed examples are given to demonstrate the capability of the system.

  3. Wien Automatic System Planning (WASP) Package. A computer code for power generating system expansion planning. Version WASP-III Plus. User's manual. Volume 1: Chapters 1-11

    International Nuclear Information System (INIS)

    1995-01-01

    determination of the optimal expansion of combined thermal and hydro power systems, taking into account the optimal operation of the hydro reservoirs throughout the year. Microcomputer (PC) versions of WASP-Ill and MAED have also been developed as stand alone programs and as part of an integrated package for energy and electricity planning called ENPEP (Energy and Power Evaluation Program). A PC version of the VALORAGUA model has also been completed in 1992. With all these developments, the catalogue of planning methodologies offered by the IAEA to its Member States has been upgraded to facilitate the work by electricity planners, WASP in particular is currently accepted as a powerful tool for electric system expansion planning. Nevertheless, experienced users of the program have indicated the need to introduce more enhancements within the WASP model in order to cope with the problems constantly faced by planners owing to the increasing complexity of this type of analysis. With several Member States, the IAEA has completed a new version of the WASP program, which has been called WASP-Ill Plus since it follows quite closely the methodology of the WASP-Ill model. The major enhancements in WASP-Ill Plus with respect to the WASP-Ill version are: increase in the number of thermal fuel types (from 5 to 10); verification of which configurations generated by CONGEN have already been simulated in previous iterations with MERSIM; direct calculation of combined Loading Order of FIXSYS and VARSYS plants; simulation of system operation includes consideration of physical constraints imposed on some fuel types (i.e., fuel availability for electricity generation); extended output of the resimulation of the optimal solution; generation of a file that can be used for graphical representation of the results of the resimulation of the optimal solution and cash flows of the investment costs; calculation of cash flows allows to include the capital costs of plants firmly committed or in construction

  4. Computer Aided Tracing System 「CATS」(version I)の使用手引書

    OpenAIRE

    谷 啓二; 納 俊樹; 木原 和久

    1984-01-01

    コンピューターグラフィックスの一応用として、各種の図形、表などをタブレットデジタイザーを用いて計算機に入力し、グラッフィックディスプレイ(GD)上で図形を編集、表示し、その結果をカラーハードコピーやNLP(レーザープリンター)に清書出力するソフトプログラム Computer Aided Tracing System「CATS」を開発を行なった。「CATS」におけるデータの入力は、全て日本語による全活形式を採用しているため、ユーザーはGD専用の複雑なソフトを意識することなくGD上で図形の編集が行える。本報告は「CATS」の使用手引書としてまとめたものである。...

  5. China National Lung Cancer Screening Guideline with Low-dose Computed 
Tomography (2018 version

    Directory of Open Access Journals (Sweden)

    Qinghua ZHOU

    2018-02-01

    Full Text Available Background and objective Lung cancer is the leading cause of cancer-related death in China. The results from a randomized controlled trial using annual low-dose computed tomography (LDCT in specific high-risk groups demonstrated a 20% reduction in lung cancer mortality. The aim of tihs study is to establish the China National lung cancer screening guidelines for clinical practice. Methods The China lung cancer early detection and treatment expert group (CLCEDTEG established the China National Lung Cancer Screening Guideline with multidisciplinary representation including 4 thoracic surgeons, 4 thoracic radiologists, 2 medical oncologists, 2 pulmonologists, 2 pathologist, and 2 epidemiologist. Members have engaged in interdisciplinary collaborations regarding lung cancer screening and clinical care of patients with at risk for lung cancer. The expert group reviewed the literature, including screening trials in the United States and Europe and China, and discussed local best clinical practices in the China. A consensus-based guidelines, China National Lung Cancer Screening Guideline (CNLCSG, was recommended by CLCEDTEG appointed by the National Health and Family Planning Commission, based on results of the National Lung Screening Trial, systematic review of evidence related to LDCT screening, and protocol of lung cancer screening program conducted in rural China. Results Annual lung cancer screening with LDCT is recommended for high risk individuals aged 50-74 years who have at least a 20 pack-year smoking history and who currently smoke or have quit within the past five years. Individualized decision making should be conducted before LDCT screening. LDCT screening also represents an opportunity to educate patients as to the health risks of smoking; thus, education should be integrated into the screening process in order to assist smoking cessation. Conclusion A lung cancer screening guideline is recommended for the high-risk population in China

  6. [China National Lung Cancer Screening Guideline with Low-dose Computed 
Tomography (2018 version)].

    Science.gov (United States)

    Zhou, Qinghua; Fan, Yaguang; Wang, Ying; Qiao, Youlin; Wang, Guiqi; Huang, Yunchao; Wang, Xinyun; Wu, Ning; Zhang, Guozheng; Zheng, Xiangpeng; Bu, Hong; Li, Yin; Wei, Sen; Chen, Liang'an; Hu, Chengping; Shi, Yuankai; Sun, Yan

    2018-02-20

    Lung cancer is the leading cause of cancer-related death in China. The results from a randomized controlled trial using annual low-dose computed tomography (LDCT) in specific high-risk groups demonstrated a 20% reduction in lung cancer mortality. The aim of tihs study is to establish the China National lung cancer screening guidelines for clinical practice. The China lung cancer early detection and treatment expert group (CLCEDTEG) established the China National Lung Cancer Screening Guideline with multidisciplinary representation including 4 thoracic surgeons, 4 thoracic radiologists, 2 medical oncologists, 2 pulmonologists, 2 pathologist, and 2 epidemiologist. Members have engaged in interdisciplinary collaborations regarding lung cancer screening and clinical care of patients with at risk for lung cancer. The expert group reviewed the literature, including screening trials in the United States and Europe and China, and discussed local best clinical practices in the China. A consensus-based guidelines, China National Lung Cancer Screening Guideline (CNLCSG), was recommended by CLCEDTEG appointed by the National Health and Family Planning Commission, based on results of the National Lung Screening Trial, systematic review of evidence related to LDCT screening, and protocol of lung cancer screening program conducted in rural China. Annual lung cancer screening with LDCT is recommended for high risk individuals aged 50-74 years who have at least a 20 pack-year smoking history and who currently smoke or have quit within the past five years. Individualized decision making should be conducted before LDCT screening. LDCT screening also represents an opportunity to educate patients as to the health risks of smoking; thus, education should be integrated into the screening process in order to assist smoking cessation. A lung cancer screening guideline is recommended for the high-risk population in China. Additional research , including LDCT combined with biomarkers, is

  7. Wien Automatic System Package (WASP). A computer code for power generating system expansion planning. Version WASP-III Plus. User's manual. Volume 2: Appendices

    International Nuclear Information System (INIS)

    1995-01-01

    With several Member States, the IAEA has completed a new version of the WASP program, which has been called WASP-Ill Plus since it follows quite closely the methodology of the WASP-Ill model. The major enhancements in WASP-Ill Plus with respect to the WASP-Ill version are: increase in the number of thermal fuel types (from 5 to 10); verification of which configurations generated by CONGEN have already been simulated in previous iterations with MERSIM; direct calculation of combined Loading Order of FIXSYS and VARSYS plants; simulation of system operation includes consideration of physical constraints imposed on some fuel types (i.e., fuel availability for electricity generation); extended output of the resimulation of the optimal solution; generation of a file that can be used for graphical representation of the results of the resimulation of the optimal solution and cash flows of the investment costs; calculation of cash flows allows to include the capital costs of plants firmly committed or in construction (FIXSYS plants); user control of the distribution of capital cost expenditures during the construction period (if required to be different from the general 'S' curve distribution used as default). This second volume of the document to support use of the WASP-Ill Plus computer code consists of 5 appendices giving some additional information about the WASP-Ill Plus program. Appendix A is mainly addressed to the WASP-Ill Plus system analyst and supplies some information which could help in the implementation of the program on the user computer facilities. This appendix also includes some aspects about WASP-Ill Plus that could not be treated in detail in Chapters 1 to 11. Appendix B identifies all error and warning messages that may appear in the WASP printouts and advises the user how to overcome the problem. Appendix C presents the flow charts of the programs along with a brief description of the objectives and structure of each module. Appendix D describes the

  8. Wien Automatic System Planning (WASP) Package. A computer code for power generating system expansion planning. Version WASP-III Plus. User's manual. Volume 1: Chapters 1-11

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    determination of the optimal expansion of combined thermal and hydro power systems, taking into account the optimal operation of the hydro reservoirs throughout the year. Microcomputer (PC) versions of WASP-Ill and MAED have also been developed as stand alone programs and as part of an integrated package for energy and electricity planning called ENPEP (Energy and Power Evaluation Program). A PC version of the VALORAGUA model has also been completed in 1992. With all these developments, the catalogue of planning methodologies offered by the IAEA to its Member States has been upgraded to facilitate the work by electricity planners, WASP in particular is currently accepted as a powerful tool for electric system expansion planning. Nevertheless, experienced users of the program have indicated the need to introduce more enhancements within the WASP model in order to cope with the problems constantly faced by planners owing to the increasing complexity of this type of analysis. With several Member States, the IAEA has completed a new version of the WASP program, which has been called WASP-Ill Plus since it follows quite closely the methodology of the WASP-Ill model. The major enhancements in WASP-Ill Plus with respect to the WASP-Ill version are: increase in the number of thermal fuel types (from 5 to 10); verification of which configurations generated by CONGEN have already been simulated in previous iterations with MERSIM; direct calculation of combined Loading Order of FIXSYS and VARSYS plants; simulation of system operation includes consideration of physical constraints imposed on some fuel types (i.e., fuel availability for electricity generation); extended output of the resimulation of the optimal solution; generation of a file that can be used for graphical representation of the results of the resimulation of the optimal solution and cash flows of the investment costs; calculation of cash flows allows to include the capital costs of plants firmly committed or in construction

  9. Implementation of an electronic medical record system in previously computer-naïve primary care centres: a pilot study from Cyprus.

    Science.gov (United States)

    Samoutis, George; Soteriades, Elpidoforos S; Kounalakis, Dimitris K; Zachariadou, Theodora; Philalithis, Anastasios; Lionis, Christos

    2007-01-01

    The computer-based electronic medical record (EMR) is an essential new technology in health care, contributing to high-quality patient care and efficient patient management. The majority of southern European countries, however, have not yet implemented universal EMR systems and many efforts are still ongoing. We describe the development of an EMR system and its pilot implementation and evaluation in two previously computer-naïve public primary care centres in Cyprus. One urban and one rural primary care centre along with their personnel (physicians and nurses) were selected to participate. Both qualitative and quantitative evaluation tools were used during the implementation phase. Qualitative data analysis was based on the framework approach, whereas quantitative assessment was based on a nine-item questionnaire and EMR usage parameters. Two public primary care centres participated, and a total often health professionals served as EMR system evaluators. Physicians and nurses rated EMR relatively highly, while patients were the most enthusiastic supporters for the new information system. Major implementation impediments were the physicians' perceptions that EMR usage negatively affected their workflow, physicians' legal concerns, lack of incentives, system breakdowns, software design problems, transition difficulties and lack of familiarity with electronic equipment. The importance of combining qualitative and quantitative evaluation tools is highlighted. More efforts are needed for the universal adoption and routine use of EMR in the primary care system of Cyprus as several barriers to adoption exist; however, none is insurmountable. Computerised systems could improve efficiency and quality of care in Cyprus, benefiting the entire population.

  10. User manual for version 4.3 of the Tripoli-4 Monte-Carlo method particle transport computer code

    International Nuclear Information System (INIS)

    Both, J.P.; Mazzolo, A.; Peneliau, Y.; Petit, O.; Roesslinger, B.

    2003-01-01

    This manual relates to Version 4.3 TRIPOLI-4 code. TRIPOLI-4 is a computer code simulating the transport of neutrons, photons, electrons and positrons. It can be used for radiation shielding calculations (long-distance propagation with flux attenuation in non-multiplying media) and neutronic calculations (fissile medium, criticality or sub-criticality basis). This makes it possible to calculate k eff (for criticality), flux, currents, reaction rates and multi-group cross-sections. TRIPOLI-4 is a three-dimensional code that uses the Monte-Carlo method. It allows for point-wise description in terms of energy of cross-sections and multi-group homogenized cross-sections and features two modes of geometrical representation: surface and combinatorial. The code uses cross-section libraries in ENDF/B format (such as JEF2-2, ENDF/B-VI and JENDL) for point-wise description cross-sections in APOTRIM format (from the APOLLO2 code) or a format specific to TRIPOLI-4 for multi-group description. (authors)

  11. On the interpretability and computational reliability of frequency-domain Granger causality [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Luca Faes

    2017-09-01

    Full Text Available This Correspondence article is a comment which directly relates to the paper “A study of problems encountered in Granger causality analysis from a neuroscience perspective” (Stokes and Purdon, 2017. We agree that interpretation issues of Granger causality (GC in neuroscience exist, partially due to the historically unfortunate use of the name “causality”, as described in previous literature. On the other hand, we think that Stokes and Purdon use a formulation of GC which is outdated (albeit still used and do not fully account for the potential of the different frequency-domain versions of GC; in doing so, their paper dismisses GC measures based on a suboptimal use of them. Furthermore, since data from simulated systems are used, the pitfalls that are found with the used formulation are intended to be general, and not limited to neuroscience. It would be a pity if this paper, even if written in good faith, became a wildcard against all possible applications of GC, regardless of the large body of work recently published which aims to address faults in methodology and interpretation. In order to provide a balanced view, we replicate the simulations of Stokes and Purdon, using an updated GC implementation and exploiting the combination of spectral and causal information, showing that in this way the pitfalls are mitigated or directly solved.

  12. Three-dimensional biplanar radiography as a new means of accessing femoral version: a comparitive study of EOS three-dimensional radiography versus computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Pomerantz, M.L. [University of California San Diego School of Medicine, Orthopaedic Surgery Department, San Diego, CA (United States); Glaser, Diana [Aurora Spine, Carlsbad, CA (United States); Doan, Josh [Orthopedic Biomechanics Research Center, San Diego, CA (United States); Kumar, Sita [University of California, San Diego, CA (United States); Edmonds, Eric W. [University of California San Diego School of Medicine, Orthopaedic Surgery Department, San Diego, CA (United States); Rady Children' s Hospital San Diego, Division of Orthopedic Surgery, San Diego, CA (United States)

    2014-10-17

    To validate femoral version measurements made from biplanar radiography (BR), three-dimensional (3D) reconstructions (EOS imaging, France) were made in differing rotational positions against the gold standard of computed tomography (CT). Two cadaveric femurs were scanned with CT and BR in five different femoral versions creating ten total phantoms. The native version was modified by rotating through a mid-diaphyseal hinge twice into increasing anteversion and twice into increased retroversion. For each biplanar scan, the phantom itself was rotated -10, -5, 0, +5 and +10 . Three-dimensional CT reconstructions were designated the true value for femoral version. Two independent observers measured the femoral version on CT axial slices and BR 3D reconstructions twice. The mean error (upper bound of the 95 % confidence interval), inter- and intraobserver reliability, and the error compared to the true version were determined for both imaging techniques. Interobserver intraclass correlation for CT axial images ranged from 0.981 to 0.991, and the intraobserver intraclass correlation ranged from 0.994 to 0.996. For the BR 3D reconstructions these values ranged from 0.983 to 0.998 and 0.982 to 0.998, respectively. For the CT measurements the upper bound of error from the true value was 5.4-7.5 , whereas for BR 3D reconstructions it was 4.0-10.1 . There was no statistical difference in the mean error from the true values for any of the measurements done with axial CT or BR 3D reconstructions. BR 3D reconstructions accurately and reliably provide clinical data on femoral version compared to CT even with rotation of the patient of up to 10 from neutral. (orig.)

  13. ASYS2: a new version of computer algebra package ASYS for analysis and simplification of polynomial systems

    International Nuclear Information System (INIS)

    Gerdt, V.P.; Khutornoj, N.V.

    1993-01-01

    In this paper a new version of a package ASYS for analysis of nonlinear algebraic equations based on the Groebner basis technique is described. In addition to the first version ASYS1 of the package, the current one has a number of new facilities which provide its higher efficiency. Some examples and results of comparison between ASYS2, ASYS1 and two other REDUCE packages GROEBNER and CALI included in REDUCE 3.5, are given. 16 refs., 4 tabs

  14. Cielo Computational Environment Usage Model With Mappings to ACE Requirements for the General Availability User Environment Capabilities Release Version 1.1

    Energy Technology Data Exchange (ETDEWEB)

    Vigil,Benny Manuel [Los Alamos National Laboratory; Ballance, Robert [SNL; Haskell, Karen [SNL

    2012-08-09

    Cielo is a massively parallel supercomputer funded by the DOE/NNSA Advanced Simulation and Computing (ASC) program, and operated by the Alliance for Computing at Extreme Scale (ACES), a partnership between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL). The primary Cielo compute platform is physically located at Los Alamos National Laboratory. This Cielo Computational Environment Usage Model documents the capabilities and the environment to be provided for the Q1 FY12 Level 2 Cielo Capability Computing (CCC) Platform Production Readiness Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory, or Sandia National Laboratories, but also addresses the needs of users working in the unclassified environment. The Cielo Computational Environment Usage Model maps the provided capabilities to the tri-Lab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the Production Readiness Milestone user environment capabilities of the ASC community. A description of ACE requirements met, and those requirements that are not met, are included in each section of this document. The Cielo Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the tri-Lab community.

  15. Computer code SICHTA-85/MOD 1 for thermohydraulic and mechanical modelling of WWER fuel channel behaviour during LOCA and comparison with original version of the SICHTA code

    International Nuclear Information System (INIS)

    Bujan, A.; Adamik, V.; Misak, J.

    1986-01-01

    A brief description is presented of the expansion of the SICHTA-83 computer code for the analysis of the thermal history of the fuel channel for large LOCAs by modelling the mechanical behaviour of fuel element cladding. The new version of the code has a more detailed treatment of heat transfer in the fuel-cladding gap because it also respects the mechanical (plastic) deformations of the cladding and the fuel-cladding interaction (magnitude of contact pressure). Also respected is the change in pressure of the gas filling of the fuel element, the mechanical criterion is considered of a failure of the cladding and the degree is considered of the blockage of the through-flow cross section for coolant flow in the fuel channel. The LOCA WWER-440 model computation provides a comparison of the new SICHTA-85/MOD 1 code with the results of the original 83 version of SICHTA. (author)

  16. Planned development and evaluation protocol of two versions of a web-based computer-tailored nutrition education intervention aimed at adults, including cognitive and environmental feedback.

    Science.gov (United States)

    Springvloet, Linda; Lechner, Lilian; Oenema, Anke

    2014-01-17

    Despite decades of nutrition education, the prevalence of unhealthy dietary patterns is still high and inequalities in intake between high and low socioeconomic groups still exist. Therefore, it is important to innovate and improve existing nutrition education interventions. This paper describes the development, design and evaluation protocol of a web-based computer-tailored nutrition education intervention for adults targeting fruit, vegetable, high-energy snack and fat intake. This intervention innovates existing computer-tailored interventions by not only targeting motivational factors, but also volitional and self-regulation processes and environmental-level factors. The intervention development was guided by the Intervention Mapping protocol, ensuring a theory-informed and evidence-based intervention. Two versions of the intervention were developed: a basic version targeting knowledge, awareness, attitude, self-efficacy and volitional and self-regulation processes, and a plus version additionally addressing the home environment arrangement and the availability and price of healthy food products in supermarkets. Both versions consist of four modules: one for each dietary behavior, i.e. fruit, vegetables, high-energy snacks and fat. Based on the self-regulation phases, each module is divided into three sessions. In the first session, feedback on dietary behavior is provided to increase awareness, feedback on attitude and self-efficacy is provided and goals and action plans are stated. In the second session goal achievement is evaluated, reasons for failure are explored, coping plans are stated and goals can be adapted. In the third session, participants can again evaluate their behavioral change and tips for maintenance are provided. Both versions will be evaluated in a three-group randomized controlled trial with measurements at baseline, 1-month, 4-months and 9-months post-intervention, using online questionnaires. Both versions will be compared with a generic

  17. Global Historical Climatology Network (GHCN), Version 1 (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Please note, this dataset has been superseded by a newer version (see below). Users should not use this version except in rare cases (e.g., when reproducing previous...

  18. QRev—Software for computation and quality assurance of acoustic doppler current profiler moving-boat streamflow measurements—Technical manual for version 2.8

    Science.gov (United States)

    Mueller, David S.

    2016-06-21

    The software program, QRev applies common and consistent computational algorithms combined with automated filtering and quality assessment of the data to improve the quality and efficiency of streamflow measurements and helps ensure that U.S. Geological Survey streamflow measurements are consistent, accurate, and independent of the manufacturer of the instrument used to make the measurement. Software from different manufacturers uses different algorithms for various aspects of the data processing and discharge computation. The algorithms used by QRev to filter data, interpolate data, and compute discharge are documented and compared to the algorithms used in the manufacturers’ software. QRev applies consistent algorithms and creates a data structure that is independent of the data source. QRev saves an extensible markup language (XML) file that can be imported into databases or electronic field notes software. This report is the technical manual for version 2.8 of QRev.

  19. Comparing a Video and Text Version of a Web-Based Computer-Tailored Intervention for Obesity Prevention: A Randomized Controlled Trial.

    Science.gov (United States)

    Walthouwer, Michel Jean Louis; Oenema, Anke; Lechner, Lilian; de Vries, Hein

    2015-10-19

    Web-based computer-tailored interventions often suffer from small effect sizes and high drop-out rates, particularly among people with a low level of education. Using videos as a delivery format can possibly improve the effects and attractiveness of these interventions The main aim of this study was to examine the effects of a video and text version of a Web-based computer-tailored obesity prevention intervention on dietary intake, physical activity, and body mass index (BMI) among Dutch adults. A second study aim was to examine differences in appreciation between the video and text version. The final study aim was to examine possible differences in intervention effects and appreciation per educational level. A three-armed randomized controlled trial was conducted with a baseline and 6 months follow-up measurement. The intervention consisted of six sessions, lasting about 15 minutes each. In the video version, the core tailored information was provided by means of videos. In the text version, the same tailored information was provided in text format. Outcome variables were self-reported and included BMI, physical activity, energy intake, and appreciation of the intervention. Multiple imputation was used to replace missing values. The effect analyses were carried out with multiple linear regression analyses and adjusted for confounders. The process evaluation data were analyzed with independent samples t tests. The baseline questionnaire was completed by 1419 participants and the 6 months follow-up measurement by 1015 participants (71.53%). No significant interaction effects of educational level were found on any of the outcome variables. Compared to the control condition, the video version resulted in lower BMI (B=-0.25, P=.049) and lower average daily energy intake from energy-dense food products (B=-175.58, PWeb-based computer-tailored obesity prevention intervention was the most effective intervention and most appreciated. Future research needs to examine if the

  20. A Computer-Interpretable Version of the AACE, AME, ETA Medical Guidelines for Clinical Practice for the Diagnosis and Management of Thyroid Nodules

    DEFF Research Database (Denmark)

    Peleg, Mor; Fox, John; Patkar, Vivek

    2014-01-01

    with data that are not necessarily obtained in a rigid flowchart sequence. Tallis-a user-friendly web-based "enactment tool"- was then used as the "execution engine" (computer program). This tool records and displays tasks that are done and prompts users to perform the next indicated steps. The development...... GuideLine Interchange Format, version 3, known as GLIF3, which emphasizes the organization of a care algorithm into a flowchart. The flowchart specified the sequence of tasks required to evaluate a patient with a thyroid nodule. PROforma, a second guideline-modeling language, was then employed to work...

  1. QRev—Software for computation and quality assurance of acoustic doppler current profiler moving-boat streamflow measurements—User’s manual for version 2.8

    Science.gov (United States)

    Mueller, David S.

    2016-05-12

    The software program, QRev computes the discharge from moving-boat acoustic Doppler current profiler measurements using data collected with any of the Teledyne RD Instrument or SonTek bottom tracking acoustic Doppler current profilers. The computation of discharge is independent of the manufacturer of the acoustic Doppler current profiler because QRev applies consistent algorithms independent of the data source. In addition, QRev automates filtering and quality checking of the collected data and provides feedback to the user of potential quality issues with the measurement. Various statistics and characteristics of the measurement, in addition to a simple uncertainty assessment are provided to the user to assist them in properly rating the measurement. QRev saves an extensible markup language file that can be imported into databases or electronic field notes software. The user interacts with QRev through a tablet-friendly graphical user interface. This report is the manual for version 2.8 of QRev.

  2. GeoT User’s Guide, A Computer Program for Multicomponent Geothermometry and Geochemical Speciation, Version 2.1

    Energy Technology Data Exchange (ETDEWEB)

    Spycher, Nicolas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Peiffer, Loic [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Finsterle, Stefan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sonnenthal, Eric [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-06-06

    GeoT implements the multicomponent geothermometry method developed by Reed and Spycher (1984, Geochim. Cosmichim. Acta 46 513–528) into a stand-alone computer program, to ease the application of this method and to improve the prediction of geothermal reservoir temperatures using full and integrated chemical analyses of geothermal fluids. Reservoir temperatures are estimated from statistical analyses of mineral saturation indices computed as a function of temperature. The reconstruction of the deep geothermal fluid compositions, and geothermometry computations, are all implemented into the same computer program, allowing unknown or poorly constrained input parameters to be estimated by numerical optimization using existing parameter estimation software, such as iTOUGH2, PEST, or UCODE. This integrated geothermometry approach presents advantages over classical geothermometers for fluids that have not fully equilibrated with reservoir minerals and/or that have been subject to processes such as dilution and gas loss.

  3. The sagittal stem alignment and the stem version clearly influence the impingement-free range of motion in total hip arthroplasty: a computer model-based analysis.

    Science.gov (United States)

    Müller, Michael; Duda, Georg; Perka, Carsten; Tohtz, Stephan

    2016-03-01

    The component alignment in total hip arthroplasty influences the impingement-free range of motion (ROM). While substantiated data is available for the cup positioning, little is known about the stem alignment. Especially stem rotation and the sagittal alignment influence the position of the cone in relation to the edge of the socket and thus the impingement-free functioning. Hence, the question arises as to what influence do these parameters have on the impingement-free ROM? With the help of a computer model the influence of the sagittal stem alignment and rotation on the impingement-free ROM were investigated. The computer model was based on the CT dataset of a patient with a non-cemented THA. In the model the stem version was set at 10°/0°/-10° and the sagittal alignment at 5°/0°/-5°, which resulted in nine alternative stem positions. For each position, the maximum impingement-free ROM was investigated. Both stem version and sagittal stem alignment have a relevant influence on the impingement-free ROM. In particular, flexion and extension as well as internal and external rotation capability present evident differences. In the position intervals of 10° sagittal stem alignment and 20° stem version a difference was found of about 80° in the flexion and 50° in the extension capability. Likewise, differences were evidenced of up to 72° in the internal and up to 36° in the external rotation. The sagittal stem alignment and the stem torsion have a relevant influence on the impingement-free ROM. To clarify the causes of an impingement or accompanying problems, both parameters should be examined and, if possible, a combined assessment of these factors should be made.

  4. Program EPICSHOW. A computer code to allow interactive viewing of the EPIC data libraries (Version 98-1). Summary documentation

    International Nuclear Information System (INIS)

    Pronyaev, V.G.; McLaughlin, P.K.

    1999-01-01

    EPICSHOW (Electron Photon Interactive Code - Show Data) is an interactive graphics code that allows users to view and interact with neutron, photon, electron and light charged particle data. Besides on screen graphics the code provides hard copy in the form of tabulated listings and Postscript output files. The code has been implemented on UNIX, IBM-PC, Power MAC, and even Laptop computers. It should be relatively easy to use it on almost any computer. All of the data included in this system is based on the Lawrence Livermore National Laboratory Databases and the neutron and photon data is used in the TART97 Monte Carlo transport code system. (author)

  5. Implantation and use of a version of the GAMALTA computer code in the 3.500 M Lecroy system

    International Nuclear Information System (INIS)

    Auler, L.T.

    1984-05-01

    The GAMALTA computer code was implanted in the 3.500 M Le Croy system, for creating an optional analysis function which is charged in RAM memory from a discket. The mode to construct functions to make part of the menu of the system is explained and a procedure to use the GAMALTA code is done. (M.C.K.) [pt

  6. Physics study of microbeam radiation therapy with PSI-version of Monte Carlo code GEANT as a new computational tool

    CERN Document Server

    Stepanek, J; Laissue, J A; Lyubimova, N; Di Michiel, F; Slatkin, D N

    2000-01-01

    Microbeam radiation therapy (MRT) is a currently experimental method of radiotherapy which is mediated by an array of parallel microbeams of synchrotron-wiggler-generated X-rays. Suitably selected, nominally supralethal doses of X-rays delivered to parallel microslices of tumor-bearing tissues in rats can be either palliative or curative while causing little or no serious damage to contiguous normal tissues. Although the pathogenesis of MRT-mediated tumor regression is not understood, as in all radiotherapy such understanding will be based ultimately on our understanding of the relationships among the following three factors: (1) microdosimetry, (2) damage to normal tissues, and (3) therapeutic efficacy. Although physical microdosimetry is feasible, published information on MRT microdosimetry to date is computational. This report describes Monte Carlo-based computational MRT microdosimetry using photon and/or electron scattering and photoionization cross-section data in the 1 e V through 100 GeV range distrib...

  7. The implementation of the CDC version of RELAP5/MOD1/019 on an IBM compatible computer system (AMDAHL 470/V8)

    International Nuclear Information System (INIS)

    Kolar, W.; Brewka, W.

    1984-01-01

    RELAP5/MOD1 is an advanced one-dimensional best estimate system code, which is used for safety analysis studies of nuclear pressurized water reactor systems and related integral and separate effect test facilities. The program predicts the system response for large break, small break LOCA and special transients. To a large extent RELAP5/MOD1 is written in Fortran, only a small part of the program is coded in CDC assembler. RELAP5/MOD1 was developed on the CDC CYBER 176 at INEL*. The code development team made use of CDC system programs like the CDC UPDATE facility and incorporated in the program special purpose software packages. The report describes the problems which have been encountered when implementing the CDC version of RELAP5/MOD1 on an IBM compatible computer systems (AMDAHL 470/V8)

  8. General Electromagnetic Model for the Analysis of Complex Systems (GEMACS) Computer Code Documentation (Version 3). Volume 3. Part 2.

    Science.gov (United States)

    1983-09-01

    F.P. PX /AMPZIJ/ REFH /AMPZIJ/ REFV /AI4PZIJ/ * RHOX /AI4PZIJ/ RHOY /At4PZIJ/ RHOZ /AI4PZIJ/ S A-ZJ SA /AMPZIJ/ SALP /AMPZIJ/ 6. CALLING ROUTINE: FLDDRV...US3NG ALGORITHM 72 COMPUTE P- YES .~:*:.~~ USING* *. 1. NAME: PLAINT (GTD) ] 2. PURPOSE: To determine if a ray traveling from a given source loca...determine if a source ray reflection from plate MP occurs. If a ray traveling from the source image location in the reflected ray direction passes through

  9. MELCOR computer code manuals: Primer and user's guides, Version 1.8.3 September 1994. Volume 1

    International Nuclear Information System (INIS)

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L.; Hodge, S.A.; Hyman, C.R.; Sanders, R.L.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the US Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR's phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users' Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package

  10. User`s guide for the Augmented Computer Exercise for Inspection Training (ACE-IT), Version 2.0 software

    Energy Technology Data Exchange (ETDEWEB)

    Dobranich, P.R. [Sandia National Labs., Albuquerque, NM (United States); Horak, K.E.; Evanko, D.A. [Excel Tactical Staffing, Albuquerque, NM (United States)] [and others

    1998-04-01

    The on-site inspection provisions in many current and proposed arms control agreements require extensive preparation and training on the part of both the Inspection Teams (inspectors) and Inspected Parties (hosts). Traditional training techniques include lectures, table-top inspections, and practice inspections. The Augmented Computer Exercise for Inspection Training (ACE-IT), an interactive computer training tool, increases the utility of table-top inspections. ACE-IT is used for training both inspectors and hosts to conduct a hypothetical challenge inspection under the Chemical Weapons Convention (CWC). The training covers the entire sequence of events in the challenge inspection regime, from initial notification of an inspection through post-inspection activities. The primary emphasis of the training tool is on conducting the inspection itself, and in particular, implementing the concept of managed access. (Managed access is a technique used to assure the inspectors that the facility is in compliance with the CWC, while at the same time protecting sensitive information unrelated to the CWC.) Information for all of the activities is located in the electronic {open_quotes}Exercise Manual.{close_quotes} In addition, interactive menus are used to negotiate access to each room and to alternate information during the simulated inspection. ACE-IT also demonstrates how various inspection provisions impact compliance determination and the protection of sensitive information.

  11. NCDC International Best Track Archive for Climate Stewardship (IBTrACS) Project, Version 2 (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Version 2 of the dataset has been superseded by a newer version. Users should not use version 2 except in rare cases (e.g., when reproducing previous studies that...

  12. NCDC International Best Track Archive for Climate Stewardship (IBTrACS) Project, Version 1 (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Version 1 of the dataset has been superseded by a newer version. Users should not use version 1 except in rare cases (e.g., when reproducing previous studies that...

  13. Is it time to scrap Scadding and adopt computed tomography for initial evaluation of sarcoidosis? [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Andrew Levy

    2018-05-01

    Full Text Available In this review, we argue for the use of high-resolution computed tomography (HRCT over chest X-ray in the initial evaluation of patients with sarcoidosis. Chest X-ray, which has long been used to classify disease severity and offer prognostication in sarcoidosis, has clear limitations compared with HRCT, including wider interobserver variability, a looser association with lung function, and poorer sensitivity to detect important lung manifestations of sarcoidosis. In addition, HRCT offers a diagnostic advantage, as it better depicts targets for biopsy, such as mediastinal/hilar lymphadenopathy and focal parenchymal disease. Newer data suggest that specific HRCT findings may be associated with important prognostic outcomes, such as increased mortality. As we elaborate in this update, we strongly recommend the use of HRCT in the initial evaluation of the patient with sarcoidosis.

  14. A multi-scale computational model of the effects of TMS on motor cortex [version 3; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Hyeon Seo

    2017-05-01

    Full Text Available The detailed biophysical mechanisms through which transcranial magnetic stimulation (TMS activates cortical circuits are still not fully understood. Here we present a multi-scale computational model to describe and explain the activation of different pyramidal cell types in motor cortex due to TMS. Our model determines precise electric fields based on an individual head model derived from magnetic resonance imaging and calculates how these electric fields activate morphologically detailed models of different neuron types. We predict neural activation patterns for different coil orientations consistent with experimental findings. Beyond this, our model allows us to calculate activation thresholds for individual neurons and precise initiation sites of individual action potentials on the neurons’ complex morphologies. Specifically, our model predicts that cortical layer 3 pyramidal neurons are generally easier to stimulate than layer 5 pyramidal neurons, thereby explaining the lower stimulation thresholds observed for I-waves compared to D-waves. It also shows differences in the regions of activated cortical layer 5 and layer 3 pyramidal cells depending on coil orientation. Finally, it predicts that under standard stimulation conditions, action potentials are mostly generated at the axon initial segment of cortical pyramidal cells, with a much less important activation site being the part of a layer 5 pyramidal cell axon where it crosses the boundary between grey matter and white matter. In conclusion, our computational model offers a detailed account of the mechanisms through which TMS activates different cortical pyramidal cell types, paving the way for more targeted application of TMS based on individual brain morphology in clinical and basic research settings.

  15. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan : ASC software quality engineering practices Version 3.0.

    Energy Technology Data Exchange (ETDEWEB)

    Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia; Pilch, Martin M.

    2009-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and the software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.

  16. WSPEEDI (worldwide version of SPEEDI): A computer code system for the prediction of radiological impacts on Japanese due to a nuclear accident in foreign countries

    Energy Technology Data Exchange (ETDEWEB)

    Chino, Masamichi; Yamazawa, Hiromi; Nagai, Haruyasu; Moriuchi, Shigeru [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Ishikawa, Hirohiko

    1995-09-01

    A computer code system has been developed for near real-time dose assessment during radiological emergencies. The system WSPEEDI, the worldwide version of SPEEDI (System for Prediction of Environmental Emergency Dose Information) aims at predicting the radiological impact on Japanese due to a nuclear accident in foreign countries. WSPEEDI consists of a mass-consistent wind model WSYNOP for large-scale wind fields and a particle random walk model GEARN for atmospheric dispersion and dry and wet deposition of radioactivity. The models are integrated into a computer code system together with a system control software, worldwide geographic database, meteorological data processor and graphic software. The performance of the models has been evaluated using the Chernobyl case with reliable source terms, well-established meteorological data and a comprehensive monitoring database. Furthermore, the response of the system has been examined by near real-time simulations of the European Tracer Experiment (ETEX), carried out over about 2,000 km area in Europe. (author).

  17. WSPEEDI (worldwide version of SPEEDI): A computer code system for the prediction of radiological impacts on Japanese due to a nuclear accident in foreign countries

    International Nuclear Information System (INIS)

    Chino, Masamichi; Yamazawa, Hiromi; Nagai, Haruyasu; Moriuchi, Shigeru; Ishikawa, Hirohiko.

    1995-09-01

    A computer code system has been developed for near real-time dose assessment during radiological emergencies. The system WSPEEDI, the worldwide version of SPEEDI (System for Prediction of Environmental Emergency Dose Information) aims at predicting the radiological impact on Japanese due to a nuclear accident in foreign countries. WSPEEDI consists of a mass-consistent wind model WSYNOP for large-scale wind fields and a particle random walk model GEARN for atmospheric dispersion and dry and wet deposition of radioactivity. The models are integrated into a computer code system together with a system control software, worldwide geographic database, meteorological data processor and graphic software. The performance of the models has been evaluated using the Chernobyl case with reliable source terms, well-established meteorological data and a comprehensive monitoring database. Furthermore, the response of the system has been examined by near real-time simulations of the European Tracer Experiment (ETEX), carried out over about 2,000 km area in Europe. (author)

  18. Matching Behavior as a Tradeoff Between Reward Maximization and Demands on Neural Computation [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Jan Kubanek

    2015-10-01

    Full Text Available When faced with a choice, humans and animals commonly distribute their behavior in proportion to the frequency of payoff of each option. Such behavior is referred to as matching and has been captured by the matching law. However, matching is not a general law of economic choice. Matching in its strict sense seems to be specifically observed in tasks whose properties make matching an optimal or a near-optimal strategy. We engaged monkeys in a foraging task in which matching was not the optimal strategy. Over-matching the proportions of the mean offered reward magnitudes would yield more reward than matching, yet, surprisingly, the animals almost exactly matched them. To gain insight into this phenomenon, we modeled the animals' decision-making using a mechanistic model. The model accounted for the animals' macroscopic and microscopic choice behavior. When the models' three parameters were not constrained to mimic the monkeys' behavior, the model over-matched the reward proportions and in doing so, harvested substantially more reward than the monkeys. This optimized model revealed a marked bottleneck in the monkeys' choice function that compares the value of the two options. The model featured a very steep value comparison function relative to that of the monkeys. The steepness of the value comparison function had a profound effect on the earned reward and on the level of matching. We implemented this value comparison function through responses of simulated biological neurons. We found that due to the presence of neural noise, steepening the value comparison requires an exponential increase in the number of value-coding neurons. Matching may be a compromise between harvesting satisfactory reward and the high demands placed by neural noise on optimal neural computation.

  19. Description of input and examples for PHREEQC version 3: a computer program for speciation, batch-reaction, one-dimensional transport, and inverse geochemical calculations

    Science.gov (United States)

    Parkhurst, David L.; Appelo, C.A.J.

    2013-01-01

    PHREEQC version 3 is a computer program written in the C and C++ programming languages that is designed to perform a wide variety of aqueous geochemical calculations. PHREEQC implements several types of aqueous models: two ion-association aqueous models (the Lawrence Livermore National Laboratory model and WATEQ4F), a Pitzer specific-ion-interaction aqueous model, and the SIT (Specific ion Interaction Theory) aqueous model. Using any of these aqueous models, PHREEQC has capabilities for (1) speciation and saturation-index calculations; (2) batch-reaction and one-dimensional (1D) transport calculations with reversible and irreversible reactions, which include aqueous, mineral, gas, solid-solution, surface-complexation, and ion-exchange equilibria, and specified mole transfers of reactants, kinetically controlled reactions, mixing of solutions, and pressure and temperature changes; and (3) inverse modeling, which finds sets of mineral and gas mole transfers that account for differences in composition between waters within specified compositional uncertainty limits. Many new modeling features were added to PHREEQC version 3 relative to version 2. The Pitzer aqueous model (pitzer.dat database, with keyword PITZER) can be used for high-salinity waters that are beyond the range of application for the Debye-Hückel theory. The Peng-Robinson equation of state has been implemented for calculating the solubility of gases at high pressure. Specific volumes of aqueous species are calculated as a function of the dielectric properties of water and the ionic strength of the solution, which allows calculation of pressure effects on chemical reactions and the density of a solution. The specific conductance and the density of a solution are calculated and printed in the output file. In addition to Runge-Kutta integration, a stiff ordinary differential equation solver (CVODE) has been included for kinetic calculations with multiple rates that occur at widely different time scales

  20. Validation of MCNP6 Version 1.0 with the ENDF/B-VII.1 Cross Section Library for Plutonium Metals, Oxides, and Solutions on the High Performance Computing Platform Moonlight

    Energy Technology Data Exchange (ETDEWEB)

    Chapman, Bryan Scott [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Gough, Sean T. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-12-05

    This report documents a validation of the MCNP6 Version 1.0 computer code on the high performance computing platform Moonlight, for operations at Los Alamos National Laboratory (LANL) that involve plutonium metals, oxides, and solutions. The validation is conducted using the ENDF/B-VII.1 continuous energy group cross section library at room temperature. The results are for use by nuclear criticality safety personnel in performing analysis and evaluation of various facility activities involving plutonium materials.

  1. Relap4/SAS/Mod5 - A version of Relap4/Mod 5 adapted to IPEN/CNEN - SP computer center

    International Nuclear Information System (INIS)

    Sabundjian, G.

    1988-04-01

    In order to improve the safety of nuclear reactor power plants several computer codes have been developed in the area of thermal - hydraulics accident analysis. Among the public-available codes, RELAP4, developed by Aerojet Nuclear Company, has been the most popular one. RELAP4 has produced satisfactory results when compared to most of the available experimental data. The purposes of the present work are: optimization of RELAP4 output and messages by writing there information in temporary records, - display of RELAP4 results in graphical form through the printer. The sample problem consists on a simplified model of a 150 MW (e) PWR whose primary circuit is simulated by 6 volumes, 8 junctions and 1 heat slab. This new version of RELAP4 (named RELAP4/SAS/MOD5) have produced results which show that the above mentioned purposes have been reached. Obviously the graphical output by RELAP4/SAS/MOD5 favors the interpretation of results by the user. (author) [pt

  2. Laparoscopy After Previous Laparotomy

    Directory of Open Access Journals (Sweden)

    Zulfo Godinjak

    2006-11-01

    Full Text Available Following the abdominal surgery, extensive adhesions often occur and they can cause difficulties during laparoscopic operations. However, previous laparotomy is not considered to be a contraindication for laparoscopy. The aim of this study is to present that an insertion of Veres needle in the region of umbilicus is a safe method for creating a pneumoperitoneum for laparoscopic operations after previous laparotomy. In the last three years, we have performed 144 laparoscopic operations in patients that previously underwent one or two laparotomies. Pathology of digestive system, genital organs, Cesarean Section or abdominal war injuries were the most common causes of previouslaparotomy. During those operations or during entering into abdominal cavity we have not experienced any complications, while in 7 patients we performed conversion to laparotomy following the diagnostic laparoscopy. In all patients an insertion of Veres needle and trocar insertion in the umbilical region was performed, namely a technique of closed laparoscopy. Not even in one patient adhesions in the region of umbilicus were found, and no abdominal organs were injured.

  3. CASKS (Computer Analysis of Storage casKS): A microcomputer based analysis system for storage cask design review. User's manual to Version 1b (including program reference)

    International Nuclear Information System (INIS)

    Chen, T.F.; Gerhard, M.A.; Trummer, D.J.; Johnson, G.L.; Mok, G.C.

    1995-02-01

    CASKS (Computer Analysis of Storage casKS) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for evaluating safety analysis reports on spent-fuel storage casks. The bulk of the complete program and this user's manual are based upon the SCANS (Shipping Cask ANalysis System) program previously developed at LLNL. A number of enhancements and improvements were added to the original SCANS program to meet requirements unique to storage casks. CASKS is an easy-to-use system that calculates global response of storage casks to impact loads, pressure loads and thermal conditions. This provides reviewers with a tool for an independent check on analyses submitted by licensees. CASKS is based on microcomputers compatible with the IBM-PC family of computers. The system is composed of a series of menus, input programs, cask analysis programs, and output display programs. All data is entered through fill-in-the-blank input screens that contain descriptive data requests

  4. Versions of the Waste Reduction Model (WARM)

    Science.gov (United States)

    This page provides a brief chronology of changes made to EPA’s Waste Reduction Model (WARM), organized by WARM version number. The page includes brief summaries of changes and updates since the previous version.

  5. Computer Programs for Uncertainty Analysis of Solubility Calculations: Windows Version and Other Updates of the SENVAR and UNCCON. Program Description and Handling Instructions

    International Nuclear Information System (INIS)

    Ekberg, Christian; Oedegaard Jensen, Arvid

    2004-04-01

    Uncertainty and sensitivity analysis is becoming more and more important for testing the reliability of computer predictions. Solubility estimations play important roles for, e.g. underground repositories for nuclear waste, other hazardous materials as well as simple dissolution problems in general or industrial chemistry applications. The calculated solubility of a solid phase is dependent on several input data, e.g. the stability constants for the complexes formed in the solution, the enthalpies of reaction for the formation of these complexes and also the content of other elements in the water used for the dissolution. These input data are determined with more or less accuracy and thus the results of the calculations are uncertain. For the purpose of investigating the effects of these uncertainties several computer programs were developed in the 1990s, e.g. SENVAR, MINVAR and UNCCON. Of these SENVAR and UNCCON now exist as windows programs based on a newer speciation code. In this report we have given an explanation of how the codes work and also given some test cases as handling instructions. The results are naturally similar to the previous ones but the advantages are easier handling and more stable solubility calculations. With these improvements the programs presented here will be more publically accessible

  6. COSY INFINITY version 8

    International Nuclear Information System (INIS)

    Makino, Kyoko; Berz, Martin

    1999-01-01

    The latest version of the particle optics code COSY INFINITY is presented. Using Differential Algebraic (DA) methods, the code allows the computation of aberrations of arbitrary field arrangements to in principle unlimited order. Besides providing a general overview of the code, several recent techniques developed for specific applications are highlighted. These include new features for the direct utilization of detailed measured fields as well as rigorous treatment of remainder bounds

  7. EASI graphics - Version II

    International Nuclear Information System (INIS)

    Allensworth, J.A.

    1984-04-01

    EASI (Estimate of Adversary Sequence Interruption) is an analytical technique for measuring the effectiveness of physical protection systems. EASI Graphics is a computer graphics extension of EASI which provides a capability for performing sensitivity and trade-off analyses of the parameters of a physical protection system. This document reports on the implementation of the Version II of EASI Graphics and illustrates its application with some examples. 5 references, 15 figures, 6 tables

  8. NOAA Climate Data Record (CDR) of Ocean Near Surface Atmospheric Properties, Version 1 (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Please note, this dataset has been superseded by a newer version (see below). Users should not use this version except in rare cases (e.g., when reproducing previous...

  9. NOAA Climate Data Record (CDR) of Ocean Heat Fluxes, Version 1.0 (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Please note, this dataset has been superseded by a newer version (see below). Users should not use this version except in rare cases (e.g., when reproducing previous...

  10. Global Historical Climatology Network - Daily (GHCN-Daily), Version 2 (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Please note, this dataset has been superseded by a newer version (see below). Users should not use this version except in rare cases (e.g., when reproducing previous...

  11. NOAA Climate Data Record (CDR) of Sea Surface Temperature - WHOI, Version 1.0 (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Please note, this dataset has been superseded by a newer version (see below). Users should not use this version except in rare cases (e.g., when reproducing previous...

  12. CT-guided Irreversible Electroporation in an Acute Porcine Liver Model: Effect of Previous Transarterial Iodized Oil Tissue Marking on Technical Parameters, 3D Computed Tomographic Rendering of the Electroporation Zone, and Histopathology

    International Nuclear Information System (INIS)

    Sommer, C. M.; Fritz, S.; Vollherbst, D.; Zelzer, S.; Wachter, M. F.; Bellemann, N.; Gockner, T.; Mokry, T.; Schmitz, A.; Aulmann, S.; Stampfl, U.; Pereira, P.; Kauczor, H. U.; Werner, J.; Radeleff, B. A.

    2015-01-01

    PurposeTo evaluate the effect of previous transarterial iodized oil tissue marking (ITM) on technical parameters, three-dimensional (3D) computed tomographic (CT) rendering of the electroporation zone, and histopathology after CT-guided irreversible electroporation (IRE) in an acute porcine liver model as a potential strategy to improve IRE performance.MethodsAfter Ethics Committee approval was obtained, in five landrace pigs, two IREs of the right and left liver (RL and LL) were performed under CT guidance with identical electroporation parameters. Before IRE, transarterial marking of the LL was performed with iodized oil. Nonenhanced and contrast-enhanced CT examinations followed. One hour after IRE, animals were killed and livers collected. Mean resulting voltage and amperage during IRE were assessed. For 3D CT rendering of the electroporation zone, parameters for size and shape were analyzed. Quantitative data were compared by the Mann–Whitney test. Histopathological differences were assessed.ResultsMean resulting voltage and amperage were 2,545.3 ± 66.0 V and 26.1 ± 1.8 A for RL, and 2,537.3 ± 69.0 V and 27.7 ± 1.8 A for LL without significant differences. Short axis, volume, and sphericity index were 16.5 ± 4.4 mm, 8.6 ± 3.2 cm 3 , and 1.7 ± 0.3 for RL, and 18.2 ± 3.4 mm, 9.8 ± 3.8 cm 3 , and 1.7 ± 0.3 for LL without significant differences. For RL and LL, the electroporation zone consisted of severely widened hepatic sinusoids containing erythrocytes and showed homogeneous apoptosis. For LL, iodized oil could be detected in the center and at the rim of the electroporation zone.ConclusionThere is no adverse effect of previous ITM on technical parameters, 3D CT rendering of the electroporation zone, and histopathology after CT-guided IRE of the liver

  13. CT-guided Irreversible Electroporation in an Acute Porcine Liver Model: Effect of Previous Transarterial Iodized Oil Tissue Marking on Technical Parameters, 3D Computed Tomographic Rendering of the Electroporation Zone, and Histopathology

    Energy Technology Data Exchange (ETDEWEB)

    Sommer, C. M., E-mail: christof.sommer@med.uni-heidelberg.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany); Fritz, S., E-mail: stefan.fritz@med.uni-heidelberg.de [University Hospital Heidelberg, Department of General Visceral and Transplantation Surgery (Germany); Vollherbst, D., E-mail: dominikvollherbst@web.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany); Zelzer, S., E-mail: s.zelzer@dkfz-heidelberg.de [German Cancer Research Center (dkfz), Medical and Biological Informatics (Germany); Wachter, M. F., E-mail: fredericwachter@googlemail.com; Bellemann, N., E-mail: nadine.bellemann@med.uni-heidelberg.de; Gockner, T., E-mail: theresa.gockner@med.uni-heidelberg.de; Mokry, T., E-mail: theresa.mokry@med.uni-heidelberg.de; Schmitz, A., E-mail: anne.schmitz@med.uni-heidelberg.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany); Aulmann, S., E-mail: sebastian.aulmann@mail.com [University Hospital Heidelberg, Department of General Pathology (Germany); Stampfl, U., E-mail: ulrike.stampfl@med.uni-heidelberg.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany); Pereira, P., E-mail: philippe.pereira@slk-kliniken.de [SLK Kliniken Heilbronn GmbH, Clinic for Radiology, Minimally-invasive Therapies and Nuclear Medicine (Germany); Kauczor, H. U., E-mail: hu.kauczor@med.uni-heidelberg.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany); Werner, J., E-mail: jens.werner@med.uni-heidelberg.de [University Hospital Heidelberg, Department of General Visceral and Transplantation Surgery (Germany); Radeleff, B. A., E-mail: boris.radeleff@med.uni-heidelberg.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany)

    2015-02-15

    PurposeTo evaluate the effect of previous transarterial iodized oil tissue marking (ITM) on technical parameters, three-dimensional (3D) computed tomographic (CT) rendering of the electroporation zone, and histopathology after CT-guided irreversible electroporation (IRE) in an acute porcine liver model as a potential strategy to improve IRE performance.MethodsAfter Ethics Committee approval was obtained, in five landrace pigs, two IREs of the right and left liver (RL and LL) were performed under CT guidance with identical electroporation parameters. Before IRE, transarterial marking of the LL was performed with iodized oil. Nonenhanced and contrast-enhanced CT examinations followed. One hour after IRE, animals were killed and livers collected. Mean resulting voltage and amperage during IRE were assessed. For 3D CT rendering of the electroporation zone, parameters for size and shape were analyzed. Quantitative data were compared by the Mann–Whitney test. Histopathological differences were assessed.ResultsMean resulting voltage and amperage were 2,545.3 ± 66.0 V and 26.1 ± 1.8 A for RL, and 2,537.3 ± 69.0 V and 27.7 ± 1.8 A for LL without significant differences. Short axis, volume, and sphericity index were 16.5 ± 4.4 mm, 8.6 ± 3.2 cm{sup 3}, and 1.7 ± 0.3 for RL, and 18.2 ± 3.4 mm, 9.8 ± 3.8 cm{sup 3}, and 1.7 ± 0.3 for LL without significant differences. For RL and LL, the electroporation zone consisted of severely widened hepatic sinusoids containing erythrocytes and showed homogeneous apoptosis. For LL, iodized oil could be detected in the center and at the rim of the electroporation zone.ConclusionThere is no adverse effect of previous ITM on technical parameters, 3D CT rendering of the electroporation zone, and histopathology after CT-guided IRE of the liver.

  14. Numerical Recipes in C++: The Art of Scientific Computing (2nd edn). Numerical Recipes Example Book (C++) (2nd edn). Numerical Recipes Multi-Language Code CD ROM with LINUX or UNIX Single-Screen License Revised Version

    International Nuclear Information System (INIS)

    Borcherds, P

    2003-01-01

    The two Numerical Recipes books are marvellous. The principal book, The Art of Scientific Computing, contains program listings for almost every conceivable requirement, and it also contains a well written discussion of the algorithms and the numerical methods involved. The Example Book provides a complete driving program, with helpful notes, for nearly all the routines in the principal book. The first edition of Numerical Recipes: The Art of Scientific Computing was published in 1986 in two versions, one with programs in Fortran, the other with programs in Pascal. There were subsequent versions with programs in BASIC and in C. The second, enlarged edition was published in 1992, again in two versions, one with programs in Fortran (NR(F)), the other with programs in C (NR(C)). In 1996 the authors produced Numerical Recipes in Fortran 90: The Art of Parallel Scientific Computing as a supplement, called Volume 2, with the original (Fortran) version referred to as Volume 1. Numerical Recipes in C++ (NR(C++)) is another version of the 1992 edition. The numerical recipes are also available on a CD ROM: if you want to use any of the recipes, I would strongly advise you to buy the CD ROM. The CD ROM contains the programs in all the languages. When the first edition was published I bought it, and have also bought copies of the other editions as they have appeared. Anyone involved in scientific computing ought to have a copy of at least one version of Numerical Recipes, and there also ought to be copies in every library. If you already have NR(F), should you buy the NR(C++) and, if not, which version should you buy? In the preface to Volume 2 of NR(F), the authors say 'C and C++ programmers have not been far from our minds as we have written this volume, and we think that you will find that time spent in absorbing its principal lessons will be amply repaid in the future as C and C++ eventually develop standard parallel extensions'. In the preface and introduction to NR

  15. Program X4TOC4 (Version 86-1). Translation of experimental data from the EXFOR format to a computation format

    International Nuclear Information System (INIS)

    Cullen, D.E.

    1986-09-01

    Experimental nuclear reaction data are world-wide compiled in EXFOR format. The computer program X4TOC4 described in the present document translates data from the rather flexible EXFOR format to the more rigid ''computation format'' which is suitable for input to further computer processing of the data including graphical plotting. The program is available costfree from the IAEA Nuclear Data Section, upon request. (author)

  16. User's Manual for LEWICE Version 3.2

    Science.gov (United States)

    Wright, William

    2008-01-01

    A research project is underway at NASA Glenn to produce a computer code which can accurately predict ice growth under a wide range of meteorological conditions for any aircraft surface. This report will present a description of the code inputs and outputs from version 3.2 of this software, which is called LEWICE. This version differs from release 2.0 due to the addition of advanced thermal analysis capabilities for de-icing and anti-icing applications using electrothermal heaters or bleed air applications, the addition of automated Navier-Stokes analysis, an empirical model for supercooled large droplets (SLD) and a pneumatic boot option. An extensive effort was also undertaken to compare the results against the database of electrothermal results which have been generated in the NASA Glenn Icing Research Tunnel (IRT) as was performed for the validation effort for version 2.0. This report will primarily describe the features of the software related to the use of the program. Appendix A has been included to list some of the inner workings of the software or the physical models used. This information is also available in the form of several unpublished documents internal to NASA. This report is intended as a replacement for all previous user manuals of LEWICE. In addition to describing the changes and improvements made for this version, information from previous manuals may be duplicated so that the user will not need to consult previous manuals to use this software.

  17. Computational analysis of perturbations in the post-fusion Dengue virus envelope protein highlights known epitopes and conserved residues in the Zika virus [version 2; referees: 3 approved

    Directory of Open Access Journals (Sweden)

    Sandeep Chakraborty

    2016-09-01

    little emphasis in existing literature, are found to have significant electrostatic perturbation. Thus, a combination of different computational methods enable the rapid and rational detection of critical residues as epitopes in the search for an elusive therapy or vaccine that neutralizes multiple members of the Flaviviridae family. These secondary structures are conserved in the related Dengue virus (DENV, and possibly rationalize isolation techniques particle adsorption on magnetic beads coated with anionic polymers and anionic antiviral agents (viprolaxikine for DENV. These amphipathic α-helices could enable design of molecules for inhibiting α-helix mediated protein-protein interactions. Finally, comparison of these secondary structures in proteins from related families illuminate subtle changes in the proteins that might render them ineffective to previously successful drugs and vaccines, which are difficult to identify by a simple sequence or structural alignment. Finally, conflicting results about residues that are involved in neutralizing a DENV-E protein by the potent antibody 5J7 (PDB ID:3J6U are reported.

  18. Global Precipitation Climatology Project (GPCP) - Monthly, Version 2.2 (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Version 2.2 of the dataset has been superseded by a newer version. Users should not use version 2.2 except in rare cases (e.g., when reproducing previous studies...

  19. Program PLOTC4 (Version 86-1). Plot evaluated data from the ENDF/B format and/or experimental data which is in a computation format

    International Nuclear Information System (INIS)

    Cullen, D.E.

    1986-09-01

    Experimental and evaluated nuclear reaction data are world-wide compiled in EXFOR format and ENDF format, respectively. The computer program PLOTC4 described in the present document plots data from both formats; EXFOR data must be converted first to a ''computation format''. The program is available costfree from the IAEA Nuclear Data Section, upon request. (author)

  20. Backfilling the Grid with Containerized BOINC in the ATLAS computing

    CERN Document Server

    Wu, Wenjing; The ATLAS collaboration

    2018-01-01

    Virtualization is a commonly used solution for utilizing the opportunistic computing resources in the HEP field, as it provides a unified software and OS layer that the HEP computing tasks require over the heterogeneous opportunistic computing resources. However there is always performance penalty with virtualization, especially for short jobs which are always the case for volunteer computing tasks, the overhead of virtualization becomes a big portion in the wall time, hence it leads to low CPU efficiency of the jobs. With the wide usage of containers in HEP computing, we explore the possibility of adopting the container technology into the ATLAS BOINC project, hence we implemented a Native version in BOINC, which uses the singularity container or direct usage of the target OS to replace VirtualBox. In this paper, we will discuss 1) the implementation and workflow of the Native version in the ATLAS BOINC; 2) the performance measurement of the Native version comparing to the previous Virtualization version. 3)...

  1. MEASUREMENT AND PRECISION, EXPERIMENTAL VERSION.

    Science.gov (United States)

    Harvard Univ., Cambridge, MA. Harvard Project Physics.

    THIS DOCUMENT IS AN EXPERIMENTAL VERSION OF A PROGRAMED TEXT ON MEASUREMENT AND PRECISION. PART I CONTAINS 24 FRAMES DEALING WITH PRECISION AND SIGNIFICANT FIGURES ENCOUNTERED IN VARIOUS MATHEMATICAL COMPUTATIONS AND MEASUREMENTS. PART II BEGINS WITH A BRIEF SECTION ON EXPERIMENTAL DATA, COVERING SUCH POINTS AS (1) ESTABLISHING THE ZERO POINT, (2)…

  2. PLOT3D (version-5): a computer code for drawing three dimensional graphs, maps and histograms, in single or multiple colours, for mono or stereoscopic viewing

    International Nuclear Information System (INIS)

    Jayaswal, Balhans

    1987-01-01

    PLOT3D series of graphic codes (version 1 to 5) have been developed for drawing three dimensional graphs, maps, histograms and simple layout diagrams on monochrome or colour raster graphic terminal and plotter. Of these, PLOT3D Version-5 is an advanced code, equipped with several features that make it specially suitable for drawing 3D maps, multicolour 3D and contour graphs, and 3D layout diagrams, in axonometric or perspective projection. Prior to drawing, graphic parameters that define orientation, magnification, smoothening, shading, colour-map, etc. of the figure can be selected interactively by means of simple commands on the user terminal, or by reading those commands from an input data file. This code requires linking with any one of three supporting libraries: PLOT 10 TCS, PLOT 10 IGL, and CALCOMP, and the figure can be plotted in single colour, or displayed in single or multiple colours depending upon the type of library support and output device. Furthermore, this code can also be used to plot left and right eye view projections of 3D figure for composing a stereoscopic image from them with the aid of a viewer. 14 figures. (author)

  3. Normative data for a computer-assisted version of the auditory three-consonant Brown-Peterson paradigm in the elderly French-Quebec population.

    Science.gov (United States)

    Callahan, Brandy L; Belleville, Sylvie; Ferland, Guylaine; Potvin, Olivier; Tremblay, Marie-Pier; Hudon, Carol; Macoir, Joël

    2014-01-01

    The Brown-Peterson task is used to assess verbal short-term memory as well as divided attention. In its auditory three-consonant version, trigrams are presented to participants who must recall the items in correct order after variable delays, during which an interference task is performed. The present study aimed to establish normative data for this test in the elderly French-Quebec population based on cross-sectional data from a retrospective, multi-center convenience sample. A total of 595 elderly native French-speakers from the province of Quebec performed the Memoria version of the auditory three-consonant Brown-Peterson test. For both series and item-by-item scoring methods, age, education, and, in most cases, recall after a 0-second interval were found to be significantly associated with recall performance after 10-second, 20-second, and 30-second interference intervals. Based on regression model results, equations to calculate Z scores are presented for the 10-second, 20-second and 30-second intervals and for each scoring method to allow estimation of expected performance based on participants' individual characteristics. As an important ceiling effect was observed at the 0-second interval, norms for this interference interval are presented in percentiles.

  4. Program PLOTC4. (Version 87-1). Plot evaluated data from the ENDF/B format and/or experimental data which is in a computation format

    International Nuclear Information System (INIS)

    Cullen, D.E.

    1987-06-01

    Experimental and evaluated nuclear reaction data are world-wide compiled in EXFOR format (see document IAEA-NDS-1) and ENDF format (see document IAEA-NDS-10), respectively. The computer program PLOTC4 described in the present document plots data from both formats; EXFOR data must be converted first to a ''computation format'' (see document IAEA-NDS-80). The program is available upon request costfree from the IAEA Nuclear Data Section. (author)

  5. Comparison of RECIST version 1.0 and 1.1 in assessment of tumor response by computed tomography in advanced gastric cancer.

    Science.gov (United States)

    Jang, Gil-Su; Kim, Min-Jeong; Ha, Hong-Il; Kim, Jung Han; Kim, Hyeong Su; Ju, Sung Bae; Zang, Dae Young

    2013-12-01

    Response Evaluation Criteria in Solid Tumors (RECIST) guideline version 1.0 (RECIST 1.0) was proposed as a new guideline for evaluating tumor response and has been widely accepted as a standardized measure. With a number of issues being raised on RECIST 1.0, however, a revised RECIST guideline version 1.1 (RECIST 1.1) was proposed by the RECIST Working Group in 2009. This study was conducted to compare CT tumor response based on RECIST 1.1 vs. RECIST 1.0 in patients with advanced gastric cancer (AGC). We reviewed 61 AGC patients with measurable diseases by RECIST 1.0 who were enrolled in other clinical trials between 2008 and 2010. These patients were retrospectively re-analyzed to determine the concordance between the two response criteria using the κ statistic. The number and sum of tumor diameters of the target lesions by RECIST 1.1 were significantly lower than those by RECIST 1.0 (P<0.0001). However, there was excellent agreement in tumor response between RECIST 1.1 and RECIST 1.0 (κ=0.844). The overall response rates (ORRs) according to RECIST 1.0 and RECIST 1.1 were 32.7% (20/61) and 34.5% (20/58), respectively. One patient with partial response (PR) based on RECIST 1.0 was reclassified as stable disease (SD) by RECIST 1.1. Of two patients with SD by RECIST 1.0, one was downgraded to progressive disease and the other was upgraded to PR by RECIST 1.1. RECIST 1.1 provided almost perfect agreement with RECIST 1.0 in the CT assessment of tumor response of AGC.

  6. Developing a strategy for computational lab skills training through Software and Data Carpentry: Experiences from the ELIXIR Pilot action [version 1; referees: 3 approved

    Directory of Open Access Journals (Sweden)

    Aleksandra Pawlik

    2017-07-01

    Full Text Available Quality training in computational skills for life scientists is essential to allow them to deliver robust, reproducible and cutting-edge research. A pan-European bioinformatics programme, ELIXIR, has adopted a well-established and progressive programme of computational lab and data skills training from Software and Data Carpentry, aimed at increasing the number of skilled life scientists and building a sustainable training community in this field. This article describes the Pilot action, which introduced the Carpentry training model to the ELIXIR community.

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  8. Beam dynamics simulations using a parallel version of PARMILA

    International Nuclear Information System (INIS)

    Ryne, R.D.

    1996-01-01

    The computer code PARMILA has been the primary tool for the design of proton and ion linacs in the United States for nearly three decades. Previously it was sufficient to perform simulations with of order 10000 particles, but recently the need to perform high resolution halo studies for next-generation, high intensity linacs has made it necessary to perform simulations with of order 100 million particles. With the advent of massively parallel computers such simulations are now within reach. Parallel computers already make it possible, for example, to perform beam dynamics calculations with tens of millions of particles, requiring over 10 GByte of core memory, in just a few hours. Also, parallel computers are becoming easier to use thanks to the availability of mature, Fortran-like languages such as Connection Machine Fortran and High Performance Fortran. We will describe our experience developing a parallel version of PARMILA and the performance of the new code

  9. Beam dynamics simulations using a parallel version of PARMILA

    International Nuclear Information System (INIS)

    Ryne, Robert

    1996-01-01

    The computer code PARMILA has been the primary tool for the design of proton and ion linacs in the United States for nearly three decades. Previously it was sufficient to perform simulations with of order 10000 particles, but recently the need to perform high resolution halo studies for next-generation, high intensity linacs has made it necessary to perform simulations with of order 100 million particles. With the advent of massively parallel computers such simulations are now within reach. Parallel computers already make it possible, for example, to perform beam dynamics calculations with tens of millions of particles, requiring over 10 GByte of core memory, in just a few hours. Also, parallel computers are becoming easier to use thanks to the availability of mature, Fortran-like languages such as Connection Machine Fortran and High Performance Fortran. We will describe our experience developing a parallel version of PARMILA and the performance of the new code. (author)

  10. Exploring individual cognitions, self-regulation skills, and environmental-level factors as mediating variables of two versions of a Web-based computer-tailored nutrition education intervention aimed at adults: A randomized controlled trial.

    Science.gov (United States)

    Springvloet, Linda; Lechner, Lilian; Candel, Math J J M; de Vries, Hein; Oenema, Anke

    2016-03-01

    This study explored whether the determinants that were targeted in two versions of a Web-based computer-tailored nutrition education intervention mediated the effects on fruit, high-energy snack, and saturated fat intake among adults who did not comply with dietary guidelines. A RCT was conducted with a basic (tailored intervention targeting individual cognitions and self-regulation), plus (additionally targeting environmental-level factors), and control group (generic nutrition information). Participants were recruited from the general Dutch adult population and randomly assigned to one of the study groups. Online self-reported questionnaires assessed dietary intake and potential mediating variables (behavior-specific cognitions, action- and coping planning, environmental-level factors) at baseline and one (T1) and four (T2) months post-intervention (i.e. four and seven months after baseline). The joint-significance test was used to establish mediating variables at different time points (T1-mediating variables - T2-intake; T1-mediating variables - T1-intake; T2-mediating variables - T2-intake). Educational differences were examined by testing interaction terms. The effect of the plus version on fruit intake was mediated (T2-T2) by intention and fruit availability at home and for high-educated participants also by attitude. Among low/moderate-educated participants, high-energy snack availability at home mediated (T1-T1) the effect of the basic version on high-energy snack intake. Subjective norm mediated (T1-T1) the effect of the basic version on fat intake among high-educated participants. Only some of the targeted determinants mediated the effects of both intervention versions on fruit, high-energy snack, and saturated fat intake. A possible reason for not finding a more pronounced pattern of mediating variables is that the educational content was tailored to individual characteristics and that participants only received feedback for relevant and not for all

  11. New version: GRASP2K relativistic atomic structure package

    Science.gov (United States)

    Jönsson, P.; Gaigalas, G.; Bieroń, J.; Fischer, C. Froese; Grant, I. P.

    2013-09-01

    , Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 730252 No. of bytes in distributed program, including test data, etc.: 14808872 Distribution format: tar.gz Programming language: Fortran. Computer: Intel Xeon, 2.66 GHz. Operating system: Suse, Ubuntu, and Debian Linux 64-bit. RAM: 500 MB or more Classification: 2.1. Catalogue identifier of previous version: ADZL_v1_0 Journal reference of previous version: Comput. Phys. Comm. 177 (2007) 597 Does the new version supersede the previous version?: Yes Nature of problem: Prediction of atomic properties — atomic energy levels, oscillator strengths, radiative decay rates, hyperfine structure parameters, Landé gJ-factors, and specific mass shift parameters — using a multiconfiguration Dirac-Hartree-Fock approach. Solution method: The computational method is the same as in the previous GRASP2K [1] version except that for v3 codes the njgraf library module [2] for recoupling has been replaced by librang [3,4]. Reasons for new version: New angular libraries with improved performance are available. Also methodology for transforming from jj- to LSJ-coupling has been developed. Summary of revisions: New angular libraries where the coefficients of fractional parentage have been extended to j=9/2, making calculations feasible for the lanthanides and actinides. Inclusion of a new program jj2lsj, which reports the percentage composition of the wave function in LSJ. Transition programs have been modified to produce a file of transition data with one record for each transition in the same format as Atsp2K [C. Froese Fischer, G. Tachiev, G. Gaigalas and M.R. Godefroid, Comput. Phys. Commun. 176 (2007) 559], which identifies each atomic state by the total energy and a label for the CSF with the largest expansion coefficient in LSJ intermediate coupling. Updated to 64-bit architecture. A

  12. NOAA Climate Data Record (CDR) of AVHRR Daily and Monthly Aerosol Optical Thickness over Global Oceans, Version 1.0 (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Version 1 of the dataset has been superseded by a newer version. Users should not use version 1 except in rare cases (e.g., when reproducing previous studies that...

  13. NOAA Climate Data Record (CDR) of AVHRR Daily and Monthly Aerosol Optical Thickness over Global Oceans, Version 2.0 (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Version 2 of the dataset has been superseded by a newer version. Users should not use version 2 except in rare cases (e.g., when reproducing previous studies that...

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  15. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  17. TJ-II Library Manual (Version 2)

    International Nuclear Information System (INIS)

    Tribaldos, V.; Milligen, B. Ph. van; Lopez-Fraguas, A.

    2001-01-01

    This is a manual of use of the TJ2 Numerical Library that has been developed for making numerical computations of different TJ-II configurations. This manual is a new version of the earlier manual CIEMAT report 806. (Author)

  18. CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (IBM PC VERSION)

    Science.gov (United States)

    Donnell, B.

    1994-01-01

    COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the

  19. CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (UNIX VERSION)

    Science.gov (United States)

    Donnell, B.

    1994-01-01

    COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the

  20. CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (MACINTOSH VERSION)

    Science.gov (United States)

    Riley, G.

    1994-01-01

    COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the

  1. TASAC a computer program for thermal analysis of severe accident conditions. Version 3/01, Dec 1991. Model description and user's guide

    International Nuclear Information System (INIS)

    Stempniewicz, M.; Marks, P.; Salwa, K.

    1992-06-01

    TASAC (Thermal Analysis of Severe Accident Conditions) is computer code developed in the Institute of Atomic Energy written in FORTRAN 77 for the digital computer analysis of PWR rod bundle behaviour during severe accident conditions. The code has the ability to model an early stage of core degradation including heat transfer inside the rods, convective and radiative heat exchange as well as cladding interactions with coolant and fuel, hydrogen generation, melting, relocations and refreezing of fuel rod materials with dissolution of UO 2 and ZrO 2 in liquid phase. The code was applied for the simulation of International Standard Problem number 28, performed on PHEBUS test facility. This report contains the program physical models description, detailed description of input data requirements and results of code verification. The main directions for future TASAC code development are formulated. (author). 20 refs, 39 figs, 4 tabs

  2. TASAC a computer program for thermal analysis of severe accident conditions. Version 3/01, Dec 1991. Model description and user`s guide

    Energy Technology Data Exchange (ETDEWEB)

    Stempniewicz, M; Marks, P; Salwa, K

    1992-06-01

    TASAC (Thermal Analysis of Severe Accident Conditions) is computer code developed in the Institute of Atomic Energy written in FORTRAN 77 for the digital computer analysis of PWR rod bundle behaviour during severe accident conditions. The code has the ability to model an early stage of core degradation including heat transfer inside the rods, convective and radiative heat exchange as well as cladding interactions with coolant and fuel, hydrogen generation, melting, relocations and refreezing of fuel rod materials with dissolution of UO{sub 2} and ZrO{sub 2} in liquid phase. The code was applied for the simulation of International Standard Problem number 28, performed on PHEBUS test facility. This report contains the program physical models description, detailed description of input data requirements and results of code verification. The main directions for future TASAC code development are formulated. (author). 20 refs, 39 figs, 4 tabs.

  3. Computational and experimental fluid mechanics. Draft version of annex to final report for period January 1st 1993 to December 31st 1997

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-10-01

    The general purpose of the program has been the development of efficient algorithms, their implementation in codes of Computational Fluid Mechanics (CFD), and the experimental verification of these codes. Flows of both fundamental and applied nature has been investigated, including flows in industrial process equipment, about aerodynamics structures and ships, and flows over bed forms of importance for sediment transport. The experimental work has included the development of improved techniques, emphasizing optical methods. The objectives were realized through a coordinated experimental and theoretical/computation research program, organized in 6 specific projects: 1. CFD-methods and algorithms. 2. Special element simulation of ultrafiltration. 3. Turbulent swirling flows; 4. Near-wall models of turbulence and development of experimental techniques. 5. Flow over bed forms. 6. Flow past ship hull. (au)

  4. EQ3NR, a computer program for geochemical aqueous speciation-solubility calculations: Theoretical manual, user`s guide, and related documentation (Version 7.0); Part 3

    Energy Technology Data Exchange (ETDEWEB)

    Wolery, T.J.

    1992-09-14

    EQ3NR is an aqueous solution speciation-solubility modeling code. It is part of the EQ3/6 software package for geochemical modeling. It computes the thermodynamic state of an aqueous solution by determining the distribution of chemical species, including simple ions, ion pairs, and complexes, using standard state thermodynamic data and various equations which describe the thermodynamic activity coefficients of these species. The input to the code describes the aqueous solution in terms of analytical data, including total (analytical) concentrations of dissolved components and such other parameters as the pH, pHCl, Eh, pe, and oxygen fugacity. The input may also include a desired electrical balancing adjustment and various constraints which impose equilibrium with special pure minerals, solid solution end-member components (of specified mole fractions), and gases (of specified fugacities). The code evaluates the degree of disequilibrium in terms of the saturation index (SI = 1og Q/K) and the thermodynamic affinity (A = {minus}2.303 RT log Q/K) for various reactions, such as mineral dissolution or oxidation-reduction in the aqueous solution itself. Individual values of Eh, pe, oxygen fugacity, and Ah (redox affinity) are computed for aqueous redox couples. Equilibrium fugacities are computed for gas species. The code is highly flexible in dealing with various parameters as either model inputs or outputs. The user can specify modification or substitution of equilibrium constants at run time by using options on the input file.

  5. EQ6, a computer program for reaction path modeling of aqueous geochemical systems: Theoretical manual, user`s guide, and related documentation (Version 7.0); Part 4

    Energy Technology Data Exchange (ETDEWEB)

    Wolery, T.J.; Daveler, S.A.

    1992-10-09

    EQ6 is a FORTRAN computer program in the EQ3/6 software package (Wolery, 1979). It calculates reaction paths (chemical evolution) in reacting water-rock and water-rock-waste systems. Speciation in aqueous solution is an integral part of these calculations. EQ6 computes models of titration processes (including fluid mixing), irreversible reaction in closed systems, irreversible reaction in some simple kinds of open systems, and heating or cooling processes, as well as solve ``single-point`` thermodynamic equilibrium problems. A reaction path calculation normally involves a sequence of thermodynamic equilibrium calculations. Chemical evolution is driven by a set of irreversible reactions (i.e., reactions out of equilibrium) and/or changes in temperature and/or pressure. These irreversible reactions usually represent the dissolution or precipitation of minerals or other solids. The code computes the appearance and disappearance of phases in solubility equilibrium with the water. It finds the identities of these phases automatically. The user may specify which potential phases are allowed to form and which are not. There is an option to fix the fugacities of specified gas species, simulating contact with a large external reservoir. Rate laws for irreversible reactions may be either relative rates or actual rates. If any actual rates are used, the calculation has a time frame. Several forms for actual rate laws are programmed into the code. EQ6 is presently able to model both mineral dissolution and growth kinetics.

  6. EQ3NR, a computer program for geochemical aqueous speciation-solubility calculations: Theoretical manual, user's guide, and related documentation (Version 7.0)

    International Nuclear Information System (INIS)

    Wolery, T.J.

    1992-01-01

    EQ3NR is an aqueous solution speciation-solubility modeling code. It is part of the EQ3/6 software package for geochemical modeling. It computes the thermodynamic state of an aqueous solution by determining the distribution of chemical species, including simple ions, ion pairs, and complexes, using standard state thermodynamic data and various equations which describe the thermodynamic activity coefficients of these species. The input to the code describes the aqueous solution in terms of analytical data, including total (analytical) concentrations of dissolved components and such other parameters as the pH, pHCl, Eh, pe, and oxygen fugacity. The input may also include a desired electrical balancing adjustment and various constraints which impose equilibrium with special pure minerals, solid solution end-member components (of specified mole fractions), and gases (of specified fugacities). The code evaluates the degree of disequilibrium in terms of the saturation index (SI = 1og Q/K) and the thermodynamic affinity (A = -2.303 RT log Q/K) for various reactions, such as mineral dissolution or oxidation-reduction in the aqueous solution itself. Individual values of Eh, pe, oxygen fugacity, and Ah (redox affinity) are computed for aqueous redox couples. Equilibrium fugacities are computed for gas species. The code is highly flexible in dealing with various parameters as either model inputs or outputs. The user can specify modification or substitution of equilibrium constants at run time by using options on the input file

  7. EQ6, a computer program for reaction path modeling of aqueous geochemical systems: Theoretical manual, user's guide, and related documentation (Version 7.0)

    International Nuclear Information System (INIS)

    Wolery, T.J.; Daveler, S.A.

    1992-01-01

    EQ6 is a FORTRAN computer program in the EQ3/6 software package (Wolery, 1979). It calculates reaction paths (chemical evolution) in reacting water-rock and water-rock-waste systems. Speciation in aqueous solution is an integral part of these calculations. EQ6 computes models of titration processes (including fluid mixing), irreversible reaction in closed systems, irreversible reaction in some simple kinds of open systems, and heating or cooling processes, as well as solve ''single-point'' thermodynamic equilibrium problems. A reaction path calculation normally involves a sequence of thermodynamic equilibrium calculations. Chemical evolution is driven by a set of irreversible reactions (i.e., reactions out of equilibrium) and/or changes in temperature and/or pressure. These irreversible reactions usually represent the dissolution or precipitation of minerals or other solids. The code computes the appearance and disappearance of phases in solubility equilibrium with the water. It finds the identities of these phases automatically. The user may specify which potential phases are allowed to form and which are not. There is an option to fix the fugacities of specified gas species, simulating contact with a large external reservoir. Rate laws for irreversible reactions may be either relative rates or actual rates. If any actual rates are used, the calculation has a time frame. Several forms for actual rate laws are programmed into the code. EQ6 is presently able to model both mineral dissolution and growth kinetics

  8. United States Climate Reference Network (USCRN) Processed Data (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Please note, this dataset has been superseded by a newer version (see below). Users should not use this version except in rare cases (e.g., when reproducing previous...

  9. Integrated Global Radiosonde Archive (IGRA) - Monthly Means (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Please note, this dataset has been superseded by a newer version (see below). Users should not use this version except in rare cases (e.g., when reproducing previous...

  10. URGENCES NOUVELLE VERSION

    CERN Multimedia

    Medical Service

    2002-01-01

    The table of emergency numbers that appeared in Bulletin 10/2002 is out of date. The updated version provided by the Medical Service appears on the following page. Please disregard the previous version. URGENT NEED OF A DOCTOR GENEVAPATIENT NOT FIT TO BE MOVED: Call your family doctor Or SOS MEDECINS (24H/24H) 748 49 50 Or ASSOC. OF GENEVA DOCTORS (7H-23H) 322 20 20 PATIENT CAN BE MOVED: HOPITAL CANTONAL 24 Micheli du Crest 372 33 11 382 33 11 CHILDREN'S HOSPITAL 6 rue Willy Donzé 382 68 18 382 45 55 MATERNITY 24 Micheli du Crest 382 68 16 382 33 11 OPHTALMOLOGY 22 Alcide Jentzer 382 84 00 HOPITAL DE LA TOUR Meyrin 719 61 11 CENTRE MEDICAL DE MEYRIN Champs Fréchets 719 74 00 URGENCES : FIRE BRIGADE 118 FIRE BRIGADE CERN 767 44 44 BESOIN URGENT D'AMBULANCE (GENEVE ET VAUD) : 144 POLICE 117 ANTI-POISON CENTRE 24H/24H 01 251 51 510 EUROPEAN EMERGENCY CALL: 112 FRANCE PATIENT NOT FIT TO BE MOVED: call your family doctor PATIENT CAN BE MOVED: ST. JULIE...

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  12. ELIPGRID-PC: Upgraded version

    International Nuclear Information System (INIS)

    Davidson, J.R.

    1995-12-01

    Evaluating the need for and the effectiveness of remedial cleanup at waste sites often includes finding average contaminant concentrations and identifying pockets of contamination called hot spots. The standard tool for calculating the probability of detecting pockets of contamination called hot spots has been the ELIPGRID code of singer and Wickman. The ELIPGRID-PC program has recently made this algorithm available for an IBM reg-sign personal computer (PC) or compatible. A new version of ELIPGRID-PC, incorporating Monte Carlo test results and simple graphics, is herein described. Various examples of how to use the program for both single and multiple hot spot cases are given. The code for an American National Standards Institute C version of the ELIPGRID algorithm is provided, and limitations and further work are noted. This version of ELIPGRID-PC reliably meets the goal of moving Singer's ELIPGRID algorithm to the PC

  13. GENII Version 2 Users’ Guide

    Energy Technology Data Exchange (ETDEWEB)

    Napier, Bruce A.

    2004-03-08

    The GENII Version 2 computer code was developed for the Environmental Protection Agency (EPA) at Pacific Northwest National Laboratory (PNNL) to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) and the radiological risk estimating procedures of Federal Guidance Report 13 into updated versions of existing environmental pathway analysis models. The resulting environmental dosimetry computer codes are compiled in the GENII Environmental Dosimetry System. The GENII system was developed to provide a state-of-the-art, technically peer-reviewed, documented set of programs for calculating radiation dose and risk from radionuclides released to the environment. The codes were designed with the flexibility to accommodate input parameters for a wide variety of generic sites. Operation of a new version of the codes, GENII Version 2, is described in this report. Two versions of the GENII Version 2 code system are available, a full-featured version and a version specifically designed for demonstrating compliance with the dose limits specified in 40 CFR 61.93(a), the National Emission Standards for Hazardous Air Pollutants (NESHAPS) for radionuclides. The only differences lie in the limitation of the capabilities of the user to change specific parameters in the NESHAPS version. This report describes the data entry, accomplished via interactive, menu-driven user interfaces. Default exposure and consumption parameters are provided for both the average (population) and maximum individual; however, these may be modified by the user. Source term information may be entered as radionuclide release quantities for transport scenarios, or as basic radionuclide concentrations in environmental media (air, water, soil). For input of basic or derived concentrations, decay of parent radionuclides and ingrowth of radioactive decay products prior to the start of the exposure scenario may be considered. A single code run can

  14. Application of modified version of SPPS-1 - HEXAB-2DB computer code package for operational analyses of fuel behaviour in WWER-440 reactors at Kozloduy NPP

    Energy Technology Data Exchange (ETDEWEB)

    Kharalampieva, Ts; Stoyanova, I; Antonov, A; Simeonov, T [Kombinat Atomna Energetika, Kozloduj (Bulgaria); Petkov, P [Bylgarska Akademiya na Naukite, Sofia (Bulgaria). Inst. za Yadrena Izsledvaniya i Yadrena Energetika

    1994-12-31

    The modified version of SPPS-1 code called SPPS-1-HEXAB-2DB was applied for the purposes of the operational analysis and power peaking factors and reactor core critical parameters predictions of WWER-440s. The results of the calculations performed by the use of SPPS-1-HEXAB-2DB code and the corresponding parameters obtained from experiments at Kozloduy NPP WWER-440s as well as the results of fuel rod power distribution are presented. The method of operation simulation of reactor core with 349 assemblies (Unit 4) and with 313 fuel assemblies and 36 dummy fuel assemblies (Unit 1) is outlined. The modified code calculates not only fuel burnup and Pm-149 and Sm-149 concentrations distributions but also the space distribution of I-135 and Xe-135 concentrations. In this way it makes possible to perform the reactor operation simulation during the immediate periods after the reactor start-up or shut-down and to predict the critical reactor core parameters during transients. The results obtained show that SPPS-1-HEXAB-2DB code describes adequately the reactor core status. The new SPPS-1 code algorithm for estimation of assembly-wise power peaking factors distribution in reactor core is also described. The new code provides an option for checking the correctness of reactor core symmetry. The experience from the use of the modified SPPS-1-HEXAB-2DB code system confirms the provision of improved availability of operational analysis, prediction of Kozloduy NPP WWER-440s safe operations and fuel behaviour estimation. 14 tabs., 4 figs., 5 refs.

  15. Genetic similarity of polyploids - A new version of the computer program POPDIST (ver. 1.2.0) considers intraspecific genetic differentiation

    DEFF Research Database (Denmark)

    Tomiuk, Jürgen; Guldbrandtsen, Bernt; Loeschcke, Volker

    2009-01-01

    For evolutionary studies of polyploid species estimates of the genetic identity between species with different degrees of ploidy are particularly required because gene counting in samples of polyploid individuals often cannot be done, e.g., in triploids the phenotype AB can be genotypically either...... ABB or AAB. We recently suggested a genetic distance measure that is based on phenotype counting and made available the computer program POPDIST. The program provides maximum-likelihood estimates of the genetic identities and distances between polyploid populations, but this approach...

  16. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1: ASC software quality engineering practices, Version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr. (,; .); Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  18. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  19. Using the genome aggregation database, computational pathogenicity prediction tools, and patch clamp heterologous expression studies to demote previously published long QT syndrome type 1 mutations from pathogenic to benign.

    Science.gov (United States)

    Clemens, Daniel J; Lentino, Anne R; Kapplinger, Jamie D; Ye, Dan; Zhou, Wei; Tester, David J; Ackerman, Michael J

    2018-04-01

    Mutations in the KCNQ1-encoded Kv7.1 potassium channel cause long QT syndrome (LQTS) type 1 (LQT1). It has been suggested that ∼10%-20% of rare LQTS case-derived variants in the literature may have been published erroneously as LQT1-causative mutations and may be "false positives." The purpose of this study was to determine which previously published KCNQ1 case variants are likely false positives. A list of all published, case-derived KCNQ1 missense variants (MVs) was compiled. The occurrence of each MV within the Genome Aggregation Database (gnomAD) was assessed. Eight in silico tools were used to predict each variant's pathogenicity. Case-derived variants that were either (1) too frequently found in gnomAD or (2) absent in gnomAD but predicted to be pathogenic by ≤2 tools were considered potential false positives. Three of these variants were characterized functionally using whole-cell patch clamp technique. Overall, there were 244 KCNQ1 case-derived MVs. Of these, 29 (12%) were seen in ≥10 individuals in gnomAD and are demotable. However, 157 of 244 MVs (64%) were absent in gnomAD. Of these, 7 (4%) were predicted to be pathogenic by ≤2 tools, 3 of which we characterized functionally. There was no significant difference in current density between heterozygous KCNQ1-F127L, -P477L, or -L619M variant-containing channels compared to KCNQ1-WT. This study offers preliminary evidence for the demotion of 32 (13%) previously published LQT1 MVs. Of these, 29 were demoted because of their frequent sighting in gnomAD. Additionally, in silico analysis and in vitro functional studies have facilitated the demotion of 3 ultra-rare MVs (F127L, P477L, L619M). Copyright © 2017 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.

  20. Computable Frames in Computable Banach Spaces

    Directory of Open Access Journals (Sweden)

    S.K. Kaushik

    2016-06-01

    Full Text Available We develop some parts of the frame theory in Banach spaces from the point of view of Computable Analysis. We define computable M-basis and use it to construct a computable Banach space of scalar valued sequences. Computable Xd frames and computable Banach frames are also defined and computable versions of sufficient conditions for their existence are obtained.

  1. Highlights from the previous volumes

    Science.gov (United States)

    Vergini Eduardo, G.; Pan, Y.; al., Vardi R. et; al., Akkermans Eric et; et al.

    2014-01-01

    Semiclassical propagation up to the Heisenberg time Superconductivity and magnetic order in the half-Heusler compound ErPdBi An experimental evidence-based computational paradigm for new logic-gates in neuronal activity Universality in the symmetric exclusion process and diffusive systems

  2. Automated Data Handling And Instrument Control Using Low-Cost Desktop Computers And An IEEE 488 Compatible Version Of The ODETA V.

    Science.gov (United States)

    van Leunen, J. A. J.; Dreessen, J.

    1984-05-01

    The result of a measurement of the modulation transfer function is only useful as long as it is accompanied by a complete description of all relevant measuring conditions involved. For this reason it is necessary to file a full description of the relevant measuring conditions together with the results. In earlier times some of our results were rendered useless because some of the relevant measuring conditions were accidentally not written down and were forgotten. This was mainly due to the lack of concensus about which measuring conditions had to be filed together with the result of a measurement. One way to secure uniform and complete archiving of measuring conditions and results is to automate the data handling. An attendent advantage of automation of data handling is that it does away with the time-consuming correction of rough measuring results. The automation of the data handling was accomplished with rather cheap desktop computers, which were powerfull enough, however, to allow us to automate the measurement as well. After automation of the data handling we started with automatic collection of rough measurement data. Step by step we extended the automation by letting the desktop computer control more and more of the measuring set-up. At present the desktop computer controls all the electrical and most of the mechanical measuring conditions. Further it controls and reads the MTF measuring instrument. Focussing and orientation optimization can be fully automatic, semi-automatic or completely manual. MTF measuring results can be collected automatically but they can also be typed in by hand. Due to the automation we are able to implement proper archival of measuring results together with all necessary measuring conditions. The improved measuring efficiency made it possible to increase the number of routine measurements done in the same time period by an order of magnitude. To our surprise the measuring accuracy also improved by a factor of two. This was due to

  3. Effectiveness of speech language therapy either alone or with add-on computer-based language therapy software (Malayalam version) for early post stroke aphasia: A feasibility study.

    Science.gov (United States)

    Kesav, Praveen; Vrinda, S L; Sukumaran, Sajith; Sarma, P S; Sylaja, P N

    2017-09-15

    This study aimed to assess the feasibility of professional based conventional speech language therapy (SLT) either alone (Group A/less intensive) or assisted by novel computer based local language software (Group B/more intensive) for rehabilitation in early post stroke aphasia. Comprehensive Stroke Care Center of a tertiary health care institute situated in South India, with the study design being prospective open randomised controlled trial with blinded endpoint evaluation. This study recruited 24 right handed first ever acute ischemic stroke patients above 15years of age affecting middle cerebral artery territory within 90days of stroke onset with baseline Western Aphasia Battery (WAB) Aphasia Quotient (AQ) score of aphasia. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan part 2 mappings for the ASC software quality engineering practices, version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Heaphy, Robert; Sturtevant, Judith E.; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr. (,; .); Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR001.3.2 and CPR001.3.6 and to a Department of Energy document, ''ASCI Software Quality Engineering: Goals, Principles, and Guidelines''. This document also identifies ASC management and software project teams' responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  5. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  12. Verification and validation of RADMODL Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, K.D.

    1993-03-01

    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V&V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transport of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident.

  13. Verification and validation of RADMODL Version 1.0

    International Nuclear Information System (INIS)

    Kimball, K.D.

    1993-03-01

    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V ampersand V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transport of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident

  14. Planetary Mission Entry Vehicles Quick Reference Guide. Version 3.0

    Science.gov (United States)

    Davies, Carol; Arcadi, Marla

    2006-01-01

    This is Version 3.0 of the planetary mission entry vehicle document. Three new missions, Re-entry F, Hayabusa, and ARD have been added to t he previously published edition (Version 2.1). In addition, the Huyge ns mission has been significantly updated and some Apollo data correc ted. Due to the changing nature of planetary vehicles during the desi gn, manufacture and mission phases, and to the variables involved in measurement and computation, please be aware that the data provided h erein cannot be guaranteed. Contact Carol Davies at cdavies@mail.arc. nasa.gov to correct or update the current data, or to suggest other missions.

  15. ASPEN Version 3.0

    Science.gov (United States)

    Rabideau, Gregg; Chien, Steve; Knight, Russell; Schaffer, Steven; Tran, Daniel; Cichy, Benjamin; Sherwood, Robert

    2006-01-01

    The Automated Scheduling and Planning Environment (ASPEN) computer program has been updated to version 3.0. ASPEN is a modular, reconfigurable, application software framework for solving batch problems that involve reasoning about time, activities, states, and resources. Applications of ASPEN can include planning spacecraft missions, scheduling of personnel, and managing supply chains, inventories, and production lines. ASPEN 3.0 can be customized for a wide range of applications and for a variety of computing environments that include various central processing units and random access memories.

  16. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1 : ASC software quality engineering practices version 1.0.

    Energy Technology Data Exchange (ETDEWEB)

    Minana, Molly A.; Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2005-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in DOE/AL Quality Criteria (QC-1) as conformance to customer requirements and expectations. This quality plan defines the ASC program software quality practices and provides mappings of these practices to the SNL Corporate Process Requirements (CPR 1.3.2 and CPR 1.3.6) and the Department of Energy (DOE) document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines (GP&G). This quality plan identifies ASC management and software project teams' responsibilities for cost-effective software engineering quality practices. The SNL ASC Software Quality Plan establishes the signatories commitment to improving software products by applying cost-effective software engineering quality practices. This document explains the project teams opportunities for tailoring and implementing the practices; enumerates the practices that compose the development of SNL ASC's software products; and includes a sample assessment checklist that was developed based upon the practices in this document.

  17. A computer model simulating human glucose absorption and metabolism in health and metabolic disease states [version 1; referees: 2 approved, 1 approved with reservations

    Directory of Open Access Journals (Sweden)

    Richard J. Naftalin

    2016-04-01

    Full Text Available A computer model designed to simulate integrated glucose-dependent changes in splanchnic blood flow with small intestinal glucose absorption, hormonal and incretin circulation and hepatic and systemic metabolism in health and metabolic diseases e.g. non-alcoholic fatty liver disease, (NAFLD, non-alcoholic steatohepatitis, (NASH and type 2 diabetes mellitus, (T2DM demonstrates how when glucagon-like peptide-1, (GLP-1 is synchronously released into the splanchnic blood during intestinal glucose absorption, it stimulates superior mesenteric arterial (SMA blood flow and by increasing passive intestinal glucose absorption, harmonizes absorption with its distribution and metabolism. GLP-1 also synergises insulin-dependent net hepatic glucose uptake (NHGU. When GLP-1 secretion is deficient post-prandial SMA blood flow is not increased and as NHGU is also reduced, hyperglycaemia follows. Portal venous glucose concentration is also raised, thereby retarding the passive component of intestinal glucose absorption.   Increased pre-hepatic sinusoidal resistance combined with portal hypertension leading to opening of intrahepatic portosystemic collateral vessels are NASH-related mechanical defects that alter the balance between splanchnic and systemic distributions of glucose, hormones and incretins.The model reveals the latent contribution of portosystemic shunting in development of metabolic disease. This diverts splanchnic blood content away from the hepatic sinuses to the systemic circulation, particularly during the glucose absorptive phase of digestion, resulting in inappropriate increases in insulin-dependent systemic glucose metabolism.  This hastens onset of hypoglycaemia and thence hyperglucagonaemia. The model reveals that low rates of GLP-1 secretion, frequently associated with T2DM and NASH, may be also be caused by splanchnic hypoglycaemia, rather than to intrinsic loss of incretin secretory capacity. These findings may have therapeutic

  18. Distribution of lithostratigraphic units within the central block of Yucca Mountain, Nevada: A three-dimensional computer-based model, Version YMP.R2.0

    International Nuclear Information System (INIS)

    Buesch, D.C.; Nelson, J.E.; Dickerson, R.P.; Drake, R.M. II; San Juan, C.A.; Spengler, R.W.; Geslin, J.K.; Moyer, T.C.

    1996-01-01

    Yucca Mountain, Nevada is underlain by 14.0 to 11.6 Ma volcanic rocks tilted eastward 3 degree to 20 degree and cut by faults that were primarily active between 12.7 and 11.6 Ma. A three-dimensional computer-based model of the central block of the mountain consists of seven structural subblocks composed of six formations and the interstratified-bedded tuffaceous deposits. Rocks from the 12.7 Ma Tiva Canyon Tuff, which forms most of the exposed rocks on the mountain, to the 13.1 Ma Prow Pass Tuff are modeled with 13 surfaces. Modeled units represent single formations such as the Pah Canyon Tuff, grouped units such as the combination of the Yucca Mountain Tuff with the superjacent bedded tuff, and divisions of the Topopah Spring Tuff such as the crystal-poor vitrophyre interval. The model is based on data from 75 boreholes from which a structure contour map at the base of the Tiva Canyon Tuff and isochore maps for each unit are constructed to serve as primary input. Modeling consists of an iterative cycle that begins with the primary structure-contour map from which isochore values of the subjacent model unit are subtracted to produce the structure contour map on the base of the unit. This new structure contour map forms the input for another cycle of isochore subtraction to produce the next structure contour map. In this method of solids modeling, the model units are presented by surfaces (structure contour maps), and all surfaces are stored in the model. Surfaces can be converted to form volumes of model units with additional effort. This lithostratigraphic and structural model can be used for (1) storing data from, and planning future, site characterization activities, (2) preliminary geometry of units for design of Exploratory Studies Facility and potential repository, and (3) performance assessment evaluations

  19. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  20. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  2. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  3. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  6. The FORM version of MINCER

    International Nuclear Information System (INIS)

    Larin, S.A.; Academy of Sciences of the USSR, Moscow; Tkachov, F.V.; McGill Univ., Montreal, PQ; Academy of Sciences of the USSR, Moscow; Vermaseren, J.A.M.

    1991-01-01

    The program MINCER for massless three-loop Feynman diagrams of the propagator type has been reprogrammed in the language of FORM. The new version is thoroughly optimized and can be run from a utility like the UNIX make, which allows one to conveniently process large numbers of diagrams. It has been used for some calculations that were previously not practical. (author). 22 refs.; 14 figs

  7. WIMSD4 Version 101 and cataloged procedure

    International Nuclear Information System (INIS)

    Roth, M.J.; Taubman, C.J.; Lawrence, J.H.

    1982-06-01

    The changes made to WIMSD4 to produce Version 101 on the Harwell IBM 3033 and the Winfrith ICL 2976 computers are summarised. A detailed description of the amended catalogued procedure for executing WIMSD4 on the Harwell Computer is given. (author)

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  9. Toward a microrealistic version of quantum mechanics. II

    International Nuclear Information System (INIS)

    Maxwell, N.

    1976-01-01

    Possible objections to the propensity microrealistic version of quantum mechanics proposed previously are answered. This version of quantum mechanics is compared with the statistical, particle, microrealistic viewpoint, and a crucial experiment is proposed designed to distinguish between these two microrealistic versions of quantum mechanics

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  11. Previously unknown species of Aspergillus.

    Science.gov (United States)

    Gautier, M; Normand, A-C; Ranque, S

    2016-08-01

    The use of multi-locus DNA sequence analysis has led to the description of previously unknown 'cryptic' Aspergillus species, whereas classical morphology-based identification of Aspergillus remains limited to the section or species-complex level. The current literature highlights two main features concerning these 'cryptic' Aspergillus species. First, the prevalence of such species in clinical samples is relatively high compared with emergent filamentous fungal taxa such as Mucorales, Scedosporium or Fusarium. Second, it is clearly important to identify these species in the clinical laboratory because of the high frequency of antifungal drug-resistant isolates of such Aspergillus species. Matrix-assisted laser desorption/ionization-time of flight mass spectrometry (MALDI-TOF MS) has recently been shown to enable the identification of filamentous fungi with an accuracy similar to that of DNA sequence-based methods. As MALDI-TOF MS is well suited to the routine clinical laboratory workflow, it facilitates the identification of these 'cryptic' Aspergillus species at the routine mycology bench. The rapid establishment of enhanced filamentous fungi identification facilities will lead to a better understanding of the epidemiology and clinical importance of these emerging Aspergillus species. Based on routine MALDI-TOF MS-based identification results, we provide original insights into the key interpretation issues of a positive Aspergillus culture from a clinical sample. Which ubiquitous species that are frequently isolated from air samples are rarely involved in human invasive disease? Can both the species and the type of biological sample indicate Aspergillus carriage, colonization or infection in a patient? Highly accurate routine filamentous fungi identification is central to enhance the understanding of these previously unknown Aspergillus species, with a vital impact on further improved patient care. Copyright © 2016 European Society of Clinical Microbiology and

  12. The Gaia Framework: Version Support In Web Based Open Hypermedia

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Kejser, Thomas

    2004-01-01

    The GAIA framework prototype, described herein, explores the possibilities and problems that arise when combining versioning and open hypermedia paradigms. It will be argued that it - by adding versioning as a separate service in the hypermedia architecture – is possible to build consistent...... versioning field and GAIA is compared with previous attempts at defining hypermedia versioning frameworks. GAIA is capable of multi-level versioning and versioning of structures and supports freezing mechanisms for both documents and hyperstructure. The experiences from GAIA provide an input to new reference...

  13. The Gaia Framework: Version Support In Web Based Open Hypermedia

    DEFF Research Database (Denmark)

    Kejser, Thomas; Grønbæk, Kaj

    2003-01-01

    The GAIA framework prototype, described herein, explores the possibilities and problems that arise when combining versioning and open hypermedia paradigms. It will be argued that it - by adding versioning as a separate service in the hypermedia architecture - is possible to build consistent...... versioning field and GAIA is compared with previous attempts at defining hypermedia versioning frameworks. GAIA is capable of multi-level versioning and versioning of structures and supports freezing mechanisms for both documents and hyperstructure. The experiences from GAIA provide an input to new reference...

  14. User manual for version 4.3 of the Tripoli-4 Monte-Carlo method particle transport computer code; Notice d'utilisation du code Tripoli-4, version 4.3: code de transport de particules par la methode de Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Both, J.P.; Mazzolo, A.; Peneliau, Y.; Petit, O.; Roesslinger, B

    2003-07-01

    This manual relates to Version 4.3 TRIPOLI-4 code. TRIPOLI-4 is a computer code simulating the transport of neutrons, photons, electrons and positrons. It can be used for radiation shielding calculations (long-distance propagation with flux attenuation in non-multiplying media) and neutronic calculations (fissile medium, criticality or sub-criticality basis). This makes it possible to calculate k{sub eff} (for criticality), flux, currents, reaction rates and multi-group cross-sections. TRIPOLI-4 is a three-dimensional code that uses the Monte-Carlo method. It allows for point-wise description in terms of energy of cross-sections and multi-group homogenized cross-sections and features two modes of geometrical representation: surface and combinatorial. The code uses cross-section libraries in ENDF/B format (such as JEF2-2, ENDF/B-VI and JENDL) for point-wise description cross-sections in APOTRIM format (from the APOLLO2 code) or a format specific to TRIPOLI-4 for multi-group description. (authors)

  15. Implementing version support for complex objects

    OpenAIRE

    Blanken, Henk

    1991-01-01

    New applications in the area of office information systems, computer aided design and manufacturing make new demands upon database management systems. Among others highly structured objects and their history have to be represented and manipulated. The paper discusses some general problems concerning the access and storage of complex objects with their versions and the solutions developed within the AIM/II project. Queries related to versions are distinguished in ASOF queries (asking informati...

  16. [External cephalic version].

    Science.gov (United States)

    Navarro-Santana, B; Duarez-Coronado, M; Plaza-Arranz, J

    2016-08-01

    To analyze the rate of successful external cephalic versions in our center and caesarean sections that would be avoided with the use of external cephalic versions. From January 2012 to March 2016 external cephalic versions carried out at our center, which were a total of 52. We collected data about female age, gestational age at the time of the external cephalic version, maternal body mass index (BMI), fetal variety and situation, fetal weight, parity, location of the placenta, amniotic fluid index (ILA), tocolysis, analgesia, and newborn weight at birth, minor adverse effects (dizziness, hypotension and maternal pain) and major adverse effects (tachycardia, bradycardia, decelerations and emergency cesarean section). 45% of the versions were unsuccessful and 55% were successful. The percentage of successful vaginal delivery in versions was 84% (4% were instrumental) and 15% of caesarean sections. With respect to the variables studied, only significant differences in birth weight were found; suggesting that birth weight it is related to the outcome of external cephalic version. Probably we did not find significant differences due to the number of patients studied. For women with breech presentation, we recommend external cephalic version before the expectant management or performing a cesarean section. The external cephalic version increases the proportion of fetuses in cephalic presentation and also decreases the rate of caesarean sections.

  17. User Manual for the NASA Glenn Ice Accretion Code LEWICE: Version 2.0

    Science.gov (United States)

    Wright, William B.

    1999-01-01

    A research project is underway at NASA Glenn to produce a computer code which can accurately predict ice growth under a wide range of meteorological conditions for any aircraft surface. This report will present a description of the code inputs and outputs from version 2.0 of this code, which is called LEWICE. This version differs from previous releases due to its robustness and its ability to reproduce results accurately for different spacing and time step criteria across computing platform. It also differs in the extensive effort undertaken to compare the results against the database of ice shapes which have been generated in the NASA Glenn Icing Research Tunnel (IRT) 1. This report will only describe the features of the code related to the use of the program. The report will not describe the inner working of the code or the physical models used. This information is available in the form of several unpublished documents which will be collectively referred to as a Programmers Manual for LEWICE 2 in this report. These reports are intended as an update/replacement for all previous user manuals of LEWICE. In addition to describing the changes and improvements made for this version, information from previous manuals may be duplicated so that the user will not need to consult previous manuals to use this code.

  18. Versioning Complex Data

    Energy Technology Data Exchange (ETDEWEB)

    Macduff, Matt C.; Lee, Benno; Beus, Sherman J.

    2014-06-29

    Using the history of ARM data files, we designed and demonstrated a data versioning paradigm that is feasible. Assigning versions to sets of files that are modified with some special assumptions and domain specific rules was effective in the case of ARM data, which has more than 5000 datastreams and 500TB of data.

  19. User Manual for the NASA Glenn Ice Accretion Code LEWICE. Version 2.2.2

    Science.gov (United States)

    Wright, William B.

    2002-01-01

    A research project is underway at NASA Glenn to produce a computer code which can accurately predict ice growth under a wide range of meteorological conditions for any aircraft surface. This report will present a description of the code inputs and outputs from version 2.2.2 of this code, which is called LEWICE. This version differs from release 2.0 due to the addition of advanced thermal analysis capabilities for de-icing and anti-icing applications using electrothermal heaters or bleed air applications. An extensive effort was also undertaken to compare the results against the database of electrothermal results which have been generated in the NASA Glenn Icing Research Tunnel (IRT) as was performed for the validation effort for version 2.0. This report will primarily describe the features of the software related to the use of the program. Appendix A of this report has been included to list some of the inner workings of the software or the physical models used. This information is also available in the form of several unpublished documents internal to NASA. This report is intended as a replacement for all previous user manuals of LEWICE. In addition to describing the changes and improvements made for this version, information from previous manuals may be duplicated so that the user will not need to consult previous manuals to use this code.

  20. CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (DEC VAX VMS VERSION)

    Science.gov (United States)

    Donnell, B.

    1994-01-01

    COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the

  1. [Computer-based quality-of-life monitoring in head and neck cancer patients: a validation model using the EORTC-QLQ C30 and EORTC- H&N35 Portuguese PC-software version].

    Science.gov (United States)

    Silveira, Augusta; Gonçalves, Joaquim; Sequeira, Teresa; Ribeiro, Cláudia; Lopes, Carlos; Monteiro, Eurico; Pimentel, Francisco Luís

    2011-12-01

    Quality of Life is a distinct and important emerging health focus, guiding practice and research. The routine Quality of Life evaluation in clinical, economic, and epidemiological studies and in medical practice promises a better Quality of Life and improved health resources optimization. The use of information technology and a Knowledge Management System related to Quality of Life assessment is essential to routine clinical evaluation and can define a clinical research methodology that is more efficient and better organized. In this paper, a Validation Model using the Quality of Life informatics platform is presented. Portuguese PC-software using European Organization for Research and Treatment of Cancer questionnaires (EORTC-QLQ C30 and EORTC-H&N35), is compared with the original paper-pen approach in the Quality of Life monitoring of head and neck cancer patients. The Quality of Life informatics platform was designed specifically for this study with a simple and intuitive interface that ensures confidentiality while providing Quality of Life evaluation for all cancer patients. For the Validation Model, the sample selection was random. Fifty-four head and neck cancer patients completed 216 questionnaires (108 using the informatics platform and 108 using the original paper-pen approach) with a one-hour interval in between. Patient preferences and computer experience were registered. Quality of Life informatics platform showed high usability as a user-friendly tool. This informatics platform allows data collection by auto-reply, database construction, and statistical data analysis and also facilitates the automatic listing of the questionnaires. When comparing the approaches (Wilcoxon test by item, percentile distribution and Cronbach's alpha), most of the responses were similar. Most of the patients (53.6%) reported a preference for the software version. The Quality of Life informatics platform has revealed to be a powerful and effective tool, allowing a real time

  2. Meeting the requirements of specialists and generalists in Version 3 of the Read Codes: Two illustrative "Case Reports"

    Directory of Open Access Journals (Sweden)

    Fiona Sinclair

    1997-11-01

    Full Text Available The Read Codes have been recognised as the standard for General Practice computing since 1988 and the original 4-byte set continues to be extensively used to record primary health care data. Read Version 3 (the Read Thesaurus is an expanded clinical vocabulary with an enhanced file structure designed to meet the detailed requirements of specialist practitioners and to address some of the limitations of previous versions. A recent phase of integration of the still widely-used 4-byte set has highlighted the need to ensure that the new Thesaurus continues to support generalist requirements.

  3. New developments in program STANSOL version 3

    International Nuclear Information System (INIS)

    Gray, W.H.

    1981-10-01

    STANSOL is a computer program that applied a solution for the mechanical displacement, stress, and strain in rotationally-transversely isotropic, homogeneous, axisymmetric solenoids. Careful application of the solution permits the complex mechanical behavior of multilayered, nonhomogeneous solenoids to be examined in which the loads may vary arbitrarily from layer to layer. Loads applied to the solenoid model by program STANSOL may consist of differential temperature, winding preload, internal and/or external surface pressure, and electromagnetic Lorentz body forces. STANSOL version 3, the latest update to the original version of the computer program, also permits structural analysis of solenoid magnets in which frictionless interlayer gaps may open or close. This paper presents the new theory coded into version 3 of the STANSOL program, as well as the new input data format and graphical output display of the resulting analysis

  4. Neural Computation and the Computational Theory of Cognition

    Science.gov (United States)

    Piccinini, Gualtiero; Bahar, Sonya

    2013-01-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism--neural processes are computations in the…

  5. PC 386-based version of DORT

    International Nuclear Information System (INIS)

    Tanker, E.

    1992-01-01

    Problems encountered during the adaptation of DORT on a personal computer using a Fortran77 compiler are described, modifications done to solve these are explained. Three test cases were run with the modified version and results are compared with those obtained on an IBM 3090/200. Numerical differences were observed in the last three decimal digits of the computations at most. The running times on the PC were found to be satisfactory for these test cases

  6. NOAA Climate Data Record (CDR) of GPS RO-Calibrated AMSU Channel 9 (Temperatures in the Lower Stratosphere,TLS), Version 1.0 (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Please note, this dataset has been superseded by a newer version (see below). Users should not use this version except in rare cases (e.g., when reproducing previous...

  7. NOAA Climate Data Record (CDR) of GPS RO-Calibrated AMSU Channel 9 (Temperatures in the Lower Stratosphere,TLS), Version 1.1 (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Please note, this dataset has been superseded by a newer version (see below). Users should not use this version except in rare cases (e.g., when reproducing previous...

  8. A one-dimensional material transfer model for HECTR version 1.5

    International Nuclear Information System (INIS)

    Geller, A.S.; Wong, C.C.

    1991-08-01

    HECTR (Hydrogen Event Containment Transient Response) is a lumped-parameter computer code developed for calculating the pressure-temperature response to combustion in a nuclear power plant containment building. The code uses a control-volume approach and subscale models to simulate the mass, momentum, and energy transfer occurring in the containment during a loss-of-collant-accident (LOCA). This document describes one-dimensional subscale models for mass and momentum transfer, and the modifications to the code required to implement them. Two problems were analyzed: the first corresponding to a standard problem studied with previous HECTR versions, the second to experiments. The performance of the revised code relative to previous HECTR version is discussed as is the ability of the code to model the experiments. 8 refs., 5 figs., 3 tabs

  9. Enigma Version 12

    Science.gov (United States)

    Shores, David; Goza, Sharon P.; McKeegan, Cheyenne; Easley, Rick; Way, Janet; Everett, Shonn; Guerra, Mark; Kraesig, Ray; Leu, William

    2013-01-01

    Enigma Version 12 software combines model building, animation, and engineering visualization into one concise software package. Enigma employs a versatile user interface to allow average users access to even the most complex pieces of the application. Using Enigma eliminates the need to buy and learn several software packages to create an engineering visualization. Models can be created and/or modified within Enigma down to the polygon level. Textures and materials can be applied for additional realism. Within Enigma, these models can be combined to create systems of models that have a hierarchical relationship to one another, such as a robotic arm. Then these systems can be animated within the program or controlled by an external application programming interface (API). In addition, Enigma provides the ability to use plug-ins. Plugins allow the user to create custom code for a specific application and access the Enigma model and system data, but still use the Enigma drawing functionality. CAD files can be imported into Enigma and combined to create systems of computer graphics models that can be manipulated with constraints. An API is available so that an engineer can write a simulation and drive the computer graphics models with no knowledge of computer graphics. An animation editor allows an engineer to set up sequences of animations generated by simulations or by conceptual trajectories in order to record these to highquality media for presentation. Enigma Version 12 Lyndon B. Johnson Space Center, Houston, Texas 28 NASA Tech Briefs, September 2013 Planetary Protection Bioburden Analysis Program NASA's Jet Propulsion Laboratory, Pasadena, California This program is a Microsoft Access program that performed statistical analysis of the colony counts from assays performed on the Mars Science Laboratory (MSL) spacecraft to determine the bioburden density, 3-sigma biodensity, and the total bioburdens required for the MSL prelaunch reports. It also contains numerous

  10. Version pressure feedback mechanisms for speculative versioning caches

    Science.gov (United States)

    Eichenberger, Alexandre E.; Gara, Alan; O& #x27; Brien, Kathryn M.; Ohmacht, Martin; Zhuang, Xiaotong

    2013-03-12

    Mechanisms are provided for controlling version pressure on a speculative versioning cache. Raw version pressure data is collected based on one or more threads accessing cache lines of the speculative versioning cache. One or more statistical measures of version pressure are generated based on the collected raw version pressure data. A determination is made as to whether one or more modifications to an operation of a data processing system are to be performed based on the one or more statistical measures of version pressure, the one or more modifications affecting version pressure exerted on the speculative versioning cache. An operation of the data processing system is modified based on the one or more determined modifications, in response to a determination that one or more modifications to the operation of the data processing system are to be performed, to affect the version pressure exerted on the speculative versioning cache.

  11. FORM version 4.0

    Science.gov (United States)

    Kuipers, J.; Ueda, T.; Vermaseren, J. A. M.; Vollinga, J.

    2013-05-01

    We present version 4.0 of the symbolic manipulation system FORM. The most important new features are manipulation of rational polynomials and the factorization of expressions. Many other new functions and commands are also added; some of them are very general, while others are designed for building specific high level packages, such as one for Gröbner bases. New is also the checkpoint facility, that allows for periodic backups during long calculations. Finally, FORM 4.0 has become available as open source under the GNU General Public License version 3. Program summaryProgram title: FORM. Catalogue identifier: AEOT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 151599 No. of bytes in distributed program, including test data, etc.: 1 078 748 Distribution format: tar.gz Programming language: The FORM language. FORM itself is programmed in a mixture of C and C++. Computer: All. Operating system: UNIX, LINUX, Mac OS, Windows. Classification: 5. Nature of problem: FORM defines a symbolic manipulation language in which the emphasis lies on fast processing of very large formulas. It has been used successfully for many calculations in Quantum Field Theory and mathematics. In speed and size of formulas that can be handled it outperforms other systems typically by an order of magnitude. Special in this version: The version 4.0 contains many new features. Most important are factorization and rational arithmetic. The program has also become open source under the GPL. The code in CPC is for reference. You are encouraged to upload the most recent sources from www.nikhef.nl/form/formcvs.php because of frequent bug fixes. Solution method: See "Nature of Problem", above. Additional comments: NOTE: The code in CPC is for reference. You are encouraged

  12. ENDF-6 formats manual. Version of Oct. 1991

    International Nuclear Information System (INIS)

    Rose, P.F.; Dunford, C.L.

    1992-01-01

    ENDF-6 is the international computer file format for evaluated nuclear data. In contrast to the earlier versions (ENDF-4 and ENDF-5) the new version ENDF-6 has been designed not only for neutron reaction data but also for photo-nuclear and charged-particle nuclear reaction data. This document gives a detailed description of the formats and procedures adopted for ENDF-6. The present version includes update pages dated Oct. 1991. (author). Refs, figs, and tabs

  13. JaxoDraw: A graphical user interface for drawing Feynman diagrams. Version 2.0 release notes

    Science.gov (United States)

    Binosi, D.; Collins, J.; Kaufhold, C.; Theussl, L.

    2009-09-01

    A new version of the Feynman graph plotting tool JaxoDraw is presented. Version 2.0 is a fundamental re-write of most of the JaxoDraw core and some functionalities, in particular importing graphs, are not backward-compatible with the 1.x branch. The most prominent new features include: drawing of Bézier curves for all particle modes, on-the-fly update of edited objects, multiple undo/redo functionality, the addition of a plugin infrastructure, and a general improved memory performance. A new LaTeX style file is presented that has been written specifically on top of the original axodraw.sty to meet the needs of this new version. New version program summaryProgram title: JaxoDraw Catalogue identifier: ADUA_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADUA_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL No. of lines in distributed program, including test data, etc.: 103 544 No. of bytes in distributed program, including test data, etc.: 3 745 814 Distribution format: tar.gz Programming language: Java Computer: Any Java-enabled platform Operating system: Any Java-enabled platform, tested on Linux, Windows XP, Mac OS X Classification: 14 Catalogue identifier of previous version: ADUA_v1_0 Journal reference of previous version: Comput. Phys. Comm. 161 (2004) 76 Does the new version supersede the previous version?: Yes Nature of problem: Existing methods for drawing Feynman diagrams usually require some hard-coding in one or the other programming or scripting language. It is not very convenient and often time consuming, to generate relatively simple diagrams. Solution method: A program is provided that allows for the interactive drawing of Feynman diagrams with a graphical user interface. The program is easy to learn and use, produces high quality output in several formats and runs on any operating system where a Java Runtime Environment is available. Reasons for new version: A

  14. Simion 3D Version 6.0 User`s Manual

    Energy Technology Data Exchange (ETDEWEB)

    Dahl, D.A.

    1995-11-01

    The original SIMION was an electrostatic lens analysis and design program developed by D.C. McGilvery at Latrobe University, Bundoora Victoria, Australia, 1977. SIMION for the PC, developed at the Idaho National Engineering Laboratory, shares little more than its name with the original McGilvery version. INEL`s fifth major SIMION release, version 6.0, represents a quantum improvement over previous versions. This C based program can model complex problems using an ion optics workbench that can hold up to 200 2D and/or 3D electrostatic/magnetic potential arrays. Arrays can have up to 10,000,000 points. SIMION 3D`s 32 bit virtual Graphics User Interface provides a highly interactive advanced user environment. All potential arrays are visualized as 3D objects that the user can cut away to inspect ion trajectories and potential energy surfaces. User programs have been greatly extended in versatility and power. A new geometry file option supports the definition of highly complex array geometry. Extensive algorithm modifications have dramatically improved this version`s computational speed and accuracy.

  15. Determining Optimal Decision Version

    Directory of Open Access Journals (Sweden)

    Olga Ioana Amariei

    2014-06-01

    Full Text Available In this paper we start from the calculation of the product cost, applying the method of calculating the cost of hour- machine (THM, on each of the three cutting machines, namely: the cutting machine with plasma, the combined cutting machine (plasma and water jet and the cutting machine with a water jet. Following the calculation of cost and taking into account the precision of manufacturing of each machine, as well as the quality of the processed surface, the optimal decisional version needs to be determined regarding the product manufacturing. To determine the optimal decisional version, we resort firstly to calculating the optimal version on each criterion, and then overall using multiattribute decision methods.

  16. Version 2 of RSXMULTI

    International Nuclear Information System (INIS)

    Heinicke, P.; Berg, D.; Constanta-Fanourakis, P.; Quigg, E.K.

    1985-01-01

    MULTI is a general purpose, high speed, high energy physics interface to data acquisition and data investigation system that runs on PDP-11 and VAX architecture. This paper describes the latest version of MULTI, which runs under RSX-11M version 4.1 and supports a modular approach to the separate tasks that interface to it, allowing the same system to be used in single CPU test beam experiments as well as multiple interconnected CPU, large scale experiments. MULTI uses CAMAC (IEE-583) for control and monitoring of an experiment, and is written in FORTRAN-77 and assembler. The design of this version, which simplified the interface between tasks, and eliminated the need for a hard to maintain homegrown I/O system is also discussed

  17. MATLAB Software Versions and Licenses for the Peregrine System |

    Science.gov (United States)

    High-Performance Computing | NREL MATLAB Software Versions and Licenses for the Peregrine System MATLAB Software Versions and Licenses for the Peregrine System Learn about the MATLAB software Peregrine is R2017b. Licenses MATLAB is proprietary software. As such, users have access to a limited number

  18. Versioning of printed products

    Science.gov (United States)

    Tuijn, Chris

    2005-01-01

    During the definition of a printed product in an MIS system, a lot of attention is paid to the production process. The MIS systems typically gather all process-related parameters at such a level of detail that they can determine what the exact cost will be to make a specific product. This information can then be used to make a quote for the customer. Considerably less attention is paid to the content of the products since this does not have an immediate impact on the production costs (assuming that the number of inks or plates is known in advance). The content management is typically carried out either by the prepress systems themselves or by dedicated workflow servers uniting all people that contribute to the manufacturing of a printed product. Special care must be taken when considering versioned products. With versioned products we here mean distinct products that have a number of pages or page layers in common. Typical examples are comic books that have to be printed in different languages. In this case, the color plates can be shared over the different versions and the black plate will be different. Other examples are nation-wide magazines or newspapers that have an area with regional pages or advertising leaflets in different languages or currencies. When considering versioned products, the content will become an important cost factor. First of all, the content management (and associated proofing and approval cycles) becomes much more complex and, therefore, the risk that mistakes will be made increases considerably. Secondly, the real production costs are very much content-dependent because the content will determine whether plates can be shared across different versions or not and how many press runs will be needed. In this paper, we will present a way to manage different versions of a printed product. First, we will introduce a data model for version management. Next, we will show how the content of the different versions can be supplied by the customer

  19. Stratified B-trees and versioning dictionaries

    OpenAIRE

    Twigg, Andy; Byde, Andrew; Milos, Grzegorz; Moreton, Tim; Wilkes, John; Wilkie, Tom

    2011-01-01

    A classic versioned data structure in storage and computer science is the copy-on-write (CoW) B-tree -- it underlies many of today's file systems and databases, including WAFL, ZFS, Btrfs and more. Unfortunately, it doesn't inherit the B-tree's optimality properties; it has poor space utilization, cannot offer fast updates, and relies on random IO to scale. Yet, nothing better has been developed since. We describe the `stratified B-tree', which beats all known semi-external memory versioned B...

  20. User's Guide for TOUGH2-MP - A Massively Parallel Version of the TOUGH2 Code

    International Nuclear Information System (INIS)

    Earth Sciences Division; Zhang, Keni; Zhang, Keni; Wu, Yu-Shu; Pruess, Karsten

    2008-01-01

    TOUGH2-MP is a massively parallel (MP) version of the TOUGH2 code, designed for computationally efficient parallel simulation of isothermal and nonisothermal flows of multicomponent, multiphase fluids in one, two, and three-dimensional porous and fractured media. In recent years, computational requirements have become increasingly intensive in large or highly nonlinear problems for applications in areas such as radioactive waste disposal, CO2 geological sequestration, environmental assessment and remediation, reservoir engineering, and groundwater hydrology. The primary objective of developing the parallel-simulation capability is to significantly improve the computational performance of the TOUGH2 family of codes. The particular goal for the parallel simulator is to achieve orders-of-magnitude improvement in computational time for models with ever-increasing complexity. TOUGH2-MP is designed to perform parallel simulation on multi-CPU computational platforms. An earlier version of TOUGH2-MP (V1.0) was based on the TOUGH2 Version 1.4 with EOS3, EOS9, and T2R3D modules, a software previously qualified for applications in the Yucca Mountain project, and was designed for execution on CRAY T3E and IBM SP supercomputers. The current version of TOUGH2-MP (V2.0) includes all fluid property modules of the standard version TOUGH2 V2.0. It provides computationally efficient capabilities using supercomputers, Linux clusters, or multi-core PCs, and also offers many user-friendly features. The parallel simulator inherits all process capabilities from V2.0 together with additional capabilities for handling fractured media from V1.4. This report provides a quick starting guide on how to set up and run the TOUGH2-MP program for users with a basic knowledge of running the (standard) version TOUGH2 code. The report also gives a brief technical description of the code, including a discussion of parallel methodology, code structure, as well as mathematical and numerical methods used

  1. ARROW (Version 2) Commercial Software Validation and Configuration Control

    International Nuclear Information System (INIS)

    HEARD, F.J.

    2000-01-01

    ARROW (Version 2), a compressible flow piping network modeling and analysis computer program from Applied Flow Technology, was installed for use at the U.S. Department of Energy Hanford Site near Richland, Washington

  2. ARROW (Version 2) Commercial Software Validation and Configuration Control

    Energy Technology Data Exchange (ETDEWEB)

    HEARD, F.J.

    2000-02-10

    ARROW (Version 2), a compressible flow piping network modeling and analysis computer program from Applied Flow Technology, was installed for use at the U.S. Department of Energy Hanford Site near Richland, Washington.

  3. A multidimensional version of the Kolmogorov-Smirnov test

    International Nuclear Information System (INIS)

    Fasano, G.; Franceschini, A.

    1987-01-01

    A generalization of the classical Kolmogorov-Smirnov test, which is suitable to analyse random samples defined in two or three dimensions is discussed. This test provides some improvements with respect to an earlier version proposed by a previous author. In particular: (i) it is faster, by a factor equal to the sample size, n, and then usable to analyse quite sizeable samples; (ii) it fully takes into account the dependence of the test statistics on the degree of correlation of data points and on the sample size; (iii) it allows for a generalization to the three-dimensional case which is still viable as regards computing time. Supported by a larger number of Monte Carlo simulations, it is ensured that this test is sufficiently distribution-free for any practical purposes. (author)

  4. New version of PLNoise: a package for exact numerical simulation of power-law noises

    Science.gov (United States)

    Milotti, Edoardo

    2007-08-01

    installed on the target machine. No. of lines in distributed program, including test data, etc.:2975 No. of bytes in distributed program, including test data, etc.:194 588 Distribution format:tar.gz Catalogue identifier of previous version: ADXV_v1_0 Journal reference of previous version: Comput. Phys. Comm. 175 (2006) 212 Does the new version supersede the previous version?: Yes Nature of problem: Exact generation of different types of colored noise. Solution method: Random superposition of relaxation processes [E. Milotti, Phys. Rev. E 72 (2005) 056701], possibly followed by an integration step to produce noise with spectral index >2. Reasons for the new version: Extension to 1/f noises with spectral index 2<α⩽4: the new version generates both noises with spectral with spectral index 0<α⩽2 and with 2<α⩽4. Summary of revisions: Although the overall structure remains the same, one routine has been added and several changes have been made throughout the code to include the new integration step. Unusual features: The algorithm is theoretically guaranteed to be exact, and unlike all other existing generators it can generate samples with uneven spacing. Additional comments: The program requires an initialization step; for some parameter sets this may become rather heavy. Running time: Running time varies widely with different input parameters, however in a test run like the one in Section 3 in the long write-up, the generation routine took on average about 75 μs for each sample.

  5. An Improved Version of TOPAZ 3D

    International Nuclear Information System (INIS)

    Krasnykh, Anatoly

    2003-01-01

    An improved version of the TOPAZ 3D gun code is presented as a powerful tool for beam optics simulation. In contrast to the previous version of TOPAZ 3D, the geometry of the device under test is introduced into TOPAZ 3D directly from a CAD program, such as Solid Edge or AutoCAD. In order to have this new feature, an interface was developed, using the GiD software package as a meshing code. The article describes this method with two models to illustrate the results

  6. Program package for multicanonical simulations of U(1) lattice gauge theory-Second version

    Science.gov (United States)

    Bazavov, Alexei; Berg, Bernd A.

    2013-03-01

    A new version STMCMUCA_V1_1 of our program package is available. It eliminates compatibility problems of our Fortran 77 code, originally developed for the g77 compiler, with Fortran 90 and 95 compilers. New version program summaryProgram title: STMC_U1MUCA_v1_1 Catalogue identifier: AEET_v1_1 Licensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.html Programming language: Fortran 77 compatible with Fortran 90 and 95 Computers: Any capable of compiling and executing Fortran code Operating systems: Any capable of compiling and executing Fortran code RAM: 10 MB and up depending on lattice size used No. of lines in distributed program, including test data, etc.: 15059 No. of bytes in distributed program, including test data, etc.: 215733 Keywords: Markov chain Monte Carlo, multicanonical, Wang-Landau recursion, Fortran, lattice gauge theory, U(1) gauge group, phase transitions of continuous systems Classification: 11.5 Catalogue identifier of previous version: AEET_v1_0 Journal Reference of previous version: Computer Physics Communications 180 (2009) 2339-2347 Does the new version supersede the previous version?: Yes Nature of problem: Efficient Markov chain Monte Carlo simulation of U(1) lattice gauge theory (or other continuous systems) close to its phase transition. Measurements and analysis of the action per plaquette, the specific heat, Polyakov loops and their structure factors. Solution method: Multicanonical simulations with an initial Wang-Landau recursion to determine suitable weight factors. Reweighting to physical values using logarithmic coding and calculating jackknife error bars. Reasons for the new version: The previous version was developed for the g77 compiler Fortran 77 version. Compiler errors were encountered with Fortran 90 and Fortran 95 compilers (specified below). Summary of revisions: epsilon=one/10**10 is replaced by epsilon/10.0D10 in the parameter statements of the subroutines u1_bmha.f, u1_mucabmha.f, u1wl

  7. Version control with Git

    CERN Document Server

    Loeliger, Jon

    2012-01-01

    Get up to speed on Git for tracking, branching, merging, and managing code revisions. Through a series of step-by-step tutorials, this practical guide takes you quickly from Git fundamentals to advanced techniques, and provides friendly yet rigorous advice for navigating the many functions of this open source version control system. This thoroughly revised edition also includes tips for manipulating trees, extended coverage of the reflog and stash, and a complete introduction to the GitHub repository. Git lets you manage code development in a virtually endless variety of ways, once you understand how to harness the system's flexibility. This book shows you how. Learn how to use Git for several real-world development scenarios ; Gain insight into Git's common-use cases, initial tasks, and basic functions ; Use the system for both centralized and distributed version control ; Learn how to manage merges, conflicts, patches, and diffs ; Apply advanced techniques such as rebasing, hooks, and ways to handle submodu...

  8. Nuclear criticality safety handbook. Version 2

    International Nuclear Information System (INIS)

    1999-03-01

    The Nuclear Criticality Safety Handbook, Version 2 essentially includes the description of the Supplement Report to the Nuclear Criticality Safety Handbook, released in 1995, into the first version of Nuclear Criticality Safety Handbook, published in 1988. The following two points are new: (1) exemplifying safety margins related to modelled dissolution and extraction processes, (2) describing evaluation methods and alarm system for criticality accidents. Revision is made based on previous studies for the chapter that treats modelling the fuel system: e.g., the fuel grain size that the system can be regarded as homogeneous, non-uniformity effect of fuel solution, and burnup credit. This revision solves the inconsistencies found in the first version between the evaluation of errors found in JACS code system and criticality condition data that were calculated based on the evaluation. (author)

  9. PROSA version 4.0 manual

    International Nuclear Information System (INIS)

    Bicking, U.; Golly, W.; Peter, N.; Seifert, R.

    1991-05-01

    This report includes a comprehensive manual of the computer program PROSA which illustrate the handling and functioning of PROSA. The manual PROSA 4.0 (FORTRAN 77) describes the PC Version of PROSA including its program moduls. The PROSA program package is a statistical tool to decide on the basis of statistical assumptions whether in a given sequence of material balance periods a loss of material might have occurred. The evaluation of the material balance data is based on statistical test procedures. In the present PROSA Version 4.0 the three tests CUMUF test, PAGE's test and GEMUF test are applied to a sequence of material balances. PROSA Version 4.0 supports a real sequential evaluation. That means, PROSA is not only able to evaluate a series of MUF values sequentially after the campaign has finished, but also real sequentially during the campaign. PROSA Version 4.0 is a menu-guided computer program. Data input can be performed either by diskette or by key-enter. Result output is primarily an information whether or not an alarm is indicated. This information can be displayed either numerically or graphically. Therefore, a comfortable graphical output utility is attached to PROSA 4.0. The program moduls are compiled and linked with the Ryan Mc-Farland Compiler. The PROSA graphical utility uses the PLOT88 Library of Plotworks, Inc. (orig./HP) [de

  10. Previous Experience a Model of Practice UNAE

    OpenAIRE

    Ormary Barberi Ruiz; María Dolores Pesántez Palacios

    2017-01-01

    The statements presented in this article represents a preliminary version of the proposed model of pre-professional practices (PPP) of the National University of Education (UNAE) of Ecuador, an urgent institutional necessity is revealed in the descriptive analyzes conducted from technical support - administrative (reports, interviews, testimonials), pedagogical foundations of UNAE (curricular directionality, transverse axes in practice, career plan, approach and diagnostic examination as subj...

  11. TRASYS - THERMAL RADIATION ANALYZER SYSTEM (DEC VAX VERSION WITH NASADIG)

    Science.gov (United States)

    Anderson, G. E.

    1994-01-01

    . The flexible structure of TRASYS allows considerable freedom in the definition and choice of solution method for a thermal radiation problem. The program's flexible structure has also allowed TRASYS to retain the same basic input structure as the authors update it in order to keep up with changing requirements. Among its other important features are the following: 1) up to 3200 node problem size capability with shadowing by intervening opaque or semi-transparent surfaces; 2) choice of diffuse, specular, or diffuse/specular radiant interchange solutions; 3) a restart capability that minimizes recomputing; 4) macroinstructions that automatically provide the executive logic for orbit generation that optimizes the use of previously completed computations; 5) a time variable geometry package that provides automatic pointing of the various parts of an articulated spacecraft and an automatic look-back feature that eliminates redundant form factor calculations; 6) capability to specify submodel names to identify sets of surfaces or components as an entity; and 7) subroutines to perform functions which save and recall the internodal and/or space form factors in subsequent steps for nodes with fixed geometry during a variable geometry run. There are two machine versions of TRASYS v27: a DEC VAX version and a Cray UNICOS version. Both versions require installation of the NASADIG library (MSC-21801 for DEC VAX or COS-10049 for CRAY), which is available from COSMIC either separately or bundled with TRASYS. The NASADIG (NASA Device Independent Graphics Library) plot package provides a pictorial representation of input geometry, orbital/orientation parameters, and heating rate output as a function of time. NASADIG supports Tektronix terminals. The CRAY version of TRASYS v27 is written in FORTRAN 77 for batch or interactive execution and has been implemented on CRAY X-MP and CRAY Y-MP series computers running UNICOS. The standard distribution medium for MSC-21959 (CRAY version

  12. TRASYS - THERMAL RADIATION ANALYZER SYSTEM (DEC VAX VERSION WITHOUT NASADIG)

    Science.gov (United States)

    Vogt, R. A.

    1994-01-01

    . The flexible structure of TRASYS allows considerable freedom in the definition and choice of solution method for a thermal radiation problem. The program's flexible structure has also allowed TRASYS to retain the same basic input structure as the authors update it in order to keep up with changing requirements. Among its other important features are the following: 1) up to 3200 node problem size capability with shadowing by intervening opaque or semi-transparent surfaces; 2) choice of diffuse, specular, or diffuse/specular radiant interchange solutions; 3) a restart capability that minimizes recomputing; 4) macroinstructions that automatically provide the executive logic for orbit generation that optimizes the use of previously completed computations; 5) a time variable geometry package that provides automatic pointing of the various parts of an articulated spacecraft and an automatic look-back feature that eliminates redundant form factor calculations; 6) capability to specify submodel names to identify sets of surfaces or components as an entity; and 7) subroutines to perform functions which save and recall the internodal and/or space form factors in subsequent steps for nodes with fixed geometry during a variable geometry run. There are two machine versions of TRASYS v27: a DEC VAX version and a Cray UNICOS version. Both versions require installation of the NASADIG library (MSC-21801 for DEC VAX or COS-10049 for CRAY), which is available from COSMIC either separately or bundled with TRASYS. The NASADIG (NASA Device Independent Graphics Library) plot package provides a pictorial representation of input geometry, orbital/orientation parameters, and heating rate output as a function of time. NASADIG supports Tektronix terminals. The CRAY version of TRASYS v27 is written in FORTRAN 77 for batch or interactive execution and has been implemented on CRAY X-MP and CRAY Y-MP series computers running UNICOS. The standard distribution medium for MSC-21959 (CRAY version

  13. TRASYS - THERMAL RADIATION ANALYZER SYSTEM (CRAY VERSION WITH NASADIG)

    Science.gov (United States)

    Anderson, G. E.

    1994-01-01

    . The flexible structure of TRASYS allows considerable freedom in the definition and choice of solution method for a thermal radiation problem. The program's flexible structure has also allowed TRASYS to retain the same basic input structure as the authors update it in order to keep up with changing requirements. Among its other important features are the following: 1) up to 3200 node problem size capability with shadowing by intervening opaque or semi-transparent surfaces; 2) choice of diffuse, specular, or diffuse/specular radiant interchange solutions; 3) a restart capability that minimizes recomputing; 4) macroinstructions that automatically provide the executive logic for orbit generation that optimizes the use of previously completed computations; 5) a time variable geometry package that provides automatic pointing of the various parts of an articulated spacecraft and an automatic look-back feature that eliminates redundant form factor calculations; 6) capability to specify submodel names to identify sets of surfaces or components as an entity; and 7) subroutines to perform functions which save and recall the internodal and/or space form factors in subsequent steps for nodes with fixed geometry during a variable geometry run. There are two machine versions of TRASYS v27: a DEC VAX version and a Cray UNICOS version. Both versions require installation of the NASADIG library (MSC-21801 for DEC VAX or COS-10049 for CRAY), which is available from COSMIC either separately or bundled with TRASYS. The NASADIG (NASA Device Independent Graphics Library) plot package provides a pictorial representation of input geometry, orbital/orientation parameters, and heating rate output as a function of time. NASADIG supports Tektronix terminals. The CRAY version of TRASYS v27 is written in FORTRAN 77 for batch or interactive execution and has been implemented on CRAY X-MP and CRAY Y-MP series computers running UNICOS. The standard distribution medium for MSC-21959 (CRAY version

  14. The Unified Extensional Versioning Model

    DEFF Research Database (Denmark)

    Asklund, U.; Bendix, Lars Gotfred; Christensen, H. B.

    1999-01-01

    Versioning of components in a system is a well-researched field where various adequate techniques have already been established. In this paper, we look at how versioning can be extended to cover also the structural aspects of a system. There exist two basic techniques for versioning - intentional...

  15. VizieR Online Data Catalog: SKY2000 Master Catalog, Version 5 (Myers+ 2006)

    Science.gov (United States)

    Myers, J. R.; Sande, C. B.; Miller, A. C.; Warren, W. H., Jr.; Tracewell, D. A.

    2015-02-01

    The SKYMAP Star Catalog System consists of a Master Catalog stellar database and a collection of utility software designed to create and maintain the database and to generate derivative mission star catalogs (run catalogs). It contains an extensive compilation of information on almost 300000 stars brighter than 8.0mag. The original SKYMAP Master Catalog was generated in the early 1970's. Incremental updates and corrections were made over the following years but the first complete revision of the source data occurred with Version 4.0. This revision also produced a unique, consolidated source of astrometric information which can be used by the astronomical community. The derived quantities were removed and wideband and photometric data in the R (red) and I (infrared) systems were added. Version 4 of the SKY2000 Master Catalog was completed in April 2002; it marks the global replacement of the variability identifier and variability data fields. More details can be found in the description file sky2kv4.pdf. The SKY2000 Version 5 Revision 4 Master Catalog differs from Revision 3 in that MK and HD spectral types have been added from the Catalogue of Stellar Spectral Classifications (B. A. Skiff of Lowell Observatory, 2005), which has been assigned source code 50 in this process. 9622 entries now have MK types from this source, while 3976 entries have HD types from this source. SKY2000 V5 R4 also differs globally from preceding MC versions in that the Galactic coordinate computations performed by UPDATE have been increased in accuracy, so that differences from the same quantities from other sources are now typically in the last decimal places carried in the MC. This version supersedes the previous versions 1(V/95), 2(V/102), 3(V/105) and 4(V/109). (6 data files).

  16. PVWatts Version 5 Manual

    Energy Technology Data Exchange (ETDEWEB)

    Dobos, A. P.

    2014-09-01

    The NREL PVWatts calculator is a web application developed by the National Renewable Energy Laboratory (NREL) that estimates the electricity production of a grid-connected photovoltaic system based on a few simple inputs. PVWatts combines a number of sub-models to predict overall system performance, and makes includes several built-in parameters that are hidden from the user. This technical reference describes the sub-models, documents assumptions and hidden parameters, and explains the sequence of calculations that yield the final system performance estimate. This reference is applicable to the significantly revised version of PVWatts released by NREL in 2014.

  17. User's guide for ABCI version 9.4 (azimuthal beam cavity interaction) and introducing the ABCI windows application package

    International Nuclear Information System (INIS)

    Chin, Yong Ho

    2005-12-01

    ABCI is a computer program which solves the Maxwell equations directly in the time domain when a bunched beam goes through an axi-symmetric structure on or off axis. An arbitrary charge distribution can be defined by the user (default=Gaussian). This document is meant to be a comprehensive user's guide to describe all features of ABCI version 9.4, including also all additions since the release of the guide for version 8.8. All appendixes from the previous two user's guides that contain different important topics are also quoted. The main advantages of ABCI lie in its high speed of execution, the minimum use of computer memory, implementation of Napoly integration method and many elaborate options of Fourier transformations. In the version 9.4, even wake potentials for a counter-rotating beam of opposite charge can be calculated instead of usual ones for a beam trailing the driving beams. Now, the Windows application version of ABCI is available as a package which includes ABCI stand-alone executable modules, the sample input files, the source codes, manuals and the Windows version of TopDrawer, TopDrawW. This package can be downloaded from the ABCI home page: http://abci.kek.jp/abci.htm. Just by drag-and-droping an input file on the icon of ABCI application, all the calculation results pop out. Neither compilation of the source code nor installation of the program to Windows is necessary. Together with the TopDrawer for Windows, all works (computation of wake fields, generation of figures and so on) can be done simply and easily on Windows alone. How to use ABCI on Windows and how to install the program to other computer systems are explained at the end of this manual. (author)

  18. Quantum walk computation

    International Nuclear Information System (INIS)

    Kendon, Viv

    2014-01-01

    Quantum versions of random walks have diverse applications that are motivating experimental implementations as well as theoretical studies. Recent results showing quantum walks are “universal for quantum computation” relate to algorithms, to be run on quantum computers. We consider whether an experimental implementation of a quantum walk could provide useful computation before we have a universal quantum computer

  19. Intruder dose pathway analysis for the onsite disposal of radioactive wastes: The ONSITE/MAXI1 computer program

    International Nuclear Information System (INIS)

    Kennedy, W.E. Jr.; Peloquin, R.A.; Napier, B.A.; Neuder, S.M.

    1987-02-01

    This document summarizes initial efforts to develop human-intrusion scenarios and a modified version of the MAXI computer program for potential use by the NRC in reviewing applications for onsite radioactive waste disposal. Supplement 1 of NUREG/CR-3620 (1986) summarized modifications and improvements to the ONSITE/MAXI1 software package. This document summarizes a modified version of the ONSITE/MAXI1 computer program. This modified version of the computer program operates on a personal computer and permits the user to optionally select radiation dose conversion factors published by the International Commission on Radiological Protection (ICRP) in their Publication No. 30 (ICRP 1979-1982) in place of those published by the ICRP in their Publication No. 2 (ICRP 1959) (as implemented in the previous versions of the ONSITE/MAXI1 computer program). The pathway-to-human models used in the computer program have not been changed from those described previously. Computer listings of the ONSITE/MAXI1 computer program and supporting data bases are included in the appendices of this document

  20. Preoperative screening: value of previous tests.

    Science.gov (United States)

    Macpherson, D S; Snow, R; Lofgren, R P

    1990-12-15

    To determine the frequency of tests done in the year before elective surgery that might substitute for preoperative screening tests and to determine the frequency of test results that change from a normal value to a value likely to alter perioperative management. Retrospective cohort analysis of computerized laboratory data (complete blood count, sodium, potassium, and creatinine levels, prothrombin time, and partial thromboplastin time). Urban tertiary care Veterans Affairs Hospital. Consecutive sample of 1109 patients who had elective surgery in 1988. At admission, 7549 preoperative tests were done, 47% of which duplicated tests performed in the previous year. Of 3096 previous results that were normal as defined by hospital reference range and done closest to the time of but before admission (median interval, 2 months), 13 (0.4%; 95% CI, 0.2% to 0.7%), repeat values were outside a range considered acceptable for surgery. Most of the abnormalities were predictable from the patient's history, and most were not noted in the medical record. Of 461 previous tests that were abnormal, 78 (17%; CI, 13% to 20%) repeat values at admission were outside a range considered acceptable for surgery (P less than 0.001, frequency of clinically important abnormalities of patients with normal previous results with those with abnormal previous results). Physicians evaluating patients preoperatively could safely substitute the previous test results analyzed in this study for preoperative screening tests if the previous tests are normal and no obvious indication for retesting is present.

  1. PAN AIR: A computer program for predicting subsonic or supersonic linear potential flows about arbitrary configurations using a higher order panel method. Volume 2: User's manual (version 3.0)

    Science.gov (United States)

    Sidwell, Kenneth W.; Baruah, Pranab K.; Bussoletti, John E.; Medan, Richard T.; Conner, R. S.; Purdon, David J.

    1990-01-01

    A comprehensive description of user problem definition for the PAN AIR (Panel Aerodynamics) system is given. PAN AIR solves the 3-D linear integral equations of subsonic and supersonic flow. Influence coefficient methods are used which employ source and doublet panels as boundary surfaces. Both analysis and design boundary conditions can be used. This User's Manual describes the information needed to use the PAN AIR system. The structure and organization of PAN AIR are described, including the job control and module execution control languages for execution of the program system. The engineering input data are described, including the mathematical and physical modeling requirements. Version 3.0 strictly applies only to PAN AIR version 3.0. The major revisions include: (1) inputs and guidelines for the new FDP module (which calculates streamlines and offbody points); (2) nine new class 1 and class 2 boundary conditions to cover commonly used modeling practices, in particular the vorticity matching Kutta condition; (3) use of the CRAY solid state Storage Device (SSD); and (4) incorporation of errata and typo's together with additional explanation and guidelines.

  2. Automatic electromagnetic valve for previous vacuum

    International Nuclear Information System (INIS)

    Granados, C. E.; Martin, F.

    1959-01-01

    A valve which permits the maintenance of an installation vacuum when electric current fails is described. It also lets the air in the previous vacuum bomb to prevent the oil ascending in the vacuum tubes. (Author)

  3. Condiment: general synthesis of different versions

    International Nuclear Information System (INIS)

    Mangin, J.P.

    1990-01-01

    CONDIMENT is a code for the computation of ion migration and diffusion in areas close to radwaste storage facilities. This type of application was found to require a mesh pattern and boundary conditions different from the usual, which justifies the writing of a new code. A first version (version 2) only convers the migration of a single, non radioactive ion. The discretization, the selection of an implicit scheme, and the various boundary conditions are described. Physical quantities such as diffusion coefficient, porosity, retardation factor and permeability vary in space but not in time. A first extension consists of taking consideration radioactivity and filiation. Discretization with respect to time is modified, and a check performed on the original analytical solutions. In a second extension, consideration is given to non-linear adsorption, which makes it necessary to use the NEWTON-RAPHSON method. One can thus modelize the FREUNDLICH isotherms, in spite of the singular point at the origin. Diffusion, apparent porosity and permeability values can be changed as computed proceeds. The last extension is the introduction of two ions with the formation of precipitate. The formulation is derived from that used for non-linear adsorption, the precipitate playing a part similar to that of adsorbed concentration. Agreement with the original analytical solutions is verified. The case of migration with several interacting ions is approached from the theoretical standpoint. We described the discretization, which is similar to that in the first version, but involves many additional variables. Numerical stability is shown to be unconditional [fr

  4. StreamStats, version 4

    Science.gov (United States)

    Ries, Kernell G.; Newson, Jeremy K.; Smith, Martyn J.; Guthrie, John D.; Steeves, Peter A.; Haluska, Tana L.; Kolb, Katharine R.; Thompson, Ryan F.; Santoro, Richard D.; Vraga, Hans W.

    2017-10-30

    IntroductionStreamStats version 4, available at https://streamstats.usgs.gov, is a map-based web application that provides an assortment of analytical tools that are useful for water-resources planning and management, and engineering purposes. Developed by the U.S. Geological Survey (USGS), the primary purpose of StreamStats is to provide estimates of streamflow statistics for user-selected ungaged sites on streams and for USGS streamgages, which are locations where streamflow data are collected.Streamflow statistics, such as the 1-percent flood, the mean flow, and the 7-day 10-year low flow, are used by engineers, land managers, biologists, and many others to help guide decisions in their everyday work. For example, estimates of the 1-percent flood (which is exceeded, on average, once in 100 years and has a 1-percent chance of exceedance in any year) are used to create flood-plain maps that form the basis for setting insurance rates and land-use zoning. This and other streamflow statistics also are used for dam, bridge, and culvert design; water-supply planning and management; permitting of water withdrawals and wastewater and industrial discharges; hydropower facility design and regulation; and setting of minimum allowed streamflows to protect freshwater ecosystems. Streamflow statistics can be computed from available data at USGS streamgages depending on the type of data collected at the stations. Most often, however, streamflow statistics are needed at ungaged sites, where no streamflow data are available to determine the statistics.

  5. MCNP(trademark) Version 5

    International Nuclear Information System (INIS)

    Cox, Lawrence J.; Barrett, Richard F.; Booth, Thomas Edward; Briesmeister, Judith F.; Brown, Forrest B.; Bull, Jeffrey S.; Giesler, Gregg Carl; Goorley, John T.; Mosteller, Russell D.; Forster, R. Arthur; Post, Susan E.; Prael, Richard E.; Selcow, Elizabeth Carol; Sood, Avneet

    2002-01-01

    The Monte Carlo transport workhorse, MCNP, is undergoing a massive renovation at Los Alamos National Laboratory (LANL) in support of the Eolus Project of the Advanced Simulation and Computing (ASCI) Program. MCNP Version 5 (V5) (expected to be released to RSICC in Spring, 2002) will consist of a major restructuring from FORTRAN-77 (with extensions) to ANSI-standard FORTRAN-90 with support for all of the features available in the present release (MCNP-4C2/4C3). To most users, the look-and-feel of MCNP will not change much except for the improvements (improved graphics, easier installation, better online documentation). For example, even with the major format change, full support for incremental patching will still be provided. In addition to the language and style updates, MCNP V5 will have various new user features. These include improved photon physics, neutral particle radiography, enhancements and additions to variance reduction methods, new source options, and improved parallelism support (PVM, MPI, OpenMP).

  6. APGEN Version 5.0

    Science.gov (United States)

    Maldague, Pierre; Page, Dennis; Chase, Adam

    2005-01-01

    Activity Plan Generator (APGEN), now at version 5.0, is a computer program that assists in generating an integrated plan of activities for a spacecraft mission that does not oversubscribe spacecraft and ground resources. APGEN generates an interactive display, through which the user can easily create or modify the plan. The display summarizes the plan by means of a time line, whereon each activity is represented by a bar stretched between its beginning and ending times. Activities can be added, deleted, and modified via simple mouse and keyboard actions. The use of resources can be viewed on resource graphs. Resource and activity constraints can be checked. Types of activities, resources, and constraints are defined by simple text files, which the user can modify. In one of two modes of operation, APGEN acts as a planning expert assistant, displaying the plan and identifying problems in the plan. The user is in charge of creating and modifying the plan. In the other mode, APGEN automatically creates a plan that does not oversubscribe resources. The user can then manually modify the plan. APGEN is designed to interact with other software that generates sequences of timed commands for implementing details of planned activities.

  7. CLIPS - C LANGUAGE INTEGRATED PRODUCTION SYSTEM (IBM PC VERSION)

    Science.gov (United States)

    Riley, G.

    1994-01-01

    The C Language Integrated Production System, CLIPS, is a shell for developing expert systems. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. The primary design goals for CLIPS are portability, efficiency, and functionality. For these reasons, the program is written in C. CLIPS meets or outperforms most micro- and minicomputer based artificial intelligence tools. CLIPS is a forward chaining rule-based language. The program contains an inference engine and a language syntax that provide a framework for the construction of an expert system. It also includes tools for debugging an application. CLIPS is based on the Rete algorithm, which enables very efficient pattern matching. The collection of conditions and actions to be taken if the conditions are met is constructed into a rule network. As facts are asserted either prior to or during a session, CLIPS pattern-matches the number of fields. Wildcards and variables are supported for both single and multiple fields. CLIPS syntax allows the inclusion of externally defined functions (outside functions which are written in a language other than CLIPS). CLIPS itself can be embedded in a program such that the expert system is available as a simple subroutine call. Advanced features found in CLIPS version 4.3 include an integrated microEMACS editor, the ability to generate C source code from a CLIPS rule base to produce a dedicated executable, binary load and save capabilities for CLIPS rule bases, and the utility program CRSV (Cross-Reference, Style, and Verification) designed to facilitate the development and maintenance of large rule bases. Five machine versions are available. Each machine version includes the source and the executable for that machine. The UNIX version includes the source and binaries for IBM RS/6000, Sun3 series, and Sun4 series computers. The UNIX, DEC VAX, and DEC RISC Workstation versions are line oriented. The PC version and the Macintosh

  8. CLIPS - C LANGUAGE INTEGRATED PRODUCTION SYSTEM (MACINTOSH VERSION)

    Science.gov (United States)

    Culbert, C.

    1994-01-01

    The C Language Integrated Production System, CLIPS, is a shell for developing expert systems. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. The primary design goals for CLIPS are portability, efficiency, and functionality. For these reasons, the program is written in C. CLIPS meets or outperforms most micro- and minicomputer based artificial intelligence tools. CLIPS is a forward chaining rule-based language. The program contains an inference engine and a language syntax that provide a framework for the construction of an expert system. It also includes tools for debugging an application. CLIPS is based on the Rete algorithm, which enables very efficient pattern matching. The collection of conditions and actions to be taken if the conditions are met is constructed into a rule network. As facts are asserted either prior to or during a session, CLIPS pattern-matches the number of fields. Wildcards and variables are supported for both single and multiple fields. CLIPS syntax allows the inclusion of externally defined functions (outside functions which are written in a language other than CLIPS). CLIPS itself can be embedded in a program such that the expert system is available as a simple subroutine call. Advanced features found in CLIPS version 4.3 include an integrated microEMACS editor, the ability to generate C source code from a CLIPS rule base to produce a dedicated executable, binary load and save capabilities for CLIPS rule bases, and the utility program CRSV (Cross-Reference, Style, and Verification) designed to facilitate the development and maintenance of large rule bases. Five machine versions are available. Each machine version includes the source and the executable for that machine. The UNIX version includes the source and binaries for IBM RS/6000, Sun3 series, and Sun4 series computers. The UNIX, DEC VAX, and DEC RISC Workstation versions are line oriented. The PC version and the Macintosh

  9. RASCAL Version 2.0 workbook

    International Nuclear Information System (INIS)

    Athey, G.F.; McKenna, T.J.

    1993-05-01

    The Radiological Assessment System for Consequence Analysis, Version 2.0 (RASCAL 2.0) has been developed for use by the NRC personnel who respond to radiological emergencies. This workbook is intended to complement the RASCAL 2.0 User's Guide (NUREG/CR-5247, Vol. 1). The workbook contains exercises designed to familiarize the user with the computer based tools of RASCAL through hands-on problem solving. The workbook is composed of four major sections. The first part is a RASCAL familiarization exercise to acquaint the user with the operation of the forms, menus, on-line help, and documentation. The latter three parts contain exercises in using the three tools of RASCAL Version 2.0: DECAY, FM-DOSE, and ST-DOSE. Each section of exercises is followed by discussion on how the tools could be used to solve the problem

  10. HECTR Version 1.5 user's manual

    International Nuclear Information System (INIS)

    Dingman, S.E.; Camp, A.L.; Wong, C.C.; King, D.B.; Gasser, R.D.

    1986-04-01

    This report describes the use and features of HECTR Version 1.5. HECTR is a relatively fast-running, lumped-volume containment analysis computer program that is most useful for performing parametric studies. The main purpose of HECTR is to analyze nuclear reactor accidents involving the transport and combustion of hydrogen, but HECTR can also function as an experiment analysis tool and can solve a limited set of other types of containment problems. New models added to HECTR Version 1.5 include fan coolers, containment leakage, continuous burning, and the capability to treat carbon monoxide and carbon dioxide. Models for the ice condenser, sumps, and Mark III suppression pool were upgraded. HECTR is designed for flexibility and provides for user control of many important parameters, particularly those related to hydrogen combustion. Built-in correlations and default values of key parameters are also provided

  11. Overview of MPLNET Version 3 Cloud Detection

    Science.gov (United States)

    Lewis, Jasper R.; Campbell, James; Welton, Ellsworth J.; Stewart, Sebastian A.; Haftings, Phillip

    2016-01-01

    The National Aeronautics and Space Administration Micro Pulse Lidar Network, version 3, cloud detection algorithm is described and differences relative to the previous version are highlighted. Clouds are identified from normalized level 1 signal profiles using two complementary methods. The first method considers vertical signal derivatives for detecting low-level clouds. The second method, which detects high-level clouds like cirrus, is based on signal uncertainties necessitated by the relatively low signal-to-noise ratio exhibited in the upper troposphere by eye-safe network instruments, especially during daytime. Furthermore, a multitemporal averaging scheme is used to improve cloud detection under conditions of a weak signal-to-noise ratio. Diurnal and seasonal cycles of cloud occurrence frequency based on one year of measurements at the Goddard Space Flight Center (Greenbelt, Maryland) site are compared for the new and previous versions. The largest differences, and perceived improvement, in detection occurs for high clouds (above 5 km, above MSL), which increase in occurrence by over 5%. There is also an increase in the detection of multilayered cloud profiles from 9% to 19%. Macrophysical properties and estimates of cloud optical depth are presented for a transparent cirrus dataset. However, the limit to which the cirrus cloud optical depth could be reliably estimated occurs between 0.5 and 0.8. A comparison using collocated CALIPSO measurements at the Goddard Space Flight Center and Singapore Micro Pulse Lidar Network (MPLNET) sites indicates improvements in cloud occurrence frequencies and layer heights.

  12. QDENSITY—A Mathematica quantum computer simulation

    Science.gov (United States)

    Juliá-Díaz, Bruno; Burdis, Joseph M.; Tabakin, Frank

    2009-03-01

    This Mathematica 6.0 package is a simulation of a Quantum Computer. The program provides a modular, instructive approach for generating the basic elements that make up a quantum circuit. The main emphasis is on using the density matrix, although an approach using state vectors is also implemented in the package. The package commands are defined in Qdensity.m which contains the tools needed in quantum circuits, e.g., multiqubit kets, projectors, gates, etc. New version program summaryProgram title: QDENSITY 2.0 Catalogue identifier: ADXH_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXH_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 26 055 No. of bytes in distributed program, including test data, etc.: 227 540 Distribution format: tar.gz Programming language: Mathematica 6.0 Operating system: Any which supports Mathematica; tested under Microsoft Windows XP, Macintosh OS X, and Linux FC4 Catalogue identifier of previous version: ADXH_v1_0 Journal reference of previous version: Comput. Phys. Comm. 174 (2006) 914 Classification: 4.15 Does the new version supersede the previous version?: Offers an alternative, more up to date, implementation Nature of problem: Analysis and design of quantum circuits, quantum algorithms and quantum clusters. Solution method: A Mathematica package is provided which contains commands to create and analyze quantum circuits. Several Mathematica notebooks containing relevant examples: Teleportation, Shor's Algorithm and Grover's search are explained in detail. A tutorial, Tutorial.nb is also enclosed. Reasons for new version: The package has been updated to make it fully compatible with Mathematica 6.0 Summary of revisions: The package has been updated to make it fully compatible with Mathematica 6.0 Running time: Most examples

  13. ERRATUM - French version only

    CERN Multimedia

    Le texte suivant remplace la version française de l'encadré paru en page 2 du Bulletin 28/2003 : Le 1er juillet 1953, les représentants des douze Etats Membres fondateurs du CERN signèrent la convention de l'Organisation. Aujourd'hui, le CERN compte vingt Etats Membres Européens : l'Allemagne, l'Autriche, la Belgique, la Bulgarie, le Danemark, l'Espagne, la Finlande, la France, la Grèce, la Hongrie, l'Italie, la Norvège, les Pays-Bas, la Pologne, le Portugal, la République Slovaque, la République Tchèque, le Royaume-Uni, la Suède, et la Suisse. Les Etats-Unis, l'Inde, l'Israël, le Japon, la Fédération Russe, la Turquie, la Commission Européenne et l'UNESCO ont un statut d'Etat observateur.

  14. NOAA Climate Data Record of Microwave Sounding Unit (MSU) Mean Atmospheric Layer Temperature, Version 1.2 (Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Please note, this dataset has been superseded by a newer version (see below). Users should not use this version except in rare cases (e.g., when reproducing previous...

  15. GENXICC2.1: An improved version of GENXICC for hadronic production of doubly heavy baryons

    Science.gov (United States)

    Wang, Xian-You; Wu, Xing-Gang

    2013-03-01

    We present an improved version of GENXICC, which is a generator for hadronic production of the doubly heavy baryons Ξcc, Ξbc and Ξbb and has been introduced by C.H. Chang, J.X. Wang and X.G. Wu [Comput. Phys. Commun. 177 (2007) 467; Comput. Phys. Commun. 181 (2010) 1144]. In comparison with the previous GENXICC versions, we update the program in order to generate the unweighted baryon events more effectively under various simulation environments, whose distributions are now generated according to the probability proportional to the integrand. One Les Houches Event (LHE) common block has been added to produce a standard LHE data file that contains useful information of the doubly heavy baryon and its accompanying partons. Such LHE data can be conveniently imported into PYTHIA to do further hadronization and decay simulation, especially, the color-flow problem can be solved with PYTHIA8.0. NEW VERSION PROGRAM SUMMARYTitle of program: GENXICC2.1 Program obtained from: CPC Program Library Reference to original program: GENXICC Reference in CPC: Comput. Phys. Commun. 177, 467 (2007); Comput. Phys. Commun. 181, 1144 (2010) Does the new version supersede the old program: No Computer: Any LINUX based on PC with FORTRAN 77 or FORTRAN 90 and GNU C compiler as well Operating systems: LINUX Programming language used: FORTRAN 77/90 Memory required to execute with typical data: About 2.0 MB No. of bytes in distributed program: About 2 MB, including PYTHIA6.4 Distribution format: .tar.gz Nature of physical problem: Hadronic production of doubly heavy baryons Ξcc, Ξbc and Ξbb. Method of solution: The upgraded version with a proper interface to PYTHIA can generate full production and decay events, either weighted or unweighted, conveniently and effectively. Especially, the unweighted events are generated by using an improved hit-and-miss approach. Reasons for new version: Responding to the feedback from users of CMS and LHCb groups at the Large Hadron Collider, and based on

  16. 77 FR 70176 - Previous Participation Certification

    Science.gov (United States)

    2012-11-23

    ... participants' previous participation in government programs and ensure that the past record is acceptable prior... information is designed to be 100 percent automated and digital submission of all data and certifications is... government programs and ensure that the past record is acceptable prior to granting approval to participate...

  17. On the Tengiz petroleum deposit previous study

    International Nuclear Information System (INIS)

    Nysangaliev, A.N.; Kuspangaliev, T.K.

    1997-01-01

    Tengiz petroleum deposit previous study is described. Some consideration about structure of productive formation, specific characteristic properties of petroleum-bearing collectors are presented. Recommendation on their detail study and using of experience on exploration and development of petroleum deposit which have analogy on most important geological and industrial parameters are given. (author)

  18. Subsequent pregnancy outcome after previous foetal death

    NARCIS (Netherlands)

    Nijkamp, J. W.; Korteweg, F. J.; Holm, J. P.; Timmer, A.; Erwich, J. J. H. M.; van Pampus, M. G.

    Objective: A history of foetal death is a risk factor for complications and foetal death in subsequent pregnancies as most previous risk factors remain present and an underlying cause of death may recur. The purpose of this study was to evaluate subsequent pregnancy outcome after foetal death and to

  19. Previous Experience a Model of Practice UNAE

    Directory of Open Access Journals (Sweden)

    Ormary Barberi Ruiz

    2017-02-01

    Full Text Available The statements presented in this article represents a preliminary version of the proposed model of pre-professional practices (PPP of the National University of Education (UNAE of Ecuador, an urgent institutional necessity is revealed in the descriptive analyzes conducted from technical support - administrative (reports, interviews, testimonials, pedagogical foundations of UNAE (curricular directionality, transverse axes in practice, career plan, approach and diagnostic examination as subject nature of the pre professional practice and the demand of socio educational contexts where the practices have been emerging to resize them. By relating these elements allowed conceiving the modeling of the processes of the pre-professional practices for the development of professional skills of future teachers through four components: contextual projective, implementation (tutoring, accompaniment (teaching couple and monitoring (meetings at the beginning, during and end of practice. The initial training of teachers is inherent to teaching (academic and professional training, research and links with the community, these are fundamental pillars of Ecuadorian higher education.

  20. Assessment of radionuclide databases in CAP88 mainframe version 1.0 and Windows-based version 3.0.

    Science.gov (United States)

    LaBone, Elizabeth D; Farfán, Eduardo B; Lee, Patricia L; Jannik, G Timothy; Donnelly, Elizabeth H; Foley, Trevor Q

    2009-09-01

    In this study the radionuclide databases for two versions of the Clean Air Act Assessment Package-1988 (CAP88) computer model were assessed in detail. CAP88 estimates radiation dose and the risk of health effects to human populations from radionuclide emissions to air. This program is used by several U.S. Department of Energy (DOE) facilities to comply with National Emission Standards for Hazardous Air Pollutants regulations. CAP88 Mainframe, referred to as version 1.0 on the U.S. Environmental Protection Agency Web site (http://www.epa.gov/radiation/assessment/CAP88/), was the very first CAP88 version released in 1988. Some DOE facilities including the Savannah River Site still employ this version (1.0) while others use the more user-friendly personal computer Windows-based version 3.0 released in December 2007. Version 1.0 uses the program RADRISK based on International Commission on Radiological Protection Publication 30 as its radionuclide database. Version 3.0 uses half-life, dose, and risk factor values based on Federal Guidance Report 13. Differences in these values could cause different results for the same input exposure data (same scenario), depending on which version of CAP88 is used. Consequently, the differences between the two versions are being assessed in detail at Savannah River National Laboratory. The version 1.0 and 3.0 database files contain 496 and 838 radionuclides, respectively, and though one would expect the newer version to include all the 496 radionuclides, 35 radionuclides are listed in version 1.0 that are not included in version 3.0. The majority of these has either extremely short or long half-lives or is no longer in production; however, some of the short-lived radionuclides might produce progeny of great interest at DOE sites. In addition, 122 radionuclides were found to have different half-lives in the two versions, with 21 over 3 percent different and 12 over 10 percent different.

  1. Python pocket reference, version 2.4

    CERN Document Server

    Lutz, Mark

    2005-01-01

    Python is optimized for quality, productivity, portability, and integration. Hundreds of thousands of Python developers around the world rely on Python for general-purpose tasks, Internet scripting, systems programming, user interfaces, and product customization. Available on all major computing platforms, including commercial versions of Unix, Linux, Windows, and Mac OS X, Python is portable, powerful and remarkable easy to use. With its convenient, quick-reference format, Python Pocket Reference, 3rd Edition is the perfect on-the-job reference. More importantly, it's now been refreshed

  2. UQTk version 2.0 user manual

    Energy Technology Data Exchange (ETDEWEB)

    Debusschere, Bert J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Sargsyan, Khachik [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Safta, Cosmin [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2013-10-01

    The UQ Toolkit (UQTk) is a collection of libraries and tools for the quantification of uncertainty in numerical model predictions. Version 2.0 ffers intrusive and non-intrusive methods for propagating input uncertainties through computational models, tools for sensitivity analysis, methods for sparse surrogate construction, and Bayesian inference tools for inferring parameters from experimental data. This manual discusses the download and installation process for UQTk, provides pointers to the UQ methods used in the toolkit, and describes some of the examples provided with the toolkit.

  3. Subsequent childbirth after a previous traumatic birth.

    Science.gov (United States)

    Beck, Cheryl Tatano; Watson, Sue

    2010-01-01

    Nine percent of new mothers in the United States who participated in the Listening to Mothers II Postpartum Survey screened positive for meeting the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition criteria for posttraumatic stress disorder after childbirth. Women who have had a traumatic birth experience report fewer subsequent children and a longer length of time before their second baby. Childbirth-related posttraumatic stress disorder impacts couples' physical relationship, communication, conflict, emotions, and bonding with their children. The purpose of this study was to describe the meaning of women's experiences of a subsequent childbirth after a previous traumatic birth. Phenomenology was the research design used. An international sample of 35 women participated in this Internet study. Women were asked, "Please describe in as much detail as you can remember your subsequent pregnancy, labor, and delivery following your previous traumatic birth." Colaizzi's phenomenological data analysis approach was used to analyze the stories of the 35 women. Data analysis yielded four themes: (a) riding the turbulent wave of panic during pregnancy; (b) strategizing: attempts to reclaim their body and complete the journey to motherhood; (c) bringing reverence to the birthing process and empowering women; and (d) still elusive: the longed-for healing birth experience. Subsequent childbirth after a previous birth trauma has the potential to either heal or retraumatize women. During pregnancy, women need permission and encouragement to grieve their prior traumatic births to help remove the burden of their invisible pain.

  4. Validation of MCNP6 Version 1.0 with the ENDF/B-VII.1 Cross Section Library for Uranium Metal, Oxide, and Solution Systems on the High Performance Computing Platform Moonlight

    Energy Technology Data Exchange (ETDEWEB)

    Chapman, Bryan Scott [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); MacQuigg, Michael Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Wysong, Andrew Russell [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-04-21

    In this document, the code MCNP is validated with ENDF/B-VII.1 cross section data under the purview of ANSI/ANS-8.24-2007, for use with uranium systems. MCNP is a computer code based on Monte Carlo transport methods. While MCNP has wide reading capability in nuclear transport simulation, this validation is limited to the functionality related to neutron transport and calculation of criticality parameters such as keff.

  5. Validation of MCNP6 Version 1.0 with the ENDF/B-VII.1 Cross Section Library for Uranium Metal, Oxide, and Solution Systems on the High Performance Computing Platform Moonlight

    International Nuclear Information System (INIS)

    Chapman, Bryan Scott; MacQuigg, Michael Robert; Wysong, Andrew Russell

    2016-01-01

    In this document, the code MCNP is validated with ENDF/B-VII.1 cross section data under the purview of ANSI/ANS-8.24-2007, for use with uranium systems. MCNP is a computer code based on Monte Carlo transport methods. While MCNP has wide reading capability in nuclear transport simulation, this validation is limited to the functionality related to neutron transport and calculation of criticality parameters such as k eff .

  6. SPARK Version 1.1 user manual

    International Nuclear Information System (INIS)

    Weissenburger, D.W.

    1988-01-01

    This manual describes the input required to use Version 1.1 of the SPARK computer code. SPARK 1.1 is a library of FORTRAN main programs and subprograms designed to calculate eddy currents on conducting surfaces where current flow is assumed zero in the direction normal to the surface. Surfaces are modeled with triangular and/or quadrilateral elements. Lorentz forces produced by the interaction of eddy currents with background magnetic fields can be output at element nodes in a form compatible with most structural analysis codes. In addition, magnetic fields due to eddy currents can be determined at points off the surface. Version 1.1 features eddy current streamline plotting with optional hidden-surface-removal graphics and topological enhancements that allow essentially any orientable surface to be modeled. SPARK also has extensive symmetry specification options. In order to make the manual as self-contained as possible, six appendices are included that present summaries of the symmetry options, topological options, coil options and code algorithms, with input and output examples. An edition of SPARK 1.1 is available on the Cray computers at the National Magnetic Fusion Energy Computer Center at Livermore, California. Another more generic edition is operational on the VAX computers at the Princeton Plasma Physics Laboratory and is available on magnetic tape by request. The generic edition requires either the GKS or PLOT10 graphics package and the IMSL or NAG mathematical package. Requests from outside the United States will be subject to applicable federal regulations regarding dissemination of computer programs. 22 refs

  7. SPARK Version 1. 1 user manual

    Energy Technology Data Exchange (ETDEWEB)

    Weissenburger, D.W.

    1988-01-01

    This manual describes the input required to use Version 1.1 of the SPARK computer code. SPARK 1.1 is a library of FORTRAN main programs and subprograms designed to calculate eddy currents on conducting surfaces where current flow is assumed zero in the direction normal to the surface. Surfaces are modeled with triangular and/or quadrilateral elements. Lorentz forces produced by the interaction of eddy currents with background magnetic fields can be output at element nodes in a form compatible with most structural analysis codes. In addition, magnetic fields due to eddy currents can be determined at points off the surface. Version 1.1 features eddy current streamline plotting with optional hidden-surface-removal graphics and topological enhancements that allow essentially any orientable surface to be modeled. SPARK also has extensive symmetry specification options. In order to make the manual as self-contained as possible, six appendices are included that present summaries of the symmetry options, topological options, coil options and code algorithms, with input and output examples. An edition of SPARK 1.1 is available on the Cray computers at the National Magnetic Fusion Energy Computer Center at Livermore, California. Another more generic edition is operational on the VAX computers at the Princeton Plasma Physics Laboratory and is available on magnetic tape by request. The generic edition requires either the GKS or PLOT10 graphics package and the IMSL or NAG mathematical package. Requests from outside the United States will be subject to applicable federal regulations regarding dissemination of computer programs. 22 refs.

  8. Computer code conversion using HISTORIAN

    International Nuclear Information System (INIS)

    Matsumoto, Kiyoshi; Kumakura, Toshimasa.

    1990-09-01

    When a computer program written for a computer A is converted for a computer B, in general, the A version source program is rewritten for B version. However, in this way of program conversion, the following inconvenient problems arise. 1) The original statements to be rewritten for B version are lost. 2) If the original statements of the A version rewritten for B version would remain as comment lines, the B version source program becomes quite large. 3) When update directives of the program are mailed from the organization which developed the program or when some modifications are needed for the program, it is difficult to point out the part to be updated or modified in the B version source program. To solve these problems, the conversion method using the general-purpose software management aid system, HISTORIAN, has been introduced. This conversion method makes a large computer code a easy-to-use program for use to update, modify or improve after the conversion. This report describes the planning and procedures of the conversion method and the MELPROG-PWR/MOD1 code conversion from the CRAY version to the JAERI FACOM version as an example. This report would provide useful information for those who develop or introduce large programs. (author)

  9. Automated evaluation of matrix elements between contracted wavefunctions: A Mathematica version of the FRODO program

    Science.gov (United States)

    Angeli, C.; Cimiraglia, R.

    2013-02-01

    A symbolic program performing the Formal Reduction of Density Operators (FRODO), formerly developed in the MuPAD computer algebra system with the purpose of evaluating the matrix elements of the electronic Hamiltonian between internally contracted functions in a complete active space (CAS) scheme, has been rewritten in Mathematica. New version : A program summaryProgram title: FRODO Catalogue identifier: ADV Y _v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVY_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 3878 No. of bytes in distributed program, including test data, etc.: 170729 Distribution format: tar.gz Programming language: Mathematica Computer: Any computer on which the Mathematica computer algebra system can be installed Operating system: Linux Classification: 5 Catalogue identifier of previous version: ADV Y _v1_0 Journal reference of previous version: Comput. Phys. Comm. 171(2005)63 Does the new version supersede the previous version?: No Nature of problem. In order to improve on the CAS-SCF wavefunction one can resort to multireference perturbation theory or configuration interaction based on internally contracted functions (ICFs) which are obtained by application of the excitation operators to the reference CAS-SCF wavefunction. The previous formulation of such matrix elements in the MuPAD computer algebra system, has been rewritten using Mathematica. Solution method: The method adopted consists in successively eliminating all occurrences of inactive orbital indices (core and virtual) from the products of excitation operators which appear in the definition of the ICFs and in the electronic Hamiltonian expressed in the second quantization formalism. Reasons for new version: Some years ago we published in this journal a couple of papers [1, 2

  10. Procedure guideline for thyroid scintigraphy (version 3)

    International Nuclear Information System (INIS)

    Dietlein, M.; Schicha, H.; Eschner, W.; Deutsche Gesellschaft fuer Medizinische Physik; Koeln Univ.; Leisner, B.; Allgemeines Krankenhaus St. Georg, Hamburg; Reiners, C.; Wuerzburg Univ.

    2007-01-01

    The version 3 of the procedure guideline for thyroid scintigraphy is an update of the procedure guideline previously published in 2003. The interpretation of the scintigraphy requires the knowledge of the patients' history, the palpation of the neck, the laboratory parameters and of the sonography. The interpretation of the technetium-99m uptake requires the knowledge of the TSH-level. As a consequence of the improved alimentary iodine supply the 99m Tc-uptake has decreased; 100 000 counts per scintigraphy should be acquired. For this, an imaging time of 10 minutes is generally needed using a high resolution collimator for thyroid imaging. (orig.)

  11. FBR metallic materials test manual (English version)

    International Nuclear Information System (INIS)

    Odaka, Susumu; Kato, Shoichi; Yoshida, Eiichi

    2003-06-01

    For the development of the fast breeder reactor, this manual describes the method of in-air and in-sodium material tests and the method of organization the data. This previous manual has revised in accordance with the revision of Japanese Industrial Standard (JIS) and the conversion to the international unit. The test methods of domestic committees such as the VAMAS (Versailles Project on Advanced Materials and Standards) workshop were also refereed. The material test technologies accumulated in this group until now were also incorporated. This English version was prepared in order to provide more engineers with the FBR metallic materials test manual. (author)

  12. High-Performance Java Codes for Computational Fluid Dynamics

    Science.gov (United States)

    Riley, Christopher; Chatterjee, Siddhartha; Biswas, Rupak; Biegel, Bryan (Technical Monitor)

    2001-01-01

    The computational science community is reluctant to write large-scale computationally -intensive applications in Java due to concerns over Java's poor performance, despite the claimed software engineering advantages of its object-oriented features. Naive Java implementations of numerical algorithms can perform poorly compared to corresponding Fortran or C implementations. To achieve high performance, Java applications must be designed with good performance as a primary goal. This paper presents the object-oriented design and implementation of two real-world applications from the field of Computational Fluid Dynamics (CFD): a finite-volume fluid flow solver (LAURA, from NASA Langley Research Center), and an unstructured mesh adaptation algorithm (2D_TAG, from NASA Ames Research Center). This work builds on our previous experience with the design of high-performance numerical libraries in Java. We examine the performance of the applications using the currently available Java infrastructure and show that the Java version of the flow solver LAURA performs almost within a factor of 2 of the original procedural version. Our Java version of the mesh adaptation algorithm 2D_TAG performs within a factor of 1.5 of its original procedural version on certain platforms. Our results demonstrate that object-oriented software design principles are not necessarily inimical to high performance.

  13. Version control of pathway models using XML patches.

    Science.gov (United States)

    Saffrey, Peter; Orton, Richard

    2009-03-17

    Computational modelling has become an important tool in understanding biological systems such as signalling pathways. With an increase in size complexity of models comes a need for techniques to manage model versions and their relationship to one another. Model version control for pathway models shares some of the features of software version control but has a number of differences that warrant a specific solution. We present a model version control method, along with a prototype implementation, based on XML patches. We show its application to the EGF/RAS/RAF pathway. Our method allows quick and convenient storage of a wide range of model variations and enables a thorough explanation of these variations. Trying to produce these results without such methods results in slow and cumbersome development that is prone to frustration and human error.

  14. The comparison of CAP88-PC version 2.0 versus CAP88-PC version 1.0

    International Nuclear Information System (INIS)

    Yakubovich, B.A.; Klee, K.O.; Palmer, C.R.; Spotts, P.B.

    1997-12-01

    40 CFR Part 61 (Subpart H of the NESHAP) requires DOE facilities to use approved sampling procedures, computer models, or other approved procedures when calculating Effective Dose Equivalent (EDE) values to members of the public. Currently version 1.0 of the approved computer model CAP88-PC is used to calculate EDE values. The DOE has upgraded the CAP88-PC software to version 2.0. This version provides simplified data entry, better printing characteristics, the use of a mouse, and other features. The DOE has developed and released version 2.0 for testing and comment. This new software is a WINDOWS based application that offers a new graphical user interface with new utilities for preparing and managing population and weather data, and several new decay chains. The program also allows the user to view results before printing. This document describes a test that confirmed CAP88-PC version 2.0 generates results comparable to the original version of the CAP88-PC program

  15. FORIG: a computer code for calculating radionuclide generation and depletion in fusion and fission reactors. User's manual

    International Nuclear Information System (INIS)

    Blink, J.A.

    1985-03-01

    In this manual we describe the use of the FORIG computer code to solve isotope-generation and depletion problems in fusion and fission reactors. FORIG runs on a Cray-1 computer and accepts more extensive activation cross sections than ORIGEN2 from which it was adapted. This report is an updated and a combined version of the previous ORIGEN2 and FORIG manuals. 7 refs., 15 figs., 13 tabs

  16. Books average previous decade of economic misery.

    Science.gov (United States)

    Bentley, R Alexander; Acerbi, Alberto; Ormerod, Paul; Lampos, Vasileios

    2014-01-01

    For the 20(th) century since the Depression, we find a strong correlation between a 'literary misery index' derived from English language books and a moving average of the previous decade of the annual U.S. economic misery index, which is the sum of inflation and unemployment rates. We find a peak in the goodness of fit at 11 years for the moving average. The fit between the two misery indices holds when using different techniques to measure the literary misery index, and this fit is significantly better than other possible correlations with different emotion indices. To check the robustness of the results, we also analysed books written in German language and obtained very similar correlations with the German economic misery index. The results suggest that millions of books published every year average the authors' shared economic experiences over the past decade.

  17. Books Average Previous Decade of Economic Misery

    Science.gov (United States)

    Bentley, R. Alexander; Acerbi, Alberto; Ormerod, Paul; Lampos, Vasileios

    2014-01-01

    For the 20th century since the Depression, we find a strong correlation between a ‘literary misery index’ derived from English language books and a moving average of the previous decade of the annual U.S. economic misery index, which is the sum of inflation and unemployment rates. We find a peak in the goodness of fit at 11 years for the moving average. The fit between the two misery indices holds when using different techniques to measure the literary misery index, and this fit is significantly better than other possible correlations with different emotion indices. To check the robustness of the results, we also analysed books written in German language and obtained very similar correlations with the German economic misery index. The results suggest that millions of books published every year average the authors' shared economic experiences over the past decade. PMID:24416159

  18. Retrieval of articles in personal computer

    International Nuclear Information System (INIS)

    Choi, Byung Gil; Park, Seog Hee; Kim, Sung Hoon; Shinn, Kyung Sub

    1994-01-01

    Although many useful articles appear in the journals published in Korea, they are not always cited by researchers mainly due to absence of efficient searching system. The authors made a program with 6 predefined filtering forms to detect published articles rapidly and accurately. The programs was coded using database management system CA-Clipper Version 5.2 (Computer Associates International, Inc.) through preliminary work for 1 year. We used 486 DX II (8 Mbyte RAM, VGA, 200 Mbyte Hard Disk). Ink-jet Printer (Hewlett Packard Company), and MS-DOS Version 5.0 (Microsoft Co). We inputted total of 1986 articles published in the Journal of Korea Radiological Society from 1981 to 1993. The searching time was 10 to 15 seconds for each use. We had very flexible user interfaces and simplified searching methods, but more complicated filtering could also be performed. Although the previous version have had some bugs, this upgrade version resolved the problems and fitted in searching articles. The program would be valuable for radiologist in searching articles published not only in the Journal of the Korean Radiological Society, but also in the Journal of the Korean Society of Medicine Ultrasound and the Korean Journal of Nuclear Medicine

  19. Special issue of Higher-Order and Symbolic Computation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Sabry, Amr

    This issue of HOSC is dedicated to the general topic of continuations. It grew out of the third ACM SIGPLAN Workshop on Continuations (CW'01), which took place in London, UK on January 16, 2001 [3]. The notion of continuation is ubiquitous in many different areas of computer science, including...... and streamline Filinski's earlier work in the previous special issue of HOSC (then LISP and Symbolic Computation) that grew out of the first ACM SIGPLAN Workshop on Continuations [1, 2]. Hasegawa and Kakutani's article is the journal version of an article presented at FOSSACS 2001 and that received the EATCS...

  20. Item analysis of the Spanish version of the Boston Naming Test with a Spanish speaking adult population from Colombia.

    Science.gov (United States)

    Kim, Stella H; Strutt, Adriana M; Olabarrieta-Landa, Laiene; Lequerica, Anthony H; Rivera, Diego; De Los Reyes Aragon, Carlos Jose; Utria, Oscar; Arango-Lasprilla, Juan Carlos

    2018-02-23

    The Boston Naming Test (BNT) is a widely used measure of confrontation naming ability that has been criticized for its questionable construct validity for non-English speakers. This study investigated item difficulty and construct validity of the Spanish version of the BNT to assess cultural and linguistic impact on performance. Subjects were 1298 healthy Spanish speaking adults from Colombia. They were administered the 60- and 15-item Spanish version of the BNT. A Rasch analysis was computed to assess dimensionality, item hierarchy, targeting, reliability, and item fit. Both versions of the BNT satisfied requirements for unidimensionality. Although internal consistency was excellent for the 60-item BNT, order of difficulty did not increase consistently with item number and there were a number of items that did not fit the Rasch model. For the 15-item BNT, a total of 5 items changed position on the item hierarchy with 7 poor fitting items. Internal consistency was acceptable. Construct validity of the BNT remains a concern when it is administered to non-English speaking populations. Similar to previous findings, the order of item presentation did not correspond with increasing item difficulty, and both versions were inadequate at assessing high naming ability.

  1. Nuclear Criticality Safety Handbook, Version 2. English translation

    International Nuclear Information System (INIS)

    2001-08-01

    The Nuclear Criticality Safety Handbook, Version 2 essentially includes the description of the Supplement Report to the Nuclear Criticality Safety Handbook, released in 1995, into the first version of the Nuclear Criticality Safety Handbook, published in 1988. The following two points are new: (1) exemplifying safety margins related to modeled dissolution and extraction processes, (2) describing evaluation methods and alarm system for criticality accidents. Revision has been made based on previous studies for the chapter that treats modeling the fuel system: e.g., the fuel grain size that the system can be regarded as homogeneous, non-uniformity effect of fuel solution, an burnup credit. This revision has solved the inconsistencies found in the first version between the evaluation of errors found in JACS code system and the criticality condition data that were calculated based on the evaluation. This report is an English translation of the Nuclear Criticality Safety Handbook, Version 2, originally published in Japanese as JAERI 1340 in 1999. (author)

  2. Research in computer science

    Science.gov (United States)

    Ortega, J. M.

    1986-01-01

    Various graduate research activities in the field of computer science are reported. Among the topics discussed are: (1) failure probabilities in multi-version software; (2) Gaussian Elimination on parallel computers; (3) three dimensional Poisson solvers on parallel/vector computers; (4) automated task decomposition for multiple robot arms; (5) multi-color incomplete cholesky conjugate gradient methods on the Cyber 205; and (6) parallel implementation of iterative methods for solving linear equations.

  3. MULTIPLE PROJECTIONS SYSTEM (MPS) - USER'S MANUAL VERSION 1.0

    Science.gov (United States)

    The report is a user's manual for version 1.0 of the Multiple Projections Systems (MPS), a computer system that can perform "what if" scenario analysis and report the final results (i.e., Rate of Further Progress - ROP - inventories) to EPA (i.e., the Aerometric Information Retri...

  4. Underestimation of Severity of Previous Whiplash Injuries

    Science.gov (United States)

    Naqui, SZH; Lovell, SJ; Lovell, ME

    2008-01-01

    INTRODUCTION We noted a report that more significant symptoms may be expressed after second whiplash injuries by a suggested cumulative effect, including degeneration. We wondered if patients were underestimating the severity of their earlier injury. PATIENTS AND METHODS We studied recent medicolegal reports, to assess subjects with a second whiplash injury. They had been asked whether their earlier injury was worse, the same or lesser in severity. RESULTS From the study cohort, 101 patients (87%) felt that they had fully recovered from their first injury and 15 (13%) had not. Seventy-six subjects considered their first injury of lesser severity, 24 worse and 16 the same. Of the 24 that felt the violence of their first accident was worse, only 8 had worse symptoms, and 16 felt their symptoms were mainly the same or less than their symptoms from their second injury. Statistical analysis of the data revealed that the proportion of those claiming a difference who said the previous injury was lesser was 76% (95% CI 66–84%). The observed proportion with a lesser injury was considerably higher than the 50% anticipated. CONCLUSIONS We feel that subjects may underestimate the severity of an earlier injury and associated symptoms. Reasons for this may include secondary gain rather than any proposed cumulative effect. PMID:18201501

  5. [Electronic cigarettes - effects on health. Previous reports].

    Science.gov (United States)

    Napierała, Marta; Kulza, Maksymilian; Wachowiak, Anna; Jabłecka, Katarzyna; Florek, Ewa

    2014-01-01

    Currently very popular in the market of tobacco products have gained electronic cigarettes (ang. E-cigarettes). These products are considered to be potentially less harmful in compared to traditional tobacco products. However, current reports indicate that the statements of the producers regarding to the composition of the e- liquids not always are sufficient, and consumers often do not have reliable information on the quality of the product used by them. This paper contain a review of previous reports on the composition of e-cigarettes and their impact on health. Most of the observed health effects was related to symptoms of the respiratory tract, mouth, throat, neurological complications and sensory organs. Particularly hazardous effects of the e-cigarettes were: pneumonia, congestive heart failure, confusion, convulsions, hypotension, aspiration pneumonia, face second-degree burns, blindness, chest pain and rapid heartbeat. In the literature there is no information relating to passive exposure by the aerosols released during e-cigarette smoking. Furthermore, the information regarding to the use of these products in the long term are not also available.

  6. TOUGH2 User's Guide Version 2

    International Nuclear Information System (INIS)

    Pruess, K.; Oldenburg, C.M.; Moridis, G.J.

    1999-01-01

    TOUGH2 is a numerical simulator for nonisothermal flows of multicomponent, multiphase fluids in one, two, and three-dimensional porous and fractured media. The chief applications for which TOUGH2 is designed are in geothermal reservoir engineering, nuclear waste disposal, environmental assessment and remediation, and unsaturated and saturated zone hydrology. TOUGH2 was first released to the public in 1991; the 1991 code was updated in 1994 when a set of preconditioned conjugate gradient solvers was added to allow a more efficient solution of large problems. The current Version 2.0 features several new fluid property modules and offers enhanced process modeling capabilities, such as coupled reservoir-wellbore flow, precipitation and dissolution effects, and multiphase diffusion. Numerous improvements in previously released modules have been made and new user features have been added, such as enhanced linear equation solvers, and writing of graphics files. The T2VOC module for three-phase flows of water, air and a volatile organic chemical (VOC), and the T2DM module for hydrodynamic dispersion in 2-D flow systems have been integrated into the overall structure of the code and are included in the Version 2.0 package. Data inputs are upwardly compatible with the previous version. Coding changes were generally kept to a minimum, and were only made as needed to achieve the additional functionalities desired. TOUGH2 is written in standard FORTRAN77 and can be run on any platform, such as workstations, PCs, Macintosh, mainframe and supercomputers, for which appropriate FORTRAN compilers are available. This report is a self-contained guide to application of TOUGH2 to subsurface flow problems. It gives a technical description of the TOUGH2 code, including a discussion of the physical processes modeled, and the mathematical and numerical methods used. Illustrative sample problems are presented along with detailed instructions for preparing input data

  7. Portable computers - portable operating systems

    International Nuclear Information System (INIS)

    Wiegandt, D.

    1985-01-01

    Hardware development has made rapid progress over the past decade. Computers used to have attributes like ''general purpose'' or ''universal'', nowadays they are labelled ''personal'' and ''portable''. Recently, a major manufacturing company started marketing a portable version of their personal computer. But even for these small computers the old truth still holds that the biggest disadvantage of a computer is that it must be programmed, hardware by itself does not make a computer. (orig.)

  8. Conservation Reasoning Ability and Performance on BSCS Blue Version Examinations.

    Science.gov (United States)

    Lawson, Anton E.; Nordland, Floyd H.

    Twenty-three high school biology students were individually administered three conservation tasks (weight, volume, volume displacement). During one semester, they were examined over the course material using published Biological Sciences Curriculum Study (BSCS) Blue Version examination questions which were previously classified as requiring either…

  9. [Fetal version as ambulatory intervention].

    Science.gov (United States)

    Nohe, G; Hartmann, W; Klapproth, C E

    1996-06-01

    The external cephalic version (ECV) of the fetus at term reduces the maternal and fetal risks of intrapartum breech presentation and Caesarean delivery. Since 1986 over 800 external cephalic versions were performed in the outpatient Department of Obstetrics and Gynaecology of the Städtische Frauenklinik Stuttgart. 60.5% were successful. NO severe complications occurred. Sufficient amniotic fluid as well as the mobility of the fetal breech is a major criterion for the success of the ECV. Management requires a safe technique for mother and fetus. This includes ultrasonography, elektronic fetal monitoring and the ability to perform immediate caesarean delivery as well as the performance of ECV without analgesicas and sedatives. More than 70% of the ECV were successful without tocolysis. In unsuccessful cases the additional use of tocolysis improves the success rate only slightly. Therefore routine use of tocolysis does not appear necessary. External cephalic version can be recommended as an outpatient treatment without tocolysis.

  10. Efficient conjugate gradient algorithms for computation of the manipulator forward dynamics

    Science.gov (United States)

    Fijany, Amir; Scheid, Robert E.

    1989-01-01

    The applicability of conjugate gradient algorithms for computation of the manipulator forward dynamics is investigated. The redundancies in the previously proposed conjugate gradient algorithm are analyzed. A new version is developed which, by avoiding these redundancies, achieves a significantly greater efficiency. A preconditioned conjugate gradient algorithm is also presented. A diagonal matrix whose elements are the diagonal elements of the inertia matrix is proposed as the preconditioner. In order to increase the computational efficiency, an algorithm is developed which exploits the synergism between the computation of the diagonal elements of the inertia matrix and that required by the conjugate gradient algorithm.

  11. A hybrid version of swan for fast and efficient practical wave modelling

    NARCIS (Netherlands)

    M. Genseberger (Menno); J. Donners

    2016-01-01

    htmlabstractIn the Netherlands, for coastal and inland water applications, wave modelling with SWAN has become a main ingredient. However, computational times are relatively high. Therefore we investigated the parallel efficiency of the current MPI and OpenMP versions of SWAN. The MPI version is

  12. BehavePlus fire modeling system, version 5.0: Variables

    Science.gov (United States)

    Patricia L. Andrews

    2009-01-01

    This publication has been revised to reflect updates to version 4.0 of the BehavePlus software. It was originally published as the BehavePlus fire modeling system, version 4.0: Variables in July, 2008.The BehavePlus fire modeling system is a computer program based on mathematical models that describe wildland fire behavior and effects and the...

  13. TAE+ 5.2 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.2 (DEC RISC ULTRIX VERSION)

    Science.gov (United States)

    TAE SUPPORT OFFICE

    1994-01-01

    workstations running ULTRIX (TK50 cartridge in UNIX tar format), (3) HP9000 Series 700/800 computers running HP-UX 9.x and X11/R5 (HP 4mm DDS DAT tape cartridge in UNIX tar format), (4) Sun4 (SPARC) series computers running SunOS (.25 inch tape cartridge in UNIX tar format), and (5) SGI Indigo computers running IRIX (.25 inch IRIS tape cartridge in UNIX tar format). Please contact COSMIC to obtain detailed information about the supported operating system and OSF/Motif releases required for each of these machine versions. An optional Motif Object Code License is available for the Sun4 version of TAE Plus 5.2. Version 5.1 of TAE Plus remains available for DEC VAX computers running VMS, HP9000 Series 300/400 computers running HP-UX, and HP 9000 Series 700/800 computers running HP-UX 8.x and X11/R4. Please contact COSMIC for details on these versions of TAE Plus.

  14. Fuzzy Versions of Epistemic and Deontic Logic

    Science.gov (United States)

    Gounder, Ramasamy S.; Esterline, Albert C.

    1998-01-01

    Epistemic and deontic logics are modal logics, respectively, of knowledge and of the normative concepts of obligation, permission, and prohibition. Epistemic logic is useful in formalizing systems of communicating processes and knowledge and belief in AI (Artificial Intelligence). Deontic logic is useful in computer science wherever we must distinguish between actual and ideal behavior, as in fault tolerance and database integrity constraints. We here discuss fuzzy versions of these logics. In the crisp versions, various axioms correspond to various properties of the structures used in defining the semantics of the logics. Thus, any axiomatic theory will be characterized not only by its axioms but also by the set of properties holding of the corresponding semantic structures. Fuzzy logic does not proceed with axiomatic systems, but fuzzy versions of the semantic properties exist and can be shown to correspond to some of the axioms for the crisp systems in special ways that support dependency networks among assertions in a modal domain. This in turn allows one to implement truth maintenance systems. For the technical development of epistemic logic, and for that of deontic logic. To our knowledge, we are the first to address fuzzy epistemic and fuzzy deontic logic explicitly and to consider the different systems and semantic properties available. We give the syntax and semantics of epistemic logic and discuss the correspondence between axioms of epistemic logic and properties of semantic structures. The same topics are covered for deontic logic. Fuzzy epistemic and fuzzy deontic logic discusses the relationship between axioms and semantic properties for these logics. Our results can be exploited in truth maintenance systems.

  15. Measuring Engagement at Work: Validation of the Chinese Version of the Utrecht Work Engagement Scale

    OpenAIRE

    Ng, Sm; Fong, TCt

    2011-01-01

    Background: Work engagement is a positive work-related state of fulfillment characterized by vigor, dedication, and absorption. Previous studies have operationalized the construct through development of the Utrecht Work Engagement Scale. Apart from the original three-factor 17-item version of the instrument (UWES-17), there exists a nine-item shortened revised version (UWES-9). Purpose: The current study explored the psychometric properties of the Chinese version of the Utrecht Work Engagemen...

  16. Measuring Engagement at Work: Validation of the Chinese Version of the Utrecht Work Engagement Scale

    OpenAIRE

    Fong, Ted Chun-tat; Ng, Siu-man

    2011-01-01

    Background Work engagement is a positive work-related state of fulfillment characterized by vigor, dedication, and absorption. Previous studies have operationalized the construct through development of the Utrecht Work Engagement Scale. Apart from the original three-factor 17-item version of the instrument (UWES-17), there exists a nine-item shortened revised version (UWES-9). Purpose The current study explored the psychometric properties of the Chinese version of the Utrecht Work Engagement ...

  17. Antepartum transabdominal amnioinfusion to facilitate external cephalic version after initial failure.

    Science.gov (United States)

    Benifla, J L; Goffinet, F; Darai, E; Madelenat, P

    1994-12-01

    Transabdominal amnioinfusion can be used to facilitate external cephalic version. Our technique involves filling the uterine cavity with 700 or 900 mL of 37C saline under continuous echographic monitoring. External cephalic version is done the next morning. We have used this procedure in six women, all of whom had previous unsuccessful attempts at external cephalic version. After amnioinfusion, all six patients were converted to cephalic presentation and delivered normally, without obstetric or neonatal complications.

  18. Development of the unified version of COBRA/RELAP5

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, J. J.; Ha, K. S.; Chung, B. D.; Lee, W. J.; Sim, S. K. [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1997-12-31

    The COBRA/RELAP5 code, an integrated version of the COBRA-TF and RELAP5/MOD3 codes, has been developed for the realistic simulations of complicated, multi-dimensional, two-phase, thermal-hydraulic system transients in light water reactors. Recently, KAERI developed an unified version of the COBRA/RELAP5 code, which can run in serial mode on both workstations and personal computers. This paper provides the brief overview of the code integration scheme, the recent code modifications, the developmental assessments, and the future development plan. 13 refs., 5 figs., 2 tabs. (Author)

  19. Development of the unified version of COBRA/RELAP5

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, J J; Ha, K S; Chung, B D; Lee, W J; Sim, S K [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    The COBRA/RELAP5 code, an integrated version of the COBRA-TF and RELAP5/MOD3 codes, has been developed for the realistic simulations of complicated, multi-dimensional, two-phase, thermal-hydraulic system transients in light water reactors. Recently, KAERI developed an unified version of the COBRA/RELAP5 code, which can run in serial mode on both workstations and personal computers. This paper provides the brief overview of the code integration scheme, the recent code modifications, the developmental assessments, and the future development plan. 13 refs., 5 figs., 2 tabs. (Author)

  20. A new version of Scilab software package for the study of dynamical systems

    Science.gov (United States)

    Bordeianu, C. C.; Felea, D.; Beşliu, C.; Jipa, Al.; Grossu, I. V.

    2009-11-01

    This work presents a new version of a software package for the study of chaotic flows, maps and fractals [1]. The codes were written using Scilab, a software package for numerical computations providing a powerful open computing environment for engineering and scientific applications. It was found that Scilab provides various functions for ordinary differential equation solving, Fast Fourier Transform, autocorrelation, and excellent 2D and 3D graphical capabilities. The chaotic behaviors of the nonlinear dynamics systems were analyzed using phase-space maps, autocorrelation functions, power spectra, Lyapunov exponents and Kolmogorov-Sinai entropy. Various well-known examples are implemented, with the capability of the users inserting their own ODE or iterative equations. New version program summaryProgram title: Chaos v2.0 Catalogue identifier: AEAP_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAP_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 1275 No. of bytes in distributed program, including test data, etc.: 7135 Distribution format: tar.gz Programming language: Scilab 5.1.1. Scilab 5.1.1 should be installed before running the program. Information about the installation can be found at scilab.org/howto/install/windows" xlink:type="simple">http://wiki.scilab.org/howto/install/windows. Computer: PC-compatible running Scilab on MS Windows or Linux Operating system: Windows XP, Linux RAM: below 150 Megabytes Classification: 6.2 Catalogue identifier of previous version: AEAP_v1_0 Journal reference of previous version: Comput. Phys. Comm. 178 (2008) 788 Does the new version supersede the previous version?: Yes Nature of problem: Any physical model containing linear or nonlinear ordinary differential equations (ODE). Solution method: Numerical solving of

  1. A COMETHE version with transient capability

    International Nuclear Information System (INIS)

    Vliet, J. van; Lebon, G.; Mathieu, P.

    1980-01-01

    A version of the COMETHE code is under development to simulate transient situations. This paper focuses on some aspects of the transient heat transfer models. Initially the coupling between transient heat transfer and other thermomechanical models is discussed. An estimation of the thermal characteristic times shows that the cladding temperatures are often in quasi-steady state. In order to reduce the computing time, calculations are therefore switched from a transient to a quasi-static numerical procedure as soon as such a quasi-equilibrium is detected. The temperature calculation is performed by use of the Lebon-Lambermont restricted variational principle, with piecewise polynoms as trial functions. The method has been checked by comparison with some exact results and yields good agreement for transient as well as for quasi-static situations. This method therefore provides a valuable tool for the simulation of the transient behaviour of nuclear reactor fuel rods. (orig.)

  2. An improved version of the HULLAC code

    Energy Technology Data Exchange (ETDEWEB)

    Busquet, M.; Bar-Shalom, A.; Klapisch, M.; Oreg, J. [ARTEPARTEP is a contractor to the Naval Research Lab., Washington, DC (United States)

    2006-06-15

    Accurate and detailed atomic structure codes are needed for simulation of spectrally resolved X-ray output of laser driven target. As such, the HULLAC code has already been presented several times. First of all, an overhaul was performed, modernizing many parts to make them easier to understand and adding many comments. The source, in Fortran-77, was compiled and checked on many different systems with different compilers. In the new version, we have added the possibility to directly compute the relativistic configuration averages, skipping the fine structure. However in this case configuration interactions can be accounted for only within each non-relativistic configuration. Therefore we added the possibility of a mixed description, where not all configurations are described at the fine structure level. Recently, cooperation was proposed to anyone interested in extending or developing the code. HULLAC is now ready to be distributed on a basis of collaboration.

  3. An improved version of the HULLAC code

    International Nuclear Information System (INIS)

    Busquet, M.; Bar-Shalom, A.; Klapisch, M.; Oreg, J.

    2006-01-01

    Accurate and detailed atomic structure codes are needed for simulation of spectrally resolved X-ray output of laser driven target. As such, the HULLAC code has already been presented several times. First of all, an overhaul was performed, modernizing many parts to make them easier to understand and adding many comments. The source, in Fortran-77, was compiled and checked on many different systems with different compilers. In the new version, we have added the possibility to directly compute the relativistic configuration averages, skipping the fine structure. However in this case configuration interactions can be accounted for only within each non-relativistic configuration. Therefore we added the possibility of a mixed description, where not all configurations are described at the fine structure level. Recently, cooperation was proposed to anyone interested in extending or developing the code. HULLAC is now ready to be distributed on a basis of collaboration

  4. Navier-Stokes computer

    International Nuclear Information System (INIS)

    Hayder, M.E.

    1988-01-01

    A new scientific supercomputer, known as the Navier-Stokes Computer (NSC), has been designed. The NSC is a multi-purpose machine, and for applications in the field of computational fluid dynamics (CFD), this supercomputer is expected to yield a computational speed far exceeding that of the present-day super computers. This computer has a few very powerful processors (known as nodes) connected by an internodal network. There are three versions of the NSC nodes: micro-, mini- and full-node. The micro-node was developed to prove, to demonstrate and to refine the key architectural features of the NSC. Architectures of the two recent versions of the NSC nodes are presented, with the main focus on the full-node. At a clock speed of 20 MHz, the mini- and the full-node have peak computational speeds of 200 and 640 MFLOPS, respectively. The full-node is the final version for the NSC nodes and an NSC is expected to have 128 full-nodes. To test the suitability of different algorithms on the NSC architecture, an NSC simulator was developed. Some of the existing computational fluid dynamics codes were placed on this simulator to determine important and relevant issues relating to the efficient use of the NSC architecture

  5. Procedure guideline for thyroid scintigraphy (version 3); Verfahrensanweisung fuer die Schilddruesenszintigraphie (Version 3)

    Energy Technology Data Exchange (ETDEWEB)

    Dietlein, M.; Schicha, H. [Deutsche Gesellschaft fuer Nuklearmedizin (DGN) (Germany); Koeln Univ. (Germany). Klinik und Poliklinik fuer Nuklearmedizin; Dressler, J. [Deutsche Gesellschaft fuer Nuklearmedizin (DGN) (Germany); Nuklearmedizinische Klinik der Henriettenstiftung, Hannover (Germany); Eschner, W. [Deutsche Gesellschaft fuer Nuklearmedizin (DGN) (Germany); Deutsche Gesellschaft fuer Medizinische Physik (DGMP) (Germany); Koeln Univ. (Germany). Klinik und Poliklinik fuer Nuklearmedizin; Leisner, B. [Deutsche Gesellschaft fuer Nuklearmedizin (DGN) (Germany); Allgemeines Krankenhaus St. Georg, Hamburg (Germany). Abt. fuer Nuklearmedizin; Reiners, C. [Deutsche Gesellschaft fuer Nuklearmedizin (DGN) (Germany); Wuerzburg Univ. (Germany). Klinik und Poliklinik fuer Nuklearmedizin

    2007-07-01

    The version 3 of the procedure guideline for thyroid scintigraphy is an update of the procedure guideline previously published in 2003. The interpretation of the scintigraphy requires the knowledge of the patients' history, the palpation of the neck, the laboratory parameters and of the sonography. The interpretation of the technetium-99m uptake requires the knowledge of the TSH-level. As a consequence of the improved alimentary iodine supply the {sup 99m}Tc-uptake has decreased; 100 000 counts per scintigraphy should be acquired. For this, an imaging time of 10 minutes is generally needed using a high resolution collimator for thyroid imaging. (orig.)

  6. Zgoubi user`s guide. Version 4

    Energy Technology Data Exchange (ETDEWEB)

    Meot, F. [Fermi National Accelerator Lab., Batavia, IL (United States). Dept. of Physics; Valero, S. [CEA, Gif-sur-Yvette (France)

    1997-10-15

    The computer code Zgoubi calculates trajectories of charged particles in magnetic and electric fields. At the origin specially adapted to the definition and adjustment of beam lines and magnetic spectrometers, it has so-evolved that it allows the study of systems including complex sequences of optical elements such as dipoles, quadrupoles, arbitrary multipoles and other magnetic or electric devices, and is able as well to handle periodic structures. Compared to other codes, it presents several peculiarities: (1) a numerical method for integrating the Lorentz equation, based on Taylor series, which optimizes computing time and provides high accuracy and strong symplecticity, (2) spin tracking, using the same numerical method as for the Lorentz equation, (3) calculation of the synchrotron radiation electric field and spectra in arbitrary magnetic fields, from the ray-tracing outcomes, (4) the possibility of using a mesh, which allows ray-tracing from simulated or measured (1-D, 2-D or 3-D) field maps, (5) Monte Carlo procedures: unlimited number of trajectories, in-flight decay, etc. (6) built-in fitting procedure, (7) multiturn tracking in circular accelerators including many features proper to machine parameter calculation and survey, and also the simulation of time-varying power supplies. The initial version of the Code, dedicated to the ray-tracing in magnetic fields, was developed by D. Garreta and J.C. Faivre at CEN-Saclay in the early 1970`s. It was perfected for the purpose of studying the four spectrometers (SPES I, II, III, IV) at the Laboratoire National Saturne (CEA-Saclay, France), and SPEG at Ganil (Caen, France). It is now in use in several national and foreign laboratories. This manual is intended only to describe the details of the most recent version of Zogoubi, which is far from being a {open_quotes}finished product{close_quotes}.

  7. Development and Evaluation of a Chinese Version of the Questionnaire on Teacher Interaction (QTI)

    Science.gov (United States)

    Sun, Xiaojing; Mainhard, Tim; Wubbels, Theo

    2018-01-01

    Teacher-student interpersonal relationships play an important role in education. The Questionnaire on Teacher Interaction (QTI) was designed to measure students' interpersonal perceptions of their teachers. There are two Chinese versions of the QTI for student use, and that inherited the weaknesses of the previous English versions, such as items…

  8. An update to the Surface Ocean CO2 Atlas (SOCAT version 2)

    NARCIS (Netherlands)

    Bakker, D.C.E.; Pfeil, B.; Smith, K.; Hankin, S.; Olsen, A.; Alin, S. R.; Cosca, C.; Harasawa, S.; Kozyr, A.; Nojiri, Y.; O'Brien, K. M.; Schuster, U.; Telszewski, M.; Tilbrook, B.; Wada, C.; Akl, J.; Barbero, L.; Bates, N. R.; Boutin, J.; Bozec, Y.; Cai, W. -J.; Castle, R. D.; Chavez, F. P.; Chen, L.; Chierici, M.; Currie, K.; de Baar, H. J. W.; Evans, W.; Feely, R. A.; Fransson, A.; Gao, Z.; Hales, B.; Hardman-Mountford, N. J.; Hoppema, M.; Huang, W. -J.; Hunt, C. W.; Huss, B.; Ichikawa, T.; Johannessen, T.; Jones, E. M.; Jones, S. D.; Jutterstrom, S.; Kitidis, V.; Koertzinger, A.; Landschuetzer, P.; Lauvset, S. K.; Lefevre, N.; Manke, A. B.; Mathis, J. T.; Merlivat, L.; Metzl, N.; Murata, A.; Newberger, T.; Omar, A. M.; Ono, T.; Park, G. -H.; Paterson, K.; Pierrot, D.; Rios, A. F.; Sabine, C. L.; Saito, S.; Salisbury, J.; Sarma, V. V. S. S.; Schlitzer, R.; Sieger, R.; Skjelvan, I.; Steinhoff, T.; Sullivan, K. F.; Sun, H.; Sutton, A. J.; Suzuki, T.; Sweeney, C.; Takahashi, T.; Tjiputra, J.; Tsurushima, N.; van Heuven, S. M. A. C.; Vandemark, D.; Vlahos, P.; Wallace, D. W. R.; Wanninkhof, R.; Watson, A.J.

    2014-01-01

    The Surface Ocean CO2 Atlas (SOCAT), an activity of the international marine carbon research community, provides access to synthesis and gridded fCO(2) (fugacity of carbon dioxide) products for the surface oceans. Version 2 of SOCAT is an update of the previous release (version 1) with more data

  9. Ariadne version 4 - a program for simulation of QCD cascades implementing the colour dipole model

    International Nuclear Information System (INIS)

    Loennblad, L.

    1992-01-01

    The fourth version of the Ariadne program for generating QCD cascades in the colour dipole approximation is presented. The underlying physics issues are discussed and a manual for using the program is given together with a few sample programs. The major changes from previous versions are the introduction of photon radiation from quarks and inclusion of interfaces to the LEPTO and PYTHIA programs. (orig.)

  10. SAGE Version 7.0 Algorithm: Application to SAGE II

    Science.gov (United States)

    Damadeo, R. P; Zawodny, J. M.; Thomason, L. W.; Iyer, N.

    2013-01-01

    This paper details the Stratospheric Aerosol and Gas Experiments (SAGE) version 7.0 algorithm and how it is applied to SAGE II. Changes made between the previous (v6.2) and current (v7.0) versions are described and their impacts on the data products explained for both coincident event comparisons and time-series analysis. Users of the data will notice a general improvement in all of the SAGE II data products, which are now in better agreement with more modern data sets (e.g. SAGE III) and more robust for use with trend studies.

  11. Upon Further Review: V. An Examination of Previous Lightcurve Analysis from the Palmer Divide Observatory

    Science.gov (United States)

    Warner, Brian D.

    2011-01-01

    Updated results are given for nine asteroids previously reported from the Palmer Divide Observatory (PDO). The original images were re-measured to obtain new data sets using the latest version of MPO Canopus photometry software, analysis tools, and revised techniques for linking multiple observing runs covering several days to several weeks. Results that were previously not reported or were moderately different were found for 1659 Punkajarju, 1719 Jens, 1987 Kaplan, 2105 Gudy, 2961 Katsurahama, 3285 Ruth Wolfe, 3447 Burckhalter, 7816 Hanoi, and (34817) 2000 SE116. This is one in a series of papers that will examine results obtained during the initial years of the asteroid lightcurve program at PDO.

  12. GEMPAK 5.1 - A GENERAL METEOROLOGICAL PACKAGE (UNIX VERSION)

    Science.gov (United States)

    Desjardins, M. L.

    1994-01-01

    GEMPAK is a general meteorological software package developed at NASA/Goddard Space Flight Center. It includes programs to analyze and display surface, upper-air, and gridded data, including model output. There are very general programs to list, edit, and plot data on maps, to display profiles and time series, to draw and fill contours, to draw streamlines, to plot symbols for clouds, sky cover, and pressure tendency, and draw cross sections in the case of gridded data and sounding data. In addition, there are Barnes objective analysis programs to grid surface and upper-air data. The programs include the capabilities to derive meteorological parameters from those found in the dataset, to perform vertical interpolations of sounding data to different coordinate systems, and to compute an extensive set of gridded diagnostic quantities by specifying various nested combinations of scalars and vector arithmetic, algebraic, and differential operators. The GEMPAK 5.1 graphics/transformation subsystem, GEMPLT, provides device-independent graphics. GEMPLT also has the capability to display output in a variety of map projections or overlaid on satellite imagery. GEMPAK 5.1 is written in FORTRAN 77 and C-language and has been implemented on VAX computers under VMS and on computers running the UNIX operating system. During installation and normal use, this package occupies approximately 100Mb of hard disk space. The UNIX version of GEMPAK includes drivers for several graphic output systems including MIT's X Window System (X11,R4), Sun GKS, PostScript (color and monochrome), Silicon Graphics, and others. The VMS version of GEMPAK also includes drivers for several graphic output systems including PostScript (color and monochrome). The VMS version is delivered with the object code for the Transportable Applications Environment (TAE) program, version 4.1 which serves as a user interface. A color monitor is recommended for displaying maps on video display devices. Data for rendering

  13. Procedure guideline for radioiodine test (version 3)

    International Nuclear Information System (INIS)

    Dietlein, M.; Schicha, H.; Eschner, W.; Deutsche Gesellschaft fuer Medizinische Physik; Koeln Univ.; Lassmann, M.; Deutsche Gesellschaft fuer Medizinische Physik; Wuerzburg Univ.; Leisner, B.; Allgemeines Krankenhaus St. Georg, Hamburg; Reiners, C.; Wuerzburg Univ.

    2007-01-01

    The version 3 of the procedure guideline for radioiodine test is an update of the guideline previously published in 2003. The procedure guideline discusses the pros and cons of a single measurement or of repeated measurements of the iodine-131 uptake and their optimal timing. Different formulas are described when one, two or three values of the radioiodine kinetic are available. The probe with a sodium-iodine crystal, alternatively or additionally the gamma camera using the ROI-technique are instrumentations for the measurement of iodine-131 uptake. A possible source of error is an inappropriate measurement (sonography) of the target volume. The patients' preparation includes the withdrawal of antithyroid drugs 2-3 days before radioiodine administration. The patient has to avoid iodine-containing medication and the possibility of additives of iodine in vitamin- and electrolyte-supplementation has to be considered. (orig.)

  14. System software for the NMFECC CRAY-1 version of GIFTS 4B

    International Nuclear Information System (INIS)

    Gray, W.H.; Baudry, T.V.

    1981-01-01

    The Oak Ridge National Laboratory (ORNL) maintains a version of the GIFTS system structural analysis computer programs. Executable modules are supported on two different types of computer hardware, a DECsystem-10 and a CRAY-1. Without external difference to the user, these modules execute equivalently upon both types of hardware. Presented herein are the local software enhancements for the ORNL version of GIFTS for the National Magnetic Fusion Energy Computer Center (NMFECC) CRAY-1 computer as well as a description of the ORNL implementation of the system-dependent portions of the GIFTS software library for the NMFECC CRAY-1

  15. Perceived parental rejection mediates the effects of previous ...

    African Journals Online (AJOL)

    Behavioural problems, parental rejection scores and child abuse ... evaluated by the Child Behavior Checklist (parental version), the Memories of Parental Rearing ... However, mental illness had no moderating effect on these relationships.

  16. Systems analysis programs for Hands-on integrated reliability evaluations (SAPHIRE) Version 5.0: Verification and validation (V ampersand V) manual. Volume 9

    International Nuclear Information System (INIS)

    Jones, J.L.; Calley, M.B.; Capps, E.L.; Zeigler, S.L.; Galyean, W.J.; Novack, S.D.; Smith, C.L.; Wolfram, L.M.

    1995-03-01

    A verification and validation (V ampersand V) process has been performed for the System Analysis Programs for Hands-on Integrated Reliability Evaluation (SAPHIRE) Version 5.0. SAPHIRE is a set of four computer programs that NRC developed for performing probabilistic risk assessments. They allow an analyst to perform many of the functions necessary to create, quantify, and evaluate the risk associated with a facility or process being analyzed. The programs are Integrated Reliability and Risk Analysis System (IRRAS) System Analysis and Risk Assessment (SARA), Models And Results Database (MAR-D), and Fault tree, Event tree, and Piping and instrumentation diagram (FEP) graphical editor. Intent of this program is to perform a V ampersand V of successive versions of SAPHIRE. Previous efforts have been the V ampersand V of SAPHIRE Version 4.0. The SAPHIRE 5.0 V ampersand V plan is based on the SAPHIRE 4.0 V ampersand V plan with revisions to incorporate lessons learned from the previous effort. Also, the SAPHIRE 5.0 vital and nonvital test procedures are based on the test procedures from SAPHIRE 4.0 with revisions to include the new SAPHIRE 5.0 features as well as to incorporate lessons learned from the previous effort. Most results from the testing were acceptable; however, some discrepancies between expected code operation and actual code operation were identified. Modifications made to SAPHIRE are identified

  17. In-vessel source term analysis code TRACER version 2.3. User's manual

    International Nuclear Information System (INIS)

    Toyohara, Daisuke; Ohno, Shuji; Hamada, Hirotsugu; Miyahara, Shinya

    2005-01-01

    A computer code TRACER (Transport Phenomena of Radionuclides for Accident Consequence Evaluation of Reactor) version 2.3 has been developed to evaluate species and quantities of fission products (FPs) released into cover gas during a fuel pin failure accident in an LMFBR. The TRACER version 2.3 includes new or modified models shown below. a) Both model: a new model for FPs release from fuel. b) Modified model for FPs transfer from fuel to bubbles or sodium coolant. c) Modified model for bubbles dynamics in coolant. Computational models, input data and output data of the TRACER version 2.3 are described in this user's manual. (author)

  18. High Performance Computing Multicast

    Science.gov (United States)

    2012-02-01

    A History of the Virtual Synchrony Replication Model,” in Replication: Theory and Practice, Charron-Bost, B., Pedone, F., and Schiper, A. (Eds...Performance Computing IP / IPv4 Internet Protocol (version 4.0) IPMC Internet Protocol MultiCast LAN Local Area Network MCMD Dr. Multicast MPI

  19. Fourier coefficientes computation in two variables, a distributional version

    Directory of Open Access Journals (Sweden)

    Carlos Manuel Ulate R.

    2015-01-01

    Full Text Available The present article, by considering the distributional summations of Euler-Maclaurin and a suitable choice of the distribution, results in repre- sentations for the Fourier coefficients in two variables are obtained. These representations may be used for the numerical evaluation of coefficients.

  20. Fourier coefficientes computation in two variables, a distributional version

    OpenAIRE

    Carlos Manuel Ulate R.

    2015-01-01

    The present article, by considering the distributional summations of Euler-Maclaurin and a suitable choice of the distribution, results in repre- sentations for the Fourier coefficients in two variables are obtained. These representations may be used for the numerical evaluation of coefficients.

  1. Qualification of the new version of HAMMER computer code

    International Nuclear Information System (INIS)

    Chia, C.T.

    1984-06-01

    (HTEC) code were tested with a great number of diferent type of experiments. This experiments covers the most important parameters in neutronic calculations, such as the cell geometry and composition. The HTEC code results have been analysed and compared with experimental data and results given by the literature and simulated by HAMMER and LEOPARD codes. The quantities used for analysis were Keff and the following integral parameters: R28 - ratio of epicadmium-to-subcadmium 238 U captures; D25 - ratio of epicadmium-to-subcadmium 235 U fission; D28 - ratio of 238 U fissions to 235 U fissions; C - ratio of 238 U captures to 235 U fissions; RC02 - ratio of epicadmium-to-subcadmium 232 Th capture. The analysis shows that the results given by the code are in good agreement with the experimental data and the results given by the other codes. The calculation that have been done with the detailed ressonance profile tabulations of plutonium isotopes shows worst results than that obtained with the ressonance parameters. Almost all the simulated cases, shows that the HTEC results are closest to the experimental data than the HAMMER results, when one do not use the detailed ressonance profile tabulations of the plutonium isotopes. (Author) [pt

  2. Transfer and development of the PC version of ABAQUS program

    International Nuclear Information System (INIS)

    Li Xiaofeng; Zhu Yuqiao

    1998-01-01

    The transfer and development of the PC version of ABAQUS,a large nonlinear mechanical finite element analysis program, are carried out. Some special problem such as difference of the floating data format in different computers and the computer's unusual dead halt during the data transfer is solved and the visualized I/O capability is added in the redevelopment. Thus, by utilizing the visual capability, the intensity of analysis works is reduced, and the correctness of analysis is ensured. The PC ABAQUS are tested by the standard examples from VAX version of ABAQUS and the calculation results are correct. The results of calculation of stress and deformation for CEFR shell structure with PC ABAQUS and ADINA codes agree very well

  3. Development of the Brazilian version of the Child Hayling Test

    Directory of Open Access Journals (Sweden)

    Larissa de Souza Siqueira

    Full Text Available Abstract Introduction: The Hayling Test assesses the components of initiation, inhibition, cognitive flexibility and verbal speed by means of a sentence completion task. This study presents the process of developing the Brazilian version of the Child Hayling Test (CHT and reports evidence of its content validity. Methods: 139 people took part in the study. The adaptation was performed by seven translators and 12 specialist judges. An initial sample of 92 healthy children was recruited to test a selection of sentences adapted from previous adult and pediatric versions of the instrument, and a sample of 28 healthy children was recruited for pilot testing of the final version. The instrument was developed in seven stages: 1 translation, 2 back-translation, 3 comparison of translated versions, 4 preparation of new stimuli, 5 data collection with healthy children to analyze comprehension of the stimuli and analyses by the authors against the psycholinguistic criteria adopted, 6 analyses conducted by judges who are specialists in neuropsychology or linguistics, and 7 the pilot study. Results: Twenty-four of the 72 sentences constructed were selected on the basis of 70-100% agreement between judges evaluating what they assessed and level of comprehensibility. The pilot study revealed better performance by older children, providing evidence of the instrument's sensitivity to developmental factors. Conclusions: Future studies employing this version of CHT with clinical pediatric populations who have frontal lesions and dysfunctions and in related areas are needed to test functional and differential diagnoses of preserved or impaired executive functions.

  4. Computer security

    CERN Document Server

    Gollmann, Dieter

    2011-01-01

    A completely up-to-date resource on computer security Assuming no previous experience in the field of computer security, this must-have book walks you through the many essential aspects of this vast topic, from the newest advances in software and technology to the most recent information on Web applications security. This new edition includes sections on Windows NT, CORBA, and Java and discusses cross-site scripting and JavaScript hacking as well as SQL injection. Serving as a helpful introduction, this self-study guide is a wonderful starting point for examining the variety of competing sec

  5. National Radiobiology Archives Distributed Access User`s Manual, Version 1.1. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Smith, S.K.; Prather, J.C.; Ligotke, E.K.; Watson, C.R.

    1992-06-01

    This supplement to the NRA Distributed Access User`s manual (PNL-7877), November 1991, describes installation and use of Version 1.1 of the software package; this is not a replacement of the previous manual. Version 1.1 of the NRA Distributed Access Package is a maintenance release. It eliminates several bugs, and includes a few new features which are described in this manual. Although the appearance of some menu screens has changed, we are confident that the Version 1.0 User`s Manual will provide an adequate introduction to the system. Users who are unfamiliar with Version 1.0 may wish to experiment with that version before moving on to Version 1.1.

  6. National Radiobiology Archives Distributed Access User's Manual, Version 1. 1

    Energy Technology Data Exchange (ETDEWEB)

    Smith, S.K.; Prather, J.C.; Ligotke, E.K.; Watson, C.R.

    1992-06-01

    This supplement to the NRA Distributed Access User's manual (PNL-7877), November 1991, describes installation and use of Version 1.1 of the software package; this is not a replacement of the previous manual. Version 1.1 of the NRA Distributed Access Package is a maintenance release. It eliminates several bugs, and includes a few new features which are described in this manual. Although the appearance of some menu screens has changed, we are confident that the Version 1.0 User's Manual will provide an adequate introduction to the system. Users who are unfamiliar with Version 1.0 may wish to experiment with that version before moving on to Version 1.1.

  7. Fiscal impacts model documentation. Version 1.0

    International Nuclear Information System (INIS)

    Beck, S.L.; Scott, M.J.

    1986-05-01

    The Fiscal Impacts (FI) Model, Version 1.0 was developed under Pacific Northwest Laboratory's Monitored Retrievable Storage (MRS) Program to aid in development of the MRS Reference Site Environmental Document (PNL 5476). It computes estimates of 182 fiscal items for state and local government jurisdictions, using input data from the US Census Bureau's 1981 Survey of Governments and local population forecasts. The model can be adapted for any county or group of counties in the United States

  8. The NJOY Nuclear Data Processing System, Version 2016

    Energy Technology Data Exchange (ETDEWEB)

    Macfarlane, Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Muir, Douglas W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Boicourt, R. M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kahler, III, Albert Comstock [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Conlin, Jeremy Lloyd [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-09

    The NJOY Nuclear Data Processing System, version 2016, is a comprehensive computer code package for producing pointwise and multigroup cross sections and related quantities from evaluated nuclear data in the ENDF-4 through ENDF-6 legacy card-image formats. NJOY works with evaluated files for incident neutrons, photons, and charged particles, producing libraries for a wide variety of particle transport and reactor analysis codes.

  9. AERO2S - SUBSONIC AERODYNAMIC ANALYSIS OF WINGS WITH LEADING- AND TRAILING-EDGE FLAPS IN COMBINATION WITH CANARD OR HORIZONTAL TAIL SURFACES (IBM PC VERSION)

    Science.gov (United States)

    Carlson, H. W.

    1994-01-01

    necessary only to add an identification record and the namelist data that are to be changed from the previous run. This code was originally developed in 1989 in FORTRAN V on a CDC 6000 computer system, and was later ported to an MS-DOS environment. Both versions are available from COSMIC. There are only a few differences between the PC version (LAR-14458) and CDC version (LAR-14178) of AERO2S distributed by COSMIC. The CDC version has one main source code file while the PC version has two files which are easier to edit and compile on a PC. The PC version does not require a FORTRAN compiler which supports NAMELIST because a special INPUT subroutine has been added. The CDC version includes two MODIFY decks which can be used to improve the code and prevent the possibility of some infrequently occurring errors while PC-version users will have to make these code changes manually. The PC version includes an executable which was generated with the Ryan McFarland/FORTRAN compiler and requires 253K RAM and an 80x87 math co-processor. Using this executable, the sample case requires about four hours to execute on an 8MHz AT-class microcomputer with a co-processor. The source code conforms to the FORTRAN 77 standard except that it uses variables longer than six characters. With two minor modifications, the PC version should be portable to any computer with a FORTRAN compiler and sufficient memory. The CDC version of AERO2S is available in CDC NOS Internal format on a 9-track 1600 BPI magnetic tape. The PC version is available on a set of two 5.25 inch 360K MS-DOS format diskettes. IBM AT is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. CDC is a registered trademark of Control Data Corporation. NOS is a trademark of Control Data Corporation.

  10. AERO2S - SUBSONIC AERODYNAMIC ANALYSIS OF WINGS WITH LEADING- AND TRAILING-EDGE FLAPS IN COMBINATION WITH CANARD OR HORIZONTAL TAIL SURFACES (CDC VERSION)

    Science.gov (United States)

    Darden, C. M.

    1994-01-01

    necessary only to add an identification record and the namelist data that are to be changed from the previous run. This code was originally developed in 1989 in FORTRAN V on a CDC 6000 computer system, and was later ported to an MS-DOS environment. Both versions are available from COSMIC. There are only a few differences between the PC version (LAR-14458) and CDC version (LAR-14178) of AERO2S distributed by COSMIC. The CDC version has one main source code file while the PC version has two files which are easier to edit and compile on a PC. The PC version does not require a FORTRAN compiler which supports NAMELIST because a special INPUT subroutine has been added. The CDC version includes two MODIFY decks which can be used to improve the code and prevent the possibility of some infrequently occurring errors while PC-version users will have to make these code changes manually. The PC version includes an executable which was generated with the Ryan McFarland/FORTRAN compiler and requires 253K RAM and an 80x87 math co-processor. Using this executable, the sample case requires about four hours to execute on an 8MHz AT-class microcomputer with a co-processor. The source code conforms to the FORTRAN 77 standard except that it uses variables longer than six characters. With two minor modifications, the PC version should be portable to any computer with a FORTRAN compiler and sufficient memory. The CDC version of AERO2S is available in CDC NOS Internal format on a 9-track 1600 BPI magnetic tape. The PC version is available on a set of two 5.25 inch 360K MS-DOS format diskettes. IBM AT is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. CDC is a registered trademark of Control Data Corporation. NOS is a trademark of Control Data Corporation.

  11. Inclusion in the Workplace - Text Version | NREL

    Science.gov (United States)

    Careers » Inclusion in the Workplace - Text Version Inclusion in the Workplace - Text Version This is the text version for the Inclusion: Leading by Example video. I'm Martin Keller. I'm the NREL of the laboratory. Another very important element in inclusion is diversity. Because if we have a

  12. A constructive version of AIP revisited

    NARCIS (Netherlands)

    Barros, A.; Hou, T.

    2008-01-01

    In this paper, we review a constructive version of the Approximation Induction Principle. This version states that bisimilarity of regular processes can be decided by observing only a part of their behaviour. We use this constructive version to formulate a complete inference system for the Algebra

  13. Embrittlement data base, version 1

    International Nuclear Information System (INIS)

    Wang, J.A.

    1997-08-01

    The aging and degradation of light-water-reactor (LWR) pressure vessels is of particular concern because of their relevance to plant integrity and the magnitude of the expected irradiation embrittlement. The radiation embrittlement of reactor pressure vessel (RPV) materials depends on many different factors such as flux, fluence, fluence spectrum, irradiation temperature, and preirradiation material history and chemical compositions. These factors must be considered to reliably predict pressure vessel embrittlement and to ensure the safe operation of the reactor. Based on embrittlement predictions, decisions must be made concerning operating parameters and issues such as low-leakage-fuel management, possible life extension, and the need for annealing the pressure vessel. Large amounts of data from surveillance capsules and test reactor experiments, comprising many different materials and different irradiation conditions, are needed to develop generally applicable damage prediction models that can be used for industry standards and regulatory guides. Version 1 of the Embrittlement Data Base (EDB) is such a comprehensive collection of data resulting from merging version 2 of the Power Reactor Embrittlement Data Base (PR-EDB). Fracture toughness data were also integrated into Version 1 of the EDB. For power reactor data, the current EDB lists the 1,029 Charpy transition-temperature shift data points, which include 321 from plates, 125 from forgoings, 115 from correlation monitor materials, 246 from welds, and 222 from heat-affected-zone (HAZ) materials that were irradiated in 271 capsules from 101 commercial power reactors. For test reactor data, information is available for 1,308 different irradiated sets (352 from plates, 186 from forgoings, 303 from correlation monitor materials, 396 from welds and 71 from HAZs) and 268 different irradiated plus annealed data sets

  14. PAV ontology: provenance, authoring and versioning.

    Science.gov (United States)

    Ciccarese, Paolo; Soiland-Reyes, Stian; Belhajjame, Khalid; Gray, Alasdair Jg; Goble, Carole; Clark, Tim

    2013-11-22

    Provenance is a critical ingredient for establishing trust of published scientific content. This is true whether we are considering a data set, a computational workflow, a peer-reviewed publication or a simple scientific claim with supportive evidence. Existing vocabularies such as Dublin Core Terms (DC Terms) and the W3C Provenance Ontology (PROV-O) are domain-independent and general-purpose and they allow and encourage for extensions to cover more specific needs. In particular, to track authoring and versioning information of web resources, PROV-O provides a basic methodology but not any specific classes and properties for identifying or distinguishing between the various roles assumed by agents manipulating digital artifacts, such as author, contributor and curator. We present the Provenance, Authoring and Versioning ontology (PAV, namespace http://purl.org/pav/): a lightweight ontology for capturing "just enough" descriptions essential for tracking the provenance, authoring and versioning of web resources. We argue that such descriptions are essential for digital scientific content. PAV distinguishes between contributors, authors and curators of content and creators of representations in addition to the provenance of originating resources that have been accessed, transformed and consumed. We explore five projects (and communities) that have adopted PAV illustrating their usage through concrete examples. Moreover, we present mappings that show how PAV extends the W3C PROV-O ontology to support broader interoperability. The initial design of the PAV ontology was driven by requirements from the AlzSWAN project with further requirements incorporated later from other projects detailed in this paper. The authors strived to keep PAV lightweight and compact by including only those terms that have demonstrated to be pragmatically useful in existing applications, and by recommending terms from existing ontologies when plausible. We analyze and compare PAV with related

  15. Nuclear data library table (Version November 1998)

    International Nuclear Information System (INIS)

    Baard, J.H.

    1998-11-01

    This report presents the edition of the Nuclear Data Library Table, valid from 1998-11-01. This library contains data for conversion of activity values to fluence rate and fluence values. The revised table is a modified version of the older library coded 1990-12-12. The older library has been extended with 23 reaction; the special 'background' reaction has been deleted. A table has been incorporated in this report which indicates the changes in this revised library data in comparison to previously used data. The data has been incorporated in this report which indicates the changes in this revised library data in comparison to previously used data. The data are presented as obtained as output from the program SAPNDLT. A table with half-lives of product nuclides is presented; in Appendix 2 these values have been calculated using the decay constants from this library. Surveys of thermal and fast cross sections are given for the various reactions in Appendix 3 and 4 respectively. Also a table with activities per mg mass for a fluence rate of 10 1 8 m -2 .s -1 is presented in Appendix 3 and 4 respectively. Also a table with activities per mg mass for a fluence rate of 10 1 8 m -1 is presented in Appendix 5 for various irradiation intervals. Appendix 6 gives for the various reactions the Kerma rate value. 8 refs

  16. Computational chemistry

    Science.gov (United States)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  17. Strong versions of Bell's theorem

    International Nuclear Information System (INIS)

    Stapp, H.P.

    1994-01-01

    Technical aspects of a recently constructed strong version of Bell's theorem are discussed. The theorem assumes neither hidden variables nor factorization, and neither determinism nor counterfactual definiteness. It deals directly with logical connections. Hence its relationship with modal logic needs to be described. It is shown that the proof can be embedded in an orthodox modal logic, and hence its compatibility with modal logic assured, but that this embedding weakens the theorem by introducing as added assumptions the conventionalities of the particular modal logic that is adopted. This weakening is avoided in the recent proof by using directly the set-theoretic conditions entailed by the locality assumption

  18. Neural computation and the computational theory of cognition.

    Science.gov (United States)

    Piccinini, Gualtiero; Bahar, Sonya

    2013-04-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism-neural processes are computations in the generic sense. After that, we reject on empirical grounds the common assimilation of neural computation to either analog or digital computation, concluding that neural computation is sui generis. Analog computation requires continuous signals; digital computation requires strings of digits. But current neuroscientific evidence indicates that typical neural signals, such as spike trains, are graded like continuous signals but are constituted by discrete functional elements (spikes); thus, typical neural signals are neither continuous signals nor strings of digits. It follows that neural computation is sui generis. Finally, we highlight three important consequences of a proper understanding of neural computation for the theory of cognition. First, understanding neural computation requires a specially designed mathematical theory (or theories) rather than the mathematical theories of analog or digital computation. Second, several popular views about neural computation turn out to be incorrect. Third, computational theories of cognition that rely on non-neural notions of computation ought to be replaced or reinterpreted in terms of neural computation. Copyright © 2012 Cognitive Science Society, Inc.

  19. NETS - A NEURAL NETWORK DEVELOPMENT TOOL, VERSION 3.0 (MACHINE INDEPENDENT VERSION)

    Science.gov (United States)

    Baffes, P. T.

    1994-01-01

    allows the user to generate C code to implement the network loaded into the system. This permits the placement of networks as components, or subroutines, in other systems. In short, once a network performs satisfactorily, the Generate C Code option provides the means for creating a program separate from NETS to run the network. Other features: files may be stored in binary or ASCII format; multiple input propagation is permitted; bias values may be included; capability to scale data without writing scaling code; quick interactive testing of network from the main menu; and several options that allow the user to manipulate learning efficiency. NETS is written in ANSI standard C language to be machine independent. The Macintosh version (MSC-22108) includes code for both a graphical user interface version and a command line interface version. The machine independent version (MSC-21588) only includes code for the command line interface version of NETS 3.0. The Macintosh version requires a Macintosh II series computer and has been successfully implemented under System 7. Four executables are included on these diskettes, two for floating point operations and two for integer arithmetic. It requires Think C 5.0 to compile. A minimum of 1Mb of RAM is required for execution. Sample input files and executables for both the command line version and the Macintosh user interface version are provided on the distribution medium. The Macintosh version is available on a set of three 3.5 inch 800K Macintosh format diskettes. The machine independent version has been successfully implemented on an IBM PC series compatible running MS-DOS, a DEC VAX running VMS, a SunIPC running SunOS, and a CRAY Y-MP running UNICOS. Two executables for the IBM PC version are included on the MS-DOS distribution media, one compiled for floating point operations and one for integer arithmetic. The machine independent version is available on a set of three 5.25 inch 360K MS-DOS format diskettes (standard

  20. NETS - A NEURAL NETWORK DEVELOPMENT TOOL, VERSION 3.0 (MACINTOSH VERSION)

    Science.gov (United States)

    Phillips, T. A.

    1994-01-01

    allows the user to generate C code to implement the network loaded into the system. This permits the placement of networks as components, or subroutines, in other systems. In short, once a network performs satisfactorily, the Generate C Code option provides the means for creating a program separate from NETS to run the network. Other features: files may be stored in binary or ASCII format; multiple input propagation is permitted; bias values may be included; capability to scale data without writing scaling code; quick interactive testing of network from the main menu; and several options that allow the user to manipulate learning efficiency. NETS is written in ANSI standard C language to be machine independent. The Macintosh version (MSC-22108) includes code for both a graphical user interface version and a command line interface version. The machine independent version (MSC-21588) only includes code for the command line interface version of NETS 3.0. The Macintosh version requires a Macintosh II series computer and has been successfully implemented under System 7. Four executables are included on these diskettes, two for floating point operations and two for integer arithmetic. It requires Think C 5.0 to compile. A minimum of 1Mb of RAM is required for execution. Sample input files and executables for both the command line version and the Macintosh user interface version are provided on the distribution medium. The Macintosh version is available on a set of three 3.5 inch 800K Macintosh format diskettes. The machine independent version has been successfully implemented on an IBM PC series compatible running MS-DOS, a DEC VAX running VMS, a SunIPC running SunOS, and a CRAY Y-MP running UNICOS. Two executables for the IBM PC version are included on the MS-DOS distribution media, one compiled for floating point operations and one for integer arithmetic. The machine independent version is available on a set of three 5.25 inch 360K MS-DOS format diskettes (standard

  1. Multiparty Computations

    DEFF Research Database (Denmark)

    Dziembowski, Stefan

    here and discuss other problems caused by the adaptiveness. All protocols in the thesis are formally specified and the proofs of their security are given. [1]Ronald Cramer, Ivan Damgård, Stefan Dziembowski, Martin Hirt, and Tal Rabin. Efficient multiparty computations with dishonest minority......In this thesis we study a problem of doing Verifiable Secret Sharing (VSS) and Multiparty Computations in a model where private channels between the players and a broadcast channel is available. The adversary is active, adaptive and has an unbounded computing power. The thesis is based on two...... to a polynomial time black-box reduction, the complexity of adaptively secure VSS is the same as that of ordinary secret sharing (SS), where security is only required against a passive, static adversary. Previously, such a connection was only known for linear secret sharing and VSS schemes. We then show...

  2. School version of ESTE EU

    International Nuclear Information System (INIS)

    Carny, P.; Suchon, D.; Chyly, M.; Smejkalova, E.; Fabova, V.

    2008-01-01

    ESTE EU is information system and software for radiological impacts assessment to the territory of the country in case of radiation accident inside/outside of the country .The program enables to model dispersion of radioactive clouds in small-scale and meso-scale. The system enables the user to estimate prediction of the source term (release to the atmosphere ) for any point of radiation/nuclear accident in Europe (for any point of the release, but especially for the sites of European power reactors ). The system enables to utilize results of real radiological monitoring in the process of source term estimation. Radiological impacts of release to the atmosphere are modelled and calculated across the Europe and displayed in the geographical information system (GIS). The school version of ESTE EU is intended for students of the universities which are interested in or could work in the field of emergency response, radiological and nuclear accidents, dispersion modelling, radiological impacts calculation and urgent or preventive protective measures implementation. The school version of ESTE EU is planned to be donated to specialized departments of faculties in Slovakia, Czech Republic, etc. System can be fully operated in Slovak, Czech or English language. (authors)

  3. School version of ESTE EU

    International Nuclear Information System (INIS)

    Carny, P.; Suchon, D.; Chyly, M.; Smejkalova, E.; Fabova, V.

    2009-01-01

    ESTE EU is information system and software for radiological impacts assessment to the territory of the country in case of radiation accident inside/outside of the country .The program enables to model dispersion of radioactive clouds in small-scale and meso-scale. The system enables the user to estimate prediction of the source term (release to the atmosphere ) for any point of radiation/nuclear accident in Europe (for any point of the release, but especially for the sites of European power reactors ). The system enables to utilize results of real radiological monitoring in the process of source term estimation. Radiological impacts of release to the atmosphere are modelled and calculated across the Europe and displayed in the geographical information system (GIS). The school version of ESTE EU is intended for students of the universities which are interested in or could work in the field of emergency response, radiological and nuclear accidents, dispersion modelling, radiological impacts calculation and urgent or preventive protective measures implementation. The school version of ESTE EU is planned to be donated to specialized departments of faculties in Slovakia, Czech Republic, etc. System can be fully operated in Slovak, Czech or English language. (authors)

  4. Quinoa, Version 0.1

    Energy Technology Data Exchange (ETDEWEB)

    2016-05-06

    Quinoa is a set of computational tools that enables research and numerical analysis in fluid dynamics. At this time it is a test-bed to experiment with various algorithms using fully asynchronous runtime systems.

  5. Labour Outcomes After Successful External Cephalic Version Compared With Spontaneous Cephalic Version.

    Science.gov (United States)

    Krueger, Samantha; Simioni, Julia; Griffith, Lauren E; Hutton, Eileen K

    2018-01-01

    This study sought to compare obstetrical outcomes for women with a cephalic presentation at birth resulting from successful external cephalic version (ECV) compared to those resulting from spontaneous cephalic version (SCV). Secondary analysis was performed on Early External Cephalic Version Trial data. A total of 931 study participants had breech presentations between 34 and 36 weeks' gestation and cephalic presentations at birth. The incidence of intrapartum interventions was compared between patients with successful ECV (557) and those with SCV (374). A generalized linear mixed model was used to determine ORs for our primary outcomes. Parity, maternal BMI, previous CS, and enrolment centre were controlled for in the analysis. No differences were found after ECV compared with SCV in the incidence of CS (96 of 557 and 76 of 374, respectively; adjusted OR [aOR] 0.89; 95% CI 0.63-1.26), instrumental birth (68 of 557 and 29 of 373, respectively; aOR 1.55; 95% CI 0.96-2.50), or normal vaginal birth (393 of 557 and 268 of 373, respectively; aOR 0.92; 95% CI 0.68-1.24). Multiparous women with successful ECV were half as likely to require a CS compared with those with SCV and no ECV (28 of 313 and 42 of 258, respectively; aOR 0.45; 95% CI 0.26-0.80). This is the first study to compare birth outcomes of breech pregnancies that convert to cephalic presentation by means of SCV with birth outcomes of breech pregnancies that have ECV. Women with a cephalic-presenting fetus at birth as a result of successful ECV are not at greater risk of obstetrical interventions at birth when compared with women with fetuses who spontaneously turn to a cephalic presentation in the third trimester. Copyright © 2018. Published by Elsevier Inc.

  6. Code OK3 - An upgraded version of OK2 with beam wobbling function

    Science.gov (United States)

    Ogoyski, A. I.; Kawata, S.; Popov, P. H.

    2010-07-01

    For computer simulations on heavy ion beam (HIB) irradiation onto a target with an arbitrary shape and structure in heavy ion fusion (HIF), the code OK2 was developed and presented in Computer Physics Communications 161 (2004). Code OK3 is an upgrade of OK2 including an important capability of wobbling beam illumination. The wobbling beam introduces a unique possibility for a smooth mechanism of inertial fusion target implosion, so that sufficient fusion energy is released to construct a fusion reactor in future. New version program summaryProgram title: OK3 Catalogue identifier: ADST_v3_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADST_v3_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 221 517 No. of bytes in distributed program, including test data, etc.: 2 471 015 Distribution format: tar.gz Programming language: C++ Computer: PC (Pentium 4, 1 GHz or more recommended) Operating system: Windows or UNIX RAM: 2048 MBytes Classification: 19.7 Catalogue identifier of previous version: ADST_v2_0 Journal reference of previous version: Comput. Phys. Comm. 161 (2004) 143 Does the new version supersede the previous version?: Yes Nature of problem: In heavy ion fusion (HIF), ion cancer therapy, material processing, etc., a precise beam energy deposition is essentially important [1]. Codes OK1 and OK2 have been developed to simulate the heavy ion beam energy deposition in three-dimensional arbitrary shaped targets [2, 3]. Wobbling beam illumination is important to smooth the beam energy deposition nonuniformity in HIF, so that a uniform target implosion is realized and a sufficient fusion output energy is released. Solution method: OK3 code works on the base of OK1 and OK2 [2, 3]. The code simulates a multi-beam illumination on a target with arbitrary shape and

  7. Charged-particle thermonuclear reaction rates: IV. Comparison to previous work

    International Nuclear Information System (INIS)

    Iliadis, C.; Longland, R.; Champagne, A.E.; Coc, A.

    2010-01-01

    We compare our Monte Carlo reaction rates (see Paper II of this issue) to previous results that were obtained by using the classical method of computing thermonuclear reaction rates. For each reaction, the comparison is presented using two types of graphs: the first shows the change in reaction rate uncertainties, while the second displays our new results normalized to the previously recommended reaction rate. We find that the rates have changed significantly for almost all reactions considered here. The changes are caused by (i) our new Monte Carlo method of computing reaction rates (see Paper I of this issue), and (ii) newly available nuclear physics information (see Paper III of this issue).

  8. Multithreaded transactions in scientific computing. The Growth06_v2 program

    Science.gov (United States)

    Daniluk, Andrzej

    2009-07-01

    Writing a concurrent program can be more difficult than writing a sequential program. Programmer needs to think about synchronization, race conditions and shared variables. Transactions help reduce the inconvenience of using threads. A transaction is an abstraction, which allows programmers to group a sequence of actions on the program into a logical, higher-level computation unit. This paper presents a new version of the GROWTHGr and GROWTH06 programs. New version program summaryProgram title: GROWTH06_v2 Catalogue identifier: ADVL_v2_1 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVL_v2_1.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 65 255 No. of bytes in distributed program, including test data, etc.: 865 985 Distribution format: tar.gz Programming language: Object Pascal Computer: Pentium-based PC Operating system: Windows 9x, XP, NT, Vista RAM: more than 1 MB Classification: 4.3, 7.2, 6.2, 8, 14 Catalogue identifier of previous version: ADVL_v2_0 Journal reference of previous version: Comput. Phys. Comm. 175 (2006) 678 Does the new version supersede the previous version?: Yes Nature of problem: The programs compute the RHEED intensities during the growth of thin epitaxial structures prepared using the molecular beam epitaxy (MBE). The computations are based on the use of kinematical diffraction theory. Solution method: Epitaxial growth of thin films is modelled by a set of non-linear differential equations [1]. The Runge-Kutta method with adaptive stepsize control was used for solving initial value problem for non-linear differential equations [2]. Reasons for new version: According to the users' suggestions functionality of the program has been improved. Moreover, new use cases have been added which make the handling of the program easier and more

  9. Procedure guideline for radioiodine test (version 3); Verfahrensanweisung zum Radioiodtest (Version 3)

    Energy Technology Data Exchange (ETDEWEB)

    Dietlein, M.; Schicha, H. [Deutsche Gesellschaft fuer Nuklearmedizin (DGN) (Germany); Koeln Univ. (Germany). Klinik und Poliklinik fuer Nuklearmedizin; Dressler, J. [Deutsche Gesellschaft fuer Nuklearmedizin (DGN) (Germany); Nuklearmedizinische Klinik der Henriettenstiftung, Hannover (Germany); Eschner, W. [Deutsche Gesellschaft fuer Nuklearmedizin (DGN) (Germany); Deutsche Gesellschaft fuer Medizinische Physik (DGMP) (Germany); Koeln Univ. (Germany). Klinik und Poliklinik fuer Nuklearmedizin; Lassmann, M. [Deutsche Gesellschaft fuer Nuklearmedizin (DGN) (Germany); Deutsche Gesellschaft fuer Medizinische Physik (DGMP) (Germany); Wuerzburg Univ. (Germany). Klinik und Poliklinik fuer Nuklearmedizin; Leisner, B. [Deutsche Gesellschaft fuer Nuklearmedizin (DGN) (Germany); Allgemeines Krankenhaus St. Georg, Hamburg (Germany). Abt. fuer Nuklearmedizin; Reiners, C. [Deutsche Gesellschaft fuer Nuklearmedizin (DGN) (Germany); Wuerzburg Univ. (Germany). Klinik und Poliklinik fuer Nuklearmedizin

    2007-07-01

    The version 3 of the procedure guideline for radioiodine test is an update of the guideline previously published in 2003. The procedure guideline discusses the pros and cons of a single measurement or of repeated measurements of the iodine-131 uptake and their optimal timing. Different formulas are described when one, two or three values of the radioiodine kinetic are available. The probe with a sodium-iodine crystal, alternatively or additionally the gamma camera using the ROI-technique are instrumentations for the measurement of iodine-131 uptake. A possible source of error is an inappropriate measurement (sonography) of the target volume. The patients' preparation includes the withdrawal of antithyroid drugs 2-3 days before radioiodine administration. The patient has to avoid iodine-containing medication and the possibility of additives of iodine in vitamin- and electrolyte-supplementation has to be considered. (orig.)

  10. Model-based version management system framework

    International Nuclear Information System (INIS)

    Mehmood, W.

    2016-01-01

    In this paper we present a model-based version management system. Version Management System (VMS) a branch of software configuration management (SCM) aims to provide a controlling mechanism for evolution of software artifacts created during software development process. Controlling the evolution requires many activities to perform, such as, construction and creation of versions, identification of differences between versions, conflict detection and merging. Traditional VMS systems are file-based and consider software systems as a set of text files. File based VMS systems are not adequate for performing software configuration management activities such as, version control on software artifacts produced in earlier phases of the software life cycle. New challenges of model differencing, merge, and evolution control arise while using models as central artifact. The goal of this work is to present a generic framework model-based VMS which can be used to overcome the problem of tradition file-based VMS systems and provide model versioning services. (author)

  11. The quantum computer game: citizen science

    Science.gov (United States)

    Damgaard, Sidse; Mølmer, Klaus; Sherson, Jacob

    2013-05-01

    Progress in the field of quantum computation is hampered by daunting technical challenges. Here we present an alternative approach to solving these by enlisting the aid of computer players around the world. We have previously examined a quantum computation architecture involving ultracold atoms in optical lattices and strongly focused tweezers of light. In The Quantum Computer Game (see http://www.scienceathome.org/), we have encapsulated the time-dependent Schrödinger equation for the problem in a graphical user interface allowing for easy user input. Players can then search the parameter space with real-time graphical feedback in a game context with a global high-score that rewards short gate times and robustness to experimental errors. The game which is still in a demo version has so far been tried by several hundred players. Extensions of the approach to other models such as Gross-Pitaevskii and Bose-Hubbard are currently under development. The game has also been incorporated into science education at high-school and university level as an alternative method for teaching quantum mechanics. Initial quantitative evaluation results are very positive. AU Ideas Center for Community Driven Research, CODER.

  12. Development of EASYQAD version β: A Visualization Code System for QAD-CGGP-A Gamma and Neutron Shielding Calculation Code

    International Nuclear Information System (INIS)

    Kim, Jae Cheon; Lee, Hwan Soo; Ha, Pham Nhu Viet; Kim, Soon Young; Shin, Chang Ho; Kim, Jong Kyung

    2007-01-01

    EASYQAD had been previously developed by using MATLAB GUI (Graphical User Interface) in order to perform conveniently gamma and neutron shielding calculations at Hanyang University. It had been completed as version α of radiation shielding analysis code. In this study, EASYQAD was upgraded to version β with many additional functions and more user-friendly graphical interfaces. For general users to run it on Windows XP environment without any MATLAB installation, this version was developed into a standalone code system

  13. CLIPS - C LANGUAGE INTEGRATED PRODUCTION SYSTEM (IBM PC VERSION WITH CLIPSITS)

    Science.gov (United States)

    Riley, , .

    1994-01-01

    The C Language Integrated Production System, CLIPS, is a shell for developing expert systems. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. The primary design goals for CLIPS are portability, efficiency, and functionality. For these reasons, the program is written in C. CLIPS meets or outperforms most micro- and minicomputer based artificial intelligence tools. CLIPS is a forward chaining rule-based language. The program contains an inference engine and a language syntax that provide a framework for the construction of an expert system. It also includes tools for debugging an application. CLIPS is based on the Rete algorithm, which enables very efficient pattern matching. The collection of conditions and actions to be taken if the conditions are met is constructed into a rule network. As facts are asserted either prior to or during a session, CLIPS pattern-matches the number of fields. Wildcards and variables are supported for both single and multiple fields. CLIPS syntax allows the inclusion of externally defined functions (outside functions which are written in a language other than CLIPS). CLIPS itself can be embedded in a program such that the expert system is available as a simple subroutine call. Advanced features found in CLIPS version 4.3 include an integrated microEMACS editor, the ability to generate C source code from a CLIPS rule base to produce a dedicated executable, binary load and save capabilities for CLIPS rule bases, and the utility program CRSV (Cross-Reference, Style, and Verification) designed to facilitate the development and maintenance of large rule bases. Five machine versions are available. Each machine version includes the source and the executable for that machine. The UNIX version includes the source and binaries for IBM RS/6000, Sun3 series, and Sun4 series computers. The UNIX, DEC VAX, and DEC RISC Workstation versions are line oriented. The PC version and the Macintosh

  14. HECTR [Hydrogen Event Containment Transient Response] Version 1.5N: A modification of HECTR Version 1.5 for application to N Reactor

    International Nuclear Information System (INIS)

    Camp, A.L.; Dingman, S.E.

    1987-05-01

    This report describes HECTR Version 1.5N, which is a special version of HECTR developed specifically for application to the N Reactor. HECTR is a fast-running, lumped-parameter containment analysis computer program that is most useful for performing parametric studies. The main purpose of HECTR is to analyze nuclear reactor accidents involving the transport and combustion of hydrogen, but HECTR can also function as an experiment analysis tool and can solve a limited set of other types of containment problems. Version 1.5N is a modification of Version 1.5 and includes changes to the spray actuation logic, and models for steam vents, vacuum breakers, and building cross-vents. Thus, all of the key features of the N Reactor confinement can be modeled. HECTR is designed for flexibility and provides for user control of many important parameters, if built-in correlations and default values are not desired

  15. XTALOPT version r11: An open-source evolutionary algorithm for crystal structure prediction

    Science.gov (United States)

    Avery, Patrick; Falls, Zackary; Zurek, Eva

    2018-01-01

    Version 11 of XTALOPT, an evolutionary algorithm for crystal structure prediction, has now been made available for download from the CPC library or the XTALOPT website, http://xtalopt.github.io. Whereas the previous versions of XTALOPT were published under the Gnu Public License (GPL), the current version is made available under the 3-Clause BSD License, which is an open source license that is recognized by the Open Source Initiative. Importantly, the new version can be executed via a command line interface (i.e., it does not require the use of a Graphical User Interface). Moreover, the new version is written as a stand-alone program, rather than an extension to AVOGADRO.

  16. Impact of previously disadvantaged land-users on sustainable ...

    African Journals Online (AJOL)

    Impact of previously disadvantaged land-users on sustainable agricultural ... about previously disadvantaged land users involved in communal farming systems ... of input, capital, marketing, information and land use planning, with effect on ...

  17. The Integrated Tiger Series version 5.0

    International Nuclear Information System (INIS)

    Laub, Th.W.; Kensek, R.P.; Franke, B.C.; Lorence, L.J.; Crawford, M.J.; Quirk, Th.J.

    2005-01-01

    The Integrated Tiger Series (ITS) is a powerful and user-friendly software package permitting Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. The package contains programs to perform 1-, 2-, and 3-dimensional simulations. Improvements in the ITS code package since the release of version 3.0 include improved physics, multigroup and adjoint capabilities, Computer-Aided Design geometry tracking, parallel implementations of all ITS codes, and more automated sub-zoning capabilities. These improvements and others are described as current or planned development efforts. The ITS package is currently at version 5.0. (authors)

  18. RASCAL Version 2.1 workbook. Volume 2, Revision 2

    International Nuclear Information System (INIS)

    Athey, G.F.; Sjoreen, A.L.; McKenna, T.J.

    1994-12-01

    The Radiological Assessment System for Consequence Analysis, Version 2.1 (RASCAL 2.1) was developed for use by the NRC personnel who respond to radiological emergencies. This workbook complements the RASCAL 2.1 User's guide (NUREG/CR-5247, Vol. 1, Rev. 2). The workbook contains exercises designed to familiarize the user with the computer-based tools of RASCAL through hands-on problem solving. The workbook contains four major sections. The first is a RASCAL familiarization exercise to acquaint the user with the operation of the forms, menus, online help, and documentation. The latter three sections contain exercises in using the three tools of RASCAL Version 2.1: DECAY, FM-DOSE, and ST-DOSE. A discussion section describing how the tools could be used to solve the problems follows each set of exercises

  19. FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)

    Science.gov (United States)

    Pack, G.

    1994-01-01

    saved as a library file which represents a generic digraph structure for a class of components. The Generate Model feature can then use library files to generate digraphs for every component listed in the modeling tables, and these individual digraph files can be used in a variety of ways to speed generation of complete digraph models. FEAT contains a preprocessor which performs transitive closure on the digraph. This multi-step algorithm builds a series of phantom bridges, or gates, that allow accurate bi-directional processing of digraphs. This preprocessing can be time-consuming, but once preprocessing is complete, queries can be answered and displayed within seconds. A UNIX X-Windows port of version 3.5 of FEAT, XFEAT, is also available to speed the processing of digraph models created on the Macintosh. FEAT v3.6, which is only available for the Macintosh, has some report generation capabilities which are not available in XFEAT. For very large integrated systems, FEAT can be a real cost saver in terms of design evaluation, training, and knowledge capture. The capability of loading multiple digraphs and schematics into FEAT allows modelers to build smaller, more focused digraphs. Typically, each digraph file will represent only a portion of a larger failure scenario. FEAT will combine these files and digraphs from other modelers to form a continuous mathematical model of the system's failure logic. Since multiple digraphs can be cumbersome to use, FEAT ties propagation results to schematic drawings produced using MacDraw II (v1.1v2 or later) or MacDraw Pro. This makes it easier to identify single and double point failures that may have to cross several system boundaries and multiple engineering disciplines before creating a hazardous condition. FEAT v3.6 for the Macintosh is written in C-language using Macintosh Programmer's Workshop C v3.2. It requires at least a Mac II series computer running System 7 or System 6.0.8 and 32 Bit QuickDraw. It also requires a math

  20. 22 CFR 40.91 - Certain aliens previously removed.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Certain aliens previously removed. 40.91... IMMIGRANTS UNDER THE IMMIGRATION AND NATIONALITY ACT, AS AMENDED Aliens Previously Removed § 40.91 Certain aliens previously removed. (a) 5-year bar. An alien who has been found inadmissible, whether as a result...

  1. Quantum analogue computing.

    Science.gov (United States)

    Kendon, Vivien M; Nemoto, Kae; Munro, William J

    2010-08-13

    We briefly review what a quantum computer is, what it promises to do for us and why it is so hard to build one. Among the first applications anticipated to bear fruit is the quantum simulation of quantum systems. While most quantum computation is an extension of classical digital computation, quantum simulation differs fundamentally in how the data are encoded in the quantum computer. To perform a quantum simulation, the Hilbert space of the system to be simulated is mapped directly onto the Hilbert space of the (logical) qubits in the quantum computer. This type of direct correspondence is how data are encoded in a classical analogue computer. There is no binary encoding, and increasing precision becomes exponentially costly: an extra bit of precision doubles the size of the computer. This has important consequences for both the precision and error-correction requirements of quantum simulation, and significant open questions remain about its practicality. It also means that the quantum version of analogue computers, continuous-variable quantum computers, becomes an equally efficient architecture for quantum simulation. Lessons from past use of classical analogue computers can help us to build better quantum simulators in future.

  2. Development of IMPACTS-BRC, Version 2.1

    International Nuclear Information System (INIS)

    Rao, R.R.; Kozak, M.W.; Rollstin, J.A.

    1991-01-01

    IMPACTS-BRC is a computer program developed to conduct scoping analyses for use in supporting rulemaking on petitions for exemption of waste streams from multiple producers. It was not initially intended for use on individual license applications for specific sites. However, the Federal Register, Volume 51, Number 168, specifies that IMPACTS-BRC be used to evaluate incoming license applications. This creates a problem since IMPACTS-BRC is not being used for its intended purpose. It is a generic code that is now being used for site specific applications. This is only a valid procedure if it can be shown that generic results from IMPACTS-BRC are conservative when compared to results from site specific models. Otherwise, IMPACTS-BRC should not be used. The purpose of this work was to verify that IMPACTS-BRC works as specified in its user's guide. In other words, Sandia National Laboratories (SNL) has determined that the mathematical models given in the user's guide are correctly implemented into the computer code. No direct work has been done to verify that the mathematical models used in the code are appropriate for the purpose that they are being used. In fact, scrutiny of the groundwater transport models in IMPACTS-BRC has led us to recommend that alternate geosphere models should be used. Other work carried out for this project included verifying that the input data for IMPACTS-BRC is correct and traceable. This was carried out, and a new version of the data with these qualities was produced. The new version of the data was used with the verified IMPACTS-BRC, Version 2.0 to produce IMPACTS-BRC, Version 2.1

  3. Previous medical history of diseases in children with attention deficit hyperactivity disorder and their parents

    Directory of Open Access Journals (Sweden)

    Ayyoub Malek

    2014-02-01

    Full Text Available Introduction: The etiology of Attention deficit hyperactivity disorder (ADHD is complex and most likely includes genetic and environmental factors. This study was conducted to evaluatethe role of previous medical history of diseases in ADHD children and their parents during theearlier years of the ADHD children's lives. Methods: In this case-control study, 164 ADHD children attending to Child and AdolescentPsychiatric Clinics of Tabriz University of Medical Sciences, Iran, compared with 166 normal children selected in a random-cluster method from primary and guidance schools. ADHDrating scale (Parents version and clinical interview based on schedule for Schedule forAffective Disorders and Schizophrenia for School-Age Children-Present and Lifetime Version(K-SADS were used to diagnose ADHD cases and to select the control group. Two groupswere compared for the existence of previous medical history of diseases in children andparents. Fisher's exact test and logistic regression model were used for data analysis. Results: The frequency of maternal history of medical disorders (28.7% vs. 12.0%; P = 0.001was significantly higher in children with ADHD compared with the control group. The frequency of jaundice, dysentery, epilepsy, asthma, allergy, and head trauma in the medicalhistory of children were not significantly differed between the two groups. Conclusion: According to this preliminary study, it may be concluded that the maternal historyof medical disorders is one of contributing risk factors for ADHD.

  4. APPLE-2: an improved version of APPLE code for plotting neutron and gamma ray spectra and reaction rates

    International Nuclear Information System (INIS)

    Kawasaki, Hiromitsu; Seki, Yasushi.

    1982-07-01

    A computer code APPLE-2 which plots the spatial distribution of energy spectra of multi-group neutron and/or gamma ray fluxes, and reaction rates has been developed. This code is an improved version of the previously developed APPLE code and has the following features: (1) It plots energy spectra of neutron and/or gamma ray fluxes calculated by ANISN, DOT and MORSE. (2) It calculates and plots the spatial distribution of neutron and gamma ray fluxes and various types of reaction rates such as nuclear heating rates, operational dose rates, displacement damage rates. (3) Input data specification is greatly simplified by the use of standard, response libraries and by close coupling with radiation transport calculation codes. (4) Plotting outputs are given in camera ready form. (author)

  5. Determining root correspondence between previously and newly detected objects

    Science.gov (United States)

    Paglieroni, David W.; Beer, N Reginald

    2014-06-17

    A system that applies attribute and topology based change detection to networks of objects that were detected on previous scans of a structure, roadway, or area of interest. The attributes capture properties or characteristics of the previously detected objects, such as location, time of detection, size, elongation, orientation, etc. The topology of the network of previously detected objects is maintained in a constellation database that stores attributes of previously detected objects and implicitly captures the geometrical structure of the network. A change detection system detects change by comparing the attributes and topology of new objects detected on the latest scan to the constellation database of previously detected objects.

  6. Computing Equilibrium Chemical Compositions

    Science.gov (United States)

    Mcbride, Bonnie J.; Gordon, Sanford

    1995-01-01

    Chemical Equilibrium With Transport Properties, 1993 (CET93) computer program provides data on chemical-equilibrium compositions. Aids calculation of thermodynamic properties of chemical systems. Information essential in design and analysis of such equipment as compressors, turbines, nozzles, engines, shock tubes, heat exchangers, and chemical-processing equipment. CET93/PC is version of CET93 specifically designed to run within 640K memory limit of MS-DOS operating system. CET93/PC written in FORTRAN.

  7. Control rod computer code IAMCOS: general theory and numerical methods

    International Nuclear Information System (INIS)

    West, G.

    1982-11-01

    IAMCOS is a computer code for the description of mechanical and thermal behavior of cylindrical control rods for fast breeders. This code version was applied, tested and modified from 1979 to 1981. In this report are described the basic model (02 version), theoretical definitions and computation methods [fr

  8. VAXNEWS version 3.12

    International Nuclear Information System (INIS)

    Callot, O.

    1994-02-01

    VAXNEWS can be used as a local news system or bulletin board. However, VAXNEWS is basically a networked news system, allowing any member of a large collaboration to read information, and also to write information which is distributed to all the computers where this information should go. VAXNEWS utility is described for the manager point of view. The installation procedure, the overview of the system, the files structure, the network setup and operation, remote management and management commands are presented. (author)

  9. Latest NASA Instrument Cost Model (NICM): Version VI

    Science.gov (United States)

    Mrozinski, Joe; Habib-Agahi, Hamid; Fox, George; Ball, Gary

    2014-01-01

    The NASA Instrument Cost Model, NICM, is a suite of tools which allow for probabilistic cost estimation of NASA's space-flight instruments at both the system and subsystem level. NICM also includes the ability to perform cost by analogy as well as joint confidence level (JCL) analysis. The latest version of NICM, Version VI, was released in Spring 2014. This paper will focus on the new features released with NICM VI, which include: 1) The NICM-E cost estimating relationship, which is applicable for instruments flying on Explorer-like class missions; 2) The new cluster analysis ability which, alongside the results of the parametric cost estimation for the user's instrument, also provides a visualization of the user's instrument's similarity to previously flown instruments; and 3) includes new cost estimating relationships for in-situ instruments.

  10. Development of the short version of the informal caregiver burden assessment questionnaire

    Directory of Open Access Journals (Sweden)

    Teresa Martins

    2015-04-01

    Full Text Available OBJETIVE to create a reduced version of the QASCI, which is structurally equivalent to the long one and meets the criteria of reliability and validity. METHOD Through secondary data from previous studies, the participants were divided into two samples, one for the development of reduced version and the second for study of the factorial validity. Participants responded to QASCI, the SF 36, the ADHS and demographic questions. RESULTS A reduced version of 14 items showed adequate psychometric properties of validity and internal consistency, adapted to a heptadimensional structure that assesses positive and negative aspects of care. CONCLUSION Confirmatory factor analysis revealed a good fit with the advocated theoretical model.

  11. DEMAID - A DESIGN MANAGER'S AID FOR INTELLIGENT DECOMPOSITION (MACINTOSH VERSION)

    Science.gov (United States)

    Rogers, J. L.

    1994-01-01

    effects of an output with respect to a change in a particular input. The second method traces backward to determine what modules must be re-executed if the output of a module must be recomputed. DeMAID is available in three machine versions: a Macintosh version which is written in Symantec's Think C 3.01, a Sun version, and an SGI IRIS version, both of which are written in C language. The Macintosh version requires system software 6.0.2 or later and CLIPS 4.3. The source code for the Macintosh version will not compile under version 4.0 of Think C; however, a sample executable is provided on the distribution media. QuickDraw is required for plotting. The Sun version requires GKS 4.1 graphics libraries, OpenWindows 3, and CLIPS 4.3. The SGI IRIS version requires CLIPS 4.3. Since DeMAID is not compatible with CLIPS 5.0 or later, the source code for CLIPS 4.3 is included on the distribution media; however, the documentation for CLIPS 4.3 is not included in the documentation package for DeMAID. It is available from COSMIC separately as the documentation for MSC-21208. The standard distribution medium for the Macintosh version of DeMAID is a set of four 3.5 inch 800K Macintosh format diskettes. The standard distribution medium for the Sun version of DeMAID is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. The standard distribution medium for the IRIS version is a .25 inch IRIX compatible streaming magnetic tape cartridge in UNIX tar format. All versions include sample input. DeMAID was originally developed for use on VAX VMS computers in 1989. The Macintosh version of DeMAID was released in 1991 and updated in 1992. The Sun version of DeMAID was released in 1992 and updated in 1993. The SGI IRIS version was released in 1993.

  12. DEMAID - A DESIGN MANAGER'S AID FOR INTELLIGENT DECOMPOSITION (SUN VERSION)

    Science.gov (United States)

    Rogers, J. L.

    1994-01-01

    effects of an output with respect to a change in a particular input. The second method traces backward to determine what modules must be re-executed if the output of a module must be recomputed. DeMAID is available in three machine versions: a Macintosh version which is written in Symantec's Think C 3.01, a Sun version, and an SGI IRIS version, both of which are written in C language. The Macintosh version requires system software 6.0.2 or later and CLIPS 4.3. The source code for the Macintosh version will not compile under version 4.0 of Think C; however, a sample executable is provided on the distribution media. QuickDraw is required for plotting. The Sun version requires GKS 4.1 graphics libraries, OpenWindows 3, and CLIPS 4.3. The SGI IRIS version requires CLIPS 4.3. Since DeMAID is not compatible with CLIPS 5.0 or later, the source code for CLIPS 4.3 is included on the distribution media; however, the documentation for CLIPS 4.3 is not included in the documentation package for DeMAID. It is available from COSMIC separately as the documentation for MSC-21208. The standard distribution medium for the Macintosh version of DeMAID is a set of four 3.5 inch 800K Macintosh format diskettes. The standard distribution medium for the Sun version of DeMAID is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. The standard distribution medium for the IRIS version is a .25 inch IRIX compatible streaming magnetic tape cartridge in UNIX tar format. All versions include sample input. DeMAID was originally developed for use on VAX VMS computers in 1989. The Macintosh version of DeMAID was released in 1991 and updated in 1992. The Sun version of DeMAID was released in 1992 and updated in 1993. The SGI IRIS version was released in 1993.

  13. HANFORD TANK WASTE OPERATIONS SIMULATOR VERSION DESCRIPTION DOCUMENT

    International Nuclear Information System (INIS)

    ALLEN, G.K.

    2003-01-01

    This document describes the software version controls established for the Hanford Tank Waste Operations Simulator (HTWOS). It defines: the methods employed to control the configuration of HTWOS; the version of each of the 26 separate modules for the version 1.0 of HTWOS; the numbering rules for incrementing the version number of each module; and a requirement to include module version numbers in each case results documentation. Version 1.0 of HTWOS is the first version under formal software version control. HTWOS contains separate revision numbers for each of its 26 modules. Individual module version numbers do not reflect the major release HTWOS configured version number

  14. Schema Versioning for Multitemporal Relational Databases.

    Science.gov (United States)

    De Castro, Cristina; Grandi, Fabio; Scalas, Maria Rita

    1997-01-01

    Investigates new design options for extended schema versioning support for multitemporal relational databases. Discusses the improved functionalities they may provide. Outlines options and basic motivations for the new design solutions, as well as techniques for the management of proposed schema versioning solutions, includes algorithms and…

  15. Several versions of forward gas ionization calorimeter

    International Nuclear Information System (INIS)

    Babintsev, V.V.; Kholodenko, A.G.; Rodnov, Yu.V.

    1994-01-01

    The properties of several versions of a gas ionization calorimeter are analyzed by means of the simulation with the GEANT code. The jet energy and coordinate resolutions are evaluated. Some versions of the forward calorimeter meet the ATLAS requirements. 13 refs., 15 figs., 7 tabs

  16. CALIPSO lidar calibration at 532 nm: version 4 nighttime algorithm

    Science.gov (United States)

    Kar, Jayanta; Vaughan, Mark A.; Lee, Kam-Pui; Tackett, Jason L.; Avery, Melody A.; Garnier, Anne; Getzewich, Brian J.; Hunt, William H.; Josset, Damien; Liu, Zhaoyan; Lucker, Patricia L.; Magill, Brian; Omar, Ali H.; Pelon, Jacques; Rogers, Raymond R.; Toth, Travis D.; Trepte, Charles R.; Vernier, Jean-Paul; Winker, David M.; Young, Stuart A.

    2018-03-01

    Data products from the Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) on board Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) were recently updated following the implementation of new (version 4) calibration algorithms for all of the Level 1 attenuated backscatter measurements. In this work we present the motivation for and the implementation of the version 4 nighttime 532 nm parallel channel calibration. The nighttime 532 nm calibration is the most fundamental calibration of CALIOP data, since all of CALIOP's other radiometric calibration procedures - i.e., the 532 nm daytime calibration and the 1064 nm calibrations during both nighttime and daytime - depend either directly or indirectly on the 532 nm nighttime calibration. The accuracy of the 532 nm nighttime calibration has been significantly improved by raising the molecular normalization altitude from 30-34 km to the upper possible signal acquisition range of 36-39 km to substantially reduce stratospheric aerosol contamination. Due to the greatly reduced molecular number density and consequently reduced signal-to-noise ratio (SNR) at these higher altitudes, the signal is now averaged over a larger number of samples using data from multiple adjacent granules. Additionally, an enhanced strategy for filtering the radiation-induced noise from high-energy particles was adopted. Further, the meteorological model used in the earlier versions has been replaced by the improved Modern-Era Retrospective analysis for Research and Applications, Version 2 (MERRA-2), model. An aerosol scattering ratio of 1.01 ± 0.01 is now explicitly used for the calibration altitude. These modifications lead to globally revised calibration coefficients which are, on average, 2-3 % lower than in previous data releases. Further, the new calibration procedure is shown to eliminate biases at high altitudes that were present in earlier versions and consequently leads to an improved representation of

  17. Moxibustion for Cephalic Version of Breech Presentation.

    Science.gov (United States)

    Schlaeger, Judith M; Stoffel, Cynthia L; Bussell, Jeanie L; Cai, Hui Yan; Takayama, Miho; Yajima, Hiroyoshi; Takakura, Nobuari

    2018-05-01

    Moxibustion, a form of traditional Chinese medicine (TCM), is the burning of the herb moxa (Folium Artemisiae argyi or mugwort) over acupuncture points. It is often used in China to facilitate cephalic version of breech presentation. This article reviews the history, philosophy, therapeutic use, possible mechanisms of action, and literature pertaining to its use for this indication. For moxibustion, moxa can be rolled into stick form, placed directly on the skin, or placed on an acupuncture needle and ignited to warm acupuncture points. Studies have demonstrated that moxibustion may promote cephalic version of breech presentation and may facilitate external cephalic version. However, there is currently a paucity of research on the effects of moxibustion on cephalic version of breech presentation, and thus there is a need for further studies. Areas needing more investigation include efficacy, safety, optimal technique, and best protocol for cephalic version of breech presentation. © 2018 by the American College of Nurse-Midwives.

  18. Tracking code patterns over multiple software versions with Herodotos

    DEFF Research Database (Denmark)

    Palix, Nicolas Jean-Michel; Lawall, Julia; Muller, Gilles

    2010-01-01

    An important element of understanding a software code base is to identify the repetitive patterns of code it contains and how these evolve over time. Some patterns are useful to the software, and may be modularized. Others are detrimental to the software, such as patterns that represent defects...... pattern occurrences over multiple versions of a software project, independent of other changes in the source files. Guided by a user-provided configuration file, Herodotos builds various graphs showing the evolution of the pattern occurrences and computes some statistics. We have evaluated this approach...

  19. A Microsoft Windows version of the MCNP visual editor

    International Nuclear Information System (INIS)

    Schwarz, R.A.; Carter, L.L.; Pfohl, J.

    1999-01-01

    Work has started on a Microsoft Windows version of the MCNP visual editor. The MCNP visual editor provides a graphical user interface for displaying and creating MCNP geometries. The visual editor is currently available from the Radiation Safety Information Computational Center (RSICC) and the Nuclear Energy Agency (NEA) as software package PSR-358. It currently runs on the major UNIX platforms (IBM, SGI, HP, SUN) and Linux. Work has started on converting the visual editor to work in a Microsoft Windows environment. This initial work focuses on converting the display capabilities of the visual editor; the geometry creation capability of the visual editor may be included in future upgrades

  20. UQTk Version 3.0.3 User Manual

    Energy Technology Data Exchange (ETDEWEB)

    Sargsyan, Khachik [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Safta, Cosmin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Chowdhary, Kamaljit Singh [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Castorena, Sarah [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); De Bord, Sarah [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Debusschere, Bert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-05-01

    The UQ Toolkit (UQTk) is a collection of libraries and tools for the quantification of uncertainty in numerical model predictions. Version 3.0.3 offers intrusive and non-intrusive methods for propagating input uncertainties through computational models, tools for sen- sitivity analysis, methods for sparse surrogate construction, and Bayesian inference tools for inferring parameters from experimental data. This manual discusses the download and installation process for UQTk, provides pointers to the UQ methods used in the toolkit, and describes some of the examples provided with the toolkit.

  1. Graphics Gems III IBM version

    CERN Document Server

    Kirk, David

    1994-01-01

    This sequel to Graphics Gems (Academic Press, 1990), and Graphics Gems II (Academic Press, 1991) is a practical collection of computer graphics programming tools and techniques. Graphics Gems III contains a larger percentage of gems related to modeling and rendering, particularly lighting and shading. This new edition also covers image processing, numerical and programming techniques, modeling and transformations, 2D and 3D geometry and algorithms,ray tracing and radiosity, rendering, and more clever new tools and tricks for graphics programming. Volume III also includes a

  2. VAXNEWS version 3.12

    International Nuclear Information System (INIS)

    Callot, O.

    1994-02-01

    VAXNEWS can be used as a local news system or bulletin board. However, VAXNEWS is basically a networked news system, allowing any member of a large collaboration to read information, and also to write information which is distributed to all the computers where this information should go. This document describes the VAXNEWS utility. The concepts used in VAXNEWS, the usage and the syntax of the user commands are documented. This document is intended for the normal user of VAXNEWS. The document is divided into two parts: a section describing VAXNEWS and how to use it, and a section describing each user commands in detail. (K.A.)

  3. IDSE Version 1 User's Manual

    Science.gov (United States)

    Mayer, Richard

    1988-01-01

    The integrated development support environment (IDSE) is a suite of integrated software tools that provide intelligent support for information modelling. These tools assist in function, information, and process modeling. Additional tools exist to assist in gathering and analyzing information to be modeled. This is a user's guide to application of the IDSE. Sections covering the requirements and design of each of the tools are presented. There are currently three integrated computer aided manufacturing definition (IDEF) modeling methodologies: IDEF0, IDEF1, and IDEF2. Also, four appendices exist to describe hardware and software requirements, installation procedures, and basic hardware usage.

  4. Verification of the 2.00 WAPPA-B [Waste Package Performance Assessment-B version] code

    International Nuclear Information System (INIS)

    Tylock, B.; Jansen, G.; Raines, G.E.

    1987-07-01

    The old version of the Waste Package Performance Assessment (WAPPA) code has been modified into a new code version, 2.00 WAPPA-B. The input files and the results for two benchmarks at repository conditions are fully documented in the appendixes of the EA reference report. The 2.00 WAPPA-B version of the code is suitable for computation of barrier failure due to uniform corrosion; however, an improved sub-version, 2.01 WAPPA-B, is recommended for general use due to minor errors found in 2.00 WAPPA-B during its verification procedures. The input files and input echoes have been modified to include behavior of both radionuclides and elements, but the 2.00 WAPPA-B version of the WAPPA code is not recommended for computation of radionuclide releases. The 2.00 WAPPA-B version computes only mass balances and the initial presence of radionuclides that can be released. Future code development in the 3.00 WAPPA-C version will include radionuclide release computations. 19 refs., 10 figs., 1 tab

  5. Reply to comment by Añel on "Most computational hydrology is not reproducible, so is it really science?"

    Science.gov (United States)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit

    2017-03-01

    In this article, we reply to a comment made on our previous commentary regarding reproducibility in computational hydrology. Software licensing and version control of code are important technical aspects of making code and workflows of scientific experiments open and reproducible. However, in our view, it is the cultural change that is the greatest challenge to overcome to achieve reproducible scientific research in computational hydrology. We believe that from changing the culture and attitude among hydrological scientists, details will evolve to cover more (technical) aspects over time.

  6. Computational Science: Ensuring America's Competitiveness

    National Research Council Canada - National Science Library

    Reed, Daniel A; Bajcsy, Ruzena; Fernandez, Manuel A; Griffiths, Jose-Marie; Mott, Randall D; Dongarra, J. J; Johnson, Chris R; Inouye, Alan S; Miner, William; Matzke, Martha K; Ponick, Terry L

    2005-01-01

    ... previously deemed intractable. Yet, despite the great opportunities and needs, universities and the Federal government have not effectively recognized the strategic significance of computational science in either...

  7. Advanced computer-based training

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, H D; Martin, H D

    1987-05-01

    The paper presents new techniques of computer-based training for personnel of nuclear power plants. Training on full-scope simulators is further increased by use of dedicated computer-based equipment. An interactive communication system runs on a personal computer linked to a video disc; a part-task simulator runs on 32 bit process computers and shows two versions: as functional trainer or as on-line predictor with an interactive learning system (OPAL), which may be well-tailored to a specific nuclear power plant. The common goal of both develoments is the optimization of the cost-benefit ratio for training and equipment.

  8. Advanced computer-based training

    International Nuclear Information System (INIS)

    Fischer, H.D.; Martin, H.D.

    1987-01-01

    The paper presents new techniques of computer-based training for personnel of nuclear power plants. Training on full-scope simulators is further increased by use of dedicated computer-based equipment. An interactive communication system runs on a personal computer linked to a video disc; a part-task simulator runs on 32 bit process computers and shows two versions: as functional trainer or as on-line predictor with an interactive learning system (OPAL), which may be well-tailored to a specific nuclear power plant. The common goal of both develoments is the optimization of the cost-benefit ratio for training and equipment. (orig.) [de

  9. Prediction of Success in External Cephalic Version under Tocolysis: Still a Challenge.

    Science.gov (United States)

    Vaz de Macedo, Carolina; Clode, Nuno; Mendes da Graça, Luís

    2015-01-01

    External cephalic version is a procedure of fetal rotation to a cephalic presentation through manoeuvres applied to the maternal abdomen. There are several prognostic factors described in literature for external cephalic version success and prediction scores have been proposed, but their true implication in clinical practice is controversial. We aim to identify possible factors that could contribute to the success of an external cephalic version attempt in our population. We retrospectively examined 207 consecutive external cephalic version attempts under tocolysis conducted between January 1997 and July 2012. We consulted the department's database for the following variables: race, age, parity, maternal body mass index, gestational age, estimated fetal weight, breech category, placental location and amniotic fluid index. We performed descriptive and analytical statistics for each variable and binary logistic regression. External cephalic version was successful in 46.9% of cases (97/207). None of the included variables was associated with the outcome of external cephalic version attempts after adjustment for confounding factors. We present a success rate similar to what has been previously described in literature. However, in contrast to previous authors, we could not associate any of the analysed variables with success of the external cephalic version attempt. We believe this discrepancy is partly related to the type of statistical analysis performed. Even though there are numerous prognostic factors identified for the success in external cephalic version, care must be taken when counselling and selecting patients for this procedure. The data obtained suggests that external cephalic version should continue being offered to all eligible patients regardless of prognostic factors for success.

  10. 49 CFR 173.23 - Previously authorized packaging.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 2 2010-10-01 2010-10-01 false Previously authorized packaging. 173.23 Section... REQUIREMENTS FOR SHIPMENTS AND PACKAGINGS Preparation of Hazardous Materials for Transportation § 173.23 Previously authorized packaging. (a) When the regulations specify a packaging with a specification marking...

  11. 28 CFR 10.5 - Incorporation of papers previously filed.

    Science.gov (United States)

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Incorporation of papers previously filed... CARRYING ON ACTIVITIES WITHIN THE UNITED STATES Registration Statement § 10.5 Incorporation of papers previously filed. Papers and documents already filed with the Attorney General pursuant to the said act and...

  12. 75 FR 76056 - FEDERAL REGISTER CITATION OF PREVIOUS ANNOUNCEMENT:

    Science.gov (United States)

    2010-12-07

    ... SECURITIES AND EXCHANGE COMMISSION Sunshine Act Meeting FEDERAL REGISTER CITATION OF PREVIOUS ANNOUNCEMENT: STATUS: Closed meeting. PLACE: 100 F Street, NE., Washington, DC. DATE AND TIME OF PREVIOUSLY ANNOUNCED MEETING: Thursday, December 9, 2010 at 2 p.m. CHANGE IN THE MEETING: Time change. The closed...

  13. No discrimination against previous mates in a sexually cannibalistic spider

    Science.gov (United States)

    Fromhage, Lutz; Schneider, Jutta M.

    2005-09-01

    In several animal species, females discriminate against previous mates in subsequent mating decisions, increasing the potential for multiple paternity. In spiders, female choice may take the form of selective sexual cannibalism, which has been shown to bias paternity in favor of particular males. If cannibalistic attacks function to restrict a male's paternity, females may have little interest to remate with males having survived such an attack. We therefore studied the possibility of female discrimination against previous mates in sexually cannibalistic Argiope bruennichi, where females almost always attack their mate at the onset of copulation. We compared mating latency and copulation duration of males having experienced a previous copulation either with the same or with a different female, but found no evidence for discrimination against previous mates. However, males copulated significantly shorter when inserting into a used, compared to a previously unused, genital pore of the female.

  14. Implant breast reconstruction after salvage mastectomy in previously irradiated patients.

    Science.gov (United States)

    Persichetti, Paolo; Cagli, Barbara; Simone, Pierfranco; Cogliandro, Annalisa; Fortunato, Lucio; Altomare, Vittorio; Trodella, Lucio

    2009-04-01

    The most common surgical approach in case of local tumor recurrence after quadrantectomy and radiotherapy is salvage mastectomy. Breast reconstruction is the subsequent phase of the treatment and the plastic surgeon has to operate on previously irradiated and manipulated tissues. The medical literature highlights that breast reconstruction with tissue expanders is not a pursuable option, considering previous radiotherapy a contraindication. The purpose of this retrospective study is to evaluate the influence of previous radiotherapy on 2-stage breast reconstruction (tissue expander/implant). Only patients with analogous timing of radiation therapy and the same demolitive and reconstructive procedures were recruited. The results of this study prove that, after salvage mastectomy in previously irradiated patients, implant reconstruction is still possible. Further comparative studies are, of course, advisable to draw any conclusion on the possibility to perform implant reconstruction in previously irradiated patients.

  15. 7th International Workshop on Natural Computing

    CERN Document Server

    Hagiya, Masami

    2015-01-01

    This book highlights recent advances in natural computing, including biology and its theory, bio-inspired computing, computational aesthetics, computational models and theories, computing with natural media, philosophy of natural computing and educational technology. It presents extended versions of the best papers selected from the symposium “7th International Workshop on Natural Computing” (IWNC7), held in Tokyo, Japan, in 2013. The target audience is not limited to researchers working in natural computing but also those active in biological engineering, fine/media art design, aesthetics and philosophy.

  16. 8th International Workshop on Natural Computing

    CERN Document Server

    Hagiya, Masami

    2016-01-01

    This book highlights recent advances in natural computing, including biology and its theory, bio-inspired computing, computational aesthetics, computational models and theories, computing with natural media, philosophy of natural computing, and educational technology. It presents extended versions of the best papers selected from the “8th International Workshop on Natural Computing” (IWNC8), a symposium held in Hiroshima, Japan, in 2014. The target audience is not limited to researchers working in natural computing but also includes those active in biological engineering, fine/media art design, aesthetics, and philosophy.

  17. Comparison of capability between two versions of reactor transient diagnosis expert system 'DISKET' programmed in different languages

    International Nuclear Information System (INIS)

    Yokobayashi, Masao; Yoshida, Kazuo

    1991-01-01

    An expert system DISKET has been developed at JAERI to apply knowledge engineering techniques to the transient diagnosis of nuclear power plant. The first version of DISKET programmed in UTILISP has been developed with the main-frame computer FACOM M-780 at JAERI. The LISP language is not suitable for on-line diagnostic systems because it is highly dependent on computer to be used and requires a large computer memory. The large mainframe computer is also not suitable because there are various restrictions as a multi-user computer system. The second version of DISKET for a practical use has been developed in FORTRAN to realize on-line real time diagnoses with limited computer resources. These two versions of DISKET with the same knowledge base have been compared in running capability, and it has been found that the LISP version of DISKET needs more than two times of memory and CPU time of FORTRAN version. From this result, it is shown that this approach is a practical one to develop expert systems for on-line real time diagnosis of transients with limited computer resources. (author)

  18. Computer graphics from basic to application

    International Nuclear Information System (INIS)

    Kim, Do Hyeong; Mun, Sung Min

    1998-04-01

    This book mentions conception of computer graphics, background history, necessity and applied field like construction design, image processing, auto mobile design, fashion design and TV broadcast, basic principle of computer, computer graphics hardware, computer graphics software such as adobe illustrator tool box and adobe photo shop, quarkXpress like introduction, application and operating circumstance, 3D graphics with summary, difference of versions of 3D studio and system, and Auto CAD application.

  19. Computer graphics from basic to application

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Do Hyeong; Mun, Sung Min

    1998-04-15

    This book mentions conception of computer graphics, background history, necessity and applied field like construction design, image processing, auto mobile design, fashion design and TV broadcast, basic principle of computer, computer graphics hardware, computer graphics software such as adobe illustrator tool box and adobe photo shop, quarkXpress like introduction, application and operating circumstance, 3D graphics with summary, difference of versions of 3D studio and system, and Auto CAD application.

  20. Radioimmunoassay data processing program for IBM PC computers

    International Nuclear Information System (INIS)

    1989-06-01

    The Medical Applications Section of the International Atomic Energy Agency (IAEA) has previously developed several programs for use on the Hewlett-Packard HP-41C programmable calculator to facilitate better quality control in radioimmunoassay through improved data processing. The program described in this document is designed for off-line analysis using an IBM PC (or compatible) for counting data from standards and unknown specimens (i.e. for analysis of counting data previously recorded by a counter), together with internal quality control (IQC) data both within and between batch. The greater computing power of the IBM PC has enabled the imprecision profile and IQC control curves which were unavailable on the HP-41C version. It is intended that the program would make available good data processing capability to laboratories having limited financial resources and serious problems of quality control. 3 refs

  1. Large-scale computer-mediated training for management teachers

    Directory of Open Access Journals (Sweden)

    Gilly Salmon

    1997-01-01

    Full Text Available In 1995/6 the Open University Business School (OUBS trained 187 tutors in the UK and Continental Western Europe in Computer Mediated Conferencing (CMC for management education. The medium chosen for the training was FirstClassTM. In 1996/7 the OUBS trained a further 106 tutors in FirstClassTM using an improved version of the previous years training. The on line training was based on a previously developed model of learning on line. The model was tested both by means of the structure of the training programme and the improvements made. The training programme was evaluated and revised for the second cohort. Comparison was made between the two training programmes.

  2. Aortic pseudoaneurysm detected on external jugular venous distention following a Bentall procedure 10 years previously.

    Science.gov (United States)

    Fukunaga, Naoto; Shomura, Yu; Nasu, Michihiro; Okada, Yukikatsu

    2010-11-01

    An asymptomatic 49-year-old woman was admitted for the purpose of surgery for aortic pseudoaneurysm. She had Marfan syndrome and had undergone an emergent Bentall procedure 10 years previously. About six months previously, she could palpate distended bilateral external jugular veins, which became distended only in a supine position and without any other symptoms. Enhanced computed tomography revealed an aortic pseudoaneurysm originating from a previous distal anastomosis site. During induction of general anesthesia in a supine position, bilateral external jugular venous distention was remarkable. Immediately after a successful operation, distention completely resolved. The present case emphasizes the importance of physical examination leading to a diagnosis of asymptomatic life-threatening diseases in patients with a history of previous aortic surgery.

  3. NDL-v2.0: A new version of the numerical differentiation library for parallel architectures

    Science.gov (United States)

    Hadjidoukas, P. E.; Angelikopoulos, P.; Voglis, C.; Papageorgiou, D. G.; Lagaris, I. E.

    2014-07-01

    We present a new version of the numerical differentiation library (NDL) used for the numerical estimation of first and second order partial derivatives of a function by finite differencing. In this version we have restructured the serial implementation of the code so as to achieve optimal task-based parallelization. The pure shared-memory parallelization of the library has been based on the lightweight OpenMP tasking model allowing for the full extraction of the available parallelism and efficient scheduling of multiple concurrent library calls. On multicore clusters, parallelism is exploited by means of TORC, an MPI-based multi-threaded tasking library. The new MPI implementation of NDL provides optimal performance in terms of function calls and, furthermore, supports asynchronous execution of multiple library calls within legacy MPI programs. In addition, a Python interface has been implemented for all cases, exporting the functionality of our library to sequential Python codes. Catalog identifier: AEDG_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEDG_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 63036 No. of bytes in distributed program, including test data, etc.: 801872 Distribution format: tar.gz Programming language: ANSI Fortran-77, ANSI C, Python. Computer: Distributed systems (clusters), shared memory systems. Operating system: Linux, Unix. Has the code been vectorized or parallelized?: Yes. RAM: The library uses O(N) internal storage, N being the dimension of the problem. It can use up to O(N2) internal storage for Hessian calculations, if a task throttling factor has not been set by the user. Classification: 4.9, 4.14, 6.5. Catalog identifier of previous version: AEDG_v1_0 Journal reference of previous version: Comput. Phys. Comm. 180

  4. The ALICE Magnetic System Computation.

    CERN Document Server

    Klempt, W; CERN. Geneva; Swoboda, Detlef

    1995-01-01

    In this note we present the first results from the ALICE magnetic system computation performed in the 3-dimensional way with the Vector Fields TOSCA code (version 6.5) [1]. To make the calculations we have used the IBM RISC System 6000-370 and 6000-550 machines combined in the CERN PaRC UNIX cluster.

  5. The National Energy Audit (NEAT) Engineering Manual (Version 6)

    Energy Technology Data Exchange (ETDEWEB)

    Gettings, M.B.

    2001-04-20

    Government-funded weatherization assistance programs resulted from increased oil prices caused by the 1973 oil embargo. These programs were instituted to reduce US consumption of oil and help low-income families afford the increasing cost of heating their homes. In the summer of 1988, Oak Ridge National Laboratory (ORNL) began providing technical support to the Department of Energy (DOE) Weatherization Assistance Program (WAP). A preliminary study found no suitable means of cost-effectively selecting energy efficiency improvements (measures) for single-family homes that incorporated all the factors seen as beneficial in improving cost-effectiveness and usability. In mid-1989, ORNL was authorized to begin development of a computer-based measure selection technique. In November of 1992 a draft version of the program was made available to all WAP state directors for testing. The first production release, Version 4.3, was made available in october of 1993. The Department of Energy's Weatherization Assistance Program has continued funding improvements to the program increasing its user-friendliness and applicability. initial publication of this engineering manual coincides with availability of Version 6.1, November 1997, though algorithms described generally apply to all prior versions. Periodic updates of specific sections in the manual will permit maintaining a relevant document. This Engineering Manual delineates the assumptions used by NEAT in arriving at the measure recommendations based on the user's input of the building characteristics. Details of the actual data entry are available in the NEAT User's Manual (ORNL/Sub/91-SK078/1) and will not be discussed in this manual.

  6. Business model elements impacting cloud computing adoption

    DEFF Research Database (Denmark)

    Bogataj, Kristina; Pucihar, Andreja; Sudzina, Frantisek

    The paper presents a proposed research framework for identification of business model elements impacting Cloud Computing Adoption. We provide a definition of main Cloud Computing characteristics, discuss previous findings on factors impacting Cloud Computing Adoption, and investigate technology a...

  7. Computer modelling of the WWER fuel elements under high burnup conditions by the computer codes PIN-W and RODQ2D

    International Nuclear Information System (INIS)

    Valach, M.; Zymak, J.; Svoboda, R.

    1997-01-01

    This paper presents the development status of the computer codes for the WWER fuel elements thermomechanical behavior modelling under high burnup conditions at the Nuclear Research Institute Rez. The accent is given on the analysis of the results from the parametric calculations, performed by the programmes PIN-W and RODQ2D, rather than on their detailed theoretical description. Several new optional correlations for the UO2 thermal conductivity with degradation effect caused by burnup were implemented into the both codes. Examples of performed calculations document differences between previous and new versions of both programmes. Some recommendations for further development of the codes are given in conclusion. (author). 6 refs, 9 figs

  8. Computer modelling of the WWER fuel elements under high burnup conditions by the computer codes PIN-W and RODQ2D

    Energy Technology Data Exchange (ETDEWEB)

    Valach, M; Zymak, J; Svoboda, R [Nuclear Research Inst. Rez plc, Rez (Czech Republic)

    1997-08-01

    This paper presents the development status of the computer codes for the WWER fuel elements thermomechanical behavior modelling under high burnup conditions at the Nuclear Research Institute Rez. The accent is given on the analysis of the results from the parametric calculations, performed by the programmes PIN-W and RODQ2D, rather than on their detailed theoretical description. Several new optional correlations for the UO2 thermal conductivity with degradation effect caused by burnup were implemented into the both codes. Examples of performed calculations document differences between previous and new versions of both programmes. Some recommendations for further development of the codes are given in conclusion. (author). 6 refs, 9 figs.

  9. Integrated Global Radiosonde Archive (IGRA) Version 2

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Integrated Global Radiosonde Archive (IGRA) Version 2 consists of quality-controlled radiosonde observations of temperature, humidity, and wind at stations across...

  10. Integrated Procurement Management System, Version II

    Science.gov (United States)

    Collier, L. J.

    1985-01-01

    Integrated Procurement Management System, Version II (IPMS II) is online/ batch system for collecting developing, managing and disseminating procurementrelated data at NASA Johnson Space Center. Portions of IPMS II adaptable to other procurement situations.

  11. Personality disorders in previously detained adolescent females: a prospective study

    NARCIS (Netherlands)

    Krabbendam, A.; Colins, O.F.; Doreleijers, T.A.H.; van der Molen, E.; Beekman, A.T.F.; Vermeiren, R.R.J.M.

    2015-01-01

    This longitudinal study investigated the predictive value of trauma and mental health problems for the development of antisocial personality disorder (ASPD) and borderline personality disorder (BPD) in previously detained women. The participants were 229 detained adolescent females who were assessed

  12. Payload specialist Reinhard Furrer show evidence of previous blood sampling

    Science.gov (United States)

    1985-01-01

    Payload specialist Reinhard Furrer shows evidence of previous blood sampling while Wubbo J. Ockels, Dutch payload specialist (only partially visible), extends his right arm after a sample has been taken. Both men show bruises on their arms.

  13. Choice of contraception after previous operative delivery at a family ...

    African Journals Online (AJOL)

    Choice of contraception after previous operative delivery at a family planning clinic in Northern Nigeria. Amina Mohammed‑Durosinlorun, Joel Adze, Stephen Bature, Caleb Mohammed, Matthew Taingson, Amina Abubakar, Austin Ojabo, Lydia Airede ...

  14. Previous utilization of service does not improve timely booking in ...

    African Journals Online (AJOL)

    Previous utilization of service does not improve timely booking in antenatal care: Cross sectional study ... Journal Home > Vol 24, No 3 (2010) > ... Results: Past experience on antenatal care service utilization did not come out as a predictor for ...

  15. TPDWR2: thermal power determination for Westinghouse reactors, Version 2. User's guide

    International Nuclear Information System (INIS)

    Kaczynski, G.M.; Woodruff, R.W.

    1985-12-01

    TPDWR2 is a computer program which was developed to determine the amount of thermal power generated by any Westinghouse nuclear power plant. From system conditions, TPDWR2 calculates enthalpies of water and steam and the power transferred to or from various components in the reactor coolant system and to or from the chemical and volume control system. From these results and assuming that the reactor core is operating at constant power and is at thermal equilibrium, TPDWR2 calculates the thermal power generated by the reactor core. TPDWR2 runs on the IBM PC and XT computers when IBM Personal Computer DOS, Version 2.00 or 2.10, and IBM Personal Computer Basic, Version D2.00 or D2.10, are stored on the same diskette with TPDWR2

  16. A previous hamstring injury affects kicking mechanics in soccer players.

    Science.gov (United States)

    Navandar, Archit; Veiga, Santiago; Torres, Gonzalo; Chorro, David; Navarro, Enrique

    2018-01-10

    Although the kicking skill is influenced by limb dominance and sex, how a previous hamstring injury affects kicking has not been studied in detail. Thus, the objective of this study was to evaluate the effect of sex and limb dominance on kicking in limbs with and without a previous hamstring injury. 45 professional players (males: n=19, previously injured players=4, age=21.16 ± 2.00 years; females: n=19, previously injured players=10, age=22.15 ± 4.50 years) performed 5 kicks each with their preferred and non-preferred limb at a target 7m away, which were recorded with a three-dimensional motion capture system. Kinematic and kinetic variables were extracted for the backswing, leg cocking, leg acceleration and follow through phases. A shorter backswing (20.20 ± 3.49% vs 25.64 ± 4.57%), and differences in knee flexion angle (58 ± 10o vs 72 ± 14o) and hip flexion velocity (8 ± 0rad/s vs 10 ± 2rad/s) were observed in previously injured, non-preferred limb kicks for females. A lower peak hip linear velocity (3.50 ± 0.84m/s vs 4.10 ± 0.45m/s) was observed in previously injured, preferred limb kicks of females. These differences occurred in the backswing and leg-cocking phases where the hamstring muscles were the most active. A variation in the functioning of the hamstring muscles and that of the gluteus maximus and iliopsoas in the case of a previous injury could account for the differences observed in the kicking pattern. Therefore, the effects of a previous hamstring injury must be considered while designing rehabilitation programs to re-educate kicking movement.

  17. Fetomaternal hemorrhage during external cephalic version.

    Science.gov (United States)

    Boucher, Marc; Marquette, Gerald P; Varin, Jocelyne; Champagne, Josette; Bujold, Emmanuel

    2008-07-01

    To estimate the frequency and volume of fetomaternal hemorrhage during external cephalic version for term breech singleton fetuses and to identify risk factors involved with this complication. A prospective observational study was performed including all patients undergoing a trial of external cephalic version for a breech presentation of at least 36 weeks of gestation between 1987 and 2001 in our center. A search for fetal erythrocytes using the standard Kleihauer-Betke test was obtained before and after each external cephalic version. The frequency and volume of fetomaternal hemorrhage were calculated. Putative risk factors for fetomaternal hemorrhage were evaluated by chi(2) test and Mann-Whitney U test. A Kleihauer-Betke test result was available before and after 1,311 trials of external cephalic version. The Kleihauer-Betke test was positive in 67 (5.1%) before the procedure. Of the 1,244 women with a negative Kleihauer-Betke test before external cephalic version, 30 (2.4%) had a positive Kleihauer-Betke test after the procedure. Ten (0.8%) had an estimated fetomaternal hemorrhage greater than 1 mL, and one (0.08%) had an estimated fetomaternal hemorrhage greater than 30 mL. The risk of fetomaternal hemorrhage was not influenced by parity, gestational age, body mass index, number of attempts at version, placental location, or amniotic fluid index. The risk of detectable fetomaternal hemorrhage during external cephalic version was 2.4%, with fetomaternal hemorrhage more than 30 mL in less than 0.1% of cases. These data suggest that the performance of a Kleihauer-Betke test is unwarranted in uneventful external cephalic version and that in Rh-negative women, no further Rh immune globulin is necessary other than the routine 300-microgram dose at 28 weeks of gestation and postpartum. II.

  18. Anesthetic management of external cephalic version.

    Science.gov (United States)

    Chalifoux, Laurie A; Sullivan, John T

    2013-09-01

    Breech presentation is common at term and its reduction through external cephalic version represents a noninvasive opportunity to avoid cesarean delivery and the associated maternal morbidity. In addition to uterine relaxants, neuraxial anesthesia is associated with increased success of version procedures when surgical anesthetic dosing is used. The intervention is likely cost effective given the effect size and the avoided high costs of cesarean delivery. Copyright © 2013 Elsevier Inc. All rights reserved.

  19. SHUFFLE. Windows 95/98/2000 version

    International Nuclear Information System (INIS)

    Slavic, S.; Zefran, B.

    2000-01-01

    Program package SHUFFLE was developed to help the user during fuel loading and unloading operations at a nuclear power plant. The first version, developed in 1992, has been written in the CLIPPER program language and run under the DOS operating system. Since the DOS environment exhibits several drawbacks regarding code portability and flexibility, the recent SHUFFLE version has been transformed to run under the MS Windows operating system. (author)

  20. Ecodesign Directive version 2.0

    DEFF Research Database (Denmark)

    This present report reports on the main findings of the project Ecodesign Directive version 2.0 - from Energy Efficiency to Resource Efficiency. The project is financed by the Danish Environmental Protection Agency and ran from December 2012 to June 2014.......This present report reports on the main findings of the project Ecodesign Directive version 2.0 - from Energy Efficiency to Resource Efficiency. The project is financed by the Danish Environmental Protection Agency and ran from December 2012 to June 2014....

  1. Cubical version of combinatorial differential forms

    DEFF Research Database (Denmark)

    Kock, Anders

    2010-01-01

    The theory of combinatorial differential forms is usually presented in simplicial terms. We present here a cubical version; it depends on the possibility of forming affine combinations of mutual neighbour points in a manifold, in the context of synthetic differential geometry.......The theory of combinatorial differential forms is usually presented in simplicial terms. We present here a cubical version; it depends on the possibility of forming affine combinations of mutual neighbour points in a manifold, in the context of synthetic differential geometry....

  2. A kernel version of multivariate alteration detection

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Vestergaard, Jacob Schack

    2013-01-01

    Based on the established methods kernel canonical correlation analysis and multivariate alteration detection we introduce a kernel version of multivariate alteration detection. A case study with SPOT HRV data shows that the kMAD variates focus on extreme change observations.......Based on the established methods kernel canonical correlation analysis and multivariate alteration detection we introduce a kernel version of multivariate alteration detection. A case study with SPOT HRV data shows that the kMAD variates focus on extreme change observations....

  3. Solution of the Skyrme-Hartree-Fock-Bogolyubov equations in the Cartesian deformed harmonic-oscillator basis. (VII) HFODD (v2.49t): A new version of the program

    International Nuclear Information System (INIS)

    Schunck, Nicolas F.; McDonnell, J.; Sheikh, J.A.; Staszczak, A.; Stoitsov, Mario; Dobaczewski, J.; Toivanen, P.

    2012-01-01

    We describe the new version (v2.49t) of the code HFODD which solves the nuclear Skyrme Hartree-Fock (HF) or Skyrme Hartree-Fock-Bogolyubov (HFB) problem by using the Cartesian deformed harmonic-oscillator basis. In the new version, we have implemented the following physics features: (i) the isospin mixing and projection, (ii) the finite temperature formalism for the HFB and HF+BCS methods, (iii) the Lipkin translational energy correction method, (iv) the calculation of the shell correction. A number of specific numerical methods have also been implemented in order to deal with large-scale multi-constraint calculations and hardware limitations: (i) the two-basis method for the HFB method, (ii) the Augmented Lagrangian Method (ALM) for multi-constraint calculations, (iii) the linear constraint method based on the approximation of the RPA matrix for multi-constraint calculations, (iv) an interface with the axial and parity-conserving Skyrme-HFB code HFBTHO, (v) the mixing of the HF or HFB matrix elements instead of the HF fields. Special care has been paid to using the code on massively parallel leadership class computers. For this purpose, the following features are now available with this version: (i) the Message Passing Interface (MPI) framework, (ii) scalable input data routines, (iii) multi-threading via OpenMP pragmas, (iv) parallel diagonalization of the HFB matrix in the simplex breaking case using the ScaLAPACK library. Finally, several little significant errors of the previous published version were corrected.

  4. A Lego version of ATLAS

    CERN Multimedia

    Laëtitia Pedroso

    2010-01-01

    There's nothing very unusual about a small child making simple objects out of Lego. But wouldn't you be surprised to learn that one six-year old has just made a life-like model of the ATLAS detector?   Bastian with his Lego ATLAS detector. © Photo provided by Kai Nicklas, Bastian's father. It all began a month ago when the boy's father was watching a video about the construction of the ATLAS detector on the Internet. He hadn't noticed that his son was watching it over his shoulder. The small boy was fascinated by what he was seeing on the computer screen and his first reaction was to exclaim: "Wow! That's a terrific machine! I think the people who built it must be really clever." The detector must have really fired his imagination because, after asking his father a few questions, he decided to make a Lego model of it. Look at the photo and you will see how closely the model he produced resembles the actual ATLAS detector. Is the little boy in question, Bastia...

  5. Validation of the Turkish Version of the Cognitive Test Anxiety Scale–Revised

    Directory of Open Access Journals (Sweden)

    Sati Bozkurt

    2017-01-01

    Full Text Available The current study explored the psychometric properties of the newly designed Turkish version of the Cognitive Test Anxiety Scale–Revised (CTAR. Results of an exploratory factor analysis revealed an unidimensional structure consistent with the conceptualized nature of cognitive test anxiety and previous examinations of the English version of the CTAR. Examination of the factor loadings revealed two items that were weakly related to the test anxiety construct and as such were prime candidates for removal. Confirmatory factor analyses were conducted to compare model fit for the 25- and 23-item version of the measure. Results indicated that the 23-item version of the measure provided a better fit to the data which support the removal of the problematic items in the Turkish version of the CTAR. Additional analyses demonstrated the internal consistency, test–retest reliability, concurrent validity, and gender equivalence for responses offered on the Turkish version of the measure. Results of the analysis revealed a 23-item Turkish version of the T-CTAR is a valid and reliable measure of cognitive test anxiety for use among Turkish students.

  6. External cephalic version for breech presentation at term

    International Nuclear Information System (INIS)

    Rauf, B.; Hassan, L.

    2007-01-01

    To assess the success rate of External Cephalic Version (ECV) at term and its effects on measures of pregnancy outcome. A total of 40 patients were offered ECV over a period of fourteen months. All singleton breech presentations with an otherwise normal antenatal course between 36-41 weeks of gestation were included in the study. Exclusion criteria included contraindications to ECV i.e. multiple pregnancy, oligohydramnios, growth retardation, antepartum hemorrhage, rupture of membranes toxemias of pregnancy, non-reassuring fetal monitoring pattern, previous uterine scar, bad obstetric history, any contraindication to vaginal delivery, labour and patient wishes after thorough counseling. Overall success rate of the procedure and its effect on maternal and fetal outcome was determined. Significance of results was determined using Chi-square test. A total of 40 patients were recruited for the trial. Overall success rate was 67.5% with only 30% being primi-gravida (p<0.05). Multi-gravida showed higher success rate of 80%. Following successful ECV, spontaneous vaginal delivery was attained in 77.7% (n=21), while caesarean section was performed due to various indications in about 6 cases (p<0.05). Following failed version, 61.5% (n=8) had elective C/S and only 5 delivered vaginally. Route of delivery did not affect the perinatal outcome except for congenital abnormalities. Following successful ECV, there was only one stillbirth. Overall live births associated with successful version was 96.2% (p<0.05), while in failed version, there were no fetal deaths. ECV at term appears to be a useful procedure to reduce the number and associated complications of term breech presentation. It is safe for the mother and the fetus and helps to avoid a significant number of caesarean sections. (author)

  7. Total hip arthroplasty after a previous pelvic osteotomy: A systematic review and meta-analysis.

    Science.gov (United States)

    Shigemura, T; Yamamoto, Y; Murata, Y; Sato, T; Tsuchiya, R; Wada, Y

    2018-06-01

    There are several reports regarding total hip arthroplasty (THA) after a previous pelvic osteotomy (PO). However, to our knowledge, until now there has been no formal systematic review and meta-analysis published to summarize the clinical results of THA after a previous PO. Therefore, we conducted a systematic review and meta-analysis of results of THA after a previous PO. We focus on these questions as follows: does a previous PO affect the results of subsequent THA, such as clinical outcomes, operative time, operative blood loss, and radiological parameters. Using PubMed, Web of Science, and Cochrane Library, we searched for relevant original papers. The pooling of data was performed using RevMan software (version 5.3, Cochrane Collaboration, Oxford, UK). A p-value50%, significant heterogeneity was assumed and a random-effects model was applied for the meta-analysis. A fixed-effects model was applied in the absence of significant heterogeneity. Eleven studies were included in this meta-analysis. The pooled results indicated that there was no significant difference in postoperative Merle D'Aubigne-Postel score (I 2 =0%, SMD=-0.15, 95% CI: -0.36 to 0.06, p=0.17), postoperative Harris hip score (I 2 =60%, SMD=-0.23, 95% CI: -0.50 to 0.05, p=0.10), operative time (I 2 =86%, SMD=0.37, 95% CI: -0.09 to 0.82, p=0.11), operative blood loss (I 2 =82%, SMD=0.23, 95% CI: -0.17 to 0.63, p=0.25), and cup abduction angle (I 2 =43%, SMD=-0.08, 95% CI: -0.25 to 0.09, p=0.38) between THA with and without a previous PO. However, cup anteversion angle of THA with a previous PO was significantly smaller than that of without a previous PO (I 2 =77%, SMD=-0.63, 95% CI: -1.13 to -0.13, p=0.01). Systematic review and meta-analysis of results of THA after a previous PO was performed. A previous PO did not affect the results of subsequent THA, except for cup anteversion. Because of the low quality evidence currently available, high-quality randomized controlled trials are required

  8. Secondary recurrent miscarriage is associated with previous male birth.

    LENUS (Irish Health Repository)

    Ooi, Poh Veh

    2012-01-31

    Secondary recurrent miscarriage (RM) is defined as three or more consecutive pregnancy losses after delivery of a viable infant. Previous reports suggest that a firstborn male child is associated with less favourable subsequent reproductive potential, possibly due to maternal immunisation against male-specific minor histocompatibility antigens. In a retrospective cohort study of 85 cases of secondary RM we aimed to determine if secondary RM was associated with (i) gender of previous child, maternal age, or duration of miscarriage history, and (ii) increased risk of pregnancy complications. Fifty-three women (62.0%; 53\\/85) gave birth to a male child prior to RM compared to 32 (38.0%; 32\\/85) who gave birth to a female child (p=0.002). The majority (91.7%; 78\\/85) had uncomplicated, term deliveries and normal birth weight neonates, with one quarter of the women previously delivered by Caesarean section. All had routine RM investigations and 19.0% (16\\/85) had an abnormal result. Fifty-seven women conceived again and 33.3% (19\\/57) miscarried, but there was no significant difference in failure rates between those with a previous male or female child (13\\/32 vs. 6\\/25, p=0.2). When patients with abnormal results were excluded, or when women with only one previous child were considered, there was still no difference in these rates. A previous male birth may be associated with an increased risk of secondary RM but numbers preclude concluding whether this increases recurrence risk. The suggested association with previous male birth provides a basis for further investigations at a molecular level.

  9. Secondary recurrent miscarriage is associated with previous male birth.

    LENUS (Irish Health Repository)

    Ooi, Poh Veh

    2011-01-01

    Secondary recurrent miscarriage (RM) is defined as three or more consecutive pregnancy losses after delivery of a viable infant. Previous reports suggest that a firstborn male child is associated with less favourable subsequent reproductive potential, possibly due to maternal immunisation against male-specific minor histocompatibility antigens. In a retrospective cohort study of 85 cases of secondary RM we aimed to determine if secondary RM was associated with (i) gender of previous child, maternal age, or duration of miscarriage history, and (ii) increased risk of pregnancy complications. Fifty-three women (62.0%; 53\\/85) gave birth to a male child prior to RM compared to 32 (38.0%; 32\\/85) who gave birth to a female child (p=0.002). The majority (91.7%; 78\\/85) had uncomplicated, term deliveries and normal birth weight neonates, with one quarter of the women previously delivered by Caesarean section. All had routine RM investigations and 19.0% (16\\/85) had an abnormal result. Fifty-seven women conceived again and 33.3% (19\\/57) miscarried, but there was no significant difference in failure rates between those with a previous male or female child (13\\/32 vs. 6\\/25, p=0.2). When patients with abnormal results were excluded, or when women with only one previous child were considered, there was still no difference in these rates. A previous male birth may be associated with an increased risk of secondary RM but numbers preclude concluding whether this increases recurrence risk. The suggested association with previous male birth provides a basis for further investigations at a molecular level.

  10. SAGE FOR WINDOWS (WSAGE) VERSION 1.0 SOLVENT ALTERNATIVES GUIDE - USER'S GUIDE

    Science.gov (United States)

    The guide provides instructions for using the Solvent Alternatives Guide (SAGE) for Windows, version 1.0. The guide assumes that the user is familiar with the fundamentals of operating Windows 3.1 (or higher) on a personal computer under the DOS 5.0 (or higher) operating system. ...

  11. SAGE FOR MACINTOSH (MSAGE) VERSION 1.0 SOLVENT ALTERNATIVES GUIDE - USER'S GUIDE

    Science.gov (United States)

    The guide provides instructions for using the Solvent Alternatives Guide (SAGE) for Macintosh, version 1.0. The guide assumes that the user is familiar with the fundamentals of operating aMacintosh personal computer under the System 7.0 (or higher) operating system. SAGE for ...

  12. Standard interface files and procedures for reactor physics codes. Version IV

    International Nuclear Information System (INIS)

    O'Dell, R.D.

    1977-09-01

    Standards, procedures, and recommendations of the Committee on Computer Code Coordination for promoting the exchange of reactor physics codes are updated to Version IV status. Standards and procedures covering general programming, program structure, standard interface files, and file management and handling subroutines are included

  13. Corporations' Resistance to Innovation: The Adoption of the Internet Protocol Version 6

    Science.gov (United States)

    Pazdrowski, Tomasz

    2013-01-01

    Computer networks that brought unprecedented growth in global communication have been using Internet Protocol version 4 (IPv4) as a standard for routing. The exponential increase in the use of the networks caused an acute shortage of available identification numbers (IP addresses). The shortage and other network communication issues are…

  14. SHABERTH - ANALYSIS OF A SHAFT BEARING SYSTEM (CRAY VERSION)

    Science.gov (United States)

    Coe, H. H.

    1994-01-01

    shear forces in the inlet zone of lubricated contacts, which accounts for the degree of lubricant film starvation; modeling normal and friction forces between a ball and a cage pocket, which account for the transition between the hydrodynamic and elastohydrodynamic regimes of lubrication; and a model of the effect on fatigue life of the ratio of the EHD plateau film thickness to the composite surface roughness. SHABERTH is intended to be as general as possible. The models in SHABERTH allow for the complete mathematical simulation of real physical systems. Systems are limited to a maximum of five bearings supporting the shaft, a maximum of thirty rolling elements per bearing, and a maximum of one hundred temperature nodes. The SHABERTH program structure is modular and has been designed to permit refinement and replacement of various component models as the need and opportunities develop. A preprocessor is included in the IBM PC version of SHABERTH to provide a user friendly means of developing SHABERTH models and executing the resulting code. The preprocessor allows the user to create and modify data files with minimal effort and a reduced chance for errors. Data is utilized as it is entered; the preprocessor then decides what additional data is required to complete the model. Only this required information is requested. The preprocessor can accommodate data input for any SHABERTH compatible shaft bearing system model. The system may include ball bearings, roller bearings, and/or tapered roller bearings. SHABERTH is written in FORTRAN 77, and two machine versions are available from COSMIC. The CRAY version (LEW-14860) has a RAM requirement of 176K of 64 bit words. The IBM PC version (MFS-28818) is written for IBM PC series and compatible computers running MS-DOS, and includes a sample MS-DOS executable. For execution, the PC version requires at least 1Mb of RAM and an 80386 or 486 processor machine with an 80x87 math co-processor. The standard distribution medium for the

  15. TOUGH2 User's Guide Version 2

    Energy Technology Data Exchange (ETDEWEB)

    Pruess, K.; Oldenburg, C.M.; Moridis, G.J.

    1999-11-01

    TOUGH2 is a numerical simulator for nonisothermal flows of multicomponent, multiphase fluids in one, two, and three-dimensional porous and fractured media. The chief applications for which TOUGH2 is designed are in geothermal reservoir engineering, nuclear waste disposal, environmental assessment and remediation, and unsaturated and saturated zone hydrology. TOUGH2 was first released to the public in 1991; the 1991 code was updated in 1994 when a set of preconditioned conjugate gradient solvers was added to allow a more efficient solution of large problems. The current Version 2.0 features several new fluid property modules and offers enhanced process modeling capabilities, such as coupled reservoir-wellbore flow, precipitation and dissolution effects, and multiphase diffusion. Numerous improvements in previously released modules have been made and new user features have been added, such as enhanced linear equation solvers, and writing of graphics files. The T2VOC module for three-phase flows of water, air and a volatile organic chemical (VOC), and the T2DM module for hydrodynamic dispersion in 2-D flow systems have been integrated into the overall structure of the code and are included in the Version 2.0 package. Data inputs are upwardly compatible with the previous version. Coding changes were generally kept to a minimum, and were only made as needed to achieve the additional functionalities desired. TOUGH2 is written in standard FORTRAN77 and can be run on any platform, such as workstations, PCs, Macintosh, mainframe and supercomputers, for which appropriate FORTRAN compilers are available. This report is a self-contained guide to application of TOUGH2 to subsurface flow problems. It gives a technical description of the TOUGH2 code, including a discussion of the physical processes modeled, and the mathematical and numerical methods used. Illustrative sample problems are presented along with detailed instructions for preparing input data.

  16. PR-EDB: Power Reactor Embrittlement Data Base, version 1: Program description

    International Nuclear Information System (INIS)

    Stallmann, F.W.; Kam, F.B.K.; Taylor, B.J.

    1990-06-01

    Data concerning radiation embrittlement of pressure vessel steels in commercial power reactors have been collected form available surveillance reports. The purpose of this NRC-sponsored program is to provide the technical bases for voluntary consensus standards, regulatory guides, standard review plans, and codes. The data can also be used for the exploration and verification of embrittlement prediction models. The data files are given in dBASE 3 Plus format and can be accessed with any personal computer using the DOS operating system. Menu-driven software is provided for easy access to the data including curve fitting and plotting facilities. This software has drastically reduced the time and effort for data processing and evaluation compared to previous data bases. The current compilation of the Power Reactor Embrittlement Data base (PR-EDB, version 1) contains results from surveillance capsule reports of 78 reactors with 381 data points from 110 different irradiated base materials (plates and forgings) and 161 data points from 79 different welds. Results from heat-affected-zone materials are also listed. Electric Power Research Institute (EPRI), reactor vendors, and utilities are in the process of providing back-up quality assurance checks of the PR-EDB and will be supplementing the data base with additional data and documentation. 2 figs., 28 tabs

  17. AUS98 - The 1998 version of the AUS modular neutronic code system

    International Nuclear Information System (INIS)

    Robinson, G.S.; Harrington, B.V.

    1998-07-01

    AUS is a neutronics code system which may be used for calculations of a wide range of fission reactors, fusion blankets and other neutron applications. The present version, AUS98, has a nuclear cross section library based on ENDF/B-VI and includes modules which provide for reactor lattice calculations, one-dimensional transport calculations, multi-dimensional diffusion calculations, cell and whole reactor burnup calculations, and flexible editing of results. Calculations of multi-region resonance shielding, coupled neutron and photon transport, energy deposition, fission product inventory and neutron diffusion are combined within the one code system. The major changes from the previous AUS publications are the inclusion of a cross-section library based on ENDF/B-VI, the addition of the MICBURN module for controlling whole reactor burnup calculations, and changes to the system as a consequence of moving from IBM main-frame computers to UNIX workstations This report gives details of all system aspects of AUS and all modules except the POW3D multi-dimensional diffusion module

  18. AUS98 - The 1998 version of the AUS modular neutronic code system

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, G.S.; Harrington, B.V

    1998-07-01

    AUS is a neutronics code system which may be used for calculations of a wide range of fission reactors, fusion blankets and other neutron applications. The present version, AUS98, has a nuclear cross section library based on ENDF/B-VI and includes modules which provide for reactor lattice calculations, one-dimensional transport calculations, multi-dimensional diffusion calculations, cell and whole reactor burnup calculations, and flexible editing of results. Calculations of multi-region resonance shielding, coupled neutron and photon transport, energy deposition, fission product inventory and neutron diffusion are combined within the one code system. The major changes from the previous AUS publications are the inclusion of a cross-section library based on ENDF/B-VI, the addition of the MICBURN module for controlling whole reactor burnup calculations, and changes to the system as a consequence of moving from IBM main-frame computers to UNIX workstations This report gives details of all system aspects of AUS and all modules except the POW3D multi-dimensional diffusion module refs., tabs.

  19. HALE UAS Concept of Operations. Version 3.0

    Science.gov (United States)

    2006-01-01

    This document is a system level Concept of Operations (CONOPS) from the perspective of future High Altitude Long Endurance (HALE) Unmanned Aircraft Systems (UAS) service providers and National Airspace System (NAS) users. It describes current systems (existing UAS), describes HALE UAS functions and operations to be performed (via sample missions), and offers insight into the user s environment (i.e., the UAS as a system of systems). It is intended to be a source document for NAS UAS operational requirements, and provides a construct for government agencies to use in guiding their regulatory decisions, architecture requirements, and investment strategies. Although it does not describe the technical capabilities of a specific HALE UAS system (which do, and will vary widely), it is intended to aid in requirements capture and to be used as input to the functional requirements and analysis process. The document provides a basis for development of functional requirements and operational guidelines to achieve unrestricted access into the NAS. This document is an FY06 update to the FY05 Access 5 Project-approved Concept of Operations document previously published in the Public Domain on the Access 5 open website. This version is recommended to be approved for public release also. The updates are a reorganization of materials from the previous version with the addition of an updated set of operational requirements, inclusion of sample mission scenarios, and identification of roles and responsibilities of interfaces within flight phases.

  20. USGS Spectral Library Version 7

    Science.gov (United States)

    Kokaly, Raymond F.; Clark, Roger N.; Swayze, Gregg A.; Livo, K. Eric; Hoefen, Todd M.; Pearson, Neil C.; Wise, Richard A.; Benzel, William M.; Lowers, Heather A.; Driscoll, Rhonda L.; Klein, Anna J.

    2017-04-10

    We have assembled a library of spectra measured with laboratory, field, and airborne spectrometers. The instruments used cover wavelengths from the ultraviolet to the far infrared (0.2 to 200 microns [μm]). Laboratory samples of specific minerals, plants, chemical compounds, and manmade materials were measured. In many cases, samples were purified, so that unique spectral features of a material can be related to its chemical structure. These spectro-chemical links are important for interpreting remotely sensed data collected in the field or from an aircraft or spacecraft. This library also contains physically constructed as well as mathematically computed mixtures. Four different spectrometer types were used to measure spectra in the library: (1) Beckman™ 5270 covering the spectral range 0.2 to 3 µm, (2) standard, high resolution (hi-res), and high-resolution Next Generation (hi-resNG) models of Analytical Spectral Devices (ASD) field portable spectrometers covering the range from 0.35 to 2.5 µm, (3) Nicolet™ Fourier Transform Infra-Red (FTIR) interferometer spectrometers covering the range from about 1.12 to 216 µm, and (4) the NASA Airborne Visible/Infra-Red Imaging Spectrometer AVIRIS, covering the range 0.37 to 2.5 µm. Measurements of rocks, soils, and natural mixtures of minerals were made in laboratory and field settings. Spectra of plant components and vegetation plots, comprising many plant types and species with varying backgrounds, are also in this library. Measurements by airborne spectrometers are included for forested vegetation plots, in which the trees are too tall for measurement by a field spectrometer. This report describes the instruments used, the organization of materials into chapters, metadata descriptions of spectra and samples, and possible artifacts in the spectral measurements. To facilitate greater application of the spectra, the library has also been convolved to selected spectrometer and imaging spectrometers sampling and

  1. A survey of computational physics introductory computational science

    CERN Document Server

    Landau, Rubin H; Bordeianu, Cristian C

    2008-01-01

    Computational physics is a rapidly growing subfield of computational science, in large part because computers can solve previously intractable problems or simulate natural processes that do not have analytic solutions. The next step beyond Landau's First Course in Scientific Computing and a follow-up to Landau and Páez's Computational Physics, this text presents a broad survey of key topics in computational physics for advanced undergraduates and beginning graduate students, including new discussions of visualization tools, wavelet analysis, molecular dynamics, and computational fluid dynamics

  2. Erlotinib-induced rash spares previously irradiated skin

    International Nuclear Information System (INIS)

    Lips, Irene M.; Vonk, Ernest J.A.; Koster, Mariska E.Y.; Houwing, Ronald H.

    2011-01-01

    Erlotinib is an epidermal growth factor receptor inhibitor prescribed to patients with locally advanced or metastasized non-small cell lung carcinoma after failure of at least one earlier chemotherapy treatment. Approximately 75% of the patients treated with erlotinib develop acneiform skin rashes. A patient treated with erlotinib 3 months after finishing concomitant treatment with chemotherapy and radiotherapy for non-small cell lung cancer is presented. Unexpectedly, the part of the skin that had been included in his previously radiotherapy field was completely spared from the erlotinib-induced acneiform skin rash. The exact mechanism of erlotinib-induced rash sparing in previously irradiated skin is unclear. The underlying mechanism of this phenomenon needs to be explored further, because the number of patients being treated with a combination of both therapeutic modalities is increasing. The therapeutic effect of erlotinib in the area of the previously irradiated lesion should be assessed. (orig.)

  3. Reasoning with Previous Decisions: Beyond the Doctrine of Precedent

    DEFF Research Database (Denmark)

    Komárek, Jan

    2013-01-01

    in different jurisdictions use previous judicial decisions in their argument, we need to move beyond the concept of precedent to a wider notion, which would embrace practices and theories in legal systems outside the Common law tradition. This article presents the concept of ‘reasoning with previous decisions...... law method’, but they are no less rational and intellectually sophisticated. The reason for the rather conceited attitude of some comparatists is in the dominance of the common law paradigm of precedent and the accompanying ‘case law method’. If we want to understand how courts and lawyers......’ as such an alternative and develops its basic models. The article first points out several shortcomings inherent in limiting the inquiry into reasoning with previous decisions by the common law paradigm (1). On the basis of numerous examples provided in section (1), I will present two basic models of reasoning...

  4. [Prevalence of previously diagnosed diabetes mellitus in Mexico.

    Science.gov (United States)

    Rojas-Martínez, Rosalba; Basto-Abreu, Ana; Aguilar-Salinas, Carlos A; Zárate-Rojas, Emiliano; Villalpando, Salvador; Barrientos-Gutiérrez, Tonatiuh

    2018-01-01

    To compare the prevalence of previously diagnosed diabetes in 2016 with previous national surveys and to describe treatment and its complications. Mexico's national surveys Ensa 2000, Ensanut 2006, 2012 and 2016 were used. For 2016, logistic regression models and measures of central tendency and dispersion were obtained. The prevalence of previously diagnosed diabetes in 2016 was 9.4%. The increase of 2.2% relative to 2012 was not significant and only observed in patients older than 60 years. While preventive measures have increased, the access to medical treatment and lifestyle has not changed. The treatment has been modified, with an increase in insulin and decrease in hypoglycaemic agents. Population aging, lack of screening actions and the increase in diabetes complications will lead to an increase on the burden of disease. Policy measures targeting primary and secondary prevention of diabetes are crucial.

  5. Cardiovascular magnetic resonance in adults with previous cardiovascular surgery.

    Science.gov (United States)

    von Knobelsdorff-Brenkenhoff, Florian; Trauzeddel, Ralf Felix; Schulz-Menger, Jeanette

    2014-03-01

    Cardiovascular magnetic resonance (CMR) is a versatile non-invasive imaging modality that serves a broad spectrum of indications in clinical cardiology and has proven evidence. Most of the numerous applications are appropriate in patients with previous cardiovascular surgery in the same manner as in non-surgical subjects. However, some specifics have to be considered. This review article is intended to provide information about the application of CMR in adults with previous cardiovascular surgery. In particular, the two main scenarios, i.e. following coronary artery bypass surgery and following heart valve surgery, are highlighted. Furthermore, several pictorial descriptions of other potential indications for CMR after cardiovascular surgery are given.

  6. Psychometric properties of the Hebrew short version of the Zimbardo Time Perspective Inventory.

    Science.gov (United States)

    Orkibi, Hod

    2015-06-01

    The purpose of this study was to develop a short Hebrew version of the Zimbardo Time Perspective Inventory that can be easily administered by health professionals in research, therapy, and counseling. First, the empirical links of time perspective (TP) to subjective well-being and health protective and health risk behaviors are reviewed. Then, a brief account of the instrument's previous modifications is provided. Results of confirmatory factor analysis (N = 572) verified the five-factor structure of the short version and yielded acceptable internal consistency reliability for each factor. The correlation coefficients between the five subscales of the short (20 items) and the original (56 items) instruments were all above .79, indicating the suitability of the short version for assessing the five TP factors. Support for the discriminant and concurrent validity was also achieved, largely in agreement with previous findings. Finally, limitations and future directions are addressed, and potential applications in therapy and counseling are offered. © The Author(s) 2014.

  7. Third-Order Transport with MAD Input: A Computer Program for Designing Charged Particle Beam Transport Systems

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Karl

    1998-10-28

    TRANSPORT has been in existence in various evolutionary versions since 1963. The present version of TRANSPORT is a first-, second-, and third-order matrix multiplication computer program intended for the design of static-magnetic beam transport systems.

  8. Optical Computing

    OpenAIRE

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  9. Is Contralateral Templating Reliable for Establishing Rotational Alignment During Intramedullary Stabilization of Femoral Shaft Fractures? A Study of Individual Bilateral Differences in Femoral Version.

    Science.gov (United States)

    Croom, William P; Lorenzana, Daniel J; Auran, Richard L; Cavallero, Matthew J; Heckmann, Nathanael; Lee, Jackson; White, Eric A

    2018-02-01

    To determine native individual bilateral differences (IBDs) in femoral version in a diverse population. Computed tomography scans with complete imaging of uninjured bilateral femora were used to determine femoral version and IBDs in version. Age, sex, and ethnicity of each subject were also collected. Femoral version and IBDs in version were correlated with demographic variables using univariate and multivariate regression models. One hundred sixty-four subjects were included in the study. The average femoral version was 9.4 degrees (±9.4 degrees). The mean IBD in femoral version was 5.4 degrees (±4.4 degrees, P alignment during intramedullary stabilization of diaphyseal femur fractures. This is also an important consideration when considering malrotation of femur fractures because most studies define malrotation as a greater than 10-15-degree difference compared with the contralateral side. Prognostic Level IV. See Instructions for Authors for a complete description of levels of evidence.

  10. Squamous cell carcinoma arising in previously burned or irradiated skin

    International Nuclear Information System (INIS)

    Edwards, M.J.; Hirsch, R.M.; Broadwater, J.R.; Netscher, D.T.; Ames, F.C.

    1989-01-01

    Squamous cell carcinoma (SCC) arising in previously burned or irradiated skin was reviewed in 66 patients treated between 1944 and 1986. Healing of the initial injury was complicated in 70% of patients. Mean interval from initial injury to diagnosis of SCC was 37 years. The overwhelming majority of patients presented with a chronic intractable ulcer in previously injured skin. The regional relapse rate after surgical excision was very high, 58% of all patients. Predominant patterns of recurrence were in local skin and regional lymph nodes (93% of recurrences). Survival rates at 5, 10, and 20 years were 52%, 34%, and 23%, respectively. Five-year survival rates in previously burned and irradiated patients were not significantly different (53% and 50%, respectively). This review, one of the largest reported series, better defines SCC arising in previously burned or irradiated skin as a locally aggressive disease that is distinct from SCC arising in sunlight-damaged skin. An increased awareness of the significance of chronic ulceration in scar tissue may allow earlier diagnosis. Regional disease control and survival depend on surgical resection of all known disease and may require radical lymph node dissection or amputation

  11. Outcome Of Pregnancy Following A Previous Lower Segment ...

    African Journals Online (AJOL)

    Background: A previous ceasarean section is an important variable that influences patient management in subsequent pregnancies. A trial of vaginal delivery in such patients is a feasible alternative to a secondary section, thus aiding to reduce the ceasarean section rate and its associated co-morbidities. Objective: To ...

  12. 24 CFR 1710.552 - Previously accepted state filings.

    Science.gov (United States)

    2010-04-01

    ... of Substantially Equivalent State Law § 1710.552 Previously accepted state filings. (a) Materials... and contracts or agreements contain notice of purchaser's revocation rights. In addition see § 1715.15..., unless the developer is obligated to do so in the contract. (b) If any such filing becomes inactive or...

  13. The job satisfaction of principals of previously disadvantaged schools

    African Journals Online (AJOL)

    The aim of this study was to identify influences on the job satisfaction of previously disadvantaged ..... I am still riding the cloud … I hope it lasts. .... as a way of creating a climate and culture in schools where individuals are willing to explore.

  14. Haemophilus influenzae type f meningitis in a previously healthy boy

    DEFF Research Database (Denmark)

    Ronit, Andreas; Berg, Ronan M G; Bruunsgaard, Helle

    2013-01-01

    Non-serotype b strains of Haemophilus influenzae are extremely rare causes of acute bacterial meningitis in immunocompetent individuals. We report a case of acute bacterial meningitis in a 14-year-old boy, who was previously healthy and had been immunised against H influenzae serotype b (Hib...

  15. Research Note Effects of previous cultivation on regeneration of ...

    African Journals Online (AJOL)

    We investigated the effects of previous cultivation on regeneration potential under miombo woodlands in a resettlement area, a spatial product of Zimbabwe's land reforms. We predicted that cultivation would affect population structure, regeneration, recruitment and potential grazing capacity of rangelands. Plant attributes ...

  16. Cryptococcal meningitis in a previously healthy child | Chimowa ...

    African Journals Online (AJOL)

    An 8-year-old previously healthy female presented with a 3 weeks history of headache, neck stiffness, deafness, fever and vomiting and was diagnosed with cryptococcal meningitis. She had documented hearing loss and was referred to tertiary-level care after treatment with fluconazole did not improve her neurological ...

  17. Investigation of previously derived Hyades, Coma, and M67 reddenings

    International Nuclear Information System (INIS)

    Taylor, B.J.

    1980-01-01

    New Hyades polarimetry and field star photometry have been obtained to check the Hyades reddening, which was found to be nonzero in a previous paper. The new Hyades polarimetry implies essentially zero reddening; this is also true of polarimetry published by Behr (which was incorrectly interpreted in the previous paper). Four photometric techniques which are presumed to be insensitive to blanketing are used to compare the Hyades to nearby field stars; these four techniques also yield essentially zero reddening. When all of these results are combined with others which the author has previously published and a simultaneous solution for the Hyades, Coma, and M67 reddenings is made, the results are E (B-V) =3 +- 2 (sigma) mmag, -1 +- 3 (sigma) mmag, and 46 +- 6 (sigma) mmag, respectively. No support for a nonzero Hyades reddening is offered by the new results. When the newly obtained reddenings for the Hyades, Coma, and M67 are compared with results from techniques given by Crawford and by users of the David Dunlap Observatory photometric system, no differences between the new and other reddenings are found which are larger than about 2 sigma. The author had previously found that the M67 main-sequence stars have about the same blanketing as that of Coma and less blanketing than the Hyades; this conclusion is essentially unchanged by the revised reddenings

  18. Rapid fish stock depletion in previously unexploited seamounts: the ...

    African Journals Online (AJOL)

    Rapid fish stock depletion in previously unexploited seamounts: the case of Beryx splendens from the Sierra Leone Rise (Gulf of Guinea) ... A spectral analysis and red-noise spectra procedure (REDFIT) algorithm was used to identify the red-noise spectrum from the gaps in the observed time-series of catch per unit effort by ...

  19. 18 CFR 154.302 - Previously submitted material.

    Science.gov (United States)

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Previously submitted material. 154.302 Section 154.302 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY... concurrently with the rate change filing. There must be furnished to the Director, Office of Energy Market...

  20. Process cells dismantling of EUREX pant: previous activities

    International Nuclear Information System (INIS)

    Gili, M.

    1998-01-01

    In the '98-'99 period some process cells of the EUREX pant will be dismantled, in order to place there the liquid wastes conditioning plant 'CORA'. This report resumes the previous activities (plant rinsing campaigns and inactive Cell 014 dismantling), run in the past three years and the drawn experience [it

  1. The job satisfaction of principals of previously disadvantaged schools

    African Journals Online (AJOL)

    The aim of this study was to identify influences on the job satisfaction of previously disadvantaged school principals in North-West Province. Evans's theory of job satisfaction, morale and motivation was useful as a conceptual framework. A mixedmethods explanatory research design was important in discovering issues with ...

  2. Obstructive pulmonary disease in patients with previous tuberculosis ...

    African Journals Online (AJOL)

    Obstructive pulmonary disease in patients with previous tuberculosis: Pathophysiology of a community-based cohort. B.W. Allwood, R Gillespie, M Galperin-Aizenberg, M Bateman, H Olckers, L Taborda-Barata, G.L. Calligaro, Q Said-Hartley, R van Zyl-Smit, C.B. Cooper, E van Rikxoort, J Goldin, N Beyers, E.D. Bateman ...

  3. Abiraterone in metastatic prostate cancer without previous chemotherapy

    NARCIS (Netherlands)

    Ryan, Charles J.; Smith, Matthew R.; de Bono, Johann S.; Molina, Arturo; Logothetis, Christopher J.; de Souza, Paul; Fizazi, Karim; Mainwaring, Paul; Piulats, Josep M.; Ng, Siobhan; Carles, Joan; Mulders, Peter F. A.; Basch, Ethan; Small, Eric J.; Saad, Fred; Schrijvers, Dirk; van Poppel, Hendrik; Mukherjee, Som D.; Suttmann, Henrik; Gerritsen, Winald R.; Flaig, Thomas W.; George, Daniel J.; Yu, Evan Y.; Efstathiou, Eleni; Pantuck, Allan; Winquist, Eric; Higano, Celestia S.; Taplin, Mary-Ellen; Park, Youn; Kheoh, Thian; Griffin, Thomas; Scher, Howard I.; Rathkopf, Dana E.; Boyce, A.; Costello, A.; Davis, I.; Ganju, V.; Horvath, L.; Lynch, R.; Marx, G.; Parnis, F.; Shapiro, J.; Singhal, N.; Slancar, M.; van Hazel, G.; Wong, S.; Yip, D.; Carpentier, P.; Luyten, D.; de Reijke, T.

    2013-01-01

    Abiraterone acetate, an androgen biosynthesis inhibitor, improves overall survival in patients with metastatic castration-resistant prostate cancer after chemotherapy. We evaluated this agent in patients who had not received previous chemotherapy. In this double-blind study, we randomly assigned

  4. Response to health insurance by previously uninsured rural children.

    Science.gov (United States)

    Tilford, J M; Robbins, J M; Shema, S J; Farmer, F L

    1999-08-01

    To examine the healthcare utilization and costs of previously uninsured rural children. Four years of claims data from a school-based health insurance program located in the Mississippi Delta. All children who were not Medicaid-eligible or were uninsured, were eligible for limited benefits under the program. The 1987 National Medical Expenditure Survey (NMES) was used to compare utilization of services. The study represents a natural experiment in the provision of insurance benefits to a previously uninsured population. Premiums for the claims cost were set with little or no information on expected use of services. Claims from the insurer were used to form a panel data set. Mixed model logistic and linear regressions were estimated to determine the response to insurance for several categories of health services. The use of services increased over time and approached the level of utilization in the NMES. Conditional medical expenditures also increased over time. Actuarial estimates of claims cost greatly exceeded actual claims cost. The provision of a limited medical, dental, and optical benefit package cost approximately $20-$24 per member per month in claims paid. An important uncertainty in providing health insurance to previously uninsured populations is whether a pent-up demand exists for health services. Evidence of a pent-up demand for medical services was not supported in this study of rural school-age children. States considering partnerships with private insurers to implement the State Children's Health Insurance Program could lower premium costs by assembling basic data on previously uninsured children.

  5. London SPAN version 4 parameter file format

    International Nuclear Information System (INIS)

    2004-06-01

    Powernext SA is a Multilateral Trading Facility in charge of managing the French power exchange through an optional and anonymous organised trading system. Powernext SA collaborates with the clearing organization LCH.Clearnet SA to secure and facilitate the transactions. The French Standard Portfolio Analysis of Risk (SPAN) is a system used by LCH.Clearnet to calculate the initial margins from and for its clearing members. SPAN is a computerized system which calculates the impact of several possible variations of rates and volatility on by-product portfolios. The initial margin call is equal to the maximum probable loss calculated by the system. This document contains details of the format of the London SPAN version 4 parameter file. This file contains all the parameters and risk arrays required to calculate SPAN margins. London SPAN Version 4 is an upgrade from Version 3, which is also known as LME SPAN. This document contains the full revised file specification, highlighting the changes from Version 3 to Version 4

  6. Reoperative sentinel lymph node biopsy after previous mastectomy.

    Science.gov (United States)

    Karam, Amer; Stempel, Michelle; Cody, Hiram S; Port, Elisa R

    2008-10-01

    Sentinel lymph node (SLN) biopsy is the standard of care for axillary staging in breast cancer, but many clinical scenarios questioning the validity of SLN biopsy remain. Here we describe our experience with reoperative-SLN (re-SLN) biopsy after previous mastectomy. Review of the SLN database from September 1996 to December 2007 yielded 20 procedures done in the setting of previous mastectomy. SLN biopsy was performed using radioisotope with or without blue dye injection superior to the mastectomy incision, in the skin flap in all patients. In 17 of 20 patients (85%), re-SLN biopsy was performed for local or regional recurrence after mastectomy. Re-SLN biopsy was successful in 13 of 20 patients (65%) after previous mastectomy. Of the 13 patients, 2 had positive re-SLN, and completion axillary dissection was performed, with 1 having additional positive nodes. In the 11 patients with negative re-SLN, 2 patients underwent completion axillary dissection demonstrating additional negative nodes. One patient with a negative re-SLN experienced chest wall recurrence combined with axillary recurrence 11 months after re-SLN biopsy. All others remained free of local or axillary recurrence. Re-SLN biopsy was unsuccessful in 7 of 20 patients (35%). In three of seven patients, axillary dissection was performed, yielding positive nodes in two of the three. The remaining four of seven patients all had previous modified radical mastectomy, so underwent no additional axillary surgery. In this small series, re-SLN was successful after previous mastectomy, and this procedure may play some role when axillary staging is warranted after mastectomy.

  7. IGT-Open: An open-source, computerized version of the Iowa Gambling Task.

    Science.gov (United States)

    Dancy, Christopher L; Ritter, Frank E

    2017-06-01

    The Iowa Gambling Task (IGT) is commonly used to understand the processes involved in decision-making. Though the task was originally run without a computer, using a computerized version of the task has become typical. These computerized versions of the IGT are useful, because they can make the task more standardized across studies and allow for the task to be used in environments where a physical version of the task may be difficult or impossible to use (e.g., while collecting brain imaging data). Though these computerized versions of the IGT have been useful for experimentation, having multiple software implementations of the task could present reliability issues. We present an open-source software version of the Iowa Gambling Task (called IGT-Open) that allows for millisecond visual presentation accuracy and is freely available to be used and modified. This software has been used to collect data from human subjects and also has been used to run model-based simulations with computational process models developed to run in the ACT-R architecture.

  8. Traduire Flaubert : Madame Bovary en version roumaine

    Directory of Open Access Journals (Sweden)

    Florica Courriol

    2012-02-01

    Full Text Available Cette communication est le résultat d'une analyse traductionnelle visant à réévaluer une version roumaine de Madame Bovary des années soixante-dix (Doamna Bovary traduit par Demostene Botez à travers le travail concret sur le texte avec les étudiants d'un mastère. De multiples anomalies ayant été ainsi décelées, j'ai été amenée à retraduire ce grand classique. Il m'a semblé profitable à la recherche flaubertienne d'exposer les difficultés et leurs résolutions, d'ordre purement linguistique ou civilisationnel, que l'on se place sur le plan de la lecture (compréhension du texte ou sur celui du rendu dans la langue d'arrivée. Le choix des points litigieux ne se limite pas au simple discours mais touche aussi à la problématique des noms propres (peut-on les traduire ? faut-il avoir recours aux fameuses notes de traducteur ? et, en conséquence logique, à celle du titre. Les exemples concrets viennent étayer une pratique traduisante et la mettre en relation avec la critique, l'histoire littéraire et la création flaubertienne dans son ensemble.I intend to address in this paper the concrete problems I encountered when translating into Romanian FLAUBERT’s major work, Madame Bovary. First of all, its nature of retranslation has to be specified, since an existing version, published in the 1970s by a Romanian poet (Demostene Botez, was in fact in circulation when I started working on the original text. This previous translation was probably the result of a collective operation, as the split rhythm suggests at several points. In the meantime, other publishers saw fit to print additional versions in great haste. However, a mere examination of the problematic passages is sufficient enough to realize that the authors of the new versions only limited themselves to take up the first one. They improved the text here and there, “updated” some expressions, while keeping the glaring errors of Botez’s version.

  9. Risk factors for cesarean section and instrumental vaginal delivery after successful external cephalic version

    NARCIS (Netherlands)

    de Hundt, Marcella; Vlemmix, Floortje; Bais, Joke M. J.; de Groot, Christianne J.; Mol, Ben Willem; Kok, Marjolein

    2016-01-01

    Aim of this article is to examine if we could identify factors that predict cesarean section and instrumental vaginal delivery in women who had a successful external cephalic version. We used data from a previous randomized trial among 25 hospitals and their referring midwife practices in the

  10. An update to the Surface Ocean CO2 Atlas (SOCAT version 2)

    Digital Repository Service at National Institute of Oceanography (India)

    Bakker, D.C.E.; Hankin, S.; Olsen, A; Pfeil, B.; Smith, K.; Alin, S.R.; Cosca, C.; Hales, B.; Harasawa, S.; Kozyr, A; Nojiri, Y.; OBrien, K.M.; Schuster, U.; Telszewski, M.; Tilbrook, B.; Wada, C.; Akl, J.; Barbero, L.; Bates, N.; Boutin, J.; Cai, W.J.; Castle, R.D.; Chavez, F.; Chen, L.; Chierici, M.; Currie, K.; Evans, W.; Feely, R.A; Fransson, A; Gao, Z.; Hardman-Mountford, N.; Hoppema, M.; Huang, W.J.; Hunt, C.W.; Huss, B.; Ichikawa, T.; Jacobson, A; Johannessen, T.; Jones, E.M.; Jones, S.; Sara, J.; Kitidis, V.; Kortzinger, A.; Lauvset, S.; Lefevre, N.; Manke, A.B.; Mathis, J.; Metzl, N.; Monteiro, P.; Murata, A.; Newberger, T.; Nobuo, T.; Ono, T.; Paterson, K.; Pierrot, D.; Rios, A.F.; Sabine, C.L.; Saito, S.; Salisbury, J.; Sarma, V.V.S.S.; Schlitzer, R.; Sieger, R.; Skjelvan, I.; Steinhoff, T.; Sullivan, K.; Sutherland, S.C.; Suzuki, T.; Sutton, A.; Sweeney, C.; Takahashi, T.; Tjiputra, J.; VanHeuven, S.; Vandemark, D.; Vlahos, P.; Wallace, D.W.R.; Wanninkhof, R.; Watson, A.J.

    of SOCAT is an update of the previous release (version 1) with more data (increased from 6.3 million to 10.1 million surface water fCO2 values) and extended data coverage (from 1968–2007 to 1968–2011). The quality control criteria, while...

  11. Factorial Structure of the French Version of the Rosenberg Self-Esteem Scale among the Elderly

    Science.gov (United States)

    Gana, Kamel; Alaphilippe, Daniel; Bailly, Nathalie

    2005-01-01

    Ten different confirmatory factor analysis models, including ones with correlated traits correlated methods, correlated traits correlated uniqueness, and correlated traits uncorrelated methods, were proposed to examine the factorial structure of the French version of the Rosenberg Self-Esteem Scale (Rosenberg, 1965). In line with previous studies…

  12. Milky Way Past Was More Turbulent Than Previously Known

    Science.gov (United States)

    2004-04-01

    Results of 1001 observing nights shed new light on our Galaxy [1] Summary A team of astronomers from Denmark, Switzerland and Sweden [2] has achieved a major breakthrough in our understanding of the Milky Way, the galaxy in which we live. After more than 1,000 nights of observations spread over 15 years, they have determined the spatial motions of more than 14,000 solar-like stars residing in the neighbourhood of the Sun. For the first time, the changing dynamics of the Milky Way since its birth can now be studied in detail and with a stellar sample sufficiently large to allow a sound analysis. The astronomers find that our home galaxy has led a much more turbulent and chaotic life than previously assumed. PR Photo 10a/04: Distribution on the sky of the observed stars. PR Photo 10b/04: Stars in the solar neigbourhood and the Milky Way galaxy (artist's view). PR Video Clip 04/04: The motions of the observed stars during the past 250 million years. Unknown history Home is the place we know best. But not so in the Milky Way - the galaxy in which we live. Our knowledge of our nearest stellar neighbours has long been seriously incomplete and - worse - skewed by prejudice concerning their behaviour. Stars were generally selected for observation because they were thought to be "interesting" in some sense, not because they were typical. This has resulted in a biased view of the evolution of our Galaxy. The Milky Way started out just after the Big Bang as one or more diffuse blobs of gas of almost pure hydrogen and helium. With time, it assembled into the flattened spiral galaxy which we inhabit today. Meanwhile, generation after generation of stars were formed, including our Sun some 4,700 million years ago. But how did all this really happen? Was it a rapid process? Was it violent or calm? When were all the heavier elements formed? How did the Milky Way change its composition and shape with time? Answers to these and many other questions are 'hot' topics for the

  13. A Comparative Investigation of the Previous and New Secondary History Curriculum: The Issues of the Definition of the Aims and Objectives and the Selection of Curriculum Content

    Science.gov (United States)

    Dinc, Erkan

    2011-01-01

    Discussions on history teaching in Turkey indicate that the previous versions of the history curriculum and the pedagogy of history in the country bear many problems and deficiencies. The problems of Turkish history curriculum mainly arise from the perspectives it takes and the selection of its content. Since 2003, there have been extensive…

  14. [Fatal amnioinfusion with previous choriocarcinoma in a parturient woman].

    Science.gov (United States)

    Hrgović, Z; Bukovic, D; Mrcela, M; Hrgović, I; Siebzehnrübl, E; Karelovic, D

    2004-04-01

    The case of 36-year-old tercipare is described who developed choriocharcinoma in a previous pregnancy. During the first term labour the patient developed cardiac arrest, so reanimation and sectio cesarea was performed. A male new-born was delivered in good condition, but even after intensive therapy and reanimation occurred death of parturient woman with picture of disseminate intravascular coagulopathia (DIK). On autopsy and on histology there was no sign of malignant disease, so it was not possible to connect previous choricarcinoma with amniotic fluid embolism. Maybe was place of choriocarcinoma "locus minoris resistentiae" which later resulted with failure in placentation what was hard to prove. On autopsy we found embolia of lung with a microthrombosis of terminal circulation with punctiformis bleeding in mucous, what stands for DIK.

  15. Challenging previous conceptions of vegetarianism and eating disorders.

    Science.gov (United States)

    Fisak, B; Peterson, R D; Tantleff-Dunn, S; Molnar, J M

    2006-12-01

    The purpose of this study was to replicate and expand upon previous research that has examined the potential association between vegetarianism and disordered eating. Limitations of previous research studies are addressed, including possible low reliability of measures of eating pathology within vegetarian samples, use of only a few dietary restraint measures, and a paucity of research examining potential differences in body image and food choice motives of vegetarians versus nonvegetarians. Two hundred and fifty-six college students completed a number of measures of eating pathology and body image, and a food choice motives questionnaire. Interestingly, no significant differences were found between vegetarians and nonvegetarians in measures of eating pathology or body image. However, significant differences in food choice motives were found. Implications for both researchers and clinicians are discussed.

  16. Previously unreported abnormalities in Wolfram Syndrome Type 2.

    Science.gov (United States)

    Akturk, Halis Kaan; Yasa, Seda

    2017-01-01

    Wolfram syndrome (WFS) is a rare autosomal recessive disease with non-autoimmune childhood onset insulin dependent diabetes and optic atrophy. WFS type 2 (WFS2) differs from WFS type 1 (WFS1) with upper intestinal ulcers, bleeding tendency and the lack ofdiabetes insipidus. Li-fespan is short due to related comorbidities. Only a few familieshave been reported with this syndrome with the CISD2 mutation. Here we report two siblings with a clinical diagnosis of WFS2, previously misdiagnosed with type 1 diabetes mellitus and diabetic retinopathy-related blindness. We report possible additional clinical and laboratory findings that have not been pre-viously reported, such as asymptomatic hypoparathyroidism, osteomalacia, growth hormone (GH) deficiency and hepatomegaly. Even though not a requirement for the diagnosis of WFS2 currently, our case series confirm hypogonadotropic hypogonadism to be also a feature of this syndrome, as reported before. © Polish Society for Pediatric Endocrinology and Diabetology.

  17. A kernel version of spatial factor analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    2009-01-01

    . Schölkopf et al. introduce kernel PCA. Shawe-Taylor and Cristianini is an excellent reference for kernel methods in general. Bishop and Press et al. describe kernel methods among many other subjects. Nielsen and Canty use kernel PCA to detect change in univariate airborne digital camera images. The kernel...... version of PCA handles nonlinearities by implicitly transforming data into high (even infinite) dimensional feature space via the kernel function and then performing a linear analysis in that space. In this paper we shall apply kernel versions of PCA, maximum autocorrelation factor (MAF) analysis...

  18. PCACE-Personal-Computer-Aided Cabling Engineering

    Science.gov (United States)

    Billitti, Joseph W.

    1987-01-01

    PCACE computer program developed to provide inexpensive, interactive system for learning and using engineering approach to interconnection systems. Basically database system that stores information as files of individual connectors and handles wiring information in circuit groups stored as records. Directly emulates typical manual engineering methods of handling data, thus making interface between user and program very natural. Apple version written in P-Code Pascal and IBM PC version of PCACE written in TURBO Pascal 3.0

  19. Poisson/Superfish codes for personal computers

    International Nuclear Information System (INIS)

    Humphries, S.

    1992-01-01

    The Poisson/Superfish codes calculate static E or B fields in two-dimensions and electromagnetic fields in resonant structures. New versions for 386/486 PCs and Macintosh computers have capabilities that exceed the mainframe versions. Notable improvements are interactive graphical post-processors, improved field calculation routines, and a new program for charged particle orbit tracking. (author). 4 refs., 1 tab., figs

  20. Standardized computer-based organized reporting of EEG

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C.

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se......Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted...... in the second, revised version of SCORE (Standardized Computer-based Organized Reporting of EEG), which is presented in this paper. The revised terminology was implemented in a software package (SCORE EEG), which was tested in clinical practice on 12,160 EEG recordings. Standardized terms implemented in SCORE...... are used to report the features of clinical relevance, extracted while assessing the EEGs. Selection of the terms is context sensitive: initial choices determine the subsequently presented sets of additional choices. This process automatically generates a report and feeds these features into a database...