WorldWideScience

Sample records for vhp computers include

  1. Dicty_cDB: VHP253 [Dicty_cDB

    Lifescience Database Archive (English)

    Full Text Available VH (Link to library) VHP253 (Link to dictyBase) - - - Contig-U16349-1 - (Link to Or...iginal site) - - VHP253Z 355 - - - - Show VHP253 Library VH (Link to library) Clone ID VHP253 (Link to dicty...Base) Atlas ID - NBRP ID - dictyBase ID - Link to Contig Contig-U16349-1 Original site URL http://dictycdb.b...CTTGGGTACCAAGAACTGACCGTCAATTTGCTGGTTCATGGTTTGC sequence update 2002. 9.10 Translated Amino Acid sequence ---MQLFAGIKSICT...VPIMRMYFHTGILDYILFKSWVPRTDRQFAGSWF Translated Amino Acid sequence (All Frames) Frame A: ---MQLFAGIKSICTEMAMD

  2. Dicty_cDB: VHP243 [Dicty_cDB

    Lifescience Database Archive (English)

    Full Text Available VH (Link to library) VHP243 (Link to dictyBase) - - - Contig-U16236-1 - (Link to Or...iginal site) VHP243F 134 - - - - - - Show VHP243 Library VH (Link to library) Clone ID VHP243 (Link to dicty...Base) Atlas ID - NBRP ID - dictyBase ID - Link to Contig Contig-U16236-1 Original site URL http://dictycdb.b...AXXXXXXXXXX sequence update 2002.10.25 Translated Amino Acid sequence CWPTGIXKTTICT...kilsif*ynfkyyqqpkkk--- Frame B: llaywyxqnnnlyqyyyyfyl*kyflsfniilniinnpkk--- Frame C: CWPTGIXKTTICTNTTIISICKN

  3. Vaporized Hydrogen Peroxide (VHP) Decontamination of VX, GD, and HD

    National Research Council Canada - National Science Library

    Wagner, George W; Sorrick, David C; Procell, Lawrence R; Hess, Zoe A; Brickhouse, Mark D; McVey, Iain F; Schwartz, Lewis I

    2003-01-01

    Vaporized Hydrogen Peroxide (VHP) has been utilized for more than a decade to sterilize clean rooms and pharmaceutical processing equipment and, quite recently, to decontaminate anthrax-ridden buildings...

  4. Vaporized Hydrogen Peroxide (VHP) Decontamination of a Section of a Boeing 747 Cabin

    National Research Council Canada - National Science Library

    Shaffstall, Robert M; Garner, Robert P; Bishop, Joshua; Cameron-Landis, Lora; Eddington, Donald L; Hau, Gwen; Spera, Shawn; Mielnik, Thaddeus; Thomas, James A

    2006-01-01

    The use of STERIS Corporation's Vaporized Hydrogen Peroxide (VHP)* technology as a potential biocide for aircraft decontamination was demonstrated in a cabin section of the Aircraft Environment Research Facility...

  5. Large Scale Tests of Vaporous Hydrogen Peroxide (VHP(Register Trademark)) for Chemical and Biological Weapons Decontamination

    National Research Council Canada - National Science Library

    Wagner, George; Procell, Larry; Sorrick, David; Maclver, Brian; Turetsky, Abe; Pfarr, Jerry; Dutt, Diane; Brickhouse, Mark

    2004-01-01

    Vaporous Hydrogen Peroxide (VHP) has been used for more than a decade to sterilize clean rooms and pharmaceutical processing equipment and, more recently, to decontaminate anthraxcontaminated buildings...

  6. VvVHP1; 2 Is Transcriptionally Activated by VvMYBA1 and Promotes Anthocyanin Accumulation of Grape Berry Skins via Glucose Signal

    OpenAIRE

    Sun, Tianyu; Xu, Lili; Sun, Hong; Yue, Qianyu; Zhai, Heng; Yao, Yuxin

    2017-01-01

    In this work, four vacuolar H+-PPase (VHP) genes were identified in the grape genome. Among them, VvVHP1; 2 was strongly expressed in berry skin and its expression exhibited high correlations to anthocyanin content of berry skin during berry ripening and under ABA and UVB treatments. VvVHP1; 2 was transcriptionally activated directly by VvMYBA1, and VvVHP1; 2 overexpression promoted anthocyanin accumulation in berry skins and Arabidopsis leaves; therefore, VvVHP1; 2 mediated VvMYBA1-regulated...

  7. A design of a computer complex including vector processors

    International Nuclear Information System (INIS)

    Asai, Kiyoshi

    1982-12-01

    We, members of the Computing Center, Japan Atomic Energy Research Institute have been engaged for these six years in the research of adaptability of vector processing to large-scale nuclear codes. The research has been done in collaboration with researchers and engineers of JAERI and a computer manufacturer. In this research, forty large-scale nuclear codes were investigated from the viewpoint of vectorization. Among them, twenty-six codes were actually vectorized and executed. As the results of the investigation, it is now estimated that about seventy percents of nuclear codes and seventy percents of our total amount of CPU time of JAERI are highly vectorizable. Based on the data obtained by the investigation, (1)currently vectorizable CPU time, (2)necessary number of vector processors, (3)necessary manpower for vectorization of nuclear codes, (4)computing speed, memory size, number of parallel 1/0 paths, size and speed of 1/0 buffer of vector processor suitable for our applications, (5)necessary software and operational policy for use of vector processors are discussed, and finally (6)a computer complex including vector processors is presented in this report. (author)

  8. VvVHP1; 2 Is Transcriptionally Activated by VvMYBA1 and Promotes Anthocyanin Accumulation of Grape Berry Skins via Glucose Signal.

    Science.gov (United States)

    Sun, Tianyu; Xu, Lili; Sun, Hong; Yue, Qianyu; Zhai, Heng; Yao, Yuxin

    2017-01-01

    In this work, four vacuolar H + -PPase ( VHP ) genes were identified in the grape genome. Among them, VvVHP1; 2 was strongly expressed in berry skin and its expression exhibited high correlations to anthocyanin content of berry skin during berry ripening and under ABA and UVB treatments. VvVHP1; 2 was transcriptionally activated directly by VvMYBA1, and VvVHP1; 2 overexpression promoted anthocyanin accumulation in berry skins and Arabidopsis leaves; therefore, VvVHP1; 2 mediated VvMYBA1-regulated berry pigmentation. On the other hand, RNA-Seq analysis of WT and transgenic berry skins revealed that carbohydrate metabolism, flavonoid metabolism and regulation and solute carrier family expression were the most clearly altered biological processes. Further experiments elucidated that VvVHP1; 2 overexpression up-regulated the expression of the genes related to anthocyanin biosynthesis and transport via hexokinase-mediated glucose signal and thereby promoted anthocyanin accumulation in berry skins and Arabidopsis leaves. Additionally, modifications of sugar status caused by enhanced hexokinase activities likely play a key role in VvVHP1; 2- induced sugar signaling.

  9. VvVHP1; 2 Is Transcriptionally Activated by VvMYBA1 and Promotes Anthocyanin Accumulation of Grape Berry Skins via Glucose Signal

    Directory of Open Access Journals (Sweden)

    Tianyu Sun

    2017-10-01

    Full Text Available In this work, four vacuolar H+-PPase (VHP genes were identified in the grape genome. Among them, VvVHP1; 2 was strongly expressed in berry skin and its expression exhibited high correlations to anthocyanin content of berry skin during berry ripening and under ABA and UVB treatments. VvVHP1; 2 was transcriptionally activated directly by VvMYBA1, and VvVHP1; 2 overexpression promoted anthocyanin accumulation in berry skins and Arabidopsis leaves; therefore, VvVHP1; 2 mediated VvMYBA1-regulated berry pigmentation. On the other hand, RNA-Seq analysis of WT and transgenic berry skins revealed that carbohydrate metabolism, flavonoid metabolism and regulation and solute carrier family expression were the most clearly altered biological processes. Further experiments elucidated that VvVHP1; 2 overexpression up-regulated the expression of the genes related to anthocyanin biosynthesis and transport via hexokinase-mediated glucose signal and thereby promoted anthocyanin accumulation in berry skins and Arabidopsis leaves. Additionally, modifications of sugar status caused by enhanced hexokinase activities likely play a key role in VvVHP1; 2-induced sugar signaling.

  10. Top 10 Threats to Computer Systems Include Professors and Students

    Science.gov (United States)

    Young, Jeffrey R.

    2008-01-01

    User awareness is growing in importance when it comes to computer security. Not long ago, keeping college networks safe from cyberattackers mainly involved making sure computers around campus had the latest software patches. New computer worms or viruses would pop up, taking advantage of some digital hole in the Windows operating system or in…

  11. Study of the relationship virus of the human papilloma (VHP) and cancer of uterine neck

    International Nuclear Information System (INIS)

    Bravo, Maria Mercedes

    1999-01-01

    Today in day the narrow relationship is known among the viral infection by VHP and the cancer of uterine neck; in this pathology they also appear the proteins E6 and E7 that are target of oncogenes and important part in the course of cancer of uterine neck. It intends as hypothesis that when to a patient with neck cancer, it is administered radiotherapy, there is lysis tumoral that liberates viral components that then E7 acts on the proteins being given an immunologic answer of cellular type, activating clones. When the immunologic answer is positive, the results to the treatment are but favorable and vice versa. The objective was to determine if the virus is detected after the treatment with the radiotherapy and if the titles of antibodies had increased or diminished. An analysis of the age was made, of the size of the tumor, of the state and one of the virus of the papilloma was looked for before and after the treatment, quantifying the variation (increase or decrease) of its quantity, finally it was observed if the presage after the treatment was related with the patient's survival

  12. Pulmonary nodule characterization, including computer analysis and quantitative features.

    Science.gov (United States)

    Bartholmai, Brian J; Koo, Chi Wan; Johnson, Geoffrey B; White, Darin B; Raghunath, Sushravya M; Rajagopalan, Srinivasan; Moynagh, Michael R; Lindell, Rebecca M; Hartman, Thomas E

    2015-03-01

    Pulmonary nodules are commonly detected in computed tomography (CT) chest screening of a high-risk population. The specific visual or quantitative features on CT or other modalities can be used to characterize the likelihood that a nodule is benign or malignant. Visual features on CT such as size, attenuation, location, morphology, edge characteristics, and other distinctive "signs" can be highly suggestive of a specific diagnosis and, in general, be used to determine the probability that a specific nodule is benign or malignant. Change in size, attenuation, and morphology on serial follow-up CT, or features on other modalities such as nuclear medicine studies or MRI, can also contribute to the characterization of lung nodules. Imaging analytics can objectively and reproducibly quantify nodule features on CT, nuclear medicine, and magnetic resonance imaging. Some quantitative techniques show great promise in helping to differentiate benign from malignant lesions or to stratify the risk of aggressive versus indolent neoplasm. In this article, we (1) summarize the visual characteristics, descriptors, and signs that may be helpful in management of nodules identified on screening CT, (2) discuss current quantitative and multimodality techniques that aid in the differentiation of nodules, and (3) highlight the power, pitfalls, and limitations of these various techniques.

  13. 78 FR 1247 - Certain Electronic Devices, Including Wireless Communication Devices, Tablet Computers, Media...

    Science.gov (United States)

    2013-01-08

    ... Wireless Communication Devices, Tablet Computers, Media Players, and Televisions, and Components Thereof... devices, including wireless communication devices, tablet computers, media players, and televisions, and... wireless communication devices, tablet computers, media players, and televisions, and components thereof...

  14. The Model of the Software Running on a Computer Equipment Hardware Included in the Grid network

    Directory of Open Access Journals (Sweden)

    T. A. Mityushkina

    2012-12-01

    Full Text Available A new approach to building a cloud computing environment using Grid networks is proposed in this paper. The authors describe the functional capabilities, algorithm, model of software running on a computer equipment hardware included in the Grid network, that will allow to implement cloud computing environment using Grid technologies.

  15. 77 FR 27078 - Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof...

    Science.gov (United States)

    2012-05-08

    ... Phones and Tablet Computers, and Components Thereof; Notice of Receipt of Complaint; Solicitation of... entitled Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof... the United States after importation of certain electronic devices, including mobile phones and tablet...

  16. 31 CFR 359.31 - What definitive Series I savings bonds are included in the computation?

    Science.gov (United States)

    2010-07-01

    ... definitive Series I savings bonds are included in the computation? In computing the purchases for each person, we include the following outstanding definitive bonds purchased in that calendar year: (a) All bonds... bearing that person's TIN; and (c) All gift bonds registered in the name of that person but bearing the...

  17. 77 FR 34063 - Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof...

    Science.gov (United States)

    2012-06-08

    ... Phones and Tablet Computers, and Components Thereof Institution of Investigation AGENCY: U.S... the United States after importation of certain electronic devices, including mobile phones and tablet... mobile phones and tablet computers, and components thereof that infringe one or more of claims 1-3 and 5...

  18. CERN’s Computing rules updated to include policy for control systems

    CERN Multimedia

    IT Department

    2008-01-01

    The use of CERN’s computing facilities is governed by rules defined in Operational Circular No. 5 and its subsidiary rules of use. These rules are available from the web site http://cern.ch/ComputingRules. Please note that the subsidiary rules for Internet/Network use have been updated to include a requirement that control systems comply with the CNIC(Computing and Network Infrastructure for Control) Security Policy. The security policy for control systems, which was approved earlier this year, can be accessed at https://edms.cern.ch/document/584092 IT Department

  19. 78 FR 63492 - Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof...

    Science.gov (United States)

    2013-10-24

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-847] Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof; Notice of Request for Statements on the Public Interest AGENCY: U.S. International Trade Commission. ACTION: Notice. SUMMARY: Notice is...

  20. A method for the computation of turbulent polymeric liquids including hydrodynamic interactions and chain entanglements

    Energy Technology Data Exchange (ETDEWEB)

    Kivotides, Demosthenes, E-mail: demosthenes.kivotides@strath.ac.uk

    2017-02-12

    An asymptotically exact method for the direct computation of turbulent polymeric liquids that includes (a) fully resolved, creeping microflow fields due to hydrodynamic interactions between chains, (b) exact account of (subfilter) residual stresses, (c) polymer Brownian motion, and (d) direct calculation of chain entanglements, is formulated. Although developed in the context of polymeric fluids, the method is equally applicable to turbulent colloidal dispersions and aerosols. - Highlights: • An asymptotically exact method for the computation of polymer and colloidal fluids is developed. • The method is valid for all flow inertia and all polymer volume fractions. • The method models entanglements and hydrodynamic interactions between polymer chains.

  1. The Watts-Strogatz network model developed by including degree distribution: theory and computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Y W [Surface Physics Laboratory and Department of Physics, Fudan University, Shanghai 200433 (China); Zhang, L F [Surface Physics Laboratory and Department of Physics, Fudan University, Shanghai 200433 (China); Huang, J P [Surface Physics Laboratory and Department of Physics, Fudan University, Shanghai 200433 (China)

    2007-07-20

    By using theoretical analysis and computer simulations, we develop the Watts-Strogatz network model by including degree distribution, in an attempt to improve the comparison between characteristic path lengths and clustering coefficients predicted by the original Watts-Strogatz network model and those of the real networks with the small-world property. Good agreement between the predictions of the theoretical analysis and those of the computer simulations has been shown. It is found that the developed Watts-Strogatz network model can fit the real small-world networks more satisfactorily. Some other interesting results are also reported by adjusting the parameters in a model degree-distribution function. The developed Watts-Strogatz network model is expected to help in the future analysis of various social problems as well as financial markets with the small-world property.

  2. The Watts-Strogatz network model developed by including degree distribution: theory and computer simulation

    International Nuclear Information System (INIS)

    Chen, Y W; Zhang, L F; Huang, J P

    2007-01-01

    By using theoretical analysis and computer simulations, we develop the Watts-Strogatz network model by including degree distribution, in an attempt to improve the comparison between characteristic path lengths and clustering coefficients predicted by the original Watts-Strogatz network model and those of the real networks with the small-world property. Good agreement between the predictions of the theoretical analysis and those of the computer simulations has been shown. It is found that the developed Watts-Strogatz network model can fit the real small-world networks more satisfactorily. Some other interesting results are also reported by adjusting the parameters in a model degree-distribution function. The developed Watts-Strogatz network model is expected to help in the future analysis of various social problems as well as financial markets with the small-world property

  3. PTAC: a computer program for pressure-transient analysis, including the effects of cavitation. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Kot, C A; Youngdahl, C K

    1978-09-01

    PTAC was developed to predict pressure transients in nuclear-power-plant piping systems in which the possibility of cavitation must be considered. The program performs linear or nonlinear fluid-hammer calculations, using a fixed-grid method-of-characteristics solution procedure. In addition to pipe friction and elasticity, the program can treat a variety of flow components, pipe junctions, and boundary conditions, including arbitrary pressure sources and a sodium/water reaction. Essential features of transient cavitation are modeled by a modified column-separation technique. Comparisons of calculated results with available experimental data, for a simple piping arrangement, show good agreement and provide validation of the computational cavitation model. Calculations for a variety of piping networks, containing either liquid sodium or water, demonstrate the versatility of PTAC and clearly show that neglecting cavitation leads to erroneous predictions of pressure-time histories.

  4. CTmod—A toolkit for Monte Carlo simulation of projections including scatter in computed tomography

    Czech Academy of Sciences Publication Activity Database

    Malušek, Alexandr; Sandborg, M.; Alm Carlsson, G.

    2008-01-01

    Roč. 90, č. 2 (2008), s. 167-178 ISSN 0169-2607 Institutional research plan: CEZ:AV0Z10480505 Keywords : Monte Carlo * computed tomography * cone beam * scatter Subject RIV: JC - Computer Hardware ; Software Impact factor: 1.220, year: 2008 http://dx.doi.org/10.1016/j.cmpb.2007.12.005

  5. High performance computation of landscape genomic models including local indicators of spatial association.

    Science.gov (United States)

    Stucki, S; Orozco-terWengel, P; Forester, B R; Duruz, S; Colli, L; Masembe, C; Negrini, R; Landguth, E; Jones, M R; Bruford, M W; Taberlet, P; Joost, S

    2017-09-01

    With the increasing availability of both molecular and topo-climatic data, the main challenges facing landscape genomics - that is the combination of landscape ecology with population genomics - include processing large numbers of models and distinguishing between selection and demographic processes (e.g. population structure). Several methods address the latter, either by estimating a null model of population history or by simultaneously inferring environmental and demographic effects. Here we present samβada, an approach designed to study signatures of local adaptation, with special emphasis on high performance computing of large-scale genetic and environmental data sets. samβada identifies candidate loci using genotype-environment associations while also incorporating multivariate analyses to assess the effect of many environmental predictor variables. This enables the inclusion of explanatory variables representing population structure into the models to lower the occurrences of spurious genotype-environment associations. In addition, samβada calculates local indicators of spatial association for candidate loci to provide information on whether similar genotypes tend to cluster in space, which constitutes a useful indication of the possible kinship between individuals. To test the usefulness of this approach, we carried out a simulation study and analysed a data set from Ugandan cattle to detect signatures of local adaptation with samβada, bayenv, lfmm and an F ST outlier method (FDIST approach in arlequin) and compare their results. samβada - an open source software for Windows, Linux and Mac OS X available at http://lasig.epfl.ch/sambada - outperforms other approaches and better suits whole-genome sequence data processing. © 2016 The Authors. Molecular Ecology Resources Published by John Wiley & Sons Ltd.

  6. Including Internet insurance as part of a hospital computer network security plan.

    Science.gov (United States)

    Riccardi, Ken

    2002-01-01

    Cyber attacks on a hospital's computer network is a new crime to be reckoned with. Should your hospital consider internet insurance? The author explains this new phenomenon and presents a risk assessment for determining network vulnerabilities.

  7. Computational and experimental analyses of the wave propagation through a bar structure including liquid-solid interface

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sang Jin [UST Graduate School, Daejeon (Korea, Republic of); Rhee, Hui Nam [Division of Mechanical and Aerospace Engineering, Sunchon National University, Sunchon (Korea, Republic of); Yoon, Doo Byung; Park, Jin Ho [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-08-15

    In this research, we study the propagation of longitudinal and transverse waves through a metal rod including a liquid layer using computational and experimental analyses. The propagation characteristics of longitudinal and transverse waves obtained by the computational and experimental analyses were consistent with the wave propagation theory for both cases, that is, the homogeneous metal rod and the metal rod including a liquid layer. The fluid-structure interaction modeling technique developed for the computational wave propagation analysis in this research can be applied to the more complex structures including solid-liquid interfaces.

  8. 29 CFR 779.253 - What is included in computing the total annual inflow volume.

    Science.gov (United States)

    2010-07-01

    ... FAIR LABOR STANDARDS ACT AS APPLIED TO RETAILERS OF GOODS OR SERVICES Employment to Which the Act May... taxes and other charges which the enterprise must pay for such goods. Generally, all charges will be... computing the total annual inflow volume. The goods which the establishment purchases or receives for resale...

  9. 31 CFR 351.66 - What book-entry Series EE savings bonds are included in the computation?

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false What book-entry Series EE savings... DEBT OFFERING OF UNITED STATES SAVINGS BONDS, SERIES EE Book-Entry Series EE Savings Bonds § 351.66 What book-entry Series EE savings bonds are included in the computation? (a) We include all bonds that...

  10. Areal rainfall estimation using moving cars - computer experiments including hydrological modeling

    Science.gov (United States)

    Rabiei, Ehsan; Haberlandt, Uwe; Sester, Monika; Fitzner, Daniel; Wallner, Markus

    2016-09-01

    The need for high temporal and spatial resolution precipitation data for hydrological analyses has been discussed in several studies. Although rain gauges provide valuable information, a very dense rain gauge network is costly. As a result, several new ideas have emerged to help estimating areal rainfall with higher temporal and spatial resolution. Rabiei et al. (2013) observed that moving cars, called RainCars (RCs), can potentially be a new source of data for measuring rain rate. The optical sensors used in that study are designed for operating the windscreen wipers and showed promising results for rainfall measurement purposes. Their measurement accuracy has been quantified in laboratory experiments. Considering explicitly those errors, the main objective of this study is to investigate the benefit of using RCs for estimating areal rainfall. For that, computer experiments are carried out, where radar rainfall is considered as the reference and the other sources of data, i.e., RCs and rain gauges, are extracted from radar data. Comparing the quality of areal rainfall estimation by RCs with rain gauges and reference data helps to investigate the benefit of the RCs. The value of this additional source of data is not only assessed for areal rainfall estimation performance but also for use in hydrological modeling. Considering measurement errors derived from laboratory experiments, the result shows that the RCs provide useful additional information for areal rainfall estimation as well as for hydrological modeling. Moreover, by testing larger uncertainties for RCs, they observed to be useful up to a certain level for areal rainfall estimation and discharge simulation.

  11. Human factors design of nuclear power plant control rooms including computer-based operator aids

    International Nuclear Information System (INIS)

    Bastl, W.; Felkel, L.; Becker, G.; Bohr, E.

    1983-01-01

    The scientific handling of human factors problems in control rooms began around 1970 on the basis of safety considerations. Some recent research work deals with the development of computerized systems like plant balance calculation, safety parameter display, alarm reduction and disturbance analysis. For disturbance analysis purposes it is necessary to homogenize the information presented to the operator according to the actual plant situation in order to supply the operator with the information he most urgently needs at the time. Different approaches for solving this problem are discussed, and an overview is given on what is being done. Other research projects concentrate on the detailed analysis of operators' diagnosis strategies in unexpected situations, in order to obtain a better understanding of their mental processes and the influences upon them when such situations occur. This project involves the use of a simulator and sophisticated recording and analysis methods. Control rooms are currently designed with the aid of mock-ups. They enable operators to contribute their experience to the optimization of the arrangement of displays and controls. Modern control rooms are characterized by increasing use of process computers and CRT (Cathode Ray Tube) displays. A general concept for the integration of the new computerized system and the conventional control panels is needed. The technical changes modify operators' tasks, and future ergonomic work in nuclear plants will need to consider the re-allocation of function between man and machine, the incorporation of task changes in training programmes, and the optimal design of information presentation using CRTs. Aspects of developments in control room design are detailed, typical research results are dealt with, and a brief forecast of the ergonomic contribution to be made in the Federal Republic of Germany is given

  12. Experience in nuclear materials accountancy, including the use of computers, in the UKAEA

    International Nuclear Information System (INIS)

    Anderson, A.R.; Adamson, A.S.; Good, P.T.; Terrey, D.R.

    1976-01-01

    The UKAEA have operated systems of nuclear materials accountancy in research and development establishments handling large quantities of material for over 20 years. In the course of that time changing requirements for nuclear materials control and increasing quantities of materials have required that accountancy systems be modified and altered to improve either the fundamental system or manpower utilization. The same accountancy principles are applied throughout the Authority but procedures at the different establishments vary according to the nature of their specific requirements; there is much in the cumulative experience of the UKAEA which could prove of value to other organizations concerned with nuclear materials accountancy or safeguards. This paper reviews the present accountancy system in the UKAEA and summarizes its advantages. Details are given of specific experience and solutions which have been found to overcome difficulties or to strengthen previous weak points. Areas discussed include the use of measurements, the establishment of measurement points (which is relevant to the designation of MBAs), the importance of regular physical stock-taking, and the benefits stemming from the existence of a separate accountancy section independent of operational management at large establishments. Some experience of a dual system of accountancy and criticality control is reported, and the present status of computerization of nuclear material accounts is summarized. Important aspects of the relationship between management systems of accountancy and safeguards' requirements are discussed briefly. (author)

  13. The utility of including pathology reports in improving the computational identification of patients

    Directory of Open Access Journals (Sweden)

    Wei Chen

    2016-01-01

    Full Text Available Background: Celiac disease (CD is a common autoimmune disorder. Efficient identification of patients may improve chronic management of the disease. Prior studies have shown searching International Classification of Diseases-9 (ICD-9 codes alone is inaccurate for identifying patients with CD. In this study, we developed automated classification algorithms leveraging pathology reports and other clinical data in Electronic Health Records (EHRs to refine the subset population preselected using ICD-9 code (579.0. Materials and Methods: EHRs were searched for established ICD-9 code (579.0 suggesting CD, based on which an initial identification of cases was obtained. In addition, laboratory results for tissue transglutaminse were extracted. Using natural language processing we analyzed pathology reports from upper endoscopy. Twelve machine learning classifiers using different combinations of variables related to ICD-9 CD status, laboratory result status, and pathology reports were experimented to find the best possible CD classifier. Ten-fold cross-validation was used to assess the results. Results: A total of 1498 patient records were used including 363 confirmed cases and 1135 false positive cases that served as controls. Logistic model based on both clinical and pathology report features produced the best results: Kappa of 0.78, F1 of 0.92, and area under the curve (AUC of 0.94, whereas in contrast using ICD-9 only generated poor results: Kappa of 0.28, F1 of 0.75, and AUC of 0.63. Conclusion: Our automated classification system presented an efficient and reliable way to improve the performance of CD patient identification.

  14. PTA-1 computer program for treating pressure transients in hydraulic networks including the effect of pipe plasticity

    International Nuclear Information System (INIS)

    Youngdahl, C.K.; Kot, C.A.

    1977-01-01

    Pressure pulses in the intermediate sodium system of a liquid-metal-cooled fast breeder reactor, such as may originate from a sodium/water reaction in a steam generator, are propagated through the complex sodium piping network to system components such as the pump and intermediate heat exchanger. To assess the effects of such pulses on continued reliable operation of these components and to contribute to system designs which result in the mitigation of these effects, Pressure Transient Analysis (PTA) computer codes are being developed for accurately computing the transmission of pressure pulses through a complicated fluid transport system, consisting of piping, fittings and junctions, and components. PTA-1 provides an extension of the well-accepted and verified fluid hammer formulation for computing hydraulic transients in elastic or rigid piping systems to include plastic deformation effects. The accuracy of the modeling of pipe plasticity effects on transient propagation has been validated using results from two sets of Stanford Research Institute experiments. Validation of PTA-1 using the latter set of experiments is described briefly. The comparisons of PTA-1 computations with experiments show that (1) elastic-plastic deformation of LMFBR-type piping can have a significant qualitative and quantitative effect on pressure pulse propagation, even in simple systems; (2) classical fluid-hammer theory gives erroneous results when applied to situations where piping deforms plastically; and (3) the computational model incorporated in PTA-1 for predicting plastic deformation and its effect on transient propagation is accurate

  15. A computer software system for the generation of global ocean tides including self-gravitation and crustal loading effects

    Science.gov (United States)

    Estes, R. H.

    1977-01-01

    A computer software system is described which computes global numerical solutions of the integro-differential Laplace tidal equations, including dissipation terms and ocean loading and self-gravitation effects, for arbitrary diurnal and semidiurnal tidal constituents. The integration algorithm features a successive approximation scheme for the integro-differential system, with time stepping forward differences in the time variable and central differences in spatial variables. Solutions for M2, S2, N2, K2, K1, O1, P1 tidal constituents neglecting the effects of ocean loading and self-gravitation and a converged M2, solution including ocean loading and self-gravitation effects are presented in the form of cotidal and corange maps.

  16. Reflexos da clarificação do caldo de cana com moringa sobre compostos inorgânicos do açúcar VHP

    Directory of Open Access Journals (Sweden)

    Gustavo H. G. Costa

    2015-02-01

    Full Text Available Objetivou-se, neste trabalho, avaliar os reflexos da clarificação do caldo de cana utilizando extrato de folhas e sementes de moringa (Moringa oleifera Lamarck como auxiliares de sedimentação sobre os teores dos compostos inorgânicos do caldo clarificado e do açúcar VHP (Very High Purity - Tipo Exportação produzido. O delineamento experimental utilizado foi o fatorial 5 x 2 com quatro repetições; o primeiro fator correspondeu aos auxiliares de sedimentação: extrato de folhas e sementes de moringa, polieletrólito sintético e testemunha; já o segundo fator correspondeu a duas variedades de cana-de-açúcar: RB92579 e RB867515. O caldo extraído foi clarificado através de caleagem simples, concentrado até 60 oBrix e submetido ao processo de cozimento. No caldo original, clarificado e no açúcar produzido, foram quantificados os teores de fósforo, potássio, cálcio, sódio, magnésio, manganês e ferro além do teor de cinzas totais. Os empregos dos extratos de folhas e sementes de moringa se mostraram eficazes no tratamento do caldo destinado à produção de açúcar, por eliminar quantidades significativas de cálcio e ferro em comparação ao polieletrólito sintético. O extrato de folhas foi o melhor auxiliar de sedimentação,quando comparado aos demais.

  17. ICECON: a computer program used to calculate containment back pressure for LOCA analysis (including ice condenser plants)

    International Nuclear Information System (INIS)

    1976-07-01

    The ICECON computer code provides a method for conservatively calculating the long term back pressure transient in the containment resulting from a hypothetical Loss-of-Coolant Accident (LOCA) for PWR plants including ice condenser containment systems. The ICECON computer code was developed from the CONTEMPT/LT-022 code. A brief discussion of the salient features of a typical ice condenser containment is presented. Details of the ice condenser models are explained. The corrections and improvements made to CONTEMPT/LT-022 are included. The organization of the code, including the calculational procedure, is outlined. The user's manual, to be used in conjunction with the CONTEMPT/LT-022 user's manual, a sample problem, a time-step study (solution convergence) and a comparison of ICECON results with the results of the NSSS vendor are presented. In general, containment pressure calculated with the ICECON code agree with those calculated by the NSSS vendor using the same mass and energy release rates to the containment

  18. Explicitly-correlated ring-coupled-cluster-doubles theory: Including exchange for computations on closed-shell systems

    Energy Technology Data Exchange (ETDEWEB)

    Hehn, Anna-Sophia; Holzer, Christof; Klopper, Wim, E-mail: klopper@kit.edu

    2016-11-10

    Highlights: • Ring-coupled-cluster-doubles approach now implemented with exchange terms. • Ring-coupled-cluster-doubles approach now implemented with F12 functions. • Szabo–Ostlund scheme (SO2) implemented for use in SAPT. • Fast convergence to the limit of a complete basis. • Implementation in the TURBOMOLE program system. - Abstract: Random-phase-approximation (RPA) methods have proven to be powerful tools in electronic-structure theory, being non-empirical, computationally efficient and broadly applicable to a variety of molecular systems including small-gap systems, transition-metal compounds and dispersion-dominated complexes. Applications are however hindered due to the slow basis-set convergence of the electron-correlation energy with the one-electron basis. As a remedy, we present approximate explicitly-correlated RPA approaches based on the ring-coupled-cluster-doubles formulation including exchange contributions. Test calculations demonstrate that the basis-set convergence of correlation energies is drastically accelerated through the explicitly-correlated approach, reaching 99% of the basis-set limit with triple-zeta basis sets. When implemented in close analogy to early work by Szabo and Ostlund [36], the new explicitly-correlated ring-coupled-cluster-doubles approach including exchange has the perspective to become a valuable tool in the framework of symmetry-adapted perturbation theory (SAPT) for the computation of dispersion energies of molecular complexes of weakly interacting closed-shell systems.

  19. Multiscale approach including microfibril scale to assess elastic constants of cortical bone based on neural network computation and homogenization method.

    Science.gov (United States)

    Barkaoui, Abdelwahed; Chamekh, Abdessalem; Merzouki, Tarek; Hambli, Ridha; Mkaddem, Ali

    2014-03-01

    The complexity and heterogeneity of bone tissue require a multiscale modeling to understand its mechanical behavior and its remodeling mechanisms. In this paper, a novel multiscale hierarchical approach including microfibril scale based on hybrid neural network (NN) computation and homogenization equations was developed to link nanoscopic and macroscopic scales to estimate the elastic properties of human cortical bone. The multiscale model is divided into three main phases: (i) in step 0, the elastic constants of collagen-water and mineral-water composites are calculated by averaging the upper and lower Hill bounds; (ii) in step 1, the elastic properties of the collagen microfibril are computed using a trained NN simulation. Finite element calculation is performed at nanoscopic levels to provide a database to train an in-house NN program; and (iii) in steps 2-10 from fibril to continuum cortical bone tissue, homogenization equations are used to perform the computation at the higher scales. The NN outputs (elastic properties of the microfibril) are used as inputs for the homogenization computation to determine the properties of mineralized collagen fibril. The mechanical and geometrical properties of bone constituents (mineral, collagen, and cross-links) as well as the porosity were taken in consideration. This paper aims to predict analytically the effective elastic constants of cortical bone by modeling its elastic response at these different scales, ranging from the nanostructural to mesostructural levels. Our findings of the lowest scale's output were well integrated with the other higher levels and serve as inputs for the next higher scale modeling. Good agreement was obtained between our predicted results and literature data. Copyright © 2013 John Wiley & Sons, Ltd.

  20. Progress report of Physics Division including Applied Mathematics and Computing Section. 1st October 1970 - 31st March 1971

    International Nuclear Information System (INIS)

    2004-01-01

    The initial MOATA safety assessment was based on data and calculations available before the advent of multigroup diffusion theory codes in two dimensions. That assessment is being revised and extended to gain approval for 100 kW operation. The more detailed representation obtained in the new calculations has resulted in a much better understanding of the physics of this reactor. The properties of the reactor are determined to a large extent by neutron leakage from the rather thin core tanks. In particular the effect of leakage on the coupling between the core tanks and on reactivity coefficients has been clarified and quantified. In neutron data studies, the theoretical fission product library was revised, checked against any experimental values and distributed to interested overseas centres. Some further nubar work was done vith much better neutron energy resolution, and confirmed our earlier measurements. A promising formulation of R matrix theory of nuclear interaction is expected to lead to simpler multilevel resonance parameter description. With large amounts of digital data being collected, dissplayed and used by theoreticians and experimentalists, more attention -was given to visual interactive computer displays. This interest is generating constructive proposals for use of the dataway now being installed between the Division and the IBM 360/50 computer. The study of gamma rays following the capture of keV neutrons continues to reveal new and interesting features of the physical processes involved. A detailed international compilation of the gamma rays emitted and their intensities is in progress. The work on nickel-68, amongst others, has enabled a partial capture cross section to be generated from the gamma ray parameters obtained by experiment. Much work still remains to be done, possibly at other establishments with more extensive facilities. The electrical and mechanical components of our new zero power split table machine for reactor physics assemblies

  1. Contribution to the algorithmic and efficient programming of new parallel architectures including accelerators for neutron physics and shielding computations

    International Nuclear Information System (INIS)

    Dubois, J.

    2011-01-01

    In science, simulation is a key process for research or validation. Modern computer technology allows faster numerical experiments, which are cheaper than real models. In the field of neutron simulation, the calculation of eigenvalues is one of the key challenges. The complexity of these problems is such that a lot of computing power may be necessary. The work of this thesis is first the evaluation of new computing hardware such as graphics card or massively multi-core chips, and their application to eigenvalue problems for neutron simulation. Then, in order to address the massive parallelism of supercomputers national, we also study the use of asynchronous hybrid methods for solving eigenvalue problems with this very high level of parallelism. Then we experiment the work of this research on several national supercomputers such as the Titane hybrid machine of the Computing Center, Research and Technology (CCRT), the Curie machine of the Very Large Computing Centre (TGCC), currently being installed, and the Hopper machine at the Lawrence Berkeley National Laboratory (LBNL). We also do our experiments on local workstations to illustrate the interest of this research in an everyday use with local computing resources. (author) [fr

  2. Quantum wavepacket ab initio molecular dynamics: an approach for computing dynamically averaged vibrational spectra including critical nuclear quantum effects.

    Science.gov (United States)

    Sumner, Isaiah; Iyengar, Srinivasan S

    2007-10-18

    We have introduced a computational methodology to study vibrational spectroscopy in clusters inclusive of critical nuclear quantum effects. This approach is based on the recently developed quantum wavepacket ab initio molecular dynamics method that combines quantum wavepacket dynamics with ab initio molecular dynamics. The computational efficiency of the dynamical procedure is drastically improved (by several orders of magnitude) through the utilization of wavelet-based techniques combined with the previously introduced time-dependent deterministic sampling procedure measure to achieve stable, picosecond length, quantum-classical dynamics of electrons and nuclei in clusters. The dynamical information is employed to construct a novel cumulative flux/velocity correlation function, where the wavepacket flux from the quantized particle is combined with classical nuclear velocities to obtain the vibrational density of states. The approach is demonstrated by computing the vibrational density of states of [Cl-H-Cl]-, inclusive of critical quantum nuclear effects, and our results are in good agreement with experiment. A general hierarchical procedure is also provided, based on electronic structure harmonic frequencies, classical ab initio molecular dynamics, computation of nuclear quantum-mechanical eigenstates, and employing quantum wavepacket ab initio dynamics to understand vibrational spectroscopy in hydrogen-bonded clusters that display large degrees of anharmonicities.

  3. Progress report of Physics Division including Applied Mathematics and Computing Section. 1st April 1970 - 30th September 1970

    International Nuclear Information System (INIS)

    2004-01-01

    Several of the senior staff of the Division have assisted in the assessment of the tenders for the proposed Jervis Bay power station. This has involved studies on light water moderated reactor systems where our experience has been limited. Several of the questions raised by the tenders are considered important and effort on these topics will continue when the assessment is complete. Major effort, other than for the Jervis Bay Project, has been devoted to the improvement of facilities and the construction of the critical facility. Studies relevant to an improved understanding of MOATA have continued to support the proposed power uprating to 100 W. The increasing number of shielding (neutron and gamma) problems referred to the Division has resulted in the procurement of several specialised codes and data libraries. These are now operational on our IBM 360 computer, and several problems are being investigated

  4. Evaluation and study of advanced optical contamination, deposition, measurement, and removal techniques. [including computer programs and ultraviolet reflection analysis

    Science.gov (United States)

    Linford, R. M. F.; Allen, T. H.; Dillow, C. F.

    1975-01-01

    A program is described to design, fabricate and install an experimental work chamber assembly (WCA) to provide a wide range of experimental capability. The WCA incorporates several techniques for studying the kinetics of contaminant films and their effect on optical surfaces. It incorporates the capability for depositing both optical and contaminant films on temperature-controlled samples, and for in-situ measurements of the vacuum ultraviolet reflectance. Ellipsometer optics are mounted on the chamber for film thickness determinations, and other features include access ports for radiation sources and instrumentation. Several supporting studies were conducted to define specific chamber requirements, to determine the sensitivity of the measurement techniques to be incorporated in the chamber, and to establish procedures for handling samples prior to their installation in the chamber. A bibliography and literature survey of contamination-related articles is included.

  5. Computation of transverse muon-spin relaxation functions including trapping-detrapping reactions, with application to electron-irradiated tantalum

    International Nuclear Information System (INIS)

    Doering, K.P.; Aurenz, T.; Herlach, D.; Schaefer, H.E.; Arnold, K.P.; Jacobs, W.; Orth, H.; Haas, N.; Seeger, A.; Max-Planck-Institut fuer Metallforschung, Stuttgart

    1986-01-01

    A new technique for the economical evaluation of transverse muon spin relaxation functions in situations involving μ + trapping at and detrapping from crystal defects is applied to electron-irradiated Ta exhibiting relaxation maxima at about 35 K, 100 K, and 250 K. The long-range μ + diffusion is shown to be limted by traps over the entire temperature range investigated. The (static) relaxation rates for several possible configurations of trapped muons are discussed, including the effect of the simultaneous presence of a proton in a vacancy. (orig.)

  6. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  7. Use of computational fluid dynamics codes for safety analysis of nuclear reactor systems, including containment. Summary report of a technical meeting

    International Nuclear Information System (INIS)

    2003-11-01

    Safety analysis is an important tool for justifying the safety of nuclear power plants. Typically, this type of analysis is performed by means of system computer codes with one dimensional approximation for modelling real plant systems. However, in the nuclear area there are issues for which traditional treatment using one dimensional system codes is considered inadequate for modelling local flow and heat transfer phenomena. There is therefore increasing interest in the application of three dimensional computational fluid dynamics (CFD) codes as a supplement to or in combination with system codes. There are a number of both commercial (general purpose) CFD codes as well as special codes for nuclear safety applications available. With further progress in safety analysis techniques, the increasing use of CFD codes for nuclear applications is expected. At present, the main objective with respect to CFD codes is generally to improve confidence in the available analysis tools and to achieve a more reliable approach to safety relevant issues. An exchange of views and experience can facilitate and speed up progress in the implementation of this objective. Both the International Atomic Energy Agency (IAEA) and the Nuclear Energy Agency of the Organisation for Economic Co-operation and Development (OECD/NEA) believed that it would be advantageous to provide a forum for such an exchange. Therefore, within the framework of the Working Group on the Analysis and Management of Accidents of the NEA's Committee on the Safety of Nuclear Installations, the IAEA and the NEA agreed to jointly organize the Technical Meeting on the Use of Computational Fluid Dynamics Codes for Safety Analysis of Reactor Systems, including Containment. The meeting was held in Pisa, Italy, from 11 to 14 November 2002. The publication constitutes the report of the Technical Meeting. It includes short summaries of the presentations that were made and of the discussions as well as conclusions and

  8. An Accurate and Dynamic Computer Graphics Muscle Model

    Science.gov (United States)

    Levine, David Asher

    1997-01-01

    A computer based musculo-skeletal model was developed at the University in the departments of Mechanical and Biomedical Engineering. This model accurately represents human shoulder kinematics. The result of this model is the graphical display of bones moving through an appropriate range of motion based on inputs of EMGs and external forces. The need existed to incorporate a geometric muscle model in the larger musculo-skeletal model. Previous muscle models did not accurately represent muscle geometries, nor did they account for the kinematics of tendons. This thesis covers the creation of a new muscle model for use in the above musculo-skeletal model. This muscle model was based on anatomical data from the Visible Human Project (VHP) cadaver study. Two-dimensional digital images from the VHP were analyzed and reconstructed to recreate the three-dimensional muscle geometries. The recreated geometries were smoothed, reduced, and sliced to form data files defining the surfaces of each muscle. The muscle modeling function opened these files during run-time and recreated the muscle surface. The modeling function applied constant volume limitations to the muscle and constant geometry limitations to the tendons.

  9. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  10. 3-dimensional magnetotelluric inversion including topography using deformed hexahedral edge finite elements and direct solvers parallelized on symmetric multiprocessor computers - Part II: direct data-space inverse solution

    Science.gov (United States)

    Kordy, M.; Wannamaker, P.; Maris, V.; Cherkaev, E.; Hill, G.

    2016-01-01

    Following the creation described in Part I of a deformable edge finite-element simulator for 3-D magnetotelluric (MT) responses using direct solvers, in Part II we develop an algorithm named HexMT for 3-D regularized inversion of MT data including topography. Direct solvers parallelized on large-RAM, symmetric multiprocessor (SMP) workstations are used also for the Gauss-Newton model update. By exploiting the data-space approach, the computational cost of the model update becomes much less in both time and computer memory than the cost of the forward simulation. In order to regularize using the second norm of the gradient, we factor the matrix related to the regularization term and apply its inverse to the Jacobian, which is done using the MKL PARDISO library. For dense matrix multiplication and factorization related to the model update, we use the PLASMA library which shows very good scalability across processor cores. A synthetic test inversion using a simple hill model shows that including topography can be important; in this case depression of the electric field by the hill can cause false conductors at depth or mask the presence of resistive structure. With a simple model of two buried bricks, a uniform spatial weighting for the norm of model smoothing recovered more accurate locations for the tomographic images compared to weightings which were a function of parameter Jacobians. We implement joint inversion for static distortion matrices tested using the Dublin secret model 2, for which we are able to reduce nRMS to ˜1.1 while avoiding oscillatory convergence. Finally we test the code on field data by inverting full impedance and tipper MT responses collected around Mount St Helens in the Cascade volcanic chain. Among several prominent structures, the north-south trending, eruption-controlling shear zone is clearly imaged in the inversion.

  11. Olfactory neuroblastoma: the long-term outcome and late toxicity of multimodal therapy including radiotherapy based on treatment planning using computed tomography

    International Nuclear Information System (INIS)

    Mori, Takashi; Onimaru, Rikiya; Onodera, Shunsuke; Tsuchiya, Kazuhiko; Yasuda, Koichi; Hatakeyama, Hiromitsu; Kobayashi, Hiroyuki; Terasaka, Shunsuke; Homma, Akihiro; Shirato, Hiroki

    2015-01-01

    Olfactory neuroblastoma (ONB) is a rare tumor originating from olfactory epithelium. Here we retrospectively analyzed the long-term treatment outcomes and toxicity of radiotherapy for ONB patients for whom computed tomography (CT) and three-dimensional treatment planning was conducted to reappraise the role of radiotherapy in the light of recent advanced technology and chemotherapy. Seventeen patients with ONB treated between July 1992 and June 2013 were included. Three patients were Kadish stage B and 14 were stage C. All patients were treated with radiotherapy with or without surgery or chemotherapy. The radiation dose was distributed from 50 Gy to 66 Gy except for one patient who received 40 Gy preoperatively. The median follow-up time was 95 months (range 8–173 months). The 5-year overall survival (OS) and relapse-free survival (RFS) rates were estimated at 88% and 74%, respectively. Five patients with stage C disease had recurrence with the median time to recurrence of 59 months (range 7–115 months). Late adverse events equal to or above Grade 2 in CTCAE v4.03 were observed in three patients. Multimodal therapy including radiotherapy with precise treatment planning based on CT simulation achieved an excellent local control rate with acceptable toxicity and reasonable overall survival for patients with ONB

  12. The WECHSL-Mod2 code: A computer program for the interaction of a core melt with concrete including the long term behavior

    International Nuclear Information System (INIS)

    Reimann, M.; Stiefel, S.

    1989-06-01

    The WECHSL-Mod2 code is a mechanistic computer code developed for the analysis of the thermal and chemical interaction of initially molten LWR reactor materials with concrete in a two-dimensional, axisymmetrical concrete cavity. The code performs calculations from the time of initial contact of a hot molten pool over start of solidification processes until long term basemat erosion over several days with the possibility of basemat penetration. The code assumes that the metallic phases of the melt pool form a layer at the bottom overlayed by the oxide melt atop. Heat generation in the melt is by decay heat and chemical reactions from metal oxidation. Energy is lost to the melting concrete and to the upper containment by radiation or evaporation of sumpwater possibly flooding the surface of the melt. Thermodynamic and transport properties as well as criteria for heat transfer and solidification processes are internally calculated for each time step. Heat transfer is modelled taking into account the high gas flux from the decomposing concrete and the heat conduction in the crusts possibly forming in the long term at the melt/concrete interface. The WECHSL code in its present version was validated by the BETA experiments. The test samples include a typical BETA post test calculation and a WECHSL application to a reactor accident. (orig.) [de

  13. The WECHSL-Mod3 code: A computer program for the interaction of a core melt with concrete including the long term behavior. Model description and user's manual

    International Nuclear Information System (INIS)

    Foit, J.J.; Adroguer, B.; Cenerino, G.; Stiefel, S.

    1995-02-01

    The WECHSL-Mod3 code is a mechanistic computer code developed for the analysis of the thermal and chemical interaction of initially molten reactor materials with concrete in a two-dimensional as well as in a one-dimensional, axisymmetrical concrete cavity. The code performs calculations from the time of initial contact of a hot molten pool over start of solidification processes until long term basemat erosion over several days with the possibility of basemat penetration. It is assumed that an underlying metallic layer exists covered by an oxidic layer or that only one oxidic layer is present which can contain a homogeneously dispersed metallic phase. Heat generation in the melt is by decay heat and chemical reactions from metal oxidation. Energy is lost to the melting concrete and to the upper containment by radiation or evaporation of sumpwater possibly flooding the surface of the melt. Thermodynamic and transport properties as well as criteria for heat transfer and solidification processes are internally calculated for each time step. Heat transfer is modelled taking into account the high gas flux from the decomposing concrete and the heat conduction in the crusts possibly forming in the long term at the melt/concrete interface. The CALTHER code (developed at CEA, France) which models the radiative heat transfer from the upper surface of the corium melt to the surrounding cavity is implemented in the present WECHSL version. The WECHSL code in its present version was validated by the BETA, ACE and SURC experiments. The test samples include a BETA and the SURC2 post test calculations and a WECHSL application to a reactor accident. (orig.) [de

  14. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  15. Three-dimensional magnetotelluric inversion including topography using deformed hexahedral edge finite elements, direct solvers and data space Gauss-Newton, parallelized on SMP computers

    Science.gov (United States)

    Kordy, M. A.; Wannamaker, P. E.; Maris, V.; Cherkaev, E.; Hill, G. J.

    2014-12-01

    We have developed an algorithm for 3D simulation and inversion of magnetotelluric (MT) responses using deformable hexahedral finite elements that permits incorporation of topography. Direct solvers parallelized on symmetric multiprocessor (SMP), single-chassis workstations with large RAM are used for the forward solution, parameter jacobians, and model update. The forward simulator, jacobians calculations, as well as synthetic and real data inversion are presented. We use first-order edge elements to represent the secondary electric field (E), yielding accuracy O(h) for E and its curl (magnetic field). For very low frequency or small material admittivity, the E-field requires divergence correction. Using Hodge decomposition, correction may be applied after the forward solution is calculated. It allows accurate E-field solutions in dielectric air. The system matrix factorization is computed using the MUMPS library, which shows moderately good scalability through 12 processor cores but limited gains beyond that. The factored matrix is used to calculate the forward response as well as the jacobians of field and MT responses using the reciprocity theorem. Comparison with other codes demonstrates accuracy of our forward calculations. We consider a popular conductive/resistive double brick structure and several topographic models. In particular, the ability of finite elements to represent smooth topographic slopes permits accurate simulation of refraction of electromagnetic waves normal to the slopes at high frequencies. Run time tests indicate that for meshes as large as 150x150x60 elements, MT forward response and jacobians can be calculated in ~2.5 hours per frequency. For inversion, we implemented data space Gauss-Newton method, which offers reduction in memory requirement and a significant speedup of the parameter step versus model space approach. For dense matrix operations we use tiling approach of PLASMA library, which shows very good scalability. In synthetic

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  18. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  19. CASKS (Computer Analysis of Storage casKS): A microcomputer based analysis system for storage cask design review. User's manual to Version 1b (including program reference)

    International Nuclear Information System (INIS)

    Chen, T.F.; Gerhard, M.A.; Trummer, D.J.; Johnson, G.L.; Mok, G.C.

    1995-02-01

    CASKS (Computer Analysis of Storage casKS) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for evaluating safety analysis reports on spent-fuel storage casks. The bulk of the complete program and this user's manual are based upon the SCANS (Shipping Cask ANalysis System) program previously developed at LLNL. A number of enhancements and improvements were added to the original SCANS program to meet requirements unique to storage casks. CASKS is an easy-to-use system that calculates global response of storage casks to impact loads, pressure loads and thermal conditions. This provides reviewers with a tool for an independent check on analyses submitted by licensees. CASKS is based on microcomputers compatible with the IBM-PC family of computers. The system is composed of a series of menus, input programs, cask analysis programs, and output display programs. All data is entered through fill-in-the-blank input screens that contain descriptive data requests

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  2. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  3. Short- and medium-term efficacy of a Web-based computer-tailored nutrition education intervention for adults including cognitive and environmental feedback: randomized controlled trial.

    Science.gov (United States)

    Springvloet, Linda; Lechner, Lilian; de Vries, Hein; Candel, Math J J M; Oenema, Anke

    2015-01-19

    Web-based, computer-tailored nutrition education interventions can be effective in modifying self-reported dietary behaviors. Traditional computer-tailored programs primarily targeted individual cognitions (knowledge, awareness, attitude, self-efficacy). Tailoring on additional variables such as self-regulation processes and environmental-level factors (the home food environment arrangement and perception of availability and prices of healthy food products in supermarkets) may improve efficacy and effect sizes (ES) of Web-based computer-tailored nutrition education interventions. This study evaluated the short- and medium-term efficacy and educational differences in efficacy of a cognitive and environmental feedback version of a Web-based computer-tailored nutrition education intervention on self-reported fruit, vegetable, high-energy snack, and saturated fat intake compared to generic nutrition information in the total sample and among participants who did not comply with dietary guidelines (the risk groups). A randomized controlled trial was conducted with a basic (tailored intervention targeting individual cognition and self-regulation processes; n=456), plus (basic intervention additionally targeting environmental-level factors; n=459), and control (generic nutrition information; n=434) group. Participants were recruited from the general population and randomly assigned to a study group. Self-reported fruit, vegetable, high-energy snack, and saturated fat intake were assessed at baseline and at 1- (T1) and 4-months (T2) postintervention using online questionnaires. Linear mixed model analyses examined group differences in change over time. Educational differences were examined with group×time×education interaction terms. In the total sample, the basic (T1: ES=-0.30; T2: ES=-0.18) and plus intervention groups (T1: ES=-0.29; T2: ES=-0.27) had larger decreases in high-energy snack intake than the control group. The basic version resulted in a larger decrease in

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  10. Planned development and evaluation protocol of two versions of a web-based computer-tailored nutrition education intervention aimed at adults, including cognitive and environmental feedback.

    Science.gov (United States)

    Springvloet, Linda; Lechner, Lilian; Oenema, Anke

    2014-01-17

    Despite decades of nutrition education, the prevalence of unhealthy dietary patterns is still high and inequalities in intake between high and low socioeconomic groups still exist. Therefore, it is important to innovate and improve existing nutrition education interventions. This paper describes the development, design and evaluation protocol of a web-based computer-tailored nutrition education intervention for adults targeting fruit, vegetable, high-energy snack and fat intake. This intervention innovates existing computer-tailored interventions by not only targeting motivational factors, but also volitional and self-regulation processes and environmental-level factors. The intervention development was guided by the Intervention Mapping protocol, ensuring a theory-informed and evidence-based intervention. Two versions of the intervention were developed: a basic version targeting knowledge, awareness, attitude, self-efficacy and volitional and self-regulation processes, and a plus version additionally addressing the home environment arrangement and the availability and price of healthy food products in supermarkets. Both versions consist of four modules: one for each dietary behavior, i.e. fruit, vegetables, high-energy snacks and fat. Based on the self-regulation phases, each module is divided into three sessions. In the first session, feedback on dietary behavior is provided to increase awareness, feedback on attitude and self-efficacy is provided and goals and action plans are stated. In the second session goal achievement is evaluated, reasons for failure are explored, coping plans are stated and goals can be adapted. In the third session, participants can again evaluate their behavioral change and tips for maintenance are provided. Both versions will be evaluated in a three-group randomized controlled trial with measurements at baseline, 1-month, 4-months and 9-months post-intervention, using online questionnaires. Both versions will be compared with a generic

  11. TURTLE with MAD input (Trace Unlimited Rays Through Lumped Elements) -- A computer program for simulating charged particle beam transport systems and DECAY TURTLE including decay calculations

    Energy Technology Data Exchange (ETDEWEB)

    Carey, D.C.

    1999-12-09

    TURTLE is a computer program useful for determining many characteristics of a particle beam once an initial design has been achieved, Charged particle beams are usually designed by adjusting various beam line parameters to obtain desired values of certain elements of a transfer or beam matrix. Such beam line parameters may describe certain magnetic fields and their gradients, lengths and shapes of magnets, spacings between magnetic elements, or the initial beam accepted into the system. For such purposes one typically employs a matrix multiplication and fitting program such as TRANSPORT. TURTLE is designed to be used after TRANSPORT. For convenience of the user, the input formats of the two programs have been made compatible. The use of TURTLE should be restricted to beams with small phase space. The lumped element approximation, described below, precludes the inclusion of the effect of conventional local geometric aberrations (due to large phase space) or fourth and higher order. A reading of the discussion below will indicate clearly the exact uses and limitations of the approach taken in TURTLE.

  12. Spelling is just a click away – a user-centered brain-computer interface including auto-calibration and predictive text entry

    Directory of Open Access Journals (Sweden)

    Tobias eKaufmann

    2012-05-01

    Full Text Available Brain Computer Interfaces (BCI based on event-related potentials (ERP allow for selection of characters from a visually presented character-matrix and thus provide a communication channel for users with neurodegenerative disease. Although they have been topic of research for more than 20 years and were multiply proven to be a reliable communication method, BCIs are almost exclusively used in experimental settings, handled by qualified experts. This study investigates if ERP-BCIs can be handled independently by laymen without expert interference, which is inevitable for establishing BCIs in end-user’s daily life situations. Furthermore we compared the classic character-by-character text entry against a predictive text entry (PTE that directly incorporates predictive text into the character matrix. N=19 BCI novices handled a user-centred ERP-BCI application on their own without expert interference. The software individually adjusted classifier weights and control parameters in the background, invisible to the user (auto-calibration. All participants were able to operate the software on their own and to twice correctly spell a sentence with the auto-calibrated classifier (once with PTE, once without. Our PTE increased spelling speed and importantly did not reduce accuracy. In sum, this study demonstrates feasibility of auto-calibrating ERP-BCI use, independently by laymen and the strong benefit of integrating predictive text directly into the character-matrix.

  13. Spelling is Just a Click Away - A User-Centered Brain-Computer Interface Including Auto-Calibration and Predictive Text Entry.

    Science.gov (United States)

    Kaufmann, Tobias; Völker, Stefan; Gunesch, Laura; Kübler, Andrea

    2012-01-01

    Brain-computer interfaces (BCI) based on event-related potentials (ERP) allow for selection of characters from a visually presented character-matrix and thus provide a communication channel for users with neurodegenerative disease. Although they have been topic of research for more than 20 years and were multiply proven to be a reliable communication method, BCIs are almost exclusively used in experimental settings, handled by qualified experts. This study investigates if ERP-BCIs can be handled independently by laymen without expert support, which is inevitable for establishing BCIs in end-user's daily life situations. Furthermore we compared the classic character-by-character text entry against a predictive text entry (PTE) that directly incorporates predictive text into the character-matrix. N = 19 BCI novices handled a user-centered ERP-BCI application on their own without expert support. The software individually adjusted classifier weights and control parameters in the background, invisible to the user (auto-calibration). All participants were able to operate the software on their own and to twice correctly spell a sentence with the auto-calibrated classifier (once with PTE, once without). Our PTE increased spelling speed and, importantly, did not reduce accuracy. In sum, this study demonstrates feasibility of auto-calibrating ERP-BCI use, independently by laymen and the strong benefit of integrating predictive text directly into the character-matrix.

  14. TURTLE with MAD input (Trace Unlimited Rays Through Lumped Elements) -- A computer program for simulating charged particle beam transport systems and DECAY TURTLE including decay calculations

    International Nuclear Information System (INIS)

    Carey, D.C.

    1999-01-01

    TURTLE is a computer program useful for determining many characteristics of a particle beam once an initial design has been achieved, Charged particle beams are usually designed by adjusting various beam line parameters to obtain desired values of certain elements of a transfer or beam matrix. Such beam line parameters may describe certain magnetic fields and their gradients, lengths and shapes of magnets, spacings between magnetic elements, or the initial beam accepted into the system. For such purposes one typically employs a matrix multiplication and fitting program such as TRANSPORT. TURTLE is designed to be used after TRANSPORT. For convenience of the user, the input formats of the two programs have been made compatible. The use of TURTLE should be restricted to beams with small phase space. The lumped element approximation, described below, precludes the inclusion of the effect of conventional local geometric aberrations (due to large phase space) or fourth and higher order. A reading of the discussion below will indicate clearly the exact uses and limitations of the approach taken in TURTLE

  15. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  16. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  18. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  19. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  2. NAIAD - a computer program for calculation of the steady state and transient behaviour (including LOCA) of compressible two-phase coolant in networks

    International Nuclear Information System (INIS)

    Trimble, G.D.; Turner, W.J.

    1976-04-01

    The three one-dimensional conservation equations of mass, momentum and energy are solved by a stable finite difference scheme which allows the time step to be varied in response to accuracy requirements. Consideration of numerical stability is not necessary. Slip between the phases is allowed and descriptions of complex hydraulic components can be added into specially provided user routines. Intrinsic choking using any of the nine slip models is possible. A pipe or fuel model and detailed surface heat transfer are included. (author)

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  4. Use of computer aids including expert systems to enhance diagnosis of NPP safety status and operator response. VDU displays in accidents - Interact

    International Nuclear Information System (INIS)

    Humble, P.; Welbourne, D.

    1998-01-01

    This report describes NNC development of a demonstration concept called Interact of Visual Display Unit (VDU) displays, integrating on-screen control of plant actions. Most plant vendors now propose on-screen control and it is being included on some plants. The integration of Station Operating Instructions (SOI) into VDU presentation of plants is being developed rapidly. With on-screen control, SOIs can be displayed with control targets able to initiate plant control, directly as called for in the SOIs. Interact displays information and control options, using a cursor to simulate on-screen display and plant control. The displays show a method which integrates soft control and SOI information into a single unified presentation. They simulate the SOI for an accident, on-screen, with simulated inserted plant values

  5. An appraisal of the computed axial tomographic appearance of the human mesentery based on mesenteric contiguity from the duodenojejunal flexure to the mesorectal level

    Energy Technology Data Exchange (ETDEWEB)

    Coffey, J.C.; Culligan, Kevin; Walsh, Leon G.; Sehgal, Rishab; Dunne, Colum; McGrath, Deirdre; Walsh, Dara [University Hospital Limerick, Centre for Interventions in Infection, Inflammation and Immunity (4i), Graduate Entry Medical School and Department of Surgery, Limerick (Ireland); Moore, Michael [Cork University Hospital, Department of Radiology, Cork (Ireland); Staunton, Marie [Mercy University Hospital, Department of Radiology, Cork (Ireland); Scanlon, Timothy; Dewhurst, Catherine; Kenny, Bryan; O' Brien, Julie M. [University Hospital Limerick, Department of Radiology, Limerick (Ireland); O' Riordan, Conor [Kilkenny General Hospital, Department of Radiology, Kilkenny (Ireland); Quondamatteo, Fabio; Dockery, Peter [National University of Ireland Galway, Anatomy, School of Medicine, Galway (Ireland)

    2016-03-15

    The human mesentery is now regarded as contiguous from the duodenojejunal (DJ) to anorectal level. This interpretation prompts re-appraisal of computed tomography (CT) images of the mesentery. A digital model and reference atlas of the mesentery were generated using the full-colour data set of the Visible Human Project (VHP). Seventy one normal abdominal CT images were examined to identify mesenteric regions. CT appearances were correlated with cadaveric and histological appearances at corresponding levels. Ascending, descending and sigmoid mesocolons were identifiable in 75 %, 86 % and 88 % of the CTs, respectively. Flexural contiguity was evident in 66 %, 68 %, 71 % and 80 % for the ileocaecal, hepatic, splenic and rectosigmoid flexures, respectively. A posterior mesocolic boundary corresponding to the anterior renal fascia was evident in 40 % and 54 % of cases on the right and left, respectively. The anterior pararenal space (in front of the boundary) corresponded to the mesocolon. Using the VHP, a mesenteric digital model and reference atlas were developed. This enabled re-appraisal of CT images of the mesentery, in which contiguous flexural and non-flexural mesenteric regions were repeatedly identifiable. The anterior pararenal space corresponded to the mesocolon. (orig.)

  6. An appraisal of the computed axial tomographic appearance of the human mesentery based on mesenteric contiguity from the duodenojejunal flexure to the mesorectal level

    International Nuclear Information System (INIS)

    Coffey, J.C.; Culligan, Kevin; Walsh, Leon G.; Sehgal, Rishab; Dunne, Colum; McGrath, Deirdre; Walsh, Dara; Moore, Michael; Staunton, Marie; Scanlon, Timothy; Dewhurst, Catherine; Kenny, Bryan; O'Brien, Julie M.; O'Riordan, Conor; Quondamatteo, Fabio; Dockery, Peter

    2016-01-01

    The human mesentery is now regarded as contiguous from the duodenojejunal (DJ) to anorectal level. This interpretation prompts re-appraisal of computed tomography (CT) images of the mesentery. A digital model and reference atlas of the mesentery were generated using the full-colour data set of the Visible Human Project (VHP). Seventy one normal abdominal CT images were examined to identify mesenteric regions. CT appearances were correlated with cadaveric and histological appearances at corresponding levels. Ascending, descending and sigmoid mesocolons were identifiable in 75 %, 86 % and 88 % of the CTs, respectively. Flexural contiguity was evident in 66 %, 68 %, 71 % and 80 % for the ileocaecal, hepatic, splenic and rectosigmoid flexures, respectively. A posterior mesocolic boundary corresponding to the anterior renal fascia was evident in 40 % and 54 % of cases on the right and left, respectively. The anterior pararenal space (in front of the boundary) corresponded to the mesocolon. Using the VHP, a mesenteric digital model and reference atlas were developed. This enabled re-appraisal of CT images of the mesentery, in which contiguous flexural and non-flexural mesenteric regions were repeatedly identifiable. The anterior pararenal space corresponded to the mesocolon. (orig.)

  7. ORCODE.77: a computer routine to control a nuclear physics experiment by a PDP-15 + CAMAC system, written in assembler language and including many new routines of general interest

    International Nuclear Information System (INIS)

    Dickens, J.K.; McConnell, J.W.

    1977-01-01

    ORCODE.77 is a versatile data-handling computer routine written in MACRO (assembler) language for a PDP-15 computer with EAE (extended arithmetic capability) connected to a CAMAC interface. The Interrupt feature of the computer is utilized. Although the code is oriented for a specific experimental problem, there are many routines of general interest, including a CAMAC Scaler handler, an executive routine to interpret and act upon three-character teletype commands, concise routines to type out double-precision integers (both octal and decimal) and floating-point numbers and to read in integers and floating-point numbers, a routine to convert to and from PDP-15 FORTRAN-IV floating-point format, a routine to handle clock interrupts, and our own DECTAPE handling routine. Routines having specific applications which are applicable to other very similar applications include a display routine using CAMAC instructions, control of external mechanical equipment using CAMAC instructions, storage of data from an Analog-to-digital Converter, analysis of stored data into time-dependent pulse-height spectra, and a routine to read the contents of a Nuclear Data 5050 Analyzer and to prepare DECTAPE output of these data for subsequent analysis by a code written in PDP-15-compiled FORTRAN-IV

  8. Optical Computing

    OpenAIRE

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  9. Dicty_cDB: VHP481 [Dicty_cDB

    Lifescience Database Archive (English)

    Full Text Available g significant alignments: (bits) Value AY698035_1( AY698035 |pid:none) Desmognath...us marmoratus isolate 69... 40 0.060 AY612344_1( AY612344 |pid:none) Desmognathus marmoratus voucher KH... 4...0 0.079 DQ018662_1( DQ018662 |pid:none) Plethodon dunni isolate 75356 NADH... 38 0.30 AY612348_1( AY612348 |pid:none) Desmog...nathus quadramaculatus vouch... 36 1.1 AY698038_1( AY698038 |pid:none) Desmog...nathus marmoratus isolate T0... 36 1.1 AY698041_1( AY698041 |pid:none) Desmognathus marmoratus i

  10. Dicty_cDB: VHP304 [Dicty_cDB

    Lifescience Database Archive (English)

    Full Text Available vgdmvmatvkkgkpelrkkvctglvvrqrkhwkrkdgvyiyfednagvmcnpkgevkgn ilgpvakecsdlwpkvatnag...yrvslglpvgavmnsadnsgaknlyviavkgikgrlnrlpsa gvgdmvmatvkkgkpelrkkvctglvvrqrkhwkrkdgvyiyfednagvmcnpkgevkgn ilgp

  11. Dicty_cDB: VHP154 [Dicty_cDB

    Lifescience Database Archive (English)

    Full Text Available PGYGMVVSKSQYPVAEMVKTLIQK*ki*i*rs*lvqmipsivhqlkiqiqlslvcqllkf gkqnivsl*nvh*qmlvmp...IVGAIVGSSGA ILSYIMCKAMNRNLMSVILGGVGTSSMGKGEAMKITGTHTEINVDQASEMITNSKNILIV PGYGMVVSKSQYPVAEMVKTLIQK*ki*i*rs*lvqmips

  12. Dicty_cDB: VHP706 [Dicty_cDB

    Lifescience Database Archive (English)

    Full Text Available 5 |pid:none) Homo sapiens full open reading fra... 76 6e-13 AF134593_1( AF134593 |pid:none) Homo sapiens L-pipecoli...3 BC114006_1( BC114006 |pid:none) Bos taurus L-pipecolic acid oxidas... 76 7e-13

  13. Dicty_cDB: VHP888 [Dicty_cDB

    Lifescience Database Archive (English)

    Full Text Available ng fra... 76 2e-12 AF134593_1( AF134593 |pid:none) Homo sapiens L-pipecolic acid oxid... 76 2e-12 AX882278_1...( AX882278 |pid:none) Sequence 17183 from Patent EP10746... 76 2e-12 BC114006_1( BC114006 |pid:none) Bos taurus L-pipecoli

  14. Dicty_cDB: VHP355 [Dicty_cDB

    Lifescience Database Archive (English)

    Full Text Available *kigalsgpflvsrtgdvg*tkywdktsnih**ipqkvlvh*dsrtvamevgir* gvcnnspakwtspeng*r*qwmvdaqslkeviilltcrka*r*rrslnvnss...dktsnih**ipqkvlvh*dsrtvamevgir* gvcnnspakwtspeng*r*qwmvdaqslkeviilltcrka*r*rrslnvnssgvvfsadl dgsskyskeftlkae

  15. Computational vision

    CERN Document Server

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  16. Batteries not included

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, M.

    2001-09-08

    This article traces the development of clockwork wind-up battery chargers that can be used to recharge mobile phones, laptop computers, torches or radio batteries from the pioneering research of the British inventor Trevor Baylis to the marketing of the wind-up gadgets by Freeplay Energy who turned the idea into a commercial product. The amount of cranking needed to power wind-up devices is discussed along with a hand-cranked charger for mobile phones, upgrading the phone charger's mechanism, and drawbacks of the charger. Details are given of another invention using a hand-cranked generator with a supercapacitor as a storage device which has a very much higher capacity for storing electrical charge.

  17. Batteries not included

    International Nuclear Information System (INIS)

    Cooper, M.

    2001-01-01

    This article traces the development of clockwork wind-up battery chargers that can be used to recharge mobile phones, laptop computers, torches or radio batteries from the pioneering research of the British inventor Trevor Baylis to the marketing of the wind-up gadgets by Freeplay Energy who turned the idea into a commercial product. The amount of cranking needed to power wind-up devices is discussed along with a hand-cranked charger for mobile phones, upgrading the phone charger's mechanism, and drawbacks of the charger. Details are given of another invention using a hand-cranked generator with a supercapacitor as a storage device which has a very much higher capacity for storing electrical charge

  18. Neoclassical transport including collisional nonlinearity.

    Science.gov (United States)

    Candy, J; Belli, E A

    2011-06-10

    In the standard δf theory of neoclassical transport, the zeroth-order (Maxwellian) solution is obtained analytically via the solution of a nonlinear equation. The first-order correction δf is subsequently computed as the solution of a linear, inhomogeneous equation that includes the linearized Fokker-Planck collision operator. This equation admits analytic solutions only in extreme asymptotic limits (banana, plateau, Pfirsch-Schlüter), and so must be solved numerically for realistic plasma parameters. Recently, numerical codes have appeared which attempt to compute the total distribution f more accurately than in the standard ordering by retaining some nonlinear terms related to finite-orbit width, while simultaneously reusing some form of the linearized collision operator. In this work we show that higher-order corrections to the distribution function may be unphysical if collisional nonlinearities are ignored.

  19. Epitaxial phase diagrams of SrTiO3, CaTiO3, and SrHfO3: Computational investigation including the role of antiferrodistortive and A -site displacement modes

    Science.gov (United States)

    Angsten, Thomas; Asta, Mark

    2018-04-01

    Ground-state epitaxial phase diagrams are calculated by density functional theory (DFT) for SrTiO3, CaTiO3, and SrHfO3 perovskite-based compounds, accounting for the effects of antiferrodistortive and A -site displacement modes. Biaxial strain states corresponding to epitaxial growth of (001)-oriented films are considered, with misfit strains ranging between -4 % and 4%. Ground-state structures are determined using a computational procedure in which input structures for DFT optimizations are identified as local minima in expansions of the total energy with respect to strain and soft-mode degrees of freedom. Comparison to results of previous DFT studies demonstrates the effectiveness of the computational approach in predicting ground-state phases. The calculated results show that antiferrodistortive octahedral rotations and associated A -site displacement modes act to suppress polarization and reduce the epitaxial strain energy. A projection of calculated atomic displacements in the ground-state epitaxial structures onto soft-mode eigenvectors shows that three ferroelectric and six antiferrodistortive displacement modes are dominant at all misfit strains considered, with the relative contributions from each varying systematically with the strain. Additional A -site displacement modes contribute to the atomic displacements in CaTiO3 and SrHfO3, which serve to optimize the coordination of the undersized A -site cation.

  20. New seismograph includes filters

    Energy Technology Data Exchange (ETDEWEB)

    1979-11-02

    The new Nimbus ES-1210 multichannel signal enhancement seismograph from EG and G geometrics has recently been redesigned to include multimode signal fillers on each amplifier. The ES-1210F is a shallow exploration seismograph for near subsurface exploration such as in depth-to-bedrock, geological hazard location, mineral exploration, and landslide investigations.

  1. Personal Computers.

    Science.gov (United States)

    Toong, Hoo-min D.; Gupta, Amar

    1982-01-01

    Describes the hardware, software, applications, and current proliferation of personal computers (microcomputers). Includes discussions of microprocessors, memory, output (including printers), application programs, the microcomputer industry, and major microcomputer manufacturers (Apple, Radio Shack, Commodore, and IBM). (JN)

  2. Analytic device including nanostructures

    KAUST Repository

    Di Fabrizio, Enzo M.; Fratalocchi, Andrea; Totero Gongora, Juan Sebastian; Coluccio, Maria Laura; Candeloro, Patrizio; Cuda, Gianni

    2015-01-01

    A device for detecting an analyte in a sample comprising: an array including a plurality of pixels, each pixel including a nanochain comprising: a first nanostructure, a second nanostructure, and a third nanostructure, wherein size of the first nanostructure is larger than that of the second nanostructure, and size of the second nanostructure is larger than that of the third nanostructure, and wherein the first nanostructure, the second nanostructure, and the third nanostructure are positioned on a substrate such that when the nanochain is excited by an energy, an optical field between the second nanostructure and the third nanostructure is stronger than an optical field between the first nanostructure and the second nanostructure, wherein the array is configured to receive a sample; and a detector arranged to collect spectral data from a plurality of pixels of the array.

  3. Saskatchewan resources. [including uranium

    Energy Technology Data Exchange (ETDEWEB)

    1979-09-01

    The production of chemicals and minerals for the chemical industry in Saskatchewan are featured, with some discussion of resource taxation. The commodities mentioned include potash, fatty amines, uranium, heavy oil, sodium sulfate, chlorine, sodium hydroxide, sodium chlorate and bentonite. Following the successful outcome of the Cluff Lake inquiry, the uranium industry is booming. Some developments and production figures for Gulf Minerals, Amok, Cenex and Eldorado are mentioned.

  4. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  5. Being Included and Excluded

    DEFF Research Database (Denmark)

    Korzenevica, Marina

    2016-01-01

    Following the civil war of 1996–2006, there was a dramatic increase in the labor mobility of young men and the inclusion of young women in formal education, which led to the transformation of the political landscape of rural Nepal. Mobility and schooling represent a level of prestige that rural...... politics. It analyzes how formal education and mobility either challenge or reinforce traditional gendered norms which dictate a lowly position for young married women in the household and their absence from community politics. The article concludes that women are simultaneously excluded and included from...... community politics. On the one hand, their mobility and decision-making powers decrease with the increase in the labor mobility of men and their newly gained education is politically devalued when compared to the informal education that men gain through mobility, but on the other hand, schooling strengthens...

  6. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  7. Unconventional Quantum Computing Devices

    OpenAIRE

    Lloyd, Seth

    2000-01-01

    This paper investigates a variety of unconventional quantum computation devices, including fermionic quantum computers and computers that exploit nonlinear quantum mechanics. It is shown that unconventional quantum computing devices can in principle compute some quantities more rapidly than `conventional' quantum computers.

  8. Analog and hybrid computing

    CERN Document Server

    Hyndman, D E

    2013-01-01

    Analog and Hybrid Computing focuses on the operations of analog and hybrid computers. The book first outlines the history of computing devices that influenced the creation of analog and digital computers. The types of problems to be solved on computers, computing systems, and digital computers are discussed. The text looks at the theory and operation of electronic analog computers, including linear and non-linear computing units and use of analog computers as operational amplifiers. The monograph examines the preparation of problems to be deciphered on computers. Flow diagrams, methods of ampl

  9. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  10. Computational Pathology

    Science.gov (United States)

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  11. Computability theory

    CERN Document Server

    Weber, Rebecca

    2012-01-01

    What can we compute--even with unlimited resources? Is everything within reach? Or are computations necessarily drastically limited, not just in practice, but theoretically? These questions are at the heart of computability theory. The goal of this book is to give the reader a firm grounding in the fundamentals of computability theory and an overview of currently active areas of research, such as reverse mathematics and algorithmic randomness. Turing machines and partial recursive functions are explored in detail, and vital tools and concepts including coding, uniformity, and diagonalization are described explicitly. From there the material continues with universal machines, the halting problem, parametrization and the recursion theorem, and thence to computability for sets, enumerability, and Turing reduction and degrees. A few more advanced topics round out the book before the chapter on areas of research. The text is designed to be self-contained, with an entire chapter of preliminary material including re...

  12. Vacuum hot pressing of titanium-alloy powders

    International Nuclear Information System (INIS)

    Malik, R.K.

    1975-01-01

    Full or nearly full dense products of wrought-metal properties have been obtained by vacuum hot pressing (VHP) of several prealloyed Ti--6Al--4V powders including hydride, hydride/dehydride, and rotating electrode process (REP) spherical powder. The properties of billets VHP from Ti--6Al--4V hydride powder and from hydride/dehydride powders have been shown to be equivalent. The REP spherical powder billets processed by VHP or by hot isostatic pressing (HIP) resulted in equivalent tensile properties. The potential of VHP for fabrication of near net aircraft parts such as complex fittings and engine disks offers considerable cost savings due to reduced material and machining requirements

  13. Computer interfacing

    CERN Document Server

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  14. Optical computing.

    Science.gov (United States)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  15. Computer hardware fault administration

    Science.gov (United States)

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  16. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  17. Computers and data processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Computers and Data Processing provides information pertinent to the advances in the computer field. This book covers a variety of topics, including the computer hardware, computer programs or software, and computer applications systems.Organized into five parts encompassing 19 chapters, this book begins with an overview of some of the fundamental computing concepts. This text then explores the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapters consider how computers present their results and explain the storage and retrieval of

  18. Computational chemistry

    Science.gov (United States)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  19. Polymorphous computing fabric

    Science.gov (United States)

    Wolinski, Christophe Czeslaw [Los Alamos, NM; Gokhale, Maya B [Los Alamos, NM; McCabe, Kevin Peter [Los Alamos, NM

    2011-01-18

    Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.

  20. Computer security

    CERN Document Server

    Gollmann, Dieter

    2011-01-01

    A completely up-to-date resource on computer security Assuming no previous experience in the field of computer security, this must-have book walks you through the many essential aspects of this vast topic, from the newest advances in software and technology to the most recent information on Web applications security. This new edition includes sections on Windows NT, CORBA, and Java and discusses cross-site scripting and JavaScript hacking as well as SQL injection. Serving as a helpful introduction, this self-study guide is a wonderful starting point for examining the variety of competing sec

  1. Computational engineering

    CERN Document Server

    2014-01-01

    The book presents state-of-the-art works in computational engineering. Focus is on mathematical modeling, numerical simulation, experimental validation and visualization in engineering sciences. In particular, the following topics are presented: constitutive models and their implementation into finite element codes, numerical models in nonlinear elasto-dynamics including seismic excitations, multiphase models in structural engineering and multiscale models of materials systems, sensitivity and reliability analysis of engineering structures, the application of scientific computing in urban water management and hydraulic engineering, and the application of genetic algorithms for the registration of laser scanner point clouds.

  2. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  3. Scientific computing

    CERN Document Server

    Trangenstein, John A

    2017-01-01

    This is the third of three volumes providing a comprehensive presentation of the fundamentals of scientific computing. This volume discusses topics that depend more on calculus than linear algebra, in order to prepare the reader for solving differential equations. This book and its companions show how to determine the quality of computational results, and how to measure the relative efficiency of competing methods. Readers learn how to determine the maximum attainable accuracy of algorithms, and how to select the best method for computing problems. This book also discusses programming in several languages, including C++, Fortran and MATLAB. There are 90 examples, 200 exercises, 36 algorithms, 40 interactive JavaScript programs, 91 references to software programs and 1 case study. Topics are introduced with goals, literature references and links to public software. There are descriptions of the current algorithms in GSLIB and MATLAB. This book could be used for a second course in numerical methods, for either ...

  4. Computational Psychiatry

    Science.gov (United States)

    Wang, Xiao-Jing; Krystal, John H.

    2014-01-01

    Psychiatric disorders such as autism and schizophrenia arise from abnormalities in brain systems that underlie cognitive, emotional and social functions. The brain is enormously complex and its abundant feedback loops on multiple scales preclude intuitive explication of circuit functions. In close interplay with experiments, theory and computational modeling are essential for understanding how, precisely, neural circuits generate flexible behaviors and their impairments give rise to psychiatric symptoms. This Perspective highlights recent progress in applying computational neuroscience to the study of mental disorders. We outline basic approaches, including identification of core deficits that cut across disease categories, biologically-realistic modeling bridging cellular and synaptic mechanisms with behavior, model-aided diagnosis. The need for new research strategies in psychiatry is urgent. Computational psychiatry potentially provides powerful tools for elucidating pathophysiology that may inform both diagnosis and treatment. To achieve this promise will require investment in cross-disciplinary training and research in this nascent field. PMID:25442941

  5. Algorithmically specialized parallel computers

    CERN Document Server

    Snyder, Lawrence; Gannon, Dennis B

    1985-01-01

    Algorithmically Specialized Parallel Computers focuses on the concept and characteristics of an algorithmically specialized computer.This book discusses the algorithmically specialized computers, algorithmic specialization using VLSI, and innovative architectures. The architectures and algorithms for digital signal, speech, and image processing and specialized architectures for numerical computations are also elaborated. Other topics include the model for analyzing generalized inter-processor, pipelined architecture for search tree maintenance, and specialized computer organization for raster

  6. Computed tomography for radiographers

    International Nuclear Information System (INIS)

    Brooker, M.

    1986-01-01

    Computed tomography is regarded by many as a complicated union of sophisticated x-ray equipment and computer technology. This book overcomes these complexities. The rigid technicalities of the machinery and the clinical aspects of computed tomography are discussed including the preparation of patients, both physically and mentally, for scanning. Furthermore, the author also explains how to set up and run a computed tomography department, including advice on how the room should be designed

  7. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  8. Quantum computing

    International Nuclear Information System (INIS)

    Steane, Andrew

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  9. Quantum computing

    Energy Technology Data Exchange (ETDEWEB)

    Steane, Andrew [Department of Atomic and Laser Physics, University of Oxford, Clarendon Laboratory, Oxford (United Kingdom)

    1998-02-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  10. MOS modeling hierarchy including radiation effects

    International Nuclear Information System (INIS)

    Alexander, D.R.; Turfler, R.M.

    1975-01-01

    A hierarchy of modeling procedures has been developed for MOS transistors, circuit blocks, and integrated circuits which include the effects of total dose radiation and photocurrent response. The models were developed for use with the SCEPTRE circuit analysis program, but the techniques are suitable for other modern computer aided analysis programs. The modeling hierarchy permits the designer or analyst to select the level of modeling complexity consistent with circuit size, parametric information, and accuracy requirements. Improvements have been made in the implementation of important second order effects in the transistor MOS model, in the definition of MOS building block models, and in the development of composite terminal models for MOS integrated circuits

  11. (including travel dates) Proposed itinerary

    Indian Academy of Sciences (India)

    Ashok

    31 July to 22 August 2012 (including travel dates). Proposed itinerary: Arrival in Bangalore on 1 August. 1-5 August: Bangalore, Karnataka. Suggested institutions: Indian Institute of Science, Bangalore. St Johns Medical College & Hospital, Bangalore. Jawaharlal Nehru Centre, Bangalore. 6-8 August: Chennai, TN.

  12. Computers in Nuclear Physics Division

    International Nuclear Information System (INIS)

    Kowalczyk, M.; Tarasiuk, J.; Srebrny, J.

    1997-01-01

    Improving of the computer equipment in Nuclear Physics Division is described. It include: new computer equipment and hardware upgrading, software developing, new programs for computer booting and modernization of data acquisition systems

  13. Research on cloud computing solutions

    OpenAIRE

    Liudvikas Kaklauskas; Vaida Zdanytė

    2015-01-01

    Cloud computing can be defined as a new style of computing in which dynamically scala-ble and often virtualized resources are provided as a services over the Internet. Advantages of the cloud computing technology include cost savings, high availability, and easy scalability. Voas and Zhang adapted six phases of computing paradigms, from dummy termi-nals/mainframes, to PCs, networking computing, to grid and cloud computing. There are four types of cloud computing: public cloud, private cloud, ...

  14. Theory including future not excluded

    DEFF Research Database (Denmark)

    Nagao, K.; Nielsen, H.B.

    2013-01-01

    We study a complex action theory (CAT) whose path runs over not only past but also future. We show that, if we regard a matrix element defined in terms of the future state at time T and the past state at time TA as an expectation value in the CAT, then we are allowed to have the Heisenberg equation......, Ehrenfest's theorem, and the conserved probability current density. In addition,we showthat the expectation value at the present time t of a future-included theory for large T - t and large t - T corresponds to that of a future-not-included theory with a proper inner product for large t - T. Hence, the CAT...

  15. A new computing principle

    International Nuclear Information System (INIS)

    Fatmi, H.A.; Resconi, G.

    1988-01-01

    In 1954 while reviewing the theory of communication and cybernetics the late Professor Dennis Gabor presented a new mathematical principle for the design of advanced computers. During our work on these computers it was found that the Gabor formulation can be further advanced to include more recent developments in Lie algebras and geometric probability, giving rise to a new computing principle

  16. Computers and Information Flow.

    Science.gov (United States)

    Patrick, R. L.

    This paper is designed to fill the need for an easily understood introduction to the computing and data processing field for the layman who has, or can expect to have, some contact with it. Information provided includes the unique terminology and jargon of the field, the various types of computers and the scope of computational capabilities, and…

  17. Approximation and Computation

    CERN Document Server

    Gautschi, Walter; Rassias, Themistocles M

    2011-01-01

    Approximation theory and numerical analysis are central to the creation of accurate computer simulations and mathematical models. Research in these areas can influence the computational techniques used in a variety of mathematical and computational sciences. This collection of contributed chapters, dedicated to renowned mathematician Gradimir V. Milovanovia, represent the recent work of experts in the fields of approximation theory and numerical analysis. These invited contributions describe new trends in these important areas of research including theoretic developments, new computational alg

  18. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  19. COMPUTER GAMES AND EDUCATION

    OpenAIRE

    Sukhov, Anton

    2018-01-01

    This paper devoted to the research of educational resources and possibilities of modern computer games. The “internal” educational aspects of computer games include educational mechanism (a separate or integrated “tutorial”) and representation of a real or even fantastic educational process within virtual worlds. The “external” dimension represents educational opportunities of computer games for personal and professional development in different genres of computer games (various transport, so...

  20. Device including a contact detector

    DEFF Research Database (Denmark)

    2011-01-01

    arms (12) may extend from the supporting body in co-planar relationship with the first surface. The plurality of cantilever arms (12) may extend substantially parallel to each other and each of the plurality of cantilever arms (12) may include an electrical conductive tip for contacting the area......The present invention relates to a probe for determining an electrical property of an area of a surface of a test sample, the probe is intended to be in a specific orientation relative to the test sample. The probe may comprise a supporting body defining a first surface. A plurality of cantilever...... of the test sample by movement of the probe relative to the surface of the test sample into the specific orientation.; The probe may further comprise a contact detector (14) extending from the supporting body arranged so as to contact the surface of the test sample prior to any one of the plurality...

  1. A formalization of computational trust

    NARCIS (Netherlands)

    Güven - Ozcelebi, C.; Holenderski, M.J.; Ozcelebi, T.; Lukkien, J.J.

    2018-01-01

    Computational trust aims to quantify trust and is studied by many disciplines including computer science, social sciences and business science. We propose a formal computational trust model, including its parameters and operations on these parameters, as well as a step by step guide to compute trust

  2. Customizable computing

    CERN Document Server

    Chen, Yu-Ting; Gill, Michael; Reinman, Glenn; Xiao, Bingjun

    2015-01-01

    Since the end of Dennard scaling in the early 2000s, improving the energy efficiency of computation has been the main concern of the research community and industry. The large energy efficiency gap between general-purpose processors and application-specific integrated circuits (ASICs) motivates the exploration of customizable architectures, where one can adapt the architecture to the workload. In this Synthesis lecture, we present an overview and introduction of the recent developments on energy-efficient customizable architectures, including customizable cores and accelerators, on-chip memory

  3. HCI in Mobile and Ubiquitous Computing

    OpenAIRE

    椎尾, 一郎; 安村, 通晃; 福本, 雅明; 伊賀, 聡一郎; 増井, 俊之

    2003-01-01

    This paper provides some perspectives to human computer interaction in mobile and ubiquitous computing. The review covers overview of ubiquitous computing, mobile computing and wearable computing. It also summarizes HCI topics on these field, including real-world oriented interface, multi-modal interface, context awareness and in-visible computers. Finally we discuss killer applications for coming ubiquitous computing era.

  4. Computer group

    International Nuclear Information System (INIS)

    Bauer, H.; Black, I.; Heusler, A.; Hoeptner, G.; Krafft, F.; Lang, R.; Moellenkamp, R.; Mueller, W.; Mueller, W.F.; Schati, C.; Schmidt, A.; Schwind, D.; Weber, G.

    1983-01-01

    The computer groups has been reorganized to take charge for the general purpose computers DEC10 and VAX and the computer network (Dataswitch, DECnet, IBM - connections to GSI and IPP, preparation for Datex-P). (orig.)

  5. Computer Engineers.

    Science.gov (United States)

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  6. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  7. Prospective Evaluation of Magnetic Resonance Imaging and [18F]Fluorodeoxyglucose Positron Emission Tomography-Computed Tomography at Diagnosis and Before Maintenance Therapy in Symptomatic Patients With Multiple Myeloma Included in the IFM/DFCI 2009 Trial: Results of the IMAJEM Study.

    Science.gov (United States)

    Moreau, Philippe; Attal, Michel; Caillot, Denis; Macro, Margaret; Karlin, Lionel; Garderet, Laurent; Facon, Thierry; Benboubker, Lotfi; Escoffre-Barbe, Martine; Stoppa, Anne-Marie; Laribi, Kamel; Hulin, Cyrille; Perrot, Aurore; Marit, Gerald; Eveillard, Jean-Richard; Caillon, Florence; Bodet-Milin, Caroline; Pegourie, Brigitte; Dorvaux, Veronique; Chaleteix, Carine; Anderson, Kenneth; Richardson, Paul; Munshi, Nikhil C; Avet-Loiseau, Herve; Gaultier, Aurelie; Nguyen, Jean-Michel; Dupas, Benoit; Frampas, Eric; Kraeber-Bodere, Françoise

    2017-09-01

    Purpose Magnetic resonance imaging (MRI) and positron emission tomography-computed tomography (PET-CT) are important imaging techniques in multiple myeloma (MM). We conducted a prospective trial in patients with MM aimed at comparing MRI and PET-CT with respect to the detection of bone lesions at diagnosis and the prognostic value of the techniques. Patients and Methods One hundred thirty-four patients received a combination of lenalidomide, bortezomib, and dexamethasone (RVD) with or without autologous stem-cell transplantation, followed by lenalidomide maintenance. PET-CT and MRI were performed at diagnosis, after three cycles of RVD, and before maintenance therapy. The primary end point was the detection of bone lesions at diagnosis by MRI versus PET-CT. Secondary end points included the prognostic impact of MRI and PET-CT regarding progression-free (PFS) and overall survival (OS). Results At diagnosis, MRI results were positive in 127 of 134 patients (95%), and PET-CT results were positive in 122 of 134 patients (91%; P = .33). Normalization of MRI after three cycles of RVD and before maintenance was not predictive of PFS or OS. PET-CT became normal after three cycles of RVD in 32% of the patients with a positive evaluation at baseline, and PFS was improved in this group (30-month PFS, 78.7% v 56.8%, respectively). PET-CT normalization before maintenance was described in 62% of the patients who were positive at baseline. This was associated with better PFS and OS. Extramedullary disease at diagnosis was an independent prognostic factor for PFS and OS, whereas PET-CT normalization before maintenance was an independent prognostic factor for PFS. Conclusion There is no difference in the detection of bone lesions at diagnosis when comparing PET-CT and MRI. PET-CT is a powerful tool to evaluate the prognosis of de novo myeloma.

  8. Essential numerical computer methods

    CERN Document Server

    Johnson, Michael L

    2010-01-01

    The use of computers and computational methods has become ubiquitous in biological and biomedical research. During the last 2 decades most basic algorithms have not changed, but what has is the huge increase in computer speed and ease of use, along with the corresponding orders of magnitude decrease in cost. A general perception exists that the only applications of computers and computer methods in biological and biomedical research are either basic statistical analysis or the searching of DNA sequence data bases. While these are important applications they only scratch the surface of the current and potential applications of computers and computer methods in biomedical research. The various chapters within this volume include a wide variety of applications that extend far beyond this limited perception. As part of the Reliable Lab Solutions series, Essential Numerical Computer Methods brings together chapters from volumes 210, 240, 321, 383, 384, 454, and 467 of Methods in Enzymology. These chapters provide ...

  9. Computer Tree

    Directory of Open Access Journals (Sweden)

    Onur AĞAOĞLU

    2014-12-01

    Full Text Available It is crucial that gifted and talented students should be supported by different educational methods for their interests and skills. The science and arts centres (gifted centres provide the Supportive Education Program for these students with an interdisciplinary perspective. In line with the program, an ICT lesson entitled “Computer Tree” serves for identifying learner readiness levels, and defining the basic conceptual framework. A language teacher also contributes to the process, since it caters for the creative function of the basic linguistic skills. The teaching technique is applied for 9-11 aged student level. The lesson introduces an evaluation process including basic information, skills, and interests of the target group. Furthermore, it includes an observation process by way of peer assessment. The lesson is considered to be a good sample of planning for any subject, for the unpredicted convergence of visual and technical abilities with linguistic abilities.

  10. Scalable optical quantum computer

    Energy Technology Data Exchange (ETDEWEB)

    Manykin, E A; Mel' nichenko, E V [Institute for Superconductivity and Solid-State Physics, Russian Research Centre ' Kurchatov Institute' , Moscow (Russian Federation)

    2014-12-31

    A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr{sup 3+}, regularly located in the lattice of the orthosilicate (Y{sub 2}SiO{sub 5}) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

  11. Computing meaning v.4

    CERN Document Server

    Bunt, Harry; Pulman, Stephen

    2013-01-01

    This book is a collection of papers by leading researchers in computational semantics. It presents a state-of-the-art overview of recent and current research in computational semantics, including descriptions of new methods for constructing and improving resources for semantic computation, such as WordNet, VerbNet, and semantically annotated corpora. It also presents new statistical methods in semantic computation, such as the application of distributional semantics in the compositional calculation of sentence meanings. Computing the meaning of sentences, texts, and spoken or texted dialogue i

  12. Scalable optical quantum computer

    International Nuclear Information System (INIS)

    Manykin, E A; Mel'nichenko, E V

    2014-01-01

    A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr 3+ , regularly located in the lattice of the orthosilicate (Y 2 SiO 5 ) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

  13. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2005-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

  14. 78 FR 34669 - Certain Electronic Devices, Including Wireless Communication Devices, Portable Music and Data...

    Science.gov (United States)

    2013-06-10

    ..., Including Wireless Communication Devices, Portable Music and Data Processing Devices, and Tablet Computers... importing wireless communication devices, portable music and data processing devices, and tablet computers... certain electronic devices, including wireless communication devices, portable music and data processing...

  15. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  16. Center for computer security: Computer Security Group conference. Summary

    Energy Technology Data Exchange (ETDEWEB)

    None

    1982-06-01

    Topics covered include: computer security management; detection and prevention of computer misuse; certification and accreditation; protection of computer security, perspective from a program office; risk analysis; secure accreditation systems; data base security; implementing R and D; key notarization system; DOD computer security center; the Sandia experience; inspector general's report; and backup and contingency planning. (GHT)

  17. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  18. Quantum Computing

    OpenAIRE

    Scarani, Valerio

    1998-01-01

    The aim of this thesis was to explain what quantum computing is. The information for the thesis was gathered from books, scientific publications, and news articles. The analysis of the information revealed that quantum computing can be broken down to three areas: theories behind quantum computing explaining the structure of a quantum computer, known quantum algorithms, and the actual physical realizations of a quantum computer. The thesis reveals that moving from classical memor...

  19. Resource allocation in grid computing

    NARCIS (Netherlands)

    Koole, Ger; Righter, Rhonda

    2007-01-01

    Grid computing, in which a network of computers is integrated to create a very fast virtual computer, is becoming ever more prevalent. Examples include the TeraGrid and Planet-lab.org, as well as applications on the existing Internet that take advantage of unused computing and storage capacity of

  20. Computed tomography

    International Nuclear Information System (INIS)

    Wells, P.; Davis, J.; Morgan, M.

    1994-01-01

    X-ray or gamma-ray transmission computed tomography (CT) is a powerful non-destructive evaluation (NDE) technique that produces two-dimensional cross-sectional images of an object without the need to physically section it. CT is also known by the acronym CAT, for computerised axial tomography. This review article presents a brief historical perspective on CT, its current status and the underlying physics. The mathematical fundamentals of computed tomography are developed for the simplest transmission CT modality. A description of CT scanner instrumentation is provided with an emphasis on radiation sources and systems. Examples of CT images are shown indicating the range of materials that can be scanned and the spatial and contrast resolutions that may be achieved. Attention is also given to the occurrence, interpretation and minimisation of various image artefacts that may arise. A final brief section is devoted to the principles and potential of a range of more recently developed tomographic modalities including diffraction CT, positron emission CT and seismic tomography. 57 refs., 2 tabs., 14 figs

  1. Introduction to reversible computing

    CERN Document Server

    Perumalla, Kalyan S

    2013-01-01

    Few books comprehensively cover the software and programming aspects of reversible computing. Filling this gap, Introduction to Reversible Computing offers an expanded view of the field that includes the traditional energy-motivated hardware viewpoint as well as the emerging application-motivated software approach. Collecting scattered knowledge into one coherent account, the book provides a compendium of both classical and recently developed results on reversible computing. It explores up-and-coming theories, techniques, and tools for the application of rever

  2. Computational movement analysis

    CERN Document Server

    Laube, Patrick

    2014-01-01

    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  3. Computational neurogenetic modeling

    CERN Document Server

    Benuskova, Lubica

    2010-01-01

    Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol

  4. Intelligent distributed computing

    CERN Document Server

    Thampi, Sabu

    2015-01-01

    This book contains a selection of refereed and revised papers of the Intelligent Distributed Computing Track originally presented at the third International Symposium on Intelligent Informatics (ISI-2014), September 24-27, 2014, Delhi, India.  The papers selected for this Track cover several Distributed Computing and related topics including Peer-to-Peer Networks, Cloud Computing, Mobile Clouds, Wireless Sensor Networks, and their applications.

  5. Computer science I essentials

    CERN Document Server

    Raus, Randall

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Computer Science I includes fundamental computer concepts, number representations, Boolean algebra, switching circuits, and computer architecture.

  6. Discrete computational structures

    CERN Document Server

    Korfhage, Robert R

    1974-01-01

    Discrete Computational Structures describes discrete mathematical concepts that are important to computing, covering necessary mathematical fundamentals, computer representation of sets, graph theory, storage minimization, and bandwidth. The book also explains conceptual framework (Gorn trees, searching, subroutines) and directed graphs (flowcharts, critical paths, information network). The text discusses algebra particularly as it applies to concentrates on semigroups, groups, lattices, propositional calculus, including a new tabular method of Boolean function minimization. The text emphasize

  7. Computational Medicine

    DEFF Research Database (Denmark)

    Nygaard, Jens Vinge

    2017-01-01

    The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours......The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours...

  8. Grid Computing

    Indian Academy of Sciences (India)

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers ...

  9. Including gauge corrections to thermal leptogenesis

    Energy Technology Data Exchange (ETDEWEB)

    Huetig, Janine

    2013-05-17

    . Furthermore, we have computed the Majorana neutrino production rate itself in chapter 6 to test our numerical procedure. In this context we have calculated the tree-level result as well as the gauge corrected result for the Majorana neutrino production rate. Finally in chapter 7, we have implemented the Majorana neutrino ladder rung diagram into our setup for leptogenesis: As a first consideration, we have collected all gauge corrected diagrams up to three-loop order for the asymmetry-causing two-loop diagrams. However, the results of chap. 5 showed that it is not sufficient to just include diagrams up to three-loop level. Due to the necessity of resumming all n-loop diagrams, we have constructed a cylindrical diagram that fulfils this condition. This diagram is the link between the Majorana neutrino ladder rung diagram calculated before on the one hand and the lepton asymmetry on the other. Therefore we have been able to derive a complete expression for the integrated lepton number matrix including all leading order corrections. The numerical analysis of this lepton number matrix needs a great computational effort since for the resulting eight-dimensional integral two ordinary differential equations have to be computed for each point the routine evaluates. Thus the result remains yet inaccessible. Research perspectives: Summarising, this thesis provides the basis for a systematic inclusion of gauge interactions in thermal leptogenesis scenarios. As a next step, one should evaluate the expression for the integrated lepton number numerically to gain a value, which can be used for comparison to earlier results such as the solutions of the Boltzmann equations as well as the Kadanoff-Baym ansatz with the implemented Standard Model widths. This numerical result would be the first quantitative number, which contains leading order corrections due to all interactions of the Majorana neutrino with the Standard Model particles. Further corrections by means of including washout effects

  10. Including gauge corrections to thermal leptogenesis

    International Nuclear Information System (INIS)

    Huetig, Janine

    2013-01-01

    . Furthermore, we have computed the Majorana neutrino production rate itself in chapter 6 to test our numerical procedure. In this context we have calculated the tree-level result as well as the gauge corrected result for the Majorana neutrino production rate. Finally in chapter 7, we have implemented the Majorana neutrino ladder rung diagram into our setup for leptogenesis: As a first consideration, we have collected all gauge corrected diagrams up to three-loop order for the asymmetry-causing two-loop diagrams. However, the results of chap. 5 showed that it is not sufficient to just include diagrams up to three-loop level. Due to the necessity of resumming all n-loop diagrams, we have constructed a cylindrical diagram that fulfils this condition. This diagram is the link between the Majorana neutrino ladder rung diagram calculated before on the one hand and the lepton asymmetry on the other. Therefore we have been able to derive a complete expression for the integrated lepton number matrix including all leading order corrections. The numerical analysis of this lepton number matrix needs a great computational effort since for the resulting eight-dimensional integral two ordinary differential equations have to be computed for each point the routine evaluates. Thus the result remains yet inaccessible. Research perspectives: Summarising, this thesis provides the basis for a systematic inclusion of gauge interactions in thermal leptogenesis scenarios. As a next step, one should evaluate the expression for the integrated lepton number numerically to gain a value, which can be used for comparison to earlier results such as the solutions of the Boltzmann equations as well as the Kadanoff-Baym ansatz with the implemented Standard Model widths. This numerical result would be the first quantitative number, which contains leading order corrections due to all interactions of the Majorana neutrino with the Standard Model particles. Further corrections by means of including washout effects

  11. Zγ production at NNLO including anomalous couplings

    Science.gov (United States)

    Campbell, John M.; Neumann, Tobias; Williams, Ciaran

    2017-11-01

    In this paper we present a next-to-next-to-leading order (NNLO) QCD calculation of the processes pp → l + l -γ and pp\\to ν \\overline{ν}γ that we have implemented in MCFM. Our calculation includes QCD corrections at NNLO both for the Standard Model (SM) and additionally in the presence of Zγγ and ZZγ anomalous couplings. We compare our implementation, obtained using the jettiness slicing approach, with a previous SM calculation and find broad agreement. Focusing on the sensitivity of our results to the slicing parameter, we show that using our setup we are able to compute NNLO cross sections with numerical uncertainties of about 0.1%, which is small compared to residual scale uncertainties of a few percent. We study potential improvements using two different jettiness definitions and the inclusion of power corrections. At √{s}=13 TeV we present phenomenological results and consider Zγ as a background to H → Zγ production. We find that, with typical cuts, the inclusion of NNLO corrections represents a small effect and loosens the extraction of limits on anomalous couplings by about 10%.

  12. Langevin simulations of QCD, including fermions

    International Nuclear Information System (INIS)

    Kronfeld, A.S.

    1986-02-01

    We encounter critical slow down in updating when xi/a -> infinite and in matrix inversion (needed to include fermions) when msub(q)a -> 0. A simulation that purports to solve QCD numerically will encounter these limits, so to face the challenge in the title of this workshop, we must cure the disease of critical slow down. Physically, this critical slow down is due to the reluctance of changes at short distances to propagate to large distances. Numerically, the stability of an algorithm at short wavelengths requires a (moderately) small step size; critical slow down occurs when the effective long wavelength step size becomes tiny. The remedy for this disease is an algorithm that propagates signals quickly throughout the system; i.e. one whose effective step size is not reduced for the long wavelength conponents of the fields. (Here the effective ''step size'' is essentially an inverse decorrelation time.) To do so one must resolve various wavelengths of the system and modify the dynamics (in CPU time) of the simulation so that all modes evolve at roughly the same rate. This can be achieved by introducing Fourier transforms. I show how to implement Fourier acceleration for Langevin updating and for conjugate gradient matrix inversion. The crucial feature of these algorithms that lends them to Fourier acceleration is that they update the lattice globally; hence the Fourier transforms are computed once per sweep rather than once per hit. (orig./HSI)

  13. Quantum computers and quantum computations

    International Nuclear Information System (INIS)

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  14. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  15. Pervasive Computing

    NARCIS (Netherlands)

    Silvis-Cividjian, N.

    This book provides a concise introduction to Pervasive Computing, otherwise known as Internet of Things (IoT) and Ubiquitous Computing (Ubicomp) which addresses the seamless integration of computing systems within everyday objects. By introducing the core topics and exploring assistive pervasive

  16. Spatial Computation

    Science.gov (United States)

    2003-12-01

    Computation and today’s microprocessors with the approach to operating system architecture, and the controversy between microkernels and monolithic kernels...Both Spatial Computation and microkernels break away a relatively monolithic architecture into in- dividual lightweight pieces, well specialized...for their particular functionality. Spatial Computation removes global signals and control, in the same way microkernels remove the global address

  17. A survey of computational physics introductory computational science

    CERN Document Server

    Landau, Rubin H; Bordeianu, Cristian C

    2008-01-01

    Computational physics is a rapidly growing subfield of computational science, in large part because computers can solve previously intractable problems or simulate natural processes that do not have analytic solutions. The next step beyond Landau's First Course in Scientific Computing and a follow-up to Landau and Páez's Computational Physics, this text presents a broad survey of key topics in computational physics for advanced undergraduates and beginning graduate students, including new discussions of visualization tools, wavelet analysis, molecular dynamics, and computational fluid dynamics

  18. Demonstration of blind quantum computing.

    Science.gov (United States)

    Barz, Stefanie; Kashefi, Elham; Broadbent, Anne; Fitzsimons, Joseph F; Zeilinger, Anton; Walther, Philip

    2012-01-20

    Quantum computers, besides offering substantial computational speedups, are also expected to preserve the privacy of a computation. We present an experimental demonstration of blind quantum computing in which the input, computation, and output all remain unknown to the computer. We exploit the conceptual framework of measurement-based quantum computation that enables a client to delegate a computation to a quantum server. Various blind delegated computations, including one- and two-qubit gates and the Deutsch and Grover quantum algorithms, are demonstrated. The client only needs to be able to prepare and transmit individual photonic qubits. Our demonstration is crucial for unconditionally secure quantum cloud computing and might become a key ingredient for real-life applications, especially when considering the challenges of making powerful quantum computers widely available.

  19. Computational mathematics in China

    CERN Document Server

    Shi, Zhong-Ci

    1994-01-01

    This volume describes the most significant contributions made by Chinese mathematicians over the past decades in various areas of computational mathematics. Some of the results are quite important and complement Western developments in the field. The contributors to the volume range from noted senior mathematicians to promising young researchers. The topics include finite element methods, computational fluid mechanics, numerical solutions of differential equations, computational methods in dynamical systems, numerical algebra, approximation, and optimization. Containing a number of survey articles, the book provides an excellent way for Western readers to gain an understanding of the status and trends of computational mathematics in China.

  20. Computer Games and Art

    Directory of Open Access Journals (Sweden)

    Anton Sukhov

    2015-10-01

    Full Text Available This article devoted to the search of relevant sources (primary and secondary and characteristics of computer games that allow to include them in the field of art (such as the creation of artistic games, computer graphics, active interaction with other forms of art, signs of spiritual aesthetic act, own temporality of computer games, “aesthetic illusion”, interactivity. In general, modern computer games can be attributed to commercial art and popular culture (blockbuster games and to elite forms of contemporary media art (author’s games, visionary games.

  1. Parallel computations

    CERN Document Server

    1982-01-01

    Parallel Computations focuses on parallel computation, with emphasis on algorithms used in a variety of numerical and physical applications and for many different types of parallel computers. Topics covered range from vectorization of fast Fourier transforms (FFTs) and of the incomplete Cholesky conjugate gradient (ICCG) algorithm on the Cray-1 to calculation of table lookups and piecewise functions. Single tridiagonal linear systems and vectorized computation of reactive flow are also discussed.Comprised of 13 chapters, this volume begins by classifying parallel computers and describing techn

  2. Human Computation

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  3. Quantum computation

    International Nuclear Information System (INIS)

    Deutsch, D.

    1992-01-01

    As computers become ever more complex, they inevitably become smaller. This leads to a need for components which are fabricated and operate on increasingly smaller size scales. Quantum theory is already taken into account in microelectronics design. This article explores how quantum theory will need to be incorporated into computers in future in order to give them their components functionality. Computation tasks which depend on quantum effects will become possible. Physicists may have to reconsider their perspective on computation in the light of understanding developed in connection with universal quantum computers. (UK)

  4. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  5. Theoretical Computer Science

    DEFF Research Database (Denmark)

    2002-01-01

    The proceedings contains 8 papers from the Conference on Theoretical Computer Science. Topics discussed include: query by committee, linear separation and random walks; hardness results for neural network approximation problems; a geometric approach to leveraging weak learners; mind change...

  6. Paraconsistent Computational Logic

    DEFF Research Database (Denmark)

    Jensen, Andreas Schmidt; Villadsen, Jørgen

    2012-01-01

    In classical logic everything follows from inconsistency and this makes classical logic problematic in areas of computer science where contradictions seem unavoidable. We describe a many-valued paraconsistent logic, discuss the truth tables and include a small case study....

  7. Computational modeling in biomechanics

    CERN Document Server

    Mofrad, Mohammad

    2010-01-01

    This book provides a glimpse of the diverse and important roles that modern computational technology is playing in various areas of biomechanics. It includes unique chapters on ab initio quantum mechanical, molecular dynamic and scale coupling methods..

  8. The CMS Computing Model

    International Nuclear Information System (INIS)

    Bonacorsi, D.

    2007-01-01

    The CMS experiment at LHC has developed a baseline Computing Model addressing the needs of a computing system capable to operate in the first years of LHC running. It is focused on a data model with heavy streaming at the raw data level based on trigger, and on the achievement of the maximum flexibility in the use of distributed computing resources. The CMS distributed Computing Model includes a Tier-0 centre at CERN, a CMS Analysis Facility at CERN, several Tier-1 centres located at large regional computing centres, and many Tier-2 centres worldwide. The workflows have been identified, along with a baseline architecture for the data management infrastructure. This model is also being tested in Grid Service Challenges of increasing complexity, coordinated with the Worldwide LHC Computing Grid community

  9. 75 FR 4583 - In the Matter of: Certain Electronic Devices, Including Mobile Phones, Portable Music Players...

    Science.gov (United States)

    2010-01-28

    ..., Including Mobile Phones, Portable Music Players, and Computers; Notice of Investigation AGENCY: U.S... music players, and computers, by reason of infringement of certain claims of U.S. Patent Nos. 6,714,091... importation of certain electronic devices, including mobile phones, portable music players, or computers that...

  10. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  11. McMaster University: College and University Computing Environment.

    Science.gov (United States)

    CAUSE/EFFECT, 1988

    1988-01-01

    The computing and information services (CIS) organization includes administrative computing, academic computing, and networking and has three divisions: computing services, development services, and information services. Other computing activities include Health Sciences, Humanities Computing Center, and Department of Computer Science and Systems.…

  12. An introduction to computer viruses

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D.R.

    1992-03-01

    This report on computer viruses is based upon a thesis written for the Master of Science degree in Computer Science from the University of Tennessee in December 1989 by David R. Brown. This thesis is entitled An Analysis of Computer Virus Construction, Proliferation, and Control and is available through the University of Tennessee Library. This paper contains an overview of the computer virus arena that can help the reader to evaluate the threat that computer viruses pose. The extent of this threat can only be determined by evaluating many different factors. These factors include the relative ease with which a computer virus can be written, the motivation involved in writing a computer virus, the damage and overhead incurred by infected systems, and the legal implications of computer viruses, among others. Based upon the research, the development of a computer virus seems to require more persistence than technical expertise. This is a frightening proclamation to the computing community. The education of computer professionals to the dangers that viruses pose to the welfare of the computing industry as a whole is stressed as a means of inhibiting the current proliferation of computer virus programs. Recommendations are made to assist computer users in preventing infection by computer viruses. These recommendations support solid general computer security practices as a means of combating computer viruses.

  13. Mathematics, the Computer, and the Impact on Mathematics Education.

    Science.gov (United States)

    Tooke, D. James

    2001-01-01

    Discusses the connection between mathematics and the computer; mathematics curriculum; mathematics instruction, including teachers learning to use computers; and the impact of the computer on learning mathematics. (LRW)

  14. 77 FR 60720 - Certain Electronic Devices, Including Wireless Commmunication Devices, Portable Music and Data...

    Science.gov (United States)

    2012-10-04

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-794] Certain Electronic Devices, Including Wireless Commmunication Devices, Portable Music and Data Processing Devices, and Tablet Computers... communication devices, portable music and data processing devices, and tablet computers, imported by Apple Inc...

  15. 77 FR 70464 - Certain Electronic Devices, Including Wireless Communication Devices, Portable Music and Data...

    Science.gov (United States)

    2012-11-26

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-794] Certain Electronic Devices, Including Wireless Communication Devices, Portable Music and Data Processing Devices, and Tablet Computers... wireless communication devices, portable music and data processing devices, and tablet computers, by reason...

  16. Analysis of Smart Composite Structures Including Debonding

    Science.gov (United States)

    Chattopadhyay, Aditi; Seeley, Charles E.

    1997-01-01

    Smart composite structures with distributed sensors and actuators have the capability to actively respond to a changing environment while offering significant weight savings and additional passive controllability through ply tailoring. Piezoelectric sensing and actuation of composite laminates is the most promising concept due to the static and dynamic control capabilities. Essential to the implementation of these smart composites are the development of accurate and efficient modeling techniques and experimental validation. This research addresses each of these important topics. A refined higher order theory is developed to model composite structures with surface bonded or embedded piezoelectric transducers. These transducers are used as both sensors and actuators for closed loop control. The theory accurately captures the transverse shear deformation through the thickness of the smart composite laminate while satisfying stress free boundary conditions on the free surfaces. The theory is extended to include the effect of debonding at the actuator-laminate interface. The developed analytical model is implemented using the finite element method utilizing an induced strain approach for computational efficiency. This allows general laminate geometries and boundary conditions to be analyzed. The state space control equations are developed to allow flexibility in the design of the control system. Circuit concepts are also discussed. Static and dynamic results of smart composite structures, obtained using the higher order theory, are correlated with available analytical data. Comparisons, including debonded laminates, are also made with a general purpose finite element code and available experimental data. Overall, very good agreement is observed. Convergence of the finite element implementation of the higher order theory is shown with exact solutions. Additional results demonstrate the utility of the developed theory to study piezoelectric actuation of composite

  17. Computational biomechanics

    International Nuclear Information System (INIS)

    Ethier, C.R.

    2004-01-01

    Computational biomechanics is a fast-growing field that integrates modern biological techniques and computer modelling to solve problems of medical and biological interest. Modelling of blood flow in the large arteries is the best-known application of computational biomechanics, but there are many others. Described here is work being carried out in the laboratory on the modelling of blood flow in the coronary arteries and on the transport of viral particles in the eye. (author)

  18. Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.

    to understand the computer as a material like any other material we would use for design, like wood, aluminum, or plastic. That as soon as the computer forms a composition with other materials it becomes just as approachable and inspiring as other smart materials. I present a series of investigations of what...... Computational Composite, and Telltale). Through the investigations, I show how the computer can be understood as a material and how it partakes in a new strand of materials whose expressions come to be in context. I uncover some of their essential material properties and potential expressions. I develop a way...

  19. LTRACK: Beam-transport calculation including wakefield effects

    International Nuclear Information System (INIS)

    Chan, K.C.D.; Cooper, R.K.

    1988-01-01

    LTRACK is a first-order beam-transport code that includes wakefield effects up to quadrupole modes. This paper will introduce the readers to this computer code by describing the history, the method of calculations, and a brief summary of the input/output information. Future plans for the code will also be described

  20. Computational thinking as an emerging competence domain

    NARCIS (Netherlands)

    Yadav, A.; Good, J.; Voogt, J.; Fisser, P.; Mulder, M.

    2016-01-01

    Computational thinking is a problem-solving skill set, which includes problem decomposition, algorithmic thinking, abstraction, and automation. Even though computational thinking draws upon concepts fundamental to computer science (CS), it has broad application to all disciplines. It has been

  1. COMPUTATIONAL SCIENCE CENTER

    International Nuclear Information System (INIS)

    DAVENPORT, J.

    2006-01-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to bring together

  2. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2006-11-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to

  3. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  4. Quantum Computing

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 9. Quantum Computing - Building Blocks of a Quantum Computer. C S Vijay Vishal Gupta. General Article Volume 5 Issue 9 September 2000 pp 69-81. Fulltext. Click here to view fulltext PDF. Permanent link:

  5. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  6. Quantum Computing

    Indian Academy of Sciences (India)

    In the first part of this article, we had looked at how quantum physics can be harnessed to make the building blocks of a quantum computer. In this concluding part, we look at algorithms which can exploit the power of this computational device, and some practical difficulties in building such a device. Quantum Algorithms.

  7. Quantum computing

    OpenAIRE

    Burba, M.; Lapitskaya, T.

    2017-01-01

    This article gives an elementary introduction to quantum computing. It is a draft for a book chapter of the "Handbook of Nature-Inspired and Innovative Computing", Eds. A. Zomaya, G.J. Milburn, J. Dongarra, D. Bader, R. Brent, M. Eshaghian-Wilner, F. Seredynski (Springer, Berlin Heidelberg New York, 2006).

  8. Power throttling of collections of computing elements

    Science.gov (United States)

    Bellofatto, Ralph E [Ridgefield, CT; Coteus, Paul W [Yorktown Heights, NY; Crumley, Paul G [Yorktown Heights, NY; Gara, Alan G [Mount Kidsco, NY; Giampapa, Mark E [Irvington, NY; Gooding,; Thomas, M [Rochester, MN; Haring, Rudolf A [Cortlandt Manor, NY; Megerian, Mark G [Rochester, MN; Ohmacht, Martin [Yorktown Heights, NY; Reed, Don D [Mantorville, MN; Swetz, Richard A [Mahopac, NY; Takken, Todd [Brewster, NY

    2011-08-16

    An apparatus and method for controlling power usage in a computer includes a plurality of computers communicating with a local control device, and a power source supplying power to the local control device and the computer. A plurality of sensors communicate with the computer for ascertaining power usage of the computer, and a system control device communicates with the computer for controlling power usage of the computer.

  9. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production......), for instance, in establishing and maintaining trust between the involved parties (Sabherwal, 1999). So far, research in cloud computing has neglected this perspective and focused entirely on aspects relating to technology, economy, security and legal questions. While the core technologies of cloud computing (e...

  10. Computational Streetscapes

    Directory of Open Access Journals (Sweden)

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  11. Computer games and prosocial behaviour.

    Science.gov (United States)

    Mengel, Friederike

    2014-01-01

    We relate different self-reported measures of computer use to individuals' propensity to cooperate in the Prisoner's dilemma. The average cooperation rate is positively related to the self-reported amount participants spend playing computer games. None of the other computer time use variables (including time spent on social media, browsing internet, working etc.) are significantly related to cooperation rates.

  12. The Need for Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Bernier, David

    2011-01-01

    Broadening computer science learning to include more students is a crucial item on the United States' education agenda, these authors say. Although policymakers advocate more computer science expertise, computer science offerings in high schools are few--and actually shrinking. In addition, poorly resourced schools with a high percentage of…

  13. Computing architecture for autonomous microgrids

    Science.gov (United States)

    Goldsmith, Steven Y.

    2015-09-29

    A computing architecture that facilitates autonomously controlling operations of a microgrid is described herein. A microgrid network includes numerous computing devices that execute intelligent agents, each of which is assigned to a particular entity (load, source, storage device, or switch) in the microgrid. The intelligent agents can execute in accordance with predefined protocols to collectively perform computations that facilitate uninterrupted control of the .

  14. Administrative Computing in Continuing Education.

    Science.gov (United States)

    Broxton, Harry

    1982-01-01

    Describes computer applications in the Division of Continuing Education at Brigham Young University. These include instructional applications (computer assisted instruction, computer science education, and student problem solving) and administrative applications (registration, payment records, grades, reports, test scoring, mailing, and others).…

  15. AV Programs for Computer Know-How.

    Science.gov (United States)

    Mandell, Phyllis Levy

    1985-01-01

    Lists 44 audiovisual programs (most released between 1983 and 1984) grouped in seven categories: computers in society, introduction to computers, computer operations, languages and programing, computer graphics, robotics, computer careers. Excerpts from "School Library Journal" reviews, price, and intended grade level are included. Names…

  16. Parallelism in matrix computations

    CERN Document Server

    Gallopoulos, Efstratios; Sameh, Ahmed H

    2016-01-01

    This book is primarily intended as a research monograph that could also be used in graduate courses for the design of parallel algorithms in matrix computations. It assumes general but not extensive knowledge of numerical linear algebra, parallel architectures, and parallel programming paradigms. The book consists of four parts: (I) Basics; (II) Dense and Special Matrix Computations; (III) Sparse Matrix Computations; and (IV) Matrix functions and characteristics. Part I deals with parallel programming paradigms and fundamental kernels, including reordering schemes for sparse matrices. Part II is devoted to dense matrix computations such as parallel algorithms for solving linear systems, linear least squares, the symmetric algebraic eigenvalue problem, and the singular-value decomposition. It also deals with the development of parallel algorithms for special linear systems such as banded ,Vandermonde ,Toeplitz ,and block Toeplitz systems. Part III addresses sparse matrix computations: (a) the development of pa...

  17. Offline computing and networking

    International Nuclear Information System (INIS)

    Appel, J.A.; Avery, P.; Chartrand, G.

    1985-01-01

    This note summarizes the work of the Offline Computing and Networking Group. The report is divided into two sections; the first deals with the computing and networking requirements and the second with the proposed way to satisfy those requirements. In considering the requirements, we have considered two types of computing problems. The first is CPU-intensive activity such as production data analysis (reducing raw data to DST), production Monte Carlo, or engineering calculations. The second is physicist-intensive computing such as program development, hardware design, physics analysis, and detector studies. For both types of computing, we examine a variety of issues. These included a set of quantitative questions: how much CPU power (for turn-around and for through-put), how much memory, mass-storage, bandwidth, and so on. There are also very important qualitative issues: what features must be provided by the operating system, what tools are needed for program design, code management, database management, and for graphics

  18. Applications of Computer Algebra Conference

    CERN Document Server

    Martínez-Moro, Edgar

    2017-01-01

    The Applications of Computer Algebra (ACA) conference covers a wide range of topics from Coding Theory to Differential Algebra to Quantam Computing, focusing on the interactions of these and other areas with the discipline of Computer Algebra. This volume provides the latest developments in the field as well as its applications in various domains, including communications, modelling, and theoretical physics. The book will appeal to researchers and professors of computer algebra, applied mathematics, and computer science, as well as to engineers and computer scientists engaged in research and development.

  19. Introduction to computer networking

    CERN Document Server

    Robertazzi, Thomas G

    2017-01-01

    This book gives a broad look at both fundamental networking technology and new areas that support it and use it. It is a concise introduction to the most prominent, recent technological topics in computer networking. Topics include network technology such as wired and wireless networks, enabling technologies such as data centers, software defined networking, cloud and grid computing and applications such as networks on chips, space networking and network security. The accessible writing style and non-mathematical treatment makes this a useful book for the student, network and communications engineer, computer scientist and IT professional. • Features a concise, accessible treatment of computer networking, focusing on new technological topics; • Provides non-mathematical introduction to networks in their most common forms today;< • Includes new developments in switching, optical networks, WiFi, Bluetooth, LTE, 5G, and quantum cryptography.

  20. Home, Hearth and Computing.

    Science.gov (United States)

    Seelig, Anita

    1982-01-01

    Advantages of having children use microcomputers at school and home include learning about sophisticated concepts early in life without a great deal of prodding, playing games that expand knowledge, and becoming literate in computer knowledge needed later in life. Includes comments from parents on their experiences with microcomputers and…

  1. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  2. Computational physics

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1987-01-15

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October.

  3. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  4. Computational Viscoelasticity

    CERN Document Server

    Marques, Severino P C

    2012-01-01

    This text is a guide how to solve problems in which viscoelasticity is present using existing commercial computational codes. The book gives information on codes’ structure and use, data preparation  and output interpretation and verification. The first part of the book introduces the reader to the subject, and to provide the models, equations and notation to be used in the computational applications. The second part shows the most important Computational techniques: Finite elements formulation, Boundary elements formulation, and presents the solutions of Viscoelastic problems with Abaqus.

  5. Computational physics

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October

  6. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot...... encompass human concepts of subjective experience and intersubjective meaningful communication, which prevents it from being genuinely transdisciplinary. (3) Philosophically, it does not sufficiently accept the deep ontological differences between various paradigms such as von Foerster’s second- order...

  7. Cloud computing basics for librarians.

    Science.gov (United States)

    Hoy, Matthew B

    2012-01-01

    "Cloud computing" is the name for the recent trend of moving software and computing resources to an online, shared-service model. This article briefly defines cloud computing, discusses different models, explores the advantages and disadvantages, and describes some of the ways cloud computing can be used in libraries. Examples of cloud services are included at the end of the article. Copyright © Taylor & Francis Group, LLC

  8. 12 CFR 1102.27 - Computing time.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Computing time. 1102.27 Section 1102.27 Banks... for Proceedings § 1102.27 Computing time. (a) General rule. In computing any period of time prescribed... time begins to run is not included. The last day so computed is included, unless it is a Saturday...

  9. The NASA computer science research program plan

    Science.gov (United States)

    1983-01-01

    A taxonomy of computer science is included, one state of the art of each of the major computer science categories is summarized. A functional breakdown of NASA programs under Aeronautics R and D, space R and T, and institutional support is also included. These areas were assessed against the computer science categories. Concurrent processing, highly reliable computing, and information management are identified.

  10. 12 CFR 622.21 - Computing time.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Computing time. 622.21 Section 622.21 Banks and... Formal Hearings § 622.21 Computing time. (a) General rule. In computing any period of time prescribed or... run is not to be included. The last day so computed shall be included, unless it is a Saturday, Sunday...

  11. Desktop grid computing

    CERN Document Server

    Cerin, Christophe

    2012-01-01

    Desktop Grid Computing presents common techniques used in numerous models, algorithms, and tools developed during the last decade to implement desktop grid computing. These techniques enable the solution of many important sub-problems for middleware design, including scheduling, data management, security, load balancing, result certification, and fault tolerance. The book's first part covers the initial ideas and basic concepts of desktop grid computing. The second part explores challenging current and future problems. Each chapter presents the sub-problems, discusses theoretical and practical

  12. Computational biology for ageing

    Science.gov (United States)

    Wieser, Daniela; Papatheodorou, Irene; Ziehm, Matthias; Thornton, Janet M.

    2011-01-01

    High-throughput genomic and proteomic technologies have generated a wealth of publicly available data on ageing. Easy access to these data, and their computational analysis, is of great importance in order to pinpoint the causes and effects of ageing. Here, we provide a description of the existing databases and computational tools on ageing that are available for researchers. We also describe the computational approaches to data interpretation in the field of ageing including gene expression, comparative and pathway analyses, and highlight the challenges for future developments. We review recent biological insights gained from applying bioinformatics methods to analyse and interpret ageing data in different organisms, tissues and conditions. PMID:21115530

  13. Digital computers in action

    CERN Document Server

    Booth, A D

    1965-01-01

    Digital Computers in Action is an introduction to the basics of digital computers as well as their programming and various applications in fields such as mathematics, science, engineering, economics, medicine, and law. Other topics include engineering automation, process control, special purpose games-playing devices, machine translation and mechanized linguistics, and information retrieval. This book consists of 14 chapters and begins by discussing the history of computers, from the idea of performing complex arithmetical calculations to the emergence of a modern view of the structure of a ge

  14. CONFERENCE: Computers and accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1984-01-15

    In September of last year a Conference on 'Computers in Accelerator Design and Operation' was held in West Berlin attracting some 160 specialists including many from outside Europe. It was a Europhysics Conference, organized by the Hahn-Meitner Institute with Roman Zelazny as Conference Chairman, postponed from an earlier intended venue in Warsaw. The aim was to bring together specialists in the fields of accelerator design, computer control and accelerator operation.

  15. Collectively loading an application in a parallel computer

    Science.gov (United States)

    Aho, Michael E.; Attinella, John E.; Gooding, Thomas M.; Miller, Samuel J.; Mundy, Michael B.

    2016-01-05

    Collectively loading an application in a parallel computer, the parallel computer comprising a plurality of compute nodes, including: identifying, by a parallel computer control system, a subset of compute nodes in the parallel computer to execute a job; selecting, by the parallel computer control system, one of the subset of compute nodes in the parallel computer as a job leader compute node; retrieving, by the job leader compute node from computer memory, an application for executing the job; and broadcasting, by the job leader to the subset of compute nodes in the parallel computer, the application for executing the job.

  16. Essentials of cloud computing

    CERN Document Server

    Chandrasekaran, K

    2014-01-01

    ForewordPrefaceComputing ParadigmsLearning ObjectivesPreambleHigh-Performance ComputingParallel ComputingDistributed ComputingCluster ComputingGrid ComputingCloud ComputingBiocomputingMobile ComputingQuantum ComputingOptical ComputingNanocomputingNetwork ComputingSummaryReview PointsReview QuestionsFurther ReadingCloud Computing FundamentalsLearning ObjectivesPreambleMotivation for Cloud ComputingThe Need for Cloud ComputingDefining Cloud ComputingNIST Definition of Cloud ComputingCloud Computing Is a ServiceCloud Computing Is a Platform5-4-3 Principles of Cloud computingFive Essential Charact

  17. Computational Literacy

    DEFF Research Database (Denmark)

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies for the acquisit......In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies...... for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...

  18. Computing Religion

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  19. Computational Controversy

    NARCIS (Netherlands)

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have

  20. Grid Computing

    Indian Academy of Sciences (India)

    IAS Admin

    emergence of supercomputers led to the use of computer simula- tion as an .... Scientific and engineering applications (e.g., Tera grid secure gate way). Collaborative ... Encryption, privacy, protection from malicious software. Physical Layer.

  1. Computer tomographs

    International Nuclear Information System (INIS)

    Niedzwiedzki, M.

    1982-01-01

    Physical foundations and the developments in the transmission and emission computer tomography are presented. On the basis of the available literature and private communications a comparison is made of the various transmission tomographs. A new technique of computer emission tomography ECT, unknown in Poland, is described. The evaluation of two methods of ECT, namely those of positron and single photon emission tomography is made. (author)

  2. Computational sustainability

    CERN Document Server

    Kersting, Kristian; Morik, Katharina

    2016-01-01

    The book at hand gives an overview of the state of the art research in Computational Sustainability as well as case studies of different application scenarios. This covers topics such as renewable energy supply, energy storage and e-mobility, efficiency in data centers and networks, sustainable food and water supply, sustainable health, industrial production and quality, etc. The book describes computational methods and possible application scenarios.

  3. Computing farms

    International Nuclear Information System (INIS)

    Yeh, G.P.

    2000-01-01

    High-energy physics, nuclear physics, space sciences, and many other fields have large challenges in computing. In recent years, PCs have achieved performance comparable to the high-end UNIX workstations, at a small fraction of the price. We review the development and broad applications of commodity PCs as the solution to CPU needs, and look forward to the important and exciting future of large-scale PC computing

  4. Computational Physics Program of the National MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1984-12-01

    The principal objective of the computational physics group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. A summary of the groups activities is presented, including computational studies in MHD equilibria and stability, plasma transport, Fokker-Planck, and efficient numerical and programming algorithms. References are included

  5. Natural Computing in Computational Finance Volume 4

    CERN Document Server

    O’Neill, Michael; Maringer, Dietmar

    2012-01-01

    This book follows on from Natural Computing in Computational Finance  Volumes I, II and III.   As in the previous volumes of this series, the  book consists of a series of  chapters each of  which was selected following a rigorous, peer-reviewed, selection process.  The chapters illustrate the application of a range of cutting-edge natural  computing and agent-based methodologies in computational finance and economics.  The applications explored include  option model calibration, financial trend reversal detection, enhanced indexation, algorithmic trading,  corporate payout determination and agent-based modeling of liquidity costs, and trade strategy adaptation.  While describing cutting edge applications, the chapters are  written so that they are accessible to a wide audience. Hence, they should be of interest  to academics, students and practitioners in the fields of computational finance and  economics.  

  6. Proposal for grid computing for nuclear applications

    International Nuclear Information System (INIS)

    Faridah Mohamad Idris; Wan Ahmad Tajuddin Wan Abdullah; Zainol Abidin Ibrahim; Zukhaimira Zolkapli

    2013-01-01

    Full-text: The use of computer clusters for computational sciences including computational physics is vital as it provides computing power to crunch big numbers at a faster rate. In compute intensive applications that requires high resolution such as Monte Carlo simulation, the use of computer clusters in a grid form that supplies computational power to any nodes within the grid that needs computing power, has now become a necessity. In this paper, we described how the clusters running on a specific application could use resources within the grid, to run the applications to speed up the computing process. (author)

  7. Computational colour science using MATLAB

    CERN Document Server

    Westland, Stephen; Cheung, Vien

    2012-01-01

    Computational Colour Science Using MATLAB 2nd Edition offers a practical, problem-based approach to colour physics. The book focuses on the key issues encountered in modern colour engineering, including efficient representation of colour information, Fourier analysis of reflectance spectra and advanced colorimetric computation. Emphasis is placed on the practical applications rather than the techniques themselves, with material structured around key topics. These topics include colour calibration of visual displays, computer recipe prediction and models for colour-appearance prediction. Each t

  8. Computational creativity

    Directory of Open Access Journals (Sweden)

    López de Mántaras Badia, Ramon

    2013-12-01

    Full Text Available New technologies, and in particular artificial intelligence, are drastically changing the nature of creative processes. Computers are playing very significant roles in creative activities such as music, architecture, fine arts, and science. Indeed, the computer is already a canvas, a brush, a musical instrument, and so on. However, we believe that we must aim at more ambitious relations between computers and creativity. Rather than just seeing the computer as a tool to help human creators, we could see it as a creative entity in its own right. This view has triggered a new subfield of Artificial Intelligence called Computational Creativity. This article addresses the question of the possibility of achieving computational creativity through some examples of computer programs capable of replicating some aspects of creative behavior in the fields of music and science.Las nuevas tecnologías y en particular la Inteligencia Artificial están cambiando de forma importante la naturaleza del proceso creativo. Los ordenadores están jugando un papel muy significativo en actividades artísticas tales como la música, la arquitectura, las bellas artes y la ciencia. Efectivamente, el ordenador ya es el lienzo, el pincel, el instrumento musical, etc. Sin embargo creemos que debemos aspirar a relaciones más ambiciosas entre los ordenadores y la creatividad. En lugar de verlos solamente como herramientas de ayuda a la creación, los ordenadores podrían ser considerados agentes creativos. Este punto de vista ha dado lugar a un nuevo subcampo de la Inteligencia Artificial denominado Creatividad Computacional. En este artículo abordamos la cuestión de la posibilidad de alcanzar dicha creatividad computacional mediante algunos ejemplos de programas de ordenador capaces de replicar algunos aspectos relacionados con el comportamiento creativo en los ámbitos de la música y la ciencia.

  9. Transportation Research & Analysis Computing Center

    Data.gov (United States)

    Federal Laboratory Consortium — The technical objectives of the TRACC project included the establishment of a high performance computing center for use by USDOT research teams, including those from...

  10. An Integrated Biochemistry Laboratory, Including Molecular Modeling

    Science.gov (United States)

    Hall, Adele J. Wolfson Mona L.; Branham, Thomas R.

    1996-11-01

    ) experience with methods of protein purification; (iii) incorporation of appropriate controls into experiments; (iv) use of basic statistics in data analysis; (v) writing papers and grant proposals in accepted scientific style; (vi) peer review; (vii) oral presentation of results and proposals; and (viii) introduction to molecular modeling. Figure 1 illustrates the modular nature of the lab curriculum. Elements from each of the exercises can be separated and treated as stand-alone exercises, or combined into short or long projects. We have been able to offer the opportunity to use sophisticated molecular modeling in the final module through funding from an NSF-ILI grant. However, many of the benefits of the research proposal can be achieved with other computer programs, or even by literature survey alone. Figure 1.Design of project-based biochemistry laboratory. Modules (projects, or portions of projects) are indicated as boxes. Each of these can be treated independently, or used as part of a larger project. Solid lines indicate some suggested paths from one module to the next. The skills and knowledge required for protein purification and design are developed in three units: (i) an introduction to critical assays needed to monitor degree of purification, including an evaluation of assay parameters; (ii) partial purification by ion-exchange techniques; and (iii) preparation of a grant proposal on protein design by mutagenesis. Brief descriptions of each of these units follow, with experimental details of each project at the end of this paper. Assays for Lysozyme Activity and Protein Concentration (4 weeks) The assays mastered during the first unit are a necessary tool for determining the purity of the enzyme during the second unit on purification by ion exchange. These assays allow an introduction to the concept of specific activity (units of enzyme activity per milligram of total protein) as a measure of purity. In this first sequence, students learn a turbidimetric assay

  11. Accreditation of academic programmes in computing South Africa

    CSIR Research Space (South Africa)

    Gerber, A

    2012-05-01

    Full Text Available Over the past two decades, strong technical convergence has been observed between computing and engineering. Computing in this context includes Computer Engineering, Computer Science, Information Systems, Information Technology and Software...

  12. CHEP95: Computing in high energy physics. Abstracts

    International Nuclear Information System (INIS)

    1995-01-01

    These proceedings cover the technical papers on computation in High Energy Physics, including computer codes, computer devices, control systems, simulations, data acquisition systems. New approaches on computer architectures are also discussed

  13. Computer control applied to accelerators

    CERN Document Server

    Crowley-Milling, Michael C

    1974-01-01

    The differences that exist between control systems for accelerators and other types of control systems are outlined. It is further indicated that earlier accelerators had manual control systems to which computers were added, but that it is essential for the new, large accelerators to include computers in the control systems right from the beginning. Details of the computer control designed for the Super Proton Synchrotron are presented. The method of choosing the computers is described, as well as the reasons for CERN having to design the message transfer system. The items discussed include: CAMAC interface systems, a new multiplex system, operator-to-computer interaction (such as touch screen, computer-controlled knob, and non- linear track-ball), and high-level control languages. Brief mention is made of the contributions of other high-energy research laboratories as well as of some other computer control applications at CERN. (0 refs).

  14. Recent trends in grid computing

    International Nuclear Information System (INIS)

    Miura, Kenichi

    2004-01-01

    Grid computing is a technology which allows uniform and transparent access to geographically dispersed computational resources, such as computers, databases, experimental and observational equipment etc. via high-speed, high-bandwidth networking. The commonly used analogy is that of electrical power grid, whereby the household electricity is made available from outlets on the wall, and little thought need to be given to where the electricity is generated and how it is transmitted. The usage of grid also includes distributed parallel computing, high through-put computing, data intensive computing (data grid) and collaborative computing. This paper reviews the historical background, software structure, current status and on-going grid projects, including applications of grid technology to nuclear fusion research. (author)

  15. 42 CFR 410.100 - Included services.

    Science.gov (United States)

    2010-10-01

    ... service; however, maintenance therapy itself is not covered as part of these services. (c) Occupational... increase respiratory function, such as graded activity services; these services include physiologic... rehabilitation plan of treatment, including physical therapy services, occupational therapy services, speech...

  16. Community Cloud Computing

    Science.gov (United States)

    Marinos, Alexandros; Briscoe, Gerard

    Cloud Computing is rising fast, with its data centres growing at an unprecedented rate. However, this has come with concerns over privacy, efficiency at the expense of resilience, and environmental sustainability, because of the dependence on Cloud vendors such as Google, Amazon and Microsoft. Our response is an alternative model for the Cloud conceptualisation, providing a paradigm for Clouds in the community, utilising networked personal computers for liberation from the centralised vendor model. Community Cloud Computing (C3) offers an alternative architecture, created by combing the Cloud with paradigms from Grid Computing, principles from Digital Ecosystems, and sustainability from Green Computing, while remaining true to the original vision of the Internet. It is more technically challenging than Cloud Computing, having to deal with distributed computing issues, including heterogeneous nodes, varying quality of service, and additional security constraints. However, these are not insurmountable challenges, and with the need to retain control over our digital lives and the potential environmental consequences, it is a challenge we must pursue.

  17. Computer science II essentials

    CERN Document Server

    Raus, Randall

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Computer Science II includes organization of a computer, memory and input/output, coding, data structures, and program development. Also included is an overview of the most commonly

  18. Applications of symbolic algebraic computation

    International Nuclear Information System (INIS)

    Brown, W.S.; Hearn, A.C.

    1979-01-01

    This paper is a survey of applications of systems for symbomic algebraic computation. In most successful applications, calculations that can be taken to a given order by hand are then extended one or two more orders by computer. Furthermore, with a few notable exceptins, these applications also involve numerical computation in some way. Therefore the authors emphasize the interface between symbolic and numerical computation, including: 1. Computations with both symbolic and numerical phases. 2. Data involving both the unpredictible size and shape that typify symbolic computation and the (usually inexact) numerical values that characterize numerical computation. 3. Applications of one field to the other. It is concluded that the fields of symbolic and numerical computation can advance most fruitfully in harmony rather than in competition. (Auth.)

  19. Static, Lightweight Includes Resolution for PHP

    NARCIS (Netherlands)

    M.A. Hills (Mark); P. Klint (Paul); J.J. Vinju (Jurgen)

    2014-01-01

    htmlabstractDynamic languages include a number of features that are challenging to model properly in static analysis tools. In PHP, one of these features is the include expression, where an arbitrary expression provides the path of the file to include at runtime. In this paper we present two

  20. Article Including Environmental Barrier Coating System

    Science.gov (United States)

    Lee, Kang N. (Inventor)

    2015-01-01

    An enhanced environmental barrier coating for a silicon containing substrate. The enhanced barrier coating may include a bond coat doped with at least one of an alkali metal oxide and an alkali earth metal oxide. The enhanced barrier coating may include a composite mullite bond coat including BSAS and another distinct second phase oxide applied over said surface.

  1. Rare thoracic cancers, including peritoneum mesothelioma

    NARCIS (Netherlands)

    Siesling, Sabine; van der Zwan, Jan Maarten; Izarzugaza, Isabel; Jaal, Jana; Treasure, Tom; Foschi, Roberto; Ricardi, Umberto; Groen, Harry; Tavilla, Andrea; Ardanaz, Eva

    Rare thoracic cancers include those of the trachea, thymus and mesothelioma (including peritoneum mesothelioma). The aim of this study was to describe the incidence, prevalence and survival of rare thoracic tumours using a large database, which includes cancer patients diagnosed from 1978 to 2002,

  2. Rare thoracic cancers, including peritoneum mesothelioma

    NARCIS (Netherlands)

    Siesling, Sabine; Zwan, J.M.V.D.; Izarzugaza, I.; Jaal, J.; Treasure, T.; Foschi, R.; Ricardi, U.; Groen, H.; Tavilla, A.; Ardanaz, E.

    2012-01-01

    Rare thoracic cancers include those of the trachea, thymus and mesothelioma (including peritoneum mesothelioma). The aim of this study was to describe the incidence, prevalence and survival of rare thoracic tumours using a large database, which includes cancer patients diagnosed from 1978 to 2002,

  3. Multiparty Computations

    DEFF Research Database (Denmark)

    Dziembowski, Stefan

    here and discuss other problems caused by the adaptiveness. All protocols in the thesis are formally specified and the proofs of their security are given. [1]Ronald Cramer, Ivan Damgård, Stefan Dziembowski, Martin Hirt, and Tal Rabin. Efficient multiparty computations with dishonest minority......In this thesis we study a problem of doing Verifiable Secret Sharing (VSS) and Multiparty Computations in a model where private channels between the players and a broadcast channel is available. The adversary is active, adaptive and has an unbounded computing power. The thesis is based on two...... to a polynomial time black-box reduction, the complexity of adaptively secure VSS is the same as that of ordinary secret sharing (SS), where security is only required against a passive, static adversary. Previously, such a connection was only known for linear secret sharing and VSS schemes. We then show...

  4. Data structures, computer graphics, and pattern recognition

    CERN Document Server

    Klinger, A; Kunii, T L

    1977-01-01

    Data Structures, Computer Graphics, and Pattern Recognition focuses on the computer graphics and pattern recognition applications of data structures methodology.This book presents design related principles and research aspects of the computer graphics, system design, data management, and pattern recognition tasks. The topics include the data structure design, concise structuring of geometric data for computer aided design, and data structures for pattern recognition algorithms. The survey of data structures for computer graphics systems, application of relational data structures in computer gr

  5. Workshop on Computational Optimization

    CERN Document Server

    2016-01-01

    This volume is a comprehensive collection of extended contributions from the Workshop on Computational Optimization 2014, held at Warsaw, Poland, September 7-10, 2014. The book presents recent advances in computational optimization. The volume includes important real problems like parameter settings for controlling processes in bioreactor and other processes, resource constrained project scheduling, infection distribution, molecule distance geometry, quantum computing, real-time management and optimal control, bin packing, medical image processing, localization the abrupt atmospheric contamination source and so on. It shows how to develop algorithms for them based on new metaheuristic methods like evolutionary computation, ant colony optimization, constrain programming and others. This research demonstrates how some real-world problems arising in engineering, economics, medicine and other domains can be formulated as optimization tasks.

  6. Discrete and computational geometry

    CERN Document Server

    Devadoss, Satyan L

    2011-01-01

    Discrete geometry is a relatively new development in pure mathematics, while computational geometry is an emerging area in applications-driven computer science. Their intermingling has yielded exciting advances in recent years, yet what has been lacking until now is an undergraduate textbook that bridges the gap between the two. Discrete and Computational Geometry offers a comprehensive yet accessible introduction to this cutting-edge frontier of mathematics and computer science. This book covers traditional topics such as convex hulls, triangulations, and Voronoi diagrams, as well as more recent subjects like pseudotriangulations, curve reconstruction, and locked chains. It also touches on more advanced material, including Dehn invariants, associahedra, quasigeodesics, Morse theory, and the recent resolution of the Poincaré conjecture. Connections to real-world applications are made throughout, and algorithms are presented independently of any programming language. This richly illustrated textbook also fe...

  7. Resilient computer system design

    CERN Document Server

    Castano, Victor

    2015-01-01

    This book presents a paradigm for designing new generation resilient and evolving computer systems, including their key concepts, elements of supportive theory, methods of analysis and synthesis of ICT with new properties of evolving functioning, as well as implementation schemes and their prototyping. The book explains why new ICT applications require a complete redesign of computer systems to address challenges of extreme reliability, high performance, and power efficiency. The authors present a comprehensive treatment for designing the next generation of computers, especially addressing safety-critical, autonomous, real time, military, banking, and wearable health care systems.   §  Describes design solutions for new computer system - evolving reconfigurable architecture (ERA) that is free from drawbacks inherent in current ICT and related engineering models §  Pursues simplicity, reliability, scalability principles of design implemented through redundancy and re-configurability; targeted for energy-,...

  8. COMPUTING: International symposium

    International Nuclear Information System (INIS)

    Anon.

    1984-01-01

    Recent Developments in Computing, Processor, and Software Research for High Energy Physics, a four-day international symposium, was held in Guanajuato, Mexico, from 8-11 May, with 112 attendees from nine countries. The symposium was the third in a series of meetings exploring activities in leading-edge computing technology in both processor and software research and their effects on high energy physics. Topics covered included fixed-target on- and off-line reconstruction processors; lattice gauge and general theoretical processors and computing; multiprocessor projects; electron-positron collider on- and offline reconstruction processors; state-of-the-art in university computer science and industry; software research; accelerator processors; and proton-antiproton collider on and off-line reconstruction processors

  9. Computational neurology and psychiatry

    CERN Document Server

    Bhattacharya, Basabdatta; Cochran, Amy

    2017-01-01

    This book presents the latest research in computational methods for modeling and simulating brain disorders. In particular, it shows how mathematical models can be used to study the relationship between a given disorder and the specific brain structure associated with that disorder. It also describes the emerging field of computational psychiatry, including the study of pathological behavior due to impaired functional connectivity, pathophysiological activity, and/or aberrant decision-making. Further, it discusses the data analysis techniques that will be required to analyze the increasing amount of data being generated about the brain. Lastly, the book offers some tips on the application of computational models in the field of quantitative systems pharmacology. Mainly written for computational scientists eager to discover new application fields for their model, this book also benefits neurologists and psychiatrists wanting to learn about new methods.

  10. GPU computing and applications

    CERN Document Server

    See, Simon

    2015-01-01

    This book presents a collection of state of the art research on GPU Computing and Application. The major part of this book is selected from the work presented at the 2013 Symposium on GPU Computing and Applications held in Nanyang Technological University, Singapore (Oct 9, 2013). Three major domains of GPU application are covered in the book including (1) Engineering design and simulation; (2) Biomedical Sciences; and (3) Interactive & Digital Media. The book also addresses the fundamental issues in GPU computing with a focus on big data processing. Researchers and developers in GPU Computing and Applications will benefit from this book. Training professionals and educators can also benefit from this book to learn the possible application of GPU technology in various areas.

  11. Encyclopedia of cloud computing

    CERN Document Server

    Bojanova, Irena

    2016-01-01

    The Encyclopedia of Cloud Computing provides IT professionals, educators, researchers and students with a compendium of cloud computing knowledge. Authored by a spectrum of subject matter experts in industry and academia, this unique publication, in a single volume, covers a wide range of cloud computing topics, including technological trends and developments, research opportunities, best practices, standards, and cloud adoption. Providing multiple perspectives, it also addresses questions that stakeholders might have in the context of development, operation, management, and use of clouds. Furthermore, it examines cloud computing's impact now and in the future. The encyclopedia presents 56 chapters logically organized into 10 sections. Each chapter covers a major topic/area with cross-references to other chapters and contains tables, illustrations, side-bars as appropriate. Furthermore, each chapter presents its summary at the beginning and backend material, references and additional resources for further i...

  12. Computational artifacts

    DEFF Research Database (Denmark)

    Schmidt, Kjeld; Bansler, Jørgen P.

    2016-01-01

    The key concern of CSCW research is that of understanding computing technologies in the social context of their use, that is, as integral features of our practices and our lives, and to think of their design and implementation under that perspective. However, the question of the nature...... of that which is actually integrated in our practices is often discussed in confusing ways, if at all. The article aims to try to clarify the issue and in doing so revisits and reconsiders the notion of ‘computational artifact’....

  13. Cloud Computing

    CERN Document Server

    Antonopoulos, Nick

    2010-01-01

    Cloud computing has recently emerged as a subject of substantial industrial and academic interest, though its meaning and scope is hotly debated. For some researchers, clouds are a natural evolution towards the full commercialisation of grid systems, while others dismiss the term as a mere re-branding of existing pay-per-use technologies. From either perspective, 'cloud' is now the label of choice for accountable pay-per-use access to third party applications and computational resources on a massive scale. Clouds support patterns of less predictable resource use for applications and services a

  14. Computational Logistics

    DEFF Research Database (Denmark)

    Pacino, Dario; Voss, Stefan; Jensen, Rune Møller

    2013-01-01

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  15. Computational Logistics

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  16. Computer busses

    CERN Document Server

    Buchanan, William

    2000-01-01

    As more and more equipment is interface or'bus' driven, either by the use of controllers or directly from PCs, the question of which bus to use is becoming increasingly important both in industry and in the office. 'Computer Busses' has been designed to help choose the best type of bus for the particular application.There are several books which cover individual busses, but none which provide a complete guide to computer busses. The author provides a basic theory of busses and draws examples and applications from real bus case studies. Busses are analysed using from a top-down approach, helpin

  17. Reconfigurable Computing

    CERN Document Server

    Cardoso, Joao MP

    2011-01-01

    As the complexity of modern embedded systems increases, it becomes less practical to design monolithic processing platforms. As a result, reconfigurable computing is being adopted widely for more flexible design. Reconfigurable Computers offer the spatial parallelism and fine-grained customizability of application-specific circuits with the postfabrication programmability of software. To make the most of this unique combination of performance and flexibility, designers need to be aware of both hardware and software issues. FPGA users must think not only about the gates needed to perform a comp

  18. Calculation of Permeability inside the Basket including one Fuel Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Seung Hwan; Bang, Kyung Sik; Lee, Ju an; Choi, Woo Seok [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    In general, the porous media model and the effective thermal conductivity were used to simply the fuel assembly. The methods of calculating permeability were compared considering the flow inside a basket which includes a nuclear fuel. Detailed fuel assembly was a computational modeling and the flow characteristics were investigated. The flow inside the basket which included a fuel assembly is analyzed by CFD. As the height of the fuel assembly increases, the pressure drop linearly increased. The inertia resistance could be neglected. Three methods to calculate the permeability were compared. The permeability by the friction factor is 50% less than the permeability by wall shear stress and pressure drop.

  19. HETC-3STEP included fragmentation process

    Energy Technology Data Exchange (ETDEWEB)

    Shigyo, Nobuhiro; Iga, Kiminori; Ishibashi, Kenji [Kyushu Univ., Fukuoka (Japan). Faculty of Engineering

    1997-03-01

    High Energy Transport Code (HETC) based on the cascade-evaporation model is modified to calculate the fragmentation cross section. For the cascade process, nucleon-nucleon cross sections are used for collision computation; effective in-medium-corrected cross sections are adopted instead of the original free-nucleon collision. The exciton model is adopted for improvement of backward nucleon-emission cross section for low-energy nucleon-incident events. The fragmentation reaction is incorporated into the original HETC as a subroutine set by the use of the systematics of the reaction. The modified HETC (HETC-3STEP/FRG) reproduces experimental fragment yields to a reasonable degree. (author)

  20. Computer Architecture A Quantitative Approach

    CERN Document Server

    Hennessy, John L

    2011-01-01

    The computing world today is in the middle of a revolution: mobile clients and cloud computing have emerged as the dominant paradigms driving programming and hardware innovation today. The Fifth Edition of Computer Architecture focuses on this dramatic shift, exploring the ways in which software and technology in the cloud are accessed by cell phones, tablets, laptops, and other mobile computing devices. Each chapter includes two real-world examples, one mobile and one datacenter, to illustrate this revolutionary change.Updated to cover the mobile computing revolutionEmphasizes the two most im

  1. Computational aspects of algebraic curves

    CERN Document Server

    Shaska, Tanush

    2005-01-01

    The development of new computational techniques and better computing power has made it possible to attack some classical problems of algebraic geometry. The main goal of this book is to highlight such computational techniques related to algebraic curves. The area of research in algebraic curves is receiving more interest not only from the mathematics community, but also from engineers and computer scientists, because of the importance of algebraic curves in applications including cryptography, coding theory, error-correcting codes, digital imaging, computer vision, and many more.This book cove

  2. Statistical Computing

    Indian Academy of Sciences (India)

    inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.

  3. Computational biology

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Computation via biological devices has been the subject of close scrutiny since von Neumann’s early work some 60 years ago. In spite of the many relevant works in this field, the notion of programming biological devices seems to be, at best, ill-defined. While many devices are claimed or proved t...

  4. Computing News

    CERN Multimedia

    McCubbin, N

    2001-01-01

    We are still five years from the first LHC data, so we have plenty of time to get the computing into shape, don't we? Well, yes and no: there is time, but there's an awful lot to do! The recently-completed CERN Review of LHC Computing gives the flavour of the LHC computing challenge. The hardware scale for each of the LHC experiments is millions of 'SpecInt95' (SI95) units of cpu power and tens of PetaBytes of data storage. PCs today are about 20-30SI95, and expected to be about 100 SI95 by 2005, so it's a lot of PCs. This hardware will be distributed across several 'Regional Centres' of various sizes, connected by high-speed networks. How to realise this in an orderly and timely fashion is now being discussed in earnest by CERN, Funding Agencies, and the LHC experiments. Mixed in with this is, of course, the GRID concept...but that's a topic for another day! Of course hardware, networks and the GRID constitute just one part of the computing. Most of the ATLAS effort is spent on software development. What we ...

  5. Quantum Computation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 16; Issue 9. Quantum Computation - Particle and Wave Aspects of Algorithms. Apoorva Patel. General Article Volume 16 Issue 9 September 2011 pp 821-835. Fulltext. Click here to view fulltext PDF. Permanent link:

  6. Cloud computing.

    Science.gov (United States)

    Wink, Diane M

    2012-01-01

    In this bimonthly series, the author examines how nurse educators can use Internet and Web-based technologies such as search, communication, and collaborative writing tools; social networking and social bookmarking sites; virtual worlds; and Web-based teaching and learning programs. This article describes how cloud computing can be used in nursing education.

  7. Computer Recreations.

    Science.gov (United States)

    Dewdney, A. K.

    1988-01-01

    Describes the creation of the computer program "BOUNCE," designed to simulate a weighted piston coming into equilibrium with a cloud of bouncing balls. The model follows the ideal gas law. Utilizes the critical event technique to create the model. Discusses another program, "BOOM," which simulates a chain reaction. (CW)

  8. [Grid computing

    CERN Multimedia

    Wolinsky, H

    2003-01-01

    "Turn on a water spigot, and it's like tapping a bottomless barrel of water. Ditto for electricity: Flip the switch, and the supply is endless. But computing is another matter. Even with the Internet revolution enabling us to connect in new ways, we are still limited to self-contained systems running locally stored software, limited by corporate, institutional and geographic boundaries" (1 page).

  9. Computational Finance

    DEFF Research Database (Denmark)

    Rasmussen, Lykke

    One of the major challenges in todays post-crisis finance environment is calculating the sensitivities of complex products for hedging and risk management. Historically, these derivatives have been determined using bump-and-revalue, but due to the increasing magnitude of these computations does...

  10. Optical Computing

    Indian Academy of Sciences (India)

    Optical computing technology is, in general, developing in two directions. One approach is ... current support in many places, with private companies as well as governments in several countries encouraging such research work. For example, much ... which enables more information to be carried and data to be processed.

  11. 78 FR 16865 - Certain Electronic Devices, Including Wireless Communication Devices, Portable Music and Data...

    Science.gov (United States)

    2013-03-19

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-794] Certain Electronic Devices, Including Wireless Communication Devices, Portable Music and Data Processing Devices, and Tablet Computers... certain electronic devices, including wireless communication devices, portable music and data processing...

  12. 76 FR 24051 - In the Matter of Certain Electronic Devices, Including Mobile Phones, Mobile Tablets, Portable...

    Science.gov (United States)

    2011-04-29

    ..., Including Mobile Phones, Mobile Tablets, Portable Music Players, and Computers, and Components Thereof... certain electronic devices, including mobile phones, mobile tablets, portable music players, and computers...''). The complaint further alleges that an industry in the United States exists or is in the process of...

  13. Internode data communications in a parallel computer

    Science.gov (United States)

    Archer, Charles J.; Blocksome, Michael A.; Miller, Douglas R.; Parker, Jeffrey J.; Ratterman, Joseph D.; Smith, Brian E.

    2013-09-03

    Internode data communications in a parallel computer that includes compute nodes that each include main memory and a messaging unit, the messaging unit including computer memory and coupling compute nodes for data communications, in which, for each compute node at compute node boot time: a messaging unit allocates, in the messaging unit's computer memory, a predefined number of message buffers, each message buffer associated with a process to be initialized on the compute node; receives, prior to initialization of a particular process on the compute node, a data communications message intended for the particular process; and stores the data communications message in the message buffer associated with the particular process. Upon initialization of the particular process, the process establishes a messaging buffer in main memory of the compute node and copies the data communications message from the message buffer of the messaging unit into the message buffer of main memory.

  14. 8th International Workshop on Natural Computing

    CERN Document Server

    Hagiya, Masami

    2016-01-01

    This book highlights recent advances in natural computing, including biology and its theory, bio-inspired computing, computational aesthetics, computational models and theories, computing with natural media, philosophy of natural computing, and educational technology. It presents extended versions of the best papers selected from the “8th International Workshop on Natural Computing” (IWNC8), a symposium held in Hiroshima, Japan, in 2014. The target audience is not limited to researchers working in natural computing but also includes those active in biological engineering, fine/media art design, aesthetics, and philosophy.

  15. Computable Frames in Computable Banach Spaces

    Directory of Open Access Journals (Sweden)

    S.K. Kaushik

    2016-06-01

    Full Text Available We develop some parts of the frame theory in Banach spaces from the point of view of Computable Analysis. We define computable M-basis and use it to construct a computable Banach space of scalar valued sequences. Computable Xd frames and computable Banach frames are also defined and computable versions of sufficient conditions for their existence are obtained.

  16. Computer Crafts for Kids.

    Science.gov (United States)

    Kuntz, Margy; Kuntz, Ann

    This work presents a collection of craft projects that can be created by children. Necessary for completion of each craft is an IBM-compatible computer running Windows, "Word for Windows 6.0," with a mouse, a printer, and plain white printer paper. Simple craft supplies, including glue and scissors, also are needed. In each activity the child is…

  17. Computer/Information Science

    Science.gov (United States)

    Birman, Ken; Roughgarden, Tim; Seltzer, Margo; Spohrer, Jim; Stolterman, Erik; Kearsley, Greg; Koszalka, Tiffany; de Jong, Ton

    2013-01-01

    Scholars representing the field of computer/information science were asked to identify what they considered to be the most exciting and imaginative work currently being done in their field, as well as how that work might change our understanding. The scholars included Ken Birman, Jennifer Rexford, Tim Roughgarden, Margo Seltzer, Jim Spohrer, and…

  18. Computed tomography in traumatology

    International Nuclear Information System (INIS)

    Heller, M.; Jend, H.H.

    1986-01-01

    This volume offers a critical review and assessment of new avenues opened up by computed tomography in traumatology. Over 200 illustrations, including numerous CT scans, aid the physician engaged emergency care and postoperative treatment of accident victims. Technical prerequisites, special techniques of investigation, pathomorphology of organ changes conditioned by trauma, diagnostic leading symptoms and signs, and diagnostics of iatrogenic injuries and lesions are presented

  19. Computed tomography for radiographers

    International Nuclear Information System (INIS)

    Brooker, M.J.

    1986-01-01

    This book is directed towards giving radiographers an introduction to and basic knowledge of computerized tomography. The technical section discusses gantries and x-ray production, computer and disc drive image display, storage, artefacts quality assurance and design of departments. The clinical section includes patient preparation, radiotherapy planning, and interpretation of images from various areas of the anatomy. (U.K.)

  20. Computer Series, 115.

    Science.gov (United States)

    Birk, James P., Ed.

    1990-01-01

    Reviewed are six computer programs which may be useful in teaching college level chemistry. Topics include dynamic data storage in FORTRAN, "KC?DISCOVERER," pH of acids and bases, calculating percent boundary surfaces for orbitals, and laboratory interfacing with PT Nomograph for the Macintosh. (CW)

  1. Control by personal computer and Interface 1

    International Nuclear Information System (INIS)

    Kim, Eung Mug; Park, Sun Ho

    1989-03-01

    This book consists of three chapters. The first chapter deals with basic knowledge of micro computer control which are computer system, micro computer system, control of the micro computer and control system for calculator. The second chapter describes Interface about basic knowledge such as 8255 parallel interface, 6821 parallel interface, parallel interface of personal computer, reading BCD code in parallel interface, IEEE-488 interface, RS-232C interface and transmit data in personal computer and a measuring instrument. The third chapter includes control experiment by micro computer, experiment by eight bit computer and control experiment by machine code and BASIC.

  2. BCS Glossary of Computing and ICT

    CERN Document Server

    Panel, BCS Education and Training Expert; Burkhardt, Diana; Cumming, Aline; Hunter, Alan; Hurvid, Frank; Jaworski, John; Ng, Thomas; Scheer, Marianne; Southall, John; Vella, Alfred

    2008-01-01

    A glossary of computing designed to support those taking computer courses or courses where computers are used, including GCSE, A-Level, ECDL and 14-19 Diplomas in Functional Skills, in schools and Further Education colleges. It helps the reader build up knowledge and understanding of computing.

  3. Algebraic computing

    International Nuclear Information System (INIS)

    MacCallum, M.A.H.

    1990-01-01

    The implementation of a new computer algebra system is time consuming: designers of general purpose algebra systems usually say it takes about 50 man-years to create a mature and fully functional system. Hence the range of available systems and their capabilities changes little between one general relativity meeting and the next, despite which there have been significant changes in the period since the last report. The introductory remarks aim to give a brief survey of capabilities of the principal available systems and highlight one or two trends. The reference to the most recent full survey of computer algebra in relativity and brief descriptions of the Maple, REDUCE and SHEEP and other applications are given. (author)

  4. Computational Controversy

    OpenAIRE

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have appeared, using new data sources such as Wikipedia, which help us now better understand these phenomena. However, compared to what social sciences have discovered about such debates, the existing computati...

  5. Computed tomography

    International Nuclear Information System (INIS)

    Andre, M.; Resnick, D.

    1988-01-01

    Computed tomography (CT) has matured into a reliable and prominent tool for study of the muscoloskeletal system. When it was introduced in 1973, it was unique in many ways and posed a challenge to interpretation. It is in these unique features, however, that its advantages lie in comparison with conventional techniques. These advantages will be described in a spectrum of important applications in orthopedics and rheumatology

  6. Computed radiography

    International Nuclear Information System (INIS)

    Pupchek, G.

    2004-01-01

    Computed radiography (CR) is an image acquisition process that is used to create digital, 2-dimensional radiographs. CR employs a photostimulable phosphor-based imaging plate, replacing the standard x-ray film and intensifying screen combination. Conventional radiographic exposure equipment is used with no modification required to the existing system. CR can transform an analog x-ray department into a digital one and eliminates the need for chemicals, water, darkrooms and film processor headaches. (author)

  7. Computational universes

    International Nuclear Information System (INIS)

    Svozil, Karl

    2005-01-01

    Suspicions that the world might be some sort of a machine or algorithm existing 'in the mind' of some symbolic number cruncher have lingered from antiquity. Although popular at times, the most radical forms of this idea never reached mainstream. Modern developments in physics and computer science have lent support to the thesis, but empirical evidence is needed before it can begin to replace our contemporary world view

  8. Visualizing a silicon quantum computer

    International Nuclear Information System (INIS)

    Sanders, Barry C; Hollenberg, Lloyd C L; Edmundson, Darran; Edmundson, Andrew

    2008-01-01

    Quantum computation is a fast-growing, multi-disciplinary research field. The purpose of a quantum computer is to execute quantum algorithms that efficiently solve computational problems intractable within the existing paradigm of 'classical' computing built on bits and Boolean gates. While collaboration between computer scientists, physicists, chemists, engineers, mathematicians and others is essential to the project's success, traditional disciplinary boundaries can hinder progress and make communicating the aims of quantum computing and future technologies difficult. We have developed a four minute animation as a tool for representing, understanding and communicating a silicon-based solid-state quantum computer to a variety of audiences, either as a stand-alone animation to be used by expert presenters or embedded into a longer movie as short animated sequences. The paper includes a generally applicable recipe for successful scientific animation production.

  9. Visualizing a silicon quantum computer

    Science.gov (United States)

    Sanders, Barry C.; Hollenberg, Lloyd C. L.; Edmundson, Darran; Edmundson, Andrew

    2008-12-01

    Quantum computation is a fast-growing, multi-disciplinary research field. The purpose of a quantum computer is to execute quantum algorithms that efficiently solve computational problems intractable within the existing paradigm of 'classical' computing built on bits and Boolean gates. While collaboration between computer scientists, physicists, chemists, engineers, mathematicians and others is essential to the project's success, traditional disciplinary boundaries can hinder progress and make communicating the aims of quantum computing and future technologies difficult. We have developed a four minute animation as a tool for representing, understanding and communicating a silicon-based solid-state quantum computer to a variety of audiences, either as a stand-alone animation to be used by expert presenters or embedded into a longer movie as short animated sequences. The paper includes a generally applicable recipe for successful scientific animation production.

  10. A programming approach to computability

    CERN Document Server

    Kfoury, A J; Arbib, Michael A

    1982-01-01

    Computability theory is at the heart of theoretical computer science. Yet, ironically, many of its basic results were discovered by mathematical logicians prior to the development of the first stored-program computer. As a result, many texts on computability theory strike today's computer science students as far removed from their concerns. To remedy this, we base our approach to computability on the language of while-programs, a lean subset of PASCAL, and postpone consideration of such classic models as Turing machines, string-rewriting systems, and p. -recursive functions till the final chapter. Moreover, we balance the presentation of un solvability results such as the unsolvability of the Halting Problem with a presentation of the positive results of modern programming methodology, including the use of proof rules, and the denotational semantics of programs. Computer science seeks to provide a scientific basis for the study of information processing, the solution of problems by algorithms, and the design ...

  11. Visualizing a silicon quantum computer

    Energy Technology Data Exchange (ETDEWEB)

    Sanders, Barry C [Institute for Quantum Information Science, University of Calgary, Calgary, Alberta T2N 1N4 (Canada); Hollenberg, Lloyd C L [ARC Centre of Excellence for Quantum Computer Technology, School of Physics, University of Melbourne, Victoria 3010 (Australia); Edmundson, Darran; Edmundson, Andrew [EDM Studio Inc., Level 2, 850 16 Avenue SW, Calgary, Alberta T2R 0S9 (Canada)], E-mail: bsanders@qis.ucalgary.ca, E-mail: lloydch@unimelb.edu.au, E-mail: darran@edmstudio.com

    2008-12-15

    Quantum computation is a fast-growing, multi-disciplinary research field. The purpose of a quantum computer is to execute quantum algorithms that efficiently solve computational problems intractable within the existing paradigm of 'classical' computing built on bits and Boolean gates. While collaboration between computer scientists, physicists, chemists, engineers, mathematicians and others is essential to the project's success, traditional disciplinary boundaries can hinder progress and make communicating the aims of quantum computing and future technologies difficult. We have developed a four minute animation as a tool for representing, understanding and communicating a silicon-based solid-state quantum computer to a variety of audiences, either as a stand-alone animation to be used by expert presenters or embedded into a longer movie as short animated sequences. The paper includes a generally applicable recipe for successful scientific animation production.

  12. NASA's computer science research program

    Science.gov (United States)

    Larsen, R. L.

    1983-01-01

    Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.

  13. Births and deaths including fetal deaths

    Data.gov (United States)

    U.S. Department of Health & Human Services — Access to a variety of United States birth and death files including fetal deaths: Birth Files, 1968-2009; 1995-2005; Fetal death file, 1982-2005; Mortality files,...

  14. Lumbar myelography with water-soluble contrast media including comparison with computed tomography

    International Nuclear Information System (INIS)

    Langlotz, M.

    1986-01-01

    The heart of this book lies in the bringing together of clinical data, CT, myelography (at full one to one scale), and early follow-up of individual cases. In order to achieve better visual understanding of the examples, as well as an overview of them, line drawings summarizing the most important pathologic and radiologic changes introduce each section. (orig./MG) With 373 figs

  15. Overlaid Alice: a statistical model computer code including fission and preequilibrium models

    International Nuclear Information System (INIS)

    Blann, M.

    1976-01-01

    The most recent edition of an evaporation code originally written previously with frequent updating and improvement. This version replaces the version Alice described previously. A brief summary is given of the types of calculations which can be done. A listing of the code and the results of several sample calculations are presented

  16. Radiation doses to patients in computed tomography including a ready reckoner for dose estimation

    International Nuclear Information System (INIS)

    Szendroe, G.; Axelsson, B.; Leitz, W.

    1995-11-01

    The radiation burden from CT-examinations is still growing in most countries and has reached a considerable part of the total from medical diagnostic x-ray procedures. Efforts for avoiding excess radiation doses are therefore especially well motivated within this field. A survey of CT-examination techniques practised in Sweden showed that standard settings for the exposure variables are used for the vast majority of examinations. Virtually no adjustments to the patient's differences in anatomy have been performed - even for infants and children on average the same settings have been used. The adjustment of the exposure variables to the individual anatomy offers a large potential of dose savings. Amongst the imaging parameters, a change of the radiation dose will primarily influence the noise. As a starting point it is assumed that, irrespective of the patient's anatomy, the same level of noise can be accepted for a certain diagnostic task. To a large extent the noise level is determined by the number of photons that are registered in the detector. Hence, for different patient size and anatomy, the exposure should be adjusted so that the same transmitted photon fluence is achieved. An appendix with a ready reckoner for dose estimation for CT-scanners used in Sweden is attached. 7 refs, 5 figs, 8 tabs

  17. Areal rainfall estimation using moving cars – computer experiments including hydrological modeling

    OpenAIRE

    E. Rabiei; U. Haberlandt; M. Sester; D. Fitzner; M. Wallner

    2016-01-01

    The need for high temporal and spatial resolution precipitation data for hydrological analyses has been discussed in several studies. Although rain gauges provide valuable information, a very dense rain gauge network is costly. As a result, several new ideas have emerged to help estimating areal rainfall with higher temporal and spatial resolution. Rabiei et al. (2013) observed that moving cars, called RainCars (RCs), can potentially be a new source of data for measuring rai...

  18. Areal rainfall estimation using moving cars - computer experiments including hydrological modeling

    OpenAIRE

    Rabiei, Ehsan; Haberlandt, Uwe; Sester, Monika; Fitzner, Daniel; Wallner, Markus

    2016-01-01

    The need for high temporal and spatial resolution precipitation data for hydrological analyses has been discussed in several studies. Although rain gauges provide valuable information, a very dense rain gauge network is costly. As a result, several new ideas have been emerged to help estimating areal rainfall with higher temporal and spatial resolution. Rabiei et al. (2013) observed that moving cars, called RainCars (RCs), can potentially be a new source of data for measuring rainfall amounts...

  19. GROGi-F. Modified version of GROGi 2 nuclear evaporation computer code including fission decay channel

    International Nuclear Information System (INIS)

    Delagrange, H.

    1977-01-01

    This report is the user manual of the GR0GI-F code, modified version of the GR0GI-2 code. It calculates the cross sections for heavy ion induced fission. Fission probabilities are calculated via the Bohr-Wheeler formalism

  20. Synthesis of realistic driving cycles with high accuracy and computational speed, including slope information

    NARCIS (Netherlands)

    Silvas, E.; Hereijgers, K.; Peng, Huei; Hofman, T.; Steinbuch, M.

    2016-01-01

    This paper describes a new method to synthesize driving cycles, where not only the velocity is considered, yet also the road slope information of the real-world measured driving cycle. Driven by strict emission regulations and tight fuel targets, hybrid or electric vehicle manufacturers aim to

  1. Computational homogenization of sound propagation in a deformable porous material including microscopic viscous-thermal effects

    NARCIS (Netherlands)

    Gao, K.; van Dommelen, J. A. W.; Göransson, P.; Geers, M. G. D.

    2016-01-01

    Porous materials like acoustic foams can be used for acoustic shielding, which is important for high-tech systems and human comfort. In this paper, a homogenization model is proposed to investigate the relation between the microstructure and the resulting macroscopic acoustic properties. The

  2. Including Indigenous Minorities in Decision-Making

    DEFF Research Database (Denmark)

    Pristed Nielsen, Helene

    Based on theories of public sphere participation and deliberative democracy, this book presents empirical results from a study of experiences with including Aboriginal and Maori groups in political decision-making in respectively Western Australia and New Zealand......Based on theories of public sphere participation and deliberative democracy, this book presents empirical results from a study of experiences with including Aboriginal and Maori groups in political decision-making in respectively Western Australia and New Zealand...

  3. Gas storage materials, including hydrogen storage materials

    Science.gov (United States)

    Mohtadi, Rana F; Wicks, George G; Heung, Leung K; Nakamura, Kenji

    2013-02-19

    A material for the storage and release of gases comprises a plurality of hollow elements, each hollow element comprising a porous wall enclosing an interior cavity, the interior cavity including structures of a solid-state storage material. In particular examples, the storage material is a hydrogen storage material such as a solid state hydride. An improved method for forming such materials includes the solution diffusion of a storage material solution through a porous wall of a hollow element into an interior cavity.

  4. Security and policy driven computing

    CERN Document Server

    Liu, Lei

    2010-01-01

    Security and Policy Driven Computing covers recent advances in security, storage, parallelization, and computing as well as applications. The author incorporates a wealth of analysis, including studies on intrusion detection and key management, computer storage policy, and transactional management.The book first describes multiple variables and index structure derivation for high dimensional data distribution and applies numeric methods to proposed search methods. It also focuses on discovering relations, logic, and knowledge for policy management. To manage performance, the text discusses con

  5. Cloud Computing Security: A Survey

    OpenAIRE

    Khalil, Issa; Khreishah, Abdallah; Azeem, Muhammad

    2014-01-01

    Cloud computing is an emerging technology paradigm that migrates current technological and computing concepts into utility-like solutions similar to electricity and water systems. Clouds bring out a wide range of benefits including configurable computing resources, economic savings, and service flexibility. However, security and privacy concerns are shown to be the primary obstacles to a wide adoption of clouds. The new concepts that clouds introduce, such as multi-tenancy, resource sharing a...

  6. Computer Education and Computer Use by Preschool Educators

    Science.gov (United States)

    Towns, Bernadette

    2010-01-01

    Researchers have found that teachers seldom use computers in the preschool classroom. However, little research has examined why preschool teachers elect not to use computers. This case study focused on identifying whether community colleges that prepare teachers for early childhood education include in their curriculum how teachers can effectively…

  7. A Computer Security Course in the Undergraduate Computer Science Curriculum.

    Science.gov (United States)

    Spillman, Richard

    1992-01-01

    Discusses the importance of computer security and considers criminal, national security, and personal privacy threats posed by security breakdown. Several examples are given, including incidents involving computer viruses. Objectives, content, instructional strategies, resources, and a sample examination for an experimental undergraduate computer…

  8. Computers and conversation

    CERN Document Server

    Luff, Paul; Gilbert, Nigel G

    1986-01-01

    In the past few years a branch of sociology, conversation analysis, has begun to have a significant impact on the design of human*b1computer interaction (HCI). The investigation of human*b1human dialogue has emerged as a fruitful foundation for interactive system design.****This book includes eleven original chapters by leading researchers who are applying conversation analysis to HCI. The fundamentals of conversation analysis are outlined, a number of systems are described, and a critical view of their value for HCI is offered.****Computers and Conversation will be of interest to all concerne

  9. Computational Elastic Knots

    KAUST Repository

    Zhao, Xin

    2013-05-01

    Elastic rods have been studied intensively since the 18th century. Even now the theory of elastic rods is still developing and enjoying popularity in computer graphics and physical-based simulation. Elastic rods also draw attention from architects. Architectural structures, NODUS, were constructed by elastic rods as a new method of form-finding. We study discrete models of elastic rods and NODUS structures. We also develop computational tools to find the equilibria of elastic rods and the shape of NODUS. Applications of elastic rods in forming torus knot and closing Bishop frame are included in this thesis.

  10. Essentials of Computational Electromagnetics

    CERN Document Server

    Sheng, Xin-Qing

    2012-01-01

    Essentials of Computational Electromagnetics provides an in-depth introduction of the three main full-wave numerical methods in computational electromagnetics (CEM); namely, the method of moment (MoM), the finite element method (FEM), and the finite-difference time-domain (FDTD) method. Numerous monographs can be found addressing one of the above three methods. However, few give a broad general overview of essentials embodied in these methods, or were published too early to include recent advances. Furthermore, many existing monographs only present the final numerical results without specifyin

  11. Simply computing for seniors

    CERN Document Server

    Clark, Linda

    2011-01-01

    Step-by-step instructions for seniors to get up and running on a home PC Answering the call for an up-to-date, straightforward computer guide targeted specifically for seniors, this helpful book includes easy-to-follow tutorials that escort you through the basics and shows you how to get the most out of your PC. Boasting an elegant, full-color interior with a clean, sophisticated look and feel, the layout makes it easy for you to find the information you need quickly. Author Linda Clark has earned her highly respected reputation through years of teaching computers at both the beginnin

  12. Computational quantum chemistry website

    International Nuclear Information System (INIS)

    1997-01-01

    This report contains the contents of a web page related to research on the development of quantum chemistry methods for computational thermochemistry and the application of quantum chemistry methods to problems in material chemistry and chemical sciences. Research programs highlighted include: Gaussian-2 theory; Density functional theory; Molecular sieve materials; Diamond thin-film growth from buckyball precursors; Electronic structure calculations on lithium polymer electrolytes; Long-distance electronic coupling in donor/acceptor molecules; and Computational studies of NOx reactions in radioactive waste storage

  13. Archives and the computer

    CERN Document Server

    Cook, Michael Garnet

    1980-01-01

    Archives and the Computer deals with the use of the computer and its systems and programs in archiving data and other related materials. The book covers topics such as the scope of automated systems in archives; systems for records management, archival description, and retrieval; and machine-readable archives. The book also features examples of systems for records management from different institutions such as theTyne and Wear Archive Department, Dyfed Record Office, and the University of Liverpool. Included in the last part are appendices. Appendix A is a directory of archival systems, Appen

  14. The computer graphics interface

    CERN Document Server

    Steinbrugge Chauveau, Karla; Niles Reed, Theodore; Shepherd, B

    2014-01-01

    The Computer Graphics Interface provides a concise discussion of computer graphics interface (CGI) standards. The title is comprised of seven chapters that cover the concepts of the CGI standard. Figures and examples are also included. The first chapter provides a general overview of CGI; this chapter covers graphics standards, functional specifications, and syntactic interfaces. Next, the book discusses the basic concepts of CGI, such as inquiry, profiles, and registration. The third chapter covers the CGI concepts and functions, while the fourth chapter deals with the concept of graphic obje

  15. Computer-controlled attenuator.

    Science.gov (United States)

    Mitov, D; Grozev, Z

    1991-01-01

    Various possibilities for applying electronic computer-controlled attenuators for the automation of physiological experiments are considered. A detailed description is given of the design of a 4-channel computer-controlled attenuator, in two of the channels of which the output signal can change by a linear step, in the other two channels--by a logarithmic step. This, together with the existence of additional programmable timers, allows to automate a wide range of studies in different spheres of physiology and psychophysics, including vision and hearing.

  16. Electric Power Monthly, August 1990. [Glossary included

    Energy Technology Data Exchange (ETDEWEB)

    1990-11-29

    The Electric Power Monthly (EPM) presents monthly summaries of electric utility statistics at the national, Census division, and State level. The purpose of this publication is to provide energy decisionmakers with accurate and timely information that may be used in forming various perspectives on electric issues that lie ahead. Data includes generation by energy source (coal, oil, gas, hydroelectric, and nuclear); generation by region; consumption of fossil fuels for power generation; sales of electric power, cost data; and unusual occurrences. A glossary is included.

  17. Electrochemical cell structure including an ionomeric barrier

    Science.gov (United States)

    Lambert, Timothy N.; Hibbs, Michael

    2017-06-20

    An apparatus includes an electrochemical half-cell comprising: an electrolyte, an anode; and an ionomeric barrier positioned between the electrolyte and the anode. The anode may comprise a multi-electron vanadium phosphorous alloy, such as VP.sub.x, wherein x is 1-5. The electrochemical half-cell is configured to oxidize the vanadium and phosphorous alloy to release electrons. A method of mitigating corrosion in an electrochemical cell includes disposing an ionomeric barrier in a path of electrolyte or ion flow to an anode and mitigating anion accumulation on the surface of the anode.

  18. Isolators Including Main Spring Linear Guide Systems

    Science.gov (United States)

    Goold, Ryan (Inventor); Buchele, Paul (Inventor); Hindle, Timothy (Inventor); Ruebsamen, Dale Thomas (Inventor)

    2017-01-01

    Embodiments of isolators, such as three parameter isolators, including a main spring linear guide system are provided. In one embodiment, the isolator includes first and second opposing end portions, a main spring mechanically coupled between the first and second end portions, and a linear guide system extending from the first end portion, across the main spring, and toward the second end portion. The linear guide system expands and contracts in conjunction with deflection of the main spring along the working axis, while restricting displacement and rotation of the main spring along first and second axes orthogonal to the working axis.

  19. Designing the next generation (fifth generation computers)

    International Nuclear Information System (INIS)

    Wallich, P.

    1983-01-01

    A description is given of the designs necessary to develop fifth generation computers. An analysis is offered of problems and developments in parallelism, VLSI, artificial intelligence, knowledge engineering and natural language processing. Software developments are outlined including logic programming, object-oriented programming and exploratory programming. Computer architecture is detailed including concurrent computer architecture

  20. 28 CFR 20.32 - Includable offenses.

    Science.gov (United States)

    2010-07-01

    ... Exchange of Criminal History Record Information § 20.32 Includable offenses. (a) Criminal history record... vehicular manslaughter, driving under the influence of drugs or liquor, and hit and run), when unaccompanied by a § 20.32(a) offense. These exclusions may not be applicable to criminal history records...

  1. Including Students with Visual Impairments: Softball

    Science.gov (United States)

    Brian, Ali; Haegele, Justin A.

    2014-01-01

    Research has shown that while students with visual impairments are likely to be included in general physical education programs, they may not be as active as their typically developing peers. This article provides ideas for equipment modifications and game-like progressions for one popular physical education unit, softball. The purpose of these…

  2. Extending flood damage assessment methodology to include ...

    African Journals Online (AJOL)

    Optimal and sustainable flood plain management, including flood control, can only be achieved when the impacts of flood control measures are considered for both the man-made and natural environments, and the sociological aspects are fully considered. Until now, methods/models developed to determine the influences ...

  3. BIOLOGIC AND ECONOMIC EFFECTS OF INCLUDING DIFFERENT ...

    African Journals Online (AJOL)

    The biologic and economic effects of including three agro-industrial by-products as ingredients in turkey poult diets were investigated using 48 turkey poults in a completely randomised design experiment. Diets were formulated to contain the three by-products – wheat offal, rice husk and palm kernel meal, each at 20% level ...

  4. Including Children Dependent on Ventilators in School.

    Science.gov (United States)

    Levine, Jack M.

    1996-01-01

    Guidelines for including ventilator-dependent children in school are offered, based on experience with six such students at a New York State school. Guidelines stress adherence to the medical management plan, the school-family partnership, roles of the social worker and psychologist, orientation, transportation, classroom issues, and steps toward…

  5. Broadcasting a message in a parallel computer

    Science.gov (United States)

    Berg, Jeremy E [Rochester, MN; Faraj, Ahmad A [Rochester, MN

    2011-08-02

    Methods, systems, and products are disclosed for broadcasting a message in a parallel computer. The parallel computer includes a plurality of compute nodes connected together using a data communications network. The data communications network optimized for point to point data communications and is characterized by at least two dimensions. The compute nodes are organized into at least one operational group of compute nodes for collective parallel operations of the parallel computer. One compute node of the operational group assigned to be a logical root. Broadcasting a message in a parallel computer includes: establishing a Hamiltonian path along all of the compute nodes in at least one plane of the data communications network and in the operational group; and broadcasting, by the logical root to the remaining compute nodes, the logical root's message along the established Hamiltonian path.

  6. Link failure detection in a parallel computer

    Science.gov (United States)

    Archer, Charles J.; Blocksome, Michael A.; Megerian, Mark G.; Smith, Brian E.

    2010-11-09

    Methods, apparatus, and products are disclosed for link failure detection in a parallel computer including compute nodes connected in a rectangular mesh network, each pair of adjacent compute nodes in the rectangular mesh network connected together using a pair of links, that includes: assigning each compute node to either a first group or a second group such that adjacent compute nodes in the rectangular mesh network are assigned to different groups; sending, by each of the compute nodes assigned to the first group, a first test message to each adjacent compute node assigned to the second group; determining, by each of the compute nodes assigned to the second group, whether the first test message was received from each adjacent compute node assigned to the first group; and notifying a user, by each of the compute nodes assigned to the second group, whether the first test message was received.

  7. Computational intelligence in medical informatics

    CERN Document Server

    Gunjan, Vinit

    2015-01-01

    This Brief highlights Informatics and related techniques to Computer Science Professionals, Engineers, Medical Doctors, Bioinformatics researchers and other interdisciplinary researchers. Chapters include the Bioinformatics of Diabetes and several computational algorithms and statistical analysis approach to effectively study the disorders and possible causes along with medical applications.

  8. Workshop on Computational Optimization

    CERN Document Server

    2015-01-01

    Our everyday life is unthinkable without optimization. We try to minimize our effort and to maximize the achieved profit. Many real world and industrial problems arising in engineering, economics, medicine and other domains can be formulated as optimization tasks. This volume is a comprehensive collection of extended contributions from the Workshop on Computational Optimization 2013. It presents recent advances in computational optimization. The volume includes important real life problems like parameter settings for controlling processes in bioreactor, resource constrained project scheduling, problems arising in transport services, error correcting codes, optimal system performance and energy consumption and so on. It shows how to develop algorithms for them based on new metaheuristic methods like evolutionary computation, ant colony optimization, constrain programming and others.

  9. Applied computational physics

    CERN Document Server

    Boudreau, Joseph F; Bianchi, Riccardo Maria

    2018-01-01

    Applied Computational Physics is a graduate-level text stressing three essential elements: advanced programming techniques, numerical analysis, and physics. The goal of the text is to provide students with essential computational skills that they will need in their careers, and to increase the confidence with which they write computer programs designed for their problem domain. The physics problems give them an opportunity to reinforce their programming skills, while the acquired programming skills augment their ability to solve physics problems. The C++ language is used throughout the text. Physics problems include Hamiltonian systems, chaotic systems, percolation, critical phenomena, few-body and multi-body quantum systems, quantum field theory, simulation of radiation transport, and data modeling. The book, the fruit of a collaboration between a theoretical physicist and an experimental physicist, covers a broad range of topics from both viewpoints. Examples, program libraries, and additional documentatio...

  10. Computational problems in engineering

    CERN Document Server

    Mladenov, Valeri

    2014-01-01

    This book provides readers with modern computational techniques for solving variety of problems from electrical, mechanical, civil and chemical engineering. Mathematical methods are presented in a unified manner, so they can be applied consistently to problems in applied electromagnetics, strength of materials, fluid mechanics, heat and mass transfer, environmental engineering, biomedical engineering, signal processing, automatic control and more.   • Features contributions from distinguished researchers on significant aspects of current numerical methods and computational mathematics; • Presents actual results and innovative methods that provide numerical solutions, while minimizing computing times; • Includes new and advanced methods and modern variations of known techniques that can solve difficult scientific problems efficiently.  

  11. Controlling data transfers from an origin compute node to a target compute node

    Science.gov (United States)

    Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2011-06-21

    Methods, apparatus, and products are disclosed for controlling data transfers from an origin compute node to a target compute node that include: receiving, by an application messaging module on the target compute node, an indication of a data transfer from an origin compute node to the target compute node; and administering, by the application messaging module on the target compute node, the data transfer using one or more messaging primitives of a system messaging module in dependence upon the indication.

  12. Mastering cloud computing foundations and applications programming

    CERN Document Server

    Buyya, Rajkumar; Selvi, SThamarai

    2013-01-01

    Mastering Cloud Computing is designed for undergraduate students learning to develop cloud computing applications. Tomorrow's applications won't live on a single computer but will be deployed from and reside on a virtual server, accessible anywhere, any time. Tomorrow's application developers need to understand the requirements of building apps for these virtual systems, including concurrent programming, high-performance computing, and data-intensive systems. The book introduces the principles of distributed and parallel computing underlying cloud architectures and specifical

  13. Photoactive devices including porphyrinoids with coordinating additives

    Science.gov (United States)

    Forrest, Stephen R; Zimmerman, Jeramy; Yu, Eric K; Thompson, Mark E; Trinh, Cong; Whited, Matthew; Diev, Vlacheslav

    2015-05-12

    Coordinating additives are included in porphyrinoid-based materials to promote intermolecular organization and improve one or more photoelectric characteristics of the materials. The coordinating additives are selected from fullerene compounds and organic compounds having free electron pairs. Combinations of different coordinating additives can be used to tailor the characteristic properties of such porphyrinoid-based materials, including porphyrin oligomers. Bidentate ligands are one type of coordinating additive that can form coordination bonds with a central metal ion of two different porphyrinoid compounds to promote porphyrinoid alignment and/or pi-stacking. The coordinating additives can shift the absorption spectrum of a photoactive material toward higher wavelengths, increase the external quantum efficiency of the material, or both.

  14. Electric power monthly, September 1990. [Glossary included

    Energy Technology Data Exchange (ETDEWEB)

    1990-12-17

    The purpose of this report is to provide energy decision makers with accurate and timely information that may be used in forming various perspectives on electric issues. The power plants considered include coal, petroleum, natural gas, hydroelectric, and nuclear power plants. Data are presented for power generation, fuel consumption, fuel receipts and cost, sales of electricity, and unusual occurrences at power plants. Data are compared at the national, Census division, and state levels. 4 figs., 52 tabs. (CK)

  15. Nuclear reactor shield including magnesium oxide

    International Nuclear Information System (INIS)

    Rouse, C.A.; Simnad, M.T.

    1981-01-01

    An improvement is described for nuclear reactor shielding of a type used in reactor applications involving significant amounts of fast neutron flux. The reactor shielding includes means providing structural support, neutron moderator material, neutron absorber material and other components, wherein at least a portion of the neutron moderator material is magnesium in the form of magnesium oxide either alone or in combination with other moderator materials such as graphite and iron

  16. Model for safety reports including descriptive examples

    International Nuclear Information System (INIS)

    1995-12-01

    Several safety reports will be produced in the process of planning and constructing the system for disposal of high-level radioactive waste in Sweden. The present report gives a model, with detailed examples, of how these reports should be organized and what steps they should include. In the near future safety reports will deal with the encapsulation plant and the repository. Later reports will treat operation of the handling systems and the repository

  17. Jet-calculus approach including coherence effects

    International Nuclear Information System (INIS)

    Jones, L.M.; Migneron, R.; Narayanan, K.S.S.

    1987-01-01

    We show how integrodifferential equations typical of jet calculus can be combined with an averaging procedure to obtain jet-calculus-based results including the Mueller interference graphs. Results in longitudinal-momentum fraction x for physical quantities are higher at intermediate x and lower at large x than with the conventional ''incoherent'' jet calculus. These results resemble those of Marchesini and Webber, who used a Monte Carlo approach based on the same dynamics

  18. Computational Physics' Greatest Hits

    Science.gov (United States)

    Bug, Amy

    2011-03-01

    The digital computer, has worked its way so effectively into our profession that now, roughly 65 years after its invention, it is virtually impossible to find a field of experimental or theoretical physics unaided by computational innovation. It is tough to think of another device about which one can make that claim. In the session ``What is computational physics?'' speakers will distinguish computation within the field of computational physics from this ubiquitous importance across all subfields of physics. This talk will recap the invited session ``Great Advances...Past, Present and Future'' in which five dramatic areas of discovery (five of our ``greatest hits'') are chronicled: The physics of many-boson systems via Path Integral Monte Carlo, the thermodynamic behavior of a huge number of diverse systems via Monte Carlo Methods, the discovery of new pharmaceutical agents via molecular dynamics, predictive simulations of global climate change via detailed, cross-disciplinary earth system models, and an understanding of the formation of the first structures in our universe via galaxy formation simulations. The talk will also identify ``greatest hits'' in our field from the teaching and research perspectives of other members of DCOMP, including its Executive Committee.

  19. Computer Registration Becoming Mandatory

    CERN Multimedia

    2003-01-01

    Following the decision by the CERN Management Board (see Weekly Bulletin 38/2003), registration of all computers connected to CERN's network will be enforced and only registered computers will be allowed network access. The implementation has started with the IT buildings, continues with building 40 and the Prevessin site (as of Tuesday 4th November 2003), and will cover the whole of CERN before the end of this year. We therefore recommend strongly that you register all your computers in CERN's network database including all network access cards (Ethernet AND wireless) as soon as possible without waiting for the access restriction to take force. This will allow you accessing the network without interruption and help IT service providers to contact you in case of problems (e.g. security problems, viruses, etc.) Users WITH a CERN computing account register at: http://cern.ch/register/ (CERN Intranet page) Visitors WITHOUT a CERN computing account (e.g. short term visitors) register at: http://cern.ch/regis...

  20. Tools for computational finance

    CERN Document Server

    Seydel, Rüdiger U

    2017-01-01

    Computational and numerical methods are used in a number of ways across the field of finance. It is the aim of this book to explain how such methods work in financial engineering. By concentrating on the field of option pricing, a core task of financial engineering and risk analysis, this book explores a wide range of computational tools in a coherent and focused manner and will be of use to anyone working in computational finance. Starting with an introductory chapter that presents the financial and stochastic background, the book goes on to detail computational methods using both stochastic and deterministic approaches. Now in its sixth edition, Tools for Computational Finance has been significantly revised and contains:    Several new parts such as a section on extended applications of tree methods, including multidimensional trees, trinomial trees, and the handling of dividends; Additional material in the field of generating normal variates with acceptance-rejection methods, and on Monte Carlo methods...

  1. AGRIS: Description of computer programs

    International Nuclear Information System (INIS)

    Schmid, H.; Schallaboeck, G.

    1976-01-01

    The set of computer programs used at the AGRIS (Agricultural Information System) Input Unit at the IAEA, Vienna, Austria to process the AGRIS computer-readable data is described. The processing flow is illustrated. The configuration of the IAEA's computer, a list of error messages generated by the computer, the EBCDIC code table extended for AGRIS and INIS, the AGRIS-6 bit code, the work sheet format, and job control listings are included as appendixes. The programs are written for an IBM 370, model 145, operating system OS or VS, and require a 130K partition. The programming languages are PL/1 (F-compiler) and Assembler

  2. An introduction to digital computing

    CERN Document Server

    George, F H

    2014-01-01

    An Introduction to Digital Computing provides information pertinent to the fundamental aspects of digital computing. This book represents a major step towards the universal availability of programmed material.Organized into four chapters, this book begins with an overview of the fundamental workings of the computer, including the way it handles simple arithmetic problems. This text then provides a brief survey of the basic features of a typical computer that is divided into three sections, namely, the input and output system, the memory system for data storage, and a processing system. Other c

  3. Performing an allreduce operation on a plurality of compute nodes of a parallel computer

    Science.gov (United States)

    Faraj, Ahmad [Rochester, MN

    2012-04-17

    Methods, apparatus, and products are disclosed for performing an allreduce operation on a plurality of compute nodes of a parallel computer. Each compute node includes at least two processing cores. Each processing core has contribution data for the allreduce operation. Performing an allreduce operation on a plurality of compute nodes of a parallel computer includes: establishing one or more logical rings among the compute nodes, each logical ring including at least one processing core from each compute node; performing, for each logical ring, a global allreduce operation using the contribution data for the processing cores included in that logical ring, yielding a global allreduce result for each processing core included in that logical ring; and performing, for each compute node, a local allreduce operation using the global allreduce results for each processing core on that compute node.

  4. Bibliography. Computer-Oriented Projects, 1987.

    Science.gov (United States)

    Smith, Richard L., Comp.

    1988-01-01

    Provides an annotated list of references on computer-oriented projects. Includes information on computers; hands-on versus simulations; games; instruction; students' attitudes and learning styles; artificial intelligence; tutoring; and application of spreadsheets. (RT)

  5. Computer vision for an autonomous mobile robot

    CSIR Research Space (South Africa)

    Withey, Daniel J

    2015-10-01

    Full Text Available Computer vision systems are essential for practical, autonomous, mobile robots – machines that employ artificial intelligence and control their own motion within an environment. As with biological systems, computer vision systems include the vision...

  6. Computer Graphics for Multimedia and Hypermedia Development.

    Science.gov (United States)

    Mohler, James L.

    1998-01-01

    Discusses several theoretical and technical aspects of computer-graphics development that are useful for creating hypermedia and multimedia materials. Topics addressed include primary bitmap attributes in computer graphics, the jigsaw principle, and raster layering. (MSE)

  7. Optical Computing

    Indian Academy of Sciences (India)

    Other advantages of optics include low manufacturing costs, immunity to ... It is now possible to control atoms by trapping single photons in small, .... cement, and optical spectrum analyzers. ... risk of noise is further reduced, as light is immune to electro- ..... mode of operation including management of large multimedia.

  8. Computing Services and Assured Computing

    Science.gov (United States)

    2006-05-01

    fighters’ ability to execute the mission.” Computing Services 4 We run IT Systems that: provide medical care pay the warfighters manage maintenance...users • 1,400 applications • 18 facilities • 180 software vendors • 18,000+ copies of executive software products • Virtually every type of mainframe and... chocs electriques, de branchez les deux cordons d’al imentation avant de faire le depannage P R IM A R Y SD A S B 1 2 PowerHub 7000 RST U L 00- 00

  9. 7th International Workshop on Natural Computing

    CERN Document Server

    Hagiya, Masami

    2015-01-01

    This book highlights recent advances in natural computing, including biology and its theory, bio-inspired computing, computational aesthetics, computational models and theories, computing with natural media, philosophy of natural computing and educational technology. It presents extended versions of the best papers selected from the symposium “7th International Workshop on Natural Computing” (IWNC7), held in Tokyo, Japan, in 2013. The target audience is not limited to researchers working in natural computing but also those active in biological engineering, fine/media art design, aesthetics and philosophy.

  10. [Renal patient's diet: Can fish be included?].

    Science.gov (United States)

    Castro González, M I; Maafs Rodríguez, A G; Galindo Gómez, C

    2012-01-01

    Medical and nutritional treatment for renal disease, now a major public health issue, is highly complicated. Nutritional therapy must seek to retard renal dysfunction, maintain an optimal nutritional status and prevent the development of underlying pathologies. To analyze ten fish species to identify those that, because of their low phosphorus content, high biological value protein and elevated n-3 fatty acids EPA and DHA, could be included in renal patient's diet. The following fish species (Litte tunny, Red drum, Spotted eagleray, Escolar, Swordfish, Big-scale pomfret, Cortez flounder, Largemouth blackbass, Periche mojarra, Florida Pompano) were analyzed according to the AOAC and Keller techniques to determine their protein, phosphorus, sodium, potassium, cholesterol, vitamins D(3) and E, and n-3 EPA+DHA content. These results were used to calculate relations between nutrients. The protein in the analyzed species ranged from 16.5 g/100 g of fillet (Largemouth black bass) to 27.2 g/100 g (Red drum); the lowest phosphorus value was 28.6 mg/100 g (Periche mojarra) and the highest 216.3 mg/100 g (Spotted eagle ray). 80% of the fish presented > 100 mg EPA + DHA in 100 g of fillet. By its Phosphorus/gProtein ratio, Escolar and Swordfish could not be included in the renal diet; Little tunny, Escolar, Big-scale pomfret, Largemouth black-bass, Periche mojarra and Florida Pompano presented a lower Phosphorus/EPA + DHA ratio. Florida pompano is the most recommended specie for renal patients, due to its optimal nutrient relations. However, all analyzed species, except Escolar and Swordfish, could be included in renal diets.

  11. Drug delivery device including electrolytic pump

    KAUST Repository

    Foulds, Ian G.; Buttner, Ulrich; Yi, Ying

    2016-01-01

    Systems and methods are provided for a drug delivery device and use of the device for drug delivery. In various aspects, the drug delivery device combines a “solid drug in reservoir” (SDR) system with an electrolytic pump. In various aspects an improved electrolytic pump is provided including, in particular, an improved electrolytic pump for use with a drug delivery device, for example an implantable drug delivery device. A catalytic reformer can be incorporated in a periodically pulsed electrolytic pump to provide stable pumping performance and reduced actuation cycle.

  12. Drug delivery device including electrolytic pump

    KAUST Repository

    Foulds, Ian G.

    2016-03-31

    Systems and methods are provided for a drug delivery device and use of the device for drug delivery. In various aspects, the drug delivery device combines a “solid drug in reservoir” (SDR) system with an electrolytic pump. In various aspects an improved electrolytic pump is provided including, in particular, an improved electrolytic pump for use with a drug delivery device, for example an implantable drug delivery device. A catalytic reformer can be incorporated in a periodically pulsed electrolytic pump to provide stable pumping performance and reduced actuation cycle.

  13. Computational neuroscience

    CERN Document Server

    Blackwell, Kim L

    2014-01-01

    Progress in Molecular Biology and Translational Science provides a forum for discussion of new discoveries, approaches, and ideas in molecular biology. It contains contributions from leaders in their fields and abundant references. This volume brings together different aspects of, and approaches to, molecular and multi-scale modeling, with applications to a diverse range of neurological diseases. Mathematical and computational modeling offers a powerful approach for examining the interaction between molecular pathways and ionic channels in producing neuron electrical activity. It is well accepted that non-linear interactions among diverse ionic channels can produce unexpected neuron behavior and hinder a deep understanding of how ion channel mutations bring about abnormal behavior and disease. Interactions with the diverse signaling pathways activated by G protein coupled receptors or calcium influx adds an additional level of complexity. Modeling is an approach to integrate myriad data sources into a cohesiv...

  14. Social Computing

    CERN Multimedia

    CERN. Geneva

    2011-01-01

    The past decade has witnessed a momentous transformation in the way people interact with each other. Content is now co-produced, shared, classified, and rated by millions of people, while attention has become the ephemeral and valuable resource that everyone seeks to acquire. This talk will describe how social attention determines the production and consumption of content within both the scientific community and social media, how its dynamics can be used to predict the future and the role that social media plays in setting the public agenda. About the speaker Bernardo Huberman is a Senior HP Fellow and Director of the Social Computing Lab at Hewlett Packard Laboratories. He received his Ph.D. in Physics from the University of Pennsylvania, and is currently a Consulting Professor in the Department of Applied Physics at Stanford University. He originally worked in condensed matter physics, ranging from superionic conductors to two-dimensional superfluids, and made contributions to the theory of critical p...

  15. computer networks

    Directory of Open Access Journals (Sweden)

    N. U. Ahmed

    2002-01-01

    Full Text Available In this paper, we construct a new dynamic model for the Token Bucket (TB algorithm used in computer networks and use systems approach for its analysis. This model is then augmented by adding a dynamic model for a multiplexor at an access node where the TB exercises a policing function. In the model, traffic policing, multiplexing and network utilization are formally defined. Based on the model, we study such issues as (quality of service QoS, traffic sizing and network dimensioning. Also we propose an algorithm using feedback control to improve QoS and network utilization. Applying MPEG video traces as the input traffic to the model, we verify the usefulness and effectiveness of our model.

  16. 3D integrated HYDRA simulations of hohlraums including fill tubes

    Science.gov (United States)

    Marinak, M. M.; Milovich, J.; Hammel, B. A.; Macphee, A. G.; Smalyuk, V. A.; Kerbel, G. D.; Sepke, S.; Patel, M. V.

    2017-10-01

    Measurements of fill tube perturbations from hydro growth radiography (HGR) experiments on the National Ignition Facility show spoke perturbations in the ablator radiating from the base of the tube. These correspond to the shadow of the 10 μm diameter glass fill tube cast by hot spots at early time. We present 3D integrated HYDRA simulations of these experiments which include the fill tube. Meshing techniques are described which were employed to resolve the fill tube structure and associated perturbations in the simulations. We examine the extent to which the specific illumination geometry necessary to accommodate a backlighter in the HGR experiment contributes to the spoke pattern. Simulations presented include high resolution calculations run on the Trinity machine operated by the Alliance for Computing at Extreme Scale (ACES) partnership. This work was performed under the auspices of the Lawrence Livermore National Security, LLC, (LLNS) under Contract No. DE-AC52-07NA27344.

  17. Microfluidic System Simulation Including the Electro-Viscous Effect

    Science.gov (United States)

    Rojas, Eileen; Chen, C. P.; Majumdar, Alok

    2007-01-01

    This paper describes a practical approach using a general purpose lumped-parameter computer program, GFSSP (Generalized Fluid System Simulation Program) for calculating flow distribution in a network of micro-channels including electro-viscous effects due to the existence of electrical double layer (EDL). In this study, an empirical formulation for calculating an effective viscosity of ionic solutions based on dimensional analysis is described to account for surface charge and bulk fluid conductivity, which give rise to electro-viscous effect in microfluidics network. Two dimensional slit micro flow data was used to determine the model coefficients. Geometry effect is then included through a Poiseuille number correlation in GFSSP. The bi-power model was used to calculate flow distribution of isotropically etched straight channel and T-junction microflows involving ionic solutions. Performance of the proposed model is assessed against experimental test data.

  18. Positron scattering by atomic hydrogen including positronium formation

    International Nuclear Information System (INIS)

    Higgins, K.; Burke, P.G.

    1993-01-01

    Positron scattering by atomic hydrogen including positronium formation has been formulated using the R-matrix method and a general computer code written. Partial wave elastic and ground state positronium formation cross sections have been calculated for L ≤ 6 using a six-state approximation which includes the ground state and the 2s and 2p pseudostates of both hydrogen and positronium. The elastic scattering results obtained are in good agreement with those derived from a highly accurate calculation based upon the intermediate energy R-matrix approach. As in a previous coupled-channel static calculation, resonance effects are observed at intermediate energies in the S-wave positronium formation cross section. However, in the present results, the dominant resonance arises in the P-wave cross sections at an energy of 2.73 Ryd and with a width of 0.19 Ryd. (author)

  19. Nonlinear equilibrium in Tokamaks including convective terms and viscosity

    International Nuclear Information System (INIS)

    Martin, P.; Castro, E.; Puerta, J.

    2003-01-01

    MHD equilibrium in tokamaks becomes very complex, when the non-linear convective term and viscosity are included in the momentum equation. In order to simplify the analysis, each new term has been separated in type gradient terms and vorticity depending terms. For the special case in which the vorticity vanishes, an extended Grad-Shafranov type equation can be obtained. However now the magnetic surface is not isobars or current surfaces as in the usual Grad-Shafranov treatment. The non-linear convective terms introduces gradient of Bernoulli type kinetic terms . Montgomery and other authors have shown the importance of the viscosity terms in tokamaks [1,2], here the treatment is carried out for the equilibrium condition, including generalized tokamaks coordinates recently described [3], which simplify the equilibrium analysis. Calculation of the new isobar surfaces is difficult and some computation have been carried out elsewhere for some particular cases [3]. Here, our analysis is extended discussing how the toroidal current density, plasma pressure and toroidal field are modified across the midplane because of the new terms (convective and viscous). New calculations and computations are also presented. (Author)

  20. Computed tomography

    International Nuclear Information System (INIS)

    Boyd, D.P.

    1989-01-01

    This paper reports on computed tomographic (CT) scanning which has improved computer-assisted imaging modalities for radiologic diagnosis. The advantage of this modality is its ability to image thin cross-sectional planes of the body, thus uncovering density information in three dimensions without tissue superposition problems. Because this enables vastly superior imaging of soft tissues in the brain and body, CT scanning was immediately successful and continues to grow in importance as improvements are made in speed, resolution, and cost efficiency. CT scanners are used for general purposes, and the more advanced machines are generally preferred in large hospitals, where volume and variety of usage justifies the cost. For imaging in the abdomen, a scanner with a rapid speed is preferred because peristalsis, involuntary motion of the diaphram, and even cardiac motion are present and can significantly degrade image quality. When contrast media is used in imaging to demonstrate scanner, immediate review of images, and multiformat hardcopy production. A second console is reserved for the radiologist to read images and perform the several types of image analysis that are available. Since CT images contain quantitative information in terms of density values and contours of organs, quantitation of volumes, areas, and masses is possible. This is accomplished with region-of- interest methods, which involve the electronic outlining of the selected region of the television display monitor with a trackball-controlled cursor. In addition, various image- processing options, such as edge enhancement (for viewing fine details of edges) or smoothing filters (for enhancing the detectability of low-contrast lesions) are useful tools

  1. Mesenteric panniculitis: computed tomography aspects

    International Nuclear Information System (INIS)

    Moreira, Luiza Beatriz Melo; Alves, Jose Ricardo Duarte; Marchiori, Edson; Pinheiro, Ricardo Andrade; Melo, Alessandro Severo Alves de; Noro, Fabio

    2001-01-01

    Mesenteric panniculitis is an inflammatory process that represents the second stage of a rare progressive disease involving the adipose tissue of the mesentery. Imaging methods used in the diagnosis of mesenteric panniculitis include barium studies, ultrasonography, computed tomography and magnetic resonance imaging. Computed tomography is important for both, diagnosis and evaluation of the extension of the disease and treatment monitoring. Computed tomography findings may vary according to the stage of the disease and the amount of inflammatory material or fibrosis. There is also good correlation between the computed tomography and anatomical pathology findings. The authors studied 10 patients with mesenteric panniculitis submitted to computed tomography. Magnetic resonance imaging was also performed in one patient. In all patients, computed tomography revealed a heterogeneous mass in the mesentery with density of fat, interspersed with areas of soft tissue density and dilated vessels. (author)

  2. Computer applications in radiation protection

    International Nuclear Information System (INIS)

    Cole, P.R.; Moores, B.M.

    1995-01-01

    Computer applications in general and diagnostic radiology in particular are becoming more widespread. Their application to the field of radiation protection in medical imaging, including quality control initiatives, is similarly becoming more widespread. Advances in computer technology have enabled departments of diagnostic radiology to have access to powerful yet affordable personal computers. The application of databases, expert systems and computer-based learning is under way. The executive information systems for the management of dose and QA data that are under way at IRS are discussed. An important consideration in developing these pragmatic software tools has been the range of computer literacy within the end user group. Using interfaces have been specifically designed to reflect the requirements of many end users who will have little or no computer knowledge. (Author)

  3. Cloud Computing: The Future of Computing

    OpenAIRE

    Aggarwal, Kanika

    2013-01-01

    Cloud computing has recently emerged as a new paradigm for hosting and delivering services over the Internet. Cloud computing is attractive to business owners as it eliminates the requirement for users to plan ahead for provisioning, and allows enterprises to start from the small and increase resources only when there is a rise in service demand. The basic principles of cloud computing is to make the computing be assigned in a great number of distributed computers, rather then local computer ...

  4. Energy principle with included boundary conditions

    International Nuclear Information System (INIS)

    Lehnert, B.

    1994-01-01

    Earlier comments by the author on the limitations of the classical form of the extended energy principle are supported by a complementary analysis on the potential energy change arising from free-boundary displacements of a magnetically confined plasma. In the final formulation of the extended principle, restricted displacements, satisfying pressure continuity by means of plasma volume currents in a thin boundary layer, are replaced by unrestricted (arbitrary) displacements which can give rise to induced surface currents. It is found that these currents contribute to the change in potential energy, and that their contribution is not taken into account by such a formulation. A general expression is further given for surface currents induced by arbitrary displacements. The expression is used to reformulate the energy principle for the class of displacements which satisfy all necessary boundary conditions, including that of the pressure balance. This makes a minimization procedure of the potential energy possible, for the class of all physically relevant test functions which include the constraints imposed by the boundary conditions. Such a procedure is also consistent with a corresponding variational calculus. (Author)

  5. Aerosol simulation including chemical and nuclear reactions

    International Nuclear Information System (INIS)

    Marwil, E.S.; Lemmon, E.C.

    1985-01-01

    The numerical simulation of aerosol transport, including the effects of chemical and nuclear reactions presents a challenging dynamic accounting problem. Particles of different sizes agglomerate and settle out due to various mechanisms, such as diffusion, diffusiophoresis, thermophoresis, gravitational settling, turbulent acceleration, and centrifugal acceleration. Particles also change size, due to the condensation and evaporation of materials on the particle. Heterogeneous chemical reactions occur at the interface between a particle and the suspending medium, or a surface and the gas in the aerosol. Homogeneous chemical reactions occur within the aersol suspending medium, within a particle, and on a surface. These reactions may include a phase change. Nuclear reactions occur in all locations. These spontaneous transmutations from one element form to another occur at greatly varying rates and may result in phase or chemical changes which complicate the accounting process. This paper presents an approach for inclusion of these effects on the transport of aerosols. The accounting system is very complex and results in a large set of stiff ordinary differential equations (ODEs). The techniques for numerical solution of these ODEs require special attention to achieve their solution in an efficient and affordable manner. 4 refs

  6. Addressing Stillbirth in India Must Include Men.

    Science.gov (United States)

    Roberts, Lisa; Montgomery, Susanne; Ganesh, Gayatri; Kaur, Harinder Pal; Singh, Ratan

    2017-07-01

    Millennium Development Goal 4, to reduce child mortality, can only be achieved by reducing stillbirths globally. A confluence of medical and sociocultural factors contribute to the high stillbirth rates in India. The psychosocial aftermath of stillbirth is a well-documented public health problem, though less is known of the experience for men, particularly outside of the Western context. Therefore, men's perceptions and knowledge regarding reproductive health, as well as maternal-child health are important. Key informant interviews (n = 5) were analyzed and 28 structured interviews were conducted using a survey based on qualitative themes. Qualitative themes included men's dual burden and right to medical and reproductive decision making power. Wives were discouraged from expressing grief and pushed to conceive again. If not successful, particularly if a son was not conceived, a second wife was considered a solution. Quantitative data revealed that men with a history of stillbirths had greater anxiety and depression, perceived less social support, but had more egalitarian views towards women than men without stillbirth experience. At the same time fathers of stillbirths were more likely to be emotionally or physically abusive. Predictors of mental health, attitudes towards women, and perceived support are discussed. Patriarchal societal values, son preference, deficient women's autonomy, and sex-selective abortion perpetuate the risk for future poor infant outcomes, including stillbirth, and compounds the already higher risk of stillbirth for males. Grief interventions should explore and take into account men's perceptions, attitudes, and behaviors towards reproductive decision making.

  7. Research on cloud computing solutions

    Directory of Open Access Journals (Sweden)

    Liudvikas Kaklauskas

    2015-07-01

    Full Text Available Cloud computing can be defined as a new style of computing in which dynamically scala-ble and often virtualized resources are provided as a services over the Internet. Advantages of the cloud computing technology include cost savings, high availability, and easy scalability. Voas and Zhang adapted six phases of computing paradigms, from dummy termi-nals/mainframes, to PCs, networking computing, to grid and cloud computing. There are four types of cloud computing: public cloud, private cloud, hybrid cloud and community. The most common and well-known deployment model is Public Cloud. A Private Cloud is suited for sensitive data, where the customer is dependent on a certain degree of security.According to the different types of services offered, cloud computing can be considered to consist of three layers (services models: IaaS (infrastructure as a service, PaaS (platform as a service, SaaS (software as a service. Main cloud computing solutions: web applications, data hosting, virtualization, database clusters and terminal services. The advantage of cloud com-puting is the ability to virtualize and share resources among different applications with the objective for better server utilization and without a clustering solution, a service may fail at the moment the server crashes.DOI: 10.15181/csat.v2i2.914

  8. Future computing needs for Fermilab

    International Nuclear Information System (INIS)

    1983-12-01

    The following recommendations are made: (1) Significant additional computing capacity and capability beyond the present procurement should be provided by 1986. A working group with representation from the principal computer user community should be formed to begin immediately to develop the technical specifications. High priority should be assigned to providing a large user memory, software portability and a productive computing environment. (2) A networked system of VAX-equivalent super-mini computers should be established with at least one such computer dedicated to each reasonably large experiment for both online and offline analysis. The laboratory staff responsible for mini computers should be augmented in order to handle the additional work of establishing, maintaining and coordinating this system. (3) The laboratory should move decisively to a more fully interactive environment. (4) A plan for networking both inside and outside the laboratory should be developed over the next year. (5) The laboratory resources devoted to computing, including manpower, should be increased over the next two to five years. A reasonable increase would be 50% over the next two years increasing thereafter to a level of about twice the present one. (6) A standing computer coordinating group, with membership of experts from all the principal computer user constituents of the laboratory, should be appointed by and report to the director. This group should meet on a regularly scheduled basis and be charged with continually reviewing all aspects of the laboratory computing environment

  9. Computer Science Research at Langley

    Science.gov (United States)

    Voigt, S. J. (Editor)

    1982-01-01

    A workshop was held at Langley Research Center, November 2-5, 1981, to highlight ongoing computer science research at Langley and to identify additional areas of research based upon the computer user requirements. A panel discussion was held in each of nine application areas, and these are summarized in the proceedings. Slides presented by the invited speakers are also included. A survey of scientific, business, data reduction, and microprocessor computer users helped identify areas of focus for the workshop. Several areas of computer science which are of most concern to the Langley computer users were identified during the workshop discussions. These include graphics, distributed processing, programmer support systems and tools, database management, and numerical methods.

  10. 76 FR 41522 - In the Matter of Certain Electronic Devices, Including Mobile Phones, Mobile Tablets, Portable...

    Science.gov (United States)

    2011-07-14

    ... Devices, Including Mobile Phones, Mobile Tablets, Portable Music Players, and Computers, and Components.... 1337, in the importation, sale for importation and sale within the United States after importation of certain mobile phones, mobile tablets, portable music players, and computers. 76 FR 24051 (Apr. 29, 2011...

  11. Computer loss experience and predictions

    Science.gov (United States)

    Parker, Donn B.

    1996-03-01

    The types of losses organizations must anticipate have become more difficult to predict because of the eclectic nature of computers and the data communications and the decrease in news media reporting of computer-related losses as they become commonplace. Total business crime is conjectured to be decreasing in frequency and increasing in loss per case as a result of increasing computer use. Computer crimes are probably increasing, however, as their share of the decreasing business crime rate grows. Ultimately all business crime will involve computers in some way, and we could see a decline of both together. The important information security measures in high-loss business crime generally concern controls over authorized people engaged in unauthorized activities. Such controls include authentication of users, analysis of detailed audit records, unannounced audits, segregation of development and production systems and duties, shielding the viewing of screens, and security awareness and motivation controls in high-value transaction areas. Computer crimes that involve highly publicized intriguing computer misuse methods, such as privacy violations, radio frequency emanations eavesdropping, and computer viruses, have been reported in waves that periodically have saturated the news media during the past 20 years. We must be able to anticipate such highly publicized crimes and reduce the impact and embarrassment they cause. On the basis of our most recent experience, I propose nine new types of computer crime to be aware of: computer larceny (theft and burglary of small computers), automated hacking (use of computer programs to intrude), electronic data interchange fraud (business transaction fraud), Trojan bomb extortion and sabotage (code security inserted into others' systems that can be triggered to cause damage), LANarchy (unknown equipment in use), desktop forgery (computerized forgery and counterfeiting of documents), information anarchy (indiscriminate use of

  12. Computer aided surface representation

    Energy Technology Data Exchange (ETDEWEB)

    Barnhill, R E

    1987-11-01

    The aims of this research are the creation of new surface forms and the determination of geometric and physical properties of surfaces. The full sweep from constructive mathematics through the implementation of algorithms and the interactive computer graphics display of surfaces is utilized. Both three-dimensional and multi- dimensional surfaces are considered. Particular emphasis is given to the scientific computing solution of Department of Energy problems. The methods that we have developed and that we are proposing to develop allow applications such as: Producing smooth contour maps from measured data, such as weather maps. Modeling the heat distribution inside a furnace from sample measurements. Terrain modeling based on satellite pictures. The investigation of new surface forms includes the topics of triangular interpolants, multivariate interpolation, surfaces defined on surfaces and monotone and/or convex surfaces. The geometric and physical properties considered include contours, the intersection of surfaces, curvatures as a interrogation tool, and numerical integration.

  13. The surgery of peripheral nerves (including tumors)

    DEFF Research Database (Denmark)

    Fugleholm, Kåre

    2013-01-01

    Surgical pathology of the peripheral nervous system includes traumatic injury, entrapment syndromes, and tumors. The recent significant advances in the understanding of the pathophysiology and cellular biology of peripheral nerve degeneration and regeneration has yet to be translated into improved...... surgical techniques and better outcome after peripheral nerve injury. Decision making in peripheral nerve surgery continues to be a complex challenge, where the mechanism of injury, repeated clinical evaluation, neuroradiological and neurophysiological examination, and detailed knowledge of the peripheral...... nervous system response to injury are prerequisite to obtain the best possible outcome. Surgery continues to be the primary treatment modality for peripheral nerve tumors and advances in adjuvant oncological treatment has improved outcome after malignant peripheral nerve tumors. The present chapter...

  14. AMS at the ANU including biomedical applications

    Energy Technology Data Exchange (ETDEWEB)

    Fifield, L K; Allan, G L; Cresswell, R G; Ophel, T R [Australian National Univ., Canberra, ACT (Australia); King, S J; Day, J P [Manchester Univ. (United Kingdom). Dept. of Chemistry

    1994-12-31

    An extensive accelerator mass spectrometry program has been conducted on the 14UD accelerator at the Australian National University since 1986. In the two years since the previous conference, the research program has expanded significantly to include biomedical applications of {sup 26}Al and studies of landform evolution using isotopes produced in situ in surface rocks by cosmic ray bombardment. The system is now used for the measurement of {sup 10}Be, {sup 14}C, {sup 26}Al, {sup 36}Cl, {sup 59}Ni and {sup 129}I, and research is being undertaken in hydrology, environmental geochemistry, archaeology and biomedicine. On the technical side, a new test system has permitted the successful off-line development of a high-intensity ion source. A new injection line to the 14UD has been established and the new source is now in position and providing beams to the accelerator. 4 refs.

  15. AMS at the ANU including biomedical applications

    Energy Technology Data Exchange (ETDEWEB)

    Fifield, L.K.; Allan, G.L.; Cresswell, R.G.; Ophel, T.R. [Australian National Univ., Canberra, ACT (Australia); King, S.J.; Day, J.P. [Manchester Univ. (United Kingdom). Dept. of Chemistry

    1993-12-31

    An extensive accelerator mass spectrometry program has been conducted on the 14UD accelerator at the Australian National University since 1986. In the two years since the previous conference, the research program has expanded significantly to include biomedical applications of {sup 26}Al and studies of landform evolution using isotopes produced in situ in surface rocks by cosmic ray bombardment. The system is now used for the measurement of {sup 10}Be, {sup 14}C, {sup 26}Al, {sup 36}Cl, {sup 59}Ni and {sup 129}I, and research is being undertaken in hydrology, environmental geochemistry, archaeology and biomedicine. On the technical side, a new test system has permitted the successful off-line development of a high-intensity ion source. A new injection line to the 14UD has been established and the new source is now in position and providing beams to the accelerator. 4 refs.

  16. CERN Technical Training: LABVIEW courses include RADE

    CERN Multimedia

    HR Department

    2009-01-01

    The contents of the "LabView Basic I" and "LabView Intermediate II" courses have recently been changed to include, respectively, an introduction to and expert training in the Rapid Application Development Environment (RADE). RADE is a LabView-based application developed at CERN to integrate LabView in the accelerator and experiment control infrastructure. It is a suitable solution to developing expert tools, machine development analysis and independent test facilities. The course names have also been changed to "LabVIEW Basics I with RADE Introduction" and "LabVIEW Intermediate II with Advanced RADE Application". " LabVIEW Basics I with RADE Introduction" is designed for: Users preparing to develop applications using LabVIEW, or NI Developer Suite; users and technical managers evaluating LabVIEW or NI Developer Suite in purchasing decisions; users pursuing the Certified LabVIEW Developer certification. The course pr...

  17. CERN Technical Training: LABVIEW courses include RADE

    CERN Multimedia

    HR Department

    2009-01-01

    The contents of the "LabView Basic I" and "LabView Intermediate II" courses have recently been changed to include, respectively, an introduction to and expert training in the Rapid Application Development Environment (RADE). RADE is a LabView-based application developed at CERN to integrate LabView in the accelerator and experiment control infrastructure. It is a suitable solution to developing expert tools, machine development analysis and independent test facilities. The course names have also been changed to "LabVIEW Basics I with RADE Introduction" and "LabVIEW Intermediate II with Advanced RADE Application". " LabVIEW Basics I with RADE Introduction" is designed for: Users preparing to develop applications using LabVIEW, or NI Developer Suite; users and technical managers evaluating LabVIEW or NI Developer Suite in purchasing decisions; users pursuing the Certified LabVIEW Developer certification. The course prepares participants to develop test and measurement, da...

  18. CERN Technical Training: LABVIEW courses include RADE

    CERN Multimedia

    HR Department

    2009-01-01

    The contents of "LabView Basic I" and "LabView Intermediate II" trainings have been recently changed to include, respectively, an introduction and an expert training on the Rapid Application Development Environment (RADE). RADE is a LabView-based application developed at CERN to integrate LabView in the accelerator and experiment control infrastructure. It is a suitable solution to develop expert tools, machine development analysis and independent test facilities. The course names have also been changed to "LabVIEW Basics I with RADE Introduction" and "LabVIEW Intermediate II with Advanced RADE Application". " LabVIEW Basics I with RADE Introduction" is designed for: Users preparing to develop applications using LabVIEW, or NI Developer Suite; users and technical managers evaluating LabVIEW or NI Developer Suite in purchasing decisions; users pursuing the Certified LabVIEW Developer certification. The course prepare...

  19. Critical point anomalies include expansion shock waves

    Energy Technology Data Exchange (ETDEWEB)

    Nannan, N. R., E-mail: ryan.nannan@uvs.edu [Mechanical Engineering Discipline, Anton de Kom University of Suriname, Leysweg 86, PO Box 9212, Paramaribo, Suriname and Process and Energy Department, Delft University of Technology, Leeghwaterstraat 44, 2628 CA Delft (Netherlands); Guardone, A., E-mail: alberto.guardone@polimi.it [Department of Aerospace Science and Technology, Politecnico di Milano, Via La Masa 34, 20156 Milano (Italy); Colonna, P., E-mail: p.colonna@tudelft.nl [Propulsion and Power, Delft University of Technology, Kluyverweg 1, 2629 HS Delft (Netherlands)

    2014-02-15

    From first-principle fluid dynamics, complemented by a rigorous state equation accounting for critical anomalies, we discovered that expansion shock waves may occur in the vicinity of the liquid-vapor critical point in the two-phase region. Due to universality of near-critical thermodynamics, the result is valid for any common pure fluid in which molecular interactions are only short-range, namely, for so-called 3-dimensional Ising-like systems, and under the assumption of thermodynamic equilibrium. In addition to rarefaction shock waves, diverse non-classical effects are admissible, including composite compressive shock-fan-shock waves, due to the change of sign of the fundamental derivative of gasdynamics.

  20. CLIC expands to include the Southern Hemisphere

    CERN Multimedia

    Roberto Cantoni

    2010-01-01

    Australia has recently joined the CLIC collaboration: the enlargement will bring new expertise and resources to the project, and is especially welcome in the wake of CERN budget redistributions following the recent adoption of the Medium Term Plan.   The countries involved in CLIC collaboration With the signing of a Memorandum of Understanding on 26 August 2010, the ACAS network (Australian Collaboration for Accelerator Science) became the 40th member of in the multilateral CLIC collaboration making Australia the 22nd country to join the collaboration. “The new MoU was signed by the ACAS network, which includes the Australian Synchrotron and the University of Melbourne”, explains Jean-Pierre Delahaye, CLIC Study Leader. “Thanks to their expertise, the Australian institutes will contribute greatly to the CLIC damping rings and the two-beam test modules." Institutes from any country wishing to join the CLIC collaboration are invited to assume responsibility o...

  1. Should Broca's area include Brodmann area 47?

    Science.gov (United States)

    Ardila, Alfredo; Bernal, Byron; Rosselli, Monica

    2017-02-01

    Understanding brain organization of speech production has been a principal goal of neuroscience. Historically, brain speech production has been associated with so-called Broca’s area (Brodmann area –BA- 44 and 45), however, modern neuroimaging developments suggest speech production is associated with networks rather than with areas. The purpose of this paper was to analyze the connectivity of BA47 ( pars orbitalis) in relation to language . A meta-analysis was conducted to assess the language network in which BA47 is involved. The Brainmap database was used. Twenty papers corresponding to 29 experimental conditions with a total of 373 subjects were included. Our results suggest that BA47 participates in a “frontal language production system” (or extended Broca’s system). The BA47  connectivity found is also concordant with a minor role in language semantics. BA47 plays a central role in the language production system.

  2. Musculoskeletal ultrasound including definitions for ultrasonographic pathology

    DEFF Research Database (Denmark)

    Wakefield, RJ; Balint, PV; Szkudlarek, Marcin

    2005-01-01

    Ultrasound (US) has great potential as an outcome in rheumatoid arthritis trials for detecting bone erosions, synovitis, tendon disease, and enthesopathy. It has a number of distinct advantages over magnetic resonance imaging, including good patient tolerability and ability to scan multiple joints...... in a short period of time. However, there are scarce data regarding its validity, reproducibility, and responsiveness to change, making interpretation and comparison of studies difficult. In particular, there are limited data describing standardized scanning methodology and standardized definitions of US...... pathologies. This article presents the first report from the OMERACT ultrasound special interest group, which has compared US against the criteria of the OMERACT filter. Also proposed for the first time are consensus US definitions for common pathological lesions seen in patients with inflammatory arthritis....

  3. Grand unified models including extra Z bosons

    International Nuclear Information System (INIS)

    Li Tiezhong

    1989-01-01

    The grand unified theories (GUT) of the simple Lie groups including extra Z bosons are discussed. Under authors's hypothesis there are only SU 5+m SO 6+4n and E 6 groups. The general discussion of SU 5+m is given, then the SU 6 and SU 7 are considered. In SU 6 the 15+6 * +6 * fermion representations are used, which are not same as others in fermion content, Yukawa coupling and broken scales. A conception of clans of particles, which are not families, is suggested. These clans consist of extra Z bosons and the corresponding fermions of the scale. The all of fermions in the clans are down quarks except for the standard model which consists of Z bosons and 15 fermions, therefore, the spectrum of the hadrons which are composed of these down quarks are different from hadrons at present

  4. Including climate change in energy investment decisions

    International Nuclear Information System (INIS)

    Ybema, J.R.; Boonekamp, P.G.M.; Smit, J.T.J.

    1995-08-01

    To properly take climate change into account in the analysis of energy investment decisions, it is required to apply decision analysis methods that are capable of considering the specific characteristics of climate change (large uncertainties, long term horizon). Such decision analysis methods do exist. They can explicitly include evolving uncertainties, multi-stage decisions, cumulative effects and risk averse attitudes. Various methods are considered in this report and two of these methods have been selected: hedging calculations and sensitivity analysis. These methods are applied to illustrative examples, and its limitations are discussed. The examples are (1a) space heating and hot water for new houses from a private investor perspective and (1b) as example (1a) but from a government perspective, (2) electricity production with an integrated coal gasification combined cycle (ICGCC) with or without CO 2 removal, and (3) national energy strategy to hedge for climate change. 9 figs., 21 tabs., 42 refs., 1 appendix

  5. Education Program on Fossil Resources Including Coal

    Science.gov (United States)

    Usami, Masahiro

    Fossil fuels including coal play a key role as crucial energies in contributing to economic development in Asia. On the other hand, its limited quantity and the environmental problems causing from its usage have become a serious global issue and a countermeasure to solve such problems is very much demanded. Along with the pursuit of sustainable development, environmentally-friendly use of highly efficient fossil resources should be therefore, accompanied. Kyushu-university‧s sophisticated research through long years of accumulated experience on the fossil resources and environmental sectors together with the advanced large-scale commercial and empirical equipments will enable us to foster cooperative research and provide internship program for the future researchers. Then, this program is executed as a consignment business from the Ministry of Economy, Trade and Industry from 2007 fiscal year to 2009 fiscal year. The lecture that uses the textbooks developed by this program is scheduled to be started a course in fiscal year 2010.

  6. Basics of Computer Networking

    CERN Document Server

    Robertazzi, Thomas

    2012-01-01

    Springer Brief Basics of Computer Networking provides a non-mathematical introduction to the world of networks. This book covers both technology for wired and wireless networks. Coverage includes transmission media, local area networks, wide area networks, and network security. Written in a very accessible style for the interested layman by the author of a widely used textbook with many years of experience explaining concepts to the beginner.

  7. Computed Tomography. Chapter 11

    Energy Technology Data Exchange (ETDEWEB)

    Geleijns, J. [Leiden University Medical Centre, Leiden (Netherlands)

    2014-09-15

    After its clinical introduction in 1971, computed tomography (CT) developed from an X ray modality that was limited to axial imaging of the brain in neuroradiology into a versatile 3-D whole body imaging modality for a wide range of applications, including oncology, vascular radiology, cardiology, traumatology and interventional radiology. CT is applied for diagnosis and follow-up studies of patients, for planning of radiotherapy, and even for screening of healthy subpopulations with specific risk factors.

  8. Three dimensional field computation

    International Nuclear Information System (INIS)

    Trowbridge, C.W.

    1981-06-01

    Recent research work carried out at Rutherford and Appleton Laboratories into the Computation of Electromagnetic Fields is summarised. The topics covered include algorithms for integral and differential methods for the solution of 3D magnetostatic fields, comparison of results with experiment and an investigation into the strengths and weaknesses of both methods for an analytic problem. The paper concludes with a brief summary of the work in progress on the solution of 3D eddy currents using differential finite elements. (author)

  9. Numerical computations with GPUs

    CERN Document Server

    Kindratenko, Volodymyr

    2014-01-01

    This book brings together research on numerical methods adapted for Graphics Processing Units (GPUs). It explains recent efforts to adapt classic numerical methods, including solution of linear equations and FFT, for massively parallel GPU architectures. This volume consolidates recent research and adaptations, covering widely used methods that are at the core of many scientific and engineering computations. Each chapter is written by authors working on a specific group of methods; these leading experts provide mathematical background, parallel algorithms and implementation details leading to

  10. Universal computer interfaces

    CERN Document Server

    Dheere, RFBM

    1988-01-01

    Presents a survey of the latest developments in the field of the universal computer interface, resulting from a study of the world patent literature. Illustrating the state of the art today, the book ranges from basic interface structure, through parameters and common characteristics, to the most important industrial bus realizations. Recent technical enhancements are also included, with special emphasis devoted to the universal interface adapter circuit. Comprehensively indexed.

  11. Alternating phase focussing including space charge

    International Nuclear Information System (INIS)

    Cheng, W.H.; Gluckstern, R.L.

    1992-01-01

    Longitudinal stability can be obtained in a non-relativistic drift tube accelerator by traversing each gap as the rf accelerating field rises. However, the rising accelerating field leads to a transverse defocusing force which is usually overcome by magnetic focussing inside the drift tubes. The radio frequency quadrupole is one way of providing simultaneous longitudinal and transverse focusing without the use of magnets. One can also avoid the use of magnets by traversing alternate gaps between drift tubes as the field is rising and falling, thus providing an alternation of focussing and defocusing forces in both the longitudinal and transverse directions. The stable longitudinal phase space area is quite small, but recent efforts suggest that alternating phase focussing (APF) may permit low velocity acceleration of currents in the 100-300 ma range. This paper presents a study of the parameter space and a test of crude analytic predictions by adapting the code PARMILA, which includes space charge, to APF. 6 refs., 3 figs

  12. Probabilistic production simulation including CHP plants

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, H.V.; Palsson, H.; Ravn, H.F.

    1997-04-01

    A probabilistic production simulation method is presented for an energy system containing combined heat and power plants. The method permits incorporation of stochastic failures (forced outages) of the plants and is well suited for analysis of the dimensioning of the system, that is, for finding the appropriate types and capacities of production plants in relation to expansion planning. The method is in the tradition of similar approaches for the analysis of power systems, based on the load duration curve. The present method extends on this by considering a two-dimensional load duration curve where the two dimensions represent heat and power. The method permits the analysis of a combined heat and power system which includes all the basic relevant types of plants, viz., condensing plants, back pressure plants, extraction plants and heat plants. The focus of the method is on the situation where the heat side has priority. This implies that on the power side there may be imbalances between demand and production. The method permits quantification of the expected power overflow, the expected unserviced power demand, and the expected unserviced heat demand. It is shown that a discretization method as well as double Fourier series may be applied in algorithms based on the method. (au) 1 tab., 28 ills., 21 refs.

  13. Computer Refurbishment

    International Nuclear Information System (INIS)

    Ichiyen, Norman; Chan, Dominic; Thompson, Paul

    2004-01-01

    The major activity for the 18-month refurbishment outage at the Point Lepreau Generating Station is the replacement of all 380 fuel channel and calandria tube assemblies and the lower portion of connecting feeder pipes. New Brunswick Power would also take advantage of this outage to conduct a number of repairs, replacements, inspections and upgrades (such as rewinding or replacing the generator, replacement of shutdown system trip computers, replacement of certain valves and expansion joints, inspection of systems not normally accessible, etc.). This would allow for an additional 25 to 30 years. Among the systems to be replaced are the PDC's for both shutdown systems. Assessments have been completed for both the SDS1 and SDS2 PDC's, and it has been decided to replace the SDS2 PDCs with the same hardware and software approach that has been used successfully for the Wolsong 2, 3, and 4 and the Qinshan 1 and 2 SDS2 PDCs. For SDS1, it has been decided to use the same software development methodology that was used successfully for the Wolsong and Qinshan called the I A and to use a new hardware platform in order to ensure successful operation for the 25-30 year station operating life. The selected supplier is Triconex, which uses a triple modular redundant architecture that will enhance the robustness/fault tolerance of the design with respect to equipment failures

  14. Mathematical model of thyristor inverter including a series-parallel resonant circuit

    OpenAIRE

    Luft, M.; Szychta, E.

    2008-01-01

    The article presents a mathematical model of thyristor inverter including a series-parallel resonant circuit with the aid of state variable method. Maple procedures are used to compute current and voltage waveforms in the inverter.

  15. Mathematical Model of Thyristor Inverter Including a Series-parallel Resonant Circuit

    OpenAIRE

    Miroslaw Luft; Elzbieta Szychta

    2008-01-01

    The article presents a mathematical model of thyristor inverter including a series-parallel resonant circuit with theaid of state variable method. Maple procedures are used to compute current and voltage waveforms in the inverter.

  16. Mathematical Model of Thyristor Inverter Including a Series-parallel Resonant Circuit

    Directory of Open Access Journals (Sweden)

    Miroslaw Luft

    2008-01-01

    Full Text Available The article presents a mathematical model of thyristor inverter including a series-parallel resonant circuit with theaid of state variable method. Maple procedures are used to compute current and voltage waveforms in the inverter.

  17. The principles of computer hardware

    CERN Document Server

    Clements, Alan

    2000-01-01

    Principles of Computer Hardware, now in its third edition, provides a first course in computer architecture or computer organization for undergraduates. The book covers the core topics of such a course, including Boolean algebra and logic design; number bases and binary arithmetic; the CPU; assembly language; memory systems; and input/output methods and devices. It then goes on to cover the related topics of computer peripherals such as printers; the hardware aspects of the operating system; and data communications, and hence provides a broader overview of the subject. Its readable, tutorial-based approach makes it an accessible introduction to the subject. The book has extensive in-depth coverage of two microprocessors, one of which (the 68000) is widely used in education. All chapters in the new edition have been updated. Major updates include: powerful software simulations of digital systems to accompany the chapters on digital design; a tutorial-based introduction to assembly language, including many exam...

  18. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses Computed tomography (CT) of the sinuses ... CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, more commonly known ...

  19. Illustrated computer tomography

    International Nuclear Information System (INIS)

    Takahashi, S.

    1983-01-01

    This book provides the following information: basic aspects of computed tomography; atlas of computed tomography of the normal adult; clinical application of computed tomography; and radiotherapy planning and computed tomography

  20. SEEPAGE MODEL FOR PA INCLUDING DRIFT COLLAPSE

    International Nuclear Information System (INIS)

    C. Tsang

    2004-01-01

    The purpose of this report is to document the predictions and analyses performed using the seepage model for performance assessment (SMPA) for both the Topopah Spring middle nonlithophysal (Tptpmn) and lower lithophysal (Tptpll) lithostratigraphic units at Yucca Mountain, Nevada. Look-up tables of seepage flow rates into a drift (and their uncertainty) are generated by performing numerical simulations with the seepage model for many combinations of the three most important seepage-relevant parameters: the fracture permeability, the capillary-strength parameter 1/a, and the percolation flux. The percolation flux values chosen take into account flow focusing effects, which are evaluated based on a flow-focusing model. Moreover, multiple realizations of the underlying stochastic permeability field are conducted. Selected sensitivity studies are performed, including the effects of an alternative drift geometry representing a partially collapsed drift from an independent drift-degradation analysis (BSC 2004 [DIRS 166107]). The intended purpose of the seepage model is to provide results of drift-scale seepage rates under a series of parameters and scenarios in support of the Total System Performance Assessment for License Application (TSPA-LA). The SMPA is intended for the evaluation of drift-scale seepage rates under the full range of parameter values for three parameters found to be key (fracture permeability, the van Genuchten 1/a parameter, and percolation flux) and drift degradation shape scenarios in support of the TSPA-LA during the period of compliance for postclosure performance [Technical Work Plan for: Performance Assessment Unsaturated Zone (BSC 2002 [DIRS 160819], Section I-4-2-1)]. The flow-focusing model in the Topopah Spring welded (TSw) unit is intended to provide an estimate of flow focusing factors (FFFs) that (1) bridge the gap between the mountain-scale and drift-scale models, and (2) account for variability in local percolation flux due to

  1. Theories of computational complexity

    CERN Document Server

    Calude, C

    1988-01-01

    This volume presents four machine-independent theories of computational complexity, which have been chosen for their intrinsic importance and practical relevance. The book includes a wealth of results - classical, recent, and others which have not been published before.In developing the mathematics underlying the size, dynamic and structural complexity measures, various connections with mathematical logic, constructive topology, probability and programming theories are established. The facts are presented in detail. Extensive examples are provided, to help clarify notions and constructions. The lists of exercises and problems include routine exercises, interesting results, as well as some open problems.

  2. Cloud Computing Fundamentals

    Science.gov (United States)

    Furht, Borko

    In the introductory chapter we define the concept of cloud computing and cloud services, and we introduce layers and types of cloud computing. We discuss the differences between cloud computing and cloud services. New technologies that enabled cloud computing are presented next. We also discuss cloud computing features, standards, and security issues. We introduce the key cloud computing platforms, their vendors, and their offerings. We discuss cloud computing challenges and the future of cloud computing.

  3. Parameterized algorithmics for computational social choice : nine research challenges

    NARCIS (Netherlands)

    Bredereck, R.; Chen, J.; Faliszewski, P.; Guo, J.; Niedermeier, R.; Woeginger, G.J.

    2014-01-01

    Computational Social Choice is an interdisciplinary research area involving Economics, Political Science, and Social Science on the one side, and Mathematics and Computer Science (including Artificial Intelligence and Multiagent Systems) on the other side. Typical computational problems studied in

  4. Cloud computing for comparative genomics

    Directory of Open Access Journals (Sweden)

    Pivovarov Rimma

    2010-05-01

    Full Text Available Abstract Background Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD, to run within Amazon's Elastic Computing Cloud (EC2. We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. Results We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. Conclusions The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems.

  5. Cloud computing for comparative genomics.

    Science.gov (United States)

    Wall, Dennis P; Kudtarkar, Parul; Fusaro, Vincent A; Pivovarov, Rimma; Patil, Prasad; Tonellato, Peter J

    2010-05-18

    Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD), to run within Amazon's Elastic Computing Cloud (EC2). We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems.

  6. Quantum computers: Definition and implementations

    International Nuclear Information System (INIS)

    Perez-Delgado, Carlos A.; Kok, Pieter

    2011-01-01

    The DiVincenzo criteria for implementing a quantum computer have been seminal in focusing both experimental and theoretical research in quantum-information processing. These criteria were formulated specifically for the circuit model of quantum computing. However, several new models for quantum computing (paradigms) have been proposed that do not seem to fit the criteria well. Therefore, the question is what are the general criteria for implementing quantum computers. To this end, a formal operational definition of a quantum computer is introduced. It is then shown that, according to this definition, a device is a quantum computer if it obeys the following criteria: Any quantum computer must consist of a quantum memory, with an additional structure that (1) facilitates a controlled quantum evolution of the quantum memory; (2) includes a method for information theoretic cooling of the memory; and (3) provides a readout mechanism for subsets of the quantum memory. The criteria are met when the device is scalable and operates fault tolerantly. We discuss various existing quantum computing paradigms and how they fit within this framework. Finally, we present a decision tree for selecting an avenue toward building a quantum computer. This is intended to help experimentalists determine the most natural paradigm given a particular physical implementation.

  7. Computing in Hydraulic Engineering Education

    Science.gov (United States)

    Duan, J. G.

    2011-12-01

    Civil engineers, pioneers of our civilization, are rarely perceived as leaders and innovators in modern society because of retardations in technology innovation. This crisis has resulted in the decline of the prestige of civil engineering profession, reduction of federal funding on deteriorating infrastructures, and problems with attracting the most talented high-school students. Infusion of cutting-edge computer technology and stimulating creativity and innovation therefore are the critical challenge to civil engineering education. To better prepare our graduates to innovate, this paper discussed the adaption of problem-based collaborative learning technique and integration of civil engineering computing into a traditional civil engineering curriculum. Three interconnected courses: Open Channel Flow, Computational Hydraulics, and Sedimentation Engineering, were developed with emphasis on computational simulations. In Open Channel flow, the focuses are principles of free surface flow and the application of computational models. This prepares students to the 2nd course, Computational Hydraulics, that introduce the fundamental principles of computational hydraulics, including finite difference and finite element methods. This course complements the Open Channel Flow class to provide students with in-depth understandings of computational methods. The 3rd course, Sedimentation Engineering, covers the fundamentals of sediment transport and river engineering, so students can apply the knowledge and programming skills gained from previous courses to develop computational models for simulating sediment transport. These courses effectively equipped students with important skills and knowledge to complete thesis and dissertation research.

  8. 77 FR 18860 - Certain Consumer Electronics, Including Mobile Phones and Tablets; Notice of Receipt of Complaint...

    Science.gov (United States)

    2012-03-28

    ... INTERNATIONAL TRADE COMMISSION [DN 2885] Certain Consumer Electronics, Including Mobile Phones and.... International Trade Commission has received a complaint entitled Certain Consumer Electronics, Including Mobile... electronics, including mobile phones and tablets. The complaint names as respondents ASUSTeK Computer, Inc. of...

  9. Efficient Algorithms for Electrostatic Interactions Including Dielectric Contrasts

    Directory of Open Access Journals (Sweden)

    Christian Holm

    2013-10-01

    Full Text Available Coarse-grained models of soft matter are usually combined with implicit solvent models that take the electrostatic polarizability into account via a dielectric background. In biophysical or nanoscale simulations that include water, this constant can vary greatly within the system. Performing molecular dynamics or other simulations that need to compute exact electrostatic interactions between charges in those systems is computationally demanding. We review here several algorithms developed by us that perform exactly this task. For planar dielectric surfaces in partial periodic boundary conditions, the arising image charges can be either treated with the MMM2D algorithm in a very efficient and accurate way or with the electrostatic layer correction term, which enables the user to use his favorite 3D periodic Coulomb solver. Arbitrarily-shaped interfaces can be dealt with using induced surface charges with the induced charge calculation (ICC* algorithm. Finally, the local electrostatics algorithm, MEMD(Maxwell Equations Molecular Dynamics, even allows one to employ a smoothly varying dielectric constant in the systems. We introduce the concepts of these three algorithms and an extension for the inclusion of boundaries that are to be held fixed at a constant potential (metal conditions. For each method, we present a showcase application to highlight the importance of dielectric interfaces.

  10. Computing handbook computer science and software engineering

    CERN Document Server

    Gonzalez, Teofilo; Tucker, Allen

    2014-01-01

    Overview of Computer Science Structure and Organization of Computing Peter J. DenningComputational Thinking Valerie BarrAlgorithms and Complexity Data Structures Mark WeissBasic Techniques for Design and Analysis of Algorithms Edward ReingoldGraph and Network Algorithms Samir Khuller and Balaji RaghavachariComputational Geometry Marc van KreveldComplexity Theory Eric Allender, Michael Loui, and Kenneth ReganFormal Models and Computability Tao Jiang, Ming Li, and Bala

  11. International Conference on Computer, Communication and Computational Sciences

    CERN Document Server

    Mishra, Krishn; Tiwari, Shailesh; Singh, Vivek

    2017-01-01

    Exchange of information and innovative ideas are necessary to accelerate the development of technology. With advent of technology, intelligent and soft computing techniques came into existence with a wide scope of implementation in engineering sciences. Keeping this ideology in preference, this book includes the insights that reflect the ‘Advances in Computer and Computational Sciences’ from upcoming researchers and leading academicians across the globe. It contains high-quality peer-reviewed papers of ‘International Conference on Computer, Communication and Computational Sciences (ICCCCS 2016), held during 12-13 August, 2016 in Ajmer, India. These papers are arranged in the form of chapters. The content of the book is divided into two volumes that cover variety of topics such as intelligent hardware and software design, advanced communications, power and energy optimization, intelligent techniques used in internet of things, intelligent image processing, advanced software engineering, evolutionary and ...

  12. Factors Influencing Organization Adoption Decision On Cloud Computing

    OpenAIRE

    Ailar Rahimli

    2013-01-01

    Cloud computing is a developing field, using by organization that require to computing resource to provide the organizational computing needs. The goal of this research is evaluate the factors that influence on organization decision to adopt the cloud computing in Malaysia. Factors that relate to cloud computing adoption that include : need for cloud computing, cost effectiveness, security effectiveness of cloud computing and reliability. This paper evaluated the factors that influence on ado...

  13. 1st International Conference on Computational Intelligence and Informatics

    CERN Document Server

    Prasad, V; Rani, B; Udgata, Siba; Raju, K

    2017-01-01

    The book covers a variety of topics which include data mining and data warehousing, high performance computing, parallel and distributed computing, computational intelligence, soft computing, big data, cloud computing, grid computing, cognitive computing, image processing, computer networks, wireless networks, social networks, wireless sensor networks, information and network security, web security, internet of things, bioinformatics and geoinformatics. The book is a collection of best papers submitted in the First International Conference on Computational Intelligence and Informatics (ICCII 2016) held during 28-30 May 2016 at JNTUH CEH, Hyderabad, India. It was hosted by Department of Computer Science and Engineering, JNTUH College of Engineering in association with Division V (Education & Research) CSI, India. .

  14. Computational chemistry research

    Science.gov (United States)

    Levin, Eugene

    1987-01-01

    Task 41 is composed of two parts: (1) analysis and design studies related to the Numerical Aerodynamic Simulation (NAS) Extended Operating Configuration (EOC) and (2) computational chemistry. During the first half of 1987, Dr. Levin served as a member of an advanced system planning team to establish the requirements, goals, and principal technical characteristics of the NAS EOC. A paper entitled 'Scaling of Data Communications for an Advanced Supercomputer Network' is included. The high temperature transport properties (such as viscosity, thermal conductivity, etc.) of the major constituents of air (oxygen and nitrogen) were correctly determined. The results of prior ab initio computer solutions of the Schroedinger equation were combined with the best available experimental data to obtain complete interaction potentials for both neutral and ion-atom collision partners. These potentials were then used in a computer program to evaluate the collision cross-sections from which the transport properties could be determined. A paper entitled 'High Temperature Transport Properties of Air' is included.

  15. Quasicrystals and Quantum Computing

    Science.gov (United States)

    Berezin, Alexander A.

    1997-03-01

    In Quantum (Q) Computing qubits form Q-superpositions for macroscopic times. One scheme for ultra-fast (Q) computing can be based on quasicrystals. Ultrafast processing in Q-coherent structures (and the very existence of durable Q-superpositions) may be 'consequence' of presence of entire manifold of integer arithmetic (A0, aleph-naught of Georg Cantor) at any 4-point of space-time, furthermore, at any point of any multidimensional phase space of (any) N-particle Q-system. The latter, apart from quasicrystals, can include dispersed and/or diluted systems (Berezin, 1994). In such systems such alleged centrepieces of Q-Computing as ability for fast factorization of long integers can be processed by sheer virtue of the fact that entire infinite pattern of prime numbers is instantaneously available as 'free lunch' at any instant/point. Infinitely rich pattern of A0 (including pattern of primes and almost primes) acts as 'independent' physical effect which directly generates Q-dynamics (and physical world) 'out of nothing'. Thus Q-nonlocality can be ultimately based on instantaneous interconnectedness through ever- the-same structure of A0 ('Platonic field' of integers).

  16. Computer systems: What the future holds

    Science.gov (United States)

    Stone, H. S.

    1976-01-01

    Developement of computer architecture is discussed in terms of the proliferation of the microprocessor, the utility of the medium-scale computer, and the sheer computational power of the large-scale machine. Changes in new applications brought about because of ever lowering costs, smaller sizes, and faster switching times are included.

  17. Skills and the appreciation of computer art

    Science.gov (United States)

    Boden, Margaret A.

    2016-04-01

    The appreciation of art normally includes recognition of the artist's skills in making it. Most people cannot appreciate computer art in that way, because they know little or nothing about coding. Various suggestions are made about how computer artists and/or curators might design and present computer art in such a way as to make the relevant making-skills more intelligible.

  18. Cognitive engineering in mental health computing

    NARCIS (Netherlands)

    Brinkman, W.P.

    2011-01-01

    Computer applications in support of mental health care and rehabilitation are becoming more widely used. They include technologies such as virtual reality, electronic diaries, multimedia, brain computing and computer games. Research in this area is emerging, and focussing on a variety of issues,

  19. Semiotics, Information Science, Documents and Computers.

    Science.gov (United States)

    Warner, Julian

    1990-01-01

    Discusses the relationship and value of semiotics to the established domains of information science. Highlights include documentation; computer operations; the language of computing; automata theory; linguistics; speech and writing; and the written language as a unifying principle for the document and the computer. (93 references) (LRW)

  20. Ordinateur et communication (Computer and Communication).

    Science.gov (United States)

    Mangenot, Francois

    1994-01-01

    Because use of computers in second-language classrooms may tend to decrease interpersonal interaction, and therefore communication, ways to promote interaction are offered. These include small group computer projects, and suggestions are made for use with various computer functions and features: tutorials, word processing, voice recording,…

  1. Computer Virus Bibliography, 1988-1989.

    Science.gov (United States)

    Bologna, Jack, Comp.

    This bibliography lists 14 books, 154 journal articles, 34 newspaper articles, and 3 research papers published during 1988-1989 on the subject of computer viruses, software protection and 'cures', virus hackers, and other related issues. Some of the sources listed include Computers and Security, Computer Security Digest, PC Week, Time, the New…

  2. Computers in Schools: White Boys Only?

    Science.gov (United States)

    Hammett, Roberta F.

    1997-01-01

    Discusses the role of computers in today's world and the construction of computer use attitudes, such as gender gaps. Suggests how schools might close the gaps. Includes a brief explanation about how facility with computers is important for women in their efforts to gain equitable treatment in all aspects of their lives. (PA)

  3. Computing Education in Korea--Current Issues and Endeavors

    Science.gov (United States)

    Choi, Jeongwon; An, Sangjin; Lee, Youngjun

    2015-01-01

    Computer education has been provided for a long period of time in Korea. Starting as a vocational program, the content of computer education for students evolved to include content on computer literacy, Information Communication Technology (ICT) literacy, and brand-new computer science. While a new curriculum related to computer science was…

  4. Digital Da Vinci computers in the arts and sciences

    CERN Document Server

    Lee, Newton

    2014-01-01

    Explores polymathic education through unconventional and creative applications of computer science in the arts and sciences Examines the use of visual computation, 3d printing, social robotics and computer modeling for computational art creation and design Includes contributions from leading researchers and practitioners in computer science, architecture and digital media

  5. Solving computationally expensive engineering problems

    CERN Document Server

    Leifsson, Leifur; Yang, Xin-She

    2014-01-01

    Computational complexity is a serious bottleneck for the design process in virtually any engineering area. While migration from prototyping and experimental-based design validation to verification using computer simulation models is inevitable and has a number of advantages, high computational costs of accurate, high-fidelity simulations can be a major issue that slows down the development of computer-aided design methodologies, particularly those exploiting automated design improvement procedures, e.g., numerical optimization. The continuous increase of available computational resources does not always translate into shortening of the design cycle because of the growing demand for higher accuracy and necessity to simulate larger and more complex systems. Accurate simulation of a single design of a given system may be as long as several hours, days or even weeks, which often makes design automation using conventional methods impractical or even prohibitive. Additional problems include numerical noise often pr...

  6. What do reversible programs compute?

    DEFF Research Database (Denmark)

    Axelsen, Holger Bock; Glück, Robert

    2011-01-01

    Reversible computing is the study of computation models that exhibit both forward and backward determinism. Understanding the fundamental properties of such models is not only relevant for reversible programming, but has also been found important in other fields, e.g., bidirectional model...... transformation, program transformations such as inversion, and general static prediction of program properties. Historically, work on reversible computing has focussed on reversible simulations of irreversible computations. Here, we take the viewpoint that the property of reversibility itself should...... are not strictly classically universal, but that they support another notion of universality; we call this RTM-universality. Thus, even though the RTMs are sub-universal in the classical sense, they are powerful enough as to include a self-interpreter. Lifting this to other computation models, we propose r...

  7. Computer architecture a quantitative approach

    CERN Document Server

    Hennessy, John L

    2019-01-01

    Computer Architecture: A Quantitative Approach, Sixth Edition has been considered essential reading by instructors, students and practitioners of computer design for over 20 years. The sixth edition of this classic textbook is fully revised with the latest developments in processor and system architecture. It now features examples from the RISC-V (RISC Five) instruction set architecture, a modern RISC instruction set developed and designed to be a free and openly adoptable standard. It also includes a new chapter on domain-specific architectures and an updated chapter on warehouse-scale computing that features the first public information on Google's newest WSC. True to its original mission of demystifying computer architecture, this edition continues the longstanding tradition of focusing on areas where the most exciting computing innovation is happening, while always keeping an emphasis on good engineering design.

  8. Discrete mathematics using a computer

    CERN Document Server

    Hall, Cordelia

    2000-01-01

    Several areas of mathematics find application throughout computer science, and all students of computer science need a practical working understanding of them. These core subjects are centred on logic, sets, recursion, induction, relations and functions. The material is often called discrete mathematics, to distinguish it from the traditional topics of continuous mathematics such as integration and differential equations. The central theme of this book is the connection between computing and discrete mathematics. This connection is useful in both directions: • Mathematics is used in many branches of computer science, in applica­ tions including program specification, datastructures,design and analysis of algorithms, database systems, hardware design, reasoning about the correctness of implementations, and much more; • Computers can help to make the mathematics easier to learn and use, by making mathematical terms executable, making abstract concepts more concrete, and through the use of software tools su...

  9. User perspectives on computer applications

    International Nuclear Information System (INIS)

    Trammell, H.E.

    1979-04-01

    Experiences of a technical group that uses the services of computer centers are recounted. An orientation on the ORNL Engineering Technology Division and its missions is given to provide background on the diversified efforts undertaken by the Division and its opportunities to benefit from computer technology. Specific ways in which computers are used within the Division are described; these include facility control, data acquisition, data analysis, theory applications, code development, information processing, cost control, management of purchase requisitions, maintenance of personnel information, and control of technical publications. Problem areas found to need improvement are the overloading of computers during normal working hours, lack of code transportability, delay in obtaining routine programming, delay in key punching services, bewilderment in the use of large computer centers, complexity of job control language, and uncertain quality of software. 20 figures

  10. Specialized computer architectures for computational aerodynamics

    Science.gov (United States)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  11. High energy physics and cloud computing

    International Nuclear Information System (INIS)

    Cheng Yaodong; Liu Baoxu; Sun Gongxing; Chen Gang

    2011-01-01

    High Energy Physics (HEP) has been a strong promoter of computing technology, for example WWW (World Wide Web) and the grid computing. In the new era of cloud computing, HEP has still a strong demand, and major international high energy physics laboratories have launched a number of projects to research on cloud computing technologies and applications. It describes the current developments in cloud computing and its applications in high energy physics. Some ongoing projects in the institutes of high energy physics, Chinese Academy of Sciences, including cloud storage, virtual computing clusters, and BESⅢ elastic cloud, are also described briefly in the paper. (authors)

  12. Planning Computer-Aided Distance Learning

    Directory of Open Access Journals (Sweden)

    Nadja Dobnik

    1996-12-01

    Full Text Available Didactics of autonomous learning changes under the influence of new technologies. Computer technology can cover all the functions that a teacher develops in personal contact with the learner. People organizing distance learning must realize all the possibilities offered by computers. Computers can take over and also combine the functions of many tools and systems, e. g. type­ writer, video, telephone. This the contents can be offered in form of classic media by means of text, speech, picture, etc. Computers take over data pro­cessing and function as study materials. Computer included in a computer network can also function as a medium for interactive communication.

  13. High energy physics computing in Japan

    International Nuclear Information System (INIS)

    Watase, Yoshiyuki

    1989-01-01

    A brief overview of the computing provision for high energy physics in Japan is presented. Most of the computing power for high energy physics is concentrated in KEK. Here there are two large scale systems: one providing a general computing service including vector processing and the other dedicated to TRISTAN experiments. Each university group has a smaller sized mainframe or VAX system to facilitate both their local computing needs and the remote use of the KEK computers through a network. The large computer system for the TRISTAN experiments is described. An overview of a prospective future large facility is also given. (orig.)

  14. Computers as components principles of embedded computing system design

    CERN Document Server

    Wolf, Marilyn

    2012-01-01

    Computers as Components: Principles of Embedded Computing System Design, 3e, presents essential knowledge on embedded systems technology and techniques. Updated for today's embedded systems design methods, this edition features new examples including digital signal processing, multimedia, and cyber-physical systems. Author Marilyn Wolf covers the latest processors from Texas Instruments, ARM, and Microchip Technology plus software, operating systems, networks, consumer devices, and more. Like the previous editions, this textbook: Uses real processors to demonstrate both technology and tec

  15. Computational synthetic geometry

    CERN Document Server

    Bokowski, Jürgen

    1989-01-01

    Computational synthetic geometry deals with methods for realizing abstract geometric objects in concrete vector spaces. This research monograph considers a large class of problems from convexity and discrete geometry including constructing convex polytopes from simplicial complexes, vector geometries from incidence structures and hyperplane arrangements from oriented matroids. It turns out that algorithms for these constructions exist if and only if arbitrary polynomial equations are decidable with respect to the underlying field. Besides such complexity theorems a variety of symbolic algorithms are discussed, and the methods are applied to obtain new mathematical results on convex polytopes, projective configurations and the combinatorics of Grassmann varieties. Finally algebraic varieties characterizing matroids and oriented matroids are introduced providing a new basis for applying computer algebra methods in this field. The necessary background knowledge is reviewed briefly. The text is accessible to stud...

  16. The Antares computing model

    Energy Technology Data Exchange (ETDEWEB)

    Kopper, Claudio, E-mail: claudio.kopper@nikhef.nl [NIKHEF, Science Park 105, 1098 XG Amsterdam (Netherlands)

    2013-10-11

    Completed in 2008, Antares is now the largest water Cherenkov neutrino telescope in the Northern Hemisphere. Its main goal is to detect neutrinos from galactic and extra-galactic sources. Due to the high background rate of atmospheric muons and the high level of bioluminescence, several on-line and off-line filtering algorithms have to be applied to the raw data taken by the instrument. To be able to handle this data stream, a dedicated computing infrastructure has been set up. The paper covers the main aspects of the current official Antares computing model. This includes an overview of on-line and off-line data handling and storage. In addition, the current usage of the “IceTray” software framework for Antares data processing is highlighted. Finally, an overview of the data storage formats used for high-level analysis is given.

  17. Computer applications in thermochemistry

    International Nuclear Information System (INIS)

    Vana Varamban, S.

    1996-01-01

    Knowledge of equilibrium is needed under many practical situations. Simple stoichiometric calculations can be performed by the use of hand calculators. Multi-component, multi-phase gas - solid chemical equilibrium calculations are far beyond the conventional devices and methods. Iterative techniques have to be resorted. Such problems are most elegantly handled by the use of modern computers. This report demonstrates the possible use of computers for chemical equilibrium calculations in the field of thermochemistry and chemical metallurgy. Four modules are explained. To fit the experimental C p data and to generate the thermal functions, to perform equilibrium calculations to the defined conditions, to prepare the elaborate input to the equilibrium and to analyse the calculated results graphically. The principles of thermochemical calculations are briefly described. An extensive input guide is given. Several illustrations are included to help the understanding and usage. (author)

  18. Computing with Mathematica

    CERN Document Server

    Hoft, Margret H

    2002-01-01

    Computing with Mathematica, 2nd edition is engaging and interactive. It is designed to teach readers how to use Mathematica efficiently for solving problems arising in fields such as mathematics, computer science, physics, and engineering. The text moves from simple to complex, often following a specific example on a number of different levels. This gradual increase incomplexity allows readers to steadily build their competence without being overwhelmed. The 2nd edition of this acclaimed book features:* An enclosed CD for Mac and Windows that contains the entire text as acollection of Mathematica notebooks* Substantive real world examples* Challenging exercises, moving from simple to complex* A collection of interactive projects from a variety of applications "I really think this is an almost perfect text." -Stephen Brick, University of South Alabama* Substantive real world examples * Challenging exercises, moving from simple to complex examples * Interactive explorations (on the included CD-ROM) from a ...

  19. Spintronics-based computing

    CERN Document Server

    Prenat, Guillaume

    2015-01-01

    This book provides a comprehensive introduction to spintronics-based computing for the next generation of ultra-low power/highly reliable logic, which is widely considered a promising candidate to replace conventional, pure CMOS-based logic. It will cover aspects from device to system-level, including magnetic memory cells, device modeling, hybrid circuit structure, design methodology, CAD tools, and technological integration methods. This book is accessible to a variety of readers and little or no background in magnetism and spin electronics are required to understand its content.  The multidisciplinary team of expert authors from circuits, devices, computer architecture, CAD and system design reveal to readers the potential of spintronics nanodevices to reduce power consumption, improve reliability and enable new functionality.  .

  20. The Fermilab central computing facility architectural model

    International Nuclear Information System (INIS)

    Nicholls, J.

    1989-01-01

    The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front-end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS cluster interactive front-end, an Amdahl VM Computing engine, ACP farms, and (primarily) VMS workstations. This paper will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. (orig.)

  1. The Fermilab Central Computing Facility architectural model

    International Nuclear Information System (INIS)

    Nicholls, J.

    1989-05-01

    The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS Cluster interactive front end, an Amdahl VM computing engine, ACP farms, and (primarily) VMS workstations. This presentation will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. 2 figs

  2. Computing with impure numbers - Automatic consistency checking and units conversion using computer algebra

    Science.gov (United States)

    Stoutemyer, D. R.

    1977-01-01

    The computer algebra language MACSYMA enables the programmer to include symbolic physical units in computer calculations, and features automatic detection of dimensionally-inhomogeneous formulas and conversion of inconsistent units in a dimensionally homogeneous formula. Some examples illustrate these features.

  3. Quantum computing from the ground up

    CERN Document Server

    Perry, Riley Tipton

    2012-01-01

    Quantum computing - the application of quantum mechanics to information - represents a fundamental break from classical information and promises to dramatically increase a computer's power. Many difficult problems, such as the factorization of large numbers, have so far resisted attack by classical computers yet are easily solved with quantum computers. If they become feasible, quantum computers will end standard practices such as RSA encryption. Most of the books or papers on quantum computing require (or assume) prior knowledge of certain areas such as linear algebra or quantum mechanics. The majority of the currently-available literature is hard to understand for the average computer enthusiast or interested layman. This text attempts to teach quantum computing from the ground up in an easily readable way, providing a comprehensive tutorial that includes all the necessary mathematics, computer science and physics.

  4. Computational complementarity

    International Nuclear Information System (INIS)

    Finkelstein, D.; Finkelstein, S.R.

    1983-01-01

    Interactivity generates paradox in that the interactive control by one system C of predicates about another system-under-study S may falsify these predicates. An ''interactive logic'' is formulated to resolve this paradox of interactivity. The construction generalizes one, the Galois connection, used by Von Neumann for the similar quantum paradox. The construction is applied to a transition system, a concept that includes general systems, automata, and quantum systems. In some (classical) automata S, the interactive predicates about S show quantumlike complementarity arising from interactivity. The interactive paradox generates the quantum paradox. Some classical S's have noncommutative algebras of interactively observable coordinates similar to the Heisenberg algebra of a quantum system. Such S's are ''hidden variable'' models of quantum theory not covered by the hidden variable studies of Von Neumann, Bohm, Bell, or Kochen and Specker. It is conceivable that some quantum effects in Nature arise from interactivity. (author)

  5. Computational complementarity

    Energy Technology Data Exchange (ETDEWEB)

    Finkelstein, D; Finkelstein, S R

    1983-08-01

    Interactivity generates paradox in that the interactive control by one system C of predicates about another system-under-study S may falsify these predicates. An ''interactive logic'' is formulated to resolve this paradox of interactivity. The construction generalizes one, the Galois connection, used by Von Neumann for the similar quantum paradox. The construction is applied to a transition system, a concept that includes general systems, automata, and quantum systems. In some (classical) automata S, the interactive predicates about S show quantumlike complementarity arising from interactivity. The interactive paradox generates the quantum paradox. Some classical S's have noncommutative algebras of interactively observable coordinates similar to the Heisenberg algebra of a quantum system. Such S's are ''hidden variable'' models of quantum theory not covered by the hidden variable studies of Von Neumann, Bohm, Bell, or Kochen and Specker. It is conceivable that some quantum effects in Nature arise from interactivity.

  6. Advanced in Computer Science and its Applications

    CERN Document Server

    Yen, Neil; Park, James; CSA 2013

    2014-01-01

    The theme of CSA is focused on the various aspects of computer science and its applications for advances in computer science and its applications and provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of computer science and its applications. Therefore this book will be include the various theories and practical applications in computer science and its applications.

  7. Quantum Computation--The Ultimate Frontier

    OpenAIRE

    Adami, Chris; Dowling, Jonathan P.

    2002-01-01

    The discovery of an algorithm for factoring which runs in polynomial time on a quantum computer has given rise to a concerted effort to understand the principles, advantages, and limitations of quantum computing. At the same time, many different quantum systems are being explored for their suitability to serve as a physical substrate for the quantum computer of the future. I discuss some of the theoretical foundations of quantum computer science, including algorithms and error correction, and...

  8. Advances in Computer Science and Engineering

    CERN Document Server

    Second International Conference on Advances in Computer Science and Engineering (CES 2012)

    2012-01-01

    This book includes the proceedings of the second International Conference on Advances in Computer Science and Engineering (CES 2012), which was held during January 13-14, 2012 in Sanya, China. The papers in these proceedings of CES 2012 focus on the researchers’ advanced works in their fields of Computer Science and Engineering mainly organized in four topics, (1) Software Engineering, (2) Intelligent Computing, (3) Computer Networks, and (4) Artificial Intelligence Software.

  9. Why Don't All Professors Use Computers?

    Science.gov (United States)

    Drew, David Eli

    1989-01-01

    Discusses the adoption of computer technology at universities and examines reasons why some professors don't use computers. Topics discussed include computer applications, including artificial intelligence, social science research, statistical analysis, and cooperative research; appropriateness of the technology for the task; the Computer Aptitude…

  10. GPU Computing Gems Emerald Edition

    CERN Document Server

    Hwu, Wen-mei W

    2011-01-01

    ".the perfect companion to Programming Massively Parallel Processors by Hwu & Kirk." -Nicolas Pinto, Research Scientist at Harvard & MIT, NVIDIA Fellow 2009-2010 Graphics processing units (GPUs) can do much more than render graphics. Scientists and researchers increasingly look to GPUs to improve the efficiency and performance of computationally-intensive experiments across a range of disciplines. GPU Computing Gems: Emerald Edition brings their techniques to you, showcasing GPU-based solutions including: Black hole simulations with CUDA GPU-accelerated computation and interactive display of

  11. Capability-based computer systems

    CERN Document Server

    Levy, Henry M

    2014-01-01

    Capability-Based Computer Systems focuses on computer programs and their capabilities. The text first elaborates capability- and object-based system concepts, including capability-based systems, object-based approach, and summary. The book then describes early descriptor architectures and explains the Burroughs B5000, Rice University Computer, and Basic Language Machine. The text also focuses on early capability architectures. Dennis and Van Horn's Supervisor; CAL-TSS System; MIT PDP-1 Timesharing System; and Chicago Magic Number Machine are discussed. The book then describes Plessey System 25

  12. Affective Computing and Sentiment Analysis

    CERN Document Server

    Ahmad, Khurshid

    2011-01-01

    This volume maps the watershed areas between two 'holy grails' of computer science: the identification and interpretation of affect -- including sentiment and mood. The expression of sentiment and mood involves the use of metaphors, especially in emotive situations. Affect computing is rooted in hermeneutics, philosophy, political science and sociology, and is now a key area of research in computer science. The 24/7 news sites and blogs facilitate the expression and shaping of opinion locally and globally. Sentiment analysis, based on text and data mining, is being used in the looking at news

  13. Practical advantages of evolutionary computation

    Science.gov (United States)

    Fogel, David B.

    1997-10-01

    Evolutionary computation is becoming a common technique for solving difficult, real-world problems in industry, medicine, and defense. This paper reviews some of the practical advantages to using evolutionary algorithms as compared with classic methods of optimization or artificial intelligence. Specific advantages include the flexibility of the procedures, as well as their ability to self-adapt the search for optimum solutions on the fly. As desktop computers increase in speed, the application of evolutionary algorithms will become routine.

  14. COMPUTATIONAL MODELS FOR SUSTAINABLE DEVELOPMENT

    OpenAIRE

    Monendra Grover; Rajesh Kumar; Tapan Kumar Mondal; S. Rajkumar

    2011-01-01

    Genetic erosion is a serious problem and computational models have been developed to prevent it. The computational modeling in this field not only includes (terrestrial) reserve design, but also decision modeling for related problems such as habitat restoration, marine reserve design, and nonreserve approaches to conservation management. Models have been formulated for evaluating tradeoffs between socioeconomic, biophysical, and spatial criteria in establishing marine reserves. The percolatio...

  15. Ionic liquids, electrolyte solutions including the ionic liquids, and energy storage devices including the ionic liquids

    Science.gov (United States)

    Gering, Kevin L.; Harrup, Mason K.; Rollins, Harry W.

    2015-12-08

    An ionic liquid including a phosphazene compound that has a plurality of phosphorus-nitrogen units and at least one pendant group bonded to each phosphorus atom of the plurality of phosphorus-nitrogen units. One pendant group of the at least one pendant group comprises a positively charged pendant group. Additional embodiments of ionic liquids are disclosed, as are electrolyte solutions and energy storage devices including the embodiments of the ionic liquid.

  16. Applied Parallel Computing Industrial Computation and Optimization

    DEFF Research Database (Denmark)

    Madsen, Kaj; NA NA NA Olesen, Dorte

    Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)......Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)...

  17. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  18. BONFIRE: benchmarking computers and computer networks

    OpenAIRE

    Bouckaert, Stefan; Vanhie-Van Gerwen, Jono; Moerman, Ingrid; Phillips, Stephen; Wilander, Jerker

    2011-01-01

    The benchmarking concept is not new in the field of computing or computer networking. With “benchmarking tools”, one usually refers to a program or set of programs, used to evaluate the performance of a solution under certain reference conditions, relative to the performance of another solution. Since the 1970s, benchmarking techniques have been used to measure the performance of computers and computer networks. Benchmarking of applications and virtual machines in an Infrastructure-as-a-Servi...

  19. Democratizing Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  20. Computing at Stanford.

    Science.gov (United States)

    Feigenbaum, Edward A.; Nielsen, Norman R.

    1969-01-01

    This article provides a current status report on the computing and computer science activities at Stanford University, focusing on the Computer Science Department, the Stanford Computation Center, the recently established regional computing network, and the Institute for Mathematical Studies in the Social Sciences. Also considered are such topics…

  1. Olkiluoto surface hydrological modelling: Update 2012 including salt transport modelling

    International Nuclear Information System (INIS)

    Karvonen, T.

    2013-11-01

    Posiva Oy is responsible for implementing a final disposal program for spent nuclear fuel of its owners Teollisuuden Voima Oyj and Fortum Power and Heat Oy. The spent nuclear fuel is planned to be disposed at a depth of about 400-450 meters in the crystalline bedrock at the Olkiluoto site. Leakages located at or close to spent fuel repository may give rise to the upconing of deep highly saline groundwater and this is a concern with regard to the performance of the tunnel backfill material after the closure of the tunnels. Therefore a salt transport sub-model was added to the Olkiluoto surface hydrological model (SHYD). The other improvements include update of the particle tracking algorithm and possibility to estimate the influence of open drillholes in a case where overpressure in inflatable packers decreases causing a hydraulic short-circuit between hydrogeological zones HZ19 and HZ20 along the drillhole. Four new hydrogeological zones HZ056, HZ146, BFZ100 and HZ039 were added to the model. In addition, zones HZ20A and HZ20B intersect with each other in the new structure model, which influences salinity upconing caused by leakages in shafts. The aim of the modelling of long-term influence of ONKALO, shafts and repository tunnels provide computational results that can be used to suggest limits for allowed leakages. The model input data included all the existing leakages into ONKALO (35-38 l/min) and shafts in the present day conditions. The influence of shafts was computed using eight different values for total shaft leakage: 5, 11, 20, 30, 40, 50, 60 and 70 l/min. The selection of the leakage criteria for shafts was influenced by the fact that upconing of saline water increases TDS-values close to the repository areas although HZ20B does not intersect any deposition tunnels. The total limit for all leakages was suggested to be 120 l/min. The limit for HZ20 zones was proposed to be 40 l/min: about 5 l/min the present day leakages to access tunnel, 25 l/min from

  2. Olkiluoto surface hydrological modelling: Update 2012 including salt transport modelling

    Energy Technology Data Exchange (ETDEWEB)

    Karvonen, T. [WaterHope, Helsinki (Finland)

    2013-11-15

    Posiva Oy is responsible for implementing a final disposal program for spent nuclear fuel of its owners Teollisuuden Voima Oyj and Fortum Power and Heat Oy. The spent nuclear fuel is planned to be disposed at a depth of about 400-450 meters in the crystalline bedrock at the Olkiluoto site. Leakages located at or close to spent fuel repository may give rise to the upconing of deep highly saline groundwater and this is a concern with regard to the performance of the tunnel backfill material after the closure of the tunnels. Therefore a salt transport sub-model was added to the Olkiluoto surface hydrological model (SHYD). The other improvements include update of the particle tracking algorithm and possibility to estimate the influence of open drillholes in a case where overpressure in inflatable packers decreases causing a hydraulic short-circuit between hydrogeological zones HZ19 and HZ20 along the drillhole. Four new hydrogeological zones HZ056, HZ146, BFZ100 and HZ039 were added to the model. In addition, zones HZ20A and HZ20B intersect with each other in the new structure model, which influences salinity upconing caused by leakages in shafts. The aim of the modelling of long-term influence of ONKALO, shafts and repository tunnels provide computational results that can be used to suggest limits for allowed leakages. The model input data included all the existing leakages into ONKALO (35-38 l/min) and shafts in the present day conditions. The influence of shafts was computed using eight different values for total shaft leakage: 5, 11, 20, 30, 40, 50, 60 and 70 l/min. The selection of the leakage criteria for shafts was influenced by the fact that upconing of saline water increases TDS-values close to the repository areas although HZ20B does not intersect any deposition tunnels. The total limit for all leakages was suggested to be 120 l/min. The limit for HZ20 zones was proposed to be 40 l/min: about 5 l/min the present day leakages to access tunnel, 25 l/min from

  3. Control rod calibration including the rod coupling effect

    International Nuclear Information System (INIS)

    Szilard, R.; Nelson, G.W.

    1984-01-01

    In a reactor containing more than one control rod, which includes all reactors licensed in the United States, there will be a 'coupling' or 'shadowing' of control rod flux at the location of a control rod as a result of the flux depression caused by another control rod. It was decided to investigate this phenomenon further, and eventually to put calibration table data or formulae in a small computer in the control room, so once could insert the positions of the three control rods and receive the excess reactivity without referring to separate tables. For this to be accomplished, a 'three control- rod reactivity function' would be used which would include the flux coupling between the rods. The function is design and measured data was fitted into it to determine the calibration constants. The input data for fitting the trial functions consisted of 254 data points, each consisting of the position of the reg, shim, and transient rods, and the total excess reactivity. (About 200 of these points were 'critical balance points', that is the rod positions for which reactor was critical, and the remainder were determined by positive period measurements.) Although this may be unrealistic from a physical viewpoint, the function derived gave a very accurate recalculation of the input data, and thus would faithfully give the excess reactivity for any possible combination of the locations of the three control rods. The next step, incorporation of the three-rod function into the minicomputer, will be pursued in the summer and fall of 1984

  4. Propulsion controlled aircraft computer

    Science.gov (United States)

    Cogan, Bruce R. (Inventor)

    2010-01-01

    A low-cost, easily retrofit Propulsion Controlled Aircraft (PCA) system for use on a wide range of commercial and military aircraft consists of an propulsion controlled aircraft computer that reads in aircraft data including aircraft state, pilot commands and other related data, calculates aircraft throttle position for a given maneuver commanded by the pilot, and then displays both current and calculated throttle position on a cockpit display to show the pilot where to move throttles to achieve the commanded maneuver, or is automatically sent digitally to command the engines directly.

  5. Fault Tolerant Computer Architecture

    CERN Document Server

    Sorin, Daniel

    2009-01-01

    For many years, most computer architects have pursued one primary goal: performance. Architects have translated the ever-increasing abundance of ever-faster transistors provided by Moore's law into remarkable increases in performance. Recently, however, the bounty provided by Moore's law has been accompanied by several challenges that have arisen as devices have become smaller, including a decrease in dependability due to physical faults. In this book, we focus on the dependability challenge and the fault tolerance solutions that architects are developing to overcome it. The two main purposes

  6. IGMtransmission: Transmission curve computation

    Science.gov (United States)

    Harrison, Christopher M.; Meiksin, Avery; Stock, David

    2015-04-01

    IGMtransmission is a Java graphical user interface that implements Monte Carlo simulations to compute the corrections to colors of high-redshift galaxies due to intergalactic attenuation based on current models of the Intergalactic Medium. The effects of absorption due to neutral hydrogen are considered, with particular attention to the stochastic effects of Lyman Limit Systems. Attenuation curves are produced, as well as colors for a wide range of filter responses and model galaxy spectra. Photometric filters are included for the Hubble Space Telescope, the Keck telescope, the Mt. Palomar 200-inch, the SUBARU telescope and UKIRT; alternative filter response curves and spectra may be readily uploaded.

  7. Optical Computing Research.

    Science.gov (United States)

    1987-10-30

    1489-1496, 1985. 13. W.T. Welford and R. Winston, The Optics of Nonimaging Concentrators, Academic Press, New York, N.Y., 1978 (see Appendix A). 14. R.H...AD-fIB? Ŗ OPTICAL CONPIITINO RESEAIRCII(U STANFORD UlNIV CA STINFORD / ELECTRONICS LASS J N 0000W4 30 OCT 97 SMAFOSR-TR-S?-1635 RFOSR-96...Force Base ELEMENT NO. NO. NO. NO. Washington, DC 20332-6448 11. TITLE ,Include Security ClaaticaonUNCLASSIFIED 61102F 2305 B4 OPTICAL COMPUTING RESEARCH

  8. CLOUD COMPUTING SECURITY ISSUES

    Directory of Open Access Journals (Sweden)

    Florin OGIGAU-NEAMTIU

    2012-01-01

    Full Text Available The term “cloud computing” has been in the spotlights of IT specialists the last years because of its potential to transform this industry. The promised benefits have determined companies to invest great sums of money in researching and developing this domain and great steps have been made towards implementing this technology. Managers have traditionally viewed IT as difficult and expensive and the promise of cloud computing leads many to think that IT will now be easy and cheap. The reality is that cloud computing has simplified some technical aspects of building computer systems, but the myriad challenges facing IT environment still remain. Organizations which consider adopting cloud based services must also understand the many major problems of information policy, including issues of privacy, security, reliability, access, and regulation. The goal of this article is to identify the main security issues and to draw the attention of both decision makers and users to the potential risks of moving data into “the cloud”.

  9. Cloud Computing: An Overview

    Directory of Open Access Journals (Sweden)

    Libor Sarga

    2012-10-01

    Full Text Available As cloud computing is gaining acclaim as a cost-effective alternative to acquiring processing resources for corporations, scientific applications and individuals, various challenges are rapidly coming to the fore. While academia struggles to procure a concise definition, corporations are more interested in competitive advantages it may generate and individuals view it as a way of speeding up data access times or a convenient backup solution. Properties of the cloud architecture largely preclude usage of existing practices while achieving end-users’ and companies’ compliance requires considering multiple infrastructural as well as commercial factors, such as sustainability in case of cloud-side interruptions, identity management and off-site corporate data handling policies. The article overviews recent attempts at formal definitions of cloud computing, summarizes and critically evaluates proposed delimitations, and specifies challenges associated with its further proliferation. Based on the conclusions, future directions in the field of cloud computing are also briefly hypothesized to include deeper focus on community clouds and bolstering innovative cloud-enabled platforms and devices such as tablets, smart phones, as well as entertainment applications.

  10. Cloud Computing Law

    CERN Document Server

    Millard, Christopher

    2013-01-01

    This book is about the legal implications of cloud computing. In essence, ‘the cloud’ is a way of delivering computing resources as a utility service via the internet. It is evolving very rapidly with substantial investments being made in infrastructure, platforms and applications, all delivered ‘as a service’. The demand for cloud resources is enormous, driven by such developments as the deployment on a vast scale of mobile apps and the rapid emergence of ‘Big Data’. Part I of this book explains what cloud computing is and how it works. Part II analyses contractual relationships between cloud service providers and their customers, as well as the complex roles of intermediaries. Drawing on primary research conducted by the Cloud Legal Project at Queen Mary University of London, cloud contracts are analysed in detail, including the appropriateness and enforceability of ‘take it or leave it’ terms of service, as well as the scope for negotiating cloud deals. Specific arrangements for public sect...

  11. Computational fluid dynamic applications

    Energy Technology Data Exchange (ETDEWEB)

    Chang, S.-L.; Lottes, S. A.; Zhou, C. Q.

    2000-04-03

    The rapid advancement of computational capability including speed and memory size has prompted the wide use of computational fluid dynamics (CFD) codes to simulate complex flow systems. CFD simulations are used to study the operating problems encountered in system, to evaluate the impacts of operation/design parameters on the performance of a system, and to investigate novel design concepts. CFD codes are generally developed based on the conservation laws of mass, momentum, and energy that govern the characteristics of a flow. The governing equations are simplified and discretized for a selected computational grid system. Numerical methods are selected to simplify and calculate approximate flow properties. For turbulent, reacting, and multiphase flow systems the complex processes relating to these aspects of the flow, i.e., turbulent diffusion, combustion kinetics, interfacial drag and heat and mass transfer, etc., are described in mathematical models, based on a combination of fundamental physics and empirical data, that are incorporated into the code. CFD simulation has been applied to a large variety of practical and industrial scale flow systems.

  12. Catalyst support structure, catalyst including the structure, reactor including a catalyst, and methods of forming same

    Science.gov (United States)

    Van Norman, Staci A.; Aston, Victoria J.; Weimer, Alan W.

    2017-05-09

    Structures, catalysts, and reactors suitable for use for a variety of applications, including gas-to-liquid and coal-to-liquid processes and methods of forming the structures, catalysts, and reactors are disclosed. The catalyst material can be deposited onto an inner wall of a microtubular reactor and/or onto porous tungsten support structures using atomic layer deposition techniques.

  13. Soft computing in computer and information science

    CERN Document Server

    Fray, Imed; Pejaś, Jerzy

    2015-01-01

    This book presents a carefully selected and reviewed collection of papers presented during the 19th Advanced Computer Systems conference ACS-2014. The Advanced Computer Systems conference concentrated from its beginning on methods and algorithms of artificial intelligence. Further future brought new areas of interest concerning technical informatics related to soft computing and some more technological aspects of computer science such as multimedia and computer graphics, software engineering, web systems, information security and safety or project management. These topics are represented in the present book under the categories Artificial Intelligence, Design of Information and Multimedia Systems, Information Technology Security and Software Technologies.

  14. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... When the image slices are reassembled by computer software, the result is a very detailed multidimensional view ... Safety Images related to Computed Tomography (CT) - Head Videos related to Computed Tomography (CT) - Head Sponsored by ...

  15. Computers: Instruments of Change.

    Science.gov (United States)

    Barkume, Megan

    1993-01-01

    Discusses the impact of computers in the home, the school, and the workplace. Looks at changes in computer use by occupations and by industry. Provides information on new job titles in computer occupations. (JOW)

  16. DNA computing models

    CERN Document Server

    Ignatova, Zoya; Zimmermann, Karl-Heinz

    2008-01-01

    In this excellent text, the reader is given a comprehensive introduction to the field of DNA computing. The book emphasizes computational methods to tackle central problems of DNA computing, such as controlling living cells, building patterns, and generating nanomachines.

  17. Distributed multiscale computing

    NARCIS (Netherlands)

    Borgdorff, J.

    2014-01-01

    Multiscale models combine knowledge, data, and hypotheses from different scales. Simulating a multiscale model often requires extensive computation. This thesis evaluates distributing these computations, an approach termed distributed multiscale computing (DMC). First, the process of multiscale

  18. Computational Modeling | Bioenergy | NREL

    Science.gov (United States)

    cell walls and are the source of biofuels and biomaterials. Our modeling investigates their properties . Quantum Mechanical Models NREL studies chemical and electronic properties and processes to reduce barriers Computational Modeling Computational Modeling NREL uses computational modeling to increase the

  19. Computer Viruses: An Overview.

    Science.gov (United States)

    Marmion, Dan

    1990-01-01

    Discusses the early history and current proliferation of computer viruses that occur on Macintosh and DOS personal computers, mentions virus detection programs, and offers suggestions for how libraries can protect themselves and their users from damage by computer viruses. (LRW)

  20. 76 FR 45860 - In the Matter of Certain Electronic Devices, Including Wireless Communication Devices, Portable...

    Science.gov (United States)

    2011-08-01

    ..., Including Wireless Communication Devices, Portable Music and Data Processing Devices, and Tablet Computers... electronic devices, including wireless communication devices, portable music and data processing devices, and...''). The complaint further alleges that an industry in the United States exists or is in the process of...