WorldWideScience

Sample records for possibly including computer

  1. Computing possibilities in the mid 1990s

    International Nuclear Information System (INIS)

    Nash, T.

    1988-09-01

    This paper describes the kind of computing resources it may be possible to make available for experiments in high energy physics in the mid and late 1990s. We outline some of the work going on today, particularly at Fermilab's Advanced Computer Program, that projects to the future. We attempt to define areas in which coordinated R and D efforts should prove fruitful to provide for on and off-line computing in the SSC era. Because of extraordinary components anticipated from industry, we can be optimistic even to the level of predicting million VAX equivalent on-line multiprocessor/data acquisition systems for SSC detectors. Managing this scale of computing will require a new approach to large hardware and software systems. 15 refs., 6 figs

  2. The NEA computer program library: a possible GDMS application

    International Nuclear Information System (INIS)

    Schuler, W.

    1978-01-01

    NEA Computer Program library maintains a series of eleven sequential computer files, used for linked applications in managing their stock of computer codes for nuclear reactor calculations, storing index and program abstract information, and administering their service to requesters. The high data redundancy beween the files suggests that a data base approach would be valid and this paper suggests a possible 'schema' for an CODASYL GDMS

  3. On the problem of possibilities of X-ray computer tomography in the diagnosis of endophitic tumors of the stomach

    International Nuclear Information System (INIS)

    Gorshkov, A.N.; Akberov, R.F.

    1996-01-01

    The possibilities of X-ray computer tomography in the diagnosis of endophitic tumors of the stomach including tumors of small size are considered using the examinations of 100 patients with stomach diseases. The computer-tomographic semiotics of small endophitic tumors of the stomach is presented, the place of computer tomography in the diagnosis of tumors of the stomach as well as its potential possibilities in revealing small tumors of the stomach with principally endophitic spreading. 10 refs.; 3 figs

  4. Theoretical calculation possibilities of the computer code HAMMER

    International Nuclear Information System (INIS)

    Onusic Junior, J.

    1978-06-01

    With the aim to know the theoretical calculation possibilities of the computer code HAMMER, developed at Savanah River Laboratory, a analysis of the crytical cells assembly of the kind utilized in PWR reactors is made. (L.F.S.) [pt

  5. Computer simulation of forest fire and its possible usage

    International Nuclear Information System (INIS)

    Halada, L.; Weisenpacher, P.; Glasa, J.

    2005-01-01

    In this presentation authors deal with computer modelling of forest fires. Their possible usage is discussed. Results of modelling are compared with real forest fire in the National Park Slovensky Raj (Slovak Paradise) in 2000 year

  6. Infinite possibilities: Computational structures technology

    Science.gov (United States)

    Beam, Sherilee F.

    1994-12-01

    Computational Fluid Dynamics (or CFD) methods are very familiar to the research community. Even the general public has had some exposure to CFD images, primarily through the news media. However, very little attention has been paid to CST--Computational Structures Technology. Yet, no important design can be completed without it. During the first half of this century, researchers only dreamed of designing and building structures on a computer. Today their dreams have become practical realities as computational methods are used in all phases of design, fabrication and testing of engineering systems. Increasingly complex structures can now be built in even shorter periods of time. Over the past four decades, computer technology has been developing, and early finite element methods have grown from small in-house programs to numerous commercial software programs. When coupled with advanced computing systems, they help engineers make dramatic leaps in designing and testing concepts. The goals of CST include: predicting how a structure will behave under actual operating conditions; designing and complementing other experiments conducted on a structure; investigating microstructural damage or chaotic, unpredictable behavior; helping material developers in improving material systems; and being a useful tool in design systems optimization and sensitivity techniques. Applying CST to a structure problem requires five steps: (1) observe the specific problem; (2) develop a computational model for numerical simulation; (3) develop and assemble software and hardware for running the codes; (4) post-process and interpret the results; and (5) use the model to analyze and design the actual structure. Researchers in both industry and academia continue to make significant contributions to advance this technology with improvements in software, collaborative computing environments and supercomputing systems. As these environments and systems evolve, computational structures technology will

  7. Computing possible worlds in the history of modern astronomy

    Directory of Open Access Journals (Sweden)

    Osvaldo Pessoa Jr.

    2016-09-01

    Full Text Available http://dx.doi.org/10.5007/1808-1711.2016v20n1p117 As part of an ongoing study of causal models in the history of science, a counterfactual scenario in the history of modern astronomy is explored with the aid of computer simulations. After the definition of “linking advance”, a possible world involving technological antecedence is described, branching out in 1510, in which the telescope is invented 70 years before its actual construction, at the time in which Fracastoro actually built the first prototelescope. By using the principle of the closest possible world (PCP, we estimate that in this scenario the discovery of the elliptical orbit of Mars would by anticipated by only 28 years. The second part of the paper involves an estimate of the probability of the previous scenario, guided by the principle that the actual world is the mean (PAM and using computer simulations to create possible worlds in which the time spans between advances is varied according to a gamma distribution function. Taking into account the importance of the use of the diaphragm for the invention of the telescope, the probability that the telescope were built by 1538 for a branching time at 1510 is found to be smaller than 1%. The work shows that one of the important features of computational simulations in philosophy of science is to serve as a consistency check for the intuitions and speculations of the philosopher.

  8. The Impossibility of the Counterfactual Computation for all Possible Outcomes

    OpenAIRE

    Vaidman, Lev

    2006-01-01

    Recent proposal for counterfactual computation [Hosten et al., Nature, 439, 949 (2006)] is analyzed. It is argued that the method does not provide counterfactual computation for all possible outcomes. The explanation involves a novel paradoxical feature of pre- and post-selected quantum particles: the particle can reach a certain location without being on the path that leads to this location.

  9. Possibilities of computer tomography in multiple sclerosis

    International Nuclear Information System (INIS)

    Vymazal, J.; Bauer, J.

    1983-01-01

    Computer tomography was performed in 41 patients with multiple sclerosis, the average age of patients being 40.8 years. Native examinations were made of 17 patients, examinations with contrast medium of 19, both methods were used in the examination of 5 patients. In 26 patients, i.e. in almost two-thirds, cerebral atrophy was found, in 11 of a severe type. In 9 patients atrophy affected only the hemispheres, in 16 also the stem and cerebellum. The stem and cerebellum only were affected in 1 patient. Hypodense foci were found in 21 patients, i.e. more than half of those examined. In 9 there were multiple foci. In most of the 19 examined patients the hypodense changes were in the hemispheres and only in 2 in the cerebellum and brain stem. No hyperdense changes were detected. The value and possibilities are discussed of examinations by computer tomography multiple sclerosis. (author)

  10. A design of a computer complex including vector processors

    International Nuclear Information System (INIS)

    Asai, Kiyoshi

    1982-12-01

    We, members of the Computing Center, Japan Atomic Energy Research Institute have been engaged for these six years in the research of adaptability of vector processing to large-scale nuclear codes. The research has been done in collaboration with researchers and engineers of JAERI and a computer manufacturer. In this research, forty large-scale nuclear codes were investigated from the viewpoint of vectorization. Among them, twenty-six codes were actually vectorized and executed. As the results of the investigation, it is now estimated that about seventy percents of nuclear codes and seventy percents of our total amount of CPU time of JAERI are highly vectorizable. Based on the data obtained by the investigation, (1)currently vectorizable CPU time, (2)necessary number of vector processors, (3)necessary manpower for vectorization of nuclear codes, (4)computing speed, memory size, number of parallel 1/0 paths, size and speed of 1/0 buffer of vector processor suitable for our applications, (5)necessary software and operational policy for use of vector processors are discussed, and finally (6)a computer complex including vector processors is presented in this report. (author)

  11. 78 FR 1247 - Certain Electronic Devices, Including Wireless Communication Devices, Tablet Computers, Media...

    Science.gov (United States)

    2013-01-08

    ... Wireless Communication Devices, Tablet Computers, Media Players, and Televisions, and Components Thereof... devices, including wireless communication devices, tablet computers, media players, and televisions, and... wireless communication devices, tablet computers, media players, and televisions, and components thereof...

  12. CERN’s Computing rules updated to include policy for control systems

    CERN Multimedia

    IT Department

    2008-01-01

    The use of CERN’s computing facilities is governed by rules defined in Operational Circular No. 5 and its subsidiary rules of use. These rules are available from the web site http://cern.ch/ComputingRules. Please note that the subsidiary rules for Internet/Network use have been updated to include a requirement that control systems comply with the CNIC(Computing and Network Infrastructure for Control) Security Policy. The security policy for control systems, which was approved earlier this year, can be accessed at https://edms.cern.ch/document/584092 IT Department

  13. Possibilities of the new hybrid technology single photon emission computer technology/computer tomography (SPECT/CT) and the first impressions of its application

    International Nuclear Information System (INIS)

    Kostadinova, I.

    2010-01-01

    With the help of the new hybrid technique SPECT/ CT it is possible, using the only investigation, to acquire a combine image of the investigated organ, visualizing its function and structure. Combining the possibilities of the new multimodality method, which combines the possibilities of the Single Photon Emission Computer Tomography - SPECT and Computer Tomography - CT, it is possible to precisely localize the pathologically changed organs function. With the further combination of the tomographic gamma camera with diagnostic CT, a detailed morphological evaluation of the finding was possible. The main clinical application of the new hybrid diagnostic is in the fields of cardiology, oncology, orthopedics with more and more extension of those, not connected with oncology, such as - thyroid, parathyroid, brain (especially localization of the epileptic foci), visualization of local infection and recently for the purposes of the radiotherapy planning. According to the literature data, around 35% of SPECT-investigations have to be combined with CT in order to increase the specificity of the diagnosis, which changes the interpretation of the result in 56% of the cases. After installation of the SPECT/CT camera in the University hospital 'Alexandrovska' in January 2009, the following changes have been done: the number of the investigated patients have increased, including number of heart, thyroid (especially scintigraphy with 131I), bones and parathyroid glands. As a result of the application of the hybrid technique, a shortage of the investigated time was realized and a decrease prize in comparison with the individual application of the investigations. Summarizing the literature data and the preliminary impression of the first multimodality scanner in our country in the University hospital 'Alexandrovska' it could be said, that there is continuously increasing information for the new clinical applications of SPECT/CT. It is now accepted, that its usage will increase in

  14. PTAC: a computer program for pressure-transient analysis, including the effects of cavitation. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Kot, C A; Youngdahl, C K

    1978-09-01

    PTAC was developed to predict pressure transients in nuclear-power-plant piping systems in which the possibility of cavitation must be considered. The program performs linear or nonlinear fluid-hammer calculations, using a fixed-grid method-of-characteristics solution procedure. In addition to pipe friction and elasticity, the program can treat a variety of flow components, pipe junctions, and boundary conditions, including arbitrary pressure sources and a sodium/water reaction. Essential features of transient cavitation are modeled by a modified column-separation technique. Comparisons of calculated results with available experimental data, for a simple piping arrangement, show good agreement and provide validation of the computational cavitation model. Calculations for a variety of piping networks, containing either liquid sodium or water, demonstrate the versatility of PTAC and clearly show that neglecting cavitation leads to erroneous predictions of pressure-time histories.

  15. 77 FR 27078 - Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof...

    Science.gov (United States)

    2012-05-08

    ... Phones and Tablet Computers, and Components Thereof; Notice of Receipt of Complaint; Solicitation of... entitled Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof... the United States after importation of certain electronic devices, including mobile phones and tablet...

  16. 31 CFR 359.31 - What definitive Series I savings bonds are included in the computation?

    Science.gov (United States)

    2010-07-01

    ... definitive Series I savings bonds are included in the computation? In computing the purchases for each person, we include the following outstanding definitive bonds purchased in that calendar year: (a) All bonds... bearing that person's TIN; and (c) All gift bonds registered in the name of that person but bearing the...

  17. The possible usability of three-dimensional cone beam computed dental tomography in dental research

    Science.gov (United States)

    Yavuz, I.; Rizal, M. F.; Kiswanjaya, B.

    2017-08-01

    The innovations and advantages of three-dimensional cone beam computed dental tomography (3D CBCT) are continually growing for its potential use in dental research. Imaging techniques are important for planning research in dentistry. Newly improved 3D CBCT imaging systems and accessory computer programs have recently been proven effective for use in dental research. The aim of this study is to introduce 3D CBCT and open a window for future research possibilities that should be given attention in dental research.

  18. Computational and experimental analyses of the wave propagation through a bar structure including liquid-solid interface

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sang Jin [UST Graduate School, Daejeon (Korea, Republic of); Rhee, Hui Nam [Division of Mechanical and Aerospace Engineering, Sunchon National University, Sunchon (Korea, Republic of); Yoon, Doo Byung; Park, Jin Ho [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-08-15

    In this research, we study the propagation of longitudinal and transverse waves through a metal rod including a liquid layer using computational and experimental analyses. The propagation characteristics of longitudinal and transverse waves obtained by the computational and experimental analyses were consistent with the wave propagation theory for both cases, that is, the homogeneous metal rod and the metal rod including a liquid layer. The fluid-structure interaction modeling technique developed for the computational wave propagation analysis in this research can be applied to the more complex structures including solid-liquid interfaces.

  19. Accurate computations of monthly average daily extraterrestrial irradiation and the maximum possible sunshine duration

    International Nuclear Information System (INIS)

    Jain, P.C.

    1985-12-01

    The monthly average daily values of the extraterrestrial irradiation on a horizontal plane and the maximum possible sunshine duration are two important parameters that are frequently needed in various solar energy applications. These are generally calculated by solar scientists and engineers each time they are needed and often by using the approximate short-cut methods. Using the accurate analytical expressions developed by Spencer for the declination and the eccentricity correction factor, computations for these parameters have been made for all the latitude values from 90 deg. N to 90 deg. S at intervals of 1 deg. and are presented in a convenient tabular form. Monthly average daily values of the maximum possible sunshine duration as recorded on a Campbell Stoke's sunshine recorder are also computed and presented. These tables would avoid the need for repetitive and approximate calculations and serve as a useful ready reference for providing accurate values to the solar energy scientists and engineers

  20. Possibilities and importance of using computer games and simulations in educational process

    OpenAIRE

    Danilović Mirčeta S.

    2003-01-01

    The paper discusses if it is possible and appropriate to use simulations (simulation games) and traditional games in the process of education. It is stressed that the terms "game" and "simulation" can and should be taken in a broader sense, although they are chiefly investigated herein as video-computer games and simulations. Any activity combining the properties of game (competition, rules, players) and the properties of simulation (i.e. operational presentation of reality) should be underst...

  1. Possibilities of differentiation of solitary focal liver lesions by computed tomography perfusion

    Directory of Open Access Journals (Sweden)

    Irmina Sefić Pašić

    2015-08-01

    Full Text Available Aim To evaluate possibilities of computed tomography (CT perfusion in differentiation of solitary focal liver lesions based on their characteristic vascularization through perfusion parameters analysis. Methods Prospective study was conducted on 50 patients in the period 2009-2012. Patients were divided in two groups: benign and malignant lesions. The following CT perfusion parameters were analyzed: blood flow (BF, blood volume (BV, mean transit time (MTT, capillary permeability surface area product (PS, hepatic arterial fraction (HAF, and impulse residual function (IRF. During the study another perfusion parameter was analyzed: hepatic perfusion index (HPI. All patients were examined on Multidetector 64-slice CT machine (GE with application of perfusion protocol for liver with i.v. administration of contrast agent. Results In both groups an increase of vascularization and arterial blood flow was noticed, but there was no significant statistical difference between any of 6 analyzed parameters. Hepatic perfusion index values were increased in all lesions in comparison with normal liver parenchyma. Conclusion Computed tomography perfusion in our study did not allow differentiation of benign and malignant liver lesions based on analysis of functional perfusion parameters. Hepatic perfusion index should be investigated in further studies as a parameter for detection of possible presence of micro-metastases in visually homogeneous liver in cases with no lesions found during standard CT protocol

  2. Possible Computer Vision Systems and Automated or Computer-Aided Edging and Trimming

    Science.gov (United States)

    Philip A. Araman

    1990-01-01

    This paper discusses research which is underway to help our industry reduce costs, increase product volume and value recovery, and market more accurately graded and described products. The research is part of a team effort to help the hardwood sawmill industry automate with computer vision systems, and computer-aided or computer controlled processing. This paper...

  3. 77 FR 34063 - Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof...

    Science.gov (United States)

    2012-06-08

    ... Phones and Tablet Computers, and Components Thereof Institution of Investigation AGENCY: U.S... the United States after importation of certain electronic devices, including mobile phones and tablet... mobile phones and tablet computers, and components thereof that infringe one or more of claims 1-3 and 5...

  4. The Model of the Software Running on a Computer Equipment Hardware Included in the Grid network

    Directory of Open Access Journals (Sweden)

    T. A. Mityushkina

    2012-12-01

    Full Text Available A new approach to building a cloud computing environment using Grid networks is proposed in this paper. The authors describe the functional capabilities, algorithm, model of software running on a computer equipment hardware included in the Grid network, that will allow to implement cloud computing environment using Grid technologies.

  5. Possible UIP pattern on high-resolution computed tomography is associated with better survival than definite UIP in IPF patients.

    Science.gov (United States)

    Salisbury, Margaret L; Tolle, Leslie B; Xia, Meng; Murray, Susan; Tayob, Nabihah; Nambiar, Anoop M; Schmidt, Shelley L; Lagstein, Amir; Myers, Jeffery L; Gross, Barry H; Kazerooni, Ella A; Sundaram, Baskaran; Chughtai, Aamer R; Martinez, Fernando J; Flaherty, Kevin R

    2017-10-01

    Idiopathic pulmonary fibrosis (IPF) is a progressive fibrosing lung disease of unknown etiology. Inter-society consensus guidelines on IPF diagnosis and management outline radiologic patterns including definite usual interstitial pneumonia (UIP), possible UIP, and inconsistent with UIP. We evaluate these diagnostic categories as prognostic markers among patients with IPF. Included subjects had biopsy-proven UIP, a multidisciplinary team diagnosis of IPF, and a baseline high-resolution computed tomography (HRCT). Thoracic radiologists assigned the radiologic pattern and documented the presence and extent of specific radiologic findings. The outcome of interest was lung transplant-free survival. IPF patients with a possible UIP pattern on HRCT had significantly longer Kaplan-Meier event-free survival compared to those with definite UIP pattern (5.21 and 3.57 years, respectively, p = 0.002). In a multivariable Cox proportional hazards model adjusted for baseline age, gender, %-predicted FVC, and %-predicted DLCO via the GAP Stage, extent of fibrosis (via the traction bronchiectasis score) and ever-smoker status, possible UIP pattern on HRCT (versus definite UIP) was associated with reduced hazard of death or lung transplant (HR = 0.42, CI 95% 0.23-0.78, p = 0.006). Radiologic diagnosis categories outlined by inter-society consensus guidelines is a widely-reported and potentially useful prognostic marker in IPF patients, with possible UIP pattern on HRCT associated with a favorable prognosis compared to definite UIP pattern, after adjusting for relevant covariates. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. On the possibility of non-invasive multilayer temperature estimation using soft-computing methods.

    Science.gov (United States)

    Teixeira, C A; Pereira, W C A; Ruano, A E; Ruano, M Graça

    2010-01-01

    This work reports original results on the possibility of non-invasive temperature estimation (NITE) in a multilayered phantom by applying soft-computing methods. The existence of reliable non-invasive temperature estimator models would improve the security and efficacy of thermal therapies. These points would lead to a broader acceptance of this kind of therapies. Several approaches based on medical imaging technologies were proposed, magnetic resonance imaging (MRI) being appointed as the only one to achieve the acceptable temperature resolutions for hyperthermia purposes. However, MRI intrinsic characteristics (e.g., high instrumentation cost) lead us to use backscattered ultrasound (BSU). Among the different BSU features, temporal echo-shifts have received a major attention. These shifts are due to changes of speed-of-sound and expansion of the medium. The originality of this work involves two aspects: the estimator model itself is original (based on soft-computing methods) and the application to temperature estimation in a three-layer phantom is also not reported in literature. In this work a three-layer (non-homogeneous) phantom was developed. The two external layers were composed of (in % of weight): 86.5% degassed water, 11% glycerin and 2.5% agar-agar. The intermediate layer was obtained by adding graphite powder in the amount of 2% of the water weight to the above composition. The phantom was developed to have attenuation and speed-of-sound similar to in vivo muscle, according to the literature. BSU signals were collected and cumulative temporal echo-shifts computed. These shifts and the past temperature values were then considered as possible estimators inputs. A soft-computing methodology was applied to look for appropriate multilayered temperature estimators. The methodology involves radial-basis functions neural networks (RBFNN) with structure optimized by the multi-objective genetic algorithm (MOGA). In this work 40 operating conditions were

  7. Characteristics and possibilities of computer program for fast assessment of primary frequency control of electric power interconnections

    Directory of Open Access Journals (Sweden)

    Ivanović Milan

    2011-01-01

    Full Text Available This paper presents the possibilities and practical features of a computer program for fast assessment of the effects of primary frequency regulation of electric power interconnections. It is based on two methods. The first one is the analytical method, which applies analytical expressions for the non-zero initial conditions, with a range of benefits provided by the analytical form, allowing consideration of possible structural changes in the power system during the analysis process. The second is a simulation method, with recurrent application of suitable drafted, fully decoupled difference equations. Capabilities and features of this computer program have been identified in case of isolated power system of Serbia, and then for the case of a widespread appreciation of its surrounding.

  8. Top 10 Threats to Computer Systems Include Professors and Students

    Science.gov (United States)

    Young, Jeffrey R.

    2008-01-01

    User awareness is growing in importance when it comes to computer security. Not long ago, keeping college networks safe from cyberattackers mainly involved making sure computers around campus had the latest software patches. New computer worms or viruses would pop up, taking advantage of some digital hole in the Windows operating system or in…

  9. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  10. COMPUTER GAMES AND EDUCATION

    OpenAIRE

    Sukhov, Anton

    2018-01-01

    This paper devoted to the research of educational resources and possibilities of modern computer games. The “internal” educational aspects of computer games include educational mechanism (a separate or integrated “tutorial”) and representation of a real or even fantastic educational process within virtual worlds. The “external” dimension represents educational opportunities of computer games for personal and professional development in different genres of computer games (various transport, so...

  11. Internet governance origins, current issues, and future possibilities

    CERN Document Server

    Balleste, Roy

    2015-01-01

    Internet Governance: Origins, Current Issues, and Future Possibilities provides an introductory, multidisciplinary account of the forces at work in the evolving concept of internet governance and includes computer history, Internet beginnings, institutions and stakeholders, proposed models of governance, and human rights.

  12. Posture, Musculoskeletal Activities, and Possible Musculoskeletal Discomfort Among Children Using Laptops or Tablet Computers for Educational Purposes: A Literature Review

    Science.gov (United States)

    Binboğa, Elif; Korhan, Orhan

    2014-10-01

    Educational ergonomics focuses on the interaction between educational performance and educational design. By improving the design or pointing out the possible problems, educational ergonomics can be utilized to have positive impacts on the student performance and thus on education process. Laptops and tablet computers are becoming widely used by school children and beginning to be used effectively for educational purposes. As the latest generation of laptops and tablet computers are mobile and lightweight compared to conventional personal computers, they support student-centred interaction-based learning. However, these technologies have been introduced into schools with minimal adaptations to furniture or attention to ergonomics. There are increasing reports of an association between increased musculoskeletal (MSK) problems in children and use of such technologies. Although children are among the users of laptops and tablet computers both in their everyday lives and at schools, the literature investigating MSK activities and possible MSK discomfort regarding children using portable technologies is limited. This study reviews the literature to identify published studies that investigated posture, MSK activities, and possible MSK discomfort among children using mobile technologies (laptops or tablet computers) for educational purposes. An electronic search of the literature published in English between January 1994 and January 2014 was performed in several databases. The literature search terms were identified and combined to search the databases. The search results that the resources investigating MSK outcomes of laptop or tablet use of children are very scarce. This review points out the research gaps in this field, and identifying areas for future studies.

  13. Security issues of cloud computing environment in possible military applications

    OpenAIRE

    Samčović, Andreja B.

    2013-01-01

    The evolution of cloud computing over the past few years is potentially one of major advances in the history of computing and telecommunications. Although there are many benefits of adopting cloud computing, there are also some significant barriers to adoption, security issues being the most important of them. This paper introduces the concept of cloud computing; looks at relevant technologies in cloud computing; takes into account cloud deployment models and some military applications. Addit...

  14. Possibilities and importance of using computer games and simulations in educational process

    Directory of Open Access Journals (Sweden)

    Danilović Mirčeta S.

    2003-01-01

    Full Text Available The paper discusses if it is possible and appropriate to use simulations (simulation games and traditional games in the process of education. It is stressed that the terms "game" and "simulation" can and should be taken in a broader sense, although they are chiefly investigated herein as video-computer games and simulations. Any activity combining the properties of game (competition, rules, players and the properties of simulation (i.e. operational presentation of reality should be understood as simulation games, where role-play constitutes their essence and basis. In those games the student assumes a new identity, identifies himself with another personality and responds similarly. Game rules are basic and most important conditions for its existence, accomplishment and goal achievement. Games and simulations make possible for a student to acquire experience and practice i.e. to do exercises in nearly similar or identical life situations, to develop cognitive and psycho-motor abilities and skills, to acquire knowledge, to develop, create and change attitudes and value criteria, and to develop perception of other people’s feelings and attitudes. It is obligatory for the teacher to conduct preparations to use and apply simulation games in the process of teaching.

  15. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  16. Computational intelligence in medical informatics

    CERN Document Server

    Gunjan, Vinit

    2015-01-01

    This Brief highlights Informatics and related techniques to Computer Science Professionals, Engineers, Medical Doctors, Bioinformatics researchers and other interdisciplinary researchers. Chapters include the Bioinformatics of Diabetes and several computational algorithms and statistical analysis approach to effectively study the disorders and possible causes along with medical applications.

  17. [Realistic possibilities of utilization of a personal computer in the office of a general practitioner].

    Science.gov (United States)

    Masopust, V

    1991-04-01

    In May 1990 work on the programme "Computer system of the health community doctor Mic DOKI was" completed which resolves more than 70 basic tasks pertaining to the keeping of health documentation by health community doctors; it resolves automatically the entire administrative work in the health community, makes it possible to evaluate the activity of doctors and nurses it will facilitate the work of control organs of future health insurance companies and contribute to investigations of the health status of the population. Despite some problems ensuing from the contemporary economic situation of the country, the validity of contemporary health regulations and minimal training of our health personnel in the use of personal computers computerization of the health community system can be considered an asset to the reform of the health services which is under way.

  18. 78 FR 63492 - Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof...

    Science.gov (United States)

    2013-10-24

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-847] Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof; Notice of Request for Statements on the Public Interest AGENCY: U.S. International Trade Commission. ACTION: Notice. SUMMARY: Notice is...

  19. 31 CFR 351.66 - What book-entry Series EE savings bonds are included in the computation?

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false What book-entry Series EE savings... DEBT OFFERING OF UNITED STATES SAVINGS BONDS, SERIES EE Book-Entry Series EE Savings Bonds § 351.66 What book-entry Series EE savings bonds are included in the computation? (a) We include all bonds that...

  20. Spin-neurons: A possible path to energy-efficient neuromorphic computers

    Energy Technology Data Exchange (ETDEWEB)

    Sharad, Mrigank; Fan, Deliang; Roy, Kaushik [School of Electrical and Computer Engineering, Purdue University, West Lafayette, Indiana 47907 (United States)

    2013-12-21

    Recent years have witnessed growing interest in the field of brain-inspired computing based on neural-network architectures. In order to translate the related algorithmic models into powerful, yet energy-efficient cognitive-computing hardware, computing-devices beyond CMOS may need to be explored. The suitability of such devices to this field of computing would strongly depend upon how closely their physical characteristics match with the essential computing primitives employed in such models. In this work, we discuss the rationale of applying emerging spin-torque devices for bio-inspired computing. Recent spin-torque experiments have shown the path to low-current, low-voltage, and high-speed magnetization switching in nano-scale magnetic devices. Such magneto-metallic, current-mode spin-torque switches can mimic the analog summing and “thresholding” operation of an artificial neuron with high energy-efficiency. Comparison with CMOS-based analog circuit-model of a neuron shows that “spin-neurons” (spin based circuit model of neurons) can achieve more than two orders of magnitude lower energy and beyond three orders of magnitude reduction in energy-delay product. The application of spin-neurons can therefore be an attractive option for neuromorphic computers of future.

  1. High performance computation of landscape genomic models including local indicators of spatial association.

    Science.gov (United States)

    Stucki, S; Orozco-terWengel, P; Forester, B R; Duruz, S; Colli, L; Masembe, C; Negrini, R; Landguth, E; Jones, M R; Bruford, M W; Taberlet, P; Joost, S

    2017-09-01

    With the increasing availability of both molecular and topo-climatic data, the main challenges facing landscape genomics - that is the combination of landscape ecology with population genomics - include processing large numbers of models and distinguishing between selection and demographic processes (e.g. population structure). Several methods address the latter, either by estimating a null model of population history or by simultaneously inferring environmental and demographic effects. Here we present samβada, an approach designed to study signatures of local adaptation, with special emphasis on high performance computing of large-scale genetic and environmental data sets. samβada identifies candidate loci using genotype-environment associations while also incorporating multivariate analyses to assess the effect of many environmental predictor variables. This enables the inclusion of explanatory variables representing population structure into the models to lower the occurrences of spurious genotype-environment associations. In addition, samβada calculates local indicators of spatial association for candidate loci to provide information on whether similar genotypes tend to cluster in space, which constitutes a useful indication of the possible kinship between individuals. To test the usefulness of this approach, we carried out a simulation study and analysed a data set from Ugandan cattle to detect signatures of local adaptation with samβada, bayenv, lfmm and an F ST outlier method (FDIST approach in arlequin) and compare their results. samβada - an open source software for Windows, Linux and Mac OS X available at http://lasig.epfl.ch/sambada - outperforms other approaches and better suits whole-genome sequence data processing. © 2016 The Authors. Molecular Ecology Resources Published by John Wiley & Sons Ltd.

  2. A method for the computation of turbulent polymeric liquids including hydrodynamic interactions and chain entanglements

    Energy Technology Data Exchange (ETDEWEB)

    Kivotides, Demosthenes, E-mail: demosthenes.kivotides@strath.ac.uk

    2017-02-12

    An asymptotically exact method for the direct computation of turbulent polymeric liquids that includes (a) fully resolved, creeping microflow fields due to hydrodynamic interactions between chains, (b) exact account of (subfilter) residual stresses, (c) polymer Brownian motion, and (d) direct calculation of chain entanglements, is formulated. Although developed in the context of polymeric fluids, the method is equally applicable to turbulent colloidal dispersions and aerosols. - Highlights: • An asymptotically exact method for the computation of polymer and colloidal fluids is developed. • The method is valid for all flow inertia and all polymer volume fractions. • The method models entanglements and hydrodynamic interactions between polymer chains.

  3. The Watts-Strogatz network model developed by including degree distribution: theory and computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Y W [Surface Physics Laboratory and Department of Physics, Fudan University, Shanghai 200433 (China); Zhang, L F [Surface Physics Laboratory and Department of Physics, Fudan University, Shanghai 200433 (China); Huang, J P [Surface Physics Laboratory and Department of Physics, Fudan University, Shanghai 200433 (China)

    2007-07-20

    By using theoretical analysis and computer simulations, we develop the Watts-Strogatz network model by including degree distribution, in an attempt to improve the comparison between characteristic path lengths and clustering coefficients predicted by the original Watts-Strogatz network model and those of the real networks with the small-world property. Good agreement between the predictions of the theoretical analysis and those of the computer simulations has been shown. It is found that the developed Watts-Strogatz network model can fit the real small-world networks more satisfactorily. Some other interesting results are also reported by adjusting the parameters in a model degree-distribution function. The developed Watts-Strogatz network model is expected to help in the future analysis of various social problems as well as financial markets with the small-world property.

  4. The Watts-Strogatz network model developed by including degree distribution: theory and computer simulation

    International Nuclear Information System (INIS)

    Chen, Y W; Zhang, L F; Huang, J P

    2007-01-01

    By using theoretical analysis and computer simulations, we develop the Watts-Strogatz network model by including degree distribution, in an attempt to improve the comparison between characteristic path lengths and clustering coefficients predicted by the original Watts-Strogatz network model and those of the real networks with the small-world property. Good agreement between the predictions of the theoretical analysis and those of the computer simulations has been shown. It is found that the developed Watts-Strogatz network model can fit the real small-world networks more satisfactorily. Some other interesting results are also reported by adjusting the parameters in a model degree-distribution function. The developed Watts-Strogatz network model is expected to help in the future analysis of various social problems as well as financial markets with the small-world property

  5. Possible new basis for fast reactor subassembly instrumentation

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, A G

    1977-03-01

    This is a digest of a paper presented to the Risley Engineering Society. The theme is a speculation that the core instrumentation problem for a liquid metal fast breeder reactor might be transformed by developments in the realm of infrared television and in pattern recognition by computer. There is a possible need to measure coolant flow and cooled exit temperature for each subassembly, with familiar fail-to-safety characteristics. Present methods use electrical devices, for example thermocouples, but this gives rise to cabling problems. It might be possible, however, to instal at the top of each subassembly a mechanical device that gives a direct indication of temperature and flow visible to an infrared television camera. Signal transmission by cable would then be replaced by direct observation. A possible arrangement for such a system is described and is shown in schematic form. It includes pattern recognition by computer. It may also be possible to infer coolant temperature directly from the characteristics of the infrared radiation emitted by a thin stainless steel sheet in contact with the sodium, and an arrangement for this is shown. The type of pattern produced for on-line interpretation by computer is also shown. It is thought that this new approach to the problem of subassembly instrumentation is sufficiently attractive to justify a close study of the problems involved.

  6. Computational aspects of algebraic curves

    CERN Document Server

    Shaska, Tanush

    2005-01-01

    The development of new computational techniques and better computing power has made it possible to attack some classical problems of algebraic geometry. The main goal of this book is to highlight such computational techniques related to algebraic curves. The area of research in algebraic curves is receiving more interest not only from the mathematics community, but also from engineers and computer scientists, because of the importance of algebraic curves in applications including cryptography, coding theory, error-correcting codes, digital imaging, computer vision, and many more.This book cove

  7. Planning Computer-Aided Distance Learning

    Directory of Open Access Journals (Sweden)

    Nadja Dobnik

    1996-12-01

    Full Text Available Didactics of autonomous learning changes under the influence of new technologies. Computer technology can cover all the functions that a teacher develops in personal contact with the learner. People organizing distance learning must realize all the possibilities offered by computers. Computers can take over and also combine the functions of many tools and systems, e. g. type­ writer, video, telephone. This the contents can be offered in form of classic media by means of text, speech, picture, etc. Computers take over data pro­cessing and function as study materials. Computer included in a computer network can also function as a medium for interactive communication.

  8. ICECON: a computer program used to calculate containment back pressure for LOCA analysis (including ice condenser plants)

    International Nuclear Information System (INIS)

    1976-07-01

    The ICECON computer code provides a method for conservatively calculating the long term back pressure transient in the containment resulting from a hypothetical Loss-of-Coolant Accident (LOCA) for PWR plants including ice condenser containment systems. The ICECON computer code was developed from the CONTEMPT/LT-022 code. A brief discussion of the salient features of a typical ice condenser containment is presented. Details of the ice condenser models are explained. The corrections and improvements made to CONTEMPT/LT-022 are included. The organization of the code, including the calculational procedure, is outlined. The user's manual, to be used in conjunction with the CONTEMPT/LT-022 user's manual, a sample problem, a time-step study (solution convergence) and a comparison of ICECON results with the results of the NSSS vendor are presented. In general, containment pressure calculated with the ICECON code agree with those calculated by the NSSS vendor using the same mass and energy release rates to the containment

  9. POSSIBILITIES OF COMPUTED TOMOGRAPHY AND MAGNETIC RESONANCE IMAGING IN FORENSIC MEDICAL EXAMINATION OF MECHANICAL TRAUMA AND SUDDEN DEATH (A LITERATURE REVIEW

    Directory of Open Access Journals (Sweden)

    L. S. Kokov

    2015-01-01

    Full Text Available ABSTRACT. The review analyzes the possibility of multislice computed tomography (MSCT and magnetic resonance imaging (MRI use in the forensic examination of corpses of adults. We present the critical analysis of literature on post-mortem imaging in terms of forensic thanatology. The review is based on basic Internet resources: Scientific Electronic Library (elibrary, Scopus, PubMed. The review includes articles that discuss both advantages and limitations of post-mortem MSCT and MRI imaging in forensic examination of the corpse.Through studying the available literature, the authors attempted to answer two questions: 1 which method was more suitable for the purposes of forensic examination of the corpse - MSCT or MRI; 2 whether the virtual autopsy replaced the traditional autopsy in the near future?Conclusion: comprehensive study of the corpse often requires both imaging methods; in cases of death under mechanical damage, MSCT exceeds the range of possibilities of MRI; today, virtual autopsy cannot completely replace traditional autopsy in forensic science, since there are no convincing evidence-based comparative studies, as well as the legal framework of the method. 

  10. A possible new basis for fast reactor subassembly instrumentation

    International Nuclear Information System (INIS)

    Edwards, A.G.

    1977-01-01

    This is a digest of a paper presented to the Risley Engineering Society. The theme is a speculation that the core instrumentation problem for a liquid metal fast breeder reactor might be transformed by developments in the realm of infrared television and in pattern recognition by computer. There is a possible need to measure coolant flow and cooled exit temperature for each subassembly, with familiar fail-to-safety characteristics. Present methods use electrical devices, for example thermocouples, but this gives rise to cabling problems. It might be possible, however, to instal at the top of each subassembly a mechanical device that gives a direct indication of temperature and flow visible to an infrared television camera. Signal transmission by cable would then be replaced by direct observation. A possible arrangement for such a system is described and is shown in schematic form. It includes pattern recognition by computer. It may also be possible to infer coolant temperature directly from the characteristics of the infrared radiation emitted by a thin stainless steel sheet in contact with the sodium, and an arrangement for this is shown. The type of pattern produced for on-line interpretation by computer is also shown. It is thought that this new approach to the problem of subassembly instrumentation is sufficiently attractive to justify a close study of the problems involved. (U.K.)

  11. The WECHSL-Mod2 code: A computer program for the interaction of a core melt with concrete including the long term behavior

    International Nuclear Information System (INIS)

    Reimann, M.; Stiefel, S.

    1989-06-01

    The WECHSL-Mod2 code is a mechanistic computer code developed for the analysis of the thermal and chemical interaction of initially molten LWR reactor materials with concrete in a two-dimensional, axisymmetrical concrete cavity. The code performs calculations from the time of initial contact of a hot molten pool over start of solidification processes until long term basemat erosion over several days with the possibility of basemat penetration. The code assumes that the metallic phases of the melt pool form a layer at the bottom overlayed by the oxide melt atop. Heat generation in the melt is by decay heat and chemical reactions from metal oxidation. Energy is lost to the melting concrete and to the upper containment by radiation or evaporation of sumpwater possibly flooding the surface of the melt. Thermodynamic and transport properties as well as criteria for heat transfer and solidification processes are internally calculated for each time step. Heat transfer is modelled taking into account the high gas flux from the decomposing concrete and the heat conduction in the crusts possibly forming in the long term at the melt/concrete interface. The WECHSL code in its present version was validated by the BETA experiments. The test samples include a typical BETA post test calculation and a WECHSL application to a reactor accident. (orig.) [de

  12. Explicitly-correlated ring-coupled-cluster-doubles theory: Including exchange for computations on closed-shell systems

    Energy Technology Data Exchange (ETDEWEB)

    Hehn, Anna-Sophia; Holzer, Christof; Klopper, Wim, E-mail: klopper@kit.edu

    2016-11-10

    Highlights: • Ring-coupled-cluster-doubles approach now implemented with exchange terms. • Ring-coupled-cluster-doubles approach now implemented with F12 functions. • Szabo–Ostlund scheme (SO2) implemented for use in SAPT. • Fast convergence to the limit of a complete basis. • Implementation in the TURBOMOLE program system. - Abstract: Random-phase-approximation (RPA) methods have proven to be powerful tools in electronic-structure theory, being non-empirical, computationally efficient and broadly applicable to a variety of molecular systems including small-gap systems, transition-metal compounds and dispersion-dominated complexes. Applications are however hindered due to the slow basis-set convergence of the electron-correlation energy with the one-electron basis. As a remedy, we present approximate explicitly-correlated RPA approaches based on the ring-coupled-cluster-doubles formulation including exchange contributions. Test calculations demonstrate that the basis-set convergence of correlation energies is drastically accelerated through the explicitly-correlated approach, reaching 99% of the basis-set limit with triple-zeta basis sets. When implemented in close analogy to early work by Szabo and Ostlund [36], the new explicitly-correlated ring-coupled-cluster-doubles approach including exchange has the perspective to become a valuable tool in the framework of symmetry-adapted perturbation theory (SAPT) for the computation of dispersion energies of molecular complexes of weakly interacting closed-shell systems.

  13. A computational method for comparing the behavior and possible failure of prosthetic implants

    Energy Technology Data Exchange (ETDEWEB)

    Nielsen, C.; Hollerbach, K.; Perfect, S.; Underhill, K.

    1995-05-01

    Prosthetic joint implants currently in use exhibit high Realistic computer modeling of prosthetic implants provides an opportunity for orthopedic biomechanics researchers and physicians to understand possible in vivo failure modes, without having to resort to lengthy and costly clinical trials. The research presented here is part of a larger effort to develop realistic models of implanted joint prostheses. The example used here is the thumb carpo-metacarpal (cmc) joint. The work, however, can be applied to any other human joints for which prosthetic implants have been designed. Preliminary results of prosthetic joint loading, without surrounding human tissue (i.e., simulating conditions under which the prosthetic joint has not yet been implanted into the human joint), are presented, based on a three-dimensional, nonlinear finite element analysis of three different joint implant designs.

  14. Computer-controlled attenuator.

    Science.gov (United States)

    Mitov, D; Grozev, Z

    1991-01-01

    Various possibilities for applying electronic computer-controlled attenuators for the automation of physiological experiments are considered. A detailed description is given of the design of a 4-channel computer-controlled attenuator, in two of the channels of which the output signal can change by a linear step, in the other two channels--by a logarithmic step. This, together with the existence of additional programmable timers, allows to automate a wide range of studies in different spheres of physiology and psychophysics, including vision and hearing.

  15. A computer software system for the generation of global ocean tides including self-gravitation and crustal loading effects

    Science.gov (United States)

    Estes, R. H.

    1977-01-01

    A computer software system is described which computes global numerical solutions of the integro-differential Laplace tidal equations, including dissipation terms and ocean loading and self-gravitation effects, for arbitrary diurnal and semidiurnal tidal constituents. The integration algorithm features a successive approximation scheme for the integro-differential system, with time stepping forward differences in the time variable and central differences in spatial variables. Solutions for M2, S2, N2, K2, K1, O1, P1 tidal constituents neglecting the effects of ocean loading and self-gravitation and a converged M2, solution including ocean loading and self-gravitation effects are presented in the form of cotidal and corange maps.

  16. Design, functioning and possible applications of process computers

    International Nuclear Information System (INIS)

    Kussl, V.

    1975-01-01

    Process computers are useful as automation instruments a) when large numbers of data are processed in analog or digital form, b) for low data flow (data rate), and c) when data must be stored over short or long periods of time. (orig./AK) [de

  17. Privacy-Preserving Computation with Trusted Computing via Scramble-then-Compute

    OpenAIRE

    Dang Hung; Dinh Tien Tuan Anh; Chang Ee-Chien; Ooi Beng Chin

    2017-01-01

    We consider privacy-preserving computation of big data using trusted computing primitives with limited private memory. Simply ensuring that the data remains encrypted outside the trusted computing environment is insufficient to preserve data privacy, for data movement observed during computation could leak information. While it is possible to thwart such leakage using generic solution such as ORAM [42], designing efficient privacy-preserving algorithms is challenging. Besides computation effi...

  18. Posture, Musculoskeletal Activities, and Possible Musculoskeletal Discomfort among Children Using Laptops or Tablet Computers for Educational Purposes: A Literature Review

    Science.gov (United States)

    Binboga, Elif; Korhan, Orhan

    2014-01-01

    Educational ergonomics focuses on the interaction between educational performance and educational design. By improving the design or pointing out the possible problems, educational ergonomics can be utilized to have positive impacts on the student performance and thus on education process. Laptops and tablet computers are becoming widely used by…

  19. Quantum computers and quantum computations

    International Nuclear Information System (INIS)

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  20. GPU computing and applications

    CERN Document Server

    See, Simon

    2015-01-01

    This book presents a collection of state of the art research on GPU Computing and Application. The major part of this book is selected from the work presented at the 2013 Symposium on GPU Computing and Applications held in Nanyang Technological University, Singapore (Oct 9, 2013). Three major domains of GPU application are covered in the book including (1) Engineering design and simulation; (2) Biomedical Sciences; and (3) Interactive & Digital Media. The book also addresses the fundamental issues in GPU computing with a focus on big data processing. Researchers and developers in GPU Computing and Applications will benefit from this book. Training professionals and educators can also benefit from this book to learn the possible application of GPU technology in various areas.

  1. Legal issues of computer imaging in plastic surgery: a primer.

    Science.gov (United States)

    Chávez, A E; Dagum, P; Koch, R J; Newman, J P

    1997-11-01

    Although plastic surgeons are increasingly incorporating computer imaging techniques into their practices, many fear the possibility of legally binding themselves to achieve surgical results identical to those reflected in computer images. Computer imaging allows surgeons to manipulate digital photographs of patients to project possible surgical outcomes. Some of the many benefits imaging techniques pose include improving doctor-patient communication, facilitating the education and training of residents, and reducing administrative and storage costs. Despite the many advantages computer imaging systems offer, however, surgeons understandably worry that imaging systems expose them to immense legal liability. The possible exploitation of computer imaging by novice surgeons as a marketing tool, coupled with the lack of consensus regarding the treatment of computer images, adds to the concern of surgeons. A careful analysis of the law, however, reveals that surgeons who use computer imaging carefully and conservatively, and adopt a few simple precautions, substantially reduce their vulnerability to legal claims. In particular, surgeons face possible claims of implied contract, failure to instruct, and malpractice from their use or failure to use computer imaging. Nevertheless, legal and practical obstacles frustrate each of those causes of actions. Moreover, surgeons who incorporate a few simple safeguards into their practice may further reduce their legal susceptibility.

  2. A personal computer-based nuclear magnetic resonance spectrometer

    Science.gov (United States)

    Job, Constantin; Pearson, Robert M.; Brown, Michael F.

    1994-11-01

    Nuclear magnetic resonance (NMR) spectroscopy using personal computer-based hardware has the potential of enabling the application of NMR methods to fields where conventional state of the art equipment is either impractical or too costly. With such a strategy for data acquisition and processing, disciplines including civil engineering, agriculture, geology, archaeology, and others have the possibility of utilizing magnetic resonance techniques within the laboratory or conducting applications directly in the field. Another aspect is the possibility of utilizing existing NMR magnets which may be in good condition but unused because of outdated or nonrepairable electronics. Moreover, NMR applications based on personal computer technology may open up teaching possibilities at the college or even secondary school level. The goal of developing such a personal computer (PC)-based NMR standard is facilitated by existing technologies including logic cell arrays, direct digital frequency synthesis, use of PC-based electrical engineering software tools to fabricate electronic circuits, and the use of permanent magnets based on neodymium-iron-boron alloy. Utilizing such an approach, we have been able to place essentially an entire NMR spectrometer console on two printed circuit boards, with the exception of the receiver and radio frequency power amplifier. Future upgrades to include the deuterium lock and the decoupler unit are readily envisioned. The continued development of such PC-based NMR spectrometers is expected to benefit from the fast growing, practical, and low cost personal computer market.

  3. PTA-1 computer program for treating pressure transients in hydraulic networks including the effect of pipe plasticity

    International Nuclear Information System (INIS)

    Youngdahl, C.K.; Kot, C.A.

    1977-01-01

    Pressure pulses in the intermediate sodium system of a liquid-metal-cooled fast breeder reactor, such as may originate from a sodium/water reaction in a steam generator, are propagated through the complex sodium piping network to system components such as the pump and intermediate heat exchanger. To assess the effects of such pulses on continued reliable operation of these components and to contribute to system designs which result in the mitigation of these effects, Pressure Transient Analysis (PTA) computer codes are being developed for accurately computing the transmission of pressure pulses through a complicated fluid transport system, consisting of piping, fittings and junctions, and components. PTA-1 provides an extension of the well-accepted and verified fluid hammer formulation for computing hydraulic transients in elastic or rigid piping systems to include plastic deformation effects. The accuracy of the modeling of pipe plasticity effects on transient propagation has been validated using results from two sets of Stanford Research Institute experiments. Validation of PTA-1 using the latter set of experiments is described briefly. The comparisons of PTA-1 computations with experiments show that (1) elastic-plastic deformation of LMFBR-type piping can have a significant qualitative and quantitative effect on pressure pulse propagation, even in simple systems; (2) classical fluid-hammer theory gives erroneous results when applied to situations where piping deforms plastically; and (3) the computational model incorporated in PTA-1 for predicting plastic deformation and its effect on transient propagation is accurate

  4. The Gender Factor in Computer Anxiety and Interest among Some Australian High School Students.

    Science.gov (United States)

    Okebukola, Peter Akinsola

    1993-01-01

    Western Australia eleventh graders (142 boys, 139 girls) were compared on such variables as computers at home, computer classes, experience with computers, and socioeconomic status. Girls had higher anxiety levels, boys higher computer interest. Possible causes included social beliefs about computer use, teacher sex bias, and software (games) more…

  5. New possibilities of three-dimensional reconstruction of computed tomography scans

    International Nuclear Information System (INIS)

    Herman, M.; Tarjan, Z.; Pozzi-Mucelli, R.S.

    1996-01-01

    Three-dimensional (3D) computed tomography (CT) scan reconstructions provide impressive and illustrative images of various parts of the human body. Such images are reconstructed from a series of basic CT scans by dedicated software. The state of the art in 3D computed tomography is demonstrated with emphasis on the imaging of soft tissues. Examples are presented of imaging the craniofacial and maxillofacial complex, central nervous system, cardiovascular system, musculoskeletal system, gastrointestinal and urogenital systems, and respiratory system, and their potential in clinical practice is discussed. Although contributing no new essential diagnostic information against conventional CT scans, 3D scans can help in spatial orientation. 11 figs., 25 refs

  6. Classical and quantum computing with C++ and Java simulations

    CERN Document Server

    Hardy, Y

    2001-01-01

    Classical and Quantum computing provides a self-contained, systematic and comprehensive introduction to all the subjects and techniques important in scientific computing. The style and presentation are readily accessible to undergraduates and graduates. A large number of examples, accompanied by complete C++ and Java code wherever possible, cover every topic. Features and benefits: - Comprehensive coverage of the theory with many examples - Topics in classical computing include boolean algebra, gates, circuits, latches, error detection and correction, neural networks, Turing machines, cryptography, genetic algorithms - For the first time, genetic expression programming is presented in a textbook - Topics in quantum computing include mathematical foundations, quantum algorithms, quantum information theory, hardware used in quantum computing This book serves as a textbook for courses in scientific computing and is also very suitable for self-study. Students, professionals and practitioners in computer...

  7. VibroCV: a computer vision-based vibroarthrography platform with possible application to Juvenile Idiopathic Arthritis.

    Science.gov (United States)

    Wiens, Andrew D; Prahalad, Sampath; Inan, Omer T

    2016-08-01

    Vibroarthrography, a method for interpreting the sounds emitted by a knee during movement, has been studied for several joint disorders since 1902. However, to our knowledge, the usefulness of this method for management of Juvenile Idiopathic Arthritis (JIA) has not been investigated. To study joint sounds as a possible new biomarker for pediatric cases of JIA we designed and built VibroCV, a platform to capture vibroarthrograms from four accelerometers; electromyograms (EMG) and inertial measurements from four wireless EMG modules; and joint angles from two Sony Eye cameras and six light-emitting diodes with commercially-available off-the-shelf parts and computer vision via OpenCV. This article explains the design of this turn-key platform in detail, and provides a sample recording captured from a pediatric subject.

  8. Adult-onset photosensitivity: clinical significance and epilepsy syndromes including idiopathic (possibly genetic) photosensitive occipital epilepsy.

    Science.gov (United States)

    Koutroumanidis, Michalis; Tsirka, Vasiliki; Panayiotopoulos, Chrysostomos

    2015-09-01

    To evaluate the clinical associations of adult-onset photosensitivity, we studied the clinical and EEG data of patients who were referred due to a possible first seizure and who had a photoparoxysmal response on their EEG. Patients with clinical evidence of photosensitivity before the age of 20 were excluded. Of a total of 30 patients, four had acute symptomatic seizures, two had vasovagal syncope, and 24 were diagnosed with epilepsy. Nine of the 24 patients had idiopathic (genetic) generalized epilepsies and predominantly generalized photoparoxysmal response, but also rare photically-induced seizures, while 15 had exclusively, or almost exclusively, reflex photically-induced occipital seizures with frequent secondary generalization and posterior photoparoxysmal response. Other important differences included a significantly older age at seizure onset and paucity of spontaneous interictal epileptic discharges in patients with photically-induced occipital seizures; only a quarter of these had occasional occipital spikes, in contrast to the idiopathic (genetic) generalized epilepsy patients with typically generalized epileptic discharges. On the other hand, both groups shared a positive family history of epilepsy, common seizure threshold modulators (such as tiredness and sleep deprivation), normal neurological examination and MRI, a generally benign course, and good response to valproic acid. We demonstrated that photosensitivity can first occur in adult life and manifest, either as idiopathic (possibly genetic) photosensitive occipital epilepsy with secondary generalization or as an EEG, and less often, a clinical/EEG feature of idiopathic (genetic) generalized epilepsies. Identification of idiopathic photosensitive occipital epilepsy fills a diagnostic gap in adult first-seizure epileptology and is clinically important because of its good response to antiepileptic drug treatment and fair prognosis.

  9. Computer Registration Becoming Mandatory

    CERN Multimedia

    2003-01-01

    Following the decision by the CERN Management Board (see Weekly Bulletin 38/2003), registration of all computers connected to CERN's network will be enforced and only registered computers will be allowed network access. The implementation has started with the IT buildings, continues with building 40 and the Prevessin site (as of Tuesday 4th November 2003), and will cover the whole of CERN before the end of this year. We therefore recommend strongly that you register all your computers in CERN's network database including all network access cards (Ethernet AND wireless) as soon as possible without waiting for the access restriction to take force. This will allow you accessing the network without interruption and help IT service providers to contact you in case of problems (e.g. security problems, viruses, etc.) Users WITH a CERN computing account register at: http://cern.ch/register/ (CERN Intranet page) Visitors WITHOUT a CERN computing account (e.g. short term visitors) register at: http://cern.ch/regis...

  10. GPU-computing in econophysics and statistical physics

    Science.gov (United States)

    Preis, T.

    2011-03-01

    A recent trend in computer science and related fields is general purpose computing on graphics processing units (GPUs), which can yield impressive performance. With multiple cores connected by high memory bandwidth, today's GPUs offer resources for non-graphics parallel processing. This article provides a brief introduction into the field of GPU computing and includes examples. In particular computationally expensive analyses employed in financial market context are coded on a graphics card architecture which leads to a significant reduction of computing time. In order to demonstrate the wide range of possible applications, a standard model in statistical physics - the Ising model - is ported to a graphics card architecture as well, resulting in large speedup values.

  11. Possible Subclinical Leaflet Thrombosis in Bioprosthetic Aortic Valves

    DEFF Research Database (Denmark)

    Makkar, Raj R; Fontana, Gregory; Jilaihawi, Hasan

    2015-01-01

    BACKGROUND: A finding of reduced aortic-valve leaflet motion was noted on computed tomography (CT) in a patient who had a stroke after transcatheter aortic-valve replacement (TAVR) during an ongoing clinical trial. This finding raised a concern about possible subclinical leaflet thrombosis...... patients and 1 of 115 patients, respectively; P=0.007). CONCLUSIONS: Reduced aortic-valve leaflet motion was shown in patients with bioprosthetic aortic valves. The condition resolved with therapeutic anticoagulation. The effect of this finding on clinical outcomes including stroke needs further...

  12. Computational Design of Batteries from Materials to Systems

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Kandler A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Santhanagopalan, Shriram [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Yang, Chuanbo [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Graf, Peter A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Usseglio Viretta, Francois L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Li, Qibo [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Finegan, Donal [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Pesaran, Ahmad A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Yao, Koffi (Pierre) [Argonne National Laboratory; Abraham, Daniel [Argonne National Laboratory; Dees, Dennis [Argonne National Laboratory; Jansen, Andy [Argonne National Laboratory; Mukherjee, Partha [Texas A& M University; Mistry, Aashutosh [Texas A& M University; Verma, Ankit [Texas A& M University; Lamb, Josh [Sandia National Laboratories; Darcy, Eric [NASA

    2017-09-01

    Computer models are helping to accelerate the design and validation of next generation batteries and provide valuable insights not possible through experimental testing alone. Validated 3-D physics-based models exist for predicting electrochemical performance, thermal and mechanical response of cells and packs under normal and abuse scenarios. The talk describes present efforts to make the models better suited for engineering design, including improving their computation speed, developing faster processes for model parameter identification including under aging, and predicting the performance of a proposed electrode material recipe a priori using microstructure models.

  13. Pancreas divisum as a possible cause of misinterpretation in ERCP, computed tomography, ultrasound and barium meal

    International Nuclear Information System (INIS)

    Brambs, H.J.; Schuetz, B.; Wimmer, B.; Hoppe-Seyler, P.; Freiburg Univ.

    1986-01-01

    In 488 patients endoscopic retorgrade pancreatography (ERP) revealed a pancreas divisum in 21 (4.3%): in 17/21 patients we found a complete, in 4/21 an incomplete separation of the pancreatic ducts. The pancreas divisum is caused by a malfusion of the ductal system. On examination by ultrasound, computed tomography or hypotonic duodenography this variant can suggest an inflammation or tumour of the head of the pancreas. A definite diagnosis is possible by ERP only. Since the small ventral duct can be confused with an alteration caused by inflammation or by a tumour, to much of contrast medium can be injected. Pancreas divisum is often associated with a chronic pancreatitis which can be demonstrated via ERP of the dorsal duct through the accessory papilla. (orig.) [de

  14. Design and applications of Computed Industrial Tomographic Imaging System (CITIS)

    Energy Technology Data Exchange (ETDEWEB)

    Ramakrishna, G S; Kumar, Umesh; Datta, S S [Bhabha Atomic Research Centre, Bombay (India). Isotope Div.

    1994-12-31

    This paper highlights the design and development of a prototype Computed Tomographic (CT) imaging system and its software for image reconstruction, simulation and display. It also describes results obtained with several test specimens including Dhruva reactor uranium fuel assembly and possibility of using neutrons as well as high energy x-rays in computed tomography. 5 refs., 4 figs.

  15. On probability-possibility transformations

    Science.gov (United States)

    Klir, George J.; Parviz, Behzad

    1992-01-01

    Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.

  16. Structure problems in the analog computation

    International Nuclear Information System (INIS)

    Braffort, P.L.

    1957-01-01

    The recent mathematical development showed the importance of elementary structures (algebraic, topological, etc.) in abeyance under the great domains of classical analysis. Such structures in analog computation are put in evidence and possible development of applied mathematics are discussed. It also studied the topological structures of the standard representation of analog schemes such as additional triangles, integrators, phase inverters and functions generators. The analog method gives only the function of the variable: time, as results of its computations. But the course of computation, for systems including reactive circuits, introduces order structures which are called 'chronological'. Finally, it showed that the approximation methods of ordinary numerical and digital computation present the same structure as these analog computation. The structure analysis permits fruitful comparisons between the several domains of applied mathematics and suggests new important domains of application for analog method. (M.P.)

  17. The implementation of AI technologies in computer wargames

    Science.gov (United States)

    Tiller, John A.

    2004-08-01

    Computer wargames involve the most in-depth analysis of general game theory. The enumerated turns of a game like chess are dwarfed by the exponentially larger possibilities of even a simple computer wargame. Implementing challenging AI is computer wargames is an important goal in both the commercial and military environments. In the commercial marketplace, customers demand a challenging AI opponent when they play a computer wargame and are frustrated by a lack of competence on the part of the AI. In the military environment, challenging AI opponents are important for several reasons. A challenging AI opponent will force the military professional to avoid routine or set-piece approaches to situations and cause them to think much deeper about military situations before taking action. A good AI opponent would also include national characteristics of the opponent being simulated, thus providing the military professional with even more of a challenge in planning and approach. Implementing current AI technologies in computer wargames is a technological challenge. The goal is to join the needs of AI in computer wargames with the solutions of current AI technologies. This talk will address several of those issues, possible solutions, and currently unsolved problems.

  18. Second Annual AEC Scientific Computer Information Exhange Meeting. Proceedings of the technical program theme: computer graphics

    Energy Technology Data Exchange (ETDEWEB)

    Peskin,A.M.; Shimamoto, Y.

    1974-01-01

    The topic of computer graphics serves well to illustrate that AEC affiliated scientific computing installations are well represented in the forefront of computing science activities. The participant response to the technical program was overwhelming--both in number of contributions and quality of the work described. Session I, entitled Advanced Systems, contains presentations describing systems that contain features not generally found in graphics facilities. These features can be roughly classified as extensions of standard two-dimensional monochromatic imaging to higher dimensions including color and time as well as multidimensional metrics. Session II presents seven diverse applications ranging from high energy physics to medicine. Session III describes a number of important developments in establishing facilities, techniques and enhancements in the computer graphics area. Although an attempt was made to schedule as many of these worthwhile presentations as possible, it appeared impossible to do so given the scheduling constraints of the meeting. A number of prospective presenters 'came to the rescue' by graciously withdrawing from the sessions. Some of their abstracts have been included in the Proceedings.

  19. Future Computer Requirements for Computational Aerodynamics

    Science.gov (United States)

    1978-01-01

    Recent advances in computational aerodynamics are discussed as well as motivations for and potential benefits of a National Aerodynamic Simulation Facility having the capability to solve fluid dynamic equations at speeds two to three orders of magnitude faster than presently possible with general computers. Two contracted efforts to define processor architectures for such a facility are summarized.

  20. Computer facilities for ISABELLE data handling

    International Nuclear Information System (INIS)

    Kramer, M.A.; Love, W.A.; Miller, R.J.; Zeller, M.

    1977-01-01

    The analysis of data produced by ISABELLE experiments will need a large system of computers. An official group of prospective users and operators of that system should begin planning now. Included in the array will be a substantial computer system at each ISABELLE intersection in use. These systems must include enough computer power to keep experimenters aware of the health of the experiment. This will require at least one very fast sophisticated processor in the system, the size depending on the experiment. Other features of the intersection systems must be a good, high speed graphic display, ability to record data on magnetic tape at 500 to 1000 KB, and a high speed link to a central computer. The operating system software must support multiple interactive users. A substantially larger capacity computer system, shared by the six intersection region experiments, must be available with good turnaround for experimenters while ISABELLE is running. A computer support group will be required to maintain the computer system and to provide and maintain software common to all experiments. Special superfast computing hardware or special function processors constructed with microprocessor circuitry may be necessary both in the data gathering and data processing work. Thus both the local and central processors should be chosen with the possibility of interfacing such devices in mind

  1. Automatization of physical experiments on-line with the MINSK-32 computer

    International Nuclear Information System (INIS)

    Fefilov, B.V.; Mikhushkin, A.V.; Morozov, V.M.; Sukhov, A.M.; Chelnokov, L.P.

    1978-01-01

    The system for data acquisition and processing of complex multi-dimensional experiments is described. The system includes the autonomous modules in the CAMAC standard, the NAIRI-4 small computer and the MINSK-32 base computer. The NAIRI-4 computer effects preliminary storage, data processing and experiment control. Its software includes the microprogram software of the NAIRI-4 computer, the software of the NAIRI-2 computer, the software of the PDP-11 computer, the technological software on the Es computers. A crate controller and a display driver are connected to the main channel for the operation of the NAIRI-4 computer on line with experimental devices. An input-output channel commutator, which transforms the MINSK-32 computer levels to the TTL levels and vice versa, was developed to enlarge the possibilities of the connection of the measurement modules to the MINSK-32 computer. The graphic display on the basis of the HP-1300A monitor with a light pencil is used for highly effective spectrum processing

  2. Computational structural mechanics for engine structures

    Science.gov (United States)

    Chamis, C. C.

    1989-01-01

    The computational structural mechanics (CSM) program at Lewis encompasses: (1) fundamental aspects for formulating and solving structural mechanics problems, and (2) development of integrated software systems to computationally simulate the performance/durability/life of engine structures. It is structured to mainly supplement, complement, and whenever possible replace, costly experimental efforts which are unavoidable during engineering research and development programs. Specific objectives include: investigate unique advantages of parallel and multiprocesses for: reformulating/solving structural mechanics and formulating/solving multidisciplinary mechanics and develop integrated structural system computational simulators for: predicting structural performances, evaluating newly developed methods, and for identifying and prioritizing improved/missing methods needed. Herein the CSM program is summarized with emphasis on the Engine Structures Computational Simulator (ESCS). Typical results obtained using ESCS are described to illustrate its versatility.

  3. CTmod—A toolkit for Monte Carlo simulation of projections including scatter in computed tomography

    Czech Academy of Sciences Publication Activity Database

    Malušek, Alexandr; Sandborg, M.; Alm Carlsson, G.

    2008-01-01

    Roč. 90, č. 2 (2008), s. 167-178 ISSN 0169-2607 Institutional research plan: CEZ:AV0Z10480505 Keywords : Monte Carlo * computed tomography * cone beam * scatter Subject RIV: JC - Computer Hardware ; Software Impact factor: 1.220, year: 2008 http://dx.doi.org/10.1016/j.cmpb.2007.12.005

  4. The Voice as Computer Interface: A Look at Tomorrow's Technologies.

    Science.gov (United States)

    Lange, Holley R.

    1991-01-01

    Discussion of voice as the communications device for computer-human interaction focuses on voice recognition systems for use within a library environment. Voice technologies are described, including voice response and voice recognition; examples of voice systems in use in libraries are examined; and further possibilities, including use with…

  5. Including Internet insurance as part of a hospital computer network security plan.

    Science.gov (United States)

    Riccardi, Ken

    2002-01-01

    Cyber attacks on a hospital's computer network is a new crime to be reckoned with. Should your hospital consider internet insurance? The author explains this new phenomenon and presents a risk assessment for determining network vulnerabilities.

  6. Possible Simple Structures of the Universe to Include General Relativity Effects

    Directory of Open Access Journals (Sweden)

    Corneliu BERBENTE

    2017-12-01

    Full Text Available The general relativity describes the universe properties, the gravity playing a fundamental role. One uses a metric tensor in a Riemann space, g  , which should be in agreement with a mass (or energy tensor in order to satisfy the Einstein equation of the general relativity [1]. This equation contains the Ricci curvature as well. In general, applications are done considering that a chosen metric is valid without region limits. In fact, the density of the energy whose distribution is however unknown is variable in universe; therefore, the metrics need to be adapted to different regions. For this reason one suggests to start with a simple, average mass-energy distribution that could represent in a first step the actual universe. This suggestion is in agreement with the symmetrical distribution of equal spheres existing in a model of the early universe given by one of the authors. Two kinds of distribution are given. The possibility of black holes formation is studied and a criterion is given.

  7. Quantum computing without wavefunctions: time-dependent density functional theory for universal quantum computation.

    Science.gov (United States)

    Tempel, David G; Aspuru-Guzik, Alán

    2012-01-01

    We prove that the theorems of TDDFT can be extended to a class of qubit Hamiltonians that are universal for quantum computation. The theorems of TDDFT applied to universal Hamiltonians imply that single-qubit expectation values can be used as the basic variables in quantum computation and information theory, rather than wavefunctions. From a practical standpoint this opens the possibility of approximating observables of interest in quantum computations directly in terms of single-qubit quantities (i.e. as density functionals). Additionally, we also demonstrate that TDDFT provides an exact prescription for simulating universal Hamiltonians with other universal Hamiltonians that have different, and possibly easier-to-realize two-qubit interactions. This establishes the foundations of TDDFT for quantum computation and opens the possibility of developing density functionals for use in quantum algorithms.

  8. Computer tomography in otolaryngology

    International Nuclear Information System (INIS)

    Gradzki, J.

    1981-01-01

    The principles of design and the action of computer tomography which was applied also for the diagnosis of nose, ear and throat diseases are discussed. Computer tomography makes possible visualization of the structures of the nose, nasal sinuses and facial skeleton in transverse and eoronal planes. The method enables an accurate evaluation of the position and size of neoplasms in these regions and differentiation of inflammatory exudates against malignant masses. In otology computer tomography is used particularly in the diagnosis of pontocerebellar angle tumours and otogenic brain abscesses. Computer tomography of the larynx and pharynx provides new diagnostic data owing to the possibility of obtaining transverse sections and visualization of cartilage. Computer tomograms of some cases are presented. (author)

  9. Description of the tasks of control room operators in German nuclear power plants and support possibilities by advanced computer systems

    International Nuclear Information System (INIS)

    Buettner, W.E.

    1984-01-01

    In course of the development of nuclear power plants the instrumentation and control systems and the information in the control room have been increasing substantially. With this background it is described which operator tasks might be supported by advanced computer aid systems with main emphasis to safety related information and diagnose facilities. Nevertheless, some of this systems under development may be helpful for normal operation modes too. As far as possible recommendations for the realization and test of such systems are made. (orig.) [de

  10. Quantum computation

    International Nuclear Information System (INIS)

    Deutsch, D.

    1992-01-01

    As computers become ever more complex, they inevitably become smaller. This leads to a need for components which are fabricated and operate on increasingly smaller size scales. Quantum theory is already taken into account in microelectronics design. This article explores how quantum theory will need to be incorporated into computers in future in order to give them their components functionality. Computation tasks which depend on quantum effects will become possible. Physicists may have to reconsider their perspective on computation in the light of understanding developed in connection with universal quantum computers. (UK)

  11. [Possibilities of computer graphics simulation in orthopedic surgery].

    Science.gov (United States)

    Kessler, P; Wiltfang, J; Teschner, M; Girod, B; Neukam, F W

    2000-11-01

    In addition to standard X-rays, photographic documentation, cephalometric and model analysis, a computer-aided, three-dimensional (3D) simulation system has been developed in close cooperation with the Institute of Communications of the Friedrich-Alexander-Universität Erlangen-Nürnberg. With this simulation system a photorealistic prediction of the expected soft tissue changes can be made. Prerequisites are a 3D reconstruction of the facial skeleton and a 3D laser scan of the face. After data reduction, the two data sets can be matched. Cutting planes enable the transposition of bony segments. The laser scan of the facial surface is combined with the underlying bone via a five-layered soft tissue model to convert bone movements on the soft tissue cover realistically. Further research is necessary to replace the virtual subcutaneous soft tissue model by correct, topographic tissue anatomy.

  12. Computing networks from cluster to cloud computing

    CERN Document Server

    Vicat-Blanc, Pascale; Guillier, Romaric; Soudan, Sebastien

    2013-01-01

    "Computing Networks" explores the core of the new distributed computing infrastructures we are using today:  the networking systems of clusters, grids and clouds. It helps network designers and distributed-application developers and users to better understand the technologies, specificities, constraints and benefits of these different infrastructures' communication systems. Cloud Computing will give the possibility for millions of users to process data anytime, anywhere, while being eco-friendly. In order to deliver this emerging traffic in a timely, cost-efficient, energy-efficient, and

  13. Recent advances in the reconstruction of cranio-maxillofacial defects using computer-aided design/computer-aided manufacturing.

    Science.gov (United States)

    Oh, Ji-Hyeon

    2018-12-01

    With the development of computer-aided design/computer-aided manufacturing (CAD/CAM) technology, it has been possible to reconstruct the cranio-maxillofacial defect with more accurate preoperative planning, precise patient-specific implants (PSIs), and shorter operation times. The manufacturing processes include subtractive manufacturing and additive manufacturing and should be selected in consideration of the material type, available technology, post-processing, accuracy, lead time, properties, and surface quality. Materials such as titanium, polyethylene, polyetheretherketone (PEEK), hydroxyapatite (HA), poly-DL-lactic acid (PDLLA), polylactide-co-glycolide acid (PLGA), and calcium phosphate are used. Design methods for the reconstruction of cranio-maxillofacial defects include the use of a pre-operative model printed with pre-operative data, printing a cutting guide or template after virtual surgery, a model after virtual surgery printed with reconstructed data using a mirror image, and manufacturing PSIs by directly obtaining PSI data after reconstruction using a mirror image. By selecting the appropriate design method, manufacturing process, and implant material according to the case, it is possible to obtain a more accurate surgical procedure, reduced operation time, the prevention of various complications that can occur using the traditional method, and predictive results compared to the traditional method.

  14. Learning Universal Computations with Spikes

    Science.gov (United States)

    Thalmeier, Dominik; Uhlmann, Marvin; Kappen, Hilbert J.; Memmesheimer, Raoul-Martin

    2016-01-01

    Providing the neurobiological basis of information processing in higher animals, spiking neural networks must be able to learn a variety of complicated computations, including the generation of appropriate, possibly delayed reactions to inputs and the self-sustained generation of complex activity patterns, e.g. for locomotion. Many such computations require previous building of intrinsic world models. Here we show how spiking neural networks may solve these different tasks. Firstly, we derive constraints under which classes of spiking neural networks lend themselves to substrates of powerful general purpose computing. The networks contain dendritic or synaptic nonlinearities and have a constrained connectivity. We then combine such networks with learning rules for outputs or recurrent connections. We show that this allows to learn even difficult benchmark tasks such as the self-sustained generation of desired low-dimensional chaotic dynamics or memory-dependent computations. Furthermore, we show how spiking networks can build models of external world systems and use the acquired knowledge to control them. PMID:27309381

  15. Learner Perceptions of Realism and Magic in Computer Simulations.

    Science.gov (United States)

    Hennessy, Sara; O'Shea, Tim

    1993-01-01

    Discusses the possible lack of credibility in educational interactive computer simulations. Topics addressed include "Shopping on Mars," a collaborative adventure game for arithmetic calculation that uses direct manipulation in the microworld; the Alternative Reality Kit, a graphical animated environment for creating interactive…

  16. Towards the computational design of solid catalysts

    DEFF Research Database (Denmark)

    Nørskov, Jens Kehlet; Bligaard, Thomas; Rossmeisl, Jan

    2009-01-01

    Over the past decade the theoretical description of surface reactions has undergone a radical development. Advances in density functional theory mean it is now possible to describe catalytic reactions at surfaces with the detail and accuracy required for computational results to compare favourably...... with experiments. Theoretical methods can be used to describe surface chemical reactions in detail and to understand variations in catalytic activity from one catalyst to another. Here, we review the first steps towards using computational methods to design new catalysts. Examples include screening for catalysts...

  17. Nuclear Computational Low Energy Initiative (NUCLEI)

    Energy Technology Data Exchange (ETDEWEB)

    Reddy, Sanjay K. [University of Washington

    2017-08-14

    This is the final report for University of Washington for the NUCLEI SciDAC-3. The NUCLEI -project, as defined by the scope of work, will develop, implement and run codes for large-scale computations of many topics in low-energy nuclear physics. Physics to be studied include the properties of nuclei and nuclear decays, nuclear structure and reactions, and the properties of nuclear matter. The computational techniques to be used include Quantum Monte Carlo, Configuration Interaction, Coupled Cluster, and Density Functional methods. The research program will emphasize areas of high interest to current and possible future DOE nuclear physics facilities, including ATLAS and FRIB (nuclear structure and reactions, and nuclear astrophysics), TJNAF (neutron distributions in nuclei, few body systems, and electroweak processes), NIF (thermonuclear reactions), MAJORANA and FNPB (neutrino-less double-beta decay and physics beyond the Standard Model), and LANSCE (fission studies).

  18. Reforming Lao Teacher Education to Include Females and Ethnic Minorities--Exploring Possibilities and Constraints

    Science.gov (United States)

    Berge, Britt-Marie; Chounlamany, Kongsy; Khounphilaphanh, Bounchanh; Silfver, Ann-Louise

    2017-01-01

    This article explores possibilities and constraints for the inclusion of female and ethnic minority students in Lao education in order to provide education for all. Females and ethnic minorities have traditionally been disadvantaged in Lao education and reforms for the inclusion of these groups are therefore welcome. The article provides rich…

  19. 'Cloud computing' and clinical trials: report from an ECRIN workshop.

    Science.gov (United States)

    Ohmann, Christian; Canham, Steve; Danielyan, Edgar; Robertshaw, Steve; Legré, Yannick; Clivio, Luca; Demotes, Jacques

    2015-07-29

    Growing use of cloud computing in clinical trials prompted the European Clinical Research Infrastructures Network, a European non-profit organisation established to support multinational clinical research, to organise a one-day workshop on the topic to clarify potential benefits and risks. The issues that arose in that workshop are summarised and include the following: the nature of cloud computing and the cloud computing industry; the risks in using cloud computing services now; the lack of explicit guidance on this subject, both generally and with reference to clinical trials; and some possible ways of reducing risks. There was particular interest in developing and using a European 'community cloud' specifically for academic clinical trial data. It was recognised that the day-long workshop was only the start of an ongoing process. Future discussion needs to include clarification of trial-specific regulatory requirements for cloud computing and involve representatives from the relevant regulatory bodies.

  20. Computer Graphics 2: More of the Best Computer Art and Design.

    Science.gov (United States)

    1994

    This collection of computer generated images aims to present media tools and processes, stimulate ideas, and inspire artists and art students working in computer-related design. The images are representative of state-of-the-art editorial, broadcast, packaging, fine arts, and graphic techniques possible through computer generation. Each image is…

  1. Diffuse abnormalities of the trachea: computed tomography findings

    International Nuclear Information System (INIS)

    Marchiori, Edson; Araujo Neto, Cesar de

    2008-01-01

    The aim of this pictorial essay was to present the main computed tomography findings seen in diffuse diseases of the trachea. The diseases studied included amyloidosis, tracheobronchopathia osteochondroplastica, tracheobronchomegaly, laryngotracheobronchial papillomatosis, lymphoma, neurofibromatosis, relapsing polychondritis, Wegener's granulomatosis, tuberculosis, paracoccidioidomycosis, and tracheobronchomalacia. The most common computed tomography finding was thickening of the walls of the trachea, with or without nodules, parietal calcifications, or involvement of the posterior wall. Although computed tomography allows the detection and characterization of diseases of the central airways, and the correlation with clinical data reduces the diagnostic possibilities, bronchoscopy with biopsy remains the most useful procedure for the diagnosis of diffuse lesions of the trachea. (author)

  2. Computer code MLCOSP for multiple-correlation and spectrum analysis with a hybrid computer

    International Nuclear Information System (INIS)

    Oguma, Ritsuo; Fujii, Yoshio; Usui, Hozumi; Watanabe, Koichi

    1975-10-01

    Usage of the computer code MLCOSP(Multiple Correlation and Spectrum) developed is described for a hybrid computer installed in JAERI Functions of the hybrid computer and its terminal devices are utilized ingeniously in the code to reduce complexity of the data handling which occurrs in analysis of the multivariable experimental data and to perform the analysis in perspective. Features of the code are as follows; Experimental data can be fed to the digital computer through the analog part of the hybrid computer by connecting with a data recorder. The computed results are displayed in figures, and hardcopies are taken when necessary. Series-messages to the code are shown on the terminal, so man-machine communication is possible. And further the data can be put in through a keyboard, so case study according to the results of analysis is possible. (auth.)

  3. Computer Assisted Instructional Design for Computer-Based Instruction. Final Report. Working Papers.

    Science.gov (United States)

    Russell, Daniel M.; Pirolli, Peter

    Recent advances in artificial intelligence and the cognitive sciences have made it possible to develop successful intelligent computer-aided instructional systems for technical and scientific training. In addition, computer-aided design (CAD) environments that support the rapid development of such computer-based instruction have also been recently…

  4. Review of the RNA Interference Pathway in Molluscs Including Some Possibilities for Use in Bivalves in Aquaculture

    Directory of Open Access Journals (Sweden)

    Leigh Owens

    2015-03-01

    Full Text Available Generalised reviews of RNA interference (RNAi in invertebrates, and for use in aquaculture, have taken for granted that RNAi pathways operate in molluscs, but inspection of such reviews show little specific evidence of such activity in molluscs. This review was to understand what specific research had been conducted on RNAi in molluscs, particularly with regard to aquaculture. There were questions of whether RNAi in molluscs functions similarly to the paradigm established for most eukaryotes or, alternatively, was it more similar to the ecdozoa and how RNAi may relate to disease control in aquaculture? RNAi in molluscs appears to have been only investigated in about 14 species, mostly as a gene silencing phenomenon. We can infer that microRNAs including let-7 are functional in molluscs. The genes/proteins involved in the actual RNAi pathways have only been rudimentarily investigated, so how homologous the genes and proteins are to other metazoa is unknown. Furthermore, how many different genes for each activity in the RNAi pathway are also unknown? The cephalopods have been greatly overlooked with only a single RNAi gene-silencing study found. The long dsRNA-linked interferon pathways seem to be present in molluscs, unlike some other invertebrates and could be used to reduce disease states in aquaculture. In particular, interferon regulatory factor genes have been found in molluscs of aquacultural importance such as Crassostrea, Mytilus, Pinctada and Haliotis. Two possible aquaculture scenarios are discussed, zoonotic norovirus and ostreid herpesvirus 1 to illustrate the possibilities. The entire field of RNAi in molluscs looks ripe for scientific exploitation and practical application.

  5. Computer Prediction of Air Quality in Livestock Buildings

    DEFF Research Database (Denmark)

    Svidt, Kjeld; Bjerg, Bjarne

    In modem livestock buildings the design of ventilation systems is important in order to obtain good air quality. The use of Computational Fluid Dynamics for predicting the air distribution makes it possible to include the effect of room geometry and heat sources in the design process. This paper...... presents numerical prediction of air flow in a livestock building compared with laboratory measurements. An example of the calculation of contaminant distribution is given, and the future possibilities of the method are discussed....

  6. Current state and future direction of computer systems at NASA Langley Research Center

    Science.gov (United States)

    Rogers, James L. (Editor); Tucker, Jerry H. (Editor)

    1992-01-01

    Computer systems have advanced at a rate unmatched by any other area of technology. As performance has dramatically increased there has been an equally dramatic reduction in cost. This constant cost performance improvement has precipitated the pervasiveness of computer systems into virtually all areas of technology. This improvement is due primarily to advances in microelectronics. Most people are now convinced that the new generation of supercomputers will be built using a large number (possibly thousands) of high performance microprocessors. Although the spectacular improvements in computer systems have come about because of these hardware advances, there has also been a steady improvement in software techniques. In an effort to understand how these hardware and software advances will effect research at NASA LaRC, the Computer Systems Technical Committee drafted this white paper to examine the current state and possible future directions of computer systems at the Center. This paper discusses selected important areas of computer systems including real-time systems, embedded systems, high performance computing, distributed computing networks, data acquisition systems, artificial intelligence, and visualization.

  7. Experience gained in using a computer-aided teaching system in Azov maritime institute

    Directory of Open Access Journals (Sweden)

    Олександр Миколайович Зиновченко

    2017-06-01

    Full Text Available Brief analysis of the known teaching methods through the use of computer has been given. Computer-aided teaching system includes an interactive lecture, laboratory works, an application for online testing and evaluation of the new knowledge assimilation and the software used by the teacher. The virtual lecture presents information as sound tracked dynamic pictures accompanied by permanent practical work that fixes the acquired knowledge in the student’s mind. Each teaching step in the virtual lecture is followed with practical work evaluated by the computer. Virtual labs make it possible to consolidate the new knowledge by practice. They provide for the individual activity of the student, monitor his progress and automatically evaluate his knowledge. These applications are installed in the student's computer. The computer applications of the teacher include a generator of the tests for testing and evaluation of the new knowledge, a typical problems base, personal information files generator for each student and a computer application forming the final mark of the student. The results of the testing of this teaching system show that it is efficient, making it possible to organize a flexible schedule of the educational process,cutting down the working hours of the teacher

  8. Evaluating Computer Screen Time and Its Possible Link to Psychopathology in the Context of Age: A Cross-Sectional Study of Parents and Children.

    Science.gov (United States)

    Segev, Aviv; Mimouni-Bloch, Aviva; Ross, Sharon; Silman, Zmira; Maoz, Hagai; Bloch, Yuval

    2015-01-01

    Several studies have suggested that high levels of computer use are linked to psychopathology. However, there is ambiguity about what should be considered normal or over-use of computers. Furthermore, the nature of the link between computer usage and psychopathology is controversial. The current study utilized the context of age to address these questions. Our hypothesis was that the context of age will be paramount for differentiating normal from excessive use, and that this context will allow a better understanding of the link to psychopathology. In a cross-sectional study, 185 parents and children aged 3-18 years were recruited in clinical and community settings. They were asked to fill out questionnaires regarding demographics, functional and academic variables, computer use as well as psychiatric screening questionnaires. Using a regression model, we identified 3 groups of normal-use, over-use and under-use and examined known factors as putative differentiators between the over-users and the other groups. After modeling computer screen time according to age, factors linked to over-use were: decreased socialization (OR 3.24, Confidence interval [CI] 1.23-8.55, p = 0.018), difficulty to disengage from the computer (OR 1.56, CI 1.07-2.28, p = 0.022) and age, though borderline-significant (OR 1.1 each year, CI 0.99-1.22, p = 0.058). While psychopathology was not linked to over-use, post-hoc analysis revealed that the link between increased computer screen time and psychopathology was age-dependent and solidified as age progressed (p = 0.007). Unlike computer usage, the use of small-screens and smartphones was not associated with psychopathology. The results suggest that computer screen time follows an age-based course. We conclude that differentiating normal from over-use as well as defining over-use as a possible marker for psychiatric difficulties must be performed within the context of age. If verified by additional studies, future research should integrate

  9. Enhancing Electrical Troubleshooting Skills in a Computer-Coached Practice Environment.

    Science.gov (United States)

    Johnson, Scott D.; And Others

    1993-01-01

    This study examines the effect of the "Technical Troubleshooting Tutor," a computer-coached training program, on aircraft electrical system troubleshooting. Performance ability differences between control groups are noted, and troubleshooting models and flow diagram examples are included. The study demonstrates the possibilities for…

  10. Vibrating crystals as possible neutron monochromators

    International Nuclear Information System (INIS)

    Stoica, A.D.; Popovici, M.

    1983-09-01

    The Bragg reflection of neutrons of vibratinq perfect crystals is considered. The additional possibilities offered by the Doppler effect for shaping neutron beams in the k-space are discussed. A simple model for computing the vibrating crystal reflectivity is proposed. (author)

  11. Advanced computations in plasma physics

    International Nuclear Information System (INIS)

    Tang, W.M.

    2002-01-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. In this paper we review recent progress and future directions for advanced simulations in magnetically confined plasmas with illustrative examples chosen from magnetic confinement research areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to

  12. From humans to computers cognition through visual perception

    CERN Document Server

    Alexandrov, Viktor Vasilievitch

    1991-01-01

    This book considers computer vision to be an integral part of the artificial intelligence system. The core of the book is an analysis of possible approaches to the creation of artificial vision systems, which simulate human visual perception. Much attention is paid to the latest achievements in visual psychology and physiology, the description of the functional and structural organization of the human perception mechanism, the peculiarities of artistic perception and the expression of reality. Computer vision models based on these data are investigated. They include the processes of external d

  13. Computer Security Handbook

    CERN Document Server

    Bosworth, Seymour; Whyne, Eric

    2012-01-01

    The classic and authoritative reference in the field of computer security, now completely updated and revised With the continued presence of large-scale computers; the proliferation of desktop, laptop, and handheld computers; and the vast international networks that interconnect them, the nature and extent of threats to computer security have grown enormously. Now in its fifth edition, Computer Security Handbook continues to provide authoritative guidance to identify and to eliminate these threats where possible, as well as to lessen any losses attributable to them. With seventy-seven chapter

  14. Computer simulation on molten ionic salts

    International Nuclear Information System (INIS)

    Kawamura, K.; Okada, I.

    1978-01-01

    The extensive advances in computer technology have since made it possible to apply computer simulation to the evaluation of the macroscopic and microscopic properties of molten salts. The evaluation of the potential energy in molten salts systems is complicated by the presence of long-range energy, i.e. Coulomb energy, in contrast to simple liquids where the potential energy is easily evaluated. It has been shown, however, that no difficulties are encountered when the Ewald method is applied to the evaluation of Coulomb energy. After a number of attempts had been made to approximate the pair potential, the Huggins-Mayer potential based on ionic crystals became the most often employed. Since it is thought that the only appreciable contribution to many-body potential, not included in Huggins-Mayer potential, arises from the internal electrostatic polarization of ions in molten ionic salts, computer simulation with a provision for ion polarization has been tried recently. The computations, which are employed mainly for molten alkali halides, can provide: (1) thermodynamic data such as internal energy, internal pressure and isothermal compressibility; (2) microscopic configurational data such as radial distribution functions; (3) transport data such as the diffusion coefficient and electrical conductivity; and (4) spectroscopic data such as the intensity of inelastic scattering and the stretching frequency of simple molecules. The computed results seem to agree well with the measured results. Computer simulation can also be used to test the effectiveness of a proposed pair potential and the adequacy of postulated models of molten salts, and to obtain experimentally inaccessible data. A further application of MD computation employing the pair potential based on an ionic model to BeF 2 , ZnCl 2 and SiO 2 shows the possibility of quantitative interpretation of structures and glass transformation phenomena

  15. Pulmonary nodule characterization, including computer analysis and quantitative features.

    Science.gov (United States)

    Bartholmai, Brian J; Koo, Chi Wan; Johnson, Geoffrey B; White, Darin B; Raghunath, Sushravya M; Rajagopalan, Srinivasan; Moynagh, Michael R; Lindell, Rebecca M; Hartman, Thomas E

    2015-03-01

    Pulmonary nodules are commonly detected in computed tomography (CT) chest screening of a high-risk population. The specific visual or quantitative features on CT or other modalities can be used to characterize the likelihood that a nodule is benign or malignant. Visual features on CT such as size, attenuation, location, morphology, edge characteristics, and other distinctive "signs" can be highly suggestive of a specific diagnosis and, in general, be used to determine the probability that a specific nodule is benign or malignant. Change in size, attenuation, and morphology on serial follow-up CT, or features on other modalities such as nuclear medicine studies or MRI, can also contribute to the characterization of lung nodules. Imaging analytics can objectively and reproducibly quantify nodule features on CT, nuclear medicine, and magnetic resonance imaging. Some quantitative techniques show great promise in helping to differentiate benign from malignant lesions or to stratify the risk of aggressive versus indolent neoplasm. In this article, we (1) summarize the visual characteristics, descriptors, and signs that may be helpful in management of nodules identified on screening CT, (2) discuss current quantitative and multimodality techniques that aid in the differentiation of nodules, and (3) highlight the power, pitfalls, and limitations of these various techniques.

  16. Constructing Precisely Computing Networks with Biophysical Spiking Neurons.

    Science.gov (United States)

    Schwemmer, Michael A; Fairhall, Adrienne L; Denéve, Sophie; Shea-Brown, Eric T

    2015-07-15

    While spike timing has been shown to carry detailed stimulus information at the sensory periphery, its possible role in network computation is less clear. Most models of computation by neural networks are based on population firing rates. In equivalent spiking implementations, firing is assumed to be random such that averaging across populations of neurons recovers the rate-based approach. Recently, however, Denéve and colleagues have suggested that the spiking behavior of neurons may be fundamental to how neuronal networks compute, with precise spike timing determined by each neuron's contribution to producing the desired output (Boerlin and Denéve, 2011; Boerlin et al., 2013). By postulating that each neuron fires to reduce the error in the network's output, it was demonstrated that linear computations can be performed by networks of integrate-and-fire neurons that communicate through instantaneous synapses. This left open, however, the possibility that realistic networks, with conductance-based neurons with subthreshold nonlinearity and the slower timescales of biophysical synapses, may not fit into this framework. Here, we show how the spike-based approach can be extended to biophysically plausible networks. We then show that our network reproduces a number of key features of cortical networks including irregular and Poisson-like spike times and a tight balance between excitation and inhibition. Lastly, we discuss how the behavior of our model scales with network size or with the number of neurons "recorded" from a larger computing network. These results significantly increase the biological plausibility of the spike-based approach to network computation. We derive a network of neurons with standard spike-generating currents and synapses with realistic timescales that computes based upon the principle that the precise timing of each spike is important for the computation. We then show that our network reproduces a number of key features of cortical networks

  17. Noninvasive coronary angioscopy using electron beam computed tomography and multidetector computed tomography

    NARCIS (Netherlands)

    van Ooijen, PMA; Nieman, K; de Feyter, PJ; Oudkerk, M

    2002-01-01

    With the advent of noninvasive coronary imaging techniques like multidetector computed tomography and electron beam computed tomography, new representation methods such as intracoronary visualization. have been introduced. We explore the possibilities of these novel visualization techniques and

  18. Progress report of Physics Division including Applied Mathematics and Computing Section. 1st October 1970 - 31st March 1971

    International Nuclear Information System (INIS)

    2004-01-01

    The initial MOATA safety assessment was based on data and calculations available before the advent of multigroup diffusion theory codes in two dimensions. That assessment is being revised and extended to gain approval for 100 kW operation. The more detailed representation obtained in the new calculations has resulted in a much better understanding of the physics of this reactor. The properties of the reactor are determined to a large extent by neutron leakage from the rather thin core tanks. In particular the effect of leakage on the coupling between the core tanks and on reactivity coefficients has been clarified and quantified. In neutron data studies, the theoretical fission product library was revised, checked against any experimental values and distributed to interested overseas centres. Some further nubar work was done vith much better neutron energy resolution, and confirmed our earlier measurements. A promising formulation of R matrix theory of nuclear interaction is expected to lead to simpler multilevel resonance parameter description. With large amounts of digital data being collected, dissplayed and used by theoreticians and experimentalists, more attention -was given to visual interactive computer displays. This interest is generating constructive proposals for use of the dataway now being installed between the Division and the IBM 360/50 computer. The study of gamma rays following the capture of keV neutrons continues to reveal new and interesting features of the physical processes involved. A detailed international compilation of the gamma rays emitted and their intensities is in progress. The work on nickel-68, amongst others, has enabled a partial capture cross section to be generated from the gamma ray parameters obtained by experiment. Much work still remains to be done, possibly at other establishments with more extensive facilities. The electrical and mechanical components of our new zero power split table machine for reactor physics assemblies

  19. Dry eye syndrome among computer users

    Science.gov (United States)

    Gajta, Aurora; Turkoanje, Daniela; Malaescu, Iosif; Marin, Catalin-Nicolae; Koos, Marie-Jeanne; Jelicic, Biljana; Milutinovic, Vuk

    2015-12-01

    Dry eye syndrome is characterized by eye irritation due to changes of the tear film. Symptoms include itching, foreign body sensations, mucous discharge and transitory vision blurring. Less occurring symptoms include photophobia and eye tiredness. Aim of the work was to determine the quality of the tear film and ocular dryness potential risk in persons who spend more than 8 hours using computers and possible correlations between severity of symptoms (dry eyes symptoms anamnesis) and clinical signs assessed by: Schirmer test I, TBUT (Tears break-up time), TFT (Tear ferning test). The results show that subjects using computer have significantly shorter TBUT (less than 5 s for 56 % of subjects and less than 10 s for 37 % of subjects), TFT type II/III in 50 % of subjects and type III 31% of subjects was found when compared to computer non users (TFT type I and II was present in 85,71% of subjects). Visual display terminal use, more than 8 hours daily, has been identified as a significant risk factor for dry eye. It's been advised to all persons who spend substantial time using computers to use artificial tears drops in order to minimize the symptoms of dry eyes syndrome and prevents serious complications.

  20. Brain-Computer Symbiosis

    Science.gov (United States)

    Schalk, Gerwin

    2009-01-01

    The theoretical groundwork of the 1930’s and 1940’s and the technical advance of computers in the following decades provided the basis for dramatic increases in human efficiency. While computers continue to evolve, and we can still expect increasing benefits from their use, the interface between humans and computers has begun to present a serious impediment to full realization of the potential payoff. This article is about the theoretical and practical possibility that direct communication between the brain and the computer can be used to overcome this impediment by improving or augmenting conventional forms of human communication. It is about the opportunity that the limitations of our body’s input and output capacities can be overcome using direct interaction with the brain, and it discusses the assumptions, possible limitations, and implications of a technology that I anticipate will be a major source of pervasive changes in the coming decades. PMID:18310804

  1. About possibility of temperature trace observing on a human skin through clothes by using computer processing of IR image

    Science.gov (United States)

    Trofimov, Vyacheslav A.; Trofimov, Vladislav V.; Shestakov, Ivan L.; Blednov, Roman G.

    2017-05-01

    One of urgent security problems is a detection of objects placed inside the human body. Obviously, for safety reasons one cannot use X-rays for such object detection widely and often. For this purpose, we propose to use THz camera and IR camera. Below we continue a possibility of IR camera using for a detection of temperature trace on a human body. In contrast to passive THz camera using, the IR camera does not allow to see very pronounced the object under clothing. Of course, this is a big disadvantage for a security problem solution based on the IR camera using. To find possible ways for this disadvantage overcoming we make some experiments with IR camera, produced by FLIR Company and develop novel approach for computer processing of images captured by IR camera. It allows us to increase a temperature resolution of IR camera as well as human year effective susceptibility enhancing. As a consequence of this, a possibility for seeing of a human body temperature changing through clothing appears. We analyze IR images of a person, which drinks water and eats chocolate. We follow a temperature trace on human body skin, caused by changing of temperature inside the human body. Some experiments are made with observing of temperature trace from objects placed behind think overall. Demonstrated results are very important for the detection of forbidden objects, concealed inside the human body, by using non-destructive control without using X-rays.

  2. Computer applications in thermochemistry

    International Nuclear Information System (INIS)

    Vana Varamban, S.

    1996-01-01

    Knowledge of equilibrium is needed under many practical situations. Simple stoichiometric calculations can be performed by the use of hand calculators. Multi-component, multi-phase gas - solid chemical equilibrium calculations are far beyond the conventional devices and methods. Iterative techniques have to be resorted. Such problems are most elegantly handled by the use of modern computers. This report demonstrates the possible use of computers for chemical equilibrium calculations in the field of thermochemistry and chemical metallurgy. Four modules are explained. To fit the experimental C p data and to generate the thermal functions, to perform equilibrium calculations to the defined conditions, to prepare the elaborate input to the equilibrium and to analyse the calculated results graphically. The principles of thermochemical calculations are briefly described. An extensive input guide is given. Several illustrations are included to help the understanding and usage. (author)

  3. Computation of transverse muon-spin relaxation functions including trapping-detrapping reactions, with application to electron-irradiated tantalum

    International Nuclear Information System (INIS)

    Doering, K.P.; Aurenz, T.; Herlach, D.; Schaefer, H.E.; Arnold, K.P.; Jacobs, W.; Orth, H.; Haas, N.; Seeger, A.; Max-Planck-Institut fuer Metallforschung, Stuttgart

    1986-01-01

    A new technique for the economical evaluation of transverse muon spin relaxation functions in situations involving μ + trapping at and detrapping from crystal defects is applied to electron-irradiated Ta exhibiting relaxation maxima at about 35 K, 100 K, and 250 K. The long-range μ + diffusion is shown to be limted by traps over the entire temperature range investigated. The (static) relaxation rates for several possible configurations of trapped muons are discussed, including the effect of the simultaneous presence of a proton in a vacancy. (orig.)

  4. Polymorphous computing fabric

    Science.gov (United States)

    Wolinski, Christophe Czeslaw [Los Alamos, NM; Gokhale, Maya B [Los Alamos, NM; McCabe, Kevin Peter [Los Alamos, NM

    2011-01-18

    Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.

  5. “Future Directions”: m-government computer systems accessed via cloud computing – advantages and possible implementations

    OpenAIRE

    Daniela LIŢAN

    2015-01-01

    In recent years, the activities of companies and Public Administration had been automated and adapted to the current information system. Therefore, in this paper, I will present and exemplify the benefits of m-government computer systems development and implementation (which can be accessed from mobile devices and which are specific to the workflow of Public Administrations) starting from the “experience” of e-government systems implementation in the context of their access and usage through ...

  6. Workstation computer systems for in-core fuel management

    International Nuclear Information System (INIS)

    Ciccone, L.; Casadei, A.L.

    1992-01-01

    The advancement of powerful engineering workstations has made it possible to have thermal-hydraulics and accident analysis computer programs operating efficiently with a significant performance/cost ratio compared to large mainframe computer. Today, nuclear utilities are acquiring independent engineering analysis capability for fuel management and safety analyses. Computer systems currently available to utility organizations vary widely thus requiring that this software be operational on a number of computer platforms. Recognizing these trends Westinghouse adopted a software development life cycle process for the software development activities which strictly controls the development, testing and qualification of design computer codes. In addition, software standards to ensure maximum portability were developed and implemented, including adherence to FORTRAN 77, and use of uniform system interface and auxiliary routines. A comprehensive test matrix was developed for each computer program to ensure that evolution of code versions preserves the licensing basis. In addition, the results of such test matrices establish the Quality Assurance basis and consistency for the same software operating on different computer platforms. (author). 4 figs

  7. Factors affecting the possibility to detect buccal bone condition around dental implants using cone beam computed tomography

    DEFF Research Database (Denmark)

    Liedke, Gabriela S; Spin-Neto, Rubens; da Silveira, Heloisa E D

    2016-01-01

    OBJECTIVES: To evaluate factors with impact on the conspicuity (possibility to detect) of the buccal bone condition around dental implants in cone beam computed tomography (CBCT) imaging. MATERIAL AND METHODS: Titanium (Ti) or zirconia (Zr) implants and abutments were inserted into 40 bone blocks...... in a way to obtain variable buccal bone thicknesses. Three combinations regarding the implant-abutment metal (TiTi, TiZr, or ZrZr) and the number of implants (one, two, or three) were assessed. Two CBCT units (Scanora 3D - Sc and Cranex 3D - Cr) and two voxel resolutions (0.2 and 0.13 mm) were used...... variable. Odds ratio (OR) were calculated separately for each CBCT unit. RESULTS: Implant-abutment combination (ZrZr) (OR Sc = 19.18, OR Cr = 11.89) and number of implants (3) (OR Sc = 12.10, OR Cr = 4.25) had major impact on buccal bone conspicuity. The thinner the buccal bone, the higher the risk...

  8. Computational physics

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1987-01-15

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October.

  9. Computational physics

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October

  10. The WECHSL-Mod3 code: A computer program for the interaction of a core melt with concrete including the long term behavior. Model description and user's manual

    International Nuclear Information System (INIS)

    Foit, J.J.; Adroguer, B.; Cenerino, G.; Stiefel, S.

    1995-02-01

    The WECHSL-Mod3 code is a mechanistic computer code developed for the analysis of the thermal and chemical interaction of initially molten reactor materials with concrete in a two-dimensional as well as in a one-dimensional, axisymmetrical concrete cavity. The code performs calculations from the time of initial contact of a hot molten pool over start of solidification processes until long term basemat erosion over several days with the possibility of basemat penetration. It is assumed that an underlying metallic layer exists covered by an oxidic layer or that only one oxidic layer is present which can contain a homogeneously dispersed metallic phase. Heat generation in the melt is by decay heat and chemical reactions from metal oxidation. Energy is lost to the melting concrete and to the upper containment by radiation or evaporation of sumpwater possibly flooding the surface of the melt. Thermodynamic and transport properties as well as criteria for heat transfer and solidification processes are internally calculated for each time step. Heat transfer is modelled taking into account the high gas flux from the decomposing concrete and the heat conduction in the crusts possibly forming in the long term at the melt/concrete interface. The CALTHER code (developed at CEA, France) which models the radiative heat transfer from the upper surface of the corium melt to the surrounding cavity is implemented in the present WECHSL version. The WECHSL code in its present version was validated by the BETA, ACE and SURC experiments. The test samples include a BETA and the SURC2 post test calculations and a WECHSL application to a reactor accident. (orig.) [de

  11. Computational sustainability

    CERN Document Server

    Kersting, Kristian; Morik, Katharina

    2016-01-01

    The book at hand gives an overview of the state of the art research in Computational Sustainability as well as case studies of different application scenarios. This covers topics such as renewable energy supply, energy storage and e-mobility, efficiency in data centers and networks, sustainable food and water supply, sustainable health, industrial production and quality, etc. The book describes computational methods and possible application scenarios.

  12. Physics of quantum computation

    International Nuclear Information System (INIS)

    Belokurov, V.V.; Khrustalev, O.A.; Sadovnichij, V.A.; Timofeevskaya, O.D.

    2003-01-01

    In the paper, the modern status of the theory of quantum computation is considered. The fundamental principles of quantum computers and their basic notions such as quantum processors and computational basis states of the quantum Turing machine as well as the quantum Fourier transform are discussed. Some possible experimental realizations on the basis of NMR methods are given

  13. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  14. Computing on the grid and in the cloud

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    "The results today are only possible because of the extraordinary performance of the accelerators, including the infrastructure, the experiments, and the Grid computing." These were the words of the CERN Director General Rolf Heuer when the observation of a new particle consistent with a Higgs Boson was revealed to the world on the 4th July 2012. The end result of the all investments made to build and operate the LHC is the data that are recorded and the knowledge that can be extracted. It is the role of the global computing infrastructure to unlock the value that is encapsulated in the data. This lecture provides a detailed overview of the Worldwide LHC Computing Grid, an international collaboration to distribute and analyse the LHC data.

  15. Computing on the grid and in the cloud

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    "The results today are only possible because of the extraordinary performance of the accelerators, including the infrastructure, the experiments, and the Grid computing." These were the words of the CERN Director General Rolf Heuer when the observation of a new particle consistent with a Higgs Boson was revealed to the world on the 4th July 2012. The end result of the all investments made to build and operate the LHC is the data that are recorded and the knowledge that can be extracted. It is the role of the global computing infrastructure to unlock the value that is encapsulated in the data. This lecture provides a detailed overview of the Worldwide LHC Computing Grid, an international collaboration to distribute and analyse the LHC data.

  16. Blind Quantum Computation

    DEFF Research Database (Denmark)

    Salvail, Louis; Arrighi, Pablo

    2006-01-01

    We investigate the possibility of "having someone carry out the work of executing a function for you, but without letting him learn anything about your input". Say Alice wants Bob to compute some known function f upon her input x, but wants to prevent Bob from learning anything about x. The situa......We investigate the possibility of "having someone carry out the work of executing a function for you, but without letting him learn anything about your input". Say Alice wants Bob to compute some known function f upon her input x, but wants to prevent Bob from learning anything about x....... The situation arises for instance if client Alice has limited computational resources in comparison with mistrusted server Bob, or if x is an inherently mobile piece of data. Could there be a protocol whereby Bob is forced to compute f(x) "blindly", i.e. without observing x? We provide such a blind computation...... protocol for the class of functions which admit an efficient procedure to generate random input-output pairs, e.g. factorization. The cheat-sensitive security achieved relies only upon quantum theory being true. The security analysis carried out assumes the eavesdropper performs individual attacks....

  17. Sex differences in perceived attributes of computer-mediated communication.

    Science.gov (United States)

    Harper, Vernon B

    2003-02-01

    Researchers have pointed to the influence of sex with respect to the attributes of the computer medium. The author elaborates upon possible sex differences in reference to perceived attributes of the computer medium, i.e., Richness, Accessibility, Velocity, Interactivity, Plasticity, and Immediacy. Data from both a pilot and main study are reported and interpreted. The pilot study included 78 participants, while the main study involved 211. The independent samples were composed of Communication Studies students enrolled at two Mid-Atlantic universities. Nine items with anchors of 1: strongly disagree and 7: strongly agree were taken from the 2000 Computer Mediated Communication Competence Scale of Spitzberg to assess the attributes of computer-mediated interaction. The results indicate that women scored higher than men on perceptions of Accessibility, Velocity, Interactivity, and Immediacy.

  18. Privacy-Preserving Computation with Trusted Computing via Scramble-then-Compute

    Directory of Open Access Journals (Sweden)

    Dang Hung

    2017-07-01

    Full Text Available We consider privacy-preserving computation of big data using trusted computing primitives with limited private memory. Simply ensuring that the data remains encrypted outside the trusted computing environment is insufficient to preserve data privacy, for data movement observed during computation could leak information. While it is possible to thwart such leakage using generic solution such as ORAM [42], designing efficient privacy-preserving algorithms is challenging. Besides computation efficiency, it is critical to keep trusted code bases lean, for large ones are unwieldy to vet and verify. In this paper, we advocate a simple approach wherein many basic algorithms (e.g., sorting can be made privacy-preserving by adding a step that securely scrambles the data before feeding it to the original algorithms. We call this approach Scramble-then-Compute (StC, and give a sufficient condition whereby existing external memory algorithms can be made privacy-preserving via StC. This approach facilitates code-reuse, and its simplicity contributes to a smaller trusted code base. It is also general, allowing algorithm designers to leverage an extensive body of known efficient algorithms for better performance. Our experiments show that StC could offer up to 4.1× speedups over known, application-specific alternatives.

  19. Method for Statically Checking an Object-oriented Computer Program Module

    Science.gov (United States)

    Bierhoff, Kevin M. (Inventor); Aldrich, Jonathan (Inventor)

    2012-01-01

    A method for statically checking an object-oriented computer program module includes the step of identifying objects within a computer program module, at least one of the objects having a plurality of references thereto, possibly from multiple clients. A discipline of permissions is imposed on the objects identified within the computer program module. The permissions enable tracking, from among a discrete set of changeable states, a subset of states each object might be in. A determination is made regarding whether the imposed permissions are violated by a potential reference to any of the identified objects. The results of the determination are output to a user.

  20. INTERRUPTION TO COMPUTING SERVICES, SATURDAY 9 FEBRUARY

    CERN Multimedia

    2002-01-01

    In order to allow the rerouting of electrical cables which power most of the B513 Computer Room, there will be a complete shutdown of central computing services on Saturday 9th February. This shutown affects all Central Computing services, including all NICE services (for Windows 95, Windows NT and Windows 2000), Mail and Web services, sitewide printing services, all Unix interactive and batch services, the ACB service, all AIS services and databases (such as EDH, BHT, CFU and HR), dedicated Engineering services, and general purpose database services. Services will be run down progressively from early on Saturday morning and reestablished as soon as possible, starting in the afternoon. However, it is unlikely that full computing services will be available before the Saturday evening. For operational reasons, some services may be shutdown on the evening of Friday 8th February and restarted on Monday 11th February. More detailed information about the stoppage and restart schedules will be given nearer...

  1. INTERRUPTION TO COMPUTING SERVICES, SATURDAY 9 FEBRUARY

    CERN Multimedia

    2002-01-01

    In order to allow the rerouting of electrical cables which power most of the B513 Computer Room, there will be a complete shutdown of central computing services on Saturday 9th February. This shutown affects all Central Computing services, including all NICE services (for Windows 95, Windows NT and Windows 2000), Mail and Web services, sitewide printing services, all Unix interactive and batch services, the ACB service, all AIS services and databases (such as EDH, BHT, CFU and HR) dedicated Engineering services, and general purpose database services. Services will be run down progressively from early on Saturday morning and reestablished as soon as possible, starting in the afternoon. However, it is unlikely that full computing services will be available before the Saturday evening. For operational reasons, some services may be shutdown on the evening of Friday 8th February and restarted on Monday 11th February. More detailed information about the stoppage and restart schedules will be given nearer ...

  2. The Computational Materials Repository

    DEFF Research Database (Denmark)

    Landis, David D.; Hummelshøj, Jens S.; Nestorov, Svetlozar

    2012-01-01

    The possibilities for designing new materials based on quantum physics calculations are rapidly growing, but these design efforts lead to a significant increase in the amount of computational data created. The Computational Materials Repository (CMR) addresses this data challenge and provides...

  3. Taxonomy of cloud computing services

    NARCIS (Netherlands)

    Hoefer, C.N.; Karagiannis, Georgios

    2010-01-01

    Cloud computing is a highly discussed topic, and many big players of the software industry are entering the development of cloud services. Several companies want to explore the possibilities and benefits of cloud computing, but with the amount of cloud computing services increasing quickly, the need

  4. Framework for Computation Offloading in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Dejan Kovachev

    2012-12-01

    Full Text Available The inherently limited processing power and battery lifetime of mobile phones hinder the possible execution of computationally intensive applications like content-based video analysis or 3D modeling. Offloading of computationally intensive application parts from the mobile platform into a remote cloud infrastructure or nearby idle computers addresses this problem. This paper presents our Mobile Augmentation Cloud Services (MACS middleware which enables adaptive extension of Android application execution from a mobile client into the cloud. Applications are developed by using the standard Android development pattern. The middleware does the heavy lifting of adaptive application partitioning, resource monitoring and computation offloading. These elastic mobile applications can run as usual mobile application, but they can also use remote computing resources transparently. Two prototype applications using the MACS middleware demonstrate the benefits of the approach. The evaluation shows that applications, which involve costly computations, can benefit from offloading with around 95% energy savings and significant performance gains compared to local execution only.

  5. A Global Computing Grid for LHC; Una red global de computacion para LHC

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez Calama, J. M.; Colino Arriero, N.

    2013-06-01

    An innovative computing infrastructure has played an instrumental role in the recent discovery of the Higgs boson in the LHC and has enabled scientists all over the world to store, process and analyze enormous amounts of data in record time. The Grid computing technology has made it possible to integrate computing center resources spread around the planet, including the CIEMAT, into a distributed system where these resources can be shared and accessed via Internet on a transparent, uniform basis. A global supercomputer for the LHC experiments. (Author)

  6. CERN Computing Colloquium | Computer Security in 2016: Where are we and what to expect | 8 February

    CERN Multimedia

    2016-01-01

    Computer Security in 2016: Where are we and what to expect  by Sebastian Lopienski, CERN-IT Monday 8 February from 11 a.m. to 12 p.m http://cseminar.web.cern.ch/cseminar/ at CERN, Council Chamber (503-1-001)  Description: Attacks against computer systems, belonging both to individuals and organisations, are an everyday reality. How many times have we heard about supposedly well protected companies and online services at the mercy of cyber criminals, or governments accusing other nation states of cyber espionage. Only the most serious breaches and biggest data leaks continue to make the headlines. But really, how secure is our data, computers and networks? What is happening behind the scenes? Is it actually possible to avoid the vulnerabilities, or detect the resulting exploits? This talk will address these questions and provide a high-level overview of security trends in the last year or two. It will include information on emerging typ...

  7. Computation as Medium

    DEFF Research Database (Denmark)

    Jochum, Elizabeth Ann; Putnam, Lance

    2017-01-01

    Artists increasingly utilize computational tools to generate art works. Computational approaches to art making open up new ways of thinking about agency in interactive art because they invite participation and allow for unpredictable outcomes. Computational art is closely linked...... to the participatory turn in visual art, wherein spectators physically participate in visual art works. Unlike purely physical methods of interaction, computer assisted interactivity affords artists and spectators more nuanced control of artistic outcomes. Interactive art brings together human bodies, computer code......, and nonliving objects to create emergent art works. Computation is more than just a tool for artists, it is a medium for investigating new aesthetic possibilities for choreography and composition. We illustrate this potential through two artistic projects: an improvisational dance performance between a human...

  8. Ubiquitous human computing.

    Science.gov (United States)

    Zittrain, Jonathan

    2008-10-28

    Ubiquitous computing means network connectivity everywhere, linking devices and systems as small as a drawing pin and as large as a worldwide product distribution chain. What could happen when people are so readily networked? This paper explores issues arising from two possible emerging models of ubiquitous human computing: fungible networked brainpower and collective personal vital sign monitoring.

  9. Multiscale approach including microfibril scale to assess elastic constants of cortical bone based on neural network computation and homogenization method.

    Science.gov (United States)

    Barkaoui, Abdelwahed; Chamekh, Abdessalem; Merzouki, Tarek; Hambli, Ridha; Mkaddem, Ali

    2014-03-01

    The complexity and heterogeneity of bone tissue require a multiscale modeling to understand its mechanical behavior and its remodeling mechanisms. In this paper, a novel multiscale hierarchical approach including microfibril scale based on hybrid neural network (NN) computation and homogenization equations was developed to link nanoscopic and macroscopic scales to estimate the elastic properties of human cortical bone. The multiscale model is divided into three main phases: (i) in step 0, the elastic constants of collagen-water and mineral-water composites are calculated by averaging the upper and lower Hill bounds; (ii) in step 1, the elastic properties of the collagen microfibril are computed using a trained NN simulation. Finite element calculation is performed at nanoscopic levels to provide a database to train an in-house NN program; and (iii) in steps 2-10 from fibril to continuum cortical bone tissue, homogenization equations are used to perform the computation at the higher scales. The NN outputs (elastic properties of the microfibril) are used as inputs for the homogenization computation to determine the properties of mineralized collagen fibril. The mechanical and geometrical properties of bone constituents (mineral, collagen, and cross-links) as well as the porosity were taken in consideration. This paper aims to predict analytically the effective elastic constants of cortical bone by modeling its elastic response at these different scales, ranging from the nanostructural to mesostructural levels. Our findings of the lowest scale's output were well integrated with the other higher levels and serve as inputs for the next higher scale modeling. Good agreement was obtained between our predicted results and literature data. Copyright © 2013 John Wiley & Sons, Ltd.

  10. Computer vision for sports

    DEFF Research Database (Denmark)

    Thomas, Graham; Gade, Rikke; Moeslund, Thomas B.

    2017-01-01

    fixed to players or equipment is generally not possible. This provides a rich set of opportunities for the application of computer vision techniques to help the competitors, coaches and audience. This paper discusses a selection of current commercial applications that use computer vision for sports...

  11. Data Intensive Computing on Amazon Web Services

    Energy Technology Data Exchange (ETDEWEB)

    Magana-Zook, S. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-04-21

    The Geophysical Monitoring Program (GMP) has spent the past few years building up the capability to perform data intensive computing using what have been referred to as “big data” tools. These big data tools would be used against massive archives of seismic signals (>300 TB) to conduct research not previously possible. Examples of such tools include Hadoop (HDFS, MapReduce), HBase, Hive, Storm, Spark, Solr, and many more by the day. These tools are useful for performing data analytics on datasets that exceed the resources of traditional analytic approaches. To this end, a research big data cluster (“Cluster A”) was set up as a collaboration between GMP and Livermore Computing (LC).

  12. Spatial Computing and Spatial Practices

    DEFF Research Database (Denmark)

    Brodersen, Anders; Büsher, Monika; Christensen, Michael

    2007-01-01

    The gathering momentum behind the research agendas of pervasive, ubiquitous and ambient computing, set in motion by Mark Weiser (1991), offer dramatic opportunities for information systems design. They raise the possibility of "putting computation where it belongs" by exploding computing power out...... the "disappearing computer" we have, therefore, carried over from previous research an interdisciplinary perspective, and a focus on the sociality of action (Suchman 1987)....

  13. Trends in scientific computing applied to petroleum exploration and production

    International Nuclear Information System (INIS)

    Guevara, Saul E; Piedrahita, Carlos E; Arroyo, Elkin R; Soto Rodolfo

    2002-01-01

    Current trends of computational tools in the upstream of the petroleum industry ore presented herein several results and images obtained through commercial programs and through in-house software developments illustrate the topics discussed. They include several types of problems and programming paradigms. Emphasis is made on the future of parallel processing through the use of affordable, open systems, as the Linux system. This kind of technologies will likely make possible new research and industry applications, since quite advanced computational resources will be available to many people working in the area

  14. Advanced methods for the computation of particle beam transport and the computation of electromagnetic fields and beam-cavity interactions

    International Nuclear Information System (INIS)

    Dragt, A.J.; Gluckstern, R.L.

    1992-11-01

    The University of Maryland Dynamical Systems and Accelerator Theory Group carries out research in two broad areas: the computation of charged particle beam transport using Lie algebraic methods and advanced methods for the computation of electromagnetic fields and beam-cavity interactions. Important improvements in the state of the art are believed to be possible in both of these areas. In addition, applications of these methods are made to problems of current interest in accelerator physics including the theoretical performance of present and proposed high energy machines. The Lie algebraic method of computing and analyzing beam transport handles both linear and nonlinear beam elements. Tests show this method to be superior to the earlier matrix or numerical integration methods. It has wide application to many areas including accelerator physics, intense particle beams, ion microprobes, high resolution electron microscopy, and light optics. With regard to the area of electromagnetic fields and beam cavity interactions, work is carried out on the theory of beam breakup in single pulses. Work is also done on the analysis of the high frequency behavior of longitudinal and transverse coupling impedances, including the examination of methods which may be used to measure these impedances. Finally, work is performed on the electromagnetic analysis of coupled cavities and on the coupling of cavities to waveguides

  15. Performing an allreduce operation on a plurality of compute nodes of a parallel computer

    Science.gov (United States)

    Faraj, Ahmad [Rochester, MN

    2012-04-17

    Methods, apparatus, and products are disclosed for performing an allreduce operation on a plurality of compute nodes of a parallel computer. Each compute node includes at least two processing cores. Each processing core has contribution data for the allreduce operation. Performing an allreduce operation on a plurality of compute nodes of a parallel computer includes: establishing one or more logical rings among the compute nodes, each logical ring including at least one processing core from each compute node; performing, for each logical ring, a global allreduce operation using the contribution data for the processing cores included in that logical ring, yielding a global allreduce result for each processing core included in that logical ring; and performing, for each compute node, a local allreduce operation using the global allreduce results for each processing core on that compute node.

  16. On the Use of Computers for Teaching Fluid Mechanics

    Science.gov (United States)

    Benson, Thomas J.

    1994-01-01

    Several approaches for improving the teaching of basic fluid mechanics using computers are presented. There are two objectives to these approaches: to increase the involvement of the student in the learning process and to present information to the student in a variety of forms. Items discussed include: the preparation of educational videos using the results of computational fluid dynamics (CFD) calculations, the analysis of CFD flow solutions using workstation based post-processing graphics packages, and the development of workstation or personal computer based simulators which behave like desk top wind tunnels. Examples of these approaches are presented along with observations from working with undergraduate co-ops. Possible problems in the implementation of these approaches as well as solutions to these problems are also discussed.

  17. Virtually going green: The role of quantum computational chemistry in reducing pollution and toxicity in chemistry

    Science.gov (United States)

    Stevens, Jonathan

    2017-07-01

    Continuing advances in computational chemistry has permitted quantum mechanical calculation to assist in research in green chemistry and to contribute to the greening of chemical practice. Presented here are recent examples illustrating the contribution of computational quantum chemistry to green chemistry, including the possibility of using computation as a green alternative to experiments, but also illustrating contributions to greener catalysis and the search for greener solvents. Examples of applications of computation to ambitious projects for green synthetic chemistry using carbon dioxide are also presented.

  18. All-optical reservoir computing.

    Science.gov (United States)

    Duport, François; Schneider, Bendix; Smerieri, Anteo; Haelterman, Marc; Massar, Serge

    2012-09-24

    Reservoir Computing is a novel computing paradigm that uses a nonlinear recurrent dynamical system to carry out information processing. Recent electronic and optoelectronic Reservoir Computers based on an architecture with a single nonlinear node and a delay loop have shown performance on standardized tasks comparable to state-of-the-art digital implementations. Here we report an all-optical implementation of a Reservoir Computer, made of off-the-shelf components for optical telecommunications. It uses the saturation of a semiconductor optical amplifier as nonlinearity. The present work shows that, within the Reservoir Computing paradigm, all-optical computing with state-of-the-art performance is possible.

  19. Soft computing analysis of the possible correlation between temporal and energy release patterns in seismic activity

    Science.gov (United States)

    Konstantaras, Anthony; Katsifarakis, Emmanouil; Artzouxaltzis, Xristos; Makris, John; Vallianatos, Filippos; Varley, Martin

    2010-05-01

    This paper is a preliminary investigation of the possible correlation of temporal and energy release patterns of seismic activity involving the preparation processes of consecutive sizeable seismic events [1,2]. The background idea is that during periods of low-level seismic activity, stress processes in the crust accumulate energy at the seismogenic area whilst larger seismic events act as a decongesting mechanism releasing considerable energy [3,4]. A dynamic algorithm is being developed aiming to identify and cluster pre- and post- seismic events to the main earthquake following on research carried out by Zubkov [5] and Dobrovolsky [6,7]. This clustering technique along with energy release equations dependent on Richter's scale [8,9] allow for an estimate to be drawn regarding the amount of the energy being released by the seismic sequence. The above approach is being implemented as a monitoring tool to investigate the behaviour of the underlying energy management system by introducing this information to various neural [10,11] and soft computing models [1,12,13,14]. The incorporation of intelligent systems aims towards the detection and simulation of the possible relationship between energy release patterns and time-intervals among consecutive sizeable earthquakes [1,15]. Anticipated successful training of the imported intelligent systems may result in a real-time, on-line processing methodology [1,16] capable to dynamically approximate the time-interval between the latest and the next forthcoming sizeable seismic event by monitoring the energy release process in a specific seismogenic area. Indexing terms: pattern recognition, long-term earthquake precursors, neural networks, soft computing, earthquake occurrence intervals References [1] Konstantaras A., Vallianatos F., Varley M.R. and Makris J. P.: ‘Soft computing modelling of seismicity in the southern Hellenic arc', IEEE Geoscience and Remote Sensing Letters, vol. 5 (3), pp. 323-327, 2008 [2] Eneva M. and

  20. Robust and Adaptive OMR System Including Fuzzy Modeling, Fusion of Musical Rules, and Possible Error Detection

    Directory of Open Access Journals (Sweden)

    Bloch Isabelle

    2007-01-01

    Full Text Available This paper describes a system for optical music recognition (OMR in case of monophonic typeset scores. After clarifying the difficulties specific to this domain, we propose appropriate solutions at both image analysis level and high-level interpretation. Thus, a recognition and segmentation method is designed, that allows dealing with common printing defects and numerous symbol interconnections. Then, musical rules are modeled and integrated, in order to make a consistent decision. This high-level interpretation step relies on the fuzzy sets and possibility framework, since it allows dealing with symbol variability, flexibility, and imprecision of music rules, and merging all these heterogeneous pieces of information. Other innovative features are the indication of potential errors and the possibility of applying learning procedures, in order to gain in robustness. Experiments conducted on a large data base show that the proposed method constitutes an interesting contribution to OMR.

  1. Ubiquitous Computing and Changing Pedagogical Possibilities: Representations, Conceptualizations and Uses of Knowledge

    Science.gov (United States)

    Swan, Karen; Van 'T Hooft, Mark; Kratcoski, Annette; Schenker, Jason

    2007-01-01

    This article reports on preliminary findings from an ongoing study of teaching and learning in a ubiquitous computing classroom. The research employed mixed methods and multiple measures to document changes in teaching and learning that result when teachers and students have access to a variety of digital devices wherever and whenever they need…

  2. Computers, Nanotechnology and Mind

    Science.gov (United States)

    Ekdahl, Bertil

    2008-10-01

    In 1958, two years after the Dartmouth conference, where the term artificial intelligence was coined, Herbert Simon and Allen Newell asserted the existence of "machines that think, that learn and create." They were further prophesying that the machines' capacity would increase and be on par with the human mind. Now, 50 years later, computers perform many more tasks than one could imagine in the 1950s but, virtually, no computer can do more than could the first digital computer, developed by John von Neumann in the 1940s. Computers still follow algorithms, they do not create them. However, the development of nanotechnology seems to have given rise to new hopes. With nanotechnology two things are supposed to happen. Firstly, due to the small scale it will be possible to construct huge computer memories which are supposed to be the precondition for building an artificial brain, secondly, nanotechnology will make it possible to scan the brain which in turn will make reverse engineering possible; the mind will be decoded by studying the brain. The consequence of such a belief is that the brain is no more than a calculator, i.e., all that the mind can do is in principle the results of arithmetical operations. Computers are equivalent to formal systems which in turn was an answer to an idea by Hilbert that proofs should contain ideal statements for which operations cannot be applied in a contentual way. The advocates of artificial intelligence will place content in a machine that is developed not only to be free of content but also cannot contain content. In this paper I argue that the hope for artificial intelligence is in vain.

  3. A survey of computational physics introductory computational science

    CERN Document Server

    Landau, Rubin H; Bordeianu, Cristian C

    2008-01-01

    Computational physics is a rapidly growing subfield of computational science, in large part because computers can solve previously intractable problems or simulate natural processes that do not have analytic solutions. The next step beyond Landau's First Course in Scientific Computing and a follow-up to Landau and Páez's Computational Physics, this text presents a broad survey of key topics in computational physics for advanced undergraduates and beginning graduate students, including new discussions of visualization tools, wavelet analysis, molecular dynamics, and computational fluid dynamics

  4. Advances and challenges in computational plasma science

    International Nuclear Information System (INIS)

    Tang, W M; Chan, V S

    2005-01-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This

  5. The ripple electromagnetic calculation: accuracy demand and possible responses

    International Nuclear Information System (INIS)

    Cocilovo, V.; Ramogida, G.; Formisano, A.; Martone, R.; Portone, A.; Roccella, M.; Roccella, R.

    2006-01-01

    Due to a number of causes (the finite number of toroidal field coils or the presence of concentrate blocks of magnetic materials, as the neutral beam shielding) the actual magnetic configuration in a Tokamak differs from the desired one. For example, a ripple is added to the ideal axisymmetric toroidal field, impacting the equilibrium and stability of the plasma column; as a further example the magnetic field out of plasma affects the operation of a number of critical components, included the diagnostic system and the neutral beam. Therefore the actual magnetic field has to be suitably calculated and his shape controlled within the required limits. Due to the complexity of its design, the problem is quite critical for the ITER project. In this paper the problem is discussed both from mathematical and numerical point of view. In particular, a complete formulation is proposed, taking into account both the presence of the non linear magnetic materials and the fully 3D geometry. Then the quality level requirements are discussed, included the accuracy of calculations and the spatial resolution. As a consequence, the numerical tools able to fulfil the quality needs while requiring reasonable computer burden are considered. In particular possible tools based on numerical FEM scheme are considered; in addition, in spite of the presence of non linear materials, the practical possibility to use Biot-Savart based approaches, as cross check tools, is also discussed. The paper also analyses the possible geometrical simplifications of the geometry able to make possible the actual calculation while guarantying the required accuracy. Finally the characteristics required for a correction system able to effectively counteract the magnetic field degradation are presented. Of course a number of examples will be also reported and commented. (author)

  6. Quantum chemistry simulation on quantum computers: theories and experiments.

    Science.gov (United States)

    Lu, Dawei; Xu, Boruo; Xu, Nanyang; Li, Zhaokai; Chen, Hongwei; Peng, Xinhua; Xu, Ruixue; Du, Jiangfeng

    2012-07-14

    It has been claimed that quantum computers can mimic quantum systems efficiently in the polynomial scale. Traditionally, those simulations are carried out numerically on classical computers, which are inevitably confronted with the exponential growth of required resources, with the increasing size of quantum systems. Quantum computers avoid this problem, and thus provide a possible solution for large quantum systems. In this paper, we first discuss the ideas of quantum simulation, the background of quantum simulators, their categories, and the development in both theories and experiments. We then present a brief introduction to quantum chemistry evaluated via classical computers followed by typical procedures of quantum simulation towards quantum chemistry. Reviewed are not only theoretical proposals but also proof-of-principle experimental implementations, via a small quantum computer, which include the evaluation of the static molecular eigenenergy and the simulation of chemical reaction dynamics. Although the experimental development is still behind the theory, we give prospects and suggestions for future experiments. We anticipate that in the near future quantum simulation will become a powerful tool for quantum chemistry over classical computations.

  7. Optical Computing

    OpenAIRE

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  8. Digital optical computers at the optoelectronic computing systems center

    Science.gov (United States)

    Jordan, Harry F.

    1991-01-01

    The Digital Optical Computing Program within the National Science Foundation Engineering Research Center for Opto-electronic Computing Systems has as its specific goal research on optical computing architectures suitable for use at the highest possible speeds. The program can be targeted toward exploiting the time domain because other programs in the Center are pursuing research on parallel optical systems, exploiting optical interconnection and optical devices and materials. Using a general purpose computing architecture as the focus, we are developing design techniques, tools and architecture for operation at the speed of light limit. Experimental work is being done with the somewhat low speed components currently available but with architectures which will scale up in speed as faster devices are developed. The design algorithms and tools developed for a general purpose, stored program computer are being applied to other systems such as optimally controlled optical communication networks.

  9. Exascale for Energy: The Role of Exascale Computing in Energy Security

    International Nuclear Information System (INIS)

    2010-01-01

    How will the United States satisfy energy demand in a tightening global energy marketplace while, at the same time, reducing greenhouse gas emissions? Exascale computing - expected to be available within the next eight to ten years - may play a crucial role in answering that question by enabling a paradigm shift from test-based to science-based design and engineering. Computational modeling of complete power generation systems and engines, based on scientific first principles, will accelerate the improvement of existing energy technologies and the development of new transformational technologies by pre-selecting the designs most likely to be successful for experimental validation, rather than relying on trial and error. The predictive understanding of complex engineered systems made possible by computational modeling will also reduce the construction and operations costs, optimize performance, and improve safety. Exascale computing will make possible fundamentally new approaches to quantifying the uncertainty of safety and performance engineering. This report discusses potential contributions of exa-scale modeling in four areas of energy production and distribution: nuclear power, combustion, the electrical grid, and renewable sources of energy, which include hydrogen fuel, bioenergy conversion, photovoltaic solar energy, and wind turbines.

  10. Neuroradiology computer-assisted instruction using interactive videodisk: Pilot project

    International Nuclear Information System (INIS)

    Andrews, C.L.; Goldsmith, D.G.; Osborn, A.G.; Stensaas, S.S.; Davidson, H.C.; Quigley, A.C.

    1987-01-01

    The availability of microcomputers, high-resolution monitors, high-level authoring languages, and videodisk technology make sophisticated neuroradiology instruction a cost-effective possibility. The authors developed a laser videodisk and interactive software to teach normal and pathologic gross and radiologic anatomy of the sellar/juxtasellar region. A spectrum of lesions is presented with information for differential diagnosis included. The exhibit permits conference participants to review the pilot module and experience the self-paced learning and self-evaluation possible with computer-assisted instruction. They also may choose to peruse a ''visual database'' by instant random access to the videodisk by hand control

  11. Computational intelligence techniques for comparative genomics dedicated to Prof. Allam Appa Rao on the occasion of his 65th birthday

    CERN Document Server

    Gunjan, Vinit

    2015-01-01

    This Brief highlights Informatics and related techniques to Computer Science Professionals, Engineers, Medical Doctors, Bioinformatics researchers and other interdisciplinary researchers. Chapters include the Bioinformatics of Diabetes and several computational algorithms and statistical analysis approach to effectively study the disorders and possible causes along with medical applications.

  12. Computing and Visualizing Reachable Volumes for Maneuvering Satellites

    Science.gov (United States)

    Jiang, M.; de Vries, W.; Pertica, A.; Olivier, S.

    2011-09-01

    Detecting and predicting maneuvering satellites is an important problem for Space Situational Awareness. The spatial envelope of all possible locations within reach of such a maneuvering satellite is known as the Reachable Volume (RV). As soon as custody of a satellite is lost, calculating the RV and its subsequent time evolution is a critical component in the rapid recovery of the satellite. In this paper, we present a Monte Carlo approach to computing the RV for a given object. Essentially, our approach samples all possible trajectories by randomizing thrust-vectors, thrust magnitudes and time of burn. At any given instance, the distribution of the "point-cloud" of the virtual particles defines the RV. For short orbital time-scales, the temporal evolution of the point-cloud can result in complex, multi-reentrant manifolds. Visualization plays an important role in gaining insight and understanding into this complex and evolving manifold. In the second part of this paper, we focus on how to effectively visualize the large number of virtual trajectories and the computed RV. We present a real-time out-of-core rendering technique for visualizing the large number of virtual trajectories. We also examine different techniques for visualizing the computed volume of probability density distribution, including volume slicing, convex hull and isosurfacing. We compare and contrast these techniques in terms of computational cost and visualization effectiveness, and describe the main implementation issues encountered during our development process. Finally, we will present some of the results from our end-to-end system for computing and visualizing RVs using examples of maneuvering satellites.

  13. Computing and Visualizing Reachable Volumes for Maneuvering Satellites

    International Nuclear Information System (INIS)

    Jiang, M.; de Vries, W.H.; Pertica, A.J.; Olivier, S.S.

    2011-01-01

    Detecting and predicting maneuvering satellites is an important problem for Space Situational Awareness. The spatial envelope of all possible locations within reach of such a maneuvering satellite is known as the Reachable Volume (RV). As soon as custody of a satellite is lost, calculating the RV and its subsequent time evolution is a critical component in the rapid recovery of the satellite. In this paper, we present a Monte Carlo approach to computing the RV for a given object. Essentially, our approach samples all possible trajectories by randomizing thrust-vectors, thrust magnitudes and time of burn. At any given instance, the distribution of the 'point-cloud' of the virtual particles defines the RV. For short orbital time-scales, the temporal evolution of the point-cloud can result in complex, multi-reentrant manifolds. Visualization plays an important role in gaining insight and understanding into this complex and evolving manifold. In the second part of this paper, we focus on how to effectively visualize the large number of virtual trajectories and the computed RV. We present a real-time out-of-core rendering technique for visualizing the large number of virtual trajectories. We also examine different techniques for visualizing the computed volume of probability density distribution, including volume slicing, convex hull and isosurfacing. We compare and contrast these techniques in terms of computational cost and visualization effectiveness, and describe the main implementation issues encountered during our development process. Finally, we will present some of the results from our end-to-end system for computing and visualizing RVs using examples of maneuvering satellites.

  14. Nonlinear simulations with and computational issues for NIMROD

    International Nuclear Information System (INIS)

    Sovinec, C.R.

    1998-01-01

    The NIMROD (Non-Ideal Magnetohydrodynamics with Rotation, Open Discussion) code development project was commissioned by the US Department of Energy in February, 1996 to provide the fusion research community with a computational tool for studying low-frequency behavior in experiments. Specific problems of interest include the neoclassical evolution of magnetic islands and the nonlinear behavior of tearing modes in the presence of rotation and nonideal walls in tokamaks; they also include topics relevant to innovative confinement concepts such as magnetic turbulence. Besides having physics models appropriate for these phenomena, an additional requirement is the ability to perform the computations in realistic geometries. The NIMROD Team is using contemporary management and computational methods to develop a computational tool for investigating low-frequency behavior in plasma fusion experiments. The authors intend to make the code freely available, and are taking steps to make it as easy to learn and use as possible. An example application for NIMROD is the nonlinear toroidal RFP simulation--the first in a series to investigate how toroidal geometry affects MHD activity in RFPs. Finally, the most important issue facing the project is execution time, and they are exploring better matrix solvers and a better parallel decomposition to address this

  15. Nonlinear simulations with and computational issues for NIMROD

    Energy Technology Data Exchange (ETDEWEB)

    Sovinec, C.R. [Los Alamos National Lab., NM (United States)

    1998-12-31

    The NIMROD (Non-Ideal Magnetohydrodynamics with Rotation, Open Discussion) code development project was commissioned by the US Department of Energy in February, 1996 to provide the fusion research community with a computational tool for studying low-frequency behavior in experiments. Specific problems of interest include the neoclassical evolution of magnetic islands and the nonlinear behavior of tearing modes in the presence of rotation and nonideal walls in tokamaks; they also include topics relevant to innovative confinement concepts such as magnetic turbulence. Besides having physics models appropriate for these phenomena, an additional requirement is the ability to perform the computations in realistic geometries. The NIMROD Team is using contemporary management and computational methods to develop a computational tool for investigating low-frequency behavior in plasma fusion experiments. The authors intend to make the code freely available, and are taking steps to make it as easy to learn and use as possible. An example application for NIMROD is the nonlinear toroidal RFP simulation--the first in a series to investigate how toroidal geometry affects MHD activity in RFPs. Finally, the most important issue facing the project is execution time, and they are exploring better matrix solvers and a better parallel decomposition to address this.

  16. Towards distributed multiscale computing for the VPH

    NARCIS (Netherlands)

    Hoekstra, A.G.; Coveney, P.

    2010-01-01

    Multiscale modeling is fundamental to the Virtual Physiological Human (VPH) initiative. Most detailed three-dimensional multiscale models lead to prohibitive computational demands. As a possible solution we present MAPPER, a computational science infrastructure for Distributed Multiscale Computing

  17. Fibonacci’s Computation Methods vs Modern Algorithms

    Directory of Open Access Journals (Sweden)

    Ernesto Burattini

    2013-12-01

    Full Text Available In this paper we discuss some computational procedures given by Leonardo Pisano Fibonacci in his famous Liber Abaci book, and we propose their translation into a modern language for computers (C ++. Among the other we describe the method of “cross” multiplication, we evaluate its computational complexity in algorithmic terms and we show the output of a C ++ code that describes the development of the method applied to the product of two integers. In a similar way we show the operations performed on fractions introduced by Fibonacci. Thanks to the possibility to reproduce on a computer, the Fibonacci’s different computational procedures, it was possible to identify some calculation errors present in the different versions of the original text.

  18. Computers for lattice field theories

    International Nuclear Information System (INIS)

    Iwasaki, Y.

    1994-01-01

    Parallel computers dedicated to lattice field theories are reviewed with emphasis on the three recent projects, the Teraflops project in the US, the CP-PACS project in Japan and the 0.5-Teraflops project in the US. Some new commercial parallel computers are also discussed. Recent development of semiconductor technologies is briefly surveyed in relation to possible approaches toward Teraflops computers. (orig.)

  19. Learning With Computers; Today and Tomorrow.

    Science.gov (United States)

    Bork, Alfred

    This paper describes the present practical use of computers in two large beginning physics courses at the University of California, Irvine; discusses the versatility and desirability of computers in the field of education; and projects the possible future directions of computer-based learning. The advantages and disadvantages of educational…

  20. Computer hardware fault administration

    Science.gov (United States)

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  1. Computational Pathology

    Science.gov (United States)

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  2. 29 CFR 779.253 - What is included in computing the total annual inflow volume.

    Science.gov (United States)

    2010-07-01

    ... FAIR LABOR STANDARDS ACT AS APPLIED TO RETAILERS OF GOODS OR SERVICES Employment to Which the Act May... taxes and other charges which the enterprise must pay for such goods. Generally, all charges will be... computing the total annual inflow volume. The goods which the establishment purchases or receives for resale...

  3. DIAGNOSTIC POSSIBILITIES OF 3D-COMPUTED TOMOGRAPHY WITH INTRALESIONAL APPLICATION OF CONTRAST MATERIAL IN A CASE OF VERY LARGE RADICULAR MAXILLARY CYST - A CASE REPORT

    Directory of Open Access Journals (Sweden)

    Galina Gavazova

    2017-09-01

    Full Text Available Introduction: Diagnosis of odontogenic cysts despite their benign nature is a critical and challenging problem. Aim: The aim of this article is to demonstrate a different diagnostic approach in case of very large odontogenic cyst. Materials and Methods: This study was executed on one male patient aged of 38 using 3D computed tomography and contrast material inside the lesion. Differential diagnosis made by the residents was compared to the histopathological examination as the gold standard for identifying the nature of the cysts. Results: This diagnostic approach using 3D computed tomography combined with contrast material injected inside the lesion shows the real borders of the cyst of the maxilla and helps oral surgeon in planning the volume of the surgical intervention. Conclusion: Precise diagnose ensure the possibility of doing the optimal surgical intervention- a precondition for best wound healing.

  4. Education in interactive media: a survey on the potentials of computers for visual literacy

    OpenAIRE

    Güleryüz, Hakan

    1996-01-01

    Ankara : Bilkent University, Department of Graphic Design and Institute of Fine Arts, 1996. Thesis (Master's) -- Bilkent University, 1996. Includes bibliographical references leaves 89-94. This study aims at investigating the potentials of multimedia and computers in design. For this purpose, a general survey on the historical development of computers for their use in education and possibilities related to the use of technology in education is conducted. Based on this survey, the dep...

  5. An Applet-based Anonymous Distributed Computing System.

    Science.gov (United States)

    Finkel, David; Wills, Craig E.; Ciaraldi, Michael J.; Amorin, Kevin; Covati, Adam; Lee, Michael

    2001-01-01

    Defines anonymous distributed computing systems and focuses on the specifics of a Java, applet-based approach for large-scale, anonymous, distributed computing on the Internet. Explains the possibility of a large number of computers participating in a single computation and describes a test of the functionality of the system. (Author/LRW)

  6. Place-Specific Computing

    DEFF Research Database (Denmark)

    Messeter, Jörn; Johansson, Michael

    project place- specific computing is explored through design oriented research. This article reports six pilot studies where design students have designed concepts for place-specific computing in Berlin (Germany), Cape Town (South Africa), Rome (Italy) and Malmö (Sweden). Background and arguments...... for place-specific computing as a genre of interaction design are described. A total number of 36 design concepts designed for 16 designated zones in the four cities are presented. An analysis of the design concepts is presented indicating potentials, possibilities and problems as directions for future......An increased interest in the notion of place has evolved in interaction design. Proliferation of wireless infrastructure, developments in digital media, and a ‘spatial turn’ in computing provides the base for place-specific computing as a suggested new genre of interaction design. In the REcult...

  7. Review your Computer Security Now and Frequently!

    CERN Multimedia

    IT Department

    2009-01-01

    The start-up of LHC is foreseen to take place in the autumn and we will be in the public spotlight again. This increases the necessity to be vigilant with respect to computer security and the defacement of an experiment’s Web page in September last year shows that we should be particularly attentive. Attackers are permanently probing CERN and so we must all do the maximum to reduce future risks. Security is a hierarchical responsibility and requires to balance the allocation of resources between making systems work and making them secure. Thus all of us, whether users, developers, system experts, administrators, or managers are responsible for securing our computing assets. These include computers, software applications, documents, accounts and passwords. There is no "silver bullet" for securing systems, which can only be achieved by a painstaking search for all possible vulnerabilities followed by their mitigation. Additional advice on particular topics can be obtained from the relevant I...

  8. Internet messenger based smart virtual class learning using ubiquitous computing

    Science.gov (United States)

    Umam, K.; Mardi, S. N. S.; Hariadi, M.

    2017-06-01

    Internet messenger (IM) has become an important educational technology component in college education, IM makes it possible for students to engage in learning and collaborating at smart virtual class learning (SVCL) using ubiquitous computing. However, the model of IM-based smart virtual class learning using ubiquitous computing and empirical evidence that would favor a broad application to improve engagement and behavior are still limited. In addition, the expectation that IM based SVCL using ubiquitous computing could improve engagement and behavior on smart class cannot be confirmed because the majority of the reviewed studies followed instructions paradigms. This article aims to present the model of IM-based SVCL using ubiquitous computing and showing learners’ experiences in improved engagement and behavior for learner-learner and learner-lecturer interactions. The method applied in this paper includes design process and quantitative analysis techniques, with the purpose of identifying scenarios of ubiquitous computing and realize the impressions of learners and lecturers about engagement and behavior aspect and its contribution to learning

  9. Computed Potential Energy Surfaces and Minimum Energy Pathways for Chemical Reactions

    Science.gov (United States)

    Walch, Stephen P.; Langhoff, S. R. (Technical Monitor)

    1994-01-01

    Computed potential energy surfaces are often required for computation of such parameters as rate constants as a function of temperature, product branching ratios, and other detailed properties. For some dynamics methods, global potential energy surfaces are required. In this case, it is necessary to obtain the energy at a complete sampling of all the possible arrangements of the nuclei, which are energetically accessible, and then a fitting function must be obtained to interpolate between the computed points. In other cases, characterization of the stationary points and the reaction pathway connecting them is sufficient. These properties may be readily obtained using analytical derivative methods. We have found that computation of the stationary points/reaction pathways using CASSCF/derivative methods, followed by use of the internally contracted CI method to obtain accurate energetics, gives usefull results for a number of chemically important systems. The talk will focus on a number of applications including global potential energy surfaces, H + O2, H + N2, O(3p) + H2, and reaction pathways for complex reactions, including reactions leading to NO and soot formation in hydrocarbon combustion.

  10. The possibilities of cloud storage for business and education. Contemporary aspect

    Directory of Open Access Journals (Sweden)

    Fomicheva T.L.

    2017-01-01

    Full Text Available the article describes the possibilities of cloud computing application to make business and educational process more effective. Cloud computing gives an opportunity to optimize the activity, to increase the effectiveness of work in almost every aspect of livelihood, right up to solving the personal tasks of an ordinary user.

  11. Does It Matter Whether One Takes a Test on an iPad or a Desktop Computer?

    Science.gov (United States)

    Ling, Guangming

    2016-01-01

    To investigate possible iPad related mode effect, we tested 403 8th graders in Indiana, Maryland, and New Jersey under three mode conditions through random assignment: a desktop computer, an iPad alone, and an iPad with an external keyboard. All students had used an iPad or computer for six months or longer. The 2-hour test included reading, math,…

  12. Wireless Technologies, Ubiquitous Computing and Mobile Health: Application to Drug Abuse Treatment and Compliance with HIV Therapies.

    Science.gov (United States)

    Boyer, Edward W; Smelson, David; Fletcher, Richard; Ziedonis, Douglas; Picard, Rosalind W

    2010-06-01

    Beneficial advances in the treatment of substance abuse and compliance with medical therapies, including HAART, are possible with new mobile technologies related to personal physiological sensing and computational methods. When incorporated into mobile platforms that allow for ubiquitous computing, these technologies have great potential for extending the reach of behavioral interventions from clinical settings where they are learned into natural environments.

  13. PEDAGOGICAL ASPECTS OF CLOUD COMPUTING

    Directory of Open Access Journals (Sweden)

    N. Morze

    2011-05-01

    Full Text Available Recent progress in computer science in the field of redundancy and protection has led to the sharing of data in many different repositories. Modern infrastructure has made cloud computing safe and reliable, and advancement of such computations radically changes the understanding of the use of resources and services. The materials in this article are connected with the definition of pedagogical possibilities of using cloud computing to provide education on the basis of competence-based approach and monitoring of learners (students.

  14. Computer Graphics and Administrative Decision-Making.

    Science.gov (United States)

    Yost, Michael

    1984-01-01

    Reduction in prices now makes it possible for almost any institution to use computer graphics for administrative decision making and research. Current and potential uses of computer graphics in these two areas are discussed. (JN)

  15. SALP-PC, a computer program for fault tree analysis on personal computers

    International Nuclear Information System (INIS)

    Contini, S.; Poucet, A.

    1987-01-01

    The paper presents the main characteristics of the SALP-PC computer code for fault tree analysis. The program has been developed in Fortran 77 on an Olivetti M24 personal computer (IBM compatible) in order to reach a high degree of portability. It is composed of six processors implementing the different phases of the analysis procedure. This particular structure presents some advantages like, for instance, the restart facility and the possibility to develop an event tree analysis code. The set of allowed logical operators, i.e. AND, OR, NOT, K/N, XOR, INH, together with the possibility to define boundary conditions, make the SALP-PC code a powerful tool for risk assessment. (orig.)

  16. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  17. Center for computer security: Computer Security Group conference. Summary

    Energy Technology Data Exchange (ETDEWEB)

    None

    1982-06-01

    Topics covered include: computer security management; detection and prevention of computer misuse; certification and accreditation; protection of computer security, perspective from a program office; risk analysis; secure accreditation systems; data base security; implementing R and D; key notarization system; DOD computer security center; the Sandia experience; inspector general's report; and backup and contingency planning. (GHT)

  18. Galaxy CloudMan: delivering cloud compute clusters.

    Science.gov (United States)

    Afgan, Enis; Baker, Dannon; Coraor, Nate; Chapman, Brad; Nekrutenko, Anton; Taylor, James

    2010-12-21

    Widespread adoption of high-throughput sequencing has greatly increased the scale and sophistication of computational infrastructure needed to perform genomic research. An alternative to building and maintaining local infrastructure is "cloud computing", which, in principle, offers on demand access to flexible computational infrastructure. However, cloud computing resources are not yet suitable for immediate "as is" use by experimental biologists. We present a cloud resource management system that makes it possible for individual researchers to compose and control an arbitrarily sized compute cluster on Amazon's EC2 cloud infrastructure without any informatics requirements. Within this system, an entire suite of biological tools packaged by the NERC Bio-Linux team (http://nebc.nerc.ac.uk/tools/bio-linux) is available for immediate consumption. The provided solution makes it possible, using only a web browser, to create a completely configured compute cluster ready to perform analysis in less than five minutes. Moreover, we provide an automated method for building custom deployments of cloud resources. This approach promotes reproducibility of results and, if desired, allows individuals and labs to add or customize an otherwise available cloud system to better meet their needs. The expected knowledge and associated effort with deploying a compute cluster in the Amazon EC2 cloud is not trivial. The solution presented in this paper eliminates these barriers, making it possible for researchers to deploy exactly the amount of computing power they need, combined with a wealth of existing analysis software, to handle the ongoing data deluge.

  19. From transistor to trapped-ion computers for quantum chemistry.

    Science.gov (United States)

    Yung, M-H; Casanova, J; Mezzacapo, A; McClean, J; Lamata, L; Aspuru-Guzik, A; Solano, E

    2014-01-07

    Over the last few decades, quantum chemistry has progressed through the development of computational methods based on modern digital computers. However, these methods can hardly fulfill the exponentially-growing resource requirements when applied to large quantum systems. As pointed out by Feynman, this restriction is intrinsic to all computational models based on classical physics. Recently, the rapid advancement of trapped-ion technologies has opened new possibilities for quantum control and quantum simulations. Here, we present an efficient toolkit that exploits both the internal and motional degrees of freedom of trapped ions for solving problems in quantum chemistry, including molecular electronic structure, molecular dynamics, and vibronic coupling. We focus on applications that go beyond the capacity of classical computers, but may be realizable on state-of-the-art trapped-ion systems. These results allow us to envision a new paradigm of quantum chemistry that shifts from the current transistor to a near-future trapped-ion-based technology.

  20. Quantum computing

    International Nuclear Information System (INIS)

    Steane, Andrew

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  1. Quantum computing

    Energy Technology Data Exchange (ETDEWEB)

    Steane, Andrew [Department of Atomic and Laser Physics, University of Oxford, Clarendon Laboratory, Oxford (United Kingdom)

    1998-02-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  2. New computer systems

    International Nuclear Information System (INIS)

    Faerber, G.

    1975-01-01

    Process computers have already become indespensable technical aids for monitoring and automation tasks in nuclear power stations. Yet there are still some problems connected with their use whose elimination should be the main objective in the development of new computer systems. In the paper, some of these problems are summarized, new tendencies in hardware development are outlined, and finally some new systems concepts made possible by the hardware development are explained. (orig./AK) [de

  3. Viking Afterbody Heating Computations and Comparisons to Flight Data

    Science.gov (United States)

    Edquist, Karl T.; Wright, Michael J.; Allen, Gary A., Jr.

    2006-01-01

    Computational fluid dynamics predictions of Viking Lander 1 entry vehicle afterbody heating are compared to flight data. The analysis includes a derivation of heat flux from temperature data at two base cover locations, as well as a discussion of available reconstructed entry trajectories. Based on the raw temperature-time history data, convective heat flux is derived to be 0.63-1.10 W/cm2 for the aluminum base cover at the time of thermocouple failure. Peak heat flux at the fiberglass base cover thermocouple is estimated to be 0.54-0.76 W/cm2, occurring 16 seconds after peak stagnation point heat flux. Navier-Stokes computational solutions are obtained with two separate codes using an 8- species Mars gas model in chemical and thermal non-equilibrium. Flowfield solutions using local time-stepping did not result in converged heating at either thermocouple location. A global time-stepping approach improved the computational stability, but steady state heat flux was not reached for either base cover location. Both thermocouple locations lie within a separated flow region of the base cover that is likely unsteady. Heat flux computations averaged over the solution history are generally below the flight data and do not vary smoothly over time for both base cover locations. Possible reasons for the mismatch between flight data and flowfield solutions include underestimated conduction effects and limitations of the computational methods.

  4. Use of computational fluid dynamics codes for safety analysis of nuclear reactor systems, including containment. Summary report of a technical meeting

    International Nuclear Information System (INIS)

    2003-11-01

    Safety analysis is an important tool for justifying the safety of nuclear power plants. Typically, this type of analysis is performed by means of system computer codes with one dimensional approximation for modelling real plant systems. However, in the nuclear area there are issues for which traditional treatment using one dimensional system codes is considered inadequate for modelling local flow and heat transfer phenomena. There is therefore increasing interest in the application of three dimensional computational fluid dynamics (CFD) codes as a supplement to or in combination with system codes. There are a number of both commercial (general purpose) CFD codes as well as special codes for nuclear safety applications available. With further progress in safety analysis techniques, the increasing use of CFD codes for nuclear applications is expected. At present, the main objective with respect to CFD codes is generally to improve confidence in the available analysis tools and to achieve a more reliable approach to safety relevant issues. An exchange of views and experience can facilitate and speed up progress in the implementation of this objective. Both the International Atomic Energy Agency (IAEA) and the Nuclear Energy Agency of the Organisation for Economic Co-operation and Development (OECD/NEA) believed that it would be advantageous to provide a forum for such an exchange. Therefore, within the framework of the Working Group on the Analysis and Management of Accidents of the NEA's Committee on the Safety of Nuclear Installations, the IAEA and the NEA agreed to jointly organize the Technical Meeting on the Use of Computational Fluid Dynamics Codes for Safety Analysis of Reactor Systems, including Containment. The meeting was held in Pisa, Italy, from 11 to 14 November 2002. The publication constitutes the report of the Technical Meeting. It includes short summaries of the presentations that were made and of the discussions as well as conclusions and

  5. Design of Computer Experiments

    DEFF Research Database (Denmark)

    Dehlendorff, Christian

    The main topic of this thesis is design and analysis of computer and simulation experiments and is dealt with in six papers and a summary report. Simulation and computer models have in recent years received increasingly more attention due to their increasing complexity and usability. Software...... packages make the development of rather complicated computer models using predefined building blocks possible. This implies that the range of phenomenas that are analyzed by means of a computer model has expanded significantly. As the complexity grows so does the need for efficient experimental designs...... and analysis methods, since the complex computer models often are expensive to use in terms of computer time. The choice of performance parameter is an important part of the analysis of computer and simulation models and Paper A introduces a new statistic for waiting times in health care units. The statistic...

  6. POSSIBILITIES FOR RADIODIAGNOSIS OF TUBERCULOUS SPONDYLITIS

    Directory of Open Access Journals (Sweden)

    S. V. Smerdin

    2014-01-01

    Full Text Available The presented case illustrates the possibilities of complex radiodiagnosis in a patient with tuberculous spondylitis. The specific features of displaying a spinal tuberculous lesion during X-ray study, tomosynthesis, computed tomography, and magnetic resonance imaging are described. A rational algorithm for the examination and treatment of patients with this disease is proposed, by comparing the clinical manifestations of spinal tuberculous lesion and the results of its radiological studies.

  7. Experiment Dashboard for Monitoring of the LHC Distributed Computing Systems

    International Nuclear Information System (INIS)

    Andreeva, J; Campos, M Devesas; Cros, J Tarragon; Gaidioz, B; Karavakis, E; Kokoszkiewicz, L; Lanciotti, E; Maier, G; Ollivier, W; Nowotka, M; Rocha, R; Sadykov, T; Saiz, P; Sargsyan, L; Sidorova, I; Tuckett, D

    2011-01-01

    LHC experiments are currently taking collisions data. A distributed computing model chosen by the four main LHC experiments allows physicists to benefit from resources spread all over the world. The distributed model and the scale of LHC computing activities increase the level of complexity of middleware, and also the chances of possible failures or inefficiencies in involved components. In order to ensure the required performance and functionality of the LHC computing system, monitoring the status of the distributed sites and services as well as monitoring LHC computing activities are among the key factors. Over the last years, the Experiment Dashboard team has been working on a number of applications that facilitate the monitoring of different activities: including following up jobs, transfers, and also site and service availabilities. This presentation describes Experiment Dashboard applications used by the LHC experiments and experience gained during the first months of data taking.

  8. Minimal mobile human computer interaction

    NARCIS (Netherlands)

    el Ali, A.

    2013-01-01

    In the last 20 years, the widespread adoption of personal, mobile computing devices in everyday life, has allowed entry into a new technological era in Human Computer Interaction (HCI). The constant change of the physical and social context in a user's situation made possible by the portability of

  9. Extreme Scale Computing for First-Principles Plasma Physics Research

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Choogn-Seock [Princeton University

    2011-10-12

    World superpowers are in the middle of the “Computnik” race. US Department of Energy (and National Nuclear Security Administration) wishes to launch exascale computer systems into the scientific (and national security) world by 2018. The objective is to solve important scientific problems and to predict the outcomes using the most fundamental scientific laws, which would not be possible otherwise. Being chosen into the next “frontier” group can be of great benefit to a scientific discipline. An extreme scale computer system requires different types of algorithms and programming philosophy from those we have been accustomed to. Only a handful of scientific codes are blessed to be capable of scalable usage of today’s largest computers in operation at petascale (using more than 100,000 cores concurrently). Fortunately, a few magnetic fusion codes are competing well in this race using the “first principles” gyrokinetic equations.These codes are beginning to study the fusion plasma dynamics in full-scale realistic diverted device geometry in natural nonlinear multiscale, including the large scale neoclassical and small scale turbulence physics, but excluding some ultra fast dynamics. In this talk, most of the above mentioned topics will be introduced at executive level. Representative properties of the extreme scale computers, modern programming exercises to take advantage of them, and different philosophies in the data flows and analyses will be presented. Examples of the multi-scale multi-physics scientific discoveries made possible by solving the gyrokinetic equations on extreme scale computers will be described. Future directions into “virtual tokamak experiments” will also be discussed.

  10. Quantum Genetic Algorithms for Computer Scientists

    Directory of Open Access Journals (Sweden)

    Rafael Lahoz-Beltra

    2016-10-01

    Full Text Available Genetic algorithms (GAs are a class of evolutionary algorithms inspired by Darwinian natural selection. They are popular heuristic optimisation methods based on simulated genetic mechanisms, i.e., mutation, crossover, etc. and population dynamical processes such as reproduction, selection, etc. Over the last decade, the possibility to emulate a quantum computer (a computer using quantum-mechanical phenomena to perform operations on data has led to a new class of GAs known as “Quantum Genetic Algorithms” (QGAs. In this review, we present a discussion, future potential, pros and cons of this new class of GAs. The review will be oriented towards computer scientists interested in QGAs “avoiding” the possible difficulties of quantum-mechanical phenomena.

  11. Complete Fairness in Secure Two-Party Computation

    DEFF Research Database (Denmark)

    Gordon, S. Dov; Hazay, Carmit; Katz, Jonathan

    2011-01-01

    In the setting of secure two-party computation, two mutually distrusting parties wish to compute some function of their inputs while preserving, to the extent possible, various security properties such as privacy, correctness, and more. One desirable property is fairness which guarantees, informa...... for such functions must have round complexity super-logarithmic in the security parameter. Our results demonstrate that the question of completely fair secure computation without an honest majority is far from closed.......In the setting of secure two-party computation, two mutually distrusting parties wish to compute some function of their inputs while preserving, to the extent possible, various security properties such as privacy, correctness, and more. One desirable property is fairness which guarantees......-party setting. We demonstrate that this folklore belief is false by showing completely fair protocols for various nontrivial functions in the two-party setting based on standard cryptographic assumptions. We first show feasibility of obtaining complete fairness when computing any function over polynomial...

  12. The utility of including pathology reports in improving the computational identification of patients

    Directory of Open Access Journals (Sweden)

    Wei Chen

    2016-01-01

    Full Text Available Background: Celiac disease (CD is a common autoimmune disorder. Efficient identification of patients may improve chronic management of the disease. Prior studies have shown searching International Classification of Diseases-9 (ICD-9 codes alone is inaccurate for identifying patients with CD. In this study, we developed automated classification algorithms leveraging pathology reports and other clinical data in Electronic Health Records (EHRs to refine the subset population preselected using ICD-9 code (579.0. Materials and Methods: EHRs were searched for established ICD-9 code (579.0 suggesting CD, based on which an initial identification of cases was obtained. In addition, laboratory results for tissue transglutaminse were extracted. Using natural language processing we analyzed pathology reports from upper endoscopy. Twelve machine learning classifiers using different combinations of variables related to ICD-9 CD status, laboratory result status, and pathology reports were experimented to find the best possible CD classifier. Ten-fold cross-validation was used to assess the results. Results: A total of 1498 patient records were used including 363 confirmed cases and 1135 false positive cases that served as controls. Logistic model based on both clinical and pathology report features produced the best results: Kappa of 0.78, F1 of 0.92, and area under the curve (AUC of 0.94, whereas in contrast using ICD-9 only generated poor results: Kappa of 0.28, F1 of 0.75, and AUC of 0.63. Conclusion: Our automated classification system presented an efficient and reliable way to improve the performance of CD patient identification.

  13. Challenges in computational materials science: Multiple scales, multi-physics and evolving discontinuities

    NARCIS (Netherlands)

    Borst, de R.

    2008-01-01

    Novel experimental possibilities together with improvements in computer hardware as well as new concepts in computational mathematics and mechanics in particular multiscale methods are now, in principle, making it possible to derive and compute phenomena and material parameters at a macroscopic

  14. Unconventional Quantum Computing Devices

    OpenAIRE

    Lloyd, Seth

    2000-01-01

    This paper investigates a variety of unconventional quantum computation devices, including fermionic quantum computers and computers that exploit nonlinear quantum mechanics. It is shown that unconventional quantum computing devices can in principle compute some quantities more rapidly than `conventional' quantum computers.

  15. Scientific Computing Strategic Plan for the Idaho National Laboratory

    International Nuclear Information System (INIS)

    Whiting, Eric Todd

    2015-01-01

    Scientific computing is a critical foundation of modern science. Without innovations in the field of computational science, the essential missions of the Department of Energy (DOE) would go unrealized. Taking a leadership role in such innovations is Idaho National Laboratory's (INL's) challenge and charge, and is central to INL's ongoing success. Computing is an essential part of INL's future. DOE science and technology missions rely firmly on computing capabilities in various forms. Modeling and simulation, fueled by innovations in computational science and validated through experiment, are a critical foundation of science and engineering. Big data analytics from an increasing number of widely varied sources is opening new windows of insight and discovery. Computing is a critical tool in education, science, engineering, and experiments. Advanced computing capabilities in the form of people, tools, computers, and facilities, will position INL competitively to deliver results and solutions on important national science and engineering challenges. A computing strategy must include much more than simply computers. The foundational enabling component of computing at many DOE national laboratories is the combination of a showcase like data center facility coupled with a very capable supercomputer. In addition, network connectivity, disk storage systems, and visualization hardware are critical and generally tightly coupled to the computer system and co located in the same facility. The existence of these resources in a single data center facility opens the doors to many opportunities that would not otherwise be possible.

  16. Computer games addiction

    OpenAIRE

    Nejepínský, Adam

    2010-01-01

    This bachelor thesis deals with the problem of computer games addiction. The attention is paid mainly to on-line games for more players. The purpose of this thesis was to describe this problem and to check - through questionnaire investigation - if the addiction to computer games and the impacts connected with the games really deserve excessive experts and laics attention. The thesis has two parts -- theoretical and practical ones. The theoretical part describes the possibilities of diagnosin...

  17. Computational vision

    CERN Document Server

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  18. Computer Attack and Cyber Terrorism: Vulnerabilities and Policy Issues for Congress

    National Research Council Canada - National Science Library

    Wilson, Clay

    2003-01-01

    Persistent computer security vulnerabilities may expose U.S. critical infrastructure and government computer systems to possible cyber attack by terrorists, possibly affecting the economy or other areas of national security...

  19. Contribution to the algorithmic and efficient programming of new parallel architectures including accelerators for neutron physics and shielding computations

    International Nuclear Information System (INIS)

    Dubois, J.

    2011-01-01

    In science, simulation is a key process for research or validation. Modern computer technology allows faster numerical experiments, which are cheaper than real models. In the field of neutron simulation, the calculation of eigenvalues is one of the key challenges. The complexity of these problems is such that a lot of computing power may be necessary. The work of this thesis is first the evaluation of new computing hardware such as graphics card or massively multi-core chips, and their application to eigenvalue problems for neutron simulation. Then, in order to address the massive parallelism of supercomputers national, we also study the use of asynchronous hybrid methods for solving eigenvalue problems with this very high level of parallelism. Then we experiment the work of this research on several national supercomputers such as the Titane hybrid machine of the Computing Center, Research and Technology (CCRT), the Curie machine of the Very Large Computing Centre (TGCC), currently being installed, and the Hopper machine at the Lawrence Berkeley National Laboratory (LBNL). We also do our experiments on local workstations to illustrate the interest of this research in an everyday use with local computing resources. (author) [fr

  20. Computational Physics Program of the National MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1984-12-01

    The principal objective of the computational physics group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. A summary of the groups activities is presented, including computational studies in MHD equilibria and stability, plasma transport, Fokker-Planck, and efficient numerical and programming algorithms. References are included

  1. 2 December 2003: Registration of Computers Mandatory for the entire CERN Site

    CERN Multimedia

    2003-01-01

    Following the decision by the CERN Management Board (see Weekly Bulletin 38/2003), registration of all computers connected to CERN's network will be enforced and only registered computers will be allowed network access. The implementation has been put into place in the IT buildings, building 40 and the Prévessin site, and will cover the whole of CERN by 2 December 2003. We therefore recommend strongly that you register all your computers in CERN's network database including all network access cards (Ethernet AND wire-less) as soon as possible without waiting for the access restriction to take force. This will allow you accessing the network without interruption and help IT service providers to contact you in case of problems (security problems, viruses, etc.). - If you have a CERN NICE/mail computing account register at: http://cern.ch/register/ (CERN Intranet page) - If you don't have CERN NICE/mail computing account (e.g. short term visitors) register at: http://cern.ch/registerVisitorComputer/...

  2. High performance computing system in the framework of the Higgs boson studies

    CERN Document Server

    Belyaev, Nikita; The ATLAS collaboration

    2017-01-01

    The Higgs boson physics is one of the most important and promising fields of study in modern High Energy Physics. To perform precision measurements of the Higgs boson properties, the use of fast and efficient instruments of Monte Carlo event simulation is required. Due to the increasing amount of data and to the growing complexity of the simulation software tools, the computing resources currently available for Monte Carlo simulation on the LHC GRID are not sufficient. One of the possibilities to address this shortfall of computing resources is the usage of institutes computer clusters, commercial computing resources and supercomputers. In this paper, a brief description of the Higgs boson physics, the Monte-Carlo generation and event simulation techniques are presented. A description of modern high performance computing systems and tests of their performance are also discussed. These studies have been performed on the Worldwide LHC Computing Grid and Kurchatov Institute Data Processing Center, including Tier...

  3. Computational Science and Innovation

    International Nuclear Information System (INIS)

    Dean, David Jarvis

    2011-01-01

    Simulations - utilizing computers to solve complicated science and engineering problems - are a key ingredient of modern science. The U.S. Department of Energy (DOE) is a world leader in the development of high-performance computing (HPC), the development of applied math and algorithms that utilize the full potential of HPC platforms, and the application of computing to science and engineering problems. An interesting general question is whether the DOE can strategically utilize its capability in simulations to advance innovation more broadly. In this article, I will argue that this is certainly possible.

  4. A formalization of computational trust

    NARCIS (Netherlands)

    Güven - Ozcelebi, C.; Holenderski, M.J.; Ozcelebi, T.; Lukkien, J.J.

    2018-01-01

    Computational trust aims to quantify trust and is studied by many disciplines including computer science, social sciences and business science. We propose a formal computational trust model, including its parameters and operations on these parameters, as well as a step by step guide to compute trust

  5. Blending Synchronous Face-to-Face and Computer-Supported Cooperative Learning in a Hybrid Doctoral Seminar

    Science.gov (United States)

    Roseth, Cary; Akcaoglu, Mete; Zellner, Andrea

    2013-01-01

    Online education is often assumed to be synonymous with asynchronous instruction, existing apart from or supplementary to face-to-face instruction in traditional bricks-and-mortar classrooms. However, expanding access to computer-mediated communication technologies now make new models possible, including distance learners synchronous online…

  6. Current algorithms used in reactor safety codes and the impact of future computer development on these algorithms

    International Nuclear Information System (INIS)

    Mahaffy, J.H.; Liles, D.R.; Woodruff, S.B.

    1985-01-01

    Computational methods and solution procedures used in the US Nuclear Regulatory Commission's reactor safety systems codes, Transient Reactor Analysis Code (TRAC) and Reactor Leak and Power Safety Excursion Code (RELAP), are reviewed. Methods used in TRAC-PF1/MOD1, including the stability-enhancing two-step (SETS) technique, which permits fast computations by allowing time steps larger than the material Courant stability limit, are described in detail, and the differences from RELAP5/MOD2 are noted. Developments in computing, including parallel and vector processing, and their applicability to nuclear reactor safety codes are described. These developments, coupled with appropriate numerical methods, make detailed faster-than-real-time reactor safety analysis a realistic near-term possibility

  7. Quantum computational webs

    International Nuclear Information System (INIS)

    Gross, D.; Eisert, J.

    2010-01-01

    We discuss the notion of quantum computational webs: These are quantum states universal for measurement-based computation, which can be built up from a collection of simple primitives. The primitive elements--reminiscent of building blocks in a construction kit--are (i) one-dimensional states (computational quantum wires) with the power to process one logical qubit and (ii) suitable couplings, which connect the wires to a computationally universal web. All elements are preparable by nearest-neighbor interactions in a single pass, of the kind accessible in a number of physical architectures. We provide a complete classification of qubit wires, a physically well-motivated class of universal resources that can be fully understood. Finally, we sketch possible realizations in superlattices and explore the power of coupling mechanisms based on Ising or exchange interactions.

  8. The computer boys take over computers, programmers, and the politics of technical expertise

    CERN Document Server

    Ensmenger, Nathan L

    2010-01-01

    This is a book about the computer revolution of the mid-twentieth century and the people who made it possible. Unlike most histories of computing, it is not a book about machines, inventors, or entrepreneurs. Instead, it tells the story of the vast but largely anonymous legions of computer specialists -- programmers, systems analysts, and other software developers -- who transformed the electronic computer from a scientific curiosity into the defining technology of the modern era. As the systems that they built became increasingly powerful and ubiquitous, these specialists became the focus of a series of critiques of the social and organizational impact of electronic computing. To many of their contemporaries, it seemed the "computer boys" were taking over, not just in the corporate setting, but also in government, politics, and society in general. In The Computer Boys Take Over, Nathan Ensmenger traces the rise to power of the computer expert in modern American society. His rich and nuanced portrayal of the ...

  9. EXPERIENCE WITH FPGA-BASED PROCESSOR CORE AS FRONT-END COMPUTER

    International Nuclear Information System (INIS)

    HOFF, L.T.

    2005-01-01

    The RHIC control system architecture follows the familiar ''standard model''. LINUX workstations are used as operator consoles. Front-end computers are distributed around the accelerator, close to equipment being controlled or monitored. These computers are generally based on VMEbus CPU modules running the VxWorks operating system. I/O is typically performed via the VMEbus, or via PMC daughter cards (via an internal PCI bus), or via on-board I/O interfaces (Ethernet or serial). Advances in FPGA size and sophistication now permit running virtual processor ''cores'' within the FPGA logic, including ''cores'' with advanced features such as memory management. Such systems offer certain advantages over traditional VMEbus Front-end computers. Advantages include tighter coupling with FPGA logic, and therefore higher I/O bandwidth, and flexibility in packaging, possibly resulting in a lower noise environment and/or lower cost. This paper presents the experience acquired while porting the RHIC control system to a PowerPC 405 core within a Xilinx FPGA for use in low-level RF control

  10. Cloud Computing Organizational Benefits : A Managerial concern

    OpenAIRE

    Mandala, Venkata Bhaskar Reddy; Chandra, Marepalli Sharat

    2012-01-01

    Context: Software industry is looking for new methods and opportunities to reduce the project management problems and operational costs. Cloud Computing concept is providing answers to these problems. Cloud Computing is made possible with the availability of high internet bandwidth. Cloud Computing is providing wide range of various services to varied customer base. Cloud Computing has some key elements such as on-demand services, large pool of configurable computing resources and minimal man...

  11. Use of declarative statements in creating and maintaining computer-interpretable knowledge bases for guideline-based care.

    Science.gov (United States)

    Tu, Samson W; Hrabak, Karen M; Campbell, James R; Glasgow, Julie; Nyman, Mark A; McClure, Robert; McClay, James; Abarbanel, Robert; Mansfield, James G; Martins, Susana M; Goldstein, Mary K; Musen, Mark A

    2006-01-01

    Developing computer-interpretable clinical practice guidelines (CPGs) to provide decision support for guideline-based care is an extremely labor-intensive task. In the EON/ATHENA and SAGE projects, we formulated substantial portions of CPGs as computable statements that express declarative relationships between patient conditions and possible interventions. We developed query and expression languages that allow a decision-support system (DSS) to evaluate these statements in specific patient situations. A DSS can use these guideline statements in multiple ways, including: (1) as inputs for determining preferred alternatives in decision-making, and (2) as a way to provide targeted commentaries in the clinical information system. The use of these declarative statements significantly reduces the modeling expertise and effort required to create and maintain computer-interpretable knowledge bases for decision-support purpose. We discuss possible implications for sharing of such knowledge bases.

  12. HeNCE: A Heterogeneous Network Computing Environment

    Directory of Open Access Journals (Sweden)

    Adam Beguelin

    1994-01-01

    Full Text Available Network computing seeks to utilize the aggregate resources of many networked computers to solve a single problem. In so doing it is often possible to obtain supercomputer performance from an inexpensive local area network. The drawback is that network computing is complicated and error prone when done by hand, especially if the computers have different operating systems and data formats and are thus heterogeneous. The heterogeneous network computing environment (HeNCE is an integrated graphical environment for creating and running parallel programs over a heterogeneous collection of computers. It is built on a lower level package called parallel virtual machine (PVM. The HeNCE philosophy of parallel programming is to have the programmer graphically specify the parallelism of a computation and to automate, as much as possible, the tasks of writing, compiling, executing, debugging, and tracing the network computation. Key to HeNCE is a graphical language based on directed graphs that describe the parallelism and data dependencies of an application. Nodes in the graphs represent conventional Fortran or C subroutines and the arcs represent data and control flow. This article describes the present state of HeNCE, its capabilities, limitations, and areas of future research.

  13. The computer support of diagnostics of circle crystallizers

    Directory of Open Access Journals (Sweden)

    J. David

    2014-04-01

    Full Text Available This paper is focused on computer aided technological processes of continuous steel casting devices. The paper characterized the fundamental aspects of creating computer aided control process of continuous steel casting operations. There is in detail described software system AMKO (Algorithm Modelling conicities. The importance of software AMKO consists in extending the usability of copper enclosure in the case of its weariness and the possibility of its further use. Using this software to create graphical model of the wear, which shows the conicity of the studied mold along all its lengths and specifies the possibility of its further use in various compositions sequences. Such modelling and prediction is possible with usage of cybernetic modelling principles and methods known as soft-computing.

  14. Protean appearance of craniopharyngioma on computed tomography

    International Nuclear Information System (INIS)

    Danziger, A.; Price, H.I.

    1979-01-01

    Craniopharyngiomas present a diverse appearance on computed tomography. Histological diagnosis is not always possible, but computed tomography is of great assistance in the delineation of the tumour as well as of the degree of associated hydrocephalus. Computed tomography also enables rapid non-invasive follow-up after surgery or radiotherapy, or both

  15. Introduction to lattice theory with computer science applications

    CERN Document Server

    Garg, Vijay K

    2015-01-01

    A computational perspective on partial order and lattice theory, focusing on algorithms and their applications This book provides a uniform treatment of the theory and applications of lattice theory. The applications covered include tracking dependency in distributed systems, combinatorics, detecting global predicates in distributed systems, set families, and integer partitions. The book presents algorithmic proofs of theorems whenever possible. These proofs are written in the calculational style advocated by Dijkstra, with arguments explicitly spelled out step by step. The author's intent

  16. REACHING THE COMPUTING HELP DESK

    CERN Multimedia

    Miguel Marquina

    2000-01-01

    You may find it useful to glue the information below, e.g. near/at your computer, for those occasions when access to computer services is not possible. It presents the way to contact the Computing Help Desk (hosted by IT Division as an entry point for general computing issues). Do not hesitate to contact us (by email to User.Relations@cern.ch) for additional information or feedback regarding this matter.Your contact for general computing problems or queriesPhone number:(+41 22 76) 78888Opening Hours:From Monday to Friday 8:30-17:30Email:Helpdesk@cern.chWeb:http://consult.cern.ch/service/helpdeskMiguel MarquinaIT Division/UserSupport

  17. Structure problems in the analog computation; Problemes de structure dans le calcul analogique

    Energy Technology Data Exchange (ETDEWEB)

    Braffort, P.L. [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1957-07-01

    The recent mathematical development showed the importance of elementary structures (algebraic, topological, etc.) in abeyance under the great domains of classical analysis. Such structures in analog computation are put in evidence and possible development of applied mathematics are discussed. It also studied the topological structures of the standard representation of analog schemes such as additional triangles, integrators, phase inverters and functions generators. The analog method gives only the function of the variable: time, as results of its computations. But the course of computation, for systems including reactive circuits, introduces order structures which are called 'chronological'. Finally, it showed that the approximation methods of ordinary numerical and digital computation present the same structure as these analog computation. The structure analysis permits fruitful comparisons between the several domains of applied mathematics and suggests new important domains of application for analog method. (M.P.)

  18. Structure problems in the analog computation; Problemes de structure dans le calcul analogique

    Energy Technology Data Exchange (ETDEWEB)

    Braffort, P L [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1957-07-01

    The recent mathematical development showed the importance of elementary structures (algebraic, topological, etc.) in abeyance under the great domains of classical analysis. Such structures in analog computation are put in evidence and possible development of applied mathematics are discussed. It also studied the topological structures of the standard representation of analog schemes such as additional triangles, integrators, phase inverters and functions generators. The analog method gives only the function of the variable: time, as results of its computations. But the course of computation, for systems including reactive circuits, introduces order structures which are called 'chronological'. Finally, it showed that the approximation methods of ordinary numerical and digital computation present the same structure as these analog computation. The structure analysis permits fruitful comparisons between the several domains of applied mathematics and suggests new important domains of application for analog method. (M.P.)

  19. 22nd International Conference on Soft Computing

    CERN Document Server

    2017-01-01

    This proceeding book contains a collection of selected accepted papers of the Mendel conference held in Brno, Czech Republic in June 2016. The proceedings book contains three chapters which present recent advances in soft computing including intelligent image processing. The Mendel conference was established in 1995 and is named after the scientist and Augustinian priest Gregor J. Mendel who discovered the famous Laws of Heredity. The main aim of the conference is to create a regular possibility for students, academics and researchers to exchange ideas and novel research methods on a yearly basis.

  20. Computed tomography for neurological intensive care patients

    International Nuclear Information System (INIS)

    Rodiek, S.; Neu, I.

    1977-01-01

    The first 100 computed tomographic (CT) examinations of the patients on the neurological intensive care ward are discussed and reported on the basis of selected typical findings. Characteristic patterns of the CT findings in determined cerebral diseases are explained. The possibility and necessity of CT observations of the development of inflammatory and cerebrovascular processes in particular are emphasized. A comparison of our experience with CT and other neuroradiological methods, is made. The clinical diagnoses, including the respective number of cases and the pertinent CT findings, are presented in a Table. (orig.) [de

  1. Visual implementation of computer communication

    OpenAIRE

    Gunnarsson, Tobias; Johansson, Hans

    2010-01-01

    Communication is a fundamental part of life and during the 20th century several new ways for communication has been developed and created. From the first telegraph which made it possible to send messages over long distances to radio communication and the telephone. In the last decades, computer to computer communication at high speed has become increasingly important, and so also the need for understanding computer communication. Since data communication today works in speeds that are so high...

  2. Dialogue with computers

    Energy Technology Data Exchange (ETDEWEB)

    Filippazzi, F.

    1991-03-01

    As to whether or not it would be possible to make a computer maintain dialogue with its operator and give plausible statements without actually 'understanding' what is being spoken about, the answer is, within certain limits, yes. An idea of this was given about 25 years ago with MIT's J. Weizenbaum's ELIZA program, named after G. B. Shaw's Pygmalion Cockney flower-seller who learned to talk like a duchess. The operating mechanism by which a computer would be able to do likewise must satisfy three prerequisites: the language must be natural; the speech coherent; and the answers should be consistent for any given question even when that question is asked in a slightly different form. To make this possible, the dialogue must take place within a limited context (in fact, the ELIZA experiment involved a simulated doctor/patient in-studio conversation). This article presents a portion of that conversation, in which the doctor, i.e., the computer, evasively answers his patient's questions without actually ever coming to grips with the issue, to illustrate how such a man-machine interface mechanism works.

  3. Computed tomography for radiographers

    International Nuclear Information System (INIS)

    Brooker, M.

    1986-01-01

    Computed tomography is regarded by many as a complicated union of sophisticated x-ray equipment and computer technology. This book overcomes these complexities. The rigid technicalities of the machinery and the clinical aspects of computed tomography are discussed including the preparation of patients, both physically and mentally, for scanning. Furthermore, the author also explains how to set up and run a computed tomography department, including advice on how the room should be designed

  4. Toward a Fault Tolerant Architecture for Vital Medical-Based Wearable Computing.

    Science.gov (United States)

    Abdali-Mohammadi, Fardin; Bajalan, Vahid; Fathi, Abdolhossein

    2015-12-01

    Advancements in computers and electronic technologies have led to the emergence of a new generation of efficient small intelligent systems. The products of such technologies might include Smartphones and wearable devices, which have attracted the attention of medical applications. These products are used less in critical medical applications because of their resource constraint and failure sensitivity. This is due to the fact that without safety considerations, small-integrated hardware will endanger patients' lives. Therefore, proposing some principals is required to construct wearable systems in healthcare so that the existing concerns are dealt with. Accordingly, this paper proposes an architecture for constructing wearable systems in critical medical applications. The proposed architecture is a three-tier one, supporting data flow from body sensors to cloud. The tiers of this architecture include wearable computers, mobile computing, and mobile cloud computing. One of the features of this architecture is its high possible fault tolerance due to the nature of its components. Moreover, the required protocols are presented to coordinate the components of this architecture. Finally, the reliability of this architecture is assessed by simulating the architecture and its components, and other aspects of the proposed architecture are discussed.

  5. Computers for Lattice QCD

    International Nuclear Information System (INIS)

    Christ, Norman H

    2000-01-01

    The architecture and capabilities of the computers currently in use for large-scale lattice QCD calculations are described and compared. Based on this present experience, possible future directions are discussed

  6. Upgrade plan for HANARO control computer system

    International Nuclear Information System (INIS)

    Kim, Min Jin; Kim, Young Ki; Jung, Hwan Sung; Choi, Young San; Woo, Jong Sub; Jun, Byung Jin

    2001-01-01

    A microprocessor based digital control system, the Multi-Loop Controller (MLC), which was chosen to control HANARO, was introduced to the market in early '80s and it had been used to control petrochemical plant, paper mill and Slowpoke reactor in Canada. Due to the development in computer technology, it has become so outdated model and the production of this model was discontinued a few years ago. Hence difficulty in acquiring the spare parts is expected. To achieve stable reactor control during its lifetime and to avoid possible technical dependency to the manufacturer, a long-term replacement plan for HANARO control computer system is on its way. The plan will include a few steps in its process. This paper briefly introduces the methods of implementation of the process and discusses the engineering activities of the plan

  7. Computability theory

    CERN Document Server

    Weber, Rebecca

    2012-01-01

    What can we compute--even with unlimited resources? Is everything within reach? Or are computations necessarily drastically limited, not just in practice, but theoretically? These questions are at the heart of computability theory. The goal of this book is to give the reader a firm grounding in the fundamentals of computability theory and an overview of currently active areas of research, such as reverse mathematics and algorithmic randomness. Turing machines and partial recursive functions are explored in detail, and vital tools and concepts including coding, uniformity, and diagonalization are described explicitly. From there the material continues with universal machines, the halting problem, parametrization and the recursion theorem, and thence to computability for sets, enumerability, and Turing reduction and degrees. A few more advanced topics round out the book before the chapter on areas of research. The text is designed to be self-contained, with an entire chapter of preliminary material including re...

  8. EXPLORATIONS IN QUANTUM COMPUTING FOR FINANCIAL APPLICATIONS

    OpenAIRE

    Gare, Jesse

    2010-01-01

    Quantum computers have the potential to increase the solution speed for many computational problems. This paper is a first step into possible applications for quantum computing in the context of computational finance. The fundamental ideas of quantum computing are introduced, followed by an exposition of the algorithms of Deutsch and Grover. Improved mean and median estimation are shown as results of Grover?s generalized framework. The algorithm for mean estimation is refined to an improved M...

  9. Perceptually-Inspired Computing

    Directory of Open Access Journals (Sweden)

    Ming Lin

    2015-08-01

    Full Text Available Human sensory systems allow individuals to see, hear, touch, and interact with the surrounding physical environment. Understanding human perception and its limit enables us to better exploit the psychophysics of human perceptual systems to design more efficient, adaptive algorithms and develop perceptually-inspired computational models. In this talk, I will survey some of recent efforts on perceptually-inspired computing with applications to crowd simulation and multimodal interaction. In particular, I will present data-driven personality modeling based on the results of user studies, example-guided physics-based sound synthesis using auditory perception, as well as perceptually-inspired simplification for multimodal interaction. These perceptually guided principles can be used to accelerating multi-modal interaction and visual computing, thereby creating more natural human-computer interaction and providing more immersive experiences. I will also present their use in interactive applications for entertainment, such as video games, computer animation, and shared social experience. I will conclude by discussing possible future research directions.

  10. Computational Nanoelectronics and Nanotechnology at NASA ARC

    Science.gov (United States)

    Saini, Subhash

    1998-01-01

    Both physical and economic considerations indicate that the scaling era of CMOS will run out of steam around the year 2010. However, physical laws also indicate that it is possible to compute at a rate of a billion times present speeds with the expenditure of only one Watt of electrical power. NASA has long-term needs where ultra-small semiconductor devices are needed for critical applications: high performance, low power, compact computers for intelligent autonomous vehicles and Petaflop computing technolpgy are some key examples. To advance the design, development, and production of future generation micro- and nano-devices, IT Modeling and Simulation Group has been started at NASA Ames with a goal to develop an integrated simulation environment that addresses problems related to nanoelectronics and molecular nanotecnology. Overview of nanoelectronics and nanotechnology research activities being carried out at Ames Research Center will be presented. We will also present the vision and the research objectives of the IT Modeling and Simulation Group including the applications of nanoelectronic based devices relevant to NASA missions.

  11. Scientific Discovery through Advanced Computing in Plasma Science

    Science.gov (United States)

    Tang, William

    2005-03-01

    per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of plasma turbulence in magnetically-confined high temperature plasmas. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to the computational science area.

  12. SLAC B Factory computing

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1992-02-01

    As part of the research and development program in preparation for a possible B Factory at SLAC, a group has been studying various aspects of HEP computing. In particular, the group is investigating the use of UNIX for all computing, from data acquisition, through analysis, and word processing. A summary of some of the results of this study will be given, along with some personal opinions on these topics

  13. A practical link between medical and computer groups in image data processing

    Energy Technology Data Exchange (ETDEWEB)

    Ollivier, J Y

    1975-01-01

    An acquisition and processing system of scintigraphic images should not be exclusively constructed for a computer specialist. Primarily it should be designed to be easily and quickly handled by a nurse or a doctor and be programmed by the doctor or the computer specialist. This consideration led Intertechnique to construct the CINE 200 system. In fact, the CINE 200 includes a computer and so offers the programming possibilities which are the tools of the computer specialist, even more it was conceived especially for clinic use and offers some functions which cannot be carried out by classical computer and some standard peripherals. In addition, the CINE 200 allows the doctor who is not a computer specialist to familiarize himself with this science by the progressive levels of language, the first level being a link of simple processing on images or curves, the second being an interpretative language identical to BASIC, very easy to learn. Before showing the offered facilities for the doctor and the computer specialist by the CINE 200, its characteristics are briefly reviewed.

  14. Possible rotation-power nature of SGRs and AXPs

    International Nuclear Information System (INIS)

    Malheiro, M.; Lobato, R. V.; Coelho, Jaziel G.; Cáceres, D. L.; De Lima, R. C. R.; Rueda, J. A.; Ruffini, R.

    2017-01-01

    We investigate the possibility of some Soft Gamma-ray Repeaters (SGRs) and Anomalous X-ray Pulsars (AXPs) could be described as rotation-powered neutron stars (NSs). The analysis was carried out by computing the structure properties of NSs, and then we focus on giving estimates for the surface magnetic field using both realistic structure parameters of NSs and a general relativistic model of a rotating magnetic dipole. We show that the use of realistic parameters of rotating neutron stars obtained from numerical integration of the self-consistent axisymmetric general relativistic equations of equilibrium leads to values of the magnetic field and radiation efficiency of SGRs/AXPs very different from estimates based on fiducial parameters. This analysis leads to a precise prediction of the range of NS masses, obtained here by making use of selected up-to-date nuclear equations of state (EOS). We show that 40% (nine) of the entire observed population of SGRs and AXPs can be described as canonical pulsars driven by the rotational energy of neutron stars, for which we give their possible range of masses. We also show that if the blackbody component in soft X-rays is due to the surface temperature of NSs, then 50% of the sources could be explained as ordinary rotation-powered pulsars. Besides, amongst these sources we find the four SGRs/AXPs with observed radio emission and six that are possibly associated with supernova remnants (including Swift J1834.9-0846 as the first magnetar to show a surrounding wind nebula), suggesting as well a natural explanation as ordinary pulsars. (paper)

  15. Cloud computing services: taxonomy and comparison

    NARCIS (Netherlands)

    Höfer, C.N.; Karagiannis, Georgios

    2011-01-01

    Cloud computing is a highly discussed topic in the technical and economic world, and many of the big players of the software industry have entered the development of cloud services. Several companies what to explore the possibilities and benefits of incorporating such cloud computing services in

  16. ALGORITHMS AND PROGRAMS FOR STRONG GRAVITATIONAL LENSING IN KERR SPACE-TIME INCLUDING POLARIZATION

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Bin; Maddumage, Prasad [Research Computing Center, Department of Scientific Computing, Florida State University, Tallahassee, FL 32306 (United States); Kantowski, Ronald; Dai, Xinyu; Baron, Eddie, E-mail: bchen3@fsu.edu [Homer L. Dodge Department of Physics and Astronomy, University of Oklahoma, Norman, OK 73019 (United States)

    2015-05-15

    Active galactic nuclei (AGNs) and quasars are important astrophysical objects to understand. Recently, microlensing observations have constrained the size of the quasar X-ray emission region to be of the order of 10 gravitational radii of the central supermassive black hole. For distances within a few gravitational radii, light paths are strongly bent by the strong gravity field of the central black hole. If the central black hole has nonzero angular momentum (spin), then a photon’s polarization plane will be rotated by the gravitational Faraday effect. The observed X-ray flux and polarization will then be influenced significantly by the strong gravity field near the source. Consequently, linear gravitational lensing theory is inadequate for such extreme circumstances. We present simple algorithms computing the strong lensing effects of Kerr black holes, including the effects on polarization. Our algorithms are realized in a program “KERTAP” in two versions: MATLAB and Python. The key ingredients of KERTAP are a graphic user interface, a backward ray-tracing algorithm, a polarization propagator dealing with gravitational Faraday rotation, and algorithms computing observables such as flux magnification and polarization angles. Our algorithms can be easily realized in other programming languages such as FORTRAN, C, and C++. The MATLAB version of KERTAP is parallelized using the MATLAB Parallel Computing Toolbox and the Distributed Computing Server. The Python code was sped up using Cython and supports full implementation of MPI using the “mpi4py” package. As an example, we investigate the inclination angle dependence of the observed polarization and the strong lensing magnification of AGN X-ray emission. We conclude that it is possible to perform complex numerical-relativity related computations using interpreted languages such as MATLAB and Python.

  17. ALGORITHMS AND PROGRAMS FOR STRONG GRAVITATIONAL LENSING IN KERR SPACE-TIME INCLUDING POLARIZATION

    International Nuclear Information System (INIS)

    Chen, Bin; Maddumage, Prasad; Kantowski, Ronald; Dai, Xinyu; Baron, Eddie

    2015-01-01

    Active galactic nuclei (AGNs) and quasars are important astrophysical objects to understand. Recently, microlensing observations have constrained the size of the quasar X-ray emission region to be of the order of 10 gravitational radii of the central supermassive black hole. For distances within a few gravitational radii, light paths are strongly bent by the strong gravity field of the central black hole. If the central black hole has nonzero angular momentum (spin), then a photon’s polarization plane will be rotated by the gravitational Faraday effect. The observed X-ray flux and polarization will then be influenced significantly by the strong gravity field near the source. Consequently, linear gravitational lensing theory is inadequate for such extreme circumstances. We present simple algorithms computing the strong lensing effects of Kerr black holes, including the effects on polarization. Our algorithms are realized in a program “KERTAP” in two versions: MATLAB and Python. The key ingredients of KERTAP are a graphic user interface, a backward ray-tracing algorithm, a polarization propagator dealing with gravitational Faraday rotation, and algorithms computing observables such as flux magnification and polarization angles. Our algorithms can be easily realized in other programming languages such as FORTRAN, C, and C++. The MATLAB version of KERTAP is parallelized using the MATLAB Parallel Computing Toolbox and the Distributed Computing Server. The Python code was sped up using Cython and supports full implementation of MPI using the “mpi4py” package. As an example, we investigate the inclination angle dependence of the observed polarization and the strong lensing magnification of AGN X-ray emission. We conclude that it is possible to perform complex numerical-relativity related computations using interpreted languages such as MATLAB and Python

  18. Towards ubiquitous access of computer-assisted surgery systems.

    Science.gov (United States)

    Liu, Hui; Lufei, Hanping; Shi, Weishong; Chaudhary, Vipin

    2006-01-01

    Traditional stand-alone computer-assisted surgery (CAS) systems impede the ubiquitous and simultaneous access by multiple users. With advances in computing and networking technologies, ubiquitous access to CAS systems becomes possible and promising. Based on our preliminary work, CASMIL, a stand-alone CAS server developed at Wayne State University, we propose a novel mobile CAS system, UbiCAS, which allows surgeons to retrieve, review and interpret multimodal medical images, and to perform some critical neurosurgical procedures on heterogeneous devices from anywhere at anytime. Furthermore, various optimization techniques, including caching, prefetching, pseudo-streaming-model, and compression, are used to guarantee the QoS of the UbiCAS system. UbiCAS enables doctors at remote locations to actively participate remote surgeries, share patient information in real time before, during, and after the surgery.

  19. Computational models of complex systems

    CERN Document Server

    Dabbaghian, Vahid

    2014-01-01

    Computational and mathematical models provide us with the opportunities to investigate the complexities of real world problems. They allow us to apply our best analytical methods to define problems in a clearly mathematical manner and exhaustively test our solutions before committing expensive resources. This is made possible by assuming parameter(s) in a bounded environment, allowing for controllable experimentation, not always possible in live scenarios. For example, simulation of computational models allows the testing of theories in a manner that is both fundamentally deductive and experimental in nature. The main ingredients for such research ideas come from multiple disciplines and the importance of interdisciplinary research is well recognized by the scientific community. This book provides a window to the novel endeavours of the research communities to present their works by highlighting the value of computational modelling as a research tool when investigating complex systems. We hope that the reader...

  20. Neoclassical transport including collisional nonlinearity.

    Science.gov (United States)

    Candy, J; Belli, E A

    2011-06-10

    In the standard δf theory of neoclassical transport, the zeroth-order (Maxwellian) solution is obtained analytically via the solution of a nonlinear equation. The first-order correction δf is subsequently computed as the solution of a linear, inhomogeneous equation that includes the linearized Fokker-Planck collision operator. This equation admits analytic solutions only in extreme asymptotic limits (banana, plateau, Pfirsch-Schlüter), and so must be solved numerically for realistic plasma parameters. Recently, numerical codes have appeared which attempt to compute the total distribution f more accurately than in the standard ordering by retaining some nonlinear terms related to finite-orbit width, while simultaneously reusing some form of the linearized collision operator. In this work we show that higher-order corrections to the distribution function may be unphysical if collisional nonlinearities are ignored.

  1. Possibilities of computer and magnetic-resonance tomography in liver neoplasm diagnostics

    International Nuclear Information System (INIS)

    Momot, N.V.; Shpak, S.A.

    2003-01-01

    With the purpose of comparison of CT and MRI possibilities in diagnostics of focal liver lesions 238 patients were studied by CT and 38 - by MRI. Results of investigation were verified by surgery, needle-fine biopsy, dynamic observation. CT is a method of a choice in diagnostics of focal liver lesions. MRI has some advantages in revealing of small metastases and neoplasms located on diaphragmal surface of the liver, in evaluation of hepatic portal structures and tumor relation with surrounding tissues and vessels

  2. Piping stress analysis with personal computers

    International Nuclear Information System (INIS)

    Revesz, Z.

    1987-01-01

    The growing market of the personal computers is providing an increasing number of professionals with unprecedented and surprisingly inexpensive computing capacity, which if using with powerful software, can enhance immensely the engineers capabilities. This paper focuses on the possibilities which opened in piping stress analysis by the widespread distribution of personal computers, on the necessary changes in the software and on the limitations of using personal computers for engineering design and analysis. Reliability and quality assurance aspects of using personal computers for nuclear applications are also mentioned. The paper resumes with personal views of the author and experiences gained during interactive graphic piping software development for personal computers. (orig./GL)

  3. The Potential for Computer Based Systems in Modular Engineering

    DEFF Research Database (Denmark)

    Miller, Thomas Dedenroth

    1998-01-01

    The paper elaborates on knowledge management and the possibility for computer support of the design process of pharmaceutical production plants in relation to the ph.d. project modular engineering.......The paper elaborates on knowledge management and the possibility for computer support of the design process of pharmaceutical production plants in relation to the ph.d. project modular engineering....

  4. How to Build a Quantum Computer

    Science.gov (United States)

    Sanders, Barry C.

    2017-11-01

    Quantum computer technology is progressing rapidly with dozens of qubits and hundreds of quantum logic gates now possible. Although current quantum computer technology is distant from being able to solve computational problems beyond the reach of non-quantum computers, experiments have progressed well beyond simply demonstrating the requisite components. We can now operate small quantum logic processors with connected networks of qubits and quantum logic gates, which is a great stride towards functioning quantum computers. This book aims to be accessible to a broad audience with basic knowledge of computers, electronics and physics. The goal is to convey key notions relevant to building quantum computers and to present state-of-the-art quantum-computer research in various media such as trapped ions, superconducting circuits, photonics and beyond.

  5. Bacterial computing: a form of natural computing and its applications.

    Science.gov (United States)

    Lahoz-Beltra, Rafael; Navarro, Jorge; Marijuán, Pedro C

    2014-01-01

    The capability to establish adaptive relationships with the environment is an essential characteristic of living cells. Both bacterial computing and bacterial intelligence are two general traits manifested along adaptive behaviors that respond to surrounding environmental conditions. These two traits have generated a variety of theoretical and applied approaches. Since the different systems of bacterial signaling and the different ways of genetic change are better known and more carefully explored, the whole adaptive possibilities of bacteria may be studied under new angles. For instance, there appear instances of molecular "learning" along the mechanisms of evolution. More in concrete, and looking specifically at the time dimension, the bacterial mechanisms of learning and evolution appear as two different and related mechanisms for adaptation to the environment; in somatic time the former and in evolutionary time the latter. In the present chapter it will be reviewed the possible application of both kinds of mechanisms to prokaryotic molecular computing schemes as well as to the solution of real world problems.

  6. Personal computers in high energy physics

    International Nuclear Information System (INIS)

    Quarrie, D.R.

    1987-01-01

    The role of personal computers within HEP is expanding as their capabilities increase and their cost decreases. Already they offer greater flexibility than many low-cost graphics terminals for a comparable cost and in addition they can significantly increase the productivity of physicists and programmers. This talk will discuss existing uses for personal computers and explore possible future directions for their integration into the overall computing environment. (orig.)

  7. Implementation of the equivalence theory inside the computational chain DRAGON/DONJON-NDF

    International Nuclear Information System (INIS)

    Dufour, P.

    2005-01-01

    The work accomplished in the scope of this master project consists in introducing the equivalence theory inside the computational schema DRAGON/DONJON-NDF. This theory takes into account the possible discontinuity of the homogeneous flux at the surfaces inside problems that involve an homogenisation procedure. To do it, the theory include new factors called discontinuity factors. These factors give, in theory, more exact solutions. Because we use the cell code DRAGON to generate all our homogeneous parameters we also used DRAGON to compute the heterogeneous surface fluxes which are essential to obtain the discontinuity factors. The project has been divided into two parts. The first part consists in computing the heterogeneous surface fluxes with the cell code DRAGON. For the second part of the project we have performed reactor computations using the code DONJON-NDF (over CANDU-6 geometry) with discontinuity factors and we have compared the results thus obtained with those computed without discontinuity factors.

  8. A Compute Environment of ABC95 Array Computer Based on Multi-FPGA Chip

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    ABC95 array computer is a multi-function network's computer based on FPGA technology, The multi-function network supports processors conflict-free access data from memory and supports processors access data from processors based on enhanced MESH network.ABC95 instruction's system includes control instructions, scalar instructions, vectors instructions.Mostly net-work instructions are introduced.A programming environment of ABC95 array computer assemble language is designed.A programming environment of ABC95 array computer for VC++ is advanced.It includes load function of ABC95 array computer program and data, store function, run function and so on.Specially, The data type of ABC95 array computer conflict-free access is defined.The results show that these technologies can develop programmer of ABC95 array computer effectively.

  9. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  10. Singular problems in shell theory. Computing and asymptotics

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez-Palencia, Evariste [Institut Jean Le Rond d' Alembert, Paris (France); Millet, Olivier [La Rochelle Univ. (France). LEPTIAB; Bechet, Fabien [Metz Univ. (France). LPMM

    2010-07-01

    It is known that deformations of thin shells exhibit peculiarities such as propagation of singularities, edge and internal layers, piecewise quasi inextensional deformations, sensitive problems and others, leading in most cases to numerical locking phenomena under several forms, and very poor quality of computations for small relative thickness. Most of these phenomena have a local and often anisotropic character (elongated in some directions), so that efficient numerical schemes should take them in consideration. This book deals with various topics in this context: general geometric formalism, analysis of singularities, numerical computing of thin shell problems, estimates for finite element approximation (including non-uniform and anisotropic meshes), mathematical considerations on boundary value problems in connection with sensitive problems encountered for very thin shells; and others. Most of numerical computations presented here use an adaptive anisotropic mesh procedure which allows a good computation of the physical peculiarities on one hand, and the possibility to perform automatic computations (without a previous mathematical description of the singularities) on the other. The book is recommended for PhD students, postgraduates and researchers who want to improve their knowledge in shell theory and in particular in the areas addressed (analysis of singularities, numerical computing of thin and very thin shell problems, sensitive problems). The lecture of the book may not be continuous and the reader may refer directly to the chapters concerned. (orig.)

  11. Proceedings of the 2011 2nd International Congress on Computer Applications and Computational Science

    CERN Document Server

    Nguyen, Quang

    2012-01-01

    The latest inventions in computer technology influence most of human daily activities. In the near future, there is tendency that all of aspect of human life will be dependent on computer applications. In manufacturing, robotics and automation have become vital for high quality products. In education, the model of teaching and learning is focusing more on electronic media than traditional ones. Issues related to energy savings and environment is becoming critical.   Computational Science should enhance the quality of human life,  not only solve their problems. Computational Science should help humans to make wise decisions by presenting choices and their possible consequences. Computational Science should help us make sense of observations, understand natural language, plan and reason with extensive background knowledge. Intelligence with wisdom is perhaps an ultimate goal for human-oriented science.   This book is a compilation of some recent research findings in computer application and computational sci...

  12. In this issue: Time to replace doctors’ judgement with computers

    Directory of Open Access Journals (Sweden)

    Simon de Lusignan

    2015-11-01

    Full Text Available Informaticians continue to rise to the challenge, set by the English Health Minister, of trying to replace doctors’ judgement with computers. This issue describes successes and where there are barriers. However, whilst there is progress this tends to be incremental and there are grand challenges to be overcome before computers can replace clinician. These grand challenges include: (1 improving usability so it is possible to more readily incorporate technology into clinical workflow; (2 rigorous new analytic methods that make use of the mass of available data, ‘Big data’, to create real-world evidence; (3 faster ways of meeting regulatory and legal requirements including ensuring privacy; (4 provision of reimbursement models to fund innovative technology that can substitute for clinical time and (5 recognition that innovations that improve quality also often increase cost. Informatics more is likely to support and augment clinical decision making rather than replace clinicians.

  13. A new stereotactic apparatus guided by computed tomography

    International Nuclear Information System (INIS)

    Huk, W.J.

    1981-01-01

    The accurate information provided by computer tomography about existence, shape, and localization of intracranial neoplasms in an early phase and in inaccessible regions have improved the diagnostics greatly, so that these lie far ahead of the therapeutic possibilities for brain tumors. To reduce this wide margin we have developed a new targeting device which makes a stereotactic approach to central lesions under sight-control by computed tomography within the computed tomography-scanner possible. With the help of this simple device we are now able to perform stereotactic procedures for tumor biopsy guided by computed tomography, needling and drainage of abscesses and cysts, and finally for the implantation of radioactive material for the interstitial radiotherapy of inoperable cysts and tumors. (orig.) [de

  14. Building a High Performance Computing Infrastructure for Novosibirsk Scientific Center

    International Nuclear Information System (INIS)

    Adakin, A; Chubarov, D; Nikultsev, V; Belov, S; Kaplin, V; Sukharev, A; Zaytsev, A; Kalyuzhny, V; Kuchin, N; Lomakin, S

    2011-01-01

    Novosibirsk Scientific Center (NSC), also known worldwide as Akademgorodok, is one of the largest Russian scientific centers hosting Novosibirsk State University (NSU) and more than 35 research organizations of the Siberian Branch of Russian Academy of Sciences including Budker Institute of Nuclear Physics (BINP), Institute of Computational Technologies (ICT), and Institute of Computational Mathematics and Mathematical Geophysics (ICM and MG). Since each institute has specific requirements on the architecture of the computing farms involved in its research field, currently we've got several computing facilities hosted by NSC institutes, each optimized for the particular set of tasks, of which the largest are the NSU Supercomputer Center, Siberian Supercomputer Center (ICM and MG), and a Grid Computing Facility of BINP. Recently a dedicated optical network with the initial bandwidth of 10 Gbps connecting these three facilities was built in order to make it possible to share the computing resources among the research communities of participating institutes, thus providing a common platform for building the computing infrastructure for various scientific projects. Unification of the computing infrastructure is achieved by extensive use of virtualization technologies based on XEN and KVM platforms. The solution implemented was tested thoroughly within the computing environment of KEDR detector experiment which is being carried out at BINP, and foreseen to be applied to the use cases of other HEP experiments in the upcoming future.

  15. Exascale for Energy: The Role of Exascale Computing in Energy Security

    Energy Technology Data Exchange (ETDEWEB)

    Authors, Various

    2010-07-15

    How will the United States satisfy energy demand in a tightening global energy marketplace while, at the same time, reducing greenhouse gas emissions? Exascale computing -- expected to be available within the next eight to ten years ? may play a crucial role in answering that question by enabling a paradigm shift from test-based to science-based design and engineering. Computational modeling of complete power generation systems and engines, based on scientific first principles, will accelerate the improvement of existing energy technologies and the development of new transformational technologies by pre-selecting the designs most likely to be successful for experimental validation, rather than relying on trial and error. The predictive understanding of complex engineered systems made possible by computational modeling will also reduce the construction and operations costs, optimize performance, and improve safety. Exascale computing will make possible fundamentally new approaches to quantifying the uncertainty of safety and performance engineering. This report discusses potential contributions of exa-scale modeling in four areas of energy production and distribution: nuclear power, combustion, the electrical grid, and renewable sources of energy, which include hydrogen fuel, bioenergy conversion, photovoltaic solar energy, and wind turbines. Examples of current research are taken from projects funded by the U.S. Department of Energy (DOE) Office of Science at universities and national laboratories, with a special focus on research conducted at Lawrence Berkeley National Laboratory.

  16. Development of real-time visualization system for Computational Fluid Dynamics on parallel computers

    International Nuclear Information System (INIS)

    Muramatsu, Kazuhiro; Otani, Takayuki; Matsumoto, Hideki; Takei, Toshifumi; Doi, Shun

    1998-03-01

    A real-time visualization system for computational fluid dynamics in a network connecting between a parallel computing server and the client terminal was developed. Using the system, a user can visualize the results of a CFD (Computational Fluid Dynamics) simulation on the parallel computer as a client terminal during the actual computation on a server. Using GUI (Graphical User Interface) on the client terminal, to user is also able to change parameters of the analysis and visualization during the real-time of the calculation. The system carries out both of CFD simulation and generation of a pixel image data on the parallel computer, and compresses the data. Therefore, the amount of data from the parallel computer to the client is so small in comparison with no compression that the user can enjoy the swift image appearance comfortably. Parallelization of image data generation is based on Owner Computation Rule. GUI on the client is built on Java applet. A real-time visualization is thus possible on the client PC only if Web browser is implemented on it. (author)

  17. Computed tomography of splenic trauma

    Energy Technology Data Exchange (ETDEWEB)

    Jeffrey, R.B.; Laing, F.C.; Federle, M.P.; Goodman, P.C.

    1981-12-01

    Fifty patients with abdominal trauma and possible splenic injury were evaluated by computed tomography (CT). CT correctly diagnosed 21 of 22 surgically proved traumatic sesions of the spleen (96%). Twenty-seven patients had no evidence of splenic injury. This was confirmed at operation in 1 patient and clinical follow-up in 26. There were one false negative and one false positive. In 5 patients (10%), CT demonstrated other clinically significant lesions, including hepatic or renal lacerations in 3 and large retroperitoneal hematomas in 2. In adolescents and adults, CT is an accurate, noninvasive method of rapidly diagnosing splenic trauma and associated injuries. Further experience is needed to assess its usefulness in evaluating splenic injuries in infants and small children.

  18. Computed tomography of splenic trauma

    International Nuclear Information System (INIS)

    Jeffrey, R.B.; Laing, F.C.; Federle, M.P.; Goodman, P.C.

    1981-01-01

    Fifty patients with abdominal trauma and possible splenic injury were evaluated by computed tomography (CT). CT correctly diagnosed 21 of 22 surgically proved traumatic sesions of the spleen (96%). Twenty-seven patients had no evidence of splenic injury. This was confirmed at operation in 1 patient and clinical follow-up in 26. There were one false negative and one false positive. In 5 patients (10%), CT demonstrated other clinically significant lesions, including hepatic or renal lacerations in 3 and large retroperitoneal hematomas in 2. In adolescents and adults, CT is an accurate, noninvasive method of rapidly diagnosing splenic trauma and associated injuries. Further experience is needed to assess its usefulness in evaluating splenic injuries in infants and small children

  19. Computation of emotions in man and machines.

    Science.gov (United States)

    Robinson, Peter; el Kaliouby, Rana

    2009-12-12

    The importance of emotional expression as part of human communication has been understood since Aristotle, and the subject has been explored scientifically since Charles Darwin and others in the nineteenth century. Advances in computer technology now allow machines to recognize and express emotions, paving the way for improved human-computer and human-human communications. Recent advances in psychology have greatly improved our understanding of the role of affect in communication, perception, decision-making, attention and memory. At the same time, advances in technology mean that it is becoming possible for machines to sense, analyse and express emotions. We can now consider how these advances relate to each other and how they can be brought together to influence future research in perception, attention, learning, memory, communication, decision-making and other applications. The computation of emotions includes both recognition and synthesis, using channels such as facial expressions, non-verbal aspects of speech, posture, gestures, physiology, brain imaging and general behaviour. The combination of new results in psychology with new techniques of computation is leading to new technologies with applications in commerce, education, entertainment, security, therapy and everyday life. However, there are important issues of privacy and personal expression that must also be considered.

  20. Introduction to morphogenetic computing

    CERN Document Server

    Resconi, Germano; Xu, Guanglin

    2017-01-01

    This book offers a concise introduction to morphogenetic computing, showing that its use makes global and local relations, defects in crystal non-Euclidean geometry databases with source and sink, genetic algorithms, and neural networks more stable and efficient. It also presents applications to database, language, nanotechnology with defects, biological genetic structure, electrical circuit, and big data structure. In Turing machines, input and output states form a system – when the system is in one state, the input is transformed into output. This computation is always deterministic and without any possible contradiction or defects. In natural computation there are defects and contradictions that have to be solved to give a coherent and effective computation. The new computation generates the morphology of the system that assumes different forms in time. Genetic process is the prototype of the morphogenetic computing. At the Boolean logic truth value, we substitute a set of truth (active sets) values with...

  1. Computed 3D visualisation of an extinct cephalopod using computer tomographs.

    Science.gov (United States)

    Lukeneder, Alexander

    2012-08-01

    The first 3D visualisation of a heteromorph cephalopod species from the Southern Alps (Dolomites, northern Italy) is presented. Computed tomography, palaeontological data and 3D reconstructions were included in the production of a movie, which shows a life reconstruction of the extinct organism. This detailed reconstruction is according to the current knowledge of the shape and mode of life as well as habitat of this animal. The results are based on the most complete shell known thus far of the genus Dissimilites . Object-based combined analyses from computed tomography and various computed 3D facility programmes help to understand morphological details as well as their ontogentical changes in fossil material. In this study, an additional goal was to show changes in locomotion during different ontogenetic phases of such fossil, marine shell-bearing animals (ammonoids). Hence, the presented models and tools can serve as starting points for discussions on morphology and locomotion of extinct cephalopods in general, and of the genus Dissimilites in particular. The heteromorph ammonoid genus Dissimilites is interpreted here as an active swimmer of the Tethyan Ocean. This study portrays non-destructive methods of 3D visualisation applied on palaeontological material, starting with computed tomography resulting in animated, high-quality video clips. The here presented 3D geometrical models and animation, which are based on palaeontological material, demonstrate the wide range of applications, analytical techniques and also outline possible limitations of 3D models in earth sciences and palaeontology. The realistic 3D models and motion pictures can easily be shared amongst palaeontologists. Data, images and short clips can be discussed online and, if necessary, adapted in morphological details and motion-style to better represent the cephalopod animal.

  2. Computed 3D visualisation of an extinct cephalopod using computer tomographs

    Science.gov (United States)

    Lukeneder, Alexander

    2012-08-01

    The first 3D visualisation of a heteromorph cephalopod species from the Southern Alps (Dolomites, northern Italy) is presented. Computed tomography, palaeontological data and 3D reconstructions were included in the production of a movie, which shows a life reconstruction of the extinct organism. This detailed reconstruction is according to the current knowledge of the shape and mode of life as well as habitat of this animal. The results are based on the most complete shell known thus far of the genus Dissimilites. Object-based combined analyses from computed tomography and various computed 3D facility programmes help to understand morphological details as well as their ontogentical changes in fossil material. In this study, an additional goal was to show changes in locomotion during different ontogenetic phases of such fossil, marine shell-bearing animals (ammonoids). Hence, the presented models and tools can serve as starting points for discussions on morphology and locomotion of extinct cephalopods in general, and of the genus Dissimilites in particular. The heteromorph ammonoid genus Dissimilites is interpreted here as an active swimmer of the Tethyan Ocean. This study portrays non-destructive methods of 3D visualisation applied on palaeontological material, starting with computed tomography resulting in animated, high-quality video clips. The here presented 3D geometrical models and animation, which are based on palaeontological material, demonstrate the wide range of applications, analytical techniques and also outline possible limitations of 3D models in earth sciences and palaeontology. The realistic 3D models and motion pictures can easily be shared amongst palaeontologists. Data, images and short clips can be discussed online and, if necessary, adapted in morphological details and motion-style to better represent the cephalopod animal.

  3. Preschool Cookbook of Computer Programming Topics

    Science.gov (United States)

    Morgado, Leonel; Cruz, Maria; Kahn, Ken

    2010-01-01

    A common problem in computer programming use for education in general, not simply as a technical skill, is that children and teachers find themselves constrained by what is possible through limited expertise in computer programming techniques. This is particularly noticeable at the preliterate level, where constructs tend to be limited to…

  4. Computer Education and Computer Use by Preschool Educators

    Science.gov (United States)

    Towns, Bernadette

    2010-01-01

    Researchers have found that teachers seldom use computers in the preschool classroom. However, little research has examined why preschool teachers elect not to use computers. This case study focused on identifying whether community colleges that prepare teachers for early childhood education include in their curriculum how teachers can effectively…

  5. Analog and hybrid computing

    CERN Document Server

    Hyndman, D E

    2013-01-01

    Analog and Hybrid Computing focuses on the operations of analog and hybrid computers. The book first outlines the history of computing devices that influenced the creation of analog and digital computers. The types of problems to be solved on computers, computing systems, and digital computers are discussed. The text looks at the theory and operation of electronic analog computers, including linear and non-linear computing units and use of analog computers as operational amplifiers. The monograph examines the preparation of problems to be deciphered on computers. Flow diagrams, methods of ampl

  6. PaCAL: A Python Package for Arithmetic Computations with Random Variables

    Directory of Open Access Journals (Sweden)

    Marcin Korze?

    2014-05-01

    Full Text Available In this paper we present PaCAL, a Python package for arithmetical computations on random variables. The package is capable of performing the four arithmetic operations: addition, subtraction, multiplication and division, as well as computing many standard functions of random variables. Summary statistics, random number generation, plots, and histograms of the resulting distributions can easily be obtained and distribution parameter ?tting is also available. The operations are performed numerically and their results interpolated allowing for arbitrary arithmetic operations on random variables following practically any probability distribution encountered in practice. The package is easy to use, as operations on random variables are performed just as they are on standard Python variables. Independence of random variables is, by default, assumed on each step but some computations on dependent random variables are also possible. We demonstrate on several examples that the results are very accurate, often close to machine precision. Practical applications include statistics, physical measurements or estimation of error distributions in scienti?c computations.

  7. Cyber Security on Nuclear Power Plant's Computer Systems

    International Nuclear Information System (INIS)

    Shin, Ick Hyun

    2010-01-01

    Computer systems are used in many different fields of industry. Most of us are taking great advantages from the computer systems. Because of the effectiveness and great performance of computer system, we are getting so dependable on the computer. But the more we are dependable on the computer system, the more the risk we will face when the computer system is unavailable or inaccessible or uncontrollable. There are SCADA, Supervisory Control And Data Acquisition, system which are broadly used for critical infrastructure such as transportation, electricity, water management. And if the SCADA system is vulnerable to the cyber attack, it is going to be nation's big disaster. Especially if nuclear power plant's main control systems are attacked by cyber terrorists, the results may be huge. Leaking of radioactive material will be the terrorist's main purpose without using physical forces. In this paper, different types of cyber attacks are described, and a possible structure of NPP's computer network system is presented. And the paper also provides possible ways of destruction of the NPP's computer system along with some suggestions for the protection against cyber attacks

  8. Selective population rate coding: a possible computational role of gamma oscillations in selective attention.

    Science.gov (United States)

    Masuda, Naoki

    2009-12-01

    Selective attention is often accompanied by gamma oscillations in local field potentials and spike field coherence in brain areas related to visual, motor, and cognitive information processing. Gamma oscillations are implicated to play an important role in, for example, visual tasks including object search, shape perception, and speed detection. However, the mechanism by which gamma oscillations enhance cognitive and behavioral performance of attentive subjects is still elusive. Using feedforward fan-in networks composed of spiking neurons, we examine a possible role for gamma oscillations in selective attention and population rate coding of external stimuli. We implement the concept proposed by Fries ( 2005 ) that under dynamic stimuli, neural populations effectively communicate with each other only when there is a good phase relationship among associated gamma oscillations. We show that the downstream neural population selects a specific dynamic stimulus received by an upstream population and represents it by population rate coding. The encoded stimulus is the one for which gamma rhythm in the corresponding upstream population is resonant with the downstream gamma rhythm. The proposed role for gamma oscillations in stimulus selection is to enable top-down control, a neural version of time division multiple access used in communication engineering.

  9. Advanced Simulation and Computing FY17 Implementation Plan, Version 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hendrickson, Bruce [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wade, Doug [National Nuclear Security Administration (NNSA), Washington, DC (United States). Office of Advanced Simulation and Computing and Institutional Research and Development; Hoang, Thuc [National Nuclear Security Administration (NNSA), Washington, DC (United States). Computational Systems and Software Environment

    2016-08-29

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.

  10. Computational Streetscapes

    Directory of Open Access Journals (Sweden)

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  11. Radiometric installations for automatic control of industrial processes and some possibilities of the specialized computers application

    International Nuclear Information System (INIS)

    Kuzino, S.; Shandru, P.

    1979-01-01

    It is noted that application of radioisotope devices in circuits for automation of some industrial processes permits to obtain the on-line information about some parameters of these processes. This information being passed to a computer, controlling the process, permits to obtain and maintain some optimum technological perameters of this process. Some elements of the automation stem projecting are given from the poin of wiev of the radiometric devices tuning, calibration of the radiometric devices with the purpose to get a digital answer in the on-line regime with the preset accuracy and thrustworthyness levels for supplying them to the controlling computer; determination of the system's reaction on the base of the preset statistical criteria; development, on the base of the data obtained from the computer, of an algorithm for the functional checking of radiometric devices' characteristics, - stability and reproductibility of readings in the operation regime as well as determination of the value threshold of an answer, depending on the measured parameter [ru

  12. Program MASTERCALC: an interactive computer program for radioanalytical computations. Description and operating instructions

    International Nuclear Information System (INIS)

    Goode, W.

    1980-10-01

    MASTERCALC is a computer program written to support radioanalytical computations in the Los Alamos Scientific Laboratory (LASL) Environmental Surveillance Group. Included in the program are routines for gross alpha and beta, 3 H, gross gamma, 90 Sr and alpha spectroscopic determinations. A description of MASTERCALC is presented and its source listing is included. Operating instructions and example computing sessions are given for each type of analysis

  13. Computer aided plant engineering: An analysis and suggestions for computer use

    International Nuclear Information System (INIS)

    Leinemann, K.

    1979-09-01

    To get indications to and boundary conditions for computer use in plant engineering, an analysis of the engineering process was done. The structure of plant engineering is represented by a network of substaks and subsets of data which are to be manipulated. Main tool for integration of CAD-subsystems in plant engineering should be a central database which is described by characteristical requirements and a possible simple conceptual schema. The main features of an interactive system for computer aided plant engineering are shortly illustrated by two examples. The analysis leads to the conclusion, that an interactive graphic system for manipulation of net-like structured data, usable for various subtasks, should be the base for computer aided plant engineering. (orig.) [de

  14. The social impact of computers

    CERN Document Server

    Rosenberg, Richard S

    1992-01-01

    The Social Impact of Computers should be read as a guide to the social implications of current and future applications of computers. Among the basic themes presented are the following: the changing nature of work in response to technological innovation as well as the threat to jobs; personal freedom in the machine age as manifested by challenges to privacy, dignity, and work; the relationship between advances in computer and communications technology and the possibility of increased centralization of authority; and the emergence and influence of artificial intelligence and its role in decision

  15. Hardware and software maintenance strategies for upgrading vintage computers

    International Nuclear Information System (INIS)

    Wang, B.C.; Buijs, W.J.; Banting, R.D.

    1992-01-01

    The paper focuses on the maintenance of the computer hardware and software for digital control computers (DCC). Specific design and problems related to various maintenance strategies are reviewed. A foundation was required for a reliable computer maintenance and upgrading program to provide operation of the DCC with high availability and reliability for 40 years. This involved a carefully planned and executed maintenance and upgrading program, involving complementary hardware and software strategies. The computer system was designed on a modular basis, with large sections easily replaceable, to facilitate maintenance and improve availability of the system. Advances in computer hardware have made it possible to replace DCC peripheral devices with reliable, inexpensive, and widely available components from PC-based systems (PC = personal computer). By providing a high speed link from the DCC to a PC, it is now possible to use many commercial software packages to process data from the plant. 1 fig

  16. Improving the reliability of nuclear reprocessing by application of computers and mathematical modelling

    International Nuclear Information System (INIS)

    Gabowitsch, E.; Trauboth, H.

    1982-01-01

    After a brief survey of the present and expected future state of nuclear energy utilization, which should demonstrate the significance of nuclear reprocessing, safety and reliability aspects of nuclear reprocessing plants (NRP) are considered. Then, the principal possibilities of modern computer technology including computer systems architecture and application-oriented software for improving the reliability and availability are outlined. In this context, two information systems being developed at the Nuclear Research Center Karlsruhe (KfK) are briefly described. For design evaluation of certain areas of a large NRP mathematical methods and computer-aided tools developed, used or being designed by KfK are discussed. In conclusion, future research to be pursued in information processing and applied mathematics in support of reliable operation of NRP's is proposed. (Auth.)

  17. Networking and distance learning for teachers: A classification of possibilities

    NARCIS (Netherlands)

    Collis, Betty

    1995-01-01

    Computer based communication technologies, or what could be more conveniently called networking, are bringing new possibilities into teacher education in many different ways. As with distance education more generally they can facilitate flexibility in time and place of learning, but the range of

  18. Computational neuroscience a first course

    CERN Document Server

    Mallot, Hanspeter A

    2013-01-01

    Computational Neuroscience - A First Course provides an essential introduction to computational neuroscience and  equips readers with a fundamental understanding of modeling the nervous system at the membrane, cellular, and network level. The book, which grew out of a lecture series held regularly for more than ten years to graduate students in neuroscience with backgrounds in biology, psychology and medicine, takes its readers on a journey through three fundamental domains of computational neuroscience: membrane biophysics, systems theory and artificial neural networks. The required mathematical concepts are kept as intuitive and simple as possible throughout the book, making it fully accessible to readers who are less familiar with mathematics. Overall, Computational Neuroscience - A First Course represents an essential reference guide for all neuroscientists who use computational methods in their daily work, as well as for any theoretical scientist approaching the field of computational neuroscience.

  19. Introduction: a brief overview of iterative algorithms in X-ray computed tomography.

    Science.gov (United States)

    Soleimani, M; Pengpen, T

    2015-06-13

    This paper presents a brief overview of some basic iterative algorithms, and more sophisticated methods are presented in the research papers in this issue. A range of algebraic iterative algorithms are covered here including ART, SART and OS-SART. A major limitation of the traditional iterative methods is their computational time. The Krylov subspace based methods such as the conjugate gradients (CG) algorithm and its variants can be used to solve linear systems of equations arising from large-scale CT with possible implementation using modern high-performance computing tools. The overall aim of this theme issue is to stimulate international efforts to develop the next generation of X-ray computed tomography (CT) image reconstruction software. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  20. On the possibility of study the surface structure of small bio-objects, including fragments of nucleotide chains, by means of electron interference

    Energy Technology Data Exchange (ETDEWEB)

    Namiot, V.A., E-mail: vnamiot@gmail.co [Institute of Nuclear Physics, Moscow State University, Vorobyovy Gory, 119992 Moscow (Russian Federation)

    2009-07-20

    We propose a new method to study the surface of small bio-objects, including macromolecules and their complexes. This method is based on interference of low-energy electrons. Theoretically, this type of interference may allow to construct a hologram of the biological object, but, unlike an optical hologram, with the spatial resolution of the order of inter-atomic distances. The method provides a possibility to construct a series of such holograms at various levels of electron energies. In theory, obtaining such information would be enough to identify the types of molecular groups existing on the surface of the studied object. This method could also be used for 'fast reading' of nucleotide chains. It has been shown how to depose a long linear molecule as a straight line on a substrate before carrying out such 'reading'.

  1. Computer-controlled 3-D treatment delivery

    International Nuclear Information System (INIS)

    Fraass, Benedick A.

    1995-01-01

    Purpose/Objective: This course will describe the use of computer-controlled treatment delivery techniques for treatment of patients with sophisticated conformal therapy. In particular, research and implementation issues related to clinical use of computer-controlled conformal radiation therapy (CCRT) techniques will be discussed. The possible/potential advantages of CCRT techniques will be highlighted using results from clinical 3-D planning studies. Materials and Methods: In recent years, 3-D treatment planning has been used to develop and implement 3-D conformal therapy treatment techniques, and studies based on these conformal treatments have begun to show the promise of conformal therapy. This work has been followed by the development of commercially-available multileaf collimator and computer control systems for treatment machines. Using these (and other) CCRT devices, various centers are beginning to clinically use complex computer-controlled treatments. Both research and clinical CCRT treatment techniques will be discussed in this presentation. General concepts and requirements for CCRT will be mentioned. Developmental and clinical experience with CCRT techniques from a number of centers will be utilized. Results: Treatment planning, treatment preparation and treatment delivery must be approached in an integrated fashion in order to clinically implement CCRT treatment techniques, and the entire process will be discussed. Various CCRT treatment methodologies will be reviewed from operational, dosimetric, and technical points of view. The discussion will concentrate on CCRT techniques which are likely to see rather wide dissemination over the next several years, including particularly the use of multileaf collimators (MLC), dynamic and segmental conformal therapy, conformal field shaping, and other related techniques. More advanced CCRT techniques, such as the use of individualized intensity modulation of beams or segments, and the use of computer

  2. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  3. The value of computed tomography (CT) in the treatment of lung cancer

    International Nuclear Information System (INIS)

    Striggaris, K.; Gouliamos, A.; Garmatis, C.; Kaklamanis, N.; Vlahos, L.; Pontifex, G.

    1982-01-01

    The extensive recent literature on computed tomography (CT) includes several reports demonstrating the usefulness of body scanners in radiotherapy treatment planning. This followed earlier experience indicating the potential application of the technique in chest disease. The fast scan-times, possible with newer CT systems, eliminate motion degradation and provide accurate localization of thoracic tumors. This paper reports the authors' experience with CT in treatment planning of 38 patients with bronchogenic carcinoma after pretherapy evaluation by CT. They conclude that the availability of CT-scan data helps to define accurately the target volume and provides the information needed for treatment planning computers in order to estimate the desired dose. (Auth.)

  4. Cloud computing as a new technology trend in education

    OpenAIRE

    Шамина, Ольга Борисовна; Буланова, Татьяна Валентиновна

    2014-01-01

    The construction and operation of extremely large-scale, commodity-computer datacenters was the key necessary enabler of Cloud Computing. Cloud Computing could offer services make a good profit for using in education. With Cloud Computing it is possible to increase the quality of education, improve communicative culture and give to teachers and students new application opportunities.

  5. Computed tomography in diagnostics of effluent otitis media

    International Nuclear Information System (INIS)

    Imomova, L.S.; Norboev, Z.; Kalandarov, S.Ch.

    2011-01-01

    This article is devoted to computed tomography in diagnostics of effluent otitis media. The purpose of present work is to assess the possibilities of computed tomography method of temporal bone in the diagnostics of otitis media.

  6. FCJ-135 Feral Computing: From Ubiquitous Calculation to Wild Interactions

    Directory of Open Access Journals (Sweden)

    Matthew Fuller

    2011-12-01

    Full Text Available In ‘The Coming Age of Calm Technology’, Mark Weiser and John Seely Brown are clear in their assertions, what really ‘matters’ about technology is not technology in itself, rather, its capacity to continuously recreate our relationship with the world at large (Brown and Weiser 1996. Even though they promote such an idea under the banner of ‘calm technology’, what is central to their thesis is the mutational capacities brought into the world by the spillage of computation out from its customary boxes. What their work tends to occlude is that in setting the sinking of technology almost imperceptibly, but deeply into the ‘everyday’ as a target for ubiquitous computing, other possibilities are masked, for instance, those of greater hackability or interrogability of such technologies. Our contention is that making ubicomp seamless (MacColl et al, 2002 tends to obfuscate the potential of computation in reworking computational subjects, including societies, modes of life, and inter-relations with the dynamics of thought and the composition of experience and understanding.

  7. Recent trends in grid computing

    International Nuclear Information System (INIS)

    Miura, Kenichi

    2004-01-01

    Grid computing is a technology which allows uniform and transparent access to geographically dispersed computational resources, such as computers, databases, experimental and observational equipment etc. via high-speed, high-bandwidth networking. The commonly used analogy is that of electrical power grid, whereby the household electricity is made available from outlets on the wall, and little thought need to be given to where the electricity is generated and how it is transmitted. The usage of grid also includes distributed parallel computing, high through-put computing, data intensive computing (data grid) and collaborative computing. This paper reviews the historical background, software structure, current status and on-going grid projects, including applications of grid technology to nuclear fusion research. (author)

  8. Stress-intensity factors for surface cracks in pipes: a computer code for evaluation by use of influence functions. Final report

    International Nuclear Information System (INIS)

    Dedhia, D.D.; Harris, D.O.

    1982-06-01

    A user-oriented computer program for the evaluation of stress intensity factors for cracks in pipes is presented. Stress intensity factors for semi-elliptical, complete circumferential and long longitudinal cracks can be obtained using this computer program. The code is based on the method of influence functions which makes it possible to treat arbitrary stresses on the plane of the crack. The stresses on the crack plane can be entered as a mathematical or tabulated function. A user's manual is included in this report. Background information is also included

  9. Introducing Seismic Tomography with Computational Modeling

    Science.gov (United States)

    Neves, R.; Neves, M. L.; Teodoro, V.

    2011-12-01

    Learning seismic tomography principles and techniques involves advanced physical and computational knowledge. In depth learning of such computational skills is a difficult cognitive process that requires a strong background in physics, mathematics and computer programming. The corresponding learning environments and pedagogic methodologies should then involve sets of computational modelling activities with computer software systems which allow students the possibility to improve their mathematical or programming knowledge and simultaneously focus on the learning of seismic wave propagation and inverse theory. To reduce the level of cognitive opacity associated with mathematical or programming knowledge, several computer modelling systems have already been developed (Neves & Teodoro, 2010). Among such systems, Modellus is particularly well suited to achieve this goal because it is a domain general environment for explorative and expressive modelling with the following main advantages: 1) an easy and intuitive creation of mathematical models using just standard mathematical notation; 2) the simultaneous exploration of images, tables, graphs and object animations; 3) the attribution of mathematical properties expressed in the models to animated objects; and finally 4) the computation and display of mathematical quantities obtained from the analysis of images and graphs. Here we describe virtual simulations and educational exercises which enable students an easy grasp of the fundamental of seismic tomography. The simulations make the lecture more interactive and allow students the possibility to overcome their lack of advanced mathematical or programming knowledge and focus on the learning of seismological concepts and processes taking advantage of basic scientific computation methods and tools.

  10. Computers and data processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Computers and Data Processing provides information pertinent to the advances in the computer field. This book covers a variety of topics, including the computer hardware, computer programs or software, and computer applications systems.Organized into five parts encompassing 19 chapters, this book begins with an overview of some of the fundamental computing concepts. This text then explores the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapters consider how computers present their results and explain the storage and retrieval of

  11. Computational modeling applied to stress gradient analysis for metallic alloys

    International Nuclear Information System (INIS)

    Iglesias, Susana M.; Assis, Joaquim T. de; Monine, Vladimir I.

    2009-01-01

    Nowadays composite materials including materials reinforced by particles are the center of the researcher's attention. There are problems with the stress measurements in these materials, connected with the superficial stress gradient caused by the difference of the stress state of particles on the surface and in the matrix of the composite material. Computer simulation of diffraction profile formed by superficial layers of material allows simulate the diffraction experiment and gives the possibility to resolve the problem of stress measurements when the stress state is characterized by strong gradient. The aim of this paper is the application of computer simulation technique, initially developed for homogeneous materials, for diffraction line simulation of composite materials and alloys. Specifically we applied this technique for siluminum fabricated by powder metallurgy. (author)

  12. McMaster University: College and University Computing Environment.

    Science.gov (United States)

    CAUSE/EFFECT, 1988

    1988-01-01

    The computing and information services (CIS) organization includes administrative computing, academic computing, and networking and has three divisions: computing services, development services, and information services. Other computing activities include Health Sciences, Humanities Computing Center, and Department of Computer Science and Systems.…

  13. GRAPH-BASED POST INCIDENT INTERNAL AUDIT METHOD OF COMPUTER EQUIPMENT

    Directory of Open Access Journals (Sweden)

    I. S. Pantiukhin

    2016-05-01

    Full Text Available Graph-based post incident internal audit method of computer equipment is proposed. The essence of the proposed solution consists in the establishing of relationships among hard disk damps (image, RAM and network. This method is intended for description of information security incident properties during the internal post incident audit of computer equipment. Hard disk damps receiving and formation process takes place at the first step. It is followed by separation of these damps into the set of components. The set of components includes a large set of attributes that forms the basis for the formation of the graph. Separated data is recorded into the non-relational database management system (NoSQL that is adapted for graph storage, fast access and processing. Damps linking application method is applied at the final step. The presented method gives the possibility to human expert in information security or computer forensics for more precise, informative internal audit of computer equipment. The proposed method allows reducing the time spent on internal audit of computer equipment, increasing accuracy and informativeness of such audit. The method has a development potential and can be applied along with the other components in the tasks of users’ identification and computer forensics.

  14. Possibilities of computed bronchophonography in the diagnosis of external respiratory dysfunction in patients with cystic fibrosis

    Directory of Open Access Journals (Sweden)

    E. B. Pavlinova

    2016-01-01

    Full Text Available The degree of respiratory organ injury in cystic fibrosis determines the prognosis of the disease. Objective: to evaluate external respiratory function in children with cystic fibrosis. The study enrolled 48 children followed up at the Omsk Cystic Fibrosis Center. A control group consisted of 42 non-addicted smoking children with no evidence for respiratory diseases in the history. External respiratory function was evaluated using computed bronchophonography; spirography was additionally carried out in children over 6 years of age. Computed bronchophonography revealed obstructive respiratory failure in all children with severe cystic fibrosis. Chronic respiratory tract infection with Pseudomonas aeruginosa and bronchiectasis were associated with the higher values of the acoustic work of breathing at frequencies over 5000 Hz. It was established that there was a moderate negative correlation between the value of the acoustic work of breathing in the high frequency range and the forced expiratory volume in 1 second in %. Conclusion. Computed bronchophonography could reveal obstructive external respiratory dysfunction in children less than 6 years of age. 

  15. Tracing monadic computations and representing effects

    Directory of Open Access Journals (Sweden)

    Maciej Piróg

    2012-02-01

    Full Text Available In functional programming, monads are supposed to encapsulate computations, effectfully producing the final result, but keeping to themselves the means of acquiring it. For various reasons, we sometimes want to reveal the internals of a computation. To make that possible, in this paper we introduce monad transformers that add the ability to automatically accumulate observations about the course of execution as an effect. We discover that if we treat the resulting trace as the actual result of the computation, we can find new functionality in existing monads, notably when working with non-terminating computations.

  16. Playing with your Brain : Brain-Computer Interfaces and Games

    NARCIS (Netherlands)

    Nijholt, Anton; Tan, Desney; Bernhaupt, Regina; Tscheligi, Manfred

    2007-01-01

    In this workshop we investigate a possible role of brain-computer interaction in computer games and entertainment computing. The assumption is that brain activity, whether it is consciously controlled and directed by the user or just recorded in order to obtain information about the user’s affective

  17. Playing with your Brain: Brain-Computer Interfaces and Games

    NARCIS (Netherlands)

    Nijholt, Antinus; Tan, Desney; Bernhaupt, R.; Tscheligi, M.

    2007-01-01

    In this workshop we investigate a possible role of brain-computer interaction in computer games and entertainment computing. The assumption is that brain activity, whether it is consciously controlled and directed by the user or just recorded in order to obtain information about the user’s affective

  18. Bacterial computing: a form of natural computing and its applications

    Directory of Open Access Journals (Sweden)

    Rafael eLahoz-Beltra

    2014-03-01

    Full Text Available The capability to establish adaptive relationships with the environment is an essential characteristic of living cells. Both bacterial computing and bacterial intelligence are two general traits manifested along adaptive behaviors that respond to surrounding environmental conditions. These two traits have generated a variety of theoretical and applied approaches. Since the different systems of bacterial signaling and the different ways of genetic change are better known and more carefully explored, the whole adaptive possibilities of bacteria may be studied under new angles. For instance, there appear instances of molecular learning along the mechanisms of evolution. More in concrete, and looking specifically at the time dimension, the bacterial mechanisms of learning and evolution appear as two different and related mechanisms for adaptation to the environment; in somatic time the former and in evolutionary time the latter. In the present chapter it will be reviewed the possible application of both kinds of mechanisms to prokaryotic molecular computing schemes as well as to the solution of real world problems.

  19. Advanced topics in security computer system design

    International Nuclear Information System (INIS)

    Stachniak, D.E.; Lamb, W.R.

    1989-01-01

    The capability, performance, and speed of contemporary computer processors, plus the associated performance capability of the operating systems accommodating the processors, have enormously expanded the scope of possibilities for designers of nuclear power plant security computer systems. This paper addresses the choices that could be made by a designer of security computer systems working with contemporary computers and describes the improvement in functionality of contemporary security computer systems based on an optimally chosen design. Primary initial considerations concern the selection of (a) the computer hardware and (b) the operating system. Considerations for hardware selection concern processor and memory word length, memory capacity, and numerous processor features

  20. Computed tomography of the skeletal system

    International Nuclear Information System (INIS)

    Maas, R.; Heller, M.

    1990-01-01

    Patients showing severe multiple injuries, require special care and attention in the hospital. In these cases, the range of the diagnostic measures taken subsequent to computed tomography of the cranium must be broadened to include examinations of the vertebral column and pelvic ring for traumatic lesions. Radiological routine procedures are discussed wit hthe view of throwing some light on the problems involved incomputed tomography of the vertebral disks. In degenerative processes associated with spinal stenosis and hypertrophic facets it has been found that angular-sagittal-reconstruction may be quite useful. Computed tomography provides valuable information on morphological factors and has great discriminating power in the diagnosis of skeletal tumours of the extremities. Quantitative computed tomography offers unprecedented possibilities in the diagnosis and treatment of osteoporosis. Here, particular care must be taken to avoid inaccuracies of measurement as a result of incorrectly performed examinations. In malignant bone tumours the method of dynamic scanning permits the success or failure of any radiotherapeutic or chemical measures taken to be evaluated at an early stage. The success or failure of any radiotherapeutic or chemical measures taken to to treat malignant bone tumours can be evaluated at an early stage using the method on dynamic scanning. (orig.) [de

  1. Natural Computing in Computational Finance Volume 4

    CERN Document Server

    O’Neill, Michael; Maringer, Dietmar

    2012-01-01

    This book follows on from Natural Computing in Computational Finance  Volumes I, II and III.   As in the previous volumes of this series, the  book consists of a series of  chapters each of  which was selected following a rigorous, peer-reviewed, selection process.  The chapters illustrate the application of a range of cutting-edge natural  computing and agent-based methodologies in computational finance and economics.  The applications explored include  option model calibration, financial trend reversal detection, enhanced indexation, algorithmic trading,  corporate payout determination and agent-based modeling of liquidity costs, and trade strategy adaptation.  While describing cutting edge applications, the chapters are  written so that they are accessible to a wide audience. Hence, they should be of interest  to academics, students and practitioners in the fields of computational finance and  economics.  

  2. Analysis of possibility to apply new mathematical methods (R-function theory) in Monte Carlo simulation of complex geometry

    International Nuclear Information System (INIS)

    Altiparmakov, D.

    1988-12-01

    This analysis is part of the report on ' Implementation of geometry module of 05R code in another Monte Carlo code', chapter 6.0: establishment of future activity related to geometry in Monte Carlo method. The introduction points out some problems in solving complex three-dimensional models which induce the need for developing more efficient geometry modules in Monte Carlo calculations. Second part include formulation of the problem and geometry module. Two fundamental questions to be solved are defined: (1) for a given point, it is necessary to determine material region or boundary where it belongs, and (2) for a given direction, all cross section points with material regions should be determined. Third part deals with possible connection with Monte Carlo calculations for computer simulation of geometry objects. R-function theory enables creation of geometry module base on the same logic (complex regions are constructed by elementary regions sets operations) as well as construction geometry codes. R-functions can efficiently replace functions of three-value logic in all significant models. They are even more appropriate for application since three-value logic is not typical for digital computers which operate in two-value logic. This shows that there is a need for work in this field. It is shown that there is a possibility to develop interactive code for computer modeling of geometry objects in parallel with development of geometry module [sr

  3. Computer simulation and experimental self-assembly of irradiated glycine amino acid under magnetic fields: Its possible significance in prebiotic chemistry.

    Science.gov (United States)

    Heredia, Alejandro; Colín-García, María; Puig, Teresa Pi I; Alba-Aldave, Leticia; Meléndez, Adriana; Cruz-Castañeda, Jorge A; Basiuk, Vladimir A; Ramos-Bernal, Sergio; Mendoza, Alicia Negrón

    2017-12-01

    Ionizing radiation may have played a relevant role in chemical reactions for prebiotic biomolecule formation on ancient Earth. Environmental conditions such as the presence of water and magnetic fields were possibly relevant in the formation of organic compounds such as amino acids. ATR-FTIR, Raman, EPR and X-ray spectroscopies provide valuable information about molecular organization of different glycine polymorphs under static magnetic fields. γ-glycine polymorph formation increases in irradiated samples interacting with static magnetic fields. The increase in γ-glycine polymorph agrees with the computer simulations. The AM1 semi-empirical simulations show a change in the catalyst behavior and dipole moment values in α and γ-glycine interaction with the static magnetic field. The simulated crystal lattice energy in α-glycine is also affected by the free radicals under the magnetic field, which decreases its stability. Therefore, solid α and γ-glycine containing free radicals under static magnetic fields might have affected the prebiotic scenario on ancient Earth by causing the oligomerization of glycine in prebiotic reactions. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Computer Registration Becoming Mandatory

    CERN Multimedia

    2003-01-01

    Following the decision by the CERN Management Board (see Weekly Bulletin 38/2003), registration of all computers connected to CERN's network will be enforced and only registered computers will be allowed network access. The implementation has started with the IT buildings, continues with building 40 and the Prevessin site (as of Tuesday 4th November 2003), and will cover the whole of CERN before the end of this year. We therefore recommend strongly that you register all your computers in CERN's network database (Ethernet and wire-less cards) as soon as possible without waiting for the access restriction to take force. This will allow you accessing the network without interruption and help IT service providers to contact you in case of problems (security problems, viruses, etc.) • Users WITH a CERN computing account register at: http://cern.ch/register/ (CERN Intranet page) • Visitors WITHOUT a CERN computing account (e.g. short term visitors) register at: http://cern.ch/registerVisitorComp...

  5. Surface roughness effect on ultracold neutron interaction with a wall and implications for computer simulations

    International Nuclear Information System (INIS)

    Steyerl, A.; Malik, S. S.; Desai, A. M.; Kaufman, C.

    2010-01-01

    We review the diffuse scattering and the loss coefficient in ultracold neutron reflection from slightly rough surfaces, report a surprising reduction in loss coefficient due to roughness, and discuss the possibility of transition from quantum treatment to ray optics. The results are used in a computer simulation of neutron storage in a recent neutron lifetime experiment that reported a large discrepancy of neutron lifetime with the current particle data value. Our partial reanalysis suggests the possibility of systematic effects that were not included in this publication.

  6. 12 CFR 1102.27 - Computing time.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Computing time. 1102.27 Section 1102.27 Banks... for Proceedings § 1102.27 Computing time. (a) General rule. In computing any period of time prescribed... time begins to run is not included. The last day so computed is included, unless it is a Saturday...

  7. 12 CFR 622.21 - Computing time.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Computing time. 622.21 Section 622.21 Banks and... Formal Hearings § 622.21 Computing time. (a) General rule. In computing any period of time prescribed or... run is not to be included. The last day so computed shall be included, unless it is a Saturday, Sunday...

  8. Including gauge corrections to thermal leptogenesis

    International Nuclear Information System (INIS)

    Huetig, Janine

    2013-01-01

    . Furthermore, we have computed the Majorana neutrino production rate itself in chapter 6 to test our numerical procedure. In this context we have calculated the tree-level result as well as the gauge corrected result for the Majorana neutrino production rate. Finally in chapter 7, we have implemented the Majorana neutrino ladder rung diagram into our setup for leptogenesis: As a first consideration, we have collected all gauge corrected diagrams up to three-loop order for the asymmetry-causing two-loop diagrams. However, the results of chap. 5 showed that it is not sufficient to just include diagrams up to three-loop level. Due to the necessity of resumming all n-loop diagrams, we have constructed a cylindrical diagram that fulfils this condition. This diagram is the link between the Majorana neutrino ladder rung diagram calculated before on the one hand and the lepton asymmetry on the other. Therefore we have been able to derive a complete expression for the integrated lepton number matrix including all leading order corrections. The numerical analysis of this lepton number matrix needs a great computational effort since for the resulting eight-dimensional integral two ordinary differential equations have to be computed for each point the routine evaluates. Thus the result remains yet inaccessible. Research perspectives: Summarising, this thesis provides the basis for a systematic inclusion of gauge interactions in thermal leptogenesis scenarios. As a next step, one should evaluate the expression for the integrated lepton number numerically to gain a value, which can be used for comparison to earlier results such as the solutions of the Boltzmann equations as well as the Kadanoff-Baym ansatz with the implemented Standard Model widths. This numerical result would be the first quantitative number, which contains leading order corrections due to all interactions of the Majorana neutrino with the Standard Model particles. Further corrections by means of including washout effects

  9. Including gauge corrections to thermal leptogenesis

    Energy Technology Data Exchange (ETDEWEB)

    Huetig, Janine

    2013-05-17

    . Furthermore, we have computed the Majorana neutrino production rate itself in chapter 6 to test our numerical procedure. In this context we have calculated the tree-level result as well as the gauge corrected result for the Majorana neutrino production rate. Finally in chapter 7, we have implemented the Majorana neutrino ladder rung diagram into our setup for leptogenesis: As a first consideration, we have collected all gauge corrected diagrams up to three-loop order for the asymmetry-causing two-loop diagrams. However, the results of chap. 5 showed that it is not sufficient to just include diagrams up to three-loop level. Due to the necessity of resumming all n-loop diagrams, we have constructed a cylindrical diagram that fulfils this condition. This diagram is the link between the Majorana neutrino ladder rung diagram calculated before on the one hand and the lepton asymmetry on the other. Therefore we have been able to derive a complete expression for the integrated lepton number matrix including all leading order corrections. The numerical analysis of this lepton number matrix needs a great computational effort since for the resulting eight-dimensional integral two ordinary differential equations have to be computed for each point the routine evaluates. Thus the result remains yet inaccessible. Research perspectives: Summarising, this thesis provides the basis for a systematic inclusion of gauge interactions in thermal leptogenesis scenarios. As a next step, one should evaluate the expression for the integrated lepton number numerically to gain a value, which can be used for comparison to earlier results such as the solutions of the Boltzmann equations as well as the Kadanoff-Baym ansatz with the implemented Standard Model widths. This numerical result would be the first quantitative number, which contains leading order corrections due to all interactions of the Majorana neutrino with the Standard Model particles. Further corrections by means of including washout effects

  10. A computational approach to chemical etiologies of diabetes

    DEFF Research Database (Denmark)

    Audouze, Karine Marie Laure; Brunak, Søren; Grandjean, Philippe

    2013-01-01

    Computational meta-analysis can link environmental chemicals to genes and proteins involved in human diseases, thereby elucidating possible etiologies and pathogeneses of non-communicable diseases. We used an integrated computational systems biology approach to examine possible pathogenetic...... linkages in type 2 diabetes (T2D) through genome-wide associations, disease similarities, and published empirical evidence. Ten environmental chemicals were found to be potentially linked to T2D, the highest scores were observed for arsenic, 2,3,7,8-tetrachlorodibenzo-p-dioxin, hexachlorobenzene...

  11. Quantum algorithms for computational nuclear physics

    Directory of Open Access Journals (Sweden)

    Višňák Jakub

    2015-01-01

    Full Text Available While quantum algorithms have been studied as an efficient tool for the stationary state energy determination in the case of molecular quantum systems, no similar study for analogical problems in computational nuclear physics (computation of energy levels of nuclei from empirical nucleon-nucleon or quark-quark potentials have been realized yet. Although the difference between the above mentioned studies might seem negligible, it will be examined. First steps towards a particular simulation (on classical computer of the Iterative Phase Estimation Algorithm for deuterium and tritium nuclei energy level computation will be carried out with the aim to prove algorithm feasibility (and extensibility to heavier nuclei for its possible practical realization on a real quantum computer.

  12. Basic Principles of Industrial Electric Power Network Computer Aided Design and Engineering

    Directory of Open Access Journals (Sweden)

    M. I. Fursanov

    2012-01-01

    Full Text Available A conceptual model for a computer aided design and engineering system has been developed in the paper. The paper presents basic automation process principles including a graphical representation   network and calculation results, convenient user interface, automatic mode calculation, selection of transformer rated power and cross-section area of wires. The developed algorithm and program make it possible to save time and improve quality of project implementation.

  13. Movement of the patient and the cone beam computed tomography scanner: objectives and possible solutions

    Czech Academy of Sciences Publication Activity Database

    Hanzelka, T.; Dušek, J.; Ocásek, F.; Kučera, J.; Šedý, Jiří; Beneš, J.; Pavlíková, G.; Foltán, R.

    2013-01-01

    Roč. 116, č. 6 (2013), s. 769-773 ISSN 2212-4403 Institutional support: RVO:67985823 Keywords : cone beam computed tomography * movement artifacts * dry-run scan Subject RIV: ED - Physiology Impact factor: 1.265, year: 2013

  14. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2005-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

  15. On the Clouds: A New Way of Computing

    Directory of Open Access Journals (Sweden)

    Yan Han

    2010-06-01

    Full Text Available This article introduces cloud computing and discusses the author’s experience “on the clouds.” The author reviews cloud computing services and providers, then presents his experience of running multiple systems (e.g., integrated library systems, content management systems, and repository software. He evaluates costs, discusses advantages, and addresses some issues about cloud computing. Cloud computing fundamentally changes the ways institutions and companies manage their computing needs. Libraries can take advantage of cloud computing to start an IT project with low cost, to manage computing resources cost-effectively, and to explore new computing possibilities.

  16. High performance computing system in the framework of the Higgs boson studies

    CERN Document Server

    Belyaev, Nikita; The ATLAS collaboration; Velikhov, Vasily; Konoplich, Rostislav

    2017-01-01

    The Higgs boson physics is one of the most important and promising fields of study in the modern high energy physics. It is important to notice, that GRID computing resources become strictly limited due to increasing amount of statistics, required for physics analyses and unprecedented LHC performance. One of the possibilities to address the shortfall of computing resources is the usage of computer institutes' clusters, commercial computing resources and supercomputers. To perform precision measurements of the Higgs boson properties in these realities, it is also highly required to have effective instruments to simulate kinematic distributions of signal events. In this talk we give a brief description of the modern distribution reconstruction method called Morphing and perform few efficiency tests to demonstrate its potential. These studies have been performed on the WLCG and Kurchatov Institute’s Data Processing Center, including Tier-1 GRID site and supercomputer as well. We also analyze the CPU efficienc...

  17. Boxers--computed tomography, EEG, and neurological evaluation

    International Nuclear Information System (INIS)

    Ross, R.J.; Cole, M.; Thompson, J.S.; Kim, K.H.

    1983-01-01

    During the last three years, 40 ex-boxers were examined to determine the effects of boxing in regard to their neurological status and the computed tomographic (CT) appearance of the brain. Thirty-eight of these patients had a CT scan of the brain, and 24 had a complete neurological examination including an EEG. The results demonstrate a significant relationship between the number of bouts fought and CT changes indicating cerebral atrophy. Positive neurological findings were not significantly correlated with the number of bouts. Electroencephalographic abnormalities were significantly correlated with the number of bouts fought. Computed tomography and EEG of the brain should be considered as part of a regular neurological examination for active boxers and, if possible, before and after each match, to detect not only the effects of acute life-threatening brain trauma such as subdural hematomas and brain hemorrhages, but the more subtle and debilitating long-term changes of cerebral atrophy

  18. Computer aided surface representation

    Energy Technology Data Exchange (ETDEWEB)

    Barnhill, R.E.

    1990-02-19

    The central research problem of this project is the effective representation, computation, and display of surfaces interpolating to information in three or more dimensions. If the given information is located on another surface, then the problem is to construct a surface defined on a surface''. Sometimes properties of an already defined surface are desired, which is geometry processing''. Visualization of multivariate surfaces is possible by means of contouring higher dimensional surfaces. These problems and more are discussed below. The broad sweep from constructive mathematics through computational algorithms to computer graphics illustrations is utilized in this research. The breadth and depth of this research activity makes this research project unique.

  19. [The Psychomat computer complex for psychophysiologic studies].

    Science.gov (United States)

    Matveev, E V; Nadezhdin, D S; Shemsudov, A I; Kalinin, A V

    1991-01-01

    The authors analyze the principles of the design of a computed psychophysiological system for universal uses. Show the effectiveness of the use of computed technology as a combination of universal computation and control potentialities of a personal computer equipped with problem-oriented specialized facilities of stimuli presentation and detection of the test subject's reactions. Define the hardware and software configuration of the microcomputer psychophysiological system "Psychomat". Describe its functional possibilities and the basic medico-technical characteristics. Review organizational issues of the maintenance of its full-scale production.

  20. 78 FR 34669 - Certain Electronic Devices, Including Wireless Communication Devices, Portable Music and Data...

    Science.gov (United States)

    2013-06-10

    ..., Including Wireless Communication Devices, Portable Music and Data Processing Devices, and Tablet Computers... importing wireless communication devices, portable music and data processing devices, and tablet computers... certain electronic devices, including wireless communication devices, portable music and data processing...

  1. Molecular Magnets for Quantum Computation

    Science.gov (United States)

    Kuroda, Takayoshi

    2009-06-01

    We review recent progress in molecular magnets especially in the viewpoint of the application for quantum computing. After a brief introduction to single-molecule magnets (SMMs), a method for qubit manipulation by using non-equidistant spin sublevels of a SMM will be introduced. A weakly-coupled dimer of two SMMs is also a candidate for quantum computing, which shows no quantum tunneling of magnetization (QTM) at zero field. In the AF ring Cr7Ni system, the large tunnel splitting is a great advantage to reduce decoherence during manipulation, which can be a possible candidate to realize quantum computer devices in future.

  2. Wearable Computing in E-education

    Directory of Open Access Journals (Sweden)

    Aleksandra Labus

    2015-03-01

    Full Text Available Emerging technologies such as mobile computing, sensors and sensor networks, and augmented reality have led to innovations in the field of wearable computing. Devices such as smart watches and smart glasses allow users to interact with devices worn under, with, or on top of clothing. This paper analyzes the possibilities of application of wearable computing in e-education. The focus is on integration of wearables into e-learning systems, in order to support ubiquitous learning, interaction and collaborative work. We present a model for integration of wearable technology in an e-education system and discuss technical, pedagogical and social aspects.

  3. Early Childhood Teacher Candidates\\' Attitudes towards Computer and Computer Assisted Instruction

    OpenAIRE

    Oğuz, Evrim; Ellez, A. Murat; Akamca, Güzin Özyılmaz; Kesercioğlu, Teoman İ.; Girgin, Günseli

    2011-01-01

    The aim of this research is to evaluate preschool candidates’ attitudes towards computers andattitudes towards use of computer assisted instruction. The sample of this study includes 481 early childhoodeducation students who attended Dokuz Eylül University’s department of Early Childhood Education. Data werecollected by using “Scale of Computer Assisted Instruction Attitudes” developed by the Arslan (2006),“Computer Attitudes Scale” developed by Çelik & Bindak (2005) and “General Info...

  4. Areal rainfall estimation using moving cars - computer experiments including hydrological modeling

    Science.gov (United States)

    Rabiei, Ehsan; Haberlandt, Uwe; Sester, Monika; Fitzner, Daniel; Wallner, Markus

    2016-09-01

    The need for high temporal and spatial resolution precipitation data for hydrological analyses has been discussed in several studies. Although rain gauges provide valuable information, a very dense rain gauge network is costly. As a result, several new ideas have emerged to help estimating areal rainfall with higher temporal and spatial resolution. Rabiei et al. (2013) observed that moving cars, called RainCars (RCs), can potentially be a new source of data for measuring rain rate. The optical sensors used in that study are designed for operating the windscreen wipers and showed promising results for rainfall measurement purposes. Their measurement accuracy has been quantified in laboratory experiments. Considering explicitly those errors, the main objective of this study is to investigate the benefit of using RCs for estimating areal rainfall. For that, computer experiments are carried out, where radar rainfall is considered as the reference and the other sources of data, i.e., RCs and rain gauges, are extracted from radar data. Comparing the quality of areal rainfall estimation by RCs with rain gauges and reference data helps to investigate the benefit of the RCs. The value of this additional source of data is not only assessed for areal rainfall estimation performance but also for use in hydrological modeling. Considering measurement errors derived from laboratory experiments, the result shows that the RCs provide useful additional information for areal rainfall estimation as well as for hydrological modeling. Moreover, by testing larger uncertainties for RCs, they observed to be useful up to a certain level for areal rainfall estimation and discharge simulation.

  5. Computer assisted analysis of medical x-ray images

    Science.gov (United States)

    Bengtsson, Ewert

    1996-01-01

    X-rays were originally used to expose film. The early computers did not have enough capacity to handle images with useful resolution. The rapid development of computer technology over the last few decades has, however, led to the introduction of computers into radiology. In this overview paper, the various possible roles of computers in radiology are examined. The state of the art is briefly presented, and some predictions about the future are made.

  6. [Basic concept in computer assisted surgery].

    Science.gov (United States)

    Merloz, Philippe; Wu, Hao

    2006-03-01

    To investigate application of medical digital imaging systems and computer technologies in orthopedics. The main computer-assisted surgery systems comprise the four following subcategories. (1) A collection and recording process for digital data on each patient, including preoperative images (CT scans, MRI, standard X-rays), intraoperative visualization (fluoroscopy, ultrasound), and intraoperative position and orientation of surgical instruments or bone sections (using 3D localises). Data merging based on the matching of preoperative imaging (CT scans, MRI, standard X-rays) and intraoperative visualization (anatomical landmarks, or bone surfaces digitized intraoperatively via 3D localiser; intraoperative ultrasound images processed for delineation of bone contours). (2) In cases where only intraoperative images are used for computer-assisted surgical navigation, the calibration of the intraoperative imaging system replaces the merged data system, which is then no longer necessary. (3) A system that provides aid in decision-making, so that the surgical approach is planned on basis of multimodal information: the interactive positioning of surgical instruments or bone sections transmitted via pre- or intraoperative images, display of elements to guide surgical navigation (direction, axis, orientation, length and diameter of a surgical instrument, impingement, etc. ). And (4) A system that monitors the surgical procedure, thereby ensuring that the optimal strategy defined at the preoperative stage is taken into account. It is possible that computer-assisted orthopedic surgery systems will enable surgeons to better assess the accuracy and reliability of the various operative techniques, an indispensable stage in the optimization of surgery.

  7. Words and possible words in early language acquisition.

    Science.gov (United States)

    Marchetto, Erika; Bonatti, Luca L

    2013-11-01

    In order to acquire language, infants must extract its building blocks-words-and master the rules governing their legal combinations from speech. These two problems are not independent, however: words also have internal structure. Thus, infants must extract two kinds of information from the same speech input. They must find the actual words of their language. Furthermore, they must identify its possible words, that is, the sequences of sounds that, being morphologically well formed, could be words. Here, we show that infants' sensitivity to possible words appears to be more primitive and fundamental than their ability to find actual words. We expose 12- and 18-month-old infants to an artificial language containing a conflict between statistically coherent and structurally coherent items. We show that 18-month-olds can extract possible words when the familiarization stream contains marks of segmentation, but cannot do so when the stream is continuous. Yet, they can find actual words from a continuous stream by computing statistical relationships among syllables. By contrast, 12-month-olds can find possible words when familiarized with a segmented stream, but seem unable to extract statistically coherent items from a continuous stream that contains minimal conflicts between statistical and structural information. These results suggest that sensitivity to word structure is in place earlier than the ability to analyze distributional information. The ability to compute nontrivial statistical relationships becomes fully effective relatively late in development, when infants have already acquired a considerable amount of linguistic knowledge. Thus, mechanisms for structure extraction that do not rely on extensive sampling of the input are likely to have a much larger role in language acquisition than general-purpose statistical abilities. Copyright © 2013. Published by Elsevier Inc.

  8. Progress Towards an LES Wall Model Including Unresolved Roughness

    Science.gov (United States)

    Craft, Kyle; Redman, Andrew; Aikens, Kurt

    2015-11-01

    Wall models used in large eddy simulations (LES) are often based on theories for hydraulically smooth walls. While this is reasonable for many applications, there are also many where the impact of surface roughness is important. A previously developed wall model has been used primarily for jet engine aeroacoustics. However, jet simulations have not accurately captured thick initial shear layers found in some experimental data. This may partly be due to nozzle wall roughness used in the experiments to promote turbulent boundary layers. As a result, the wall model is extended to include the effects of unresolved wall roughness through appropriate alterations to the log-law. The methodology is tested for incompressible flat plate boundary layers with different surface roughness. Correct trends are noted for the impact of surface roughness on the velocity profile. However, velocity deficit profiles and the Reynolds stresses do not collapse as well as expected. Possible reasons for the discrepancies as well as future work will be presented. This work used the Extreme Science and Engineering Discovery Environment (XSEDE), which is supported by National Science Foundation grant number ACI-1053575. Computational resources on TACC Stampede were provided under XSEDE allocation ENG150001.

  9. Control rod calibration including the rod coupling effect

    International Nuclear Information System (INIS)

    Szilard, R.; Nelson, G.W.

    1984-01-01

    In a reactor containing more than one control rod, which includes all reactors licensed in the United States, there will be a 'coupling' or 'shadowing' of control rod flux at the location of a control rod as a result of the flux depression caused by another control rod. It was decided to investigate this phenomenon further, and eventually to put calibration table data or formulae in a small computer in the control room, so once could insert the positions of the three control rods and receive the excess reactivity without referring to separate tables. For this to be accomplished, a 'three control- rod reactivity function' would be used which would include the flux coupling between the rods. The function is design and measured data was fitted into it to determine the calibration constants. The input data for fitting the trial functions consisted of 254 data points, each consisting of the position of the reg, shim, and transient rods, and the total excess reactivity. (About 200 of these points were 'critical balance points', that is the rod positions for which reactor was critical, and the remainder were determined by positive period measurements.) Although this may be unrealistic from a physical viewpoint, the function derived gave a very accurate recalculation of the input data, and thus would faithfully give the excess reactivity for any possible combination of the locations of the three control rods. The next step, incorporation of the three-rod function into the minicomputer, will be pursued in the summer and fall of 1984

  10. Blind topological measurement-based quantum computation.

    Science.gov (United States)

    Morimae, Tomoyuki; Fujii, Keisuke

    2012-01-01

    Blind quantum computation is a novel secure quantum-computing protocol that enables Alice, who does not have sufficient quantum technology at her disposal, to delegate her quantum computation to Bob, who has a fully fledged quantum computer, in such a way that Bob cannot learn anything about Alice's input, output and algorithm. A recent proof-of-principle experiment demonstrating blind quantum computation in an optical system has raised new challenges regarding the scalability of blind quantum computation in realistic noisy conditions. Here we show that fault-tolerant blind quantum computation is possible in a topologically protected manner using the Raussendorf-Harrington-Goyal scheme. The error threshold of our scheme is 4.3 × 10(-3), which is comparable to that (7.5 × 10(-3)) of non-blind topological quantum computation. As the error per gate of the order 10(-3) was already achieved in some experimental systems, our result implies that secure cloud quantum computation is within reach.

  11. Computational design gains momentum in enzyme catalysis engineering

    NARCIS (Netherlands)

    Wijma, Hein J.; Janssen, Dick B.

    Computational protein design is becoming a powerful tool for tailoring enzymes for specific biotechnological applications. When applied to existing enzymes, computational re-design makes it possible to obtain orders of magnitude improvement in catalytic activity towards a new target substrate.

  12. A Computer Program for Assessing Nuclear Safety Culture Impact

    Energy Technology Data Exchange (ETDEWEB)

    Han, Kiyoon; Jae, Moosung [Hanyang Univ., Seoul (Korea, Republic of)

    2014-10-15

    Through several accidents of NPP including the Fukushima Daiichi in 2011 and Chernobyl accidents in 1986, a lack of safety culture was pointed out as one of the root cause of these accidents. Due to its latent influences on safety performance, safety culture has become an important issue in safety researches. Most of the researches describe how to evaluate the state of the safety culture of the organization. However, they did not include a possibility that the accident occurs due to the lack of safety culture. Because of that, a methodology for evaluating the impact of the safety culture on NPP's safety is required. In this study, the methodology for assessing safety culture impact is suggested and a computer program is developed for its application. SCII model which is the new methodology for assessing safety culture impact quantitatively by using PSA model. The computer program is developed for its application. This program visualizes the SCIs and the SCIIs. It might contribute to comparing the level of the safety culture among NPPs as well as improving the management safety of NPP.

  13. A Computer Security Course in the Undergraduate Computer Science Curriculum.

    Science.gov (United States)

    Spillman, Richard

    1992-01-01

    Discusses the importance of computer security and considers criminal, national security, and personal privacy threats posed by security breakdown. Several examples are given, including incidents involving computer viruses. Objectives, content, instructional strategies, resources, and a sample examination for an experimental undergraduate computer…

  14. Algorithmically specialized parallel computers

    CERN Document Server

    Snyder, Lawrence; Gannon, Dennis B

    1985-01-01

    Algorithmically Specialized Parallel Computers focuses on the concept and characteristics of an algorithmically specialized computer.This book discusses the algorithmically specialized computers, algorithmic specialization using VLSI, and innovative architectures. The architectures and algorithms for digital signal, speech, and image processing and specialized architectures for numerical computations are also elaborated. Other topics include the model for analyzing generalized inter-processor, pipelined architecture for search tree maintenance, and specialized computer organization for raster

  15. Including robustness in multi-criteria optimization for intensity-modulated proton therapy

    Science.gov (United States)

    Chen, Wei; Unkelbach, Jan; Trofimov, Alexei; Madden, Thomas; Kooy, Hanne; Bortfeld, Thomas; Craft, David

    2012-02-01

    each Pareto optimal plan takes less than 5 min on a standard computer, making a computationally friendly interface possible to the planner. In conclusion, the uncertainty pertinent to the IMPT procedure can be reduced during treatment planning by optimizing plans that emphasize different treatment objectives, including robustness, and then interactively seeking for a most-preferred one from the solution Pareto surface.

  16. Computer-aided software understanding systems to enhance confidence of scientific codes

    International Nuclear Information System (INIS)

    Sheng, G.; Oeren, T.I.

    1991-01-01

    A unique characteristic of nuclear waste disposal is the very long time span over which the combined engineered and natural containment system must remain effective: hundreds of thousands of years. Since there is no precedent in human history for such an endeavour, simulation with the use of computers is the only means we have of forecasting possible future outcomes quantitatively. The need for reliable models and software to make such forecasts so far into the future is obvious. One of the critical elements necessary to ensure reliability is the degree of reviewability of the computer program. Among others, there are two very important reasons for this. Firstly, if there is to be any chance at all of validating the conceptual models as implemented by the computer code, peer reviewers must be able to see and understand what the program is doing. It is all but impossible to achieve this understanding by just looking at the code due to possible unfamiliarity with the language and often due as well to the length and complexity of the code. Secondly, a thorough understanding of the code is also necessary to carry out code maintenance activities which include among others, error detection, error correction and code modification for purposes of enhancing its performance, functionality or to adapt it to a changed environment. The emerging concepts of computer-aided software understanding and reverse engineering can answer precisely these needs. This paper will discuss the role they can play in enhancing the confidence one has on computer codes and several examples will be provided. Finally a brief discussion of combining state-of-art forward engineering systems with reverse engineering systems will show how powerfully they can contribute to the overall quality assurance of a computer program. (13 refs., 7 figs.)

  17. Seventh Medical Image Computing and Computer Assisted Intervention Conference (MICCAI 2012)

    CERN Document Server

    Miller, Karol; Nielsen, Poul; Computational Biomechanics for Medicine : Models, Algorithms and Implementation

    2013-01-01

    One of the greatest challenges for mechanical engineers is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, biomedical sciences, and medicine. This book is an opportunity for computational biomechanics specialists to present and exchange opinions on the opportunities of applying their techniques to computer-integrated medicine. Computational Biomechanics for Medicine: Models, Algorithms and Implementation collects the papers from the Seventh Computational Biomechanics for Medicine Workshop held in Nice in conjunction with the Medical Image Computing and Computer Assisted Intervention conference. The topics covered include: medical image analysis, image-guided surgery, surgical simulation, surgical intervention planning, disease prognosis and diagnostics, injury mechanism analysis, implant and prostheses design, and medical robotics.

  18. Intelligent distributed computing

    CERN Document Server

    Thampi, Sabu

    2015-01-01

    This book contains a selection of refereed and revised papers of the Intelligent Distributed Computing Track originally presented at the third International Symposium on Intelligent Informatics (ISI-2014), September 24-27, 2014, Delhi, India.  The papers selected for this Track cover several Distributed Computing and related topics including Peer-to-Peer Networks, Cloud Computing, Mobile Clouds, Wireless Sensor Networks, and their applications.

  19. LTRACK: Beam-transport calculation including wakefield effects

    International Nuclear Information System (INIS)

    Chan, K.C.D.; Cooper, R.K.

    1988-01-01

    LTRACK is a first-order beam-transport code that includes wakefield effects up to quadrupole modes. This paper will introduce the readers to this computer code by describing the history, the method of calculations, and a brief summary of the input/output information. Future plans for the code will also be described

  20. Nature-inspired computing and optimization theory and applications

    CERN Document Server

    Yang, Xin-She; Nakamatsu, Kazumi

    2017-01-01

    The book provides readers with a snapshot of the state of the art in the field of nature-inspired computing and its application in optimization. The approach is mainly practice-oriented: each bio-inspired technique or algorithm is introduced together with one of its possible applications. Applications cover a wide range of real-world optimization problems: from feature selection and image enhancement to scheduling and dynamic resource management, from wireless sensor networks and wiring network diagnosis to sports training planning and gene expression, from topology control and morphological filters to nutritional meal design and antenna array design. There are a few theoretical chapters comparing different existing techniques, exploring the advantages of nature-inspired computing over other methods, and investigating the mixing time of genetic algorithms. The book also introduces a wide range of algorithms, including the ant colony optimization, the bat algorithm, genetic algorithms, the collision-based opti...

  1. Compact Gaussian quantum computation by multi-pixel homodyne detection

    International Nuclear Information System (INIS)

    Ferrini, G; Fabre, C; Treps, N; Gazeau, J P; Coudreau, T

    2013-01-01

    We study the possibility of producing and detecting continuous variable cluster states in an extremely compact optical setup. This method is based on a multi-pixel homodyne detection system recently demonstrated experimentally, which includes classical data post-processing. It allows the incorporation of the linear optics network, usually employed in standard experiments for the production of cluster states, in the stage of the measurement. After giving an example of cluster state generation by this method, we further study how this procedure can be generalized to perform Gaussian quantum computation. (paper)

  2. Review of quantum computation

    International Nuclear Information System (INIS)

    Lloyd, S.

    1992-01-01

    Digital computers are machines that can be programmed to perform logical and arithmetical operations. Contemporary digital computers are ''universal,'' in the sense that a program that runs on one computer can, if properly compiled, run on any other computer that has access to enough memory space and time. Any one universal computer can simulate the operation of any other; and the set of tasks that any such machine can perform is common to all universal machines. Since Bennett's discovery that computation can be carried out in a non-dissipative fashion, a number of Hamiltonian quantum-mechanical systems have been proposed whose time-evolutions over discrete intervals are equivalent to those of specific universal computers. The first quantum-mechanical treatment of computers was given by Benioff, who exhibited a Hamiltonian system with a basis whose members corresponded to the logical states of a Turing machine. In order to make the Hamiltonian local, in the sense that its structure depended only on the part of the computation being performed at that time, Benioff found it necessary to make the Hamiltonian time-dependent. Feynman discovered a way to make the computational Hamiltonian both local and time-independent by incorporating the direction of computation in the initial condition. In Feynman's quantum computer, the program is a carefully prepared wave packet that propagates through different computational states. Deutsch presented a quantum computer that exploits the possibility of existing in a superposition of computational states to perform tasks that a classical computer cannot, such as generating purely random numbers, and carrying out superpositions of computations as a method of parallel processing. In this paper, we show that such computers, by virtue of their common function, possess a common form for their quantum dynamics

  3. Why Don't All Professors Use Computers?

    Science.gov (United States)

    Drew, David Eli

    1989-01-01

    Discusses the adoption of computer technology at universities and examines reasons why some professors don't use computers. Topics discussed include computer applications, including artificial intelligence, social science research, statistical analysis, and cooperative research; appropriateness of the technology for the task; the Computer Aptitude…

  4. SHEAT for PC. A computer code for probabilistic seismic hazard analysis for personal computer, user's manual

    International Nuclear Information System (INIS)

    Yamada, Hiroyuki; Tsutsumi, Hideaki; Ebisawa, Katsumi; Suzuki, Masahide

    2002-03-01

    The SHEAT code developed at Japan Atomic Energy Research Institute is for probabilistic seismic hazard analysis which is one of the tasks needed for seismic Probabilistic Safety Assessment (PSA) of a nuclear power plant. At first, SHEAT was developed as the large sized computer version. In addition, a personal computer version was provided to improve operation efficiency and generality of this code in 2001. It is possible to perform the earthquake hazard analysis, display and the print functions with the Graphical User Interface. With the SHEAT for PC code, seismic hazard which is defined as an annual exceedance frequency of occurrence of earthquake ground motions at various levels of intensity at a given site is calculated by the following two steps as is done with the large sized computer. One is the modeling of earthquake generation around a site. Future earthquake generation (locations, magnitudes and frequencies of postulated earthquake) is modeled based on the historical earthquake records, active fault data and expert judgment. Another is the calculation of probabilistic seismic hazard at the site. An earthquake ground motion is calculated for each postulated earthquake using an attenuation model taking into account its standard deviation. Then the seismic hazard at the site is calculated by summing the frequencies of ground motions by all the earthquakes. This document is the user's manual of the SHEAT for PC code. It includes: (1) Outline of the code, which include overall concept, logical process, code structure, data file used and special characteristics of code, (2) Functions of subprogram and analytical models in them, (3) Guidance of input and output data, (4) Sample run result, and (5) Operational manual. (author)

  5. Mass prophylactic screening of the organized female populaton using the Thermograph-Computer System

    International Nuclear Information System (INIS)

    Vepkhvadze, R.Ya.; Khvedelidze, E.Sh.

    1984-01-01

    Organizational aspects of the Thermograph Computer System usage have been analyzed. It has been shown that results of thermodiagnosis completely coincide with clinical conclusion, but roentrenological method permits to reveal a disease only for 19 patients from 36 ones. It is possible to examine 120 women for the aim of early diagnosis of mammary gland diseases during the day operating hours with the use of the Thermograph Computer System. A movable thermodiagnostic room simultaneoUsly served as an inspection room to discover visual forms of tumor diseases including diseases of cervix uteri and may be used for mass preventive examination of the organized female population

  6. Computational Psychometrics for Modeling System Dynamics during Stressful Disasters

    Directory of Open Access Journals (Sweden)

    Pietro Cipresso

    2017-08-01

    Full Text Available Disasters can be very stressful events. However, computational models of stress require data that might be very difficult to collect during disasters. Moreover, personal experiences are not repeatable, so it is not possible to collect bottom-up information when building a coherent model. To overcome these problems, we propose the use of computational models and virtual reality integration to recreate disaster situations, while examining possible dynamics in order to understand human behavior and relative consequences. By providing realistic parameters associated with disaster situations, computational scientists can work more closely with emergency responders to improve the quality of interventions in the future.

  7. Approximation and Computation

    CERN Document Server

    Gautschi, Walter; Rassias, Themistocles M

    2011-01-01

    Approximation theory and numerical analysis are central to the creation of accurate computer simulations and mathematical models. Research in these areas can influence the computational techniques used in a variety of mathematical and computational sciences. This collection of contributed chapters, dedicated to renowned mathematician Gradimir V. Milovanovia, represent the recent work of experts in the fields of approximation theory and numerical analysis. These invited contributions describe new trends in these important areas of research including theoretic developments, new computational alg

  8. The challenge of computer mathematics.

    Science.gov (United States)

    Barendregt, Henk; Wiedijk, Freek

    2005-10-15

    Progress in the foundations of mathematics has made it possible to formulate all thinkable mathematical concepts, algorithms and proofs in one language and in an impeccable way. This is not in spite of, but partially based on the famous results of Gödel and Turing. In this way statements are about mathematical objects and algorithms, proofs show the correctness of statements and computations, and computations are dealing with objects and proofs. Interactive computer systems for a full integration of defining, computing and proving are based on this. The human defines concepts, constructs algorithms and provides proofs, while the machine checks that the definitions are well formed and the proofs and computations are correct. Results formalized so far demonstrate the feasibility of this 'computer mathematics'. Also there are very good applications. The challenge is to make the systems more mathematician-friendly, by building libraries and tools. The eventual goal is to help humans to learn, develop, communicate, referee and apply mathematics.

  9. Computer programs as accounting object

    Directory of Open Access Journals (Sweden)

    I.V. Perviy

    2015-03-01

    Full Text Available Existing approaches to the regulation of accounting software as one of the types of intangible assets have been considered. The features and current state of the legal protection of computer programs have been analyzed. The reasons for the need to use patent law as a means of legal protection of individual elements of computer programs have been discovered. The influence of the legal aspects of the use of computer programs for national legislation to their accounting reflection has been analyzed. The possible options for the transfer of rights from computer programs copyright owners have been analyzed that should be considered during creation of software accounting system at the enterprise. Identified and analyzed the characteristics of computer software as an intangible asset under the current law. General economic characteristics of computer programs as one of the types of intangible assets have been grounded. The main distinguishing features of software compared to other types of intellectual property have been all ocated

  10. Hypothermic death: Possibility of diagnosis by post-mortem computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Kawasumi, Yusuke, E-mail: ssu@rad.med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Clinical Imaging, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi, 980-8575 (Japan); Onozuka, Naoki; Kakizaki, Ayana [Tohoku University Graduate School of Medicine, Department of Clinical Imaging, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi, 980-8575 (Japan); Usui, Akihito, E-mail: t7402r0506@med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Diagnostic Image Analysis, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi, 980-8575 (Japan); Hosokai, Yoshiyuki, E-mail: hosokai@med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Diagnostic Image Analysis, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi, 980-8575 (Japan); Sato, Miho, E-mail: meifan58@m.tains.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Clinical Imaging, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi, 980-8575 (Japan); Saito, Haruo, E-mail: hsaito@med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Diagnostic Image Analysis, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi, 980-8575 (Japan); Ishibashi, Tadashi, E-mail: tisibasi@med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Clinical Imaging, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi, 980-8575 (Japan); Hayashizaki, Yoshie, E-mail: yoshie@forensic.med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Forensic Medicine, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi, 980-8575 (Japan); Funayama, Masato, E-mail: funayama@forensic.med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Forensic Medicine, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi, 980-8575 (Japan)

    2013-02-15

    Referring to our experience with post-mortem computed tomography (CT), many hypothermic death cases presented a lack of increase in lung-field concentration, blood clotting in the heart, thoracic aorta or pulmonary artery, and urine retention in the bladder. Thus we evaluated the diagnostic performance of post-mortem CT on hypothermic death based on the above-mentioned three findings. Twenty-four hypothermic death subjects and 53 non-hypothermic death subjects were examined. Two radiologists assessed the presence or lack of an increase in lung-field concentration, blood clotting in the heart, thoracic aorta or pulmonary artery, and measured urine volume in the bladder. Pearson's chi-square test and Mann–Whitney U-test were used to assess the relationship between the three findings and hypothermic death. The sensitivity, specificity, accuracy, positive predictive value (PPV) and negative predictive value (NPV) of the diagnosis were also calculated. Lack of an increase in lung-field concentration and blood clotting in the heart, thoracic aorta or pulmonary artery were significantly associated with hypothermic death (p = 0.0007, p < 0.0001, respectively). The hypothermic death cases had significantly more urine in the bladder than the non-hypothermic death cases (p = 0.0011). Regarding the diagnostic performance with all three findings, the sensitivity was 29.2% but the specificity was 100%. These three findings were more common in hypothermic death cases. Although the sensitivity was low, these findings will assist forensic physicians in diagnosing hypothermic death since the specificity was high.

  11. Hypothermic death: Possibility of diagnosis by post-mortem computed tomography

    International Nuclear Information System (INIS)

    Kawasumi, Yusuke; Onozuka, Naoki; Kakizaki, Ayana; Usui, Akihito; Hosokai, Yoshiyuki; Sato, Miho; Saito, Haruo; Ishibashi, Tadashi; Hayashizaki, Yoshie; Funayama, Masato

    2013-01-01

    Referring to our experience with post-mortem computed tomography (CT), many hypothermic death cases presented a lack of increase in lung-field concentration, blood clotting in the heart, thoracic aorta or pulmonary artery, and urine retention in the bladder. Thus we evaluated the diagnostic performance of post-mortem CT on hypothermic death based on the above-mentioned three findings. Twenty-four hypothermic death subjects and 53 non-hypothermic death subjects were examined. Two radiologists assessed the presence or lack of an increase in lung-field concentration, blood clotting in the heart, thoracic aorta or pulmonary artery, and measured urine volume in the bladder. Pearson's chi-square test and Mann–Whitney U-test were used to assess the relationship between the three findings and hypothermic death. The sensitivity, specificity, accuracy, positive predictive value (PPV) and negative predictive value (NPV) of the diagnosis were also calculated. Lack of an increase in lung-field concentration and blood clotting in the heart, thoracic aorta or pulmonary artery were significantly associated with hypothermic death (p = 0.0007, p < 0.0001, respectively). The hypothermic death cases had significantly more urine in the bladder than the non-hypothermic death cases (p = 0.0011). Regarding the diagnostic performance with all three findings, the sensitivity was 29.2% but the specificity was 100%. These three findings were more common in hypothermic death cases. Although the sensitivity was low, these findings will assist forensic physicians in diagnosing hypothermic death since the specificity was high

  12. The IceCube Computing Infrastructure Model

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Besides the big LHC experiments a number of mid-size experiments is coming online which need to define new computing models to meet the demands on processing and storage requirements of those experiments. We present the hybrid computing model of IceCube which leverages GRID models with a more flexible direct user model as an example of a possible solution. In IceCube a central datacenter at UW-Madison servers as Tier-0 with a single Tier-1 datacenter at DESY Zeuthen. We describe the setup of the IceCube computing infrastructure and report on our experience in successfully provisioning the IceCube computing needs.

  13. Analysis of control room computers at nuclear power plants

    International Nuclear Information System (INIS)

    Leijonhufvud, S.; Lindholm, L.

    1984-03-01

    The following problems are analyzed: - the developing of a system - hardware and software - data - the aquisition of the system - operation and service. The findings are: - most reliability problems can be solved by doubling critical units - reliability in software has a quality that can only be created through development - reliability in computer systems in extremely unusual situations can not be quantified or verified, except possibly for very small and functionally simple systems - to attain the highest possible reliability by such simple systems these have to: - contian one or very few functions - be functionally simple - be application-transparent, viz. the internal function of the system should be independent of the status of the process - a computer system will compete succesfully with other possible systems regarding reliability for the following reasons: - if the function is simple enough for other systems, the dator system would be small - if the functions cannot be realized by other systems - the computer system would complement the human effort - and the man-machine system would be a better solution than no system, possibly better than human function only. (Aa)

  14. Designing Ubiquitous Computing to Enhance Children's Learning in Museums

    Science.gov (United States)

    Hall, T.; Bannon, L.

    2006-01-01

    In recent years, novel paradigms of computing have emerged, which enable computational power to be embedded in artefacts and in environments in novel ways. These developments may create new possibilities for using computing to enhance learning. This paper presents the results of a design process that set out to explore interactive techniques,…

  15. Computer organization and design the hardware/software interface

    CERN Document Server

    Hennessy, John L

    1994-01-01

    Computer Organization and Design: The Hardware/Software Interface presents the interaction between hardware and software at a variety of levels, which offers a framework for understanding the fundamentals of computing. This book focuses on the concepts that are the basis for computers.Organized into nine chapters, this book begins with an overview of the computer revolution. This text then explains the concepts and algorithms used in modern computer arithmetic. Other chapters consider the abstractions and concepts in memory hierarchies by starting with the simplest possible cache. This book di

  16. Personal Computers.

    Science.gov (United States)

    Toong, Hoo-min D.; Gupta, Amar

    1982-01-01

    Describes the hardware, software, applications, and current proliferation of personal computers (microcomputers). Includes discussions of microprocessors, memory, output (including printers), application programs, the microcomputer industry, and major microcomputer manufacturers (Apple, Radio Shack, Commodore, and IBM). (JN)

  17. Adiabatic Quantum Computing

    Science.gov (United States)

    Landahl, Andrew

    2012-10-01

    Quantum computers promise to exploit counterintuitive quantum physics principles like superposition, entanglement, and uncertainty to solve problems using fundamentally fewer steps than any conventional computer ever could. The mere possibility of such a device has sharpened our understanding of quantum coherent information, just as lasers did for our understanding of coherent light. The chief obstacle to developing quantum computer technology is decoherence--one of the fastest phenomena in all of physics. In principle, decoherence can be overcome by using clever entangled redundancies in a process called fault-tolerant quantum error correction. However, the quality and scale of technology required to realize this solution appears distant. An exciting alternative is a proposal called ``adiabatic'' quantum computing (AQC), in which adiabatic quantum physics keeps the computer in its lowest-energy configuration throughout its operation, rendering it immune to many decoherence sources. The Adiabatic Quantum Architectures In Ultracold Systems (AQUARIUS) Grand Challenge Project at Sandia seeks to demonstrate this robustness in the laboratory and point a path forward for future hardware development. We are building devices in AQUARIUS that realize the AQC architecture on up to three quantum bits (``qubits'') in two platforms: Cs atoms laser-cooled to below 5 microkelvin and Si quantum dots cryo-cooled to below 100 millikelvin. We are also expanding theoretical frontiers by developing methods for scalable universal AQC in these platforms. We have successfully demonstrated operational qubits in both platforms and have even run modest one-qubit calculations using our Cs device. In the course of reaching our primary proof-of-principle demonstrations, we have developed multiple spinoff technologies including nanofabricated diffractive optical elements that define optical-tweezer trap arrays and atomic-scale Si lithography commensurate with placing individual donor atoms with

  18. Computer code for general analysis of radon risks (GARR)

    International Nuclear Information System (INIS)

    Ginevan, M.

    1984-09-01

    This document presents a computer model for general analysis of radon risks that allow the user to specify a large number of possible models with a small number of simple commands. The model is written in a version of BASIC which conforms closely to the American National Standards Institute (ANSI) definition for minimal BASIC and thus is readily modified for use on a wide variety of computers and, in particular, microcomputers. Model capabilities include generation of single-year life tables from 5-year abridged data, calculation of multiple-decrement life tables for lung cancer for the general population, smokers, and nonsmokers, and a cohort lung cancer risk calculation that allows specification of level and duration of radon exposure, the form of the risk model, and the specific population assumed at risk. 36 references, 8 figures, 7 tables

  19. Controlling data transfers from an origin compute node to a target compute node

    Science.gov (United States)

    Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2011-06-21

    Methods, apparatus, and products are disclosed for controlling data transfers from an origin compute node to a target compute node that include: receiving, by an application messaging module on the target compute node, an indication of a data transfer from an origin compute node to the target compute node; and administering, by the application messaging module on the target compute node, the data transfer using one or more messaging primitives of a system messaging module in dependence upon the indication.

  20. Ammonia-based quantum computer

    International Nuclear Information System (INIS)

    Ferguson, Andrew J.; Cain, Paul A.; Williams, David A.; Briggs, G. Andrew D.

    2002-01-01

    We propose a scheme for quantum computation using two eigenstates of ammonia or similar molecules. Individual ammonia molecules are confined inside fullerenes and used as two-level qubit systems. Interaction between these ammonia qubits takes place via the electric dipole moments, and in particular we show how a controlled-NOT gate could be implemented. After computation the qubit is measured with a single-electron electrometer sensitive enough to differentiate between the dipole moments of different states. We also discuss a possible implementation based on a quantum cellular automaton

  1. 75 FR 4583 - In the Matter of: Certain Electronic Devices, Including Mobile Phones, Portable Music Players...

    Science.gov (United States)

    2010-01-28

    ..., Including Mobile Phones, Portable Music Players, and Computers; Notice of Investigation AGENCY: U.S... music players, and computers, by reason of infringement of certain claims of U.S. Patent Nos. 6,714,091... importation of certain electronic devices, including mobile phones, portable music players, or computers that...

  2. Quantum wavepacket ab initio molecular dynamics: an approach for computing dynamically averaged vibrational spectra including critical nuclear quantum effects.

    Science.gov (United States)

    Sumner, Isaiah; Iyengar, Srinivasan S

    2007-10-18

    We have introduced a computational methodology to study vibrational spectroscopy in clusters inclusive of critical nuclear quantum effects. This approach is based on the recently developed quantum wavepacket ab initio molecular dynamics method that combines quantum wavepacket dynamics with ab initio molecular dynamics. The computational efficiency of the dynamical procedure is drastically improved (by several orders of magnitude) through the utilization of wavelet-based techniques combined with the previously introduced time-dependent deterministic sampling procedure measure to achieve stable, picosecond length, quantum-classical dynamics of electrons and nuclei in clusters. The dynamical information is employed to construct a novel cumulative flux/velocity correlation function, where the wavepacket flux from the quantized particle is combined with classical nuclear velocities to obtain the vibrational density of states. The approach is demonstrated by computing the vibrational density of states of [Cl-H-Cl]-, inclusive of critical quantum nuclear effects, and our results are in good agreement with experiment. A general hierarchical procedure is also provided, based on electronic structure harmonic frequencies, classical ab initio molecular dynamics, computation of nuclear quantum-mechanical eigenstates, and employing quantum wavepacket ab initio dynamics to understand vibrational spectroscopy in hydrogen-bonded clusters that display large degrees of anharmonicities.

  3. Internode data communications in a parallel computer

    Science.gov (United States)

    Archer, Charles J.; Blocksome, Michael A.; Miller, Douglas R.; Parker, Jeffrey J.; Ratterman, Joseph D.; Smith, Brian E.

    2013-09-03

    Internode data communications in a parallel computer that includes compute nodes that each include main memory and a messaging unit, the messaging unit including computer memory and coupling compute nodes for data communications, in which, for each compute node at compute node boot time: a messaging unit allocates, in the messaging unit's computer memory, a predefined number of message buffers, each message buffer associated with a process to be initialized on the compute node; receives, prior to initialization of a particular process on the compute node, a data communications message intended for the particular process; and stores the data communications message in the message buffer associated with the particular process. Upon initialization of the particular process, the process establishes a messaging buffer in main memory of the compute node and copies the data communications message from the message buffer of the messaging unit into the message buffer of main memory.

  4. Pacing a data transfer operation between compute nodes on a parallel computer

    Science.gov (United States)

    Blocksome, Michael A [Rochester, MN

    2011-09-13

    Methods, systems, and products are disclosed for pacing a data transfer between compute nodes on a parallel computer that include: transferring, by an origin compute node, a chunk of an application message to a target compute node; sending, by the origin compute node, a pacing request to a target direct memory access (`DMA`) engine on the target compute node using a remote get DMA operation; determining, by the origin compute node, whether a pacing response to the pacing request has been received from the target DMA engine; and transferring, by the origin compute node, a next chunk of the application message if the pacing response to the pacing request has been received from the target DMA engine.

  5. Application of computational systems biology to explore environmental toxicity hazards

    DEFF Research Database (Denmark)

    Audouze, Karine Marie Laure; Grandjean, Philippe

    2011-01-01

    Background: Computer-based modeling is part of a new approach to predictive toxicology.Objectives: We investigated the usefulness of an integrated computational systems biology approach in a case study involving the isomers and metabolites of the pesticide dichlorodiphenyltrichloroethane (DDT......) to ascertain their possible links to relevant adverse effects.Methods: We extracted chemical-protein association networks for each DDT isomer and its metabolites using ChemProt, a disease chemical biology database that includes both binding and gene expression data, and we explored protein-protein interactions...... using a human interactome network. To identify associated dysfunctions and diseases, we integrated protein-disease annotations into the protein complexes using the Online Mendelian Inheritance in Man database and the Comparative Toxicogenomics Database.Results: We found 175 human proteins linked to p,p´-DDT...

  6. Concentrator optical characterization using computer mathematical modelling and point source testing

    Science.gov (United States)

    Dennison, E. W.; John, S. L.; Trentelman, G. F.

    1984-01-01

    The optical characteristics of a paraboloidal solar concentrator are analyzed using the intercept factor curve (a format for image data) to describe the results of a mathematical model and to represent reduced data from experimental testing. This procedure makes it possible not only to test an assembled concentrator, but also to evaluate single optical panels or to conduct non-solar tests of an assembled concentrator. The use of three-dimensional ray tracing computer programs to calculate the mathematical model is described. These ray tracing programs can include any type of optical configuration from simple paraboloids to array of spherical facets and can be adapted to microcomputers or larger computers, which can graphically display real-time comparison of calculated and measured data.

  7. Overhead Crane Computer Model

    Science.gov (United States)

    Enin, S. S.; Omelchenko, E. Y.; Fomin, N. V.; Beliy, A. V.

    2018-03-01

    The paper has a description of a computer model of an overhead crane system. The designed overhead crane system consists of hoisting, trolley and crane mechanisms as well as a payload two-axis system. With the help of the differential equation of specified mechanisms movement derived through Lagrange equation of the II kind, it is possible to build an overhead crane computer model. The computer model was obtained using Matlab software. Transients of coordinate, linear speed and motor torque of trolley and crane mechanism systems were simulated. In addition, transients of payload swaying were obtained with respect to the vertical axis. A trajectory of the trolley mechanism with simultaneous operation with the crane mechanism is represented in the paper as well as a two-axis trajectory of payload. The designed computer model of an overhead crane is a great means for studying positioning control and anti-sway control systems.

  8. COMPUGIRLS: Stepping Stone to Future Computer-Based Technology Pathways

    Science.gov (United States)

    Lee, Jieun; Husman, Jenefer; Scott, Kimberly A.; Eggum-Wilkens, Natalie D.

    2015-01-01

    The COMPUGIRLS: Culturally relevant technology program for adolescent girls was developed to promote underrepresented girls' future possible selves and career pathways in computer-related technology fields. We hypothesized that the COMPUGIRLS would promote academic possible selves and self-regulation to achieve these possible selves. We compared…

  9. IBM Cloud Computing Powering a Smarter Planet

    Science.gov (United States)

    Zhu, Jinzy; Fang, Xing; Guo, Zhe; Niu, Meng Hua; Cao, Fan; Yue, Shuang; Liu, Qin Yu

    With increasing need for intelligent systems supporting the world's businesses, Cloud Computing has emerged as a dominant trend to provide a dynamic infrastructure to make such intelligence possible. The article introduced how to build a smarter planet with cloud computing technology. First, it introduced why we need cloud, and the evolution of cloud technology. Secondly, it analyzed the value of cloud computing and how to apply cloud technology. Finally, it predicted the future of cloud in the smarter planet.

  10. Computers in Nuclear Physics Division

    International Nuclear Information System (INIS)

    Kowalczyk, M.; Tarasiuk, J.; Srebrny, J.

    1997-01-01

    Improving of the computer equipment in Nuclear Physics Division is described. It include: new computer equipment and hardware upgrading, software developing, new programs for computer booting and modernization of data acquisition systems

  11. Statistical properties of dynamical systems – Simulation and abstract computation

    International Nuclear Information System (INIS)

    Galatolo, Stefano; Hoyrup, Mathieu; Rojas, Cristóbal

    2012-01-01

    Highlights: ► A survey on results about computation and computability on the statistical properties of dynamical systems. ► Computability and non-computability results for invariant measures. ► A short proof for the computability of the convergence speed of ergodic averages. ► A kind of “constructive” version of the pointwise ergodic theorem. - Abstract: We survey an area of recent development, relating dynamics to theoretical computer science. We discuss some aspects of the theoretical simulation and computation of the long term behavior of dynamical systems. We will focus on the statistical limiting behavior and invariant measures. We present a general method allowing the algorithmic approximation at any given accuracy of invariant measures. The method can be applied in many interesting cases, as we shall explain. On the other hand, we exhibit some examples where the algorithmic approximation of invariant measures is not possible. We also explain how it is possible to compute the speed of convergence of ergodic averages (when the system is known exactly) and how this entails the computation of arbitrarily good approximations of points of the space having typical statistical behaviour (a sort of constructive version of the pointwise ergodic theorem).

  12. NAIAD - a computer program for calculation of the steady state and transient behaviour (including LOCA) of compressible two-phase coolant in networks

    International Nuclear Information System (INIS)

    Trimble, G.D.; Turner, W.J.

    1976-04-01

    The three one-dimensional conservation equations of mass, momentum and energy are solved by a stable finite difference scheme which allows the time step to be varied in response to accuracy requirements. Consideration of numerical stability is not necessary. Slip between the phases is allowed and descriptions of complex hydraulic components can be added into specially provided user routines. Intrinsic choking using any of the nine slip models is possible. A pipe or fuel model and detailed surface heat transfer are included. (author)

  13. Computing all hybridization networks for multiple binary phylogenetic input trees.

    Science.gov (United States)

    Albrecht, Benjamin

    2015-07-30

    The computation of phylogenetic trees on the same set of species that are based on different orthologous genes can lead to incongruent trees. One possible explanation for this behavior are interspecific hybridization events recombining genes of different species. An important approach to analyze such events is the computation of hybridization networks. This work presents the first algorithm computing the hybridization number as well as a set of representative hybridization networks for multiple binary phylogenetic input trees on the same set of taxa. To improve its practical runtime, we show how this algorithm can be parallelized. Moreover, we demonstrate the efficiency of the software Hybroscale, containing an implementation of our algorithm, by comparing it to PIRNv2.0, which is so far the best available software computing the exact hybridization number for multiple binary phylogenetic trees on the same set of taxa. The algorithm is part of the software Hybroscale, which was developed specifically for the investigation of hybridization networks including their computation and visualization. Hybroscale is freely available(1) and runs on all three major operating systems. Our simulation study indicates that our approach is on average 100 times faster than PIRNv2.0. Moreover, we show how Hybroscale improves the interpretation of the reported hybridization networks by adding certain features to its graphical representation.

  14. Computer-Based Career Interventions.

    Science.gov (United States)

    Mau, Wei-Cheng

    The possible utilities and limitations of computer-assisted career guidance systems (CACG) have been widely discussed although the effectiveness of CACG has not been systematically considered. This paper investigates the effectiveness of a theory-based CACG program, integrating Sequential Elimination and Expected Utility strategies. Three types of…

  15. Computer Security Incident Response Planning at Nuclear Facilities

    International Nuclear Information System (INIS)

    2016-06-01

    The purpose of this publication is to assist Member States in developing comprehensive contingency plans for computer security incidents with the potential to impact nuclear security and/or nuclear safety. It provides an outline and recommendations for establishing a computer security incident response capability as part of a computer security programme, and considers the roles and responsibilities of the system owner, operator, competent authority, and national technical authority in responding to a computer security incident with possible nuclear security repercussions

  16. International Conference on Computer, Communication and Computational Sciences

    CERN Document Server

    Mishra, Krishn; Tiwari, Shailesh; Singh, Vivek

    2017-01-01

    Exchange of information and innovative ideas are necessary to accelerate the development of technology. With advent of technology, intelligent and soft computing techniques came into existence with a wide scope of implementation in engineering sciences. Keeping this ideology in preference, this book includes the insights that reflect the ‘Advances in Computer and Computational Sciences’ from upcoming researchers and leading academicians across the globe. It contains high-quality peer-reviewed papers of ‘International Conference on Computer, Communication and Computational Sciences (ICCCCS 2016), held during 12-13 August, 2016 in Ajmer, India. These papers are arranged in the form of chapters. The content of the book is divided into two volumes that cover variety of topics such as intelligent hardware and software design, advanced communications, power and energy optimization, intelligent techniques used in internet of things, intelligent image processing, advanced software engineering, evolutionary and ...

  17. The ATLAS computing challenge for HL-LHC

    CERN Document Server

    Campana, Simone; The ATLAS collaboration

    2016-01-01

    The ATLAS experiment successfully commissioned a software and computing infrastructure to support the physics program during LHC Run 2. The next phases of the accelerator upgrade will present new challenges in the offline area. In particular, at High Luminosity LHC (also known as Run 4) the data taking conditions will be very demanding in terms of computing resources: between 5 and 10 KHz of event rate from the HLT to be reconstructed (and possibly further reprocessed) with an average pile-up of up to 200 events per collision and an equivalent number of simulated samples to be produced. The same parameters for the current run are lower by up to an order of magnitude. While processing and storage resources would need to scale accordingly, the funding situation allows one at best to consider a flat budget over the next few years for offline computing needs. In this paper we present a study quantifying the challenge in terms of computing resources for HL-LHC and present ideas about the possible evolution of the ...

  18. From Computational Thinking to Computational Empowerment: A 21st Century PD Agenda

    DEFF Research Database (Denmark)

    Iversen, Ole Sejer; Smith, Rachel Charlotte; Dindler, Christian

    2018-01-01

    We propose computational empowerment as an approach, and a Participatory Design response, to challenges related to digitalization of society and the emerging need for digital literacy in K12 education. Our approach extends the current focus on computational thinking to include contextual, human-c...... technology in education. We argue that PD has the potential to drive a computational empowerment agenda in education, by connecting political PD with contemporary visions for addressing a future digitalized labor market and society.......We propose computational empowerment as an approach, and a Participatory Design response, to challenges related to digitalization of society and the emerging need for digital literacy in K12 education. Our approach extends the current focus on computational thinking to include contextual, human......-centred and societal challenges and impacts involved in students’ creative and critical engagement with digital technology. Our research is based on the FabLab@School project, in which a PD approach to computational empowerment provided opportunities as well as further challenges for the complex agenda of digital...

  19. ORCODE.77: a computer routine to control a nuclear physics experiment by a PDP-15 + CAMAC system, written in assembler language and including many new routines of general interest

    International Nuclear Information System (INIS)

    Dickens, J.K.; McConnell, J.W.

    1977-01-01

    ORCODE.77 is a versatile data-handling computer routine written in MACRO (assembler) language for a PDP-15 computer with EAE (extended arithmetic capability) connected to a CAMAC interface. The Interrupt feature of the computer is utilized. Although the code is oriented for a specific experimental problem, there are many routines of general interest, including a CAMAC Scaler handler, an executive routine to interpret and act upon three-character teletype commands, concise routines to type out double-precision integers (both octal and decimal) and floating-point numbers and to read in integers and floating-point numbers, a routine to convert to and from PDP-15 FORTRAN-IV floating-point format, a routine to handle clock interrupts, and our own DECTAPE handling routine. Routines having specific applications which are applicable to other very similar applications include a display routine using CAMAC instructions, control of external mechanical equipment using CAMAC instructions, storage of data from an Analog-to-digital Converter, analysis of stored data into time-dependent pulse-height spectra, and a routine to read the contents of a Nuclear Data 5050 Analyzer and to prepare DECTAPE output of these data for subsequent analysis by a code written in PDP-15-compiled FORTRAN-IV

  20. Computer interfacing

    CERN Document Server

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  1. Optical computing.

    Science.gov (United States)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  2. Interventions on central computing services during the weekend of 21 and 22 August

    CERN Multimedia

    2004-01-01

    As part of the planned upgrade of the computer centre infrastructure to meet the LHC computing needs, approximately 150 servers, hosting in particular the NICE home directories, Mail services and Web services, will need to be physically relocated to another part of the computing hall during the weekend of the 21 and 22 August. On Saturday 21 August, starting from 8:30a.m. interruptions of typically 60 minutes will take place on the following central computing services: NICE and the whole Windows infrastructure, Mail services, file services (including home directories and DFS workspaces), Web services, VPN access, Windows Terminal Services. During any interruption, incoming mail from outside CERN will be queued and delivered as soon as the service is operational again. All Services should be available again on Saturday 21 at 17:30 but a few additional interruptions will be possible after that time and on Sunday 22 August. IT Department

  3. Monte Carlo simulation of fast neutron scattering experiments including DD-breakup neutrons

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, D.; Siebert, B.R.L.

    1993-06-01

    The computational simulation of the deuteron breakup in a scattering experiment has been investigated. Experimental breakup spectra measured at 16 deuteron energies and at 7 angles for each energy served as the data base. Analysis of these input data and of the conditions of the scattering experiment made it possible to reduce the input data. The use of one weighted breakup spectrum is sufficient to simulate the scattering spectra at one incident neutron energy. A number of tests were carried out to prove the validity of this result. The simulation of neutron scattering on carbon, including the breakup, was compared with measured spectra. Differences between calculated and measured spectra were for the most part within the experimental uncertainties. Certain significant deviations can be attributed to erroneous scattering cross sections taken from an evaluation and used in the simulation. Scattering on higher-lying states in [sup 12]C can be analyzed by subtracting the simulated breakup-scattering from the experimental spectra. (orig.)

  4. Monte Carlo simulation of fast neutron scattering experiments including DD-breakup neutrons

    International Nuclear Information System (INIS)

    Schmidt, D.; Siebert, B.R.L.

    1993-06-01

    The computational simulation of the deuteron breakup in a scattering experiment has been investigated. Experimental breakup spectra measured at 16 deuteron energies and at 7 angles for each energy served as the data base. Analysis of these input data and of the conditions of the scattering experiment made it possible to reduce the input data. The use of one weighted breakup spectrum is sufficient to simulate the scattering spectra at one incident neutron energy. A number of tests were carried out to prove the validity of this result. The simulation of neutron scattering on carbon, including the breakup, was compared with measured spectra. Differences between calculated and measured spectra were for the most part within the experimental uncertainties. Certain significant deviations can be attributed to erroneous scattering cross sections taken from an evaluation and used in the simulation. Scattering on higher-lying states in 12 C can be analyzed by subtracting the simulated breakup-scattering from the experimental spectra. (orig.)

  5. Using Puppet to contextualize computing resources for ATLAS analysis on Google Compute Engine

    International Nuclear Information System (INIS)

    Öhman, Henrik; Panitkin, Sergey; Hendrix, Valerie

    2014-01-01

    With the advent of commercial as well as institutional and national clouds, new opportunities for on-demand computing resources for the HEP community become available. The new cloud technologies also come with new challenges, and one such is the contextualization of computing resources with regard to requirements of the user and his experiment. In particular on Google's new cloud platform Google Compute Engine (GCE) upload of user's virtual machine images is not possible. This precludes application of ready to use technologies like CernVM and forces users to build and contextualize their own VM images from scratch. We investigate the use of Puppet to facilitate contextualization of cloud resources on GCE, with particular regard to ease of configuration and dynamic resource scaling.

  6. Identifying logical planes formed of compute nodes of a subcommunicator in a parallel computer

    Science.gov (United States)

    Davis, Kristan D.; Faraj, Daniel A.

    2016-03-01

    In a parallel computer, a plurality of logical planes formed of compute nodes of a subcommunicator may be identified by: for each compute node of the subcommunicator and for a number of dimensions beginning with a first dimension: establishing, by a plane building node, in a positive direction of the first dimension, all logical planes that include the plane building node and compute nodes of the subcommunicator in a positive direction of a second dimension, where the second dimension is orthogonal to the first dimension; and establishing, by the plane building node, in a negative direction of the first dimension, all logical planes that include the plane building node and compute nodes of the subcommunicator in the positive direction of the second dimension.

  7. Design principles and clinical possibilities with a new generation of radiation therapy equipment

    Energy Technology Data Exchange (ETDEWEB)

    Ruden, B P [Department of Hospital physics Karolinska institute, (Sudan)

    1997-12-31

    The main steps in the development of isocentric megavoltage external beam radiation therapy machines are briefly reviewed identifying three principal types or generations of equipment to date. The new fourth generation of equipment presented here is characterized by considerably increased flexibility in dose delivery through the use of scanned elementary electron and photon beams of very high quality. Furthermore, the wide energy range and the possibility of using high resolution multi leaf collimation with all beam modalities makes it possible to simplify irradiation techniques and increase the accuracy in dose delivery. The main design features are described including a dual magnet scanning system, a photon beam purging magnet, a helium atmosphere in the treatment head, a beam`s eye view video read-out system of the collimator setting and a radiotherapeutic computed tomography facility. Some of the clinical applications of this new type of radiation therapy machine are finally reviewed, such as the ease of performance, became flattening, beam filtering and compensation, and the simplification of many treatment techniques using the wide spectrum of high quality electron and photon beams. Finally, the interesting possibility of doing conformation and more general isocentric treatments with non-uniform beams using the multi leaf collimator and the scanning systems are demonstrated. 9 figs., 1 tab.

  8. Computed Tomography (CT) Perfusion in Abdominal Cancer

    DEFF Research Database (Denmark)

    Hansen, Martin Lundsgaard; Norling, Rikke; Lauridsen, Carsten

    2013-01-01

    Computed Tomography (CT) Perfusion is an evolving method to visualize perfusion in organs and tissue. With the introduction of multidetector CT scanners, it is now possible to cover up to 16 cm in one rotation, and thereby making it possible to scan entire organs such as the liver with a fixed...

  9. Smart learning services based on smart cloud computing.

    Science.gov (United States)

    Kim, Svetlana; Song, Su-Mi; Yoon, Yong-Ik

    2011-01-01

    Context-aware technologies can make e-learning services smarter and more efficient since context-aware services are based on the user's behavior. To add those technologies into existing e-learning services, a service architecture model is needed to transform the existing e-learning environment, which is situation-aware, into the environment that understands context as well. The context-awareness in e-learning may include the awareness of user profile and terminal context. In this paper, we propose a new notion of service that provides context-awareness to smart learning content in a cloud computing environment. We suggest the elastic four smarts (E4S)--smart pull, smart prospect, smart content, and smart push--concept to the cloud services so smart learning services are possible. The E4S focuses on meeting the users' needs by collecting and analyzing users' behavior, prospecting future services, building corresponding contents, and delivering the contents through cloud computing environment. Users' behavior can be collected through mobile devices such as smart phones that have built-in sensors. As results, the proposed smart e-learning model in cloud computing environment provides personalized and customized learning services to its users.

  10. Smart Learning Services Based on Smart Cloud Computing

    Directory of Open Access Journals (Sweden)

    Yong-Ik Yoon

    2011-08-01

    Full Text Available Context-aware technologies can make e-learning services smarter and more efficient since context-aware services are based on the user’s behavior. To add those technologies into existing e-learning services, a service architecture model is needed to transform the existing e-learning environment, which is situation-aware, into the environment that understands context as well. The context-awareness in e-learning may include the awareness of user profile and terminal context. In this paper, we propose a new notion of service that provides context-awareness to smart learning content in a cloud computing environment. We suggest the elastic four smarts (E4S—smart pull, smart prospect, smart content, and smart push—concept to the cloud services so smart learning services are possible. The E4S focuses on meeting the users’ needs by collecting and analyzing users’ behavior, prospecting future services, building corresponding contents, and delivering the contents through cloud computing environment. Users’ behavior can be collected through mobile devices such as smart phones that have built-in sensors. As results, the proposed smart e-learning model in cloud computing environment provides personalized and customized learning services to its users.

  11. Hypothermic death: possibility of diagnosis by post-mortem computed tomography.

    Science.gov (United States)

    Kawasumi, Yusuke; Onozuka, Naoki; Kakizaki, Ayana; Usui, Akihito; Hosokai, Yoshiyuki; Sato, Miho; Saito, Haruo; Ishibashi, Tadashi; Hayashizaki, Yoshie; Funayama, Masato

    2013-02-01

    Referring to our experience with post-mortem computed tomography (CT), many hypothermic death cases presented a lack of increase in lung-field concentration, blood clotting in the heart, thoracic aorta or pulmonary artery, and urine retention in the bladder. Thus we evaluated the diagnostic performance of post-mortem CT on hypothermic death based on the above-mentioned three findings. Twenty-four hypothermic death subjects and 53 non-hypothermic death subjects were examined. Two radiologists assessed the presence or lack of an increase in lung-field concentration, blood clotting in the heart, thoracic aorta or pulmonary artery, and measured urine volume in the bladder. Pearson's chi-square test and Mann-Whitney U-test were used to assess the relationship between the three findings and hypothermic death. The sensitivity, specificity, accuracy, positive predictive value (PPV) and negative predictive value (NPV) of the diagnosis were also calculated. Lack of an increase in lung-field concentration and blood clotting in the heart, thoracic aorta or pulmonary artery were significantly associated with hypothermic death (p=0.0007, p<0.0001, respectively). The hypothermic death cases had significantly more urine in the bladder than the non-hypothermic death cases (p=0.0011). Regarding the diagnostic performance with all three findings, the sensitivity was 29.2% but the specificity was 100%. These three findings were more common in hypothermic death cases. Although the sensitivity was low, these findings will assist forensic physicians in diagnosing hypothermic death since the specificity was high. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  12. Computer Science Research at Langley

    Science.gov (United States)

    Voigt, S. J. (Editor)

    1982-01-01

    A workshop was held at Langley Research Center, November 2-5, 1981, to highlight ongoing computer science research at Langley and to identify additional areas of research based upon the computer user requirements. A panel discussion was held in each of nine application areas, and these are summarized in the proceedings. Slides presented by the invited speakers are also included. A survey of scientific, business, data reduction, and microprocessor computer users helped identify areas of focus for the workshop. Several areas of computer science which are of most concern to the Langley computer users were identified during the workshop discussions. These include graphics, distributed processing, programmer support systems and tools, database management, and numerical methods.

  13. Resource allocation in grid computing

    NARCIS (Netherlands)

    Koole, Ger; Righter, Rhonda

    2007-01-01

    Grid computing, in which a network of computers is integrated to create a very fast virtual computer, is becoming ever more prevalent. Examples include the TeraGrid and Planet-lab.org, as well as applications on the existing Internet that take advantage of unused computing and storage capacity of

  14. 8th International Workshop on Natural Computing

    CERN Document Server

    Hagiya, Masami

    2016-01-01

    This book highlights recent advances in natural computing, including biology and its theory, bio-inspired computing, computational aesthetics, computational models and theories, computing with natural media, philosophy of natural computing, and educational technology. It presents extended versions of the best papers selected from the “8th International Workshop on Natural Computing” (IWNC8), a symposium held in Hiroshima, Japan, in 2014. The target audience is not limited to researchers working in natural computing but also includes those active in biological engineering, fine/media art design, aesthetics, and philosophy.

  15. Using x-ray computed tomography in hydrology: Systems, resolutions, and limitations

    DEFF Research Database (Denmark)

    Wildenschild, Dorthe; Hopmans, J.W.; Vaz, C.M.P.

    2002-01-01

    media, obtained with different scanning systems and sample sizes, to illustrate advantages and limitations of these various systems, including topics of spatial resolution and contrast. In addition, we present examples of our most recent three-dimensional high-resolution images, for which......A combination of advances in experimental techniques and mathematical analysis has made it possible to characterize phase distribution and pore geometry in porous media using non-destructive X-ray computed tomography (CT). We present qualitative and quantitative CT results for partially saturated...

  16. The influence of the radiation pressure force on possible critical surfaces in binary systems

    International Nuclear Information System (INIS)

    Vanbeveren, D.

    1978-01-01

    Using a spherically symmetric approximation for the radiation pressure force to compute a possible critical surface for binary systems, previous authors found that the surface opens up at the far side of the companion. It is shown that this effect may be unreal, and could be a consequence of the simple approximation for the radiation pressure force, Due to the influence of the radiation force, mass will be lost over the whole surface of the star. In that way much mass could leave the system in massive binary systems. On the basis of evolutionary models, including mass loss by stellar wind, the results were applied on the X-ray binaries 3U 1700 - 37 and HD 77581. (Auth.)

  17. Research on cloud computing solutions

    OpenAIRE

    Liudvikas Kaklauskas; Vaida Zdanytė

    2015-01-01

    Cloud computing can be defined as a new style of computing in which dynamically scala-ble and often virtualized resources are provided as a services over the Internet. Advantages of the cloud computing technology include cost savings, high availability, and easy scalability. Voas and Zhang adapted six phases of computing paradigms, from dummy termi-nals/mainframes, to PCs, networking computing, to grid and cloud computing. There are four types of cloud computing: public cloud, private cloud, ...

  18. Computational models reveal a passive mechanism for cell migration in the crypt.

    Directory of Open Access Journals (Sweden)

    Sara-Jane Dunn

    Full Text Available Cell migration in the intestinal crypt is essential for the regular renewal of the epithelium, and the continued upward movement of cells is a key characteristic of healthy crypt dynamics. However, the driving force behind this migration is unknown. Possibilities include mitotic pressure, active movement driven by motility cues, or negative pressure arising from cell loss at the crypt collar. It is possible that a combination of factors together coordinate migration. Here, three different computational models are used to provide insight into the mechanisms that underpin cell movement in the crypt, by examining the consequence of eliminating cell division on cell movement. Computational simulations agree with existing experimental results, confirming that migration can continue in the absence of mitosis. Importantly, however, simulations allow us to infer mechanisms that are sufficient to generate cell movement, which is not possible through experimental observation alone. The results produced by the three models agree and suggest that cell loss due to apoptosis and extrusion at the crypt collar relieves cell compression below, allowing cells to expand and move upwards. This finding suggests that future experiments should focus on the role of apoptosis and cell extrusion in controlling cell migration in the crypt.

  19. Computer control applied to accelerators

    CERN Document Server

    Crowley-Milling, Michael C

    1974-01-01

    The differences that exist between control systems for accelerators and other types of control systems are outlined. It is further indicated that earlier accelerators had manual control systems to which computers were added, but that it is essential for the new, large accelerators to include computers in the control systems right from the beginning. Details of the computer control designed for the Super Proton Synchrotron are presented. The method of choosing the computers is described, as well as the reasons for CERN having to design the message transfer system. The items discussed include: CAMAC interface systems, a new multiplex system, operator-to-computer interaction (such as touch screen, computer-controlled knob, and non- linear track-ball), and high-level control languages. Brief mention is made of the contributions of other high-energy research laboratories as well as of some other computer control applications at CERN. (0 refs).

  20. Advanced Simulation & Computing FY15 Implementation Plan Volume 2, Rev. 0.5

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Matzen, M. Keith [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-09-16

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. As the program approaches the end of its second decade, ASC is intently focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Where possible, the program also enables the use of high-performance simulation and computing tools to address broader national security needs, such as foreign nuclear weapon assessments and counternuclear terrorism.

  1. [Problem list in computer-based patient records].

    Science.gov (United States)

    Ludwig, C A

    1997-01-14

    Computer-based clinical information systems are capable of effectively processing even large amounts of patient-related data. However, physicians depend on rapid access to summarized, clearly laid out data on the computer screen to inform themselves about a patient's current clinical situation. In introducing a clinical workplace system, we therefore transformed the problem list-which for decades has been successfully used in clinical information management-into an electronic equivalent and integrated it into the medical record. The table contains a concise overview of diagnoses and problems as well as related findings. Graphical information can also be integrated into the table, and an additional space is provided for a summary of planned examinations or interventions. The digital form of the problem list makes it possible to use the entire list or selected text elements for generating medical documents. Diagnostic terms for medical reports are transferred automatically to corresponding documents. Computer technology has an immense potential for the further development of problem list concepts. With multimedia applications sound and images will be included in the problem list. For hyperlink purpose the problem list could become a central information board and table of contents of the medical record, thus serving as the starting point for database searches and supporting the user in navigating through the medical record.

  2. Computational tools for high-throughput discovery in biology

    OpenAIRE

    Jones, Neil Christopher

    2007-01-01

    High throughput data acquisition technology has inarguably transformed the landscape of the life sciences, in part by making possible---and necessary---the computational disciplines of bioinformatics and biomedical informatics. These fields focus primarily on developing tools for analyzing data and generating hypotheses about objects in nature, and it is in this context that we address three pressing problems in the fields of the computational life sciences which each require computing capaci...

  3. Computational creativity

    Directory of Open Access Journals (Sweden)

    López de Mántaras Badia, Ramon

    2013-12-01

    Full Text Available New technologies, and in particular artificial intelligence, are drastically changing the nature of creative processes. Computers are playing very significant roles in creative activities such as music, architecture, fine arts, and science. Indeed, the computer is already a canvas, a brush, a musical instrument, and so on. However, we believe that we must aim at more ambitious relations between computers and creativity. Rather than just seeing the computer as a tool to help human creators, we could see it as a creative entity in its own right. This view has triggered a new subfield of Artificial Intelligence called Computational Creativity. This article addresses the question of the possibility of achieving computational creativity through some examples of computer programs capable of replicating some aspects of creative behavior in the fields of music and science.Las nuevas tecnologías y en particular la Inteligencia Artificial están cambiando de forma importante la naturaleza del proceso creativo. Los ordenadores están jugando un papel muy significativo en actividades artísticas tales como la música, la arquitectura, las bellas artes y la ciencia. Efectivamente, el ordenador ya es el lienzo, el pincel, el instrumento musical, etc. Sin embargo creemos que debemos aspirar a relaciones más ambiciosas entre los ordenadores y la creatividad. En lugar de verlos solamente como herramientas de ayuda a la creación, los ordenadores podrían ser considerados agentes creativos. Este punto de vista ha dado lugar a un nuevo subcampo de la Inteligencia Artificial denominado Creatividad Computacional. En este artículo abordamos la cuestión de la posibilidad de alcanzar dicha creatividad computacional mediante algunos ejemplos de programas de ordenador capaces de replicar algunos aspectos relacionados con el comportamiento creativo en los ámbitos de la música y la ciencia.

  4. HCI in Mobile and Ubiquitous Computing

    OpenAIRE

    椎尾, 一郎; 安村, 通晃; 福本, 雅明; 伊賀, 聡一郎; 増井, 俊之

    2003-01-01

    This paper provides some perspectives to human computer interaction in mobile and ubiquitous computing. The review covers overview of ubiquitous computing, mobile computing and wearable computing. It also summarizes HCI topics on these field, including real-world oriented interface, multi-modal interface, context awareness and in-visible computers. Finally we discuss killer applications for coming ubiquitous computing era.

  5. Distributed computing system with dual independent communications paths between computers and employing split tokens

    Science.gov (United States)

    Rasmussen, Robert D. (Inventor); Manning, Robert M. (Inventor); Lewis, Blair F. (Inventor); Bolotin, Gary S. (Inventor); Ward, Richard S. (Inventor)

    1990-01-01

    This is a distributed computing system providing flexible fault tolerance; ease of software design and concurrency specification; and dynamic balance of the loads. The system comprises a plurality of computers each having a first input/output interface and a second input/output interface for interfacing to communications networks each second input/output interface including a bypass for bypassing the associated computer. A global communications network interconnects the first input/output interfaces for providing each computer the ability to broadcast messages simultaneously to the remainder of the computers. A meshwork communications network interconnects the second input/output interfaces providing each computer with the ability to establish a communications link with another of the computers bypassing the remainder of computers. Each computer is controlled by a resident copy of a common operating system. Communications between respective ones of computers is by means of split tokens each having a moving first portion which is sent from computer to computer and a resident second portion which is disposed in the memory of at least one of computer and wherein the location of the second portion is part of the first portion. The split tokens represent both functions to be executed by the computers and data to be employed in the execution of the functions. The first input/output interfaces each include logic for detecting a collision between messages and for terminating the broadcasting of a message whereby collisions between messages are detected and avoided.

  6. Software of the BESM-4 computer for operating in on-line regime with the ''ALPhA'' installation

    International Nuclear Information System (INIS)

    Piskunov, N.M.; Sadovnikov, V.N.; Sitnik, I.M.; Strokovskij, E.A.; Sharov, V.I.

    1976-01-01

    A program for storing information and control of equipment for a magnetic spectrometer, which includes scintillation counters and proportional chambers, is described. The entire electronic equipment is designed according to the CAMAC standard. The program widely uses the possibilities of two-way ''device - computer'' communication by employing a display and computer-controlled logic blocks of the electronic equipment. The information is taped according to the international ISO standard with the aid of an ES-5012 storage device. The total information capacity received during a single pulse of accelerator radiation is 18K bits

  7. Computer program for diagnostic X-ray exposure conversion

    International Nuclear Information System (INIS)

    Lewis, S.L.

    1984-01-01

    Presented is a computer program designed to convert any given set of diagnostic X-ray exposure factors sequentially into another, yielding either an equivalent photographic density or one increased or decreased by a specifiable proportion. In addition to containing the wherewithal with which to manipulate a set of exposure factors, the facility to print hard (paper) copy is included enabling the results to be pasted into a notebook and used at any time. This program was originally written as an investigative exercise into examining the potential use of computers for practical radiographic purposes as conventionally encountered. At the same time, its possible use as an educational tool was borne in mind. To these ends, the current version of this program may be used as a means whereby exposure factors used in a diagnostic department may be altered to suit a particular requirement or may be used in the school as a mathematical model to describe the behaviour of exposure factors under manipulation without patient exposure. (author)

  8. CERN School of Computing 2012 - Registration is open!

    CERN Multimedia

    IT Department

    2012-01-01

    The registration to the CERN School of Computing is now open. CSC2012 will take place in Uppsala from the 13th to the 24th of August. The programme is comprised of three main themes: Data Technologies, Base Technologies and Physics Computing and will address a number or timely questions including: Do you know how to bridge Grids and Clouds using virtualization technology? Is it possible to simplify LHC physics analysis using virtual machine? How can reliable storage services be built from unreliable hardware? Why are tapes still used in high energy physics data storage? How can I write code for tomorrow’s hardware, today? Do you want to see your software with attacker's eyes? Can you hack your own code? Do you know what do 'code injection' and 'integer overflow' have in common? What's so special about High Energy Physic's data format? What are the key statistical methods used in physics data analysis? The CSC is a true Summer Un...

  9. CERN School of Computing 2011 - Registration is open

    CERN Multimedia

    IT Department

    2011-01-01

      The registration to the CERN School of Computing is open. CSC2011 will take place in Copenhagen from the 15th to the 26th of August. The programme is comprised of three main themes: Data Technologies, Base Technologies and Physics Computing and will address a number or timely questions including: Do you know how to bridge Grids and Clouds using virtualization technology? Is it possible to simplify LHC physics analysis using virtual machine? How can reliable storage services be built from unreliable hardware? Why are tapes still used in high energy physics data storage? How can I write code for tomorrow’s hardware, today? Do you want to see your software with attacker's eyes? Can you hack your own code? What's so special about High Energy Physic's data format? What are the key statistical methods used in physics data analysis? The CSC is a true Summer University. The focus is on delivering knowledge rather than know-how, which can better be provi...

  10. Creation of 'Ukrytie' objects computer model

    International Nuclear Information System (INIS)

    Mazur, A.B.; Kotlyarov, V.T.; Ermolenko, A.I.; Podbereznyj, S.S.; Postil, S.D.; Shaptala, D.V.

    1999-01-01

    A partial computer model of the 'Ukrytie' object was created with the use of geoinformation technologies. The computer model makes it possible to carry out information support of the works related to the 'Ukrytie' object stabilization and its conversion into ecologically safe system for analyzing, forecasting and controlling the processes occurring in the 'Ukrytie' object. Elements and structures of the 'Ukryttia' object were designed and input into the model

  11. Framework of Resource Management for Intercloud Computing

    Directory of Open Access Journals (Sweden)

    Mohammad Aazam

    2014-01-01

    Full Text Available There has been a very rapid increase in digital media content, due to which media cloud is gaining importance. Cloud computing paradigm provides management of resources and helps create extended portfolio of services. Through cloud computing, not only are services managed more efficiently, but also service discovery is made possible. To handle rapid increase in the content, media cloud plays a very vital role. But it is not possible for standalone clouds to handle everything with the increasing user demands. For scalability and better service provisioning, at times, clouds have to communicate with other clouds and share their resources. This scenario is called Intercloud computing or cloud federation. The study on Intercloud computing is still in its start. Resource management is one of the key concerns to be addressed in Intercloud computing. Already done studies discuss this issue only in a trivial and simplistic way. In this study, we present a resource management model, keeping in view different types of services, different customer types, customer characteristic, pricing, and refunding. The presented framework was implemented using Java and NetBeans 8.0 and evaluated using CloudSim 3.0.3 toolkit. Presented results and their discussion validate our model and its efficiency.

  12. Cyber Security on Nuclear Power Plant's Computer Systems

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Ick Hyun [Korea Institute of Nuclear Nonproliferation and Control, Daejeon (Korea, Republic of)

    2010-10-15

    Computer systems are used in many different fields of industry. Most of us are taking great advantages from the computer systems. Because of the effectiveness and great performance of computer system, we are getting so dependable on the computer. But the more we are dependable on the computer system, the more the risk we will face when the computer system is unavailable or inaccessible or uncontrollable. There are SCADA, Supervisory Control And Data Acquisition, system which are broadly used for critical infrastructure such as transportation, electricity, water management. And if the SCADA system is vulnerable to the cyber attack, it is going to be nation's big disaster. Especially if nuclear power plant's main control systems are attacked by cyber terrorists, the results may be huge. Leaking of radioactive material will be the terrorist's main purpose without using physical forces. In this paper, different types of cyber attacks are described, and a possible structure of NPP's computer network system is presented. And the paper also provides possible ways of destruction of the NPP's computer system along with some suggestions for the protection against cyber attacks

  13. AV Programs for Computer Know-How.

    Science.gov (United States)

    Mandell, Phyllis Levy

    1985-01-01

    Lists 44 audiovisual programs (most released between 1983 and 1984) grouped in seven categories: computers in society, introduction to computers, computer operations, languages and programing, computer graphics, robotics, computer careers. Excerpts from "School Library Journal" reviews, price, and intended grade level are included. Names…

  14. Discrete computational structures

    CERN Document Server

    Korfhage, Robert R

    1974-01-01

    Discrete Computational Structures describes discrete mathematical concepts that are important to computing, covering necessary mathematical fundamentals, computer representation of sets, graph theory, storage minimization, and bandwidth. The book also explains conceptual framework (Gorn trees, searching, subroutines) and directed graphs (flowcharts, critical paths, information network). The text discusses algebra particularly as it applies to concentrates on semigroups, groups, lattices, propositional calculus, including a new tabular method of Boolean function minimization. The text emphasize

  15. Computing meaning v.4

    CERN Document Server

    Bunt, Harry; Pulman, Stephen

    2013-01-01

    This book is a collection of papers by leading researchers in computational semantics. It presents a state-of-the-art overview of recent and current research in computational semantics, including descriptions of new methods for constructing and improving resources for semantic computation, such as WordNet, VerbNet, and semantically annotated corpora. It also presents new statistical methods in semantic computation, such as the application of distributional semantics in the compositional calculation of sentence meanings. Computing the meaning of sentences, texts, and spoken or texted dialogue i

  16. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  17. Compendium of computer codes for the researcher in magnetic fusion energy

    International Nuclear Information System (INIS)

    Porter, G.D.

    1989-01-01

    This is a compendium of computer codes, which are available to the fusion researcher. It is intended to be a document that permits a quick evaluation of the tools available to the experimenter who wants to both analyze his data, and compare the results of his analysis with the predictions of available theories. This document will be updated frequently to maintain its usefulness. I would appreciate receiving further information about codes not included here from anyone who has used them. The information required includes a brief description of the code (including any special features), a bibliography of the documentation available for the code and/or the underlying physics, a list of people to contact for help in running the code, instructions on how to access the code, and a description of the output from the code. Wherever possible, the code contacts should include people from each of the fusion facilities so that the novice can talk to someone ''down the hall'' when he first tries to use a code. I would also appreciate any comments about possible additions and improvements in the index. I encourage any additional criticism of this document. 137 refs

  18. East-West paths to unconventional computing.

    Science.gov (United States)

    Adamatzky, Andrew; Akl, Selim; Burgin, Mark; Calude, Cristian S; Costa, José Félix; Dehshibi, Mohammad Mahdi; Gunji, Yukio-Pegio; Konkoli, Zoran; MacLennan, Bruce; Marchal, Bruno; Margenstern, Maurice; Martínez, Genaro J; Mayne, Richard; Morita, Kenichi; Schumann, Andrew; Sergeyev, Yaroslav D; Sirakoulis, Georgios Ch; Stepney, Susan; Svozil, Karl; Zenil, Hector

    2017-12-01

    Unconventional computing is about breaking boundaries in thinking, acting and computing. Typical topics of this non-typical field include, but are not limited to physics of computation, non-classical logics, new complexity measures, novel hardware, mechanical, chemical and quantum computing. Unconventional computing encourages a new style of thinking while practical applications are obtained from uncovering and exploiting principles and mechanisms of information processing in and functional properties of, physical, chemical and living systems; in particular, efficient algorithms are developed, (almost) optimal architectures are designed and working prototypes of future computing devices are manufactured. This article includes idiosyncratic accounts of 'unconventional computing' scientists reflecting on their personal experiences, what attracted them to the field, their inspirations and discoveries. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Archiving Software Systems: Approaches to Preserve Computational Capabilities

    Science.gov (United States)

    King, T. A.

    2014-12-01

    A great deal of effort is made to preserve scientific data. Not only because data is knowledge, but it is often costly to acquire and is sometimes collected under unique circumstances. Another part of the science enterprise is the development of software to process and analyze the data. Developed software is also a large investment and worthy of preservation. However, the long term preservation of software presents some challenges. Software often requires a specific technology stack to operate. This can include software, operating systems and hardware dependencies. One past approach to preserve computational capabilities is to maintain ancient hardware long past its typical viability. On an archive horizon of 100 years, this is not feasible. Another approach to preserve computational capabilities is to archive source code. While this can preserve details of the implementation and algorithms, it may not be possible to reproduce the technology stack needed to compile and run the resulting applications. This future forward dilemma has a solution. Technology used to create clouds and process big data can also be used to archive and preserve computational capabilities. We explore how basic hardware, virtual machines, containers and appropriate metadata can be used to preserve computational capabilities and to archive functional software systems. In conjunction with data archives, this provides scientist with both the data and capability to reproduce the processing and analysis used to generate past scientific results.

  20. (Some) Computer Futures: Mainframes.

    Science.gov (United States)

    Joseph, Earl C.

    Possible futures for the world of mainframe computers can be forecast through studies identifying forces of change and their impact on current trends. Some new prospects for the future have been generated by advances in information technology; for example, recent United States successes in applied artificial intelligence (AI) have created new…

  1. Computer vision based nacre thickness measurement of Tahitian pearls

    Science.gov (United States)

    Loesdau, Martin; Chabrier, Sébastien; Gabillon, Alban

    2017-03-01

    The Tahitian Pearl is the most valuable export product of French Polynesia contributing with over 61 million Euros to more than 50% of the total export income. To maintain its excellent reputation on the international market, an obligatory quality control for every pearl deemed for exportation has been established by the local government. One of the controlled quality parameters is the pearls nacre thickness. The evaluation is currently done manually by experts that are visually analyzing X-ray images of the pearls. In this article, a computer vision based approach to automate this procedure is presented. Even though computer vision based approaches for pearl nacre thickness measurement exist in the literature, the very specific features of the Tahitian pearl, namely the large shape variety and the occurrence of cavities, have so far not been considered. The presented work closes the. Our method consists of segmenting the pearl from X-ray images with a model-based approach, segmenting the pearls nucleus with an own developed heuristic circle detection and segmenting possible cavities with region growing. Out of the obtained boundaries, the 2-dimensional nacre thickness profile can be calculated. A certainty measurement to consider imaging and segmentation imprecisions is included in the procedure. The proposed algorithms are tested on 298 manually evaluated Tahitian pearls, showing that it is generally possible to automatically evaluate the nacre thickness of Tahitian pearls with computer vision. Furthermore the results show that the automatic measurement is more precise and faster than the manual one.

  2. Global Conference on Applied Computing in Science and Engineering

    CERN Document Server

    2016-01-01

    The Global Conference on Applied Computing in Science and Engineering is organized by academics and researchers belonging to different scientific areas of the C3i/Polytechnic Institute of Portalegre (Portugal) and the University of Extremadura (Spain) with the technical support of ScienceKnow Conferences. The event has the objective of creating an international forum for academics, researchers and scientists from worldwide to discuss worldwide results and proposals regarding to the soundest issues related to Applied Computing in Science and Engineering. This event will include the participation of renowned keynote speakers, oral presentations, posters sessions and technical conferences related to the topics dealt with in the Scientific Program as well as an attractive social and cultural program. The papers will be published in the Proceedings e-books. The proceedings of the conference will be sent to possible indexing on Thomson Reuters (selective by Thomson Reuters, not all-inclusive) and Google Scholar...

  3. Computational Humor 2012 : extended abstacts of the (3rd international) Workshop on computational Humor

    NARCIS (Netherlands)

    Nijholt, Antinus; Unknown, [Unknown

    2012-01-01

    Like its predecessors in 1996 (University of Twente, the Netherlands) and 2002 (ITC-irst, Trento, Italy), this Third International Workshop on Computational Humor (IWCH 2012) focusses on the possibility to find algorithms that allow understanding and generation of humor. There is the general aim of

  4. Classical versus Computer Algebra Methods in Elementary Geometry

    Science.gov (United States)

    Pech, Pavel

    2005-01-01

    Computer algebra methods based on results of commutative algebra like Groebner bases of ideals and elimination of variables make it possible to solve complex, elementary and non elementary problems of geometry, which are difficult to solve using a classical approach. Computer algebra methods permit the proof of geometric theorems, automatic…

  5. Knowledge-based computer security advisor

    International Nuclear Information System (INIS)

    Hunteman, W.J.; Squire, M.B.

    1991-01-01

    The rapid expansion of computer security information and technology has included little support to help the security officer identify the safeguards needed to comply with a policy and to secure a computing system. This paper reports that Los Alamos is developing a knowledge-based computer security system to provide expert knowledge to the security officer. This system includes a model for expressing the complex requirements in computer security policy statements. The model is part of an expert system that allows a security officer to describe a computer system and then determine compliance with the policy. The model contains a generic representation that contains network relationships among the policy concepts to support inferencing based on information represented in the generic policy description

  6. 25 years of industrial computer tomography in Europe

    International Nuclear Information System (INIS)

    Sauerwein, C.; Simon, M.

    2003-01-01

    In recent years X-ray computed tomography (CT) has received a growing interest in industry. Most of the mathematical and physical basis for CT has been developed many decades ago and are still valid. Already 25 years ago pioneers were applying X-ray CT to inspect and examine industrial objects. Since then, advances especially in sensor technology and computer hardware have led to dramatical changes in CT hardware. Together with refinements of the CT reconstruction algorithms and their implementation in modern object oriented software environments and fast PC hardware it was possible to enhance the resolution and reduce the scan and reconstruction time. Especially the development of 3D CT systems based on cone beam reconstruction algorithms has improved the acceptance of CT technology for industrial applications. Besides the classical NDT application CT is becoming more and more a versatile tool for defect detection, dimensional measurement and is even entering the field of reverse engineering. Due to this advanced technology a multitude of applications in various fields has become possible, such that CT is now an indispensable instrument in many fields of industrial product development and manufacturing. This paper presents an overview on the development of industrial CT systems and technologies which includes advances in system concepts and a cross-section on a variety of applications

  7. The IAEA transport regulations: main modifications included in the 1996 edition and the possible impact of its adoption in Argentina

    International Nuclear Information System (INIS)

    Lopez Vietri, J.R.; Novo, R.G.; Bianchi, A.J.

    1998-01-01

    Full text: This paper points out a comparative analysis between the requirements of the 1985 edition (as Amended 1990), in-force in almost all countries included Argentina, and the 1996 edition, that is foresee to put in-force 1st January 2001, of the Regulations for the safe transport of radioactive material, published by the International Atomic Energy Agency (IAEA). The English version of the 1996 edition was published in December 1996 and the Spanish one in September 1997. Such edition was the culmination of a difficult consensus and harmonisation reached after an analysis process of the-years cycle between the IAEA Member Sates and related international organisations (United Nations, International Civil Aviation Organisation, International Air Transport Association, International Federation of Air Lines Pilots Associations, International Maritime Organisation) as well as regional organisations (Economic Commission for Europe, Commission of the European Communities). Both editions of the Regulations include a set of design, operational and administrative requirements that substantially do not differ as for their safety basic philosophy. However, the 1996 edition introduces numerous modifications of different magnitude, which will derive in technological, economic and operative consequences. Of such modifications the paper only analysed the relevant ones which update the state of art in the subject and allow the Regulations continue maintaining an acceptable level of control of the radiation, criticality and thermal hazards to persons, property and the environment during the transport of radioactive material. In addition, the paper briefly describes the possible impact that the main modifications induced in the 1996 edition of the Regulations should have, depending on the type of user considered either in Argentina or in other Latin America countries. However, it is desirable that the personal of competent authorities of each country involved in transport

  8. Computer simulation of ductile fracture

    International Nuclear Information System (INIS)

    Wilkins, M.L.; Streit, R.D.

    1979-01-01

    Finite difference computer simulation programs are capable of very accurate solutions to problems in plasticity with large deformations and rotation. This opens the possibility of developing models of ductile fracture by correlating experiments with equivalent computer simulations. Selected experiments were done to emphasize different aspects of the model. A difficult problem is the establishment of a fracture-size effect. This paper is a study of the strain field around notched tensile specimens of aluminum 6061-T651. A series of geometrically scaled specimens are tested to fracture. The scaled experiments are conducted for different notch radius-to-diameter ratios. The strains at fracture are determined from computer simulations. An estimate is made of the fracture-size effect

  9. Numerical simulation of information recovery in quantum computers

    International Nuclear Information System (INIS)

    Salas, P.J.; Sanz, A.L.

    2002-01-01

    Decoherence is the main problem to be solved before quantum computers can be built. To control decoherence, it is possible to use error correction methods, but these methods are themselves noisy quantum computation processes. In this work, we study the ability of Steane's and Shor's fault-tolerant recovering methods, as well as a modification of Steane's ancilla network, to correct errors in qubits. We test a way to measure correctly ancilla's fidelity for these methods, and state the possibility of carrying out an effective error correction through a noisy quantum channel, even using noisy error correction methods

  10. Cumulative keyboard strokes: a possible risk factor for carpal tunnel syndrome

    Directory of Open Access Journals (Sweden)

    Eleftheriou Andreas

    2012-08-01

    Full Text Available Abstract Background Contradictory reports have been published regarding the association of Carpal Tunnel Syndrome (CTS and the use of computer keyboard. Previous studies did not take into account the cumulative exposure to keyboard strokes among computer workers. The aim of the present study was to investigate the association between cumulative keyboard use (keyboard strokes and CTS. Methods Employees (461 from a Governmental data entry & processing unit agreed to participate (response rate: 84.1 % in a cross-sectional study. Α questionnaire was distributed to the participants to obtain information on socio-demographics and risk factors for CTS. The participants were examined for signs and symptoms related to CTS and were asked if they had previous history or surgery for CTS. The cumulative amount of the keyboard strokes per worker per year was calculated by the use of payroll’s registry. Two case definitions for CTS were used. The first included subjects with personal history/surgery for CTS while the second included subjects that belonged to the first case definition plus those participants were identified through clinical examination. Results Multivariate analysis used for both case definitions, indicated that those employees with high cumulative exposure to keyboard strokes were at increased risk of CTS (case definition A: OR = 2.23;95 % CI = 1.09-4.52 and case definition B: OR = 2.41; 95%CI = 1.36-4.25. A dose response pattern between cumulative exposure to keyboard strokes and CTS has been revealed (p  Conclusions The present study indicated a possible association between cumulative exposure to keyboard strokes and development of CTS. Cumulative exposure to key-board strokes would be taken into account as an exposure indicator regarding exposure assessment of computer workers. Further research is needed in order to test the results of the current study and assess causality between cumulative keyboard strokes and

  11. The NASA computer science research program plan

    Science.gov (United States)

    1983-01-01

    A taxonomy of computer science is included, one state of the art of each of the major computer science categories is summarized. A functional breakdown of NASA programs under Aeronautics R and D, space R and T, and institutional support is also included. These areas were assessed against the computer science categories. Concurrent processing, highly reliable computing, and information management are identified.

  12. On the computational content of the Axiom of Choice

    NARCIS (Netherlands)

    Berardi, S.; Bezem, M.A.; Coquand, T.

    We present a possible computational content of the negative translation of classical analysis with the Axiom of Choice. Our interpretation seems computationally more direct than the one based on Gödel's Dialectica interpretation [10,18]. Interestingly, this interpretation uses a refinement of the

  13. Occupational stress in human computer interaction.

    Science.gov (United States)

    Smith, M J; Conway, F T; Karsh, B T

    1999-04-01

    There have been a variety of research approaches that have examined the stress issues related to human computer interaction including laboratory studies, cross-sectional surveys, longitudinal case studies and intervention studies. A critical review of these studies indicates that there are important physiological, biochemical, somatic and psychological indicators of stress that are related to work activities where human computer interaction occurs. Many of the stressors of human computer interaction at work are similar to those stressors that have historically been observed in other automated jobs. These include high workload, high work pressure, diminished job control, inadequate employee training to use new technology, monotonous tasks, por supervisory relations, and fear for job security. New stressors have emerged that can be tied primarily to human computer interaction. These include technology breakdowns, technology slowdowns, and electronic performance monitoring. The effects of the stress of human computer interaction in the workplace are increased physiological arousal; somatic complaints, especially of the musculoskeletal system; mood disturbances, particularly anxiety, fear and anger; and diminished quality of working life, such as reduced job satisfaction. Interventions to reduce the stress of computer technology have included improved technology implementation approaches and increased employee participation in implementation. Recommendations for ways to reduce the stress of human computer interaction at work are presented. These include proper ergonomic conditions, increased organizational support, improved job content, proper workload to decrease work pressure, and enhanced opportunities for social support. A model approach to the design of human computer interaction at work that focuses on the system "balance" is proposed.

  14. Entertainment computing, social transformation and the quantum field

    OpenAIRE

    Rauterberg, G.W.M.; Nijholt, A.; Reidsma, D.; Hondorp, H.

    2009-01-01

    Entertainment computing is on its way getting an established academic discipline. The scope of entertainment computing is quite broad (see the scope of the international journal Entertainment Computing). One unifying idea in this diverse community of entertainment researchers and developers might be a normative position to enhance human living through social transformation. One possible option in this direction is a shared `conscious' field. Several ideas about a new kind of field based on qu...

  15. Computers and Information Flow.

    Science.gov (United States)

    Patrick, R. L.

    This paper is designed to fill the need for an easily understood introduction to the computing and data processing field for the layman who has, or can expect to have, some contact with it. Information provided includes the unique terminology and jargon of the field, the various types of computers and the scope of computational capabilities, and…

  16. Administrative Computing in Continuing Education.

    Science.gov (United States)

    Broxton, Harry

    1982-01-01

    Describes computer applications in the Division of Continuing Education at Brigham Young University. These include instructional applications (computer assisted instruction, computer science education, and student problem solving) and administrative applications (registration, payment records, grades, reports, test scoring, mailing, and others).…

  17. Nonadiabatic holonomic quantum computation using Rydberg blockade

    Science.gov (United States)

    Kang, Yi-Hao; Chen, Ye-Hong; Shi, Zhi-Cheng; Huang, Bi-Hua; Song, Jie; Xia, Yan

    2018-04-01

    In this paper, we propose a scheme for realizing nonadiabatic holonomic computation assisted by two atoms and the shortcuts to adiabaticity (STA). The blockade effect induced by strong Rydberg-mediated interaction between two Rydberg atoms provides us the possibility to simplify the dynamics of the system, and the STA helps us design pulses for implementing the holonomic computation with high fidelity. Numerical simulations show the scheme is noise immune and decoherence resistant. Therefore, the current scheme may provide some useful perspectives for realizing nonadiabatic holonomic computation.

  18. Generalized Bell-inequality experiments and computation

    Energy Technology Data Exchange (ETDEWEB)

    Hoban, Matty J. [Department of Physics and Astronomy, University College London, Gower Street, London WC1E 6BT (United Kingdom); Department of Computer Science, University of Oxford, Wolfson Building, Parks Road, Oxford OX1 3QD (United Kingdom); Wallman, Joel J. [School of Physics, The University of Sydney, Sydney, New South Wales 2006 (Australia); Browne, Dan E. [Department of Physics and Astronomy, University College London, Gower Street, London WC1E 6BT (United Kingdom)

    2011-12-15

    We consider general settings of Bell inequality experiments with many parties, where each party chooses from a finite number of measurement settings each with a finite number of outcomes. We investigate the constraints that Bell inequalities place upon the correlations possible in local hidden variable theories using a geometrical picture of correlations. We show that local hidden variable theories can be characterized in terms of limited computational expressiveness, which allows us to characterize families of Bell inequalities. The limited computational expressiveness for many settings (each with many outcomes) generalizes previous results about the many-party situation each with a choice of two possible measurements (each with two outcomes). Using this computational picture we present generalizations of the Popescu-Rohrlich nonlocal box for many parties and nonbinary inputs and outputs at each site. Finally, we comment on the effect of preprocessing on measurement data in our generalized setting and show that it becomes problematic outside of the binary setting, in that it allows local hidden variable theories to simulate maximally nonlocal correlations such as those of these generalized Popescu-Rohrlich nonlocal boxes.

  19. Generalized Bell-inequality experiments and computation

    International Nuclear Information System (INIS)

    Hoban, Matty J.; Wallman, Joel J.; Browne, Dan E.

    2011-01-01

    We consider general settings of Bell inequality experiments with many parties, where each party chooses from a finite number of measurement settings each with a finite number of outcomes. We investigate the constraints that Bell inequalities place upon the correlations possible in local hidden variable theories using a geometrical picture of correlations. We show that local hidden variable theories can be characterized in terms of limited computational expressiveness, which allows us to characterize families of Bell inequalities. The limited computational expressiveness for many settings (each with many outcomes) generalizes previous results about the many-party situation each with a choice of two possible measurements (each with two outcomes). Using this computational picture we present generalizations of the Popescu-Rohrlich nonlocal box for many parties and nonbinary inputs and outputs at each site. Finally, we comment on the effect of preprocessing on measurement data in our generalized setting and show that it becomes problematic outside of the binary setting, in that it allows local hidden variable theories to simulate maximally nonlocal correlations such as those of these generalized Popescu-Rohrlich nonlocal boxes.

  20. Strictly contractive quantum channels and physically realizable quantum computers

    International Nuclear Information System (INIS)

    Raginsky, Maxim

    2002-01-01

    We study the robustness of quantum computers under the influence of errors modeled by strictly contractive channels. A channel T is defined to be strictly contractive if, for any pair of density operators ρ, σ in its domain, parallel Tρ-Tσ parallel 1 ≤k parallel ρ-σ parallel 1 for some 0≤k 1 denotes the trace norm). In other words, strictly contractive channels render the states of the computer less distinguishable in the sense of quantum detection theory. Starting from the premise that all experimental procedures can be carried out with finite precision, we argue that there exists a physically meaningful connection between strictly contractive channels and errors in physically realizable quantum computers. We show that, in the absence of error correction, sensitivity of quantum memories and computers to strictly contractive errors grows exponentially with storage time and computation time, respectively, and depends only on the constant k and the measurement precision. We prove that strict contractivity rules out the possibility of perfect error correction, and give an argument that approximate error correction, which covers previous work on fault-tolerant quantum computation as a special case, is possible

  1. Wearable computing from modeling to implementation of wearable systems based on body sensor networks

    CERN Document Server

    Fortino, Giancarlo; Galzarano, Stefano

    2018-01-01

    This book provides the most up-to-date research and development on wearable computing, wireless body sensor networks, wearable systems integrated with mobile computing, wireless networking and cloud computing. This book has a specific focus on advanced methods for programming Body Sensor Networks (BSNs) based on the reference SPINE project. It features an on-line website (http://spine.deis.unical.it) to support readers in developing their own BSN application/systems and covers new emerging topics on BSNs such as collaborative BSNs, BSN design methods, autonomic BSNs, integration of BSNs and pervasive environments, and integration of BSNs with cloud computing. The book provides a description of real BSN prototypes with the possibility to see on-line demos and download the software to test them on specific sensor platforms and includes case studies for more practical applications. * Provides a future roadmap by learning advanced technology and open research issues * Gathers the background knowledge to tackl...

  2. Noise thresholds for optical quantum computers.

    Science.gov (United States)

    Dawson, Christopher M; Haselgrove, Henry L; Nielsen, Michael A

    2006-01-20

    In this Letter we numerically investigate the fault-tolerant threshold for optical cluster-state quantum computing. We allow both photon loss noise and depolarizing noise (as a general proxy for all local noise), and obtain a threshold region of allowed pairs of values for the two types of noise. Roughly speaking, our results show that scalable optical quantum computing is possible for photon loss probabilities <3 x 10(-3), and for depolarization probabilities <10(-4).

  3. Quantum Genetic Algorithms for Computer Scientists

    OpenAIRE

    Lahoz Beltrá, Rafael

    2016-01-01

    Genetic algorithms (GAs) are a class of evolutionary algorithms inspired by Darwinian natural selection. They are popular heuristic optimisation methods based on simulated genetic mechanisms, i.e., mutation, crossover, etc. and population dynamical processes such as reproduction, selection, etc. Over the last decade, the possibility to emulate a quantum computer (a computer using quantum-mechanical phenomena to perform operations on data) has led to a new class of GAs known as “Quantum Geneti...

  4. Computational colour science using MATLAB

    CERN Document Server

    Westland, Stephen; Cheung, Vien

    2012-01-01

    Computational Colour Science Using MATLAB 2nd Edition offers a practical, problem-based approach to colour physics. The book focuses on the key issues encountered in modern colour engineering, including efficient representation of colour information, Fourier analysis of reflectance spectra and advanced colorimetric computation. Emphasis is placed on the practical applications rather than the techniques themselves, with material structured around key topics. These topics include colour calibration of visual displays, computer recipe prediction and models for colour-appearance prediction. Each t

  5. About possible technologies of creation nanostructures blankets

    International Nuclear Information System (INIS)

    Blednova, Zh.M.; Chaevskij, M.I.; Rusinov, P.O.

    2008-01-01

    Possible technologies of formation nanostructures blankets are considered: a method of thermal carrying over of weights in the conditions of a high gradient of temperatures; the combined method including cathode-plasma nitriding in the conditions of low pressure and drawing of nitride of the titan in a uniform work cycle; the combined method including high-frequency ionic nitriding and drawing of carbide of chrome by pyrolysis chrome and organic of connections in plasma of the decaying category. Possibility of formation layered nanostructures layers is shown.

  6. Open Compute Project at CERN

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    The Open Compute Project, OCP ( http://www.opencompute.org/), was launched by Facebook in 2011 with the objective of building efficient computing infrastructures at lowest possible cost. The technologies are released as open hardware design, with the goal to develop servers and data centers following the model traditionally associated with open source software projects. We have been following the OCP project for some time and decided to buy two OCP twin servers in 2013 to get some hands-on experience. The servers have been tested and compared with our standard hardware regularly acquired through large tenders. In this presentation we will give some relevant results from this testing and also discuss some of the more important differences that can matter for a larger deployment at CERN. Finally it will outline the details for a possible project for a larger deployment of OCP hardware for production use at CERN.

  7. Is it possible to design universal multi-phase flow analyzer?

    International Nuclear Information System (INIS)

    Ivanov Kolev, N.

    2005-01-01

    Transient 3D-multiphase flows consisting of many chemical constituents in nature and technology (Figs. 1 and 2) are the common case of flows. In many technical applications we have to do with particular realization of the multi-phase flows like steady state flows, or single component flows or single phase flows etc. Engineers and scientists created hundreds of computer codes for description of more or less specific realizations of multi-phase flows. If one compares the structure of these codes one is astonished by the waste of the human resources for programming repeating model elements like equations of state, friction lows in variety of geometry, heat transfer coefficients, mathematical equation solvers, data handling procedures, graphical environment etc. It is hardly to expect, that the best solution for the specific sub-phenomenon is available in all codes. Looking in other branches of the technology like computer chips production we realize that the revolutionary idea of having common ''chips'' within complex applications is very far from its practical realization in the computational multi-phase flow dynamics. Following this line of arguments I expressed several times in my publications explicitly or implicitly the idea, that it is possible to create a universal multi-phase flow analyzer in the sense of computer architecture, that is capable to absorb the adequate multi-phase knowledge data base specified in Appendix 1. The subject of this paper is to summarize some of the main ideas, some of them already realized by this author, on the way of creating such computer code architecture, to illustrate haw they work, and to make an outlook regarding what are the challenges in the future developments. We confine deliberately our attention to the solution of the so called local volume and time averaged system of PDE's for a simple reason: Direct numerical resolution of interacting fluids is possible as demonstrated for small scales by many researchers, but for

  8. Modern computer hardware and the role of central computing facilities in particle physics

    International Nuclear Information System (INIS)

    Zacharov, V.

    1981-01-01

    Important recent changes in the hardware technology of computer system components are reviewed, and the impact of these changes assessed on the present and future pattern of computing in particle physics. The place of central computing facilities is particularly examined, to answer the important question as to what, if anything, should be their future role. Parallelism in computing system components is considered to be an important property that can be exploited with advantage. The paper includes a short discussion of the position of communications and network technology in modern computer systems. (orig.)

  9. Updates and solution to the 21st century computer virus sourge ...

    African Journals Online (AJOL)

    The computer virus scourge continues to be a problem the Information Technology (IT), industries must address. Computer virus is a malicious program codes, which can replicate and spread infections into large number of possible hosts and cause damage to computer programs, files, databases and data, in general.

  10. Accelerating statistical image reconstruction algorithms for fan-beam x-ray CT using cloud computing

    Science.gov (United States)

    Srivastava, Somesh; Rao, A. Ravishankar; Sheinin, Vadim

    2011-03-01

    Statistical image reconstruction algorithms potentially offer many advantages to x-ray computed tomography (CT), e.g. lower radiation dose. But, their adoption in practical CT scanners requires extra computation power, which is traditionally provided by incorporating additional computing hardware (e.g. CPU-clusters, GPUs, FPGAs etc.) into a scanner. An alternative solution is to access the required computation power over the internet from a cloud computing service, which is orders-of-magnitude more cost-effective. This is because users only pay a small pay-as-you-go fee for the computation resources used (i.e. CPU time, storage etc.), and completely avoid purchase, maintenance and upgrade costs. In this paper, we investigate the benefits and shortcomings of using cloud computing for statistical image reconstruction. We parallelized the most time-consuming parts of our application, the forward and back projectors, using MapReduce, the standard parallelization library on clouds. From preliminary investigations, we found that a large speedup is possible at a very low cost. But, communication overheads inside MapReduce can limit the maximum speedup, and a better MapReduce implementation might become necessary in the future. All the experiments for this paper, including development and testing, were completed on the Amazon Elastic Compute Cloud (EC2) for less than $20.

  11. Modeling soft factors in computer-based wargames

    Science.gov (United States)

    Alexander, Steven M.; Ross, David O.; Vinarskai, Jonathan S.; Farr, Steven D.

    2002-07-01

    Computer-based wargames have seen much improvement in recent years due to rapid increases in computing power. Because these games have been developed for the entertainment industry, most of these advances have centered on the graphics, sound, and user interfaces integrated into these wargames with less attention paid to the game's fidelity. However, for a wargame to be useful to the military, it must closely approximate as many of the elements of war as possible. Among the elements that are typically not modeled or are poorly modeled in nearly all military computer-based wargames are systematic effects, command and control, intelligence, morale, training, and other human and political factors. These aspects of war, with the possible exception of systematic effects, are individually modeled quite well in many board-based commercial wargames. The work described in this paper focuses on incorporating these elements from the board-based games into a computer-based wargame. This paper will also address the modeling and simulation of the systemic paralysis of an adversary that is implied by the concept of Effects Based Operations (EBO). Combining the fidelity of current commercial board wargames with the speed, ease of use, and advanced visualization of the computer can significantly improve the effectiveness of military decision making and education. Once in place, the process of converting board wargames concepts to computer wargames will allow the infusion of soft factors into military training and planning.

  12. Computer tomography in the diagnosis of liver diseases

    International Nuclear Information System (INIS)

    Petkov, D.; Zhelyazkov, S.; Nedelkov, G.

    1983-01-01

    The modern achievements in the clinical study and diagnosis of liver diseases has definitely been associated with the application of whole body computer tomography (CT) in the practice. The diagnostic possibilities of the method come from high contrast and spacial disjunctive capabilities. Visualization of local lesions is associated with their size and the differences in their densitometric compactness from that of the normal parenchyma. The advantages of computer tomography in the diagnosis of liver diseases is discussed. They are associated with the possibilities for densitometric analysis of the pathologic changes, which opens a way for a probable qualitative diagnosis. Diffuse processes in the liver are relative indication for performing computer tomography. Examination under conditions of contrast amplification is indicated in cases when the nature of the lesion has to be specified and a ''negative'' result does not concur with the clinical manifestations. (authors)

  13. The challenge of networked enterprises for cloud computing interoperability

    OpenAIRE

    Mezgár, István; Rauschecker, Ursula

    2014-01-01

    Manufacturing enterprises have to organize themselves into effective system architectures forming different types of Networked Enterprises (NE) to match fast changing market demands. Cloud Computing (CC) is an important up to date computing concept for NE, as it offers significant financial and technical advantages beside high-level collaboration possibilities. As cloud computing is a new concept the solutions for handling interoperability, portability, security, privacy and standardization c...

  14. Development of computer models for fuel element behaviour in water reactors

    International Nuclear Information System (INIS)

    Gittus, J.H.

    1987-03-01

    Description of fuel behaviour during normal operation transients and accident conditions has always represented a most challenging and important problem. Reliable predictions constitute a basic demand for safety based calculations, for design purposes and for fuel performance. Therefore, computer codes based on deterministic and probabilistic models were developed. Possibility of comprehensive descriptions of the phenomena is precluded in view of the great number of individual processes, involving physical, chemical, thermohydraulical and mechanical parameters, to be considered in a wide range of situations. In case of fast thermal transients predictive capability is limited by the kinetics of evolution of the system and its eventual dynamic behaviour. Evidently, probabilistic approaches are also limited by the sparcity and limited breadth of the impirical data base. Code predictions have to be evaluated against power reactor data, results from simulation experiments and, if possible, include cross validation of different codes and validation of sub-models. Progress on this subject is reviewed in this report, which completes the co-ordinated research programme on 'Development of Computer Models for Fuel Element Behaviour in Water Reactors' (D-COM), initiated under the auspices of the IAEA in 1981

  15. Enhanced delegated computing using coherence

    Science.gov (United States)

    Barz, Stefanie; Dunjko, Vedran; Schlederer, Florian; Moore, Merritt; Kashefi, Elham; Walmsley, Ian A.

    2016-03-01

    A longstanding question is whether it is possible to delegate computational tasks securely—such that neither the computation nor the data is revealed to the server. Recently, both a classical and a quantum solution to this problem were found [C. Gentry, in Proceedings of the 41st Annual ACM Symposium on the Theory of Computing (Association for Computing Machinery, New York, 2009), pp. 167-178; A. Broadbent, J. Fitzsimons, and E. Kashefi, in Proceedings of the 50th Annual Symposium on Foundations of Computer Science (IEEE Computer Society, Los Alamitos, CA, 2009), pp. 517-526]. Here, we study the first step towards the interplay between classical and quantum approaches and show how coherence can be used as a tool for secure delegated classical computation. We show that a client with limited computational capacity—restricted to an XOR gate—can perform universal classical computation by manipulating information carriers that may occupy superpositions of two states. Using single photonic qubits or coherent light, we experimentally implement secure delegated classical computations between an independent client and a server, which are installed in two different laboratories and separated by 50 m . The server has access to the light sources and measurement devices, whereas the client may use only a restricted set of passive optical devices to manipulate the information-carrying light beams. Thus, our work highlights how minimal quantum and classical resources can be combined and exploited for classical computing.

  16. Current puzzles and future possibilities

    International Nuclear Information System (INIS)

    Nagamiya, S.

    1982-02-01

    Four current puzzles and several future experimental possibilities in high-energy nuclear collision research are discussed. These puzzles are (1) entropy, (2) hydrodynamic flow, (3) anomalon, and (4) particle emission at backward angles in proton-nucleus collisions. The last one seems not to be directly related to the subject of the present school. But it is, because particle emission into the region far beyond the nucleon-nucleon kinematical limit is an interesting subject common for both proton-nucleus and nucleus-nucleus collisions, and the basic mechanism involved is strongly related in these two cases. Future experimental possibilities are described which include: (1) possibilities of studying multibaryonic excited states, (2) applications of neutron-rich isotopes, and (3) other needed experimental tasks. 72 references

  17. Impact of computer use on children's vision.

    Science.gov (United States)

    Kozeis, N

    2009-10-01

    Today, millions of children use computers on a daily basis. Extensive viewing of the computer screen can lead to eye discomfort, fatigue, blurred vision and headaches, dry eyes and other symptoms of eyestrain. These symptoms may be caused by poor lighting, glare, an improper work station set-up, vision problems of which the person was not previously aware, or a combination of these factors. Children can experience many of the same symptoms related to computer use as adults. However, some unique aspects of how children use computers may make them more susceptible than adults to the development of these problems. In this study, the most common eye symptoms related to computer use in childhood, the possible causes and ways to avoid them are reviewed.

  18. Effects of Computer Course on Computer Self-Efficacy, Computer Attitudes and Achievements of Young Individuals in Siirt, Turkey

    Science.gov (United States)

    Çelik, Halil Coskun

    2015-01-01

    The purpose of this study is to investigate the effects of computer courses on young individuals' computer self-efficacy, attitudes and achievement. The study group of this research included 60 unemployed young individuals (18-25 ages) in total; 30 in the experimental group and 30 in the control group. An experimental research model with pretest…

  19. Some algorithms for the solution of the symmetric eigenvalue problem on a multiprocessor electronic computer

    International Nuclear Information System (INIS)

    Molchanov, I.N.; Khimich, A.N.

    1984-01-01

    This article shows how a reflection method can be used to find the eigenvalues of a matrix by transforming the matrix to tridiagonal form. The method of conjugate gradients is used to find the smallest eigenvalue and the corresponding eigenvector of symmetric positive-definite band matrices. Topics considered include the computational scheme of the reflection method, the organization of parallel calculations by the reflection method, the computational scheme of the conjugate gradient method, the organization of parallel calculations by the conjugate gradient method, and the effectiveness of parallel algorithms. It is concluded that it is possible to increase the overall effectiveness of the multiprocessor electronic computers by either letting the newly available processors of a new problem operate in the multiprocessor mode, or by improving the coefficient of uniform partition of the original information

  20. A new computing principle

    International Nuclear Information System (INIS)

    Fatmi, H.A.; Resconi, G.

    1988-01-01

    In 1954 while reviewing the theory of communication and cybernetics the late Professor Dennis Gabor presented a new mathematical principle for the design of advanced computers. During our work on these computers it was found that the Gabor formulation can be further advanced to include more recent developments in Lie algebras and geometric probability, giving rise to a new computing principle

  1. Computer Assisted Language Learning. Routledge Studies in Computer Assisted Language Learning

    Science.gov (United States)

    Pennington, Martha

    2011-01-01

    Computer-assisted language learning (CALL) is an approach to language teaching and learning in which computer technology is used as an aid to the presentation, reinforcement and assessment of material to be learned, usually including a substantial interactive element. This books provides an up-to date and comprehensive overview of…

  2. Ancilla-driven quantum computation for qudits and continuous variables

    Science.gov (United States)

    Proctor, Timothy; Giulian, Melissa; Korolkova, Natalia; Andersson, Erika; Kendon, Viv

    2017-05-01

    Although qubits are the leading candidate for the basic elements in a quantum computer, there are also a range of reasons to consider using higher-dimensional qudits or quantum continuous variables (QCVs). In this paper, we use a general "quantum variable" formalism to propose a method of quantum computation in which ancillas are used to mediate gates on a well-isolated "quantum memory" register and which may be applied to the setting of qubits, qudits (for d >2 ), or QCVs. More specifically, we present a model in which universal quantum computation may be implemented on a register using only repeated applications of a single fixed two-body ancilla-register interaction gate, ancillas prepared in a single state, and local measurements of these ancillas. In order to maintain determinism in the computation, adaptive measurements via a classical feed forward of measurement outcomes are used, with the method similar to that in measurement-based quantum computation (MBQC). We show that our model has the same hybrid quantum-classical processing advantages as MBQC, including the power to implement any Clifford circuit in essentially one layer of quantum computation. In some physical settings, high-quality measurements of the ancillas may be highly challenging or not possible, and hence we also present a globally unitary model which replaces the need for measurements of the ancillas with the requirement for ancillas to be prepared in states from a fixed orthonormal basis. Finally, we discuss settings in which these models may be of practical interest.

  3. Assessing Practical Skills in Physics Using Computer Simulations

    Science.gov (United States)

    Walsh, Kevin

    2018-01-01

    Computer simulations have been used very effectively for many years in the teaching of science but the focus has been on cognitive development. This study, however, is an investigation into the possibility that a student's experimental skills in the real-world environment can be judged via the undertaking of a suitably chosen computer simulation…

  4. Desktop grid computing

    CERN Document Server

    Cerin, Christophe

    2012-01-01

    Desktop Grid Computing presents common techniques used in numerous models, algorithms, and tools developed during the last decade to implement desktop grid computing. These techniques enable the solution of many important sub-problems for middleware design, including scheduling, data management, security, load balancing, result certification, and fault tolerance. The book's first part covers the initial ideas and basic concepts of desktop grid computing. The second part explores challenging current and future problems. Each chapter presents the sub-problems, discusses theoretical and practical

  5. Computational movement analysis

    CERN Document Server

    Laube, Patrick

    2014-01-01

    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  6. Computational neurogenetic modeling

    CERN Document Server

    Benuskova, Lubica

    2010-01-01

    Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol

  7. Preface to Computational Humor 2012

    NARCIS (Netherlands)

    Unknown, [Unknown; Nijholt, Antinus; Nijholt, A.

    2012-01-01

    Like its predecessors in 1996 (University of Twente, the Netherlands) and 2002 (ITC-irst, Trento, Italy), this Third International Workshop on Computational Humor (IWCH 2012) focusses on the possibility to find algorithms that allow understanding and generation of humor. There is the general aim of

  8. A technique of including the effect of aging of passive components in probabilistic risk assessments

    International Nuclear Information System (INIS)

    Phillips, J.H.; Weidenhamer, G.H.

    1992-01-01

    The probabilistic risk assessments (PRAS) being developed at most nuclear power plants to calculate the risk of core damage generally focus on the possible failure of active components. The possible failure of passive components is given little consideration. We are developing methods for selecting risk-significant passive components and including them in PRAS. These methods provide effective ways to prioritize passive components for inspection, and where inspection reveals aging damage, mitigation or repair can be employed to reduce the likelihood of component failure. We demonstrated a method by selecting a weld in the auxiliary feedwater (AFW) system, basing our selection on expert judgement of the likelihood of failure and on an estimate of the consequence of component failure to plant safety. We then modified and used the Piping Reliability Analysis Including Seismic Events (PRAISE) computer code to perform a probabilistic structural analysis to calculate the probability that crack growth due to aging would cause the weld to fail. The PRAISE code was modified to include the effects of changing design material properties with age and changing stress cycles. The calculation included the effects of mechanical loads and thermal transients typical of the service loads for this piping design and the effects of thermal cycling caused by a leaking check valve. However, this particular calculation showed little change in low component failure probability and plant risk for 48 years of service. However, sensitivity studies showed that if the probability of component failure is high, the effect on plant risk is significant. The success of this demonstration shows that this method could be applied to nuclear power plants. The demonstration showed the method is too involved (PRAISE takes a long time to perform the calculation and the input information is extensive) for handling a large number of passive components and therefore simpler methods are needed

  9. Can the possibility of transverse iliosacral screw fixation for first sacral segment be predicted preoperatively? Results of a computational cadaveric study.

    Science.gov (United States)

    Jeong, Jin-Hoon; Jin, Jin Woo; Kang, Byoung Youl; Jung, Gu-Hee

    2017-10-01

    The purpose of this study was to predict the possibility of transverse iliosacral (TIS) screw fixation into the first sacral segment (S 1 ) and introduce practical anatomical variables using conventional computed tomography (CT) scans. A total of 82 cadaveric sacra (42 males and 40 females) were used for continuous 1.0-mm slice CT scans, which were imported into Mimics ® software to produce a three-dimensional pelvis model. The anterior height (BH) and superior width (BW) of the elevated sacral segment was measured, followed by verification of the safe zone (SZ S1 and SZ S2 ) in a true lateral view. Their vertical (VD S1 and VD S2 ) and horizontal (HD S1 and HD S2 ) distances were measured. VD S1 less than 7mm was classified as impossible sacrum, since the transverse fixation of 7.0 mm-sized IS screw could not be done safely. Fourteen models (16.7%; six females, eight males) were assigned as the impossible sacrum. There was no statistical significance regarding gender (p=0.626) and height (p=0.419). The average values were as follows: BW, 31.4mm (SD 2.9); BH, 16.7mm (SD 6.8); VD S1 , 13.4mm (SD 6.1); HD S1 , 22.5mm (SD 4.5); SZ S1 , 239.5mm 2 (SD 137.1); VD S2 , 15.5mm (SD 3.0); HD S2 , 18.3mm (SD 2.9); and SZ S2 , 221.1mm 2 (SD 68.5). Logistic regression analysis identified BH (p=0.001) and HD S1 (p=0.02) as the only statistically significant variables to predict the possibility. Receiver operating characteristic curve analysis established a cut-off value for BH and HD S1 of impossible sacrum of 20.6mm and 18.6mm, respectively. BH and HD S1 could be used to predict the possibility of TIS screw fixation. If the BH exceeds 20.6mm or HD S1 is less than 18.6mm, TIS screw fixation for S 1 should not be undertaken because of narrowed SZ. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Community Cloud Computing

    Science.gov (United States)

    Marinos, Alexandros; Briscoe, Gerard

    Cloud Computing is rising fast, with its data centres growing at an unprecedented rate. However, this has come with concerns over privacy, efficiency at the expense of resilience, and environmental sustainability, because of the dependence on Cloud vendors such as Google, Amazon and Microsoft. Our response is an alternative model for the Cloud conceptualisation, providing a paradigm for Clouds in the community, utilising networked personal computers for liberation from the centralised vendor model. Community Cloud Computing (C3) offers an alternative architecture, created by combing the Cloud with paradigms from Grid Computing, principles from Digital Ecosystems, and sustainability from Green Computing, while remaining true to the original vision of the Internet. It is more technically challenging than Cloud Computing, having to deal with distributed computing issues, including heterogeneous nodes, varying quality of service, and additional security constraints. However, these are not insurmountable challenges, and with the need to retain control over our digital lives and the potential environmental consequences, it is a challenge we must pursue.

  11. Bacterial computing with engineered populations.

    Science.gov (United States)

    Amos, Martyn; Axmann, Ilka Maria; Blüthgen, Nils; de la Cruz, Fernando; Jaramillo, Alfonso; Rodriguez-Paton, Alfonso; Simmel, Friedrich

    2015-07-28

    We describe strategies for the construction of bacterial computing platforms by describing a number of results from the recently completed bacterial computing with engineered populations project. In general, the implementation of such systems requires a framework containing various components such as intracellular circuits, single cell input/output and cell-cell interfacing, as well as extensive analysis. In this overview paper, we describe our approach to each of these, and suggest possible areas for future research. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  12. Computer-aided power systems analysis

    CERN Document Server

    Kusic, George

    2008-01-01

    Computer applications yield more insight into system behavior than is possible by using hand calculations on system elements. Computer-Aided Power Systems Analysis: Second Edition is a state-of-the-art presentation of basic principles and software for power systems in steady-state operation. Originally published in 1985, this revised edition explores power systems from the point of view of the central control facility. It covers the elements of transmission networks, bus reference frame, network fault and contingency calculations, power flow on transmission networks, generator base power setti

  13. Volunteered Cloud Computing for Disaster Management

    Science.gov (United States)

    Evans, J. D.; Hao, W.; Chettri, S. R.

    2014-12-01

    Disaster management relies increasingly on interpreting earth observations and running numerical models; which require significant computing capacity - usually on short notice and at irregular intervals. Peak computing demand during event detection, hazard assessment, or incident response may exceed agency budgets; however some of it can be met through volunteered computing, which distributes subtasks to participating computers via the Internet. This approach has enabled large projects in mathematics, basic science, and climate research to harness the slack computing capacity of thousands of desktop computers. This capacity is likely to diminish as desktops give way to battery-powered mobile devices (laptops, smartphones, tablets) in the consumer market; but as cloud computing becomes commonplace, it may offer significant slack capacity -- if its users are given an easy, trustworthy mechanism for participating. Such a "volunteered cloud computing" mechanism would also offer several advantages over traditional volunteered computing: tasks distributed within a cloud have fewer bandwidth limitations; granular billing mechanisms allow small slices of "interstitial" computing at no marginal cost; and virtual storage volumes allow in-depth, reversible machine reconfiguration. Volunteered cloud computing is especially suitable for "embarrassingly parallel" tasks, including ones requiring large data volumes: examples in disaster management include near-real-time image interpretation, pattern / trend detection, or large model ensembles. In the context of a major disaster, we estimate that cloud users (if suitably informed) might volunteer hundreds to thousands of CPU cores across a large provider such as Amazon Web Services. To explore this potential, we are building a volunteered cloud computing platform and targeting it to a disaster management context. Using a lightweight, fault-tolerant network protocol, this platform helps cloud users join parallel computing projects

  14. An introduction to quantum computing algorithms

    CERN Document Server

    Pittenger, Arthur O

    2000-01-01

    In 1994 Peter Shor [65] published a factoring algorithm for a quantum computer that finds the prime factors of a composite integer N more efficiently than is possible with the known algorithms for a classical com­ puter. Since the difficulty of the factoring problem is crucial for the se­ curity of a public key encryption system, interest (and funding) in quan­ tum computing and quantum computation suddenly blossomed. Quan­ tum computing had arrived. The study of the role of quantum mechanics in the theory of computa­ tion seems to have begun in the early 1980s with the publications of Paul Benioff [6]' [7] who considered a quantum mechanical model of computers and the computation process. A related question was discussed shortly thereafter by Richard Feynman [35] who began from a different perspec­ tive by asking what kind of computer should be used to simulate physics. His analysis led him to the belief that with a suitable class of "quantum machines" one could imitate any quantum system.

  15. Computing quantum discord is NP-complete

    International Nuclear Information System (INIS)

    Huang, Yichen

    2014-01-01

    We study the computational complexity of quantum discord (a measure of quantum correlation beyond entanglement), and prove that computing quantum discord is NP-complete. Therefore, quantum discord is computationally intractable: the running time of any algorithm for computing quantum discord is believed to grow exponentially with the dimension of the Hilbert space so that computing quantum discord in a quantum system of moderate size is not possible in practice. As by-products, some entanglement measures (namely entanglement cost, entanglement of formation, relative entropy of entanglement, squashed entanglement, classical squashed entanglement, conditional entanglement of mutual information, and broadcast regularization of mutual information) and constrained Holevo capacity are NP-hard/NP-complete to compute. These complexity-theoretic results are directly applicable in common randomness distillation, quantum state merging, entanglement distillation, superdense coding, and quantum teleportation; they may offer significant insights into quantum information processing. Moreover, we prove the NP-completeness of two typical problems: linear optimization over classical states and detecting classical states in a convex set, providing evidence that working with classical states is generically computationally intractable. (paper)

  16. Effect of computer game playing on baseline laparoscopic simulator skills.

    Science.gov (United States)

    Halvorsen, Fredrik H; Cvancarova, Milada; Fosse, Erik; Mjåland, Odd

    2013-08-01

    Studies examining the possible association between computer game playing and laparoscopic performance in general have yielded conflicting results and neither has a relationship between computer game playing and baseline performance on laparoscopic simulators been established. The aim of this study was to examine the possible association between previous and present computer game playing and baseline performance on a virtual reality laparoscopic performance in a sample of potential future medical students. The participating students completed a questionnaire covering the weekly amount and type of computer game playing activity during the previous year and 3 years ago. They then performed 2 repetitions of 2 tasks ("gallbladder dissection" and "traverse tube") on a virtual reality laparoscopic simulator. Performance on the simulator were then analyzed for association to their computer game experience. Local high school, Norway. Forty-eight students from 2 high school classes volunteered to participate in the study. No association between prior and present computer game playing and baseline performance was found. The results were similar both for prior and present action game playing and prior and present computer game playing in general. Our results indicate that prior and present computer game playing may not affect baseline performance in a virtual reality simulator.

  17. Introduction to computer networking

    CERN Document Server

    Robertazzi, Thomas G

    2017-01-01

    This book gives a broad look at both fundamental networking technology and new areas that support it and use it. It is a concise introduction to the most prominent, recent technological topics in computer networking. Topics include network technology such as wired and wireless networks, enabling technologies such as data centers, software defined networking, cloud and grid computing and applications such as networks on chips, space networking and network security. The accessible writing style and non-mathematical treatment makes this a useful book for the student, network and communications engineer, computer scientist and IT professional. • Features a concise, accessible treatment of computer networking, focusing on new technological topics; • Provides non-mathematical introduction to networks in their most common forms today;< • Includes new developments in switching, optical networks, WiFi, Bluetooth, LTE, 5G, and quantum cryptography.

  18. Scalable optical quantum computer

    International Nuclear Information System (INIS)

    Manykin, E A; Mel'nichenko, E V

    2014-01-01

    A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr 3+ , regularly located in the lattice of the orthosilicate (Y 2 SiO 5 ) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

  19. Computer loss experience and predictions

    Science.gov (United States)

    Parker, Donn B.

    1996-03-01

    The types of losses organizations must anticipate have become more difficult to predict because of the eclectic nature of computers and the data communications and the decrease in news media reporting of computer-related losses as they become commonplace. Total business crime is conjectured to be decreasing in frequency and increasing in loss per case as a result of increasing computer use. Computer crimes are probably increasing, however, as their share of the decreasing business crime rate grows. Ultimately all business crime will involve computers in some way, and we could see a decline of both together. The important information security measures in high-loss business crime generally concern controls over authorized people engaged in unauthorized activities. Such controls include authentication of users, analysis of detailed audit records, unannounced audits, segregation of development and production systems and duties, shielding the viewing of screens, and security awareness and motivation controls in high-value transaction areas. Computer crimes that involve highly publicized intriguing computer misuse methods, such as privacy violations, radio frequency emanations eavesdropping, and computer viruses, have been reported in waves that periodically have saturated the news media during the past 20 years. We must be able to anticipate such highly publicized crimes and reduce the impact and embarrassment they cause. On the basis of our most recent experience, I propose nine new types of computer crime to be aware of: computer larceny (theft and burglary of small computers), automated hacking (use of computer programs to intrude), electronic data interchange fraud (business transaction fraud), Trojan bomb extortion and sabotage (code security inserted into others' systems that can be triggered to cause damage), LANarchy (unknown equipment in use), desktop forgery (computerized forgery and counterfeiting of documents), information anarchy (indiscriminate use of

  20. Broadcasting a message in a parallel computer

    Science.gov (United States)

    Berg, Jeremy E [Rochester, MN; Faraj, Ahmad A [Rochester, MN

    2011-08-02

    Methods, systems, and products are disclosed for broadcasting a message in a parallel computer. The parallel computer includes a plurality of compute nodes connected together using a data communications network. The data communications network optimized for point to point data communications and is characterized by at least two dimensions. The compute nodes are organized into at least one operational group of compute nodes for collective parallel operations of the parallel computer. One compute node of the operational group assigned to be a logical root. Broadcasting a message in a parallel computer includes: establishing a Hamiltonian path along all of the compute nodes in at least one plane of the data communications network and in the operational group; and broadcasting, by the logical root to the remaining compute nodes, the logical root's message along the established Hamiltonian path.