WorldWideScience

Sample records for possibly including computer

  1. Infinite possibilities: Computational structures technology

    Science.gov (United States)

    Beam, Sherilee F.

    1994-12-01

    Computational Fluid Dynamics (or CFD) methods are very familiar to the research community. Even the general public has had some exposure to CFD images, primarily through the news media. However, very little attention has been paid to CST--Computational Structures Technology. Yet, no important design can be completed without it. During the first half of this century, researchers only dreamed of designing and building structures on a computer. Today their dreams have become practical realities as computational methods are used in all phases of design, fabrication and testing of engineering systems. Increasingly complex structures can now be built in even shorter periods of time. Over the past four decades, computer technology has been developing, and early finite element methods have grown from small in-house programs to numerous commercial software programs. When coupled with advanced computing systems, they help engineers make dramatic leaps in designing and testing concepts. The goals of CST include: predicting how a structure will behave under actual operating conditions; designing and complementing other experiments conducted on a structure; investigating microstructural damage or chaotic, unpredictable behavior; helping material developers in improving material systems; and being a useful tool in design systems optimization and sensitivity techniques. Applying CST to a structure problem requires five steps: (1) observe the specific problem; (2) develop a computational model for numerical simulation; (3) develop and assemble software and hardware for running the codes; (4) post-process and interpret the results; and (5) use the model to analyze and design the actual structure. Researchers in both industry and academia continue to make significant contributions to advance this technology with improvements in software, collaborative computing environments and supercomputing systems. As these environments and systems evolve, computational structures technology will

  2. Computing possibilities in the mid 1990s

    International Nuclear Information System (INIS)

    Nash, T.

    1988-09-01

    This paper describes the kind of computing resources it may be possible to make available for experiments in high energy physics in the mid and late 1990s. We outline some of the work going on today, particularly at Fermilab's Advanced Computer Program, that projects to the future. We attempt to define areas in which coordinated R and D efforts should prove fruitful to provide for on and off-line computing in the SSC era. Because of extraordinary components anticipated from industry, we can be optimistic even to the level of predicting million VAX equivalent on-line multiprocessor/data acquisition systems for SSC detectors. Managing this scale of computing will require a new approach to large hardware and software systems. 15 refs., 6 figs

  3. Possibilities of computer tomography in multiple sclerosis

    International Nuclear Information System (INIS)

    Vymazal, J.; Bauer, J.

    1983-01-01

    Computer tomography was performed in 41 patients with multiple sclerosis, the average age of patients being 40.8 years. Native examinations were made of 17 patients, examinations with contrast medium of 19, both methods were used in the examination of 5 patients. In 26 patients, i.e. in almost two-thirds, cerebral atrophy was found, in 11 of a severe type. In 9 patients atrophy affected only the hemispheres, in 16 also the stem and cerebellum. The stem and cerebellum only were affected in 1 patient. Hypodense foci were found in 21 patients, i.e. more than half of those examined. In 9 there were multiple foci. In most of the 19 examined patients the hypodense changes were in the hemispheres and only in 2 in the cerebellum and brain stem. No hyperdense changes were detected. The value and possibilities are discussed of examinations by computer tomography multiple sclerosis. (author)

  4. A design of a computer complex including vector processors

    International Nuclear Information System (INIS)

    Asai, Kiyoshi

    1982-12-01

    We, members of the Computing Center, Japan Atomic Energy Research Institute have been engaged for these six years in the research of adaptability of vector processing to large-scale nuclear codes. The research has been done in collaboration with researchers and engineers of JAERI and a computer manufacturer. In this research, forty large-scale nuclear codes were investigated from the viewpoint of vectorization. Among them, twenty-six codes were actually vectorized and executed. As the results of the investigation, it is now estimated that about seventy percents of nuclear codes and seventy percents of our total amount of CPU time of JAERI are highly vectorizable. Based on the data obtained by the investigation, (1)currently vectorizable CPU time, (2)necessary number of vector processors, (3)necessary manpower for vectorization of nuclear codes, (4)computing speed, memory size, number of parallel 1/0 paths, size and speed of 1/0 buffer of vector processor suitable for our applications, (5)necessary software and operational policy for use of vector processors are discussed, and finally (6)a computer complex including vector processors is presented in this report. (author)

  5. Computer simulation of forest fire and its possible usage

    International Nuclear Information System (INIS)

    Halada, L.; Weisenpacher, P.; Glasa, J.

    2005-01-01

    In this presentation authors deal with computer modelling of forest fires. Their possible usage is discussed. Results of modelling are compared with real forest fire in the National Park Slovensky Raj (Slovak Paradise) in 2000 year

  6. Theoretical calculation possibilities of the computer code HAMMER

    International Nuclear Information System (INIS)

    Onusic Junior, J.

    1978-06-01

    With the aim to know the theoretical calculation possibilities of the computer code HAMMER, developed at Savanah River Laboratory, a analysis of the crytical cells assembly of the kind utilized in PWR reactors is made. (L.F.S.) [pt

  7. The Impossibility of the Counterfactual Computation for all Possible Outcomes

    OpenAIRE

    Vaidman, Lev

    2006-01-01

    Recent proposal for counterfactual computation [Hosten et al., Nature, 439, 949 (2006)] is analyzed. It is argued that the method does not provide counterfactual computation for all possible outcomes. The explanation involves a novel paradoxical feature of pre- and post-selected quantum particles: the particle can reach a certain location without being on the path that leads to this location.

  8. The NEA computer program library: a possible GDMS application

    International Nuclear Information System (INIS)

    Schuler, W.

    1978-01-01

    NEA Computer Program library maintains a series of eleven sequential computer files, used for linked applications in managing their stock of computer codes for nuclear reactor calculations, storing index and program abstract information, and administering their service to requesters. The high data redundancy beween the files suggests that a data base approach would be valid and this paper suggests a possible 'schema' for an CODASYL GDMS

  9. Top 10 Threats to Computer Systems Include Professors and Students

    Science.gov (United States)

    Young, Jeffrey R.

    2008-01-01

    User awareness is growing in importance when it comes to computer security. Not long ago, keeping college networks safe from cyberattackers mainly involved making sure computers around campus had the latest software patches. New computer worms or viruses would pop up, taking advantage of some digital hole in the Windows operating system or in…

  10. Computing possible worlds in the history of modern astronomy

    Directory of Open Access Journals (Sweden)

    Osvaldo Pessoa Jr.

    2016-09-01

    Full Text Available http://dx.doi.org/10.5007/1808-1711.2016v20n1p117 As part of an ongoing study of causal models in the history of science, a counterfactual scenario in the history of modern astronomy is explored with the aid of computer simulations. After the definition of “linking advance”, a possible world involving technological antecedence is described, branching out in 1510, in which the telescope is invented 70 years before its actual construction, at the time in which Fracastoro actually built the first prototelescope. By using the principle of the closest possible world (PCP, we estimate that in this scenario the discovery of the elliptical orbit of Mars would by anticipated by only 28 years. The second part of the paper involves an estimate of the probability of the previous scenario, guided by the principle that the actual world is the mean (PAM and using computer simulations to create possible worlds in which the time spans between advances is varied according to a gamma distribution function. Taking into account the importance of the use of the diaphragm for the invention of the telescope, the probability that the telescope were built by 1538 for a branching time at 1510 is found to be smaller than 1%. The work shows that one of the important features of computational simulations in philosophy of science is to serve as a consistency check for the intuitions and speculations of the philosopher.

  11. Pulmonary nodule characterization, including computer analysis and quantitative features.

    Science.gov (United States)

    Bartholmai, Brian J; Koo, Chi Wan; Johnson, Geoffrey B; White, Darin B; Raghunath, Sushravya M; Rajagopalan, Srinivasan; Moynagh, Michael R; Lindell, Rebecca M; Hartman, Thomas E

    2015-03-01

    Pulmonary nodules are commonly detected in computed tomography (CT) chest screening of a high-risk population. The specific visual or quantitative features on CT or other modalities can be used to characterize the likelihood that a nodule is benign or malignant. Visual features on CT such as size, attenuation, location, morphology, edge characteristics, and other distinctive "signs" can be highly suggestive of a specific diagnosis and, in general, be used to determine the probability that a specific nodule is benign or malignant. Change in size, attenuation, and morphology on serial follow-up CT, or features on other modalities such as nuclear medicine studies or MRI, can also contribute to the characterization of lung nodules. Imaging analytics can objectively and reproducibly quantify nodule features on CT, nuclear medicine, and magnetic resonance imaging. Some quantitative techniques show great promise in helping to differentiate benign from malignant lesions or to stratify the risk of aggressive versus indolent neoplasm. In this article, we (1) summarize the visual characteristics, descriptors, and signs that may be helpful in management of nodules identified on screening CT, (2) discuss current quantitative and multimodality techniques that aid in the differentiation of nodules, and (3) highlight the power, pitfalls, and limitations of these various techniques.

  12. 78 FR 1247 - Certain Electronic Devices, Including Wireless Communication Devices, Tablet Computers, Media...

    Science.gov (United States)

    2013-01-08

    ... Wireless Communication Devices, Tablet Computers, Media Players, and Televisions, and Components Thereof... devices, including wireless communication devices, tablet computers, media players, and televisions, and... wireless communication devices, tablet computers, media players, and televisions, and components thereof...

  13. Adult-onset photosensitivity: clinical significance and epilepsy syndromes including idiopathic (possibly genetic) photosensitive occipital epilepsy.

    Science.gov (United States)

    Koutroumanidis, Michalis; Tsirka, Vasiliki; Panayiotopoulos, Chrysostomos

    2015-09-01

    To evaluate the clinical associations of adult-onset photosensitivity, we studied the clinical and EEG data of patients who were referred due to a possible first seizure and who had a photoparoxysmal response on their EEG. Patients with clinical evidence of photosensitivity before the age of 20 were excluded. Of a total of 30 patients, four had acute symptomatic seizures, two had vasovagal syncope, and 24 were diagnosed with epilepsy. Nine of the 24 patients had idiopathic (genetic) generalized epilepsies and predominantly generalized photoparoxysmal response, but also rare photically-induced seizures, while 15 had exclusively, or almost exclusively, reflex photically-induced occipital seizures with frequent secondary generalization and posterior photoparoxysmal response. Other important differences included a significantly older age at seizure onset and paucity of spontaneous interictal epileptic discharges in patients with photically-induced occipital seizures; only a quarter of these had occasional occipital spikes, in contrast to the idiopathic (genetic) generalized epilepsy patients with typically generalized epileptic discharges. On the other hand, both groups shared a positive family history of epilepsy, common seizure threshold modulators (such as tiredness and sleep deprivation), normal neurological examination and MRI, a generally benign course, and good response to valproic acid. We demonstrated that photosensitivity can first occur in adult life and manifest, either as idiopathic (possibly genetic) photosensitive occipital epilepsy with secondary generalization or as an EEG, and less often, a clinical/EEG feature of idiopathic (genetic) generalized epilepsies. Identification of idiopathic photosensitive occipital epilepsy fills a diagnostic gap in adult first-seizure epileptology and is clinically important because of its good response to antiepileptic drug treatment and fair prognosis.

  14. Security issues of cloud computing environment in possible military applications

    OpenAIRE

    Samčović, Andreja B.

    2013-01-01

    The evolution of cloud computing over the past few years is potentially one of major advances in the history of computing and telecommunications. Although there are many benefits of adopting cloud computing, there are also some significant barriers to adoption, security issues being the most important of them. This paper introduces the concept of cloud computing; looks at relevant technologies in cloud computing; takes into account cloud deployment models and some military applications. Addit...

  15. Reforming Lao Teacher Education to Include Females and Ethnic Minorities--Exploring Possibilities and Constraints

    Science.gov (United States)

    Berge, Britt-Marie; Chounlamany, Kongsy; Khounphilaphanh, Bounchanh; Silfver, Ann-Louise

    2017-01-01

    This article explores possibilities and constraints for the inclusion of female and ethnic minority students in Lao education in order to provide education for all. Females and ethnic minorities have traditionally been disadvantaged in Lao education and reforms for the inclusion of these groups are therefore welcome. The article provides rich…

  16. Robust and Adaptive OMR System Including Fuzzy Modeling, Fusion of Musical Rules, and Possible Error Detection

    Directory of Open Access Journals (Sweden)

    Bloch Isabelle

    2007-01-01

    Full Text Available This paper describes a system for optical music recognition (OMR in case of monophonic typeset scores. After clarifying the difficulties specific to this domain, we propose appropriate solutions at both image analysis level and high-level interpretation. Thus, a recognition and segmentation method is designed, that allows dealing with common printing defects and numerous symbol interconnections. Then, musical rules are modeled and integrated, in order to make a consistent decision. This high-level interpretation step relies on the fuzzy sets and possibility framework, since it allows dealing with symbol variability, flexibility, and imprecision of music rules, and merging all these heterogeneous pieces of information. Other innovative features are the indication of potential errors and the possibility of applying learning procedures, in order to gain in robustness. Experiments conducted on a large data base show that the proposed method constitutes an interesting contribution to OMR.

  17. On the problem of possibilities of X-ray computer tomography in the diagnosis of endophitic tumors of the stomach

    International Nuclear Information System (INIS)

    Gorshkov, A.N.; Akberov, R.F.

    1996-01-01

    The possibilities of X-ray computer tomography in the diagnosis of endophitic tumors of the stomach including tumors of small size are considered using the examinations of 100 patients with stomach diseases. The computer-tomographic semiotics of small endophitic tumors of the stomach is presented, the place of computer tomography in the diagnosis of tumors of the stomach as well as its potential possibilities in revealing small tumors of the stomach with principally endophitic spreading. 10 refs.; 3 figs

  18. Possible Simple Structures of the Universe to Include General Relativity Effects

    Directory of Open Access Journals (Sweden)

    Corneliu BERBENTE

    2017-12-01

    Full Text Available The general relativity describes the universe properties, the gravity playing a fundamental role. One uses a metric tensor in a Riemann space, g  , which should be in agreement with a mass (or energy tensor in order to satisfy the Einstein equation of the general relativity [1]. This equation contains the Ricci curvature as well. In general, applications are done considering that a chosen metric is valid without region limits. In fact, the density of the energy whose distribution is however unknown is variable in universe; therefore, the metrics need to be adapted to different regions. For this reason one suggests to start with a simple, average mass-energy distribution that could represent in a first step the actual universe. This suggestion is in agreement with the symmetrical distribution of equal spheres existing in a model of the early universe given by one of the authors. Two kinds of distribution are given. The possibility of black holes formation is studied and a criterion is given.

  19. The Model of the Software Running on a Computer Equipment Hardware Included in the Grid network

    Directory of Open Access Journals (Sweden)

    T. A. Mityushkina

    2012-12-01

    Full Text Available A new approach to building a cloud computing environment using Grid networks is proposed in this paper. The authors describe the functional capabilities, algorithm, model of software running on a computer equipment hardware included in the Grid network, that will allow to implement cloud computing environment using Grid technologies.

  20. Expanding Canadian Medicare to include a national pharmaceutical benefit while controlling expenditures: possible lessons from Israel.

    Science.gov (United States)

    Rosen, Bruce

    2018-02-05

    In Canada, there is an ongoing debate about whether to expand Medicare to include a national pharmaceutical benefit on a universal basis. The potential health benefits are understood to be significant, but there are ongoing concerns about affordability. In Israel, the National Health Insurance benefits package includes a comprehensive pharmaceutical benefit. Nonetheless, per capita pharmaceutical spending is well below that of Canada and the Organization for Economic Co-operation and Development average. This paper highlights seven strategies that Israel has employed to constrain pharmaceutical spending: (1) prioritizing new technologies, subject to a global budget constraint; (2) using regulations and market power to secure fair and reasonable prices; (3) establishing an efficient pharmaceutical distribution system; (4) promoting effective prescribing behavior; (5) avoiding artificial inflation of consumer demand; (6) striking an appropriate balance between respect for IP rights, access and cost containment; and (7) developing a shared societal understanding about the value and limits of pharmaceutical spending. Some of these strategies are already in place in some parts of Canada. Others could be introduced into Canada, and might contribute to the affordability of a national pharmaceutical benefit, but substantial adaptation would be needed. For example, in Israel the health maintenance organizations (HMOs) play a central role in promoting effective prescribing behavior, whereas in HMO-free Canada other mechanisms are needed to advance this important goal.

  1. PTAC: a computer program for pressure-transient analysis, including the effects of cavitation. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Kot, C A; Youngdahl, C K

    1978-09-01

    PTAC was developed to predict pressure transients in nuclear-power-plant piping systems in which the possibility of cavitation must be considered. The program performs linear or nonlinear fluid-hammer calculations, using a fixed-grid method-of-characteristics solution procedure. In addition to pipe friction and elasticity, the program can treat a variety of flow components, pipe junctions, and boundary conditions, including arbitrary pressure sources and a sodium/water reaction. Essential features of transient cavitation are modeled by a modified column-separation technique. Comparisons of calculated results with available experimental data, for a simple piping arrangement, show good agreement and provide validation of the computational cavitation model. Calculations for a variety of piping networks, containing either liquid sodium or water, demonstrate the versatility of PTAC and clearly show that neglecting cavitation leads to erroneous predictions of pressure-time histories.

  2. Design, functioning and possible applications of process computers

    International Nuclear Information System (INIS)

    Kussl, V.

    1975-01-01

    Process computers are useful as automation instruments a) when large numbers of data are processed in analog or digital form, b) for low data flow (data rate), and c) when data must be stored over short or long periods of time. (orig./AK) [de

  3. 77 FR 27078 - Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof...

    Science.gov (United States)

    2012-05-08

    ... Phones and Tablet Computers, and Components Thereof; Notice of Receipt of Complaint; Solicitation of... entitled Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof... the United States after importation of certain electronic devices, including mobile phones and tablet...

  4. 31 CFR 359.31 - What definitive Series I savings bonds are included in the computation?

    Science.gov (United States)

    2010-07-01

    ... definitive Series I savings bonds are included in the computation? In computing the purchases for each person, we include the following outstanding definitive bonds purchased in that calendar year: (a) All bonds... bearing that person's TIN; and (c) All gift bonds registered in the name of that person but bearing the...

  5. [Possibilities of computer graphics simulation in orthopedic surgery].

    Science.gov (United States)

    Kessler, P; Wiltfang, J; Teschner, M; Girod, B; Neukam, F W

    2000-11-01

    In addition to standard X-rays, photographic documentation, cephalometric and model analysis, a computer-aided, three-dimensional (3D) simulation system has been developed in close cooperation with the Institute of Communications of the Friedrich-Alexander-Universität Erlangen-Nürnberg. With this simulation system a photorealistic prediction of the expected soft tissue changes can be made. Prerequisites are a 3D reconstruction of the facial skeleton and a 3D laser scan of the face. After data reduction, the two data sets can be matched. Cutting planes enable the transposition of bony segments. The laser scan of the facial surface is combined with the underlying bone via a five-layered soft tissue model to convert bone movements on the soft tissue cover realistically. Further research is necessary to replace the virtual subcutaneous soft tissue model by correct, topographic tissue anatomy.

  6. 77 FR 34063 - Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof...

    Science.gov (United States)

    2012-06-08

    ... Phones and Tablet Computers, and Components Thereof Institution of Investigation AGENCY: U.S... the United States after importation of certain electronic devices, including mobile phones and tablet... mobile phones and tablet computers, and components thereof that infringe one or more of claims 1-3 and 5...

  7. Possible Computer Vision Systems and Automated or Computer-Aided Edging and Trimming

    Science.gov (United States)

    Philip A. Araman

    1990-01-01

    This paper discusses research which is underway to help our industry reduce costs, increase product volume and value recovery, and market more accurately graded and described products. The research is part of a team effort to help the hardwood sawmill industry automate with computer vision systems, and computer-aided or computer controlled processing. This paper...

  8. CERN’s Computing rules updated to include policy for control systems

    CERN Multimedia

    IT Department

    2008-01-01

    The use of CERN’s computing facilities is governed by rules defined in Operational Circular No. 5 and its subsidiary rules of use. These rules are available from the web site http://cern.ch/ComputingRules. Please note that the subsidiary rules for Internet/Network use have been updated to include a requirement that control systems comply with the CNIC(Computing and Network Infrastructure for Control) Security Policy. The security policy for control systems, which was approved earlier this year, can be accessed at https://edms.cern.ch/document/584092 IT Department

  9. High performance computation of landscape genomic models including local indicators of spatial association.

    Science.gov (United States)

    Stucki, S; Orozco-terWengel, P; Forester, B R; Duruz, S; Colli, L; Masembe, C; Negrini, R; Landguth, E; Jones, M R; Bruford, M W; Taberlet, P; Joost, S

    2017-09-01

    With the increasing availability of both molecular and topo-climatic data, the main challenges facing landscape genomics - that is the combination of landscape ecology with population genomics - include processing large numbers of models and distinguishing between selection and demographic processes (e.g. population structure). Several methods address the latter, either by estimating a null model of population history or by simultaneously inferring environmental and demographic effects. Here we present samβada, an approach designed to study signatures of local adaptation, with special emphasis on high performance computing of large-scale genetic and environmental data sets. samβada identifies candidate loci using genotype-environment associations while also incorporating multivariate analyses to assess the effect of many environmental predictor variables. This enables the inclusion of explanatory variables representing population structure into the models to lower the occurrences of spurious genotype-environment associations. In addition, samβada calculates local indicators of spatial association for candidate loci to provide information on whether similar genotypes tend to cluster in space, which constitutes a useful indication of the possible kinship between individuals. To test the usefulness of this approach, we carried out a simulation study and analysed a data set from Ugandan cattle to detect signatures of local adaptation with samβada, bayenv, lfmm and an F ST outlier method (FDIST approach in arlequin) and compare their results. samβada - an open source software for Windows, Linux and Mac OS X available at http://lasig.epfl.ch/sambada - outperforms other approaches and better suits whole-genome sequence data processing. © 2016 The Authors. Molecular Ecology Resources Published by John Wiley & Sons Ltd.

  10. 78 FR 63492 - Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof...

    Science.gov (United States)

    2013-10-24

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-847] Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof; Notice of Request for Statements on the Public Interest AGENCY: U.S. International Trade Commission. ACTION: Notice. SUMMARY: Notice is...

  11. A method for the computation of turbulent polymeric liquids including hydrodynamic interactions and chain entanglements

    Energy Technology Data Exchange (ETDEWEB)

    Kivotides, Demosthenes, E-mail: demosthenes.kivotides@strath.ac.uk

    2017-02-12

    An asymptotically exact method for the direct computation of turbulent polymeric liquids that includes (a) fully resolved, creeping microflow fields due to hydrodynamic interactions between chains, (b) exact account of (subfilter) residual stresses, (c) polymer Brownian motion, and (d) direct calculation of chain entanglements, is formulated. Although developed in the context of polymeric fluids, the method is equally applicable to turbulent colloidal dispersions and aerosols. - Highlights: • An asymptotically exact method for the computation of polymer and colloidal fluids is developed. • The method is valid for all flow inertia and all polymer volume fractions. • The method models entanglements and hydrodynamic interactions between polymer chains.

  12. The Watts-Strogatz network model developed by including degree distribution: theory and computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Y W [Surface Physics Laboratory and Department of Physics, Fudan University, Shanghai 200433 (China); Zhang, L F [Surface Physics Laboratory and Department of Physics, Fudan University, Shanghai 200433 (China); Huang, J P [Surface Physics Laboratory and Department of Physics, Fudan University, Shanghai 200433 (China)

    2007-07-20

    By using theoretical analysis and computer simulations, we develop the Watts-Strogatz network model by including degree distribution, in an attempt to improve the comparison between characteristic path lengths and clustering coefficients predicted by the original Watts-Strogatz network model and those of the real networks with the small-world property. Good agreement between the predictions of the theoretical analysis and those of the computer simulations has been shown. It is found that the developed Watts-Strogatz network model can fit the real small-world networks more satisfactorily. Some other interesting results are also reported by adjusting the parameters in a model degree-distribution function. The developed Watts-Strogatz network model is expected to help in the future analysis of various social problems as well as financial markets with the small-world property.

  13. The Watts-Strogatz network model developed by including degree distribution: theory and computer simulation

    International Nuclear Information System (INIS)

    Chen, Y W; Zhang, L F; Huang, J P

    2007-01-01

    By using theoretical analysis and computer simulations, we develop the Watts-Strogatz network model by including degree distribution, in an attempt to improve the comparison between characteristic path lengths and clustering coefficients predicted by the original Watts-Strogatz network model and those of the real networks with the small-world property. Good agreement between the predictions of the theoretical analysis and those of the computer simulations has been shown. It is found that the developed Watts-Strogatz network model can fit the real small-world networks more satisfactorily. Some other interesting results are also reported by adjusting the parameters in a model degree-distribution function. The developed Watts-Strogatz network model is expected to help in the future analysis of various social problems as well as financial markets with the small-world property

  14. Review of the RNA Interference Pathway in Molluscs Including Some Possibilities for Use in Bivalves in Aquaculture

    Directory of Open Access Journals (Sweden)

    Leigh Owens

    2015-03-01

    Full Text Available Generalised reviews of RNA interference (RNAi in invertebrates, and for use in aquaculture, have taken for granted that RNAi pathways operate in molluscs, but inspection of such reviews show little specific evidence of such activity in molluscs. This review was to understand what specific research had been conducted on RNAi in molluscs, particularly with regard to aquaculture. There were questions of whether RNAi in molluscs functions similarly to the paradigm established for most eukaryotes or, alternatively, was it more similar to the ecdozoa and how RNAi may relate to disease control in aquaculture? RNAi in molluscs appears to have been only investigated in about 14 species, mostly as a gene silencing phenomenon. We can infer that microRNAs including let-7 are functional in molluscs. The genes/proteins involved in the actual RNAi pathways have only been rudimentarily investigated, so how homologous the genes and proteins are to other metazoa is unknown. Furthermore, how many different genes for each activity in the RNAi pathway are also unknown? The cephalopods have been greatly overlooked with only a single RNAi gene-silencing study found. The long dsRNA-linked interferon pathways seem to be present in molluscs, unlike some other invertebrates and could be used to reduce disease states in aquaculture. In particular, interferon regulatory factor genes have been found in molluscs of aquacultural importance such as Crassostrea, Mytilus, Pinctada and Haliotis. Two possible aquaculture scenarios are discussed, zoonotic norovirus and ostreid herpesvirus 1 to illustrate the possibilities. The entire field of RNAi in molluscs looks ripe for scientific exploitation and practical application.

  15. The possible usability of three-dimensional cone beam computed dental tomography in dental research

    Science.gov (United States)

    Yavuz, I.; Rizal, M. F.; Kiswanjaya, B.

    2017-08-01

    The innovations and advantages of three-dimensional cone beam computed dental tomography (3D CBCT) are continually growing for its potential use in dental research. Imaging techniques are important for planning research in dentistry. Newly improved 3D CBCT imaging systems and accessory computer programs have recently been proven effective for use in dental research. The aim of this study is to introduce 3D CBCT and open a window for future research possibilities that should be given attention in dental research.

  16. The IAEA transport regulations: main modifications included in the 1996 edition and the possible impact of its adoption in Argentina

    International Nuclear Information System (INIS)

    Lopez Vietri, J.R.; Novo, R.G.; Bianchi, A.J.

    1998-01-01

    Full text: This paper points out a comparative analysis between the requirements of the 1985 edition (as Amended 1990), in-force in almost all countries included Argentina, and the 1996 edition, that is foresee to put in-force 1st January 2001, of the Regulations for the safe transport of radioactive material, published by the International Atomic Energy Agency (IAEA). The English version of the 1996 edition was published in December 1996 and the Spanish one in September 1997. Such edition was the culmination of a difficult consensus and harmonisation reached after an analysis process of the-years cycle between the IAEA Member Sates and related international organisations (United Nations, International Civil Aviation Organisation, International Air Transport Association, International Federation of Air Lines Pilots Associations, International Maritime Organisation) as well as regional organisations (Economic Commission for Europe, Commission of the European Communities). Both editions of the Regulations include a set of design, operational and administrative requirements that substantially do not differ as for their safety basic philosophy. However, the 1996 edition introduces numerous modifications of different magnitude, which will derive in technological, economic and operative consequences. Of such modifications the paper only analysed the relevant ones which update the state of art in the subject and allow the Regulations continue maintaining an acceptable level of control of the radiation, criticality and thermal hazards to persons, property and the environment during the transport of radioactive material. In addition, the paper briefly describes the possible impact that the main modifications induced in the 1996 edition of the Regulations should have, depending on the type of user considered either in Argentina or in other Latin America countries. However, it is desirable that the personal of competent authorities of each country involved in transport

  17. Accurate computations of monthly average daily extraterrestrial irradiation and the maximum possible sunshine duration

    International Nuclear Information System (INIS)

    Jain, P.C.

    1985-12-01

    The monthly average daily values of the extraterrestrial irradiation on a horizontal plane and the maximum possible sunshine duration are two important parameters that are frequently needed in various solar energy applications. These are generally calculated by solar scientists and engineers each time they are needed and often by using the approximate short-cut methods. Using the accurate analytical expressions developed by Spencer for the declination and the eccentricity correction factor, computations for these parameters have been made for all the latitude values from 90 deg. N to 90 deg. S at intervals of 1 deg. and are presented in a convenient tabular form. Monthly average daily values of the maximum possible sunshine duration as recorded on a Campbell Stoke's sunshine recorder are also computed and presented. These tables would avoid the need for repetitive and approximate calculations and serve as a useful ready reference for providing accurate values to the solar energy scientists and engineers

  18. Possibilities and importance of using computer games and simulations in educational process

    OpenAIRE

    Danilović Mirčeta S.

    2003-01-01

    The paper discusses if it is possible and appropriate to use simulations (simulation games) and traditional games in the process of education. It is stressed that the terms "game" and "simulation" can and should be taken in a broader sense, although they are chiefly investigated herein as video-computer games and simulations. Any activity combining the properties of game (competition, rules, players) and the properties of simulation (i.e. operational presentation of reality) should be underst...

  19. CTmod—A toolkit for Monte Carlo simulation of projections including scatter in computed tomography

    Czech Academy of Sciences Publication Activity Database

    Malušek, Alexandr; Sandborg, M.; Alm Carlsson, G.

    2008-01-01

    Roč. 90, č. 2 (2008), s. 167-178 ISSN 0169-2607 Institutional research plan: CEZ:AV0Z10480505 Keywords : Monte Carlo * computed tomography * cone beam * scatter Subject RIV: JC - Computer Hardware ; Software Impact factor: 1.220, year: 2008 http://dx.doi.org/10.1016/j.cmpb.2007.12.005

  20. Effects of Lactobacillus salivarius, Lactobacillus reuteri, and Pediococcus acidilactici on the nematode Caenorhabditis elegans include possible antitumor activity.

    Science.gov (United States)

    Fasseas, Michael K; Fasseas, Costas; Mountzouris, Konstantinos C; Syntichaki, Popi

    2013-03-01

    This study examined the effects of three lactic acid bacteria (LAB) strains on the nematode Caenorhabditis elegans. Lactobacillus salivarius, Lactobacillus reuteri, and Pediococcus acidilactici were found to inhibit the development and growth of the worm. Compared to Escherichia coli used as the control, L. reuteri and P. acidilactici reduced the lifespan of wild-type and short-lived daf-16 worms. On the contrary, L. salivarius extended the lifespan of daf-16 worms when used live, but reduced it as UV-killed bacteria. The three LAB induced the expression of genes involved in pathogen response and inhibited the growth of tumor-like germ cells, without affecting DAF16 localization or increasing corpse cells. Our results suggest the possible use of C. elegans as a model for studying the antitumor attributes of LAB. The negative effects of these LAB strains on the nematode also indicate their potential use against parasitic nematodes.

  1. Including Internet insurance as part of a hospital computer network security plan.

    Science.gov (United States)

    Riccardi, Ken

    2002-01-01

    Cyber attacks on a hospital's computer network is a new crime to be reckoned with. Should your hospital consider internet insurance? The author explains this new phenomenon and presents a risk assessment for determining network vulnerabilities.

  2. The utility of including pathology reports in improving the computational identification of patients

    Directory of Open Access Journals (Sweden)

    Wei Chen

    2016-01-01

    Full Text Available Background: Celiac disease (CD is a common autoimmune disorder. Efficient identification of patients may improve chronic management of the disease. Prior studies have shown searching International Classification of Diseases-9 (ICD-9 codes alone is inaccurate for identifying patients with CD. In this study, we developed automated classification algorithms leveraging pathology reports and other clinical data in Electronic Health Records (EHRs to refine the subset population preselected using ICD-9 code (579.0. Materials and Methods: EHRs were searched for established ICD-9 code (579.0 suggesting CD, based on which an initial identification of cases was obtained. In addition, laboratory results for tissue transglutaminse were extracted. Using natural language processing we analyzed pathology reports from upper endoscopy. Twelve machine learning classifiers using different combinations of variables related to ICD-9 CD status, laboratory result status, and pathology reports were experimented to find the best possible CD classifier. Ten-fold cross-validation was used to assess the results. Results: A total of 1498 patient records were used including 363 confirmed cases and 1135 false positive cases that served as controls. Logistic model based on both clinical and pathology report features produced the best results: Kappa of 0.78, F1 of 0.92, and area under the curve (AUC of 0.94, whereas in contrast using ICD-9 only generated poor results: Kappa of 0.28, F1 of 0.75, and AUC of 0.63. Conclusion: Our automated classification system presented an efficient and reliable way to improve the performance of CD patient identification.

  3. Psychological impact of a possible radiation exposure including psychosocial support required in case of such a scenario

    International Nuclear Information System (INIS)

    Mazumdar, Kaustubh

    2014-01-01

    In the early years of the Atomic Age, radiation accidents or exposure was limited to laboratories or facilities. After the major accidents at TMI, Goainia, Chernobyl, when a large proportion of the population were exposed, interest in the psychosocial aspects has developed. In order to understand the psychological impact, an understanding of the causation of symptoms is necessary. Stress, anxiety, fear, physiological correlates and psychological consequences are thus explained. The different clinical entities and the ways and means of tackling them are described. Further, 'psychological first aid' and ameliorating measures are discussed too. Finally, prevention of psychological impact including education, community support, information dissemination etc. is described. (author)

  4. Computational and experimental analyses of the wave propagation through a bar structure including liquid-solid interface

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sang Jin [UST Graduate School, Daejeon (Korea, Republic of); Rhee, Hui Nam [Division of Mechanical and Aerospace Engineering, Sunchon National University, Sunchon (Korea, Republic of); Yoon, Doo Byung; Park, Jin Ho [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-08-15

    In this research, we study the propagation of longitudinal and transverse waves through a metal rod including a liquid layer using computational and experimental analyses. The propagation characteristics of longitudinal and transverse waves obtained by the computational and experimental analyses were consistent with the wave propagation theory for both cases, that is, the homogeneous metal rod and the metal rod including a liquid layer. The fluid-structure interaction modeling technique developed for the computational wave propagation analysis in this research can be applied to the more complex structures including solid-liquid interfaces.

  5. [Realistic possibilities of utilization of a personal computer in the office of a general practitioner].

    Science.gov (United States)

    Masopust, V

    1991-04-01

    In May 1990 work on the programme "Computer system of the health community doctor Mic DOKI was" completed which resolves more than 70 basic tasks pertaining to the keeping of health documentation by health community doctors; it resolves automatically the entire administrative work in the health community, makes it possible to evaluate the activity of doctors and nurses it will facilitate the work of control organs of future health insurance companies and contribute to investigations of the health status of the population. Despite some problems ensuing from the contemporary economic situation of the country, the validity of contemporary health regulations and minimal training of our health personnel in the use of personal computers computerization of the health community system can be considered an asset to the reform of the health services which is under way.

  6. 29 CFR 779.253 - What is included in computing the total annual inflow volume.

    Science.gov (United States)

    2010-07-01

    ... FAIR LABOR STANDARDS ACT AS APPLIED TO RETAILERS OF GOODS OR SERVICES Employment to Which the Act May... taxes and other charges which the enterprise must pay for such goods. Generally, all charges will be... computing the total annual inflow volume. The goods which the establishment purchases or receives for resale...

  7. Possibilities of differentiation of solitary focal liver lesions by computed tomography perfusion

    Directory of Open Access Journals (Sweden)

    Irmina Sefić Pašić

    2015-08-01

    Full Text Available Aim To evaluate possibilities of computed tomography (CT perfusion in differentiation of solitary focal liver lesions based on their characteristic vascularization through perfusion parameters analysis. Methods Prospective study was conducted on 50 patients in the period 2009-2012. Patients were divided in two groups: benign and malignant lesions. The following CT perfusion parameters were analyzed: blood flow (BF, blood volume (BV, mean transit time (MTT, capillary permeability surface area product (PS, hepatic arterial fraction (HAF, and impulse residual function (IRF. During the study another perfusion parameter was analyzed: hepatic perfusion index (HPI. All patients were examined on Multidetector 64-slice CT machine (GE with application of perfusion protocol for liver with i.v. administration of contrast agent. Results In both groups an increase of vascularization and arterial blood flow was noticed, but there was no significant statistical difference between any of 6 analyzed parameters. Hepatic perfusion index values were increased in all lesions in comparison with normal liver parenchyma. Conclusion Computed tomography perfusion in our study did not allow differentiation of benign and malignant liver lesions based on analysis of functional perfusion parameters. Hepatic perfusion index should be investigated in further studies as a parameter for detection of possible presence of micro-metastases in visually homogeneous liver in cases with no lesions found during standard CT protocol

  8. On the possibility of non-invasive multilayer temperature estimation using soft-computing methods.

    Science.gov (United States)

    Teixeira, C A; Pereira, W C A; Ruano, A E; Ruano, M Graça

    2010-01-01

    This work reports original results on the possibility of non-invasive temperature estimation (NITE) in a multilayered phantom by applying soft-computing methods. The existence of reliable non-invasive temperature estimator models would improve the security and efficacy of thermal therapies. These points would lead to a broader acceptance of this kind of therapies. Several approaches based on medical imaging technologies were proposed, magnetic resonance imaging (MRI) being appointed as the only one to achieve the acceptable temperature resolutions for hyperthermia purposes. However, MRI intrinsic characteristics (e.g., high instrumentation cost) lead us to use backscattered ultrasound (BSU). Among the different BSU features, temporal echo-shifts have received a major attention. These shifts are due to changes of speed-of-sound and expansion of the medium. The originality of this work involves two aspects: the estimator model itself is original (based on soft-computing methods) and the application to temperature estimation in a three-layer phantom is also not reported in literature. In this work a three-layer (non-homogeneous) phantom was developed. The two external layers were composed of (in % of weight): 86.5% degassed water, 11% glycerin and 2.5% agar-agar. The intermediate layer was obtained by adding graphite powder in the amount of 2% of the water weight to the above composition. The phantom was developed to have attenuation and speed-of-sound similar to in vivo muscle, according to the literature. BSU signals were collected and cumulative temporal echo-shifts computed. These shifts and the past temperature values were then considered as possible estimators inputs. A soft-computing methodology was applied to look for appropriate multilayered temperature estimators. The methodology involves radial-basis functions neural networks (RBFNN) with structure optimized by the multi-objective genetic algorithm (MOGA). In this work 40 operating conditions were

  9. Possibilities of the new hybrid technology single photon emission computer technology/computer tomography (SPECT/CT) and the first impressions of its application

    International Nuclear Information System (INIS)

    Kostadinova, I.

    2010-01-01

    With the help of the new hybrid technique SPECT/ CT it is possible, using the only investigation, to acquire a combine image of the investigated organ, visualizing its function and structure. Combining the possibilities of the new multimodality method, which combines the possibilities of the Single Photon Emission Computer Tomography - SPECT and Computer Tomography - CT, it is possible to precisely localize the pathologically changed organs function. With the further combination of the tomographic gamma camera with diagnostic CT, a detailed morphological evaluation of the finding was possible. The main clinical application of the new hybrid diagnostic is in the fields of cardiology, oncology, orthopedics with more and more extension of those, not connected with oncology, such as - thyroid, parathyroid, brain (especially localization of the epileptic foci), visualization of local infection and recently for the purposes of the radiotherapy planning. According to the literature data, around 35% of SPECT-investigations have to be combined with CT in order to increase the specificity of the diagnosis, which changes the interpretation of the result in 56% of the cases. After installation of the SPECT/CT camera in the University hospital 'Alexandrovska' in January 2009, the following changes have been done: the number of the investigated patients have increased, including number of heart, thyroid (especially scintigraphy with 131I), bones and parathyroid glands. As a result of the application of the hybrid technique, a shortage of the investigated time was realized and a decrease prize in comparison with the individual application of the investigations. Summarizing the literature data and the preliminary impression of the first multimodality scanner in our country in the University hospital 'Alexandrovska' it could be said, that there is continuously increasing information for the new clinical applications of SPECT/CT. It is now accepted, that its usage will increase in

  10. 31 CFR 351.66 - What book-entry Series EE savings bonds are included in the computation?

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false What book-entry Series EE savings... DEBT OFFERING OF UNITED STATES SAVINGS BONDS, SERIES EE Book-Entry Series EE Savings Bonds § 351.66 What book-entry Series EE savings bonds are included in the computation? (a) We include all bonds that...

  11. Pancreas divisum as a possible cause of misinterpretation in ERCP, computed tomography, ultrasound and barium meal

    International Nuclear Information System (INIS)

    Brambs, H.J.; Schuetz, B.; Wimmer, B.; Hoppe-Seyler, P.; Freiburg Univ.

    1986-01-01

    In 488 patients endoscopic retorgrade pancreatography (ERP) revealed a pancreas divisum in 21 (4.3%): in 17/21 patients we found a complete, in 4/21 an incomplete separation of the pancreatic ducts. The pancreas divisum is caused by a malfusion of the ductal system. On examination by ultrasound, computed tomography or hypotonic duodenography this variant can suggest an inflammation or tumour of the head of the pancreas. A definite diagnosis is possible by ERP only. Since the small ventral duct can be confused with an alteration caused by inflammation or by a tumour, to much of contrast medium can be injected. Pancreas divisum is often associated with a chronic pancreatitis which can be demonstrated via ERP of the dorsal duct through the accessory papilla. (orig.) [de

  12. A computational method for comparing the behavior and possible failure of prosthetic implants

    Energy Technology Data Exchange (ETDEWEB)

    Nielsen, C.; Hollerbach, K.; Perfect, S.; Underhill, K.

    1995-05-01

    Prosthetic joint implants currently in use exhibit high Realistic computer modeling of prosthetic implants provides an opportunity for orthopedic biomechanics researchers and physicians to understand possible in vivo failure modes, without having to resort to lengthy and costly clinical trials. The research presented here is part of a larger effort to develop realistic models of implanted joint prostheses. The example used here is the thumb carpo-metacarpal (cmc) joint. The work, however, can be applied to any other human joints for which prosthetic implants have been designed. Preliminary results of prosthetic joint loading, without surrounding human tissue (i.e., simulating conditions under which the prosthetic joint has not yet been implanted into the human joint), are presented, based on a three-dimensional, nonlinear finite element analysis of three different joint implant designs.

  13. Areal rainfall estimation using moving cars - computer experiments including hydrological modeling

    Science.gov (United States)

    Rabiei, Ehsan; Haberlandt, Uwe; Sester, Monika; Fitzner, Daniel; Wallner, Markus

    2016-09-01

    The need for high temporal and spatial resolution precipitation data for hydrological analyses has been discussed in several studies. Although rain gauges provide valuable information, a very dense rain gauge network is costly. As a result, several new ideas have emerged to help estimating areal rainfall with higher temporal and spatial resolution. Rabiei et al. (2013) observed that moving cars, called RainCars (RCs), can potentially be a new source of data for measuring rain rate. The optical sensors used in that study are designed for operating the windscreen wipers and showed promising results for rainfall measurement purposes. Their measurement accuracy has been quantified in laboratory experiments. Considering explicitly those errors, the main objective of this study is to investigate the benefit of using RCs for estimating areal rainfall. For that, computer experiments are carried out, where radar rainfall is considered as the reference and the other sources of data, i.e., RCs and rain gauges, are extracted from radar data. Comparing the quality of areal rainfall estimation by RCs with rain gauges and reference data helps to investigate the benefit of the RCs. The value of this additional source of data is not only assessed for areal rainfall estimation performance but also for use in hydrological modeling. Considering measurement errors derived from laboratory experiments, the result shows that the RCs provide useful additional information for areal rainfall estimation as well as for hydrological modeling. Moreover, by testing larger uncertainties for RCs, they observed to be useful up to a certain level for areal rainfall estimation and discharge simulation.

  14. Human factors design of nuclear power plant control rooms including computer-based operator aids

    International Nuclear Information System (INIS)

    Bastl, W.; Felkel, L.; Becker, G.; Bohr, E.

    1983-01-01

    The scientific handling of human factors problems in control rooms began around 1970 on the basis of safety considerations. Some recent research work deals with the development of computerized systems like plant balance calculation, safety parameter display, alarm reduction and disturbance analysis. For disturbance analysis purposes it is necessary to homogenize the information presented to the operator according to the actual plant situation in order to supply the operator with the information he most urgently needs at the time. Different approaches for solving this problem are discussed, and an overview is given on what is being done. Other research projects concentrate on the detailed analysis of operators' diagnosis strategies in unexpected situations, in order to obtain a better understanding of their mental processes and the influences upon them when such situations occur. This project involves the use of a simulator and sophisticated recording and analysis methods. Control rooms are currently designed with the aid of mock-ups. They enable operators to contribute their experience to the optimization of the arrangement of displays and controls. Modern control rooms are characterized by increasing use of process computers and CRT (Cathode Ray Tube) displays. A general concept for the integration of the new computerized system and the conventional control panels is needed. The technical changes modify operators' tasks, and future ergonomic work in nuclear plants will need to consider the re-allocation of function between man and machine, the incorporation of task changes in training programmes, and the optimal design of information presentation using CRTs. Aspects of developments in control room design are detailed, typical research results are dealt with, and a brief forecast of the ergonomic contribution to be made in the Federal Republic of Germany is given

  15. Possibilities and importance of using computer games and simulations in educational process

    Directory of Open Access Journals (Sweden)

    Danilović Mirčeta S.

    2003-01-01

    Full Text Available The paper discusses if it is possible and appropriate to use simulations (simulation games and traditional games in the process of education. It is stressed that the terms "game" and "simulation" can and should be taken in a broader sense, although they are chiefly investigated herein as video-computer games and simulations. Any activity combining the properties of game (competition, rules, players and the properties of simulation (i.e. operational presentation of reality should be understood as simulation games, where role-play constitutes their essence and basis. In those games the student assumes a new identity, identifies himself with another personality and responds similarly. Game rules are basic and most important conditions for its existence, accomplishment and goal achievement. Games and simulations make possible for a student to acquire experience and practice i.e. to do exercises in nearly similar or identical life situations, to develop cognitive and psycho-motor abilities and skills, to acquire knowledge, to develop, create and change attitudes and value criteria, and to develop perception of other people’s feelings and attitudes. It is obligatory for the teacher to conduct preparations to use and apply simulation games in the process of teaching.

  16. Progress report of Physics Division including Applied Mathematics and Computing Section. 1st October 1970 - 31st March 1971

    International Nuclear Information System (INIS)

    2004-01-01

    The initial MOATA safety assessment was based on data and calculations available before the advent of multigroup diffusion theory codes in two dimensions. That assessment is being revised and extended to gain approval for 100 kW operation. The more detailed representation obtained in the new calculations has resulted in a much better understanding of the physics of this reactor. The properties of the reactor are determined to a large extent by neutron leakage from the rather thin core tanks. In particular the effect of leakage on the coupling between the core tanks and on reactivity coefficients has been clarified and quantified. In neutron data studies, the theoretical fission product library was revised, checked against any experimental values and distributed to interested overseas centres. Some further nubar work was done vith much better neutron energy resolution, and confirmed our earlier measurements. A promising formulation of R matrix theory of nuclear interaction is expected to lead to simpler multilevel resonance parameter description. With large amounts of digital data being collected, dissplayed and used by theoreticians and experimentalists, more attention -was given to visual interactive computer displays. This interest is generating constructive proposals for use of the dataway now being installed between the Division and the IBM 360/50 computer. The study of gamma rays following the capture of keV neutrons continues to reveal new and interesting features of the physical processes involved. A detailed international compilation of the gamma rays emitted and their intensities is in progress. The work on nickel-68, amongst others, has enabled a partial capture cross section to be generated from the gamma ray parameters obtained by experiment. Much work still remains to be done, possibly at other establishments with more extensive facilities. The electrical and mechanical components of our new zero power split table machine for reactor physics assemblies

  17. Experience in nuclear materials accountancy, including the use of computers, in the UKAEA

    International Nuclear Information System (INIS)

    Anderson, A.R.; Adamson, A.S.; Good, P.T.; Terrey, D.R.

    1976-01-01

    The UKAEA have operated systems of nuclear materials accountancy in research and development establishments handling large quantities of material for over 20 years. In the course of that time changing requirements for nuclear materials control and increasing quantities of materials have required that accountancy systems be modified and altered to improve either the fundamental system or manpower utilization. The same accountancy principles are applied throughout the Authority but procedures at the different establishments vary according to the nature of their specific requirements; there is much in the cumulative experience of the UKAEA which could prove of value to other organizations concerned with nuclear materials accountancy or safeguards. This paper reviews the present accountancy system in the UKAEA and summarizes its advantages. Details are given of specific experience and solutions which have been found to overcome difficulties or to strengthen previous weak points. Areas discussed include the use of measurements, the establishment of measurement points (which is relevant to the designation of MBAs), the importance of regular physical stock-taking, and the benefits stemming from the existence of a separate accountancy section independent of operational management at large establishments. Some experience of a dual system of accountancy and criticality control is reported, and the present status of computerization of nuclear material accounts is summarized. Important aspects of the relationship between management systems of accountancy and safeguards' requirements are discussed briefly. (author)

  18. Possible UIP pattern on high-resolution computed tomography is associated with better survival than definite UIP in IPF patients.

    Science.gov (United States)

    Salisbury, Margaret L; Tolle, Leslie B; Xia, Meng; Murray, Susan; Tayob, Nabihah; Nambiar, Anoop M; Schmidt, Shelley L; Lagstein, Amir; Myers, Jeffery L; Gross, Barry H; Kazerooni, Ella A; Sundaram, Baskaran; Chughtai, Aamer R; Martinez, Fernando J; Flaherty, Kevin R

    2017-10-01

    Idiopathic pulmonary fibrosis (IPF) is a progressive fibrosing lung disease of unknown etiology. Inter-society consensus guidelines on IPF diagnosis and management outline radiologic patterns including definite usual interstitial pneumonia (UIP), possible UIP, and inconsistent with UIP. We evaluate these diagnostic categories as prognostic markers among patients with IPF. Included subjects had biopsy-proven UIP, a multidisciplinary team diagnosis of IPF, and a baseline high-resolution computed tomography (HRCT). Thoracic radiologists assigned the radiologic pattern and documented the presence and extent of specific radiologic findings. The outcome of interest was lung transplant-free survival. IPF patients with a possible UIP pattern on HRCT had significantly longer Kaplan-Meier event-free survival compared to those with definite UIP pattern (5.21 and 3.57 years, respectively, p = 0.002). In a multivariable Cox proportional hazards model adjusted for baseline age, gender, %-predicted FVC, and %-predicted DLCO via the GAP Stage, extent of fibrosis (via the traction bronchiectasis score) and ever-smoker status, possible UIP pattern on HRCT (versus definite UIP) was associated with reduced hazard of death or lung transplant (HR = 0.42, CI 95% 0.23-0.78, p = 0.006). Radiologic diagnosis categories outlined by inter-society consensus guidelines is a widely-reported and potentially useful prognostic marker in IPF patients, with possible UIP pattern on HRCT associated with a favorable prognosis compared to definite UIP pattern, after adjusting for relevant covariates. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. PTA-1 computer program for treating pressure transients in hydraulic networks including the effect of pipe plasticity

    International Nuclear Information System (INIS)

    Youngdahl, C.K.; Kot, C.A.

    1977-01-01

    Pressure pulses in the intermediate sodium system of a liquid-metal-cooled fast breeder reactor, such as may originate from a sodium/water reaction in a steam generator, are propagated through the complex sodium piping network to system components such as the pump and intermediate heat exchanger. To assess the effects of such pulses on continued reliable operation of these components and to contribute to system designs which result in the mitigation of these effects, Pressure Transient Analysis (PTA) computer codes are being developed for accurately computing the transmission of pressure pulses through a complicated fluid transport system, consisting of piping, fittings and junctions, and components. PTA-1 provides an extension of the well-accepted and verified fluid hammer formulation for computing hydraulic transients in elastic or rigid piping systems to include plastic deformation effects. The accuracy of the modeling of pipe plasticity effects on transient propagation has been validated using results from two sets of Stanford Research Institute experiments. Validation of PTA-1 using the latter set of experiments is described briefly. The comparisons of PTA-1 computations with experiments show that (1) elastic-plastic deformation of LMFBR-type piping can have a significant qualitative and quantitative effect on pressure pulse propagation, even in simple systems; (2) classical fluid-hammer theory gives erroneous results when applied to situations where piping deforms plastically; and (3) the computational model incorporated in PTA-1 for predicting plastic deformation and its effect on transient propagation is accurate

  20. A computer software system for the generation of global ocean tides including self-gravitation and crustal loading effects

    Science.gov (United States)

    Estes, R. H.

    1977-01-01

    A computer software system is described which computes global numerical solutions of the integro-differential Laplace tidal equations, including dissipation terms and ocean loading and self-gravitation effects, for arbitrary diurnal and semidiurnal tidal constituents. The integration algorithm features a successive approximation scheme for the integro-differential system, with time stepping forward differences in the time variable and central differences in spatial variables. Solutions for M2, S2, N2, K2, K1, O1, P1 tidal constituents neglecting the effects of ocean loading and self-gravitation and a converged M2, solution including ocean loading and self-gravitation effects are presented in the form of cotidal and corange maps.

  1. Soft computing analysis of the possible correlation between temporal and energy release patterns in seismic activity

    Science.gov (United States)

    Konstantaras, Anthony; Katsifarakis, Emmanouil; Artzouxaltzis, Xristos; Makris, John; Vallianatos, Filippos; Varley, Martin

    2010-05-01

    This paper is a preliminary investigation of the possible correlation of temporal and energy release patterns of seismic activity involving the preparation processes of consecutive sizeable seismic events [1,2]. The background idea is that during periods of low-level seismic activity, stress processes in the crust accumulate energy at the seismogenic area whilst larger seismic events act as a decongesting mechanism releasing considerable energy [3,4]. A dynamic algorithm is being developed aiming to identify and cluster pre- and post- seismic events to the main earthquake following on research carried out by Zubkov [5] and Dobrovolsky [6,7]. This clustering technique along with energy release equations dependent on Richter's scale [8,9] allow for an estimate to be drawn regarding the amount of the energy being released by the seismic sequence. The above approach is being implemented as a monitoring tool to investigate the behaviour of the underlying energy management system by introducing this information to various neural [10,11] and soft computing models [1,12,13,14]. The incorporation of intelligent systems aims towards the detection and simulation of the possible relationship between energy release patterns and time-intervals among consecutive sizeable earthquakes [1,15]. Anticipated successful training of the imported intelligent systems may result in a real-time, on-line processing methodology [1,16] capable to dynamically approximate the time-interval between the latest and the next forthcoming sizeable seismic event by monitoring the energy release process in a specific seismogenic area. Indexing terms: pattern recognition, long-term earthquake precursors, neural networks, soft computing, earthquake occurrence intervals References [1] Konstantaras A., Vallianatos F., Varley M.R. and Makris J. P.: ‘Soft computing modelling of seismicity in the southern Hellenic arc', IEEE Geoscience and Remote Sensing Letters, vol. 5 (3), pp. 323-327, 2008 [2] Eneva M. and

  2. Spin-neurons: A possible path to energy-efficient neuromorphic computers

    Energy Technology Data Exchange (ETDEWEB)

    Sharad, Mrigank; Fan, Deliang; Roy, Kaushik [School of Electrical and Computer Engineering, Purdue University, West Lafayette, Indiana 47907 (United States)

    2013-12-21

    Recent years have witnessed growing interest in the field of brain-inspired computing based on neural-network architectures. In order to translate the related algorithmic models into powerful, yet energy-efficient cognitive-computing hardware, computing-devices beyond CMOS may need to be explored. The suitability of such devices to this field of computing would strongly depend upon how closely their physical characteristics match with the essential computing primitives employed in such models. In this work, we discuss the rationale of applying emerging spin-torque devices for bio-inspired computing. Recent spin-torque experiments have shown the path to low-current, low-voltage, and high-speed magnetization switching in nano-scale magnetic devices. Such magneto-metallic, current-mode spin-torque switches can mimic the analog summing and “thresholding” operation of an artificial neuron with high energy-efficiency. Comparison with CMOS-based analog circuit-model of a neuron shows that “spin-neurons” (spin based circuit model of neurons) can achieve more than two orders of magnitude lower energy and beyond three orders of magnitude reduction in energy-delay product. The application of spin-neurons can therefore be an attractive option for neuromorphic computers of future.

  3. Selective population rate coding: a possible computational role of gamma oscillations in selective attention.

    Science.gov (United States)

    Masuda, Naoki

    2009-12-01

    Selective attention is often accompanied by gamma oscillations in local field potentials and spike field coherence in brain areas related to visual, motor, and cognitive information processing. Gamma oscillations are implicated to play an important role in, for example, visual tasks including object search, shape perception, and speed detection. However, the mechanism by which gamma oscillations enhance cognitive and behavioral performance of attentive subjects is still elusive. Using feedforward fan-in networks composed of spiking neurons, we examine a possible role for gamma oscillations in selective attention and population rate coding of external stimuli. We implement the concept proposed by Fries ( 2005 ) that under dynamic stimuli, neural populations effectively communicate with each other only when there is a good phase relationship among associated gamma oscillations. We show that the downstream neural population selects a specific dynamic stimulus received by an upstream population and represents it by population rate coding. The encoded stimulus is the one for which gamma rhythm in the corresponding upstream population is resonant with the downstream gamma rhythm. The proposed role for gamma oscillations in stimulus selection is to enable top-down control, a neural version of time division multiple access used in communication engineering.

  4. “Future Directions”: m-government computer systems accessed via cloud computing – advantages and possible implementations

    OpenAIRE

    Daniela LIŢAN

    2015-01-01

    In recent years, the activities of companies and Public Administration had been automated and adapted to the current information system. Therefore, in this paper, I will present and exemplify the benefits of m-government computer systems development and implementation (which can be accessed from mobile devices and which are specific to the workflow of Public Administrations) starting from the “experience” of e-government systems implementation in the context of their access and usage through ...

  5. ICECON: a computer program used to calculate containment back pressure for LOCA analysis (including ice condenser plants)

    International Nuclear Information System (INIS)

    1976-07-01

    The ICECON computer code provides a method for conservatively calculating the long term back pressure transient in the containment resulting from a hypothetical Loss-of-Coolant Accident (LOCA) for PWR plants including ice condenser containment systems. The ICECON computer code was developed from the CONTEMPT/LT-022 code. A brief discussion of the salient features of a typical ice condenser containment is presented. Details of the ice condenser models are explained. The corrections and improvements made to CONTEMPT/LT-022 are included. The organization of the code, including the calculational procedure, is outlined. The user's manual, to be used in conjunction with the CONTEMPT/LT-022 user's manual, a sample problem, a time-step study (solution convergence) and a comparison of ICECON results with the results of the NSSS vendor are presented. In general, containment pressure calculated with the ICECON code agree with those calculated by the NSSS vendor using the same mass and energy release rates to the containment

  6. Explicitly-correlated ring-coupled-cluster-doubles theory: Including exchange for computations on closed-shell systems

    Energy Technology Data Exchange (ETDEWEB)

    Hehn, Anna-Sophia; Holzer, Christof; Klopper, Wim, E-mail: klopper@kit.edu

    2016-11-10

    Highlights: • Ring-coupled-cluster-doubles approach now implemented with exchange terms. • Ring-coupled-cluster-doubles approach now implemented with F12 functions. • Szabo–Ostlund scheme (SO2) implemented for use in SAPT. • Fast convergence to the limit of a complete basis. • Implementation in the TURBOMOLE program system. - Abstract: Random-phase-approximation (RPA) methods have proven to be powerful tools in electronic-structure theory, being non-empirical, computationally efficient and broadly applicable to a variety of molecular systems including small-gap systems, transition-metal compounds and dispersion-dominated complexes. Applications are however hindered due to the slow basis-set convergence of the electron-correlation energy with the one-electron basis. As a remedy, we present approximate explicitly-correlated RPA approaches based on the ring-coupled-cluster-doubles formulation including exchange contributions. Test calculations demonstrate that the basis-set convergence of correlation energies is drastically accelerated through the explicitly-correlated approach, reaching 99% of the basis-set limit with triple-zeta basis sets. When implemented in close analogy to early work by Szabo and Ostlund [36], the new explicitly-correlated ring-coupled-cluster-doubles approach including exchange has the perspective to become a valuable tool in the framework of symmetry-adapted perturbation theory (SAPT) for the computation of dispersion energies of molecular complexes of weakly interacting closed-shell systems.

  7. Multiscale approach including microfibril scale to assess elastic constants of cortical bone based on neural network computation and homogenization method.

    Science.gov (United States)

    Barkaoui, Abdelwahed; Chamekh, Abdessalem; Merzouki, Tarek; Hambli, Ridha; Mkaddem, Ali

    2014-03-01

    The complexity and heterogeneity of bone tissue require a multiscale modeling to understand its mechanical behavior and its remodeling mechanisms. In this paper, a novel multiscale hierarchical approach including microfibril scale based on hybrid neural network (NN) computation and homogenization equations was developed to link nanoscopic and macroscopic scales to estimate the elastic properties of human cortical bone. The multiscale model is divided into three main phases: (i) in step 0, the elastic constants of collagen-water and mineral-water composites are calculated by averaging the upper and lower Hill bounds; (ii) in step 1, the elastic properties of the collagen microfibril are computed using a trained NN simulation. Finite element calculation is performed at nanoscopic levels to provide a database to train an in-house NN program; and (iii) in steps 2-10 from fibril to continuum cortical bone tissue, homogenization equations are used to perform the computation at the higher scales. The NN outputs (elastic properties of the microfibril) are used as inputs for the homogenization computation to determine the properties of mineralized collagen fibril. The mechanical and geometrical properties of bone constituents (mineral, collagen, and cross-links) as well as the porosity were taken in consideration. This paper aims to predict analytically the effective elastic constants of cortical bone by modeling its elastic response at these different scales, ranging from the nanostructural to mesostructural levels. Our findings of the lowest scale's output were well integrated with the other higher levels and serve as inputs for the next higher scale modeling. Good agreement was obtained between our predicted results and literature data. Copyright © 2013 John Wiley & Sons, Ltd.

  8. Posture, Musculoskeletal Activities, and Possible Musculoskeletal Discomfort Among Children Using Laptops or Tablet Computers for Educational Purposes: A Literature Review

    Science.gov (United States)

    Binboğa, Elif; Korhan, Orhan

    2014-10-01

    Educational ergonomics focuses on the interaction between educational performance and educational design. By improving the design or pointing out the possible problems, educational ergonomics can be utilized to have positive impacts on the student performance and thus on education process. Laptops and tablet computers are becoming widely used by school children and beginning to be used effectively for educational purposes. As the latest generation of laptops and tablet computers are mobile and lightweight compared to conventional personal computers, they support student-centred interaction-based learning. However, these technologies have been introduced into schools with minimal adaptations to furniture or attention to ergonomics. There are increasing reports of an association between increased musculoskeletal (MSK) problems in children and use of such technologies. Although children are among the users of laptops and tablet computers both in their everyday lives and at schools, the literature investigating MSK activities and possible MSK discomfort regarding children using portable technologies is limited. This study reviews the literature to identify published studies that investigated posture, MSK activities, and possible MSK discomfort among children using mobile technologies (laptops or tablet computers) for educational purposes. An electronic search of the literature published in English between January 1994 and January 2014 was performed in several databases. The literature search terms were identified and combined to search the databases. The search results that the resources investigating MSK outcomes of laptop or tablet use of children are very scarce. This review points out the research gaps in this field, and identifying areas for future studies.

  9. Posture, Musculoskeletal Activities, and Possible Musculoskeletal Discomfort among Children Using Laptops or Tablet Computers for Educational Purposes: A Literature Review

    Science.gov (United States)

    Binboga, Elif; Korhan, Orhan

    2014-01-01

    Educational ergonomics focuses on the interaction between educational performance and educational design. By improving the design or pointing out the possible problems, educational ergonomics can be utilized to have positive impacts on the student performance and thus on education process. Laptops and tablet computers are becoming widely used by…

  10. New possibilities of three-dimensional reconstruction of computed tomography scans

    International Nuclear Information System (INIS)

    Herman, M.; Tarjan, Z.; Pozzi-Mucelli, R.S.

    1996-01-01

    Three-dimensional (3D) computed tomography (CT) scan reconstructions provide impressive and illustrative images of various parts of the human body. Such images are reconstructed from a series of basic CT scans by dedicated software. The state of the art in 3D computed tomography is demonstrated with emphasis on the imaging of soft tissues. Examples are presented of imaging the craniofacial and maxillofacial complex, central nervous system, cardiovascular system, musculoskeletal system, gastrointestinal and urogenital systems, and respiratory system, and their potential in clinical practice is discussed. Although contributing no new essential diagnostic information against conventional CT scans, 3D scans can help in spatial orientation. 11 figs., 25 refs

  11. Ubiquitous Computing and Changing Pedagogical Possibilities: Representations, Conceptualizations and Uses of Knowledge

    Science.gov (United States)

    Swan, Karen; Van 'T Hooft, Mark; Kratcoski, Annette; Schenker, Jason

    2007-01-01

    This article reports on preliminary findings from an ongoing study of teaching and learning in a ubiquitous computing classroom. The research employed mixed methods and multiple measures to document changes in teaching and learning that result when teachers and students have access to a variety of digital devices wherever and whenever they need…

  12. Movement of the patient and the cone beam computed tomography scanner: objectives and possible solutions

    Czech Academy of Sciences Publication Activity Database

    Hanzelka, T.; Dušek, J.; Ocásek, F.; Kučera, J.; Šedý, Jiří; Beneš, J.; Pavlíková, G.; Foltán, R.

    2013-01-01

    Roč. 116, č. 6 (2013), s. 769-773 ISSN 2212-4403 Institutional support: RVO:67985823 Keywords : cone beam computed tomography * movement artifacts * dry-run scan Subject RIV: ED - Physiology Impact factor: 1.265, year: 2013

  13. Computation of transverse muon-spin relaxation functions including trapping-detrapping reactions, with application to electron-irradiated tantalum

    International Nuclear Information System (INIS)

    Doering, K.P.; Aurenz, T.; Herlach, D.; Schaefer, H.E.; Arnold, K.P.; Jacobs, W.; Orth, H.; Haas, N.; Seeger, A.; Max-Planck-Institut fuer Metallforschung, Stuttgart

    1986-01-01

    A new technique for the economical evaluation of transverse muon spin relaxation functions in situations involving μ + trapping at and detrapping from crystal defects is applied to electron-irradiated Ta exhibiting relaxation maxima at about 35 K, 100 K, and 250 K. The long-range μ + diffusion is shown to be limted by traps over the entire temperature range investigated. The (static) relaxation rates for several possible configurations of trapped muons are discussed, including the effect of the simultaneous presence of a proton in a vacancy. (orig.)

  14. On the possibility of study the surface structure of small bio-objects, including fragments of nucleotide chains, by means of electron interference

    Energy Technology Data Exchange (ETDEWEB)

    Namiot, V.A., E-mail: vnamiot@gmail.co [Institute of Nuclear Physics, Moscow State University, Vorobyovy Gory, 119992 Moscow (Russian Federation)

    2009-07-20

    We propose a new method to study the surface of small bio-objects, including macromolecules and their complexes. This method is based on interference of low-energy electrons. Theoretically, this type of interference may allow to construct a hologram of the biological object, but, unlike an optical hologram, with the spatial resolution of the order of inter-atomic distances. The method provides a possibility to construct a series of such holograms at various levels of electron energies. In theory, obtaining such information would be enough to identify the types of molecular groups existing on the surface of the studied object. This method could also be used for 'fast reading' of nucleotide chains. It has been shown how to depose a long linear molecule as a straight line on a substrate before carrying out such 'reading'.

  15. Didactic possibilities of computer communications - the foundation of didactic model of training teachers in remote form

    Directory of Open Access Journals (Sweden)

    Vasilchenko L.V.

    2010-04-01

    Full Text Available The didactic model of the remote form of raising the level of teacher's skill is considered. Directions of creation of single educational space of continuous education are rotined. The association of didactic possibilities of information telecommunication technologies is given by possibility to stimulate independent educational activity of teachers. The indicated system renders assistance to development of abilities in a short-story form to give information, create short information messages, assort necessary information on certain signs. All of it develops communicative capabilities which play an important role in development of the personal qualities of individual

  16. Possibilities of computed bronchophonography in the diagnosis of external respiratory dysfunction in patients with cystic fibrosis

    Directory of Open Access Journals (Sweden)

    E. B. Pavlinova

    2016-01-01

    Full Text Available The degree of respiratory organ injury in cystic fibrosis determines the prognosis of the disease. Objective: to evaluate external respiratory function in children with cystic fibrosis. The study enrolled 48 children followed up at the Omsk Cystic Fibrosis Center. A control group consisted of 42 non-addicted smoking children with no evidence for respiratory diseases in the history. External respiratory function was evaluated using computed bronchophonography; spirography was additionally carried out in children over 6 years of age. Computed bronchophonography revealed obstructive respiratory failure in all children with severe cystic fibrosis. Chronic respiratory tract infection with Pseudomonas aeruginosa and bronchiectasis were associated with the higher values of the acoustic work of breathing at frequencies over 5000 Hz. It was established that there was a moderate negative correlation between the value of the acoustic work of breathing in the high frequency range and the forced expiratory volume in 1 second in %. Conclusion. Computed bronchophonography could reveal obstructive external respiratory dysfunction in children less than 6 years of age. 

  17. Radiometric installations for automatic control of industrial processes and some possibilities of the specialized computers application

    International Nuclear Information System (INIS)

    Kuzino, S.; Shandru, P.

    1979-01-01

    It is noted that application of radioisotope devices in circuits for automation of some industrial processes permits to obtain the on-line information about some parameters of these processes. This information being passed to a computer, controlling the process, permits to obtain and maintain some optimum technological perameters of this process. Some elements of the automation stem projecting are given from the poin of wiev of the radiometric devices tuning, calibration of the radiometric devices with the purpose to get a digital answer in the on-line regime with the preset accuracy and thrustworthyness levels for supplying them to the controlling computer; determination of the system's reaction on the base of the preset statistical criteria; development, on the base of the data obtained from the computer, of an algorithm for the functional checking of radiometric devices' characteristics, - stability and reproductibility of readings in the operation regime as well as determination of the value threshold of an answer, depending on the measured parameter [ru

  18. Characteristics and possibilities of computer program for fast assessment of primary frequency control of electric power interconnections

    Directory of Open Access Journals (Sweden)

    Ivanović Milan

    2011-01-01

    Full Text Available This paper presents the possibilities and practical features of a computer program for fast assessment of the effects of primary frequency regulation of electric power interconnections. It is based on two methods. The first one is the analytical method, which applies analytical expressions for the non-zero initial conditions, with a range of benefits provided by the analytical form, allowing consideration of possible structural changes in the power system during the analysis process. The second is a simulation method, with recurrent application of suitable drafted, fully decoupled difference equations. Capabilities and features of this computer program have been identified in case of isolated power system of Serbia, and then for the case of a widespread appreciation of its surrounding.

  19. Limites e possibilidades dos programas de aceleração de aprendizagem The limits and possibilities of including students from remedial learning programs in regular schooling

    Directory of Open Access Journals (Sweden)

    Clarilza Prado de Sousa

    1999-11-01

    Full Text Available Pretendi neste trabalho analisar os limites e possibilidades da escola integrar alunos com atraso de escolaridade em processos de educação regular, que receberam apoio de programas de aceleração da aprendizagem. Baseada nas avaliações realizadas desses programas por professores do Programa de Estudos Pós-Graduados em Psicologia da Educação da PUCSP e por pesquisadores do Núcleo de Avaliação Educacional da Fundação Carlos Chagas, discuto os resultados efetivamente alcançados considerando duas categorias de análise. Na primeira categoria, analiso os efeitos da estratégia pedagógica promovida pelos programas, nas aprendizagens e progressos dos alunos participantes. Na segunda categoria, procuro analisar as possibilidades de integração/inclusão desses alunos no processo de educação regular. Finalmente, à guisa de conclusão, procuro fazer algumas considerações teórico-metodológicas. Distinguindo integração de inclusão, discuto os limites e possibilidades que as ações dos programas têm de realmente promoverem o desenvolvimento de uma escola sem exclusão.This article analyzes the limits and possibilities for schools to include students with schooling deficits who receive support from the accelerated learning programs, in their regular education processes. Based on evaluations of these programs done by professors from the Post Graduate Program in Educational Psychology of the Pontifical Catholic University of São Paulo and by researchers from the Nucleus for Educational Evaluation of the Carlos Chagas Foundation, the results will be discussed in two analytical categories. In the first category, I analyze the effects of the teaching strategies promoted by the programs on the learning and progress of the participating students. In the second category, I seek to analyze the possibilities for integration/inclusion of these students in the regular educational process. Finally by way of conclusion, I try to make some

  20. Sensitivity analysis with regard to variations of physical forcing including two possible future hydrographic regimes for the Oeregrundsgrepen. A follow-up baroclinic 3D-model study

    International Nuclear Information System (INIS)

    Engqvist, A.; Andrejev, O.

    2000-02-01

    A sensitivity analysis with regard to variations of physical forcing has been performed using a 3D baroclinic model of the Oeregrundsgrepen area for a whole-year period with data pertaining to 1992. The results of these variations are compared to a nominal run with unaltered physical forcing. This nominal simulation is based on the experience gained in an earlier whole-year modelling of the same area; the difference is mainly that the present nominal simulation is run with identical parameters for the whole year. From a computational economy point of view it has been necessary to vary the time step between the month-long simulation periods. For all simulations with varied forcing, the same time step as for the nominal run has been used. The analysis also comprises the water turnover of a hypsographically defined subsection, the Bio Model area, located above the SFR depository. The external forcing factors that have been varied are the following (with their found relative impact on the volume average of the retention time of the Bio Model area over one year given within parentheses): atmospheric temperature increased/reduced by 2.5 deg C (-0.1% resp. +0.6%), local freshwater discharge rate doubled/halved (-1.6% resp. +0.01%), salinity range at the border increased/reduced a factor 2 (-0.84% resp. 0.00%), wind speed forcing reduced 10% (+8.6%). The results of these simulations, at least the yearly averages, permit a reasonably direct physical explanation, while the detailed dynamics is for natural reasons more intricate. Two additional full-year simulations of possible future hydrographic regimes have also been performed. The first mimics a hypothetical situation with permanent ice cover, which increases the average retention time 87%. The second regime entails the future hypsography with its anticipated shoreline displacement by an 11 m land-rise in the year 4000 AD, which also considerably increases the average retention times for the two remaining layers of the

  1. Low density in liver of idiopathic portal hypertension. A computed tomographic observation with possible diagnostic significance

    Energy Technology Data Exchange (ETDEWEB)

    Ishito, Hiroyuki

    1988-01-01

    In order to evaluate the diagnostic value of low density in liver on computed tomography (CT), CT scans of 11 patients with idiopathic portal hypertension (IPH) were compared with those from 22 cirrhotic patients, two patients with scarred liver and 16 normal subjects. Low densities on plain CT scans in patients with IPH were distinctly different from those observed in normal liver. Some of the low densities had irregular shape with unclear margin and were scattered near the liver surface, and others had vessel-like structures with unclear margin and extended as far as near the liver surface. Ten of the 11 patients with IPH had low densities mentioned above, while none of the 22 cirrhotic patients had such low densities. The present results suggest that the presence of low densities in liver on plain CT scan is clinically beneficial in diagnosis of IPH.

  2. Primary graft dysfunction; possible evaluation by high resolution computed tomography, and suggestions for a scoring system

    DEFF Research Database (Denmark)

    Belmaati, Esther; Jensen, Claus; Kofoed, Klaus F

    2009-01-01

    /exclusion criteria of patients, pilot testing, and training investigators through review of disagreements, were possibilities suggested for decreasing inter/intra observer variability. Factors affecting the image attenuation (Hounsfield numbers) and thus, the reproducibility of CT densitometric measurements were...... of parenchymal change in the lung. HRCT is considered relevant and superior in evaluating disease severity, disease progression, and in evaluating the effects of therapy regimes in the lung. It is, however, not clear to what extent these scoring methods may be implemented for grading PGD. Further efforts could...

  3. Possibilities of computer and magnetic-resonance tomography in liver neoplasm diagnostics

    International Nuclear Information System (INIS)

    Momot, N.V.; Shpak, S.A.

    2003-01-01

    With the purpose of comparison of CT and MRI possibilities in diagnostics of focal liver lesions 238 patients were studied by CT and 38 - by MRI. Results of investigation were verified by surgery, needle-fine biopsy, dynamic observation. CT is a method of a choice in diagnostics of focal liver lesions. MRI has some advantages in revealing of small metastases and neoplasms located on diaphragmal surface of the liver, in evaluation of hepatic portal structures and tumor relation with surrounding tissues and vessels

  4. The WECHSL-Mod2 code: A computer program for the interaction of a core melt with concrete including the long term behavior

    International Nuclear Information System (INIS)

    Reimann, M.; Stiefel, S.

    1989-06-01

    The WECHSL-Mod2 code is a mechanistic computer code developed for the analysis of the thermal and chemical interaction of initially molten LWR reactor materials with concrete in a two-dimensional, axisymmetrical concrete cavity. The code performs calculations from the time of initial contact of a hot molten pool over start of solidification processes until long term basemat erosion over several days with the possibility of basemat penetration. The code assumes that the metallic phases of the melt pool form a layer at the bottom overlayed by the oxide melt atop. Heat generation in the melt is by decay heat and chemical reactions from metal oxidation. Energy is lost to the melting concrete and to the upper containment by radiation or evaporation of sumpwater possibly flooding the surface of the melt. Thermodynamic and transport properties as well as criteria for heat transfer and solidification processes are internally calculated for each time step. Heat transfer is modelled taking into account the high gas flux from the decomposing concrete and the heat conduction in the crusts possibly forming in the long term at the melt/concrete interface. The WECHSL code in its present version was validated by the BETA experiments. The test samples include a typical BETA post test calculation and a WECHSL application to a reactor accident. (orig.) [de

  5. Computational Models Describing Possible Mechanisms for Generation of Excessive Beta Oscillations in Parkinson's Disease.

    Directory of Open Access Journals (Sweden)

    Alex Pavlides

    2015-12-01

    Full Text Available In Parkinson's disease, an increase in beta oscillations within the basal ganglia nuclei has been shown to be associated with difficulty in movement initiation. An important role in the generation of these oscillations is thought to be played by the motor cortex and by a network composed of the subthalamic nucleus (STN and the external segment of globus pallidus (GPe. Several alternative models have been proposed to describe the mechanisms for generation of the Parkinsonian beta oscillations. However, a recent experimental study of Tachibana and colleagues yielded results which are challenging for all published computational models of beta generation. That study investigated how the presence of beta oscillations in a primate model of Parkinson's disease is affected by blocking different connections of the STN-GPe circuit. Due to a large number of experimental conditions, the study provides strong constraints that any mechanistic model of beta generation should satisfy. In this paper we present two models consistent with the data of Tachibana et al. The first model assumes that Parkinsonian beta oscillation are generated in the cortex and the STN-GPe circuits resonates at this frequency. The second model additionally assumes that the feedback from STN-GPe circuit to cortex is important for maintaining the oscillations in the network. Predictions are made about experimental evidence that is required to differentiate between the two models, both of which are able to reproduce firing rates, oscillation frequency and effects of lesions carried out by Tachibana and colleagues. Furthermore, an analysis of the models reveals how the amplitude and frequency of the generated oscillations depend on parameters.

  6. Hypothermic death: Possibility of diagnosis by post-mortem computed tomography

    International Nuclear Information System (INIS)

    Kawasumi, Yusuke; Onozuka, Naoki; Kakizaki, Ayana; Usui, Akihito; Hosokai, Yoshiyuki; Sato, Miho; Saito, Haruo; Ishibashi, Tadashi; Hayashizaki, Yoshie; Funayama, Masato

    2013-01-01

    Referring to our experience with post-mortem computed tomography (CT), many hypothermic death cases presented a lack of increase in lung-field concentration, blood clotting in the heart, thoracic aorta or pulmonary artery, and urine retention in the bladder. Thus we evaluated the diagnostic performance of post-mortem CT on hypothermic death based on the above-mentioned three findings. Twenty-four hypothermic death subjects and 53 non-hypothermic death subjects were examined. Two radiologists assessed the presence or lack of an increase in lung-field concentration, blood clotting in the heart, thoracic aorta or pulmonary artery, and measured urine volume in the bladder. Pearson's chi-square test and Mann–Whitney U-test were used to assess the relationship between the three findings and hypothermic death. The sensitivity, specificity, accuracy, positive predictive value (PPV) and negative predictive value (NPV) of the diagnosis were also calculated. Lack of an increase in lung-field concentration and blood clotting in the heart, thoracic aorta or pulmonary artery were significantly associated with hypothermic death (p = 0.0007, p < 0.0001, respectively). The hypothermic death cases had significantly more urine in the bladder than the non-hypothermic death cases (p = 0.0011). Regarding the diagnostic performance with all three findings, the sensitivity was 29.2% but the specificity was 100%. These three findings were more common in hypothermic death cases. Although the sensitivity was low, these findings will assist forensic physicians in diagnosing hypothermic death since the specificity was high

  7. Hypothermic death: possibility of diagnosis by post-mortem computed tomography.

    Science.gov (United States)

    Kawasumi, Yusuke; Onozuka, Naoki; Kakizaki, Ayana; Usui, Akihito; Hosokai, Yoshiyuki; Sato, Miho; Saito, Haruo; Ishibashi, Tadashi; Hayashizaki, Yoshie; Funayama, Masato

    2013-02-01

    Referring to our experience with post-mortem computed tomography (CT), many hypothermic death cases presented a lack of increase in lung-field concentration, blood clotting in the heart, thoracic aorta or pulmonary artery, and urine retention in the bladder. Thus we evaluated the diagnostic performance of post-mortem CT on hypothermic death based on the above-mentioned three findings. Twenty-four hypothermic death subjects and 53 non-hypothermic death subjects were examined. Two radiologists assessed the presence or lack of an increase in lung-field concentration, blood clotting in the heart, thoracic aorta or pulmonary artery, and measured urine volume in the bladder. Pearson's chi-square test and Mann-Whitney U-test were used to assess the relationship between the three findings and hypothermic death. The sensitivity, specificity, accuracy, positive predictive value (PPV) and negative predictive value (NPV) of the diagnosis were also calculated. Lack of an increase in lung-field concentration and blood clotting in the heart, thoracic aorta or pulmonary artery were significantly associated with hypothermic death (p=0.0007, p<0.0001, respectively). The hypothermic death cases had significantly more urine in the bladder than the non-hypothermic death cases (p=0.0011). Regarding the diagnostic performance with all three findings, the sensitivity was 29.2% but the specificity was 100%. These three findings were more common in hypothermic death cases. Although the sensitivity was low, these findings will assist forensic physicians in diagnosing hypothermic death since the specificity was high. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  8. Hypothermic death: Possibility of diagnosis by post-mortem computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Kawasumi, Yusuke, E-mail: ssu@rad.med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Clinical Imaging, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi, 980-8575 (Japan); Onozuka, Naoki; Kakizaki, Ayana [Tohoku University Graduate School of Medicine, Department of Clinical Imaging, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi, 980-8575 (Japan); Usui, Akihito, E-mail: t7402r0506@med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Diagnostic Image Analysis, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi, 980-8575 (Japan); Hosokai, Yoshiyuki, E-mail: hosokai@med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Diagnostic Image Analysis, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi, 980-8575 (Japan); Sato, Miho, E-mail: meifan58@m.tains.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Clinical Imaging, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi, 980-8575 (Japan); Saito, Haruo, E-mail: hsaito@med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Diagnostic Image Analysis, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi, 980-8575 (Japan); Ishibashi, Tadashi, E-mail: tisibasi@med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Clinical Imaging, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi, 980-8575 (Japan); Hayashizaki, Yoshie, E-mail: yoshie@forensic.med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Forensic Medicine, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi, 980-8575 (Japan); Funayama, Masato, E-mail: funayama@forensic.med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Forensic Medicine, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi, 980-8575 (Japan)

    2013-02-15

    Referring to our experience with post-mortem computed tomography (CT), many hypothermic death cases presented a lack of increase in lung-field concentration, blood clotting in the heart, thoracic aorta or pulmonary artery, and urine retention in the bladder. Thus we evaluated the diagnostic performance of post-mortem CT on hypothermic death based on the above-mentioned three findings. Twenty-four hypothermic death subjects and 53 non-hypothermic death subjects were examined. Two radiologists assessed the presence or lack of an increase in lung-field concentration, blood clotting in the heart, thoracic aorta or pulmonary artery, and measured urine volume in the bladder. Pearson's chi-square test and Mann–Whitney U-test were used to assess the relationship between the three findings and hypothermic death. The sensitivity, specificity, accuracy, positive predictive value (PPV) and negative predictive value (NPV) of the diagnosis were also calculated. Lack of an increase in lung-field concentration and blood clotting in the heart, thoracic aorta or pulmonary artery were significantly associated with hypothermic death (p = 0.0007, p < 0.0001, respectively). The hypothermic death cases had significantly more urine in the bladder than the non-hypothermic death cases (p = 0.0011). Regarding the diagnostic performance with all three findings, the sensitivity was 29.2% but the specificity was 100%. These three findings were more common in hypothermic death cases. Although the sensitivity was low, these findings will assist forensic physicians in diagnosing hypothermic death since the specificity was high.

  9. Contribution to the algorithmic and efficient programming of new parallel architectures including accelerators for neutron physics and shielding computations

    International Nuclear Information System (INIS)

    Dubois, J.

    2011-01-01

    In science, simulation is a key process for research or validation. Modern computer technology allows faster numerical experiments, which are cheaper than real models. In the field of neutron simulation, the calculation of eigenvalues is one of the key challenges. The complexity of these problems is such that a lot of computing power may be necessary. The work of this thesis is first the evaluation of new computing hardware such as graphics card or massively multi-core chips, and their application to eigenvalue problems for neutron simulation. Then, in order to address the massive parallelism of supercomputers national, we also study the use of asynchronous hybrid methods for solving eigenvalue problems with this very high level of parallelism. Then we experiment the work of this research on several national supercomputers such as the Titane hybrid machine of the Computing Center, Research and Technology (CCRT), the Curie machine of the Very Large Computing Centre (TGCC), currently being installed, and the Hopper machine at the Lawrence Berkeley National Laboratory (LBNL). We also do our experiments on local workstations to illustrate the interest of this research in an everyday use with local computing resources. (author) [fr

  10. Description of the tasks of control room operators in German nuclear power plants and support possibilities by advanced computer systems

    International Nuclear Information System (INIS)

    Buettner, W.E.

    1984-01-01

    In course of the development of nuclear power plants the instrumentation and control systems and the information in the control room have been increasing substantially. With this background it is described which operator tasks might be supported by advanced computer aid systems with main emphasis to safety related information and diagnose facilities. Nevertheless, some of this systems under development may be helpful for normal operation modes too. As far as possible recommendations for the realization and test of such systems are made. (orig.) [de

  11. The WECHSL-Mod3 code: A computer program for the interaction of a core melt with concrete including the long term behavior. Model description and user's manual

    International Nuclear Information System (INIS)

    Foit, J.J.; Adroguer, B.; Cenerino, G.; Stiefel, S.

    1995-02-01

    The WECHSL-Mod3 code is a mechanistic computer code developed for the analysis of the thermal and chemical interaction of initially molten reactor materials with concrete in a two-dimensional as well as in a one-dimensional, axisymmetrical concrete cavity. The code performs calculations from the time of initial contact of a hot molten pool over start of solidification processes until long term basemat erosion over several days with the possibility of basemat penetration. It is assumed that an underlying metallic layer exists covered by an oxidic layer or that only one oxidic layer is present which can contain a homogeneously dispersed metallic phase. Heat generation in the melt is by decay heat and chemical reactions from metal oxidation. Energy is lost to the melting concrete and to the upper containment by radiation or evaporation of sumpwater possibly flooding the surface of the melt. Thermodynamic and transport properties as well as criteria for heat transfer and solidification processes are internally calculated for each time step. Heat transfer is modelled taking into account the high gas flux from the decomposing concrete and the heat conduction in the crusts possibly forming in the long term at the melt/concrete interface. The CALTHER code (developed at CEA, France) which models the radiative heat transfer from the upper surface of the corium melt to the surrounding cavity is implemented in the present WECHSL version. The WECHSL code in its present version was validated by the BETA, ACE and SURC experiments. The test samples include a BETA and the SURC2 post test calculations and a WECHSL application to a reactor accident. (orig.) [de

  12. Quantum wavepacket ab initio molecular dynamics: an approach for computing dynamically averaged vibrational spectra including critical nuclear quantum effects.

    Science.gov (United States)

    Sumner, Isaiah; Iyengar, Srinivasan S

    2007-10-18

    We have introduced a computational methodology to study vibrational spectroscopy in clusters inclusive of critical nuclear quantum effects. This approach is based on the recently developed quantum wavepacket ab initio molecular dynamics method that combines quantum wavepacket dynamics with ab initio molecular dynamics. The computational efficiency of the dynamical procedure is drastically improved (by several orders of magnitude) through the utilization of wavelet-based techniques combined with the previously introduced time-dependent deterministic sampling procedure measure to achieve stable, picosecond length, quantum-classical dynamics of electrons and nuclei in clusters. The dynamical information is employed to construct a novel cumulative flux/velocity correlation function, where the wavepacket flux from the quantized particle is combined with classical nuclear velocities to obtain the vibrational density of states. The approach is demonstrated by computing the vibrational density of states of [Cl-H-Cl]-, inclusive of critical quantum nuclear effects, and our results are in good agreement with experiment. A general hierarchical procedure is also provided, based on electronic structure harmonic frequencies, classical ab initio molecular dynamics, computation of nuclear quantum-mechanical eigenstates, and employing quantum wavepacket ab initio dynamics to understand vibrational spectroscopy in hydrogen-bonded clusters that display large degrees of anharmonicities.

  13. About possibility of temperature trace observing on a human skin through clothes by using computer processing of IR image

    Science.gov (United States)

    Trofimov, Vyacheslav A.; Trofimov, Vladislav V.; Shestakov, Ivan L.; Blednov, Roman G.

    2017-05-01

    One of urgent security problems is a detection of objects placed inside the human body. Obviously, for safety reasons one cannot use X-rays for such object detection widely and often. For this purpose, we propose to use THz camera and IR camera. Below we continue a possibility of IR camera using for a detection of temperature trace on a human body. In contrast to passive THz camera using, the IR camera does not allow to see very pronounced the object under clothing. Of course, this is a big disadvantage for a security problem solution based on the IR camera using. To find possible ways for this disadvantage overcoming we make some experiments with IR camera, produced by FLIR Company and develop novel approach for computer processing of images captured by IR camera. It allows us to increase a temperature resolution of IR camera as well as human year effective susceptibility enhancing. As a consequence of this, a possibility for seeing of a human body temperature changing through clothing appears. We analyze IR images of a person, which drinks water and eats chocolate. We follow a temperature trace on human body skin, caused by changing of temperature inside the human body. Some experiments are made with observing of temperature trace from objects placed behind think overall. Demonstrated results are very important for the detection of forbidden objects, concealed inside the human body, by using non-destructive control without using X-rays.

  14. VibroCV: a computer vision-based vibroarthrography platform with possible application to Juvenile Idiopathic Arthritis.

    Science.gov (United States)

    Wiens, Andrew D; Prahalad, Sampath; Inan, Omer T

    2016-08-01

    Vibroarthrography, a method for interpreting the sounds emitted by a knee during movement, has been studied for several joint disorders since 1902. However, to our knowledge, the usefulness of this method for management of Juvenile Idiopathic Arthritis (JIA) has not been investigated. To study joint sounds as a possible new biomarker for pediatric cases of JIA we designed and built VibroCV, a platform to capture vibroarthrograms from four accelerometers; electromyograms (EMG) and inertial measurements from four wireless EMG modules; and joint angles from two Sony Eye cameras and six light-emitting diodes with commercially-available off-the-shelf parts and computer vision via OpenCV. This article explains the design of this turn-key platform in detail, and provides a sample recording captured from a pediatric subject.

  15. Progress report of Physics Division including Applied Mathematics and Computing Section. 1st April 1970 - 30th September 1970

    International Nuclear Information System (INIS)

    2004-01-01

    Several of the senior staff of the Division have assisted in the assessment of the tenders for the proposed Jervis Bay power station. This has involved studies on light water moderated reactor systems where our experience has been limited. Several of the questions raised by the tenders are considered important and effort on these topics will continue when the assessment is complete. Major effort, other than for the Jervis Bay Project, has been devoted to the improvement of facilities and the construction of the critical facility. Studies relevant to an improved understanding of MOATA have continued to support the proposed power uprating to 100 W. The increasing number of shielding (neutron and gamma) problems referred to the Division has resulted in the procurement of several specialised codes and data libraries. These are now operational on our IBM 360 computer, and several problems are being investigated

  16. Investigating Direct Links between Depression, Emotional Control, and Physical Punishment with Adolescent Drive for Thinness and Bulimic Behaviors, Including Possible Moderation by the Serotonin Transporter 5-HTTLPR Polymorphism.

    Science.gov (United States)

    Rozenblat, Vanja; Ryan, Joanne; Wertheim, Eleanor H; King, Ross; Olsson, Craig A; Krug, Isabel

    2017-01-01

    Objectives: To examine the relationship between psychological and social factors (depression, emotional control, sexual abuse, and parental physical punishment) and adolescent drive for Thinness and Bulimic behaviors in a large community sample, and to investigate possible genetic moderation. Method: Data were drawn from the Australian Temperament Project (ATP), a population-based cohort study that has followed a representative sample of 2443 participants from infancy to adulthood across 16 waves since 1983. A subsample of 650 participants (50.2% female) of Caucasian descent who provided DNA were genotyped for a serotonin transporter promoter polymorphism ( 5-HTTLPR ). Adolescent disordered eating attitudes and behaviors were assessed using the Bulimia and Drive for Thinness scales of the Eating Disorder Inventory-2 (15-16 years). Depression and emotional control were examined at the same age using the Short Mood and Feelings Questionnaire, and an ATP-devised measure of emotional control. History of sexual abuse and physical punishment were assessed retrospectively (23-24 years) in a subsample of 467 of those providing DNA. Results: EDI-2 scores were associated with depression, emotional control, and retrospectively reported parental physical punishment. Although there was statistically significant moderation of the relationship between parental physical punishment and bulimic behaviors by 5-HTTLPR ( p = 0.0048), genotypes in this subsample were not in Hardy-Weinberg Equilibrium. No other G×E interactions were significant. Conclusion: Findings from this study affirm the central importance of psychosocial processes in disordered eating patterns in adolescence. Evidence of moderation by 5-HTTLPR was not conclusive; however, genetic moderation observed in a subsample not in Hardy-Weinberg Equilibrium warrants further investigation.

  17. Investigating Direct Links between Depression, Emotional Control, and Physical Punishment with Adolescent Drive for Thinness and Bulimic Behaviors, Including Possible Moderation by the Serotonin Transporter 5-HTTLPR Polymorphism

    Directory of Open Access Journals (Sweden)

    Vanja Rozenblat

    2017-08-01

    Full Text Available Objectives: To examine the relationship between psychological and social factors (depression, emotional control, sexual abuse, and parental physical punishment and adolescent drive for Thinness and Bulimic behaviors in a large community sample, and to investigate possible genetic moderation.Method: Data were drawn from the Australian Temperament Project (ATP, a population-based cohort study that has followed a representative sample of 2443 participants from infancy to adulthood across 16 waves since 1983. A subsample of 650 participants (50.2% female of Caucasian descent who provided DNA were genotyped for a serotonin transporter promoter polymorphism (5-HTTLPR. Adolescent disordered eating attitudes and behaviors were assessed using the Bulimia and Drive for Thinness scales of the Eating Disorder Inventory-2 (15–16 years. Depression and emotional control were examined at the same age using the Short Mood and Feelings Questionnaire, and an ATP-devised measure of emotional control. History of sexual abuse and physical punishment were assessed retrospectively (23–24 years in a subsample of 467 of those providing DNA.Results: EDI-2 scores were associated with depression, emotional control, and retrospectively reported parental physical punishment. Although there was statistically significant moderation of the relationship between parental physical punishment and bulimic behaviors by 5-HTTLPR (p = 0.0048, genotypes in this subsample were not in Hardy–Weinberg Equilibrium. No other G×E interactions were significant. Conclusion: Findings from this study affirm the central importance of psychosocial processes in disordered eating patterns in adolescence. Evidence of moderation by 5-HTTLPR was not conclusive; however, genetic moderation observed in a subsample not in Hardy–Weinberg Equilibrium warrants further investigation.

  18. Enteric bacterial metabolites propionic and butyric acid modulate gene expression, including CREB-dependent catecholaminergic neurotransmission, in PC12 cells--possible relevance to autism spectrum disorders.

    Directory of Open Access Journals (Sweden)

    Bistra B Nankova

    Full Text Available Alterations in gut microbiome composition have an emerging role in health and disease including brain function and behavior. Short chain fatty acids (SCFA like propionic (PPA, and butyric acid (BA, which are present in diet and are fermentation products of many gastrointestinal bacteria, are showing increasing importance in host health, but also may be environmental contributors in neurodevelopmental disorders including autism spectrum disorders (ASD. Further to this we have shown SCFA administration to rodents over a variety of routes (intracerebroventricular, subcutaneous, intraperitoneal or developmental time periods can elicit behavioral, electrophysiological, neuropathological and biochemical effects consistent with findings in ASD patients. SCFA are capable of altering host gene expression, partly due to their histone deacetylase inhibitor activity. We have previously shown BA can regulate tyrosine hydroxylase (TH mRNA levels in a PC12 cell model. Since monoamine concentration is known to be elevated in the brain and blood of ASD patients and in many ASD animal models, we hypothesized that SCFA may directly influence brain monoaminergic pathways. When PC12 cells were transiently transfected with plasmids having a luciferase reporter gene under the control of the TH promoter, PPA was found to induce reporter gene activity over a wide concentration range. CREB transcription factor(s was necessary for the transcriptional activation of TH gene by PPA. At lower concentrations PPA also caused accumulation of TH mRNA and protein, indicative of increased cell capacity to produce catecholamines. PPA and BA induced broad alterations in gene expression including neurotransmitter systems, neuronal cell adhesion molecules, inflammation, oxidative stress, lipid metabolism and mitochondrial function, all of which have been implicated in ASD. In conclusion, our data are consistent with a molecular mechanism through which gut related environmental signals

  19. Evaluation and study of advanced optical contamination, deposition, measurement, and removal techniques. [including computer programs and ultraviolet reflection analysis

    Science.gov (United States)

    Linford, R. M. F.; Allen, T. H.; Dillow, C. F.

    1975-01-01

    A program is described to design, fabricate and install an experimental work chamber assembly (WCA) to provide a wide range of experimental capability. The WCA incorporates several techniques for studying the kinetics of contaminant films and their effect on optical surfaces. It incorporates the capability for depositing both optical and contaminant films on temperature-controlled samples, and for in-situ measurements of the vacuum ultraviolet reflectance. Ellipsometer optics are mounted on the chamber for film thickness determinations, and other features include access ports for radiation sources and instrumentation. Several supporting studies were conducted to define specific chamber requirements, to determine the sensitivity of the measurement techniques to be incorporated in the chamber, and to establish procedures for handling samples prior to their installation in the chamber. A bibliography and literature survey of contamination-related articles is included.

  20. Factors affecting the possibility to detect buccal bone condition around dental implants using cone beam computed tomography

    DEFF Research Database (Denmark)

    Liedke, Gabriela S; Spin-Neto, Rubens; da Silveira, Heloisa E D

    2016-01-01

    OBJECTIVES: To evaluate factors with impact on the conspicuity (possibility to detect) of the buccal bone condition around dental implants in cone beam computed tomography (CBCT) imaging. MATERIAL AND METHODS: Titanium (Ti) or zirconia (Zr) implants and abutments were inserted into 40 bone blocks...... in a way to obtain variable buccal bone thicknesses. Three combinations regarding the implant-abutment metal (TiTi, TiZr, or ZrZr) and the number of implants (one, two, or three) were assessed. Two CBCT units (Scanora 3D - Sc and Cranex 3D - Cr) and two voxel resolutions (0.2 and 0.13 mm) were used...... variable. Odds ratio (OR) were calculated separately for each CBCT unit. RESULTS: Implant-abutment combination (ZrZr) (OR Sc = 19.18, OR Cr = 11.89) and number of implants (3) (OR Sc = 12.10, OR Cr = 4.25) had major impact on buccal bone conspicuity. The thinner the buccal bone, the higher the risk...

  1. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  2. POSSIBILITIES OF COMPUTED TOMOGRAPHY AND MAGNETIC RESONANCE IMAGING IN FORENSIC MEDICAL EXAMINATION OF MECHANICAL TRAUMA AND SUDDEN DEATH (A LITERATURE REVIEW

    Directory of Open Access Journals (Sweden)

    L. S. Kokov

    2015-01-01

    Full Text Available ABSTRACT. The review analyzes the possibility of multislice computed tomography (MSCT and magnetic resonance imaging (MRI use in the forensic examination of corpses of adults. We present the critical analysis of literature on post-mortem imaging in terms of forensic thanatology. The review is based on basic Internet resources: Scientific Electronic Library (elibrary, Scopus, PubMed. The review includes articles that discuss both advantages and limitations of post-mortem MSCT and MRI imaging in forensic examination of the corpse.Through studying the available literature, the authors attempted to answer two questions: 1 which method was more suitable for the purposes of forensic examination of the corpse - MSCT or MRI; 2 whether the virtual autopsy replaced the traditional autopsy in the near future?Conclusion: comprehensive study of the corpse often requires both imaging methods; in cases of death under mechanical damage, MSCT exceeds the range of possibilities of MRI; today, virtual autopsy cannot completely replace traditional autopsy in forensic science, since there are no convincing evidence-based comparative studies, as well as the legal framework of the method. 

  3. Computational Search for Two-Dimensional MX2 Semiconductors with Possible High Electron Mobility at Room Temperature

    Directory of Open Access Journals (Sweden)

    Zhishuo Huang

    2016-08-01

    Full Text Available Neither of the two typical two-dimensional materials, graphene and single layer MoS 2 , are good enough for developing semiconductor logical devices. We calculated the electron mobility of 14 two-dimensional semiconductors with composition of MX 2 , where M (=Mo, W, Sn, Hf, Zr and Pt are transition metals, and Xs are S, Se and Te. We approximated the electron phonon scattering matrix by deformation potentials, within which long wave longitudinal acoustical and optical phonon scatterings were included. Piezoelectric scattering in the compounds without inversion symmetry is also taken into account. We found that out of the 14 compounds, WS 2 , PtS 2 and PtSe 2 are promising for logical devices regarding the possible high electron mobility and finite band gap. Especially, the phonon limited electron mobility in PtSe 2 reaches about 4000 cm 2 ·V - 1 ·s - 1 at room temperature, which is the highest among the compounds with an indirect bandgap of about 1.25 eV under the local density approximation. Our results can be the first guide for experiments to synthesize better two-dimensional materials for future semiconductor devices.

  4. Use of computational fluid dynamics codes for safety analysis of nuclear reactor systems, including containment. Summary report of a technical meeting

    International Nuclear Information System (INIS)

    2003-11-01

    Safety analysis is an important tool for justifying the safety of nuclear power plants. Typically, this type of analysis is performed by means of system computer codes with one dimensional approximation for modelling real plant systems. However, in the nuclear area there are issues for which traditional treatment using one dimensional system codes is considered inadequate for modelling local flow and heat transfer phenomena. There is therefore increasing interest in the application of three dimensional computational fluid dynamics (CFD) codes as a supplement to or in combination with system codes. There are a number of both commercial (general purpose) CFD codes as well as special codes for nuclear safety applications available. With further progress in safety analysis techniques, the increasing use of CFD codes for nuclear applications is expected. At present, the main objective with respect to CFD codes is generally to improve confidence in the available analysis tools and to achieve a more reliable approach to safety relevant issues. An exchange of views and experience can facilitate and speed up progress in the implementation of this objective. Both the International Atomic Energy Agency (IAEA) and the Nuclear Energy Agency of the Organisation for Economic Co-operation and Development (OECD/NEA) believed that it would be advantageous to provide a forum for such an exchange. Therefore, within the framework of the Working Group on the Analysis and Management of Accidents of the NEA's Committee on the Safety of Nuclear Installations, the IAEA and the NEA agreed to jointly organize the Technical Meeting on the Use of Computational Fluid Dynamics Codes for Safety Analysis of Reactor Systems, including Containment. The meeting was held in Pisa, Italy, from 11 to 14 November 2002. The publication constitutes the report of the Technical Meeting. It includes short summaries of the presentations that were made and of the discussions as well as conclusions and

  5. Detection of a possible epilepsy focus in a preoperated patient by perfusion SPECT and computer-aided subtraction analysis

    International Nuclear Information System (INIS)

    Apostolova, I.; Wilke, F.; Clausen, M.; Buchert, R.; Lindenau, M.; Stodieck, S.; Fiehler, J.; Heese, O.

    2008-01-01

    Ictal perfusion SPECT with either 9 9mTc-hexamethylpropylene amine oxime (HMPAO) or 9 9mTc-ethylcysteinate dimer (ECD) has been reported to provide very good sensitivity for the determination of the SOA in fTLE. Sensitivity of interictal perfusion SPECT is much lower. However, interictal perfusion SPECT might enhance the specificity of ictal SPECT findings by improving the discrimination between seizure related local hyperperfusion and intersubject variability of perfusion (physiologic hyperperfusion). In addition, the combination of interictal and ictal perfusion SPECT might provide improved sensitivity compared to ictal SPECT alone, particularly when computer-aided subtraction of ictal and interictal SPECT is used instead of traditional side-by-side visual comparison. The combination of ictal and interictal perfusion SPECT eliminates not only physiological inter-subject variance, but it can also eliminate severe partial volume effects. Computer-aided subtraction analysis appears particularly useful in this case

  6. Detection of a possible epilepsy focus in a preoperated patient by perfusion SPECT and computer-aided subtraction analysis

    Energy Technology Data Exchange (ETDEWEB)

    Apostolova, I.; Wilke, F.; Clausen, M.; Buchert, R. [Univ. Medical Center Hamburg-Eppendorf (Germany). Dept. of Nuclear Medicine; Lindenau, M.; Stodieck, S. [Protestant Hospital Alsterdorf, Hamburg (Germany). Dept. of Neurology and Epileptology; Fiehler, J. [Univ. Medical Center Hamburg-Eppendorf (Germany). Dept. of Neuroradiology; Heese, O. [Univ. Medical Center Hamburg-Eppendorf (Germany). Neurological Surgery

    2008-07-01

    Ictal perfusion SPECT with either {sup 9}9mTc-hexamethylpropylene amine oxime (HMPAO) or {sup 9}9mTc-ethylcysteinate dimer (ECD) has been reported to provide very good sensitivity for the determination of the SOA in fTLE. Sensitivity of interictal perfusion SPECT is much lower. However, interictal perfusion SPECT might enhance the specificity of ictal SPECT findings by improving the discrimination between seizure related local hyperperfusion and intersubject variability of perfusion (physiologic hyperperfusion). In addition, the combination of interictal and ictal perfusion SPECT might provide improved sensitivity compared to ictal SPECT alone, particularly when computer-aided subtraction of ictal and interictal SPECT is used instead of traditional side-by-side visual comparison. The combination of ictal and interictal perfusion SPECT eliminates not only physiological inter-subject variance, but it can also eliminate severe partial volume effects. Computer-aided subtraction analysis appears particularly useful in this case.

  7. Retroperitoneal Endometriosis: A Possible Cause of False Positive Finding at 18F-Fluorodeoxyglucose Positron Emission Tomography/Computed Tomography

    International Nuclear Information System (INIS)

    Maffione, Anna Margherita; Panzavolta, Riccardo; Lisato, Laura Camilla; Ballotta, Maria; D'Isanto, Mariangela Zanforlini; Rubello, Domenico

    2015-01-01

    Endometriosis is a frequent and clinically relevant problem in young women. Laparoscopy is still the gold standard for the diagnosis of endometriosis, but frequently both morphologic and functional imaging techniques are involved in the diagnostic course before achieving a conclusive diagnosis. We present a case of a patient affected by infiltrating retroperitoneal endometriosis falsely interpreted as a malignant mass by contrast-enhanced magnetic resonance imaging and 18 F-fluorodeoxyglucose positron emission tomography/computed tomography

  8. The possibility of coexistence and co-development in language competition: ecology-society computational model and simulation.

    Science.gov (United States)

    Yun, Jian; Shang, Song-Chao; Wei, Xiao-Dan; Liu, Shuang; Li, Zhi-Jie

    2016-01-01

    Language is characterized by both ecological properties and social properties, and competition is the basic form of language evolution. The rise and decline of one language is a result of competition between languages. Moreover, this rise and decline directly influences the diversity of human culture. Mathematics and computer modeling for language competition has been a popular topic in the fields of linguistics, mathematics, computer science, ecology, and other disciplines. Currently, there are several problems in the research on language competition modeling. First, comprehensive mathematical analysis is absent in most studies of language competition models. Next, most language competition models are based on the assumption that one language in the model is stronger than the other. These studies tend to ignore cases where there is a balance of power in the competition. The competition between two well-matched languages is more practical, because it can facilitate the co-development of two languages. A third issue with current studies is that many studies have an evolution result where the weaker language inevitably goes extinct. From the integrated point of view of ecology and sociology, this paper improves the Lotka-Volterra model and basic reaction-diffusion model to propose an "ecology-society" computational model for describing language competition. Furthermore, a strict and comprehensive mathematical analysis was made for the stability of the equilibria. Two languages in competition may be either well-matched or greatly different in strength, which was reflected in the experimental design. The results revealed that language coexistence, and even co-development, are likely to occur during language competition.

  9. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  11. NAIAD - a computer program for calculation of the steady state and transient behaviour (including LOCA) of compressible two-phase coolant in networks

    International Nuclear Information System (INIS)

    Trimble, G.D.; Turner, W.J.

    1976-04-01

    The three one-dimensional conservation equations of mass, momentum and energy are solved by a stable finite difference scheme which allows the time step to be varied in response to accuracy requirements. Consideration of numerical stability is not necessary. Slip between the phases is allowed and descriptions of complex hydraulic components can be added into specially provided user routines. Intrinsic choking using any of the nine slip models is possible. A pipe or fuel model and detailed surface heat transfer are included. (author)

  12. The possibilities of spiral computed tomography in evaluating of pathologic changes of the inner surface of the stomach

    International Nuclear Information System (INIS)

    Pomakov, P.; Mlachkova, D.; Naraliev, V.; Lyutckanova, E.

    2006-01-01

    Full text: The aim of the presentation is to determine the possibilities of spiral CT for evaluating pathologic changes, affecting predominantly the stomach mucosa. 20 patients with preliminary clinical diagnosis 'cancer of the stomach' were examined. The patients had different complaints. Gas CT with medical myorelaxation of the stomach was performed. The stomach cavity was inflated by swallowing of three effervescent tablets vitamin C 500 mg with a little water. After the native CT examination a contrast series was performed 20 min. after the beginning of injecting 100 ml nonionic contrast in the arterial phase. 16 patients had no pathologic changes in the stomach. In the rest were established: lymphoma of the stomach - in 1 patient, carcinoma stage I - 1, polyps - 1, erritemoerosive gastritis - 1. CT findings were compared with the histologic ones. Gas CT in arterial contrast phase has possibilities for evaluating changes, affecting predominantly the inner surface (mucosa) of the stomach

  13. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  14. Regional cerebral blood flow studies with single photon emission computed tomography (SPECT); Clinical experiences, possibilities. Regionalis agyi veratfolyas vizsgalata egyfotonos emissios computer tomographiaval (SPECT); Klinikai tapasztalatok, lehetoesegek

    Energy Technology Data Exchange (ETDEWEB)

    Pavics, Laszlo; Csernay, Laszlo; Doczi, Tamas; Lang, Jenoe; Blaho, Gabor; Janka, Zoltan; Bodosi, Mihaly [Szegedi Orvostudomanyi Egyetem, Szeged (Hungary)

    1990-01-07

    Clinical experiences based on regional cerebral blood flow investigations with {sup 99m}Tc hexamethylpropyleneamin-oxime (HMPAO) SPECT in 164 patients are reported. The pharmacokinetics of the {sup 99m}Tc HMPAO are summarized, and the important indications of the investigations are interpreted in case reports (stroke, surgical solution of intracavernous aneurysm, Alzheimer and multiinfarct types of dementia). The literature data suggest that the diagnostic possibilities with this method are advantageous, even in other diseases. (author) 36 refs.; 7 figs.

  15. 3-dimensional magnetotelluric inversion including topography using deformed hexahedral edge finite elements and direct solvers parallelized on symmetric multiprocessor computers - Part II: direct data-space inverse solution

    Science.gov (United States)

    Kordy, M.; Wannamaker, P.; Maris, V.; Cherkaev, E.; Hill, G.

    2016-01-01

    Following the creation described in Part I of a deformable edge finite-element simulator for 3-D magnetotelluric (MT) responses using direct solvers, in Part II we develop an algorithm named HexMT for 3-D regularized inversion of MT data including topography. Direct solvers parallelized on large-RAM, symmetric multiprocessor (SMP) workstations are used also for the Gauss-Newton model update. By exploiting the data-space approach, the computational cost of the model update becomes much less in both time and computer memory than the cost of the forward simulation. In order to regularize using the second norm of the gradient, we factor the matrix related to the regularization term and apply its inverse to the Jacobian, which is done using the MKL PARDISO library. For dense matrix multiplication and factorization related to the model update, we use the PLASMA library which shows very good scalability across processor cores. A synthetic test inversion using a simple hill model shows that including topography can be important; in this case depression of the electric field by the hill can cause false conductors at depth or mask the presence of resistive structure. With a simple model of two buried bricks, a uniform spatial weighting for the norm of model smoothing recovered more accurate locations for the tomographic images compared to weightings which were a function of parameter Jacobians. We implement joint inversion for static distortion matrices tested using the Dublin secret model 2, for which we are able to reduce nRMS to ˜1.1 while avoiding oscillatory convergence. Finally we test the code on field data by inverting full impedance and tipper MT responses collected around Mount St Helens in the Cascade volcanic chain. Among several prominent structures, the north-south trending, eruption-controlling shear zone is clearly imaged in the inversion.

  16. Soft computing model for optimized siRNA design by identifying off target possibilities using artificial neural network model.

    Science.gov (United States)

    Murali, Reena; John, Philips George; Peter S, David

    2015-05-15

    The ability of small interfering RNA (siRNA) to do posttranscriptional gene regulation by knocking down targeted genes is an important research topic in functional genomics, biomedical research and in cancer therapeutics. Many tools had been developed to design exogenous siRNA with high experimental inhibition. Even though considerable amount of work has been done in designing exogenous siRNA, design of effective siRNA sequences is still a challenging work because the target mRNAs must be selected such that their corresponding siRNAs are likely to be efficient against that target and unlikely to accidentally silence other transcripts due to sequence similarity. In some cases, siRNAs may tolerate mismatches with the target mRNA, but knockdown of genes other than the intended target could make serious consequences. Hence to design siRNAs, two important concepts must be considered: the ability in knocking down target genes and the off target possibility on any nontarget genes. So before doing gene silencing by siRNAs, it is essential to analyze their off target effects in addition to their inhibition efficacy against a particular target. Only a few methods have been developed by considering both efficacy and off target possibility of siRNA against a gene. In this paper we present a new design of neural network model with whole stacking energy (ΔG) that enables to identify the efficacy and off target effect of siRNAs against target genes. The tool lists all siRNAs against a particular target with their inhibition efficacy and number of matches or sequence similarity with other genes in the database. We could achieve an excellent performance of Pearson Correlation Coefficient (R=0. 74) and Area Under Curve (AUC=0.906) when the threshold of whole stacking energy is ≥-34.6 kcal/mol. To the best of the author's knowledge, this is one of the best score while considering the "combined efficacy and off target possibility" of siRNA for silencing a gene. The proposed model

  17. Evaluating Computer Screen Time and Its Possible Link to Psychopathology in the Context of Age: A Cross-Sectional Study of Parents and Children.

    Science.gov (United States)

    Segev, Aviv; Mimouni-Bloch, Aviva; Ross, Sharon; Silman, Zmira; Maoz, Hagai; Bloch, Yuval

    2015-01-01

    Several studies have suggested that high levels of computer use are linked to psychopathology. However, there is ambiguity about what should be considered normal or over-use of computers. Furthermore, the nature of the link between computer usage and psychopathology is controversial. The current study utilized the context of age to address these questions. Our hypothesis was that the context of age will be paramount for differentiating normal from excessive use, and that this context will allow a better understanding of the link to psychopathology. In a cross-sectional study, 185 parents and children aged 3-18 years were recruited in clinical and community settings. They were asked to fill out questionnaires regarding demographics, functional and academic variables, computer use as well as psychiatric screening questionnaires. Using a regression model, we identified 3 groups of normal-use, over-use and under-use and examined known factors as putative differentiators between the over-users and the other groups. After modeling computer screen time according to age, factors linked to over-use were: decreased socialization (OR 3.24, Confidence interval [CI] 1.23-8.55, p = 0.018), difficulty to disengage from the computer (OR 1.56, CI 1.07-2.28, p = 0.022) and age, though borderline-significant (OR 1.1 each year, CI 0.99-1.22, p = 0.058). While psychopathology was not linked to over-use, post-hoc analysis revealed that the link between increased computer screen time and psychopathology was age-dependent and solidified as age progressed (p = 0.007). Unlike computer usage, the use of small-screens and smartphones was not associated with psychopathology. The results suggest that computer screen time follows an age-based course. We conclude that differentiating normal from over-use as well as defining over-use as a possible marker for psychiatric difficulties must be performed within the context of age. If verified by additional studies, future research should integrate

  18. Olfactory neuroblastoma: the long-term outcome and late toxicity of multimodal therapy including radiotherapy based on treatment planning using computed tomography

    International Nuclear Information System (INIS)

    Mori, Takashi; Onimaru, Rikiya; Onodera, Shunsuke; Tsuchiya, Kazuhiko; Yasuda, Koichi; Hatakeyama, Hiromitsu; Kobayashi, Hiroyuki; Terasaka, Shunsuke; Homma, Akihiro; Shirato, Hiroki

    2015-01-01

    Olfactory neuroblastoma (ONB) is a rare tumor originating from olfactory epithelium. Here we retrospectively analyzed the long-term treatment outcomes and toxicity of radiotherapy for ONB patients for whom computed tomography (CT) and three-dimensional treatment planning was conducted to reappraise the role of radiotherapy in the light of recent advanced technology and chemotherapy. Seventeen patients with ONB treated between July 1992 and June 2013 were included. Three patients were Kadish stage B and 14 were stage C. All patients were treated with radiotherapy with or without surgery or chemotherapy. The radiation dose was distributed from 50 Gy to 66 Gy except for one patient who received 40 Gy preoperatively. The median follow-up time was 95 months (range 8–173 months). The 5-year overall survival (OS) and relapse-free survival (RFS) rates were estimated at 88% and 74%, respectively. Five patients with stage C disease had recurrence with the median time to recurrence of 59 months (range 7–115 months). Late adverse events equal to or above Grade 2 in CTCAE v4.03 were observed in three patients. Multimodal therapy including radiotherapy with precise treatment planning based on CT simulation achieved an excellent local control rate with acceptable toxicity and reasonable overall survival for patients with ONB

  19. DIAGNOSTIC POSSIBILITIES OF 3D-COMPUTED TOMOGRAPHY WITH INTRALESIONAL APPLICATION OF CONTRAST MATERIAL IN A CASE OF VERY LARGE RADICULAR MAXILLARY CYST - A CASE REPORT

    Directory of Open Access Journals (Sweden)

    Galina Gavazova

    2017-09-01

    Full Text Available Introduction: Diagnosis of odontogenic cysts despite their benign nature is a critical and challenging problem. Aim: The aim of this article is to demonstrate a different diagnostic approach in case of very large odontogenic cyst. Materials and Methods: This study was executed on one male patient aged of 38 using 3D computed tomography and contrast material inside the lesion. Differential diagnosis made by the residents was compared to the histopathological examination as the gold standard for identifying the nature of the cysts. Results: This diagnostic approach using 3D computed tomography combined with contrast material injected inside the lesion shows the real borders of the cyst of the maxilla and helps oral surgeon in planning the volume of the surgical intervention. Conclusion: Precise diagnose ensure the possibility of doing the optimal surgical intervention- a precondition for best wound healing.

  20. Intravenous Contrast Medium Administration for Computed Tomography Scan in Emergency: A Possible Cause of Contrast-Induced Nephropathy

    International Nuclear Information System (INIS)

    Sonhaye, Lantam; Kolou, Bérésa; Tchaou, Mazamaesso; Amadou, Abdoulatif; Assih, Kouméabalo; N'Timon, Bidamin; Adambounou, Kokou; Agoda-Koussema, Lama; Adjenou, Komlavi; N'Dakena, Koffi

    2015-01-01

    The goal of this study was to assess risk for CIN after CT Scan during an emergency and to identify risk factors for the patient. Prospective review of all patients admitted to the emergency room (ER) of the Teaching Hospital of Lomé (Togo) during a 2-year period. CIN was defined as an increase in serum creatinine by 0.5 mg/dL from admission after undergoing CT Scan with intravenous contrast. A total of 620 patients underwent a CT Scan in the emergency room using intravenous contrast and 672 patients took the CT Scan without intravenous contrast. Out of the patients who received intravenous contrast for CT Scan, three percent of them developed CIN during their admission. Moreover, upon discharge no patient had continued renal impairment. No patient required dialysis during their admission. The multivariate analysis of all patients who had serial creatinine levels (including those who did not receive any contrast load) shows no increased risk for acute kidney injury associated intravenous contrast (odds ratio = 0.619, p value = 0.886); only diabetes remains independent risk factor of acute kidney injury (odds ratio = 6.26, p value = 0.031)

  1. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  2. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  3. Three-dimensional magnetotelluric inversion including topography using deformed hexahedral edge finite elements, direct solvers and data space Gauss-Newton, parallelized on SMP computers

    Science.gov (United States)

    Kordy, M. A.; Wannamaker, P. E.; Maris, V.; Cherkaev, E.; Hill, G. J.

    2014-12-01

    We have developed an algorithm for 3D simulation and inversion of magnetotelluric (MT) responses using deformable hexahedral finite elements that permits incorporation of topography. Direct solvers parallelized on symmetric multiprocessor (SMP), single-chassis workstations with large RAM are used for the forward solution, parameter jacobians, and model update. The forward simulator, jacobians calculations, as well as synthetic and real data inversion are presented. We use first-order edge elements to represent the secondary electric field (E), yielding accuracy O(h) for E and its curl (magnetic field). For very low frequency or small material admittivity, the E-field requires divergence correction. Using Hodge decomposition, correction may be applied after the forward solution is calculated. It allows accurate E-field solutions in dielectric air. The system matrix factorization is computed using the MUMPS library, which shows moderately good scalability through 12 processor cores but limited gains beyond that. The factored matrix is used to calculate the forward response as well as the jacobians of field and MT responses using the reciprocity theorem. Comparison with other codes demonstrates accuracy of our forward calculations. We consider a popular conductive/resistive double brick structure and several topographic models. In particular, the ability of finite elements to represent smooth topographic slopes permits accurate simulation of refraction of electromagnetic waves normal to the slopes at high frequencies. Run time tests indicate that for meshes as large as 150x150x60 elements, MT forward response and jacobians can be calculated in ~2.5 hours per frequency. For inversion, we implemented data space Gauss-Newton method, which offers reduction in memory requirement and a significant speedup of the parameter step versus model space approach. For dense matrix operations we use tiling approach of PLASMA library, which shows very good scalability. In synthetic

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  7. CASKS (Computer Analysis of Storage casKS): A microcomputer based analysis system for storage cask design review. User's manual to Version 1b (including program reference)

    International Nuclear Information System (INIS)

    Chen, T.F.; Gerhard, M.A.; Trummer, D.J.; Johnson, G.L.; Mok, G.C.

    1995-02-01

    CASKS (Computer Analysis of Storage casKS) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for evaluating safety analysis reports on spent-fuel storage casks. The bulk of the complete program and this user's manual are based upon the SCANS (Shipping Cask ANalysis System) program previously developed at LLNL. A number of enhancements and improvements were added to the original SCANS program to meet requirements unique to storage casks. CASKS is an easy-to-use system that calculates global response of storage casks to impact loads, pressure loads and thermal conditions. This provides reviewers with a tool for an independent check on analyses submitted by licensees. CASKS is based on microcomputers compatible with the IBM-PC family of computers. The system is composed of a series of menus, input programs, cask analysis programs, and output display programs. All data is entered through fill-in-the-blank input screens that contain descriptive data requests

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  9. Clinical significance of cerebrospinal fluid tap test and magnetic resonance imaging/computed tomography findings of tight high convexity in patients with possible idiopathic normal pressure hydrocephalus

    International Nuclear Information System (INIS)

    Ishikawa, Masatsune; Furuse, Motomasa; Nishida, Namiko; Oowaki, Hisayuki; Matsumoto, Atsuhito; Suzuki, Takayuki

    2010-01-01

    Idiopathic normal pressure hydrocephalus (iNPH) is a treatable syndrome with a classical triad of symptoms. The Japanese iNPH guidelines indicate that the cerebrospinal fluid (CSF) tap test and tight high-convexity on magnetic resonance (MR) imaging are important for the diagnosis. The relationships between the effectiveness of CSF shunt surgery in possible iNPH patients, the tap test result, and the MR imaging/computed tomography (CT) findings of tight high-convexity were evaluated in 88 possible iNPH patients (mean age 75 years) with one or more of the classical triad of symptoms, and mild to moderate ventricular dilation. All patients underwent the tap test in the outpatient clinic, and patients and caregivers assessed the clinical changes during one week. The tap test was positive in 47 patients and negative in 41 patients. Surgery was performed in 19 patients with positive tap test, and was effective in 17 patients. Although the findings were inconsistent in some patients, the result of the tap test was found to be highly correlated with the MR imaging/CT finding of tight high-convexity (p<0.0001), confirming that both these diagnostic tests are promising predictors of shunt effectiveness. (author)

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  11. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  12. Comparison of different application systems and CT- assisted treatment planning procedures in primary endometrium cancer: Is it technically possible to include the whole uterus volume in the volume treated by brachytherapy

    International Nuclear Information System (INIS)

    Mock, U.; Knocke, Th.; Fellner, C.; Poetter, R.

    1996-01-01

    Purpose: Brachytherapy is regarded as the definitive component of treatment for inoperable patients with endometrium cancer. In published series the whole uterus has been claimed to represent the target volume independently of the individual tumor spread. The purpose of this work is to compare different planning and application procedures and to analyze the target volumes (whole uterus), treatment volumes and their respective relation for the given various conditions. Material and Methods: In ten patients with primary endometrium cancer the correlation between target- and treatment volume was analysed based on standard one-channel applicators or individual Heyman applicators. A comparative analysis of target volumes resulting from two different planning procedures of Heyman applications was performed. CT was carried out after insertion of the Heyman ovoids. Target volume was estimated by measuring the uterus size at different cross sections of the CT images. Dose calculation was performed with (PLATO-system) or without (NPS-system) transferring these data directly to the planning system. We report on the differences in treatment volumes resulting from the two application and planning systems. Results: The mean value of the uterus volume was 180 ccm (range 57 ccm to 316 ccm). Four out of 10 patients had an asymmetric uterus configuration with a side-difference (in longitudinal or transversal direction) of more than 1 cm. On average 70% (range 48-95%) of the uterus volume was included by the treatment volume when Heymann applicators were used compared to 45 % (range 25-89%) when standard one channel applicators were used. This represents an improvement of 25% (range from 11%-35%). By utilizing the more sophisticated way of treatment planning a more adequate coverage of the uterus volume was achieved in five out of ten patients. The treated volume increased on the average by 20 % (range 11 %-32%). In three cases changes in the irradiation volume were less than 5%. In

  13. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  14. Patient-specific scatter correction in clinical cone beam computed tomography imaging made possible by the combination of Monte Carlo simulations and a ray tracing algorithm

    International Nuclear Information System (INIS)

    Thing, Rune S.; Bernchou, Uffe; Brink, Carsten; Mainegra-Hing, Ernesto

    2013-01-01

    Purpose: Cone beam computed tomography (CBCT) image quality is limited by scattered photons. Monte Carlo (MC) simulations provide the ability of predicting the patient-specific scatter contamination in clinical CBCT imaging. Lengthy simulations prevent MC-based scatter correction from being fully implemented in a clinical setting. This study investigates the combination of using fast MC simulations to predict scatter distributions with a ray tracing algorithm to allow calibration between simulated and clinical CBCT images. Material and methods: An EGSnrc-based user code (egs c bct), was used to perform MC simulations of an Elekta XVI CBCT imaging system. A 60keV x-ray source was used, and air kerma scored at the detector plane. Several variance reduction techniques (VRTs) were used to increase the scatter calculation efficiency. Three patient phantoms based on CT scans were simulated, namely a brain, a thorax and a pelvis scan. A ray tracing algorithm was used to calculate the detector signal due to primary photons. A total of 288 projections were simulated, one for each thread on the computer cluster used for the investigation. Results: Scatter distributions for the brain, thorax and pelvis scan were simulated within 2 % statistical uncertainty in two hours per scan. Within the same time, the ray tracing algorithm provided the primary signal for each of the projections. Thus, all the data needed for MC-based scatter correction in clinical CBCT imaging was obtained within two hours per patient, using a full simulation of the clinical CBCT geometry. Conclusions: This study shows that use of MC-based scatter corrections in CBCT imaging has a great potential to improve CBCT image quality. By use of powerful VRTs to predict scatter distributions and a ray tracing algorithm to calculate the primary signal, it is possible to obtain the necessary data for patient specific MC scatter correction within two hours per patient

  15. Short- and medium-term efficacy of a Web-based computer-tailored nutrition education intervention for adults including cognitive and environmental feedback: randomized controlled trial.

    Science.gov (United States)

    Springvloet, Linda; Lechner, Lilian; de Vries, Hein; Candel, Math J J M; Oenema, Anke

    2015-01-19

    Web-based, computer-tailored nutrition education interventions can be effective in modifying self-reported dietary behaviors. Traditional computer-tailored programs primarily targeted individual cognitions (knowledge, awareness, attitude, self-efficacy). Tailoring on additional variables such as self-regulation processes and environmental-level factors (the home food environment arrangement and perception of availability and prices of healthy food products in supermarkets) may improve efficacy and effect sizes (ES) of Web-based computer-tailored nutrition education interventions. This study evaluated the short- and medium-term efficacy and educational differences in efficacy of a cognitive and environmental feedback version of a Web-based computer-tailored nutrition education intervention on self-reported fruit, vegetable, high-energy snack, and saturated fat intake compared to generic nutrition information in the total sample and among participants who did not comply with dietary guidelines (the risk groups). A randomized controlled trial was conducted with a basic (tailored intervention targeting individual cognition and self-regulation processes; n=456), plus (basic intervention additionally targeting environmental-level factors; n=459), and control (generic nutrition information; n=434) group. Participants were recruited from the general population and randomly assigned to a study group. Self-reported fruit, vegetable, high-energy snack, and saturated fat intake were assessed at baseline and at 1- (T1) and 4-months (T2) postintervention using online questionnaires. Linear mixed model analyses examined group differences in change over time. Educational differences were examined with group×time×education interaction terms. In the total sample, the basic (T1: ES=-0.30; T2: ES=-0.18) and plus intervention groups (T1: ES=-0.29; T2: ES=-0.27) had larger decreases in high-energy snack intake than the control group. The basic version resulted in a larger decrease in

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  19. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  2. Planned development and evaluation protocol of two versions of a web-based computer-tailored nutrition education intervention aimed at adults, including cognitive and environmental feedback.

    Science.gov (United States)

    Springvloet, Linda; Lechner, Lilian; Oenema, Anke

    2014-01-17

    Despite decades of nutrition education, the prevalence of unhealthy dietary patterns is still high and inequalities in intake between high and low socioeconomic groups still exist. Therefore, it is important to innovate and improve existing nutrition education interventions. This paper describes the development, design and evaluation protocol of a web-based computer-tailored nutrition education intervention for adults targeting fruit, vegetable, high-energy snack and fat intake. This intervention innovates existing computer-tailored interventions by not only targeting motivational factors, but also volitional and self-regulation processes and environmental-level factors. The intervention development was guided by the Intervention Mapping protocol, ensuring a theory-informed and evidence-based intervention. Two versions of the intervention were developed: a basic version targeting knowledge, awareness, attitude, self-efficacy and volitional and self-regulation processes, and a plus version additionally addressing the home environment arrangement and the availability and price of healthy food products in supermarkets. Both versions consist of four modules: one for each dietary behavior, i.e. fruit, vegetables, high-energy snacks and fat. Based on the self-regulation phases, each module is divided into three sessions. In the first session, feedback on dietary behavior is provided to increase awareness, feedback on attitude and self-efficacy is provided and goals and action plans are stated. In the second session goal achievement is evaluated, reasons for failure are explored, coping plans are stated and goals can be adapted. In the third session, participants can again evaluate their behavioral change and tips for maintenance are provided. Both versions will be evaluated in a three-group randomized controlled trial with measurements at baseline, 1-month, 4-months and 9-months post-intervention, using online questionnaires. Both versions will be compared with a generic

  3. Computer simulation and experimental self-assembly of irradiated glycine amino acid under magnetic fields: Its possible significance in prebiotic chemistry.

    Science.gov (United States)

    Heredia, Alejandro; Colín-García, María; Puig, Teresa Pi I; Alba-Aldave, Leticia; Meléndez, Adriana; Cruz-Castañeda, Jorge A; Basiuk, Vladimir A; Ramos-Bernal, Sergio; Mendoza, Alicia Negrón

    2017-12-01

    Ionizing radiation may have played a relevant role in chemical reactions for prebiotic biomolecule formation on ancient Earth. Environmental conditions such as the presence of water and magnetic fields were possibly relevant in the formation of organic compounds such as amino acids. ATR-FTIR, Raman, EPR and X-ray spectroscopies provide valuable information about molecular organization of different glycine polymorphs under static magnetic fields. γ-glycine polymorph formation increases in irradiated samples interacting with static magnetic fields. The increase in γ-glycine polymorph agrees with the computer simulations. The AM1 semi-empirical simulations show a change in the catalyst behavior and dipole moment values in α and γ-glycine interaction with the static magnetic field. The simulated crystal lattice energy in α-glycine is also affected by the free radicals under the magnetic field, which decreases its stability. Therefore, solid α and γ-glycine containing free radicals under static magnetic fields might have affected the prebiotic scenario on ancient Earth by causing the oligomerization of glycine in prebiotic reactions. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. TURTLE with MAD input (Trace Unlimited Rays Through Lumped Elements) -- A computer program for simulating charged particle beam transport systems and DECAY TURTLE including decay calculations

    Energy Technology Data Exchange (ETDEWEB)

    Carey, D.C.

    1999-12-09

    TURTLE is a computer program useful for determining many characteristics of a particle beam once an initial design has been achieved, Charged particle beams are usually designed by adjusting various beam line parameters to obtain desired values of certain elements of a transfer or beam matrix. Such beam line parameters may describe certain magnetic fields and their gradients, lengths and shapes of magnets, spacings between magnetic elements, or the initial beam accepted into the system. For such purposes one typically employs a matrix multiplication and fitting program such as TRANSPORT. TURTLE is designed to be used after TRANSPORT. For convenience of the user, the input formats of the two programs have been made compatible. The use of TURTLE should be restricted to beams with small phase space. The lumped element approximation, described below, precludes the inclusion of the effect of conventional local geometric aberrations (due to large phase space) or fourth and higher order. A reading of the discussion below will indicate clearly the exact uses and limitations of the approach taken in TURTLE.

  5. Spelling is just a click away – a user-centered brain-computer interface including auto-calibration and predictive text entry

    Directory of Open Access Journals (Sweden)

    Tobias eKaufmann

    2012-05-01

    Full Text Available Brain Computer Interfaces (BCI based on event-related potentials (ERP allow for selection of characters from a visually presented character-matrix and thus provide a communication channel for users with neurodegenerative disease. Although they have been topic of research for more than 20 years and were multiply proven to be a reliable communication method, BCIs are almost exclusively used in experimental settings, handled by qualified experts. This study investigates if ERP-BCIs can be handled independently by laymen without expert interference, which is inevitable for establishing BCIs in end-user’s daily life situations. Furthermore we compared the classic character-by-character text entry against a predictive text entry (PTE that directly incorporates predictive text into the character matrix. N=19 BCI novices handled a user-centred ERP-BCI application on their own without expert interference. The software individually adjusted classifier weights and control parameters in the background, invisible to the user (auto-calibration. All participants were able to operate the software on their own and to twice correctly spell a sentence with the auto-calibrated classifier (once with PTE, once without. Our PTE increased spelling speed and importantly did not reduce accuracy. In sum, this study demonstrates feasibility of auto-calibrating ERP-BCI use, independently by laymen and the strong benefit of integrating predictive text directly into the character-matrix.

  6. Spelling is Just a Click Away - A User-Centered Brain-Computer Interface Including Auto-Calibration and Predictive Text Entry.

    Science.gov (United States)

    Kaufmann, Tobias; Völker, Stefan; Gunesch, Laura; Kübler, Andrea

    2012-01-01

    Brain-computer interfaces (BCI) based on event-related potentials (ERP) allow for selection of characters from a visually presented character-matrix and thus provide a communication channel for users with neurodegenerative disease. Although they have been topic of research for more than 20 years and were multiply proven to be a reliable communication method, BCIs are almost exclusively used in experimental settings, handled by qualified experts. This study investigates if ERP-BCIs can be handled independently by laymen without expert support, which is inevitable for establishing BCIs in end-user's daily life situations. Furthermore we compared the classic character-by-character text entry against a predictive text entry (PTE) that directly incorporates predictive text into the character-matrix. N = 19 BCI novices handled a user-centered ERP-BCI application on their own without expert support. The software individually adjusted classifier weights and control parameters in the background, invisible to the user (auto-calibration). All participants were able to operate the software on their own and to twice correctly spell a sentence with the auto-calibrated classifier (once with PTE, once without). Our PTE increased spelling speed and, importantly, did not reduce accuracy. In sum, this study demonstrates feasibility of auto-calibrating ERP-BCI use, independently by laymen and the strong benefit of integrating predictive text directly into the character-matrix.

  7. TURTLE with MAD input (Trace Unlimited Rays Through Lumped Elements) -- A computer program for simulating charged particle beam transport systems and DECAY TURTLE including decay calculations

    International Nuclear Information System (INIS)

    Carey, D.C.

    1999-01-01

    TURTLE is a computer program useful for determining many characteristics of a particle beam once an initial design has been achieved, Charged particle beams are usually designed by adjusting various beam line parameters to obtain desired values of certain elements of a transfer or beam matrix. Such beam line parameters may describe certain magnetic fields and their gradients, lengths and shapes of magnets, spacings between magnetic elements, or the initial beam accepted into the system. For such purposes one typically employs a matrix multiplication and fitting program such as TRANSPORT. TURTLE is designed to be used after TRANSPORT. For convenience of the user, the input formats of the two programs have been made compatible. The use of TURTLE should be restricted to beams with small phase space. The lumped element approximation, described below, precludes the inclusion of the effect of conventional local geometric aberrations (due to large phase space) or fourth and higher order. A reading of the discussion below will indicate clearly the exact uses and limitations of the approach taken in TURTLE

  8. Can the possibility of transverse iliosacral screw fixation for first sacral segment be predicted preoperatively? Results of a computational cadaveric study.

    Science.gov (United States)

    Jeong, Jin-Hoon; Jin, Jin Woo; Kang, Byoung Youl; Jung, Gu-Hee

    2017-10-01

    The purpose of this study was to predict the possibility of transverse iliosacral (TIS) screw fixation into the first sacral segment (S 1 ) and introduce practical anatomical variables using conventional computed tomography (CT) scans. A total of 82 cadaveric sacra (42 males and 40 females) were used for continuous 1.0-mm slice CT scans, which were imported into Mimics ® software to produce a three-dimensional pelvis model. The anterior height (BH) and superior width (BW) of the elevated sacral segment was measured, followed by verification of the safe zone (SZ S1 and SZ S2 ) in a true lateral view. Their vertical (VD S1 and VD S2 ) and horizontal (HD S1 and HD S2 ) distances were measured. VD S1 less than 7mm was classified as impossible sacrum, since the transverse fixation of 7.0 mm-sized IS screw could not be done safely. Fourteen models (16.7%; six females, eight males) were assigned as the impossible sacrum. There was no statistical significance regarding gender (p=0.626) and height (p=0.419). The average values were as follows: BW, 31.4mm (SD 2.9); BH, 16.7mm (SD 6.8); VD S1 , 13.4mm (SD 6.1); HD S1 , 22.5mm (SD 4.5); SZ S1 , 239.5mm 2 (SD 137.1); VD S2 , 15.5mm (SD 3.0); HD S2 , 18.3mm (SD 2.9); and SZ S2 , 221.1mm 2 (SD 68.5). Logistic regression analysis identified BH (p=0.001) and HD S1 (p=0.02) as the only statistically significant variables to predict the possibility. Receiver operating characteristic curve analysis established a cut-off value for BH and HD S1 of impossible sacrum of 20.6mm and 18.6mm, respectively. BH and HD S1 could be used to predict the possibility of TIS screw fixation. If the BH exceeds 20.6mm or HD S1 is less than 18.6mm, TIS screw fixation for S 1 should not be undertaken because of narrowed SZ. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  10. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  11. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  15. Use of computer aids including expert systems to enhance diagnosis of NPP safety status and operator response. VDU displays in accidents - Interact

    International Nuclear Information System (INIS)

    Humble, P.; Welbourne, D.

    1998-01-01

    This report describes NNC development of a demonstration concept called Interact of Visual Display Unit (VDU) displays, integrating on-screen control of plant actions. Most plant vendors now propose on-screen control and it is being included on some plants. The integration of Station Operating Instructions (SOI) into VDU presentation of plants is being developed rapidly. With on-screen control, SOIs can be displayed with control targets able to initiate plant control, directly as called for in the SOIs. Interact displays information and control options, using a cursor to simulate on-screen display and plant control. The displays show a method which integrates soft control and SOI information into a single unified presentation. They simulate the SOI for an accident, on-screen, with simulated inserted plant values

  16. Preoperative planning of calcium deposit removal in calcifying tendinitis of the rotator cuff - possible contribution of computed tomography, ultrasound and conventional X-Ray.

    Science.gov (United States)

    Izadpanah, Kaywan; Jaeger, Martin; Maier, Dirk; Südkamp, Norbert P; Ogon, Peter

    2014-11-20

    The purpose of the present study was to investigate the accuracy of Ultrasound (US), conventional X-Ray (CX) and Computed Tomography (CT) to estimate the total count, localization, morphology and consistency of Calcium deposits (CDs) in the rotator cuff. US, CX and CT imaging was performed pre-operatively in 151 patients who underwent arthroscopic removal of CDs in the rotator cuff. In all procedures: (1) total CD counts were determined, (2) the CDs appearance in each image modality was correlated to the intraoperative consistency and (3) CDs were localized in their relation to the acromion using US, CX and CT. Using US158 CDs, using CT 188 CDs and using CX 164 CDs were identified. Reliable localization of the CDs was possible with all used diagnostic modalities. CT revealed 49% of the CDs to be septated, out of which 85% were uni- and 15% multiseptated. CX was not suitable for prediction of CDs consistency. US reliably predicted viscous-solid CDs consistency only when presenting with full sound extinction (PPV 84.6%) . CT had high positive and negative predictive values for detection of liquid-soft (PPV 92.9%) and viscous-solid (PPV 87.8%) CDs. US and CX are sufficient for preoperative planning of CD removal with regards to localization and prediction of consistency if the deposits present with full sound extinction. This is the case in the majority of the patients. However, in patients with missing sound extinction CT can be recommended if CDs consistency of the deposits should be determined. Satellite deposits or septations are regularly present, which is of importance if complete CD removal is aspired.

  17. COMPUTER GAMES AND EDUCATION

    OpenAIRE

    Sukhov, Anton

    2018-01-01

    This paper devoted to the research of educational resources and possibilities of modern computer games. The “internal” educational aspects of computer games include educational mechanism (a separate or integrated “tutorial”) and representation of a real or even fantastic educational process within virtual worlds. The “external” dimension represents educational opportunities of computer games for personal and professional development in different genres of computer games (various transport, so...

  18. Multidetector-row computed tomography for prosthetic heart valve dysfunction: is concomitant non-invasive coronary angiography possible before redo-surgery?

    Energy Technology Data Exchange (ETDEWEB)

    Tanis, Wilco [Haga Teaching Hospital, Department of Cardiology, The Hague (Netherlands); Haga Teaching Hospital, The Hague (Netherlands); Sucha, Dominika; Habets, Jesse [University Medical Center Utrecht, Department of Radiology, Utrecht (Netherlands); Laufer, Ward; Chamuleau, Steven [University Medical Center Utrecht, Department of Cardiology, Utrecht (Netherlands); Herwerden, Lex.A. van [University Medical Center Utrecht, Department of Cardiothoracic Surgery, Utrecht (Netherlands); Symersky, Petr [Vrije Universiteit, Department of Cardiothoracic Surgery, Amsterdam (Netherlands); Budde, Ricardo P.J. [Erasmus Medical Center, Department of Radiology, Rotterdam (Netherlands)

    2015-06-01

    Retrospective ECG-gated multidetector-row computed tomography (MDCT) is increasingly used for the assessment of prosthetic heart valve (PHV) dysfunction, but is also hampered by PHV-related artefacts/cardiac arrhythmias. Furthermore, it is performed without nitroglycerine or heart rate correction. The purpose was to determine whether MDCT performed before potential redo-PHV surgery is feasible for concomitant coronary artery stenosis assessment and can replace invasive coronary angiography (CAG). PHV patients with CAG and MDCT were identified. Based on medical history, two groups were created: (I) patients with no known coronary artery disease (CAD), (II) patients with known CAD. All images were scored for the presence of significant (>50 %) stenosis. CAG was the reference test. Fifty-one patients were included. In group I (n = 38), MDCT accurately ruled out significant stenosis in 19/38 (50 %) patients, but could not replace CAG in the remaining 19/38 (50 %) patients due to non-diagnostic image quality (n = 16) or significant stenosis (n = 3) detection. In group II (n = 13), MDCT correctly found no patients without significant stenosis, requiring CAG imaging in all. MDCT assessed patency in 16/19 (84 %) grafts and detected a hostile anatomy in two. MDCT performed for PHV dysfunction assessment can replace CAG (100 % accurate) in approximately half of patients without previously known CAD. (orig.)

  19. Multidetector-row computed tomography for prosthetic heart valve dysfunction: is concomitant non-invasive coronary angiography possible before redo-surgery?

    International Nuclear Information System (INIS)

    Tanis, Wilco; Sucha, Dominika; Habets, Jesse; Laufer, Ward; Chamuleau, Steven; Herwerden, Lex.A. van; Symersky, Petr; Budde, Ricardo P.J.

    2015-01-01

    Retrospective ECG-gated multidetector-row computed tomography (MDCT) is increasingly used for the assessment of prosthetic heart valve (PHV) dysfunction, but is also hampered by PHV-related artefacts/cardiac arrhythmias. Furthermore, it is performed without nitroglycerine or heart rate correction. The purpose was to determine whether MDCT performed before potential redo-PHV surgery is feasible for concomitant coronary artery stenosis assessment and can replace invasive coronary angiography (CAG). PHV patients with CAG and MDCT were identified. Based on medical history, two groups were created: (I) patients with no known coronary artery disease (CAD), (II) patients with known CAD. All images were scored for the presence of significant (>50 %) stenosis. CAG was the reference test. Fifty-one patients were included. In group I (n = 38), MDCT accurately ruled out significant stenosis in 19/38 (50 %) patients, but could not replace CAG in the remaining 19/38 (50 %) patients due to non-diagnostic image quality (n = 16) or significant stenosis (n = 3) detection. In group II (n = 13), MDCT correctly found no patients without significant stenosis, requiring CAG imaging in all. MDCT assessed patency in 16/19 (84 %) grafts and detected a hostile anatomy in two. MDCT performed for PHV dysfunction assessment can replace CAG (100 % accurate) in approximately half of patients without previously known CAD. (orig.)

  20. Development of a modified prognostic index for patients with aggressive adult T-cell leukemia-lymphoma aged 70 years or younger: possible risk-adapted management strategies including allogeneic transplantation.

    Science.gov (United States)

    Fuji, Shigeo; Yamaguchi, Takuhiro; Inoue, Yoshitaka; Utsunomiya, Atae; Moriuchi, Yukiyoshi; Uchimaru, Kaoru; Owatari, Satsuki; Miyagi, Takashi; Taguchi, Jun; Choi, Ilseung; Otsuka, Eiichi; Nakachi, Sawako; Yamamoto, Hisashi; Kurosawa, Saiko; Tobinai, Kensei; Fukuda, Takahiro

    2017-07-01

    Adult T-cell leukemia-lymphoma is a distinct type of peripheral T-cell lymphoma caused by human T-cell lymphotropic virus type I. Although allogeneic stem cell transplantation after chemotherapy is a recommended treatment option for patients with aggressive adult T-cell leukemia-lymphoma, there is no consensus about indications for allogeneic stem cell transplantation because there is no established risk stratification system for transplant eligible patients. We conducted a nationwide survey of patients with aggressive adult T-cell leukemia-lymphoma in order to construct a new, large database that includes 1,792 patients aged 70 years or younger with aggressive adult T-cell leukemia-lymphoma who were diagnosed between 2000 and 2013 and received intensive first-line chemotherapy. We randomly divided patients into two groups (training and validation sets). Acute type, poor performance status, high soluble interleukin-2 receptor levels (> 5,000 U/mL), high adjusted calcium levels (≥ 12 mg/dL), and high C-reactive protein levels (≥ 2.5 mg/dL) were independent adverse prognostic factors used in the training set. We used these five variables to divide patients into three risk groups. In the validation set, median overall survival for the low-, intermediate-, and high-risk groups was 626 days, 322 days, and 197 days, respectively. In the intermediate- and high-risk groups, transplanted recipients had significantly better overall survival than non-transplanted patients. We developed a promising new risk stratification system to identify patients aged 70 years or younger with aggressive adult T-cell leukemia-lymphoma who may benefit from upfront allogeneic stem cell transplantation. Prospective studies are warranted to confirm the benefit of this treatment strategy. Copyright© 2017 Ferrata Storti Foundation.

  1. Lumbar spine spondylolysis in the adult population: using computed tomography to evaluate the possibility of adult onset lumbar spondylosis as a cause of back pain

    Energy Technology Data Exchange (ETDEWEB)

    Brooks, Benjamin K.; Southam, Samuel L.; Mlady, Gary W.; Logan, Jeremy; Rosett, Matthew [University of New Mexico School of Medicine, Department of Radiology, Albuquerque, NM (United States)

    2010-07-15

    To determine if new onset of low back pain in adults could be secondary to lumbar spondylolysis by establishing the age-related prevalence in the general population by examining patients undergoing computed tomography (CT) for reasons unrelated to back pain. The records of 2,555 patients who had undergone abdominal and pelvic CT in 2008 were reviewed electronically. In order to determine a true representation of the general population, we reviewed all indications for CT, excluding patients with a primary complaint of low back pain as the primary indication for imaging. Equal numbers of patients were separated into age groups by decade to ensure an even distribution of ages for statistical analysis. Patients older than 70 years were grouped together to provide case numbers comparable to those of the other decades. Logistic regression analysis was performed to evaluate the significance of the results. Three board-certified radiologists, including two musculoskeletal fellows and a radiology resident, retrospectively evaluated CT scans for lumbar spondylolysis, including unilateral and bilateral defects. Of the 2,555 cases evaluated, there were 203 positive cases of defects of the lumbar pars interarticularis. This corresponded to an overall prevalence of 8.0%. Prevalence per decade was fairly evenly distributed and ranged from 7.0%(ages 30-39 years) to 9.2% (ages 70 years and above). Prevalence of ages 20-49 years was 7.9%, and that of ages 50 years and older was 8.0%. Male to female ratio was 1.5:1. Logistic regression showed no significant increase in spondylolysis based on age. No significant increase in the prevalence of lumbar spondylolysis was demonstrated in patients older than 20 years. This suggests that the development of symptomatic lumbar pars defects do not occur in this population and should not be considered as a rare but potentially treatable cause of new onset low back pain in adults. This study demonstrated an overall prevalence of pars defects of 8

  2. Lumbar spine spondylolysis in the adult population: using computed tomography to evaluate the possibility of adult onset lumbar spondylosis as a cause of back pain.

    Science.gov (United States)

    Brooks, Benjamin K; Southam, Samuel L; Mlady, Gary W; Logan, Jeremy; Rosett, Matthew

    2010-07-01

    To determine if new onset of low back pain in adults could be secondary to lumbar spondylolysis by establishing the age-related prevalence in the general population by examining patients undergoing computed tomography (CT) for reasons unrelated to back pain. The records of 2,555 patients who had undergone abdominal and pelvic CT in 2008 were reviewed electronically. In order to determine a true representation of the general population, we reviewed all indications for CT, excluding patients with a primary complaint of low back pain as the primary indication for imaging. Equal numbers of patients were separated into age groups by decade to ensure an even distribution of ages for statistical analysis. Patients older than 70 years were grouped together to provide case numbers comparable to those of the other decades. Logistic regression analysis was performed to evaluate the significance of the results. Three board-certified radiologists, including two musculoskeletal fellows and a radiology resident, retrospectively evaluated CT scans for lumbar spondylolysis, including unilateral and bilateral defects. Of the 2,555 cases evaluated, there were 203 positive cases of defects of the lumbar pars interarticularis. This corresponded to an overall prevalence of 8.0%. Prevalence per decade was fairly evenly distributed and ranged from 7.0%( ages 30-39 years) to 9.2% (ages 70 years and above). Prevalence of ages 20-49 years was 7.9%, and that of ages 50 years and older was 8.0%. Male to female ratio was 1.5:1. Logistic regression showed no significant increase in spondylolysis based on age. No significant increase in the prevalence of lumbar spondylolysis was demonstrated in patients older than 20 years. This suggests that the development of symptomatic lumbar pars defects do not occur in this population and should not be considered as a rare but potentially treatable cause of new onset low back pain in adults. This study demonstrated an overall prevalence of pars defects of 8

  3. Lumbar spine spondylolysis in the adult population: using computed tomography to evaluate the possibility of adult onset lumbar spondylosis as a cause of back pain

    International Nuclear Information System (INIS)

    Brooks, Benjamin K.; Southam, Samuel L.; Mlady, Gary W.; Logan, Jeremy; Rosett, Matthew

    2010-01-01

    To determine if new onset of low back pain in adults could be secondary to lumbar spondylolysis by establishing the age-related prevalence in the general population by examining patients undergoing computed tomography (CT) for reasons unrelated to back pain. The records of 2,555 patients who had undergone abdominal and pelvic CT in 2008 were reviewed electronically. In order to determine a true representation of the general population, we reviewed all indications for CT, excluding patients with a primary complaint of low back pain as the primary indication for imaging. Equal numbers of patients were separated into age groups by decade to ensure an even distribution of ages for statistical analysis. Patients older than 70 years were grouped together to provide case numbers comparable to those of the other decades. Logistic regression analysis was performed to evaluate the significance of the results. Three board-certified radiologists, including two musculoskeletal fellows and a radiology resident, retrospectively evaluated CT scans for lumbar spondylolysis, including unilateral and bilateral defects. Of the 2,555 cases evaluated, there were 203 positive cases of defects of the lumbar pars interarticularis. This corresponded to an overall prevalence of 8.0%. Prevalence per decade was fairly evenly distributed and ranged from 7.0%(ages 30-39 years) to 9.2% (ages 70 years and above). Prevalence of ages 20-49 years was 7.9%, and that of ages 50 years and older was 8.0%. Male to female ratio was 1.5:1. Logistic regression showed no significant increase in spondylolysis based on age. No significant increase in the prevalence of lumbar spondylolysis was demonstrated in patients older than 20 years. This suggests that the development of symptomatic lumbar pars defects do not occur in this population and should not be considered as a rare but potentially treatable cause of new onset low back pain in adults. This study demonstrated an overall prevalence of pars defects of 8

  4. Possible use of a computer for processing technological information of daily reports on drilling in order to optimize the drilling regimes

    Energy Technology Data Exchange (ETDEWEB)

    Sukhanov, V B; Kovalev, A A; Rezchikov, A V; Sukhanova, L G; Vyazenkin, S N; Zakolyuzhnyy, V D

    1982-01-01

    It is suggested that a computer be used for processing technological information of data reports on drilling. This will permit solution in the future to the task of monitoring the observation of the assigned regime-technological parameters of drilling wells by compiling planning recommendations and factual information about their fulfillment. Comprehensive analysis of the factual data regarding the regimes of making wells based on the information of daily reports on drilling using a computer in the OAIS system of drilling of the Ministry of the Gas Industry at the existing stage of technical support of the associations with a computer will permit in the near future production of exhaustive regime-technological information regarding the operation of bits in each well and development of RTK for drilling future wells by intervals of the same drillability.

  5. Patient-specific scatter correction in clinical cone beam computed tomography imaging made possible by the combination of Monte Carlo simulations and a ray tracing algorithm

    DEFF Research Database (Denmark)

    Slot Thing, Rune; Bernchou, Uffe; Mainegra-Hing, Ernesto

    2013-01-01

    Abstract Purpose. Cone beam computed tomography (CBCT) image quality is limited by scattered photons. Monte Carlo (MC) simulations provide the ability of predicting the patient-specific scatter contamination in clinical CBCT imaging. Lengthy simulations prevent MC-based scatter correction from...

  6. ORCODE.77: a computer routine to control a nuclear physics experiment by a PDP-15 + CAMAC system, written in assembler language and including many new routines of general interest

    International Nuclear Information System (INIS)

    Dickens, J.K.; McConnell, J.W.

    1977-01-01

    ORCODE.77 is a versatile data-handling computer routine written in MACRO (assembler) language for a PDP-15 computer with EAE (extended arithmetic capability) connected to a CAMAC interface. The Interrupt feature of the computer is utilized. Although the code is oriented for a specific experimental problem, there are many routines of general interest, including a CAMAC Scaler handler, an executive routine to interpret and act upon three-character teletype commands, concise routines to type out double-precision integers (both octal and decimal) and floating-point numbers and to read in integers and floating-point numbers, a routine to convert to and from PDP-15 FORTRAN-IV floating-point format, a routine to handle clock interrupts, and our own DECTAPE handling routine. Routines having specific applications which are applicable to other very similar applications include a display routine using CAMAC instructions, control of external mechanical equipment using CAMAC instructions, storage of data from an Analog-to-digital Converter, analysis of stored data into time-dependent pulse-height spectra, and a routine to read the contents of a Nuclear Data 5050 Analyzer and to prepare DECTAPE output of these data for subsequent analysis by a code written in PDP-15-compiled FORTRAN-IV

  7. Dispersed flow film boiling: An investigation of the possibility to improve the models implemented in the NRC computer codes for the reflooding phase of the LOCA

    International Nuclear Information System (INIS)

    Andreani, M.; Yadigaroglu, G.; Paul Scherrer Inst.

    1992-08-01

    Dispersed Flow Film Boiling is the heat transfer regime that occurs at high void fractions in a heated channel. The way this heat transfer mode is modelled in the NRC computer codes (RELAP5 and TRAC) and the validity of the assumptions and empirical correlations used is discussed. An extensive review of the theoretical and experimental work related with heat transfer to highly dispersed mixtures reveals the basic deficiencies of these models: the investigation refers mostly to the typical conditions of low rate bottom reflooding, since the simulation of this physical situation by the computer codes has often showed poor results. The alternative models that are available in the literature are reviewed, and their merits and limits are highlighted. The modifications that could improve the physics of the models implemented in the codes are identified

  8. Clinico-pathological consideration of diffuse brain injury. Does it possible to diagnosis the diffuse axonal injury by using computed imagings?

    International Nuclear Information System (INIS)

    Fukuda, Tadaharu; Takeda, Hiroki; Matsumura, Hiroyuki; Izawa, Hitoshi; Nakanishi, Fuminori; Onitsuka, Toshirou; Nakajima, Satoshi; Hasue, Masamichi; Ito, Hiroshi

    1998-01-01

    We studied 146 cases with clinical symptoms equivalent to diffuse axonal injury (DAI) according to Gennarelli et al., and initial CT findings equivalent to diffuse injury I, II and III according to Marshall et al. The relationships between computed imagings (CI) assessments, various monitorings, higher order cerebral functions during follow-up, and long-term prognosis were investigated, and the feasibility of assessing the nature of diffuse brain damage by CI was examined. Each of the three CI conducted; computed tomography (CT), magnetic resonance imaging (MRI) T2-weighted image, and 123I-IMP single photon emission computed tomography (SPECT), by itself failed to achieve accurate assessment of outcome. No correlation was observed between acute phase CT finding and the highest WAIS or WISC score from 3 months to within 2 years after injury. Fifty percent of survivors had normal cerebro-ventricular index (CVI) calculated from chronic phase CT images. The correlation between chronic phase CVI and acute phase CT or MRI grade was weak. Even in cases with increased CVI and severe cerebral atrophy, improvement in higher order cerebral function was observed in young patients. In acute phase DAI, cerebral circulatory disorders such as cerebral blood flow and cerebral blood volume changes and electro-physiological disorders such as abnormal somato-sensory evoked potentials were present, and these disorders resolved with time in survivors. Recovery from these disorders tended to correlate with improvement of consciousness level. These findings indicate that the clinical symptoms of DAI are composed of reversible functional impairments. The current CI techniques cannot differentiate between irreversible neural damages and transient functional impairments, and also cannot predict the recovery process due to sprouting and other mechanisms. For these reasons, discrepancies exist between acute phase CI findings and outcomes, especially functional prognosis. (author)

  9. Computed tomography in endocrine orbitopathy: Effects of different gantry tilt and patient positioning on measurements of eye muscle thickness, and possibilities for correction

    International Nuclear Information System (INIS)

    Markl, A.; Hilbertz, T.; Mayr, B.; Lissner, J.; Pickardt, C.R.

    1986-01-01

    Thickening of eye muscles in endocrine orbitopathy can be demonstrated particularly impressively in coronary computed tomograms. However, when measuring the height and width of rectus eye muscles manifesting pathologic changes, the measurement is increased by a deviation from the coronary section plane; this is due to different tilting of the gantry. This often leads to an incorrect stage classification and makes objective observation of the course (e.g., under therapy) impossible. By converting the measured values into the actual extent of the muscles by means of the cosine set, appreciable changes in the pattern and frequency of affection of the rectus eye muscles were found in 121 patients examined. (orig.) [de

  10. [Possibilities of a software-based hybrid single photon emission computed tomography/magnetic resonance imaging in the diagnosis of complicated diabetic foot syndrome].

    Science.gov (United States)

    Zavadovskaya, V D; Zorkal'tsev, M A; Udodov, V D; Zamyshevskaya, M A; Kilina, O Yu; Kurazhov, A P; Popov, K M

    2015-01-01

    To give the results of a software-based hybrid single photon emission computed tomography/magnetic resonance imaging (SPECT/MRI) in detecting osteomyelitis (OM) in patients with diabetic foot syndrome (DFS). Seventy-six patients (35 men and 41 women) (mean age, 59.4 +/- 7.1 years) with type 1 and 2 diabetes mellitus and suspected OM were examined. The investigation enrolled patients with neuropathic (n = 25), ischemic (n = 13), and mixed (n = 38) DFS. All the patients underwent (99m)Tc-HMPAO/ (99m)Tc-technefit labeled leukocyte scintigraphy; magnetic resonance imaging was performed in 30 patients. The results were combined using RView 9.06 software (Colin Studholme). Labeled leukocyte SPECT to Diagnose OM yielded 255 true positive (TP), 38 true negative (TN), 12 false negative (FP), and 1 false negative (FN) results. The accuracy of the technique was 82.9%. The FP results were due to the low resolution of the technique and to the small sizes of the object under study. One FN result was detected in a patient with ischemic DFS because of reduced blood flow. MRI to identify OM in patients with DFS provided 20 TP, 16 TN, 4 FP, and 2 FN results. Its diagnostic accuracy was 85.7%. The relative low specificity of MRI was associated with the presence of FP results due to the complexity of differential diagnosis of bone marrow edema and inflammatory infiltration. Assessing 42 hybrid SPECT/MR-images revealed 21 TP, 17 TN, 3 FP, and I FN results. The diagnostic accuracy was equal to 95.9%. Thus, comparing MRI (90.9% sensitivity and 80.0% specificity), labeled leukocyte scintigraphy (96.2% sensitivity and 76.0% specificity), and hybrid SPECT/MRI (95.5% sensitivity and 85.0% specificity) showed the high diagnostic efficiency of the latter.

  11. Optical Computing

    OpenAIRE

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  12. Prediction of Chlamydia pneumoniae protein localization in host mitochondria and cytoplasm and possible involvements in lung cancer etiology: a computational approach

    Directory of Open Access Journals (Sweden)

    Aws Alshamsan

    2017-12-01

    Full Text Available Collecting evidence suggests that the intercellular infection of Chlamydia pneumoniae in lungs contributes to the etiology of lung cancer. Many proteins of Chlamydia pneumoniae outmanoeuvre the various system of the host. The infection may regulate various factors, which can influence the growth of lung cancer in affected persons. In this in-silico study, we predict potential targeting of Chlamydia pneumoniae proteins in mitochondrial and cytoplasmic comportments of host cell and their possible involvement in growth and development of lung cancer. Various cellular activities are controlled in mitochondria and cytoplasm, where the localization of Chlamydia pneumoniae proteins may alter the normal functioning of host cells. The rationale of this study is to find out and explain the connection between Chlamydia pneumoniae infection and lung cancer. A sum of 183 and 513 proteins were predicted to target in mitochondria and cytoplasm of host cell out of total 1112 proteins of Chlamydia pneumoniae. In particular, many targeted proteins may interfere with normal growth behaviour of host cells, thereby altering the decision of program cell death. Present article provides a potential connection of Chlamydia pneumoniae protein targeting and proposed that various targeted proteins may play crucial role in lung cancer etiology through diverse mechanisms.

  13. Possibilities for exposure reduction in computed tomography examination of acute chest pain; Moeglichkeiten der Dosisreduktion bei CT-Untersuchungen des akuten Thoraxschmerzes

    Energy Technology Data Exchange (ETDEWEB)

    Becker, H.C. [Klinikum der Ludwig-Maximilians-Universitaet Muenchen, Campus Grosshadern, Institut fuer Klinische Radiologie, Muenchen (Germany)

    2012-10-15

    Electrocardiogram-gated (ECG) computed tomography (CT) investigations can be accompanied by high amounts of radiation exposure. This is particularly true for the investigation of patients with unclear and acute chest pain. The common approach in patients with acute chest pain is standard spiral CT of the chest. The chest pain or triple-rule-out CT protocol is a relatively new ECG-gated protocol of the entire chest. This article reviews and discusses different techniques for the CT investigation of patients with acute chest pain. By applying the appropriate scan technique, the radiation exposure for an ECG-gated protocol must not necessarily be higher than a standard chest CT scan Aortic pathologies are far better depicted by ECG-gated scan protocols and depending on the heart rate coronary artery disease can also be detected at the same time. The use of ECG-triggered scans will not support the diagnostics of the pulmonary arteries. However, in unspecific chest pain an ECG-triggered scan protocol can provide information on the differential diagnosis. (orig.) [German] EKG-getriggerte CT-Untersuchungen koennen mit einer relativ hohen Strahlenexposition einhergehen. Dies gilt im besonderen Masse fuer die Untersuchung des gesamten Thorax bei Patienten mit unklarem akutem Thoraxschmerz. Bisher wurden Untersuchungen bei Patienten mit akutem Thoraxschmerz in Spiraltechnik ohne EKG-Triggerung durchgefuehrt. Das ''Chest-pain-'' oder ''Triple-rule-out''-Protokoll ist ein neues EKG-getriggertes Untersuchungsprotokoll des gesamten Thorax. Im vorliegenden Artikel werden verschiedene Techniken zur CT-Untersuchung von Patienten mit akutem Thoraxschmerz vorgestellt und besprochen. Mit der richtigen Untersuchungstechnik muss die Strahlenexposition fuer ein EKG-getriggertes Untersuchungsprotokoll nicht hoeher sein als eine Standarduntersuchung ohne EKG. Mit einem EKG-getriggerten Untersuchungsprotokoll laesst sich die Aorta in Hinblick auf

  14. Remote possibilities

    International Nuclear Information System (INIS)

    Fernandes, J.

    1995-01-01

    The impact that wireless communications has had for gas and oil producers was discussed. Wireless communication, which has been replacing the traditional formats of radio and telephone data networks, has proved to be cheaper, smaller, and faster than creating privately owned communication networks. With highly developed supervisory control and data acquisition systems - combined with cellular or satellite technology - information from drill sites can be online at the corporate headquarters instantaneously. Eighty percent of Canada's land mass is beyond reach of traditional wireline and wireless services. Research into advanced communications, including telecommunication and mobile applications, yielded lucrative results for service providers such as BCTel, SaskTel, Bell Mobility and AGT. The latest data transmission technology is the cellular digital packet data (CDPD) which will operate over existing cellular networks. However, unlike circuit-switched cellular, CDPD technology provides an airlink where data is secure. It will be available to the marketplace over the course of the coming year. Among other advantages, CDPD will allow producers to remotely monitor production information and downtime alarms from wells and compressor stations. It will also provide fleet operators with the means to monitor operating vital signs on rolling stock

  15. Computational intelligence in medical informatics

    CERN Document Server

    Gunjan, Vinit

    2015-01-01

    This Brief highlights Informatics and related techniques to Computer Science Professionals, Engineers, Medical Doctors, Bioinformatics researchers and other interdisciplinary researchers. Chapters include the Bioinformatics of Diabetes and several computational algorithms and statistical analysis approach to effectively study the disorders and possible causes along with medical applications.

  16. Making extreme computations possible with virtual machines

    International Nuclear Information System (INIS)

    Reuter, J.; Chokoufe Nejad, B.

    2016-02-01

    State-of-the-art algorithms generate scattering amplitudes for high-energy physics at leading order for high-multiplicity processes as compiled code (in Fortran, C or C++). For complicated processes the size of these libraries can become tremendous (many GiB). We show that amplitudes can be translated to byte-code instructions, which even reduce the size by one order of magnitude. The byte-code is interpreted by a Virtual Machine with runtimes comparable to compiled code and a better scaling with additional legs. We study the properties of this algorithm, as an extension of the Optimizing Matrix Element Generator (O'Mega). The bytecode matrix elements are available as alternative input for the event generator WHIZARD. The bytecode interpreter can be implemented very compactly, which will help with a future implementation on massively parallel GPUs.

  17. Computational aspects of algebraic curves

    CERN Document Server

    Shaska, Tanush

    2005-01-01

    The development of new computational techniques and better computing power has made it possible to attack some classical problems of algebraic geometry. The main goal of this book is to highlight such computational techniques related to algebraic curves. The area of research in algebraic curves is receiving more interest not only from the mathematics community, but also from engineers and computer scientists, because of the importance of algebraic curves in applications including cryptography, coding theory, error-correcting codes, digital imaging, computer vision, and many more.This book cove

  18. Quantum computers and quantum computations

    International Nuclear Information System (INIS)

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  19. Quantum computation

    International Nuclear Information System (INIS)

    Deutsch, D.

    1992-01-01

    As computers become ever more complex, they inevitably become smaller. This leads to a need for components which are fabricated and operate on increasingly smaller size scales. Quantum theory is already taken into account in microelectronics design. This article explores how quantum theory will need to be incorporated into computers in future in order to give them their components functionality. Computation tasks which depend on quantum effects will become possible. Physicists may have to reconsider their perspective on computation in the light of understanding developed in connection with universal quantum computers. (UK)

  20. The Gender Factor in Computer Anxiety and Interest among Some Australian High School Students.

    Science.gov (United States)

    Okebukola, Peter Akinsola

    1993-01-01

    Western Australia eleventh graders (142 boys, 139 girls) were compared on such variables as computers at home, computer classes, experience with computers, and socioeconomic status. Girls had higher anxiety levels, boys higher computer interest. Possible causes included social beliefs about computer use, teacher sex bias, and software (games) more…

  1. Computational vision

    CERN Document Server

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  2. Quantum computing

    International Nuclear Information System (INIS)

    Steane, Andrew

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  3. Quantum computing

    Energy Technology Data Exchange (ETDEWEB)

    Steane, Andrew [Department of Atomic and Laser Physics, University of Oxford, Clarendon Laboratory, Oxford (United Kingdom)

    1998-02-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  4. Planning Computer-Aided Distance Learning

    Directory of Open Access Journals (Sweden)

    Nadja Dobnik

    1996-12-01

    Full Text Available Didactics of autonomous learning changes under the influence of new technologies. Computer technology can cover all the functions that a teacher develops in personal contact with the learner. People organizing distance learning must realize all the possibilities offered by computers. Computers can take over and also combine the functions of many tools and systems, e. g. type­ writer, video, telephone. This the contents can be offered in form of classic media by means of text, speech, picture, etc. Computers take over data pro­cessing and function as study materials. Computer included in a computer network can also function as a medium for interactive communication.

  5. Batteries not included

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, M.

    2001-09-08

    This article traces the development of clockwork wind-up battery chargers that can be used to recharge mobile phones, laptop computers, torches or radio batteries from the pioneering research of the British inventor Trevor Baylis to the marketing of the wind-up gadgets by Freeplay Energy who turned the idea into a commercial product. The amount of cranking needed to power wind-up devices is discussed along with a hand-cranked charger for mobile phones, upgrading the phone charger's mechanism, and drawbacks of the charger. Details are given of another invention using a hand-cranked generator with a supercapacitor as a storage device which has a very much higher capacity for storing electrical charge.

  6. Batteries not included

    International Nuclear Information System (INIS)

    Cooper, M.

    2001-01-01

    This article traces the development of clockwork wind-up battery chargers that can be used to recharge mobile phones, laptop computers, torches or radio batteries from the pioneering research of the British inventor Trevor Baylis to the marketing of the wind-up gadgets by Freeplay Energy who turned the idea into a commercial product. The amount of cranking needed to power wind-up devices is discussed along with a hand-cranked charger for mobile phones, upgrading the phone charger's mechanism, and drawbacks of the charger. Details are given of another invention using a hand-cranked generator with a supercapacitor as a storage device which has a very much higher capacity for storing electrical charge

  7. Neoclassical transport including collisional nonlinearity.

    Science.gov (United States)

    Candy, J; Belli, E A

    2011-06-10

    In the standard δf theory of neoclassical transport, the zeroth-order (Maxwellian) solution is obtained analytically via the solution of a nonlinear equation. The first-order correction δf is subsequently computed as the solution of a linear, inhomogeneous equation that includes the linearized Fokker-Planck collision operator. This equation admits analytic solutions only in extreme asymptotic limits (banana, plateau, Pfirsch-Schlüter), and so must be solved numerically for realistic plasma parameters. Recently, numerical codes have appeared which attempt to compute the total distribution f more accurately than in the standard ordering by retaining some nonlinear terms related to finite-orbit width, while simultaneously reusing some form of the linearized collision operator. In this work we show that higher-order corrections to the distribution function may be unphysical if collisional nonlinearities are ignored.

  8. Internet governance origins, current issues, and future possibilities

    CERN Document Server

    Balleste, Roy

    2015-01-01

    Internet Governance: Origins, Current Issues, and Future Possibilities provides an introductory, multidisciplinary account of the forces at work in the evolving concept of internet governance and includes computer history, Internet beginnings, institutions and stakeholders, proposed models of governance, and human rights.

  9. Epitaxial phase diagrams of SrTiO3, CaTiO3, and SrHfO3: Computational investigation including the role of antiferrodistortive and A -site displacement modes

    Science.gov (United States)

    Angsten, Thomas; Asta, Mark

    2018-04-01

    Ground-state epitaxial phase diagrams are calculated by density functional theory (DFT) for SrTiO3, CaTiO3, and SrHfO3 perovskite-based compounds, accounting for the effects of antiferrodistortive and A -site displacement modes. Biaxial strain states corresponding to epitaxial growth of (001)-oriented films are considered, with misfit strains ranging between -4 % and 4%. Ground-state structures are determined using a computational procedure in which input structures for DFT optimizations are identified as local minima in expansions of the total energy with respect to strain and soft-mode degrees of freedom. Comparison to results of previous DFT studies demonstrates the effectiveness of the computational approach in predicting ground-state phases. The calculated results show that antiferrodistortive octahedral rotations and associated A -site displacement modes act to suppress polarization and reduce the epitaxial strain energy. A projection of calculated atomic displacements in the ground-state epitaxial structures onto soft-mode eigenvectors shows that three ferroelectric and six antiferrodistortive displacement modes are dominant at all misfit strains considered, with the relative contributions from each varying systematically with the strain. Additional A -site displacement modes contribute to the atomic displacements in CaTiO3 and SrHfO3, which serve to optimize the coordination of the undersized A -site cation.

  10. New seismograph includes filters

    Energy Technology Data Exchange (ETDEWEB)

    1979-11-02

    The new Nimbus ES-1210 multichannel signal enhancement seismograph from EG and G geometrics has recently been redesigned to include multimode signal fillers on each amplifier. The ES-1210F is a shallow exploration seismograph for near subsurface exploration such as in depth-to-bedrock, geological hazard location, mineral exploration, and landslide investigations.

  11. Personal Computers.

    Science.gov (United States)

    Toong, Hoo-min D.; Gupta, Amar

    1982-01-01

    Describes the hardware, software, applications, and current proliferation of personal computers (microcomputers). Includes discussions of microprocessors, memory, output (including printers), application programs, the microcomputer industry, and major microcomputer manufacturers (Apple, Radio Shack, Commodore, and IBM). (JN)

  12. Analytic device including nanostructures

    KAUST Repository

    Di Fabrizio, Enzo M.; Fratalocchi, Andrea; Totero Gongora, Juan Sebastian; Coluccio, Maria Laura; Candeloro, Patrizio; Cuda, Gianni

    2015-01-01

    A device for detecting an analyte in a sample comprising: an array including a plurality of pixels, each pixel including a nanochain comprising: a first nanostructure, a second nanostructure, and a third nanostructure, wherein size of the first nanostructure is larger than that of the second nanostructure, and size of the second nanostructure is larger than that of the third nanostructure, and wherein the first nanostructure, the second nanostructure, and the third nanostructure are positioned on a substrate such that when the nanochain is excited by an energy, an optical field between the second nanostructure and the third nanostructure is stronger than an optical field between the first nanostructure and the second nanostructure, wherein the array is configured to receive a sample; and a detector arranged to collect spectral data from a plurality of pixels of the array.

  13. GPU computing and applications

    CERN Document Server

    See, Simon

    2015-01-01

    This book presents a collection of state of the art research on GPU Computing and Application. The major part of this book is selected from the work presented at the 2013 Symposium on GPU Computing and Applications held in Nanyang Technological University, Singapore (Oct 9, 2013). Three major domains of GPU application are covered in the book including (1) Engineering design and simulation; (2) Biomedical Sciences; and (3) Interactive & Digital Media. The book also addresses the fundamental issues in GPU computing with a focus on big data processing. Researchers and developers in GPU Computing and Applications will benefit from this book. Training professionals and educators can also benefit from this book to learn the possible application of GPU technology in various areas.

  14. Saskatchewan resources. [including uranium

    Energy Technology Data Exchange (ETDEWEB)

    1979-09-01

    The production of chemicals and minerals for the chemical industry in Saskatchewan are featured, with some discussion of resource taxation. The commodities mentioned include potash, fatty amines, uranium, heavy oil, sodium sulfate, chlorine, sodium hydroxide, sodium chlorate and bentonite. Following the successful outcome of the Cluff Lake inquiry, the uranium industry is booming. Some developments and production figures for Gulf Minerals, Amok, Cenex and Eldorado are mentioned.

  15. On probability-possibility transformations

    Science.gov (United States)

    Klir, George J.; Parviz, Behzad

    1992-01-01

    Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.

  16. SU-E-J-243: Possibility of Exposure Dose Reduction of Cone-Beam Computed Tomography in An Image Guided Patient Positioning System by Using Various Noise Suppression Filters

    International Nuclear Information System (INIS)

    Kamezawa, H; Arimura, H; Ohki, M; Shirieda, K; Kameda, N

    2014-01-01

    Purpose: To investigate the possibility of exposure dose reduction of the cone-beam computed tomography (CBCT) in an image guided patient positioning system by using 6 noise suppression filters. Methods: First, a reference dose (RD) and low-dose (LD)-CBCT (X-ray volume imaging system, Elekta Co.) images were acquired with a reference dose of 86.2 mGy (weighted CT dose index: CTDIw) and various low doses of 1.4 to 43.1 mGy, respectively. Second, an automated rigid registration for three axes was performed for estimating setup errors between a planning CT image and the LD-CBCT images, which were processed by 6 noise suppression filters, i.e., averaging filter (AF), median filter (MF), Gaussian filter (GF), bilateral filter (BF), edge preserving smoothing filter (EPF) and adaptive partial median filter (AMF). Third, residual errors representing the patient positioning accuracy were calculated as an Euclidean distance between the setup error vectors estimated using the LD-CBCT image and RD-CBCT image. Finally, the relationships between the residual error and CTDIw were obtained for 6 noise suppression filters, and then the CTDIw for LD-CBCT images processed by the noise suppression filters were measured at the same residual error, which was obtained with the RD-CBCT. This approach was applied to an anthropomorphic pelvic phantom and two cancer patients. Results: For the phantom, the exposure dose could be reduced from 61% (GF) to 78% (AMF) by applying the noise suppression filters to the CBCT images. The exposure dose in a prostate cancer case could be reduced from 8% (AF) to 61% (AMF), and the exposure dose in a lung cancer case could be reduced from 9% (AF) to 37% (AMF). Conclusion: Using noise suppression filters, particularly an adaptive partial median filter, could be feasible to decrease the additional exposure dose to patients in image guided patient positioning systems

  17. SU-E-J-243: Possibility of Exposure Dose Reduction of Cone-Beam Computed Tomography in An Image Guided Patient Positioning System by Using Various Noise Suppression Filters

    Energy Technology Data Exchange (ETDEWEB)

    Kamezawa, H [Graduate School of Medical Sciences, Kyushu University, Higashi-ku, Fukuoka (Japan); Fujimoto General Hospital, Miyakonojo, Miyazaki (Japan); Arimura, H; Ohki, M [Faculty of Medical Sciences, Kyushu University, Higashi-ku, Fukuoka (Japan); Shirieda, K; Kameda, N [Fujimoto General Hospital, Miyakonojo, Miyazaki (Japan)

    2014-06-01

    Purpose: To investigate the possibility of exposure dose reduction of the cone-beam computed tomography (CBCT) in an image guided patient positioning system by using 6 noise suppression filters. Methods: First, a reference dose (RD) and low-dose (LD)-CBCT (X-ray volume imaging system, Elekta Co.) images were acquired with a reference dose of 86.2 mGy (weighted CT dose index: CTDIw) and various low doses of 1.4 to 43.1 mGy, respectively. Second, an automated rigid registration for three axes was performed for estimating setup errors between a planning CT image and the LD-CBCT images, which were processed by 6 noise suppression filters, i.e., averaging filter (AF), median filter (MF), Gaussian filter (GF), bilateral filter (BF), edge preserving smoothing filter (EPF) and adaptive partial median filter (AMF). Third, residual errors representing the patient positioning accuracy were calculated as an Euclidean distance between the setup error vectors estimated using the LD-CBCT image and RD-CBCT image. Finally, the relationships between the residual error and CTDIw were obtained for 6 noise suppression filters, and then the CTDIw for LD-CBCT images processed by the noise suppression filters were measured at the same residual error, which was obtained with the RD-CBCT. This approach was applied to an anthropomorphic pelvic phantom and two cancer patients. Results: For the phantom, the exposure dose could be reduced from 61% (GF) to 78% (AMF) by applying the noise suppression filters to the CBCT images. The exposure dose in a prostate cancer case could be reduced from 8% (AF) to 61% (AMF), and the exposure dose in a lung cancer case could be reduced from 9% (AF) to 37% (AMF). Conclusion: Using noise suppression filters, particularly an adaptive partial median filter, could be feasible to decrease the additional exposure dose to patients in image guided patient positioning systems.

  18. Computer-controlled attenuator.

    Science.gov (United States)

    Mitov, D; Grozev, Z

    1991-01-01

    Various possibilities for applying electronic computer-controlled attenuators for the automation of physiological experiments are considered. A detailed description is given of the design of a 4-channel computer-controlled attenuator, in two of the channels of which the output signal can change by a linear step, in the other two channels--by a logarithmic step. This, together with the existence of additional programmable timers, allows to automate a wide range of studies in different spheres of physiology and psychophysics, including vision and hearing.

  19. Being Included and Excluded

    DEFF Research Database (Denmark)

    Korzenevica, Marina

    2016-01-01

    Following the civil war of 1996–2006, there was a dramatic increase in the labor mobility of young men and the inclusion of young women in formal education, which led to the transformation of the political landscape of rural Nepal. Mobility and schooling represent a level of prestige that rural...... politics. It analyzes how formal education and mobility either challenge or reinforce traditional gendered norms which dictate a lowly position for young married women in the household and their absence from community politics. The article concludes that women are simultaneously excluded and included from...... community politics. On the one hand, their mobility and decision-making powers decrease with the increase in the labor mobility of men and their newly gained education is politically devalued when compared to the informal education that men gain through mobility, but on the other hand, schooling strengthens...

  20. Computational sustainability

    CERN Document Server

    Kersting, Kristian; Morik, Katharina

    2016-01-01

    The book at hand gives an overview of the state of the art research in Computational Sustainability as well as case studies of different application scenarios. This covers topics such as renewable energy supply, energy storage and e-mobility, efficiency in data centers and networks, sustainable food and water supply, sustainable health, industrial production and quality, etc. The book describes computational methods and possible application scenarios.

  1. Future possibilities at ISOLDE

    International Nuclear Information System (INIS)

    Haas, H.

    1991-01-01

    The performance of the ISOLDE facility at CERN is summarized and recently achieved target/ion-source and separator improvements are presented. New ion source principles being tested include multistep laser ionization, bunched surface ionization, and ECR ion source. The prospects for the high-resolution separator ISOLDE-3 are described. The possibilities of a new ISOLDE installation at the PS Booster at CERN are presented together with the planned features of the new facility. Add-on devices to improve the performance that are presently being discussed are an electrostatic sector field, an electron beam ion stripper, and a high voltage platform for energy boosting. (author) 7 refs.; 5 figs

  2. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  3. Computational physics

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1987-01-15

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October.

  4. Computational physics

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October

  5. Unconventional Quantum Computing Devices

    OpenAIRE

    Lloyd, Seth

    2000-01-01

    This paper investigates a variety of unconventional quantum computation devices, including fermionic quantum computers and computers that exploit nonlinear quantum mechanics. It is shown that unconventional quantum computing devices can in principle compute some quantities more rapidly than `conventional' quantum computers.

  6. Analog and hybrid computing

    CERN Document Server

    Hyndman, D E

    2013-01-01

    Analog and Hybrid Computing focuses on the operations of analog and hybrid computers. The book first outlines the history of computing devices that influenced the creation of analog and digital computers. The types of problems to be solved on computers, computing systems, and digital computers are discussed. The text looks at the theory and operation of electronic analog computers, including linear and non-linear computing units and use of analog computers as operational amplifiers. The monograph examines the preparation of problems to be deciphered on computers. Flow diagrams, methods of ampl

  7. Computational Pathology

    Science.gov (United States)

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  8. Computability theory

    CERN Document Server

    Weber, Rebecca

    2012-01-01

    What can we compute--even with unlimited resources? Is everything within reach? Or are computations necessarily drastically limited, not just in practice, but theoretically? These questions are at the heart of computability theory. The goal of this book is to give the reader a firm grounding in the fundamentals of computability theory and an overview of currently active areas of research, such as reverse mathematics and algorithmic randomness. Turing machines and partial recursive functions are explored in detail, and vital tools and concepts including coding, uniformity, and diagonalization are described explicitly. From there the material continues with universal machines, the halting problem, parametrization and the recursion theorem, and thence to computability for sets, enumerability, and Turing reduction and degrees. A few more advanced topics round out the book before the chapter on areas of research. The text is designed to be self-contained, with an entire chapter of preliminary material including re...

  9. Possible future HERA analyses

    International Nuclear Information System (INIS)

    Geiser, Achim

    2015-12-01

    A variety of possible future analyses of HERA data in the context of the HERA data preservation programme is collected, motivated, and commented. The focus is placed on possible future analyses of the existing ep collider data and their physics scope. Comparisons to the original scope of the HERA pro- gramme are made, and cross references to topics also covered by other participants of the workshop are given. This includes topics on QCD, proton structure, diffraction, jets, hadronic final states, heavy flavours, electroweak physics, and the application of related theory and phenomenology topics like NNLO QCD calculations, low-x related models, nonperturbative QCD aspects, and electroweak radiative corrections. Synergies with other collider programmes are also addressed. In summary, the range of physics topics which can still be uniquely covered using the existing data is very broad and of considerable physics interest, often matching the interest of results from colliders currently in operation. Due to well-established data and MC sets, calibrations, and analysis procedures the manpower and expertise needed for a particular analysis is often very much smaller than that needed for an ongoing experiment. Since centrally funded manpower to carry out such analyses is not available any longer, this contribution not only targets experienced self-funded experimentalists, but also theorists and master-level students who might wish to carry out such an analysis.

  10. The IAEA transport regulations: main modifications included in the 1996 edition and the possible impact of its adoption in Argentina; El reglamento de transporte del OIEA: principales modificaciones incorporadas en la edicion de 1996 y el posible impacto de su adopcion en Argentina

    Energy Technology Data Exchange (ETDEWEB)

    Lopez Vietri, J R; Novo, R G; Bianchi, A J [Autoridad Regulatoria Nuclear, Buenos Aires (Argentina)

    1999-12-31

    Full text: This paper points out a comparative analysis between the requirements of the 1985 edition (as Amended 1990), in-force in almost all countries included Argentina, and the 1996 edition, that is foresee to put in-force 1st January 2001, of the Regulations for the safe transport of radioactive material, published by the International Atomic Energy Agency (IAEA). The English version of the 1996 edition was published in December 1996 and the Spanish one in September 1997. Such edition was the culmination of a difficult consensus and harmonisation reached after an analysis process of the-years cycle between the IAEA Member Sates and related international organisations (United Nations, International Civil Aviation Organisation, International Air Transport Association, International Federation of Air Lines Pilots Associations, International Maritime Organisation) as well as regional organisations (Economic Commission for Europe, Commission of the European Communities). Both editions of the Regulations include a set of design, operational and administrative requirements that substantially do not differ as for their safety basic philosophy. However, the 1996 edition introduces numerous modifications of different magnitude, which will derive in technological, economic and operative consequences. Of such modifications the paper only analysed the relevant ones which update the state of art in the subject and allow the Regulations continue maintaining an acceptable level of control of the radiation, criticality and thermal hazards to persons, property and the environment during the transport of radioactive material. In addition, the paper briefly describes the possible impact that the main modifications induced in the 1996 edition of the Regulations should have, depending on the type of user considered either in Argentina or in other Latin America countries. However, it is desirable that the personal of competent authorities of each country involved in transport

  11. Computational Streetscapes

    Directory of Open Access Journals (Sweden)

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  12. Computer interfacing

    CERN Document Server

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  13. Optical computing.

    Science.gov (United States)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  14. Optimisation of heat transformers for an economical utilisation of waste heat. Project Pt. C: Investigation of appropriate application possibilities for heat transforming processes including the characterisation of waste heat potentials. Final report; Optimierung von Waermetransformatoren zur wirtschaftlichen Nutzung von Abwaerme. Teilprojekt C: Untersuchung geeigneter Einsatzmoeglichkeiten fuer Waermetransformationsprozesse einschliesslich der Charakterisierung von Abwaermepotentialen. Schlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-07-27

    The aim of this project was to carry out energetic analysis of typical technical processes in the industrial field, in which low-temperature heat is used. In the frame of these experiments waste heat potentials of the production process were discovered. Furthermore the possibilities of refeeding the waste heat by means of heat transformation plants was investigated. The application of absorption heat transformers or absorption heat pumps is linked to technical marginal conditions, which have to be clarified in the frame of an energetic analysis. The waste heat resulting from processes between 60 and 100 C can be heated to temperatures between 100 and 140 C especially by absorption circuit processes. The incorporation of this enhanced waste heat into the technology was a further aim of the experiments, which is demonstrated by several examples. By means of an absorption heat transformer (performance number {epsilon} = 0,49) a reduction of heating steam of approx. 40% and an amortisation time of five years were achieved. The same result was achieved at the sugar crystallisation in a sugar factory. In the fruit juice industry the exhaust vapours can be used to heat the fruit juice by means of an absorption circuit for heat recovery. Thus it was possible to save energy costs of 360000 DM during one fruit harvest. (orig./GL) [Deutsch] Ein Ziel des bearbeiteten Projektes war es, energetische Analysen typischer technologischer Prozesse in der Industrie durchzufuehren, bei denen Niedertemperaturwaerme eingesetzt wird. Innerhalb dieser Untersuchungen wurden Abwaermepotentiale des Produktionsprozesses aufgedeckt. Im weiteren sollten Moeglichkeiten der Wiedereinkopplung von Abwaerme mit Hilfe von Waermetransformationsanlagen ermittelt werden. Der Einsatz von Absorptionswaermetransformatoren oder Absorptionswaermepumpen ist an technologische Randbedingungen geknuepft, die im Rahmen einer energetischen Analyse geklaert werden muessen. Die anfallende Abwaerme der Prozesse

  15. Computer Registration Becoming Mandatory

    CERN Multimedia

    2003-01-01

    Following the decision by the CERN Management Board (see Weekly Bulletin 38/2003), registration of all computers connected to CERN's network will be enforced and only registered computers will be allowed network access. The implementation has started with the IT buildings, continues with building 40 and the Prevessin site (as of Tuesday 4th November 2003), and will cover the whole of CERN before the end of this year. We therefore recommend strongly that you register all your computers in CERN's network database including all network access cards (Ethernet AND wireless) as soon as possible without waiting for the access restriction to take force. This will allow you accessing the network without interruption and help IT service providers to contact you in case of problems (e.g. security problems, viruses, etc.) Users WITH a CERN computing account register at: http://cern.ch/register/ (CERN Intranet page) Visitors WITHOUT a CERN computing account (e.g. short term visitors) register at: http://cern.ch/regis...

  16. Possible Radiation-Induced Damage to the Molecular Structure of Wooden Artifacts Due to Micro-Computed Tomography, Handheld X-Ray Fluorescence, and X-Ray Photoelectron Spectroscopic Techniques

    Directory of Open Access Journals (Sweden)

    Madalena Kozachuk

    2016-05-01

    Full Text Available This study was undertaken to ascertain whether radiation produced by X-ray photoelectron spectroscopy (XPS, micro-computed tomography (μCT and/or portable handheld X-ray fluorescence (XRF equipment might damage wood artifacts during analysis. Changes at the molecular level were monitored by Fourier transform infrared (FTIR analysis. No significant changes in FTIR spectra were observed as a result of μCT or handheld XRF analysis. No substantial changes in the collected FTIR spectra were observed when XPS analytical times on the order of minutes were used. However, XPS analysis collected over tens of hours did produce significant changes in the FTIR spectra.

  17. Computer hardware fault administration

    Science.gov (United States)

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  18. A possible new basis for fast reactor subassembly instrumentation

    International Nuclear Information System (INIS)

    Edwards, A.G.

    1977-01-01

    This is a digest of a paper presented to the Risley Engineering Society. The theme is a speculation that the core instrumentation problem for a liquid metal fast breeder reactor might be transformed by developments in the realm of infrared television and in pattern recognition by computer. There is a possible need to measure coolant flow and cooled exit temperature for each subassembly, with familiar fail-to-safety characteristics. Present methods use electrical devices, for example thermocouples, but this gives rise to cabling problems. It might be possible, however, to instal at the top of each subassembly a mechanical device that gives a direct indication of temperature and flow visible to an infrared television camera. Signal transmission by cable would then be replaced by direct observation. A possible arrangement for such a system is described and is shown in schematic form. It includes pattern recognition by computer. It may also be possible to infer coolant temperature directly from the characteristics of the infrared radiation emitted by a thin stainless steel sheet in contact with the sodium, and an arrangement for this is shown. The type of pattern produced for on-line interpretation by computer is also shown. It is thought that this new approach to the problem of subassembly instrumentation is sufficiently attractive to justify a close study of the problems involved. (U.K.)

  19. Possible new basis for fast reactor subassembly instrumentation

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, A G

    1977-03-01

    This is a digest of a paper presented to the Risley Engineering Society. The theme is a speculation that the core instrumentation problem for a liquid metal fast breeder reactor might be transformed by developments in the realm of infrared television and in pattern recognition by computer. There is a possible need to measure coolant flow and cooled exit temperature for each subassembly, with familiar fail-to-safety characteristics. Present methods use electrical devices, for example thermocouples, but this gives rise to cabling problems. It might be possible, however, to instal at the top of each subassembly a mechanical device that gives a direct indication of temperature and flow visible to an infrared television camera. Signal transmission by cable would then be replaced by direct observation. A possible arrangement for such a system is described and is shown in schematic form. It includes pattern recognition by computer. It may also be possible to infer coolant temperature directly from the characteristics of the infrared radiation emitted by a thin stainless steel sheet in contact with the sodium, and an arrangement for this is shown. The type of pattern produced for on-line interpretation by computer is also shown. It is thought that this new approach to the problem of subassembly instrumentation is sufficiently attractive to justify a close study of the problems involved.

  20. Possibility Fuzzy Soft Set

    Directory of Open Access Journals (Sweden)

    Shawkat Alkhazaleh

    2011-01-01

    Full Text Available We introduce the concept of possibility fuzzy soft set and its operation and study some of its properties. We give applications of this theory in solving a decision-making problem. We also introduce a similarity measure of two possibility fuzzy soft sets and discuss their application in a medical diagnosis problem.

  1. Clinical study on the cardiac hemodynamics and the possibility of demonstration of the left intraatrial thrombi by echocardiography, angiocardiography and computed tomography and the neurological symptoms in patients with heart disorder and cerebral embolism

    Energy Technology Data Exchange (ETDEWEB)

    Nakajima, Kazuo

    1987-03-01

    In an attempt to elucidate risk factors for developing cerebral embolism (CE) in patients with heart disease, hemodynamic, sonographic or radiologic, and neurologic manifestations of heart disease developing into CE were retrospectively analyzed in 44 patients with CE and 122 patients with mitral valve disease (MVD). The most common underlying disease of CE was valve disease (50 %), followed by myocardial infarction, atrial fibrillation, and infectious endocarditis. In MVD patients, risk factors for CE were considered to be atrial fibrillation, mitral stenosis, and intraatrial thrombi. Combined use of various imaging modalities revealed the presence of intraatrial thrombi in 65 % of the CE patients. Cranial computed tomography showed hemorrhagic infarction in 22 %, and found the mid-arotic artery to be the commonest responsible region (81 %). The frequent initial neurologic symptom was hemiplegia. Half of the patients had disturbance of consciousness on admission. Prognosis was better in patients with MVD than those with the other types of heart disease. (Namekawa, K.). 117 refs.

  2. Clinical study on the cardiac hemodynamics and the possibility of demonstration of the left intraatrial thrombi by echocardiography, angiocardiography and computed tomography and the neurological symptoms in patients with heart disorder and cerebral embolism

    International Nuclear Information System (INIS)

    Nakajima, Kazuo

    1987-01-01

    In an attempt to elucidate risk factors for developing cerebral embolism (CE) in patients with heart disease, hemodynamic, sonographic or radiologic, and neurologic manifestations of heart disease developing into CE were retrospectively analyzed in 44 patients with CE and 122 patients with mitral valve disease (MVD). The most common underlying disease of CE was valve disease (50 %), followed by myocardial infarction, atrial fibrillation, and infectious endocarditis. In MVD patients, risk factors for CE were considered to be atrial fibrillation, mitral stenosis, and intraatrial thrombi. Combined use of various imaging modalities revealed the presence of intraatrial thrombi in 65 % of the CE patients. Cranial computed tomography showed hemorrhagic infarction in 22 %, and found the mid-arotic artery to be the commonest responsible region (81 %). The frequent initial neurologic symptom was hemiplegia. Half of the patients had disturbance of consciousness on admission. Prognosis was better in patients with MVD than those with the other types of heart disease. (Namekawa, K.). 117 refs

  3. Including gauge corrections to thermal leptogenesis

    Energy Technology Data Exchange (ETDEWEB)

    Huetig, Janine

    2013-05-17

    . Furthermore, we have computed the Majorana neutrino production rate itself in chapter 6 to test our numerical procedure. In this context we have calculated the tree-level result as well as the gauge corrected result for the Majorana neutrino production rate. Finally in chapter 7, we have implemented the Majorana neutrino ladder rung diagram into our setup for leptogenesis: As a first consideration, we have collected all gauge corrected diagrams up to three-loop order for the asymmetry-causing two-loop diagrams. However, the results of chap. 5 showed that it is not sufficient to just include diagrams up to three-loop level. Due to the necessity of resumming all n-loop diagrams, we have constructed a cylindrical diagram that fulfils this condition. This diagram is the link between the Majorana neutrino ladder rung diagram calculated before on the one hand and the lepton asymmetry on the other. Therefore we have been able to derive a complete expression for the integrated lepton number matrix including all leading order corrections. The numerical analysis of this lepton number matrix needs a great computational effort since for the resulting eight-dimensional integral two ordinary differential equations have to be computed for each point the routine evaluates. Thus the result remains yet inaccessible. Research perspectives: Summarising, this thesis provides the basis for a systematic inclusion of gauge interactions in thermal leptogenesis scenarios. As a next step, one should evaluate the expression for the integrated lepton number numerically to gain a value, which can be used for comparison to earlier results such as the solutions of the Boltzmann equations as well as the Kadanoff-Baym ansatz with the implemented Standard Model widths. This numerical result would be the first quantitative number, which contains leading order corrections due to all interactions of the Majorana neutrino with the Standard Model particles. Further corrections by means of including washout effects

  4. Including gauge corrections to thermal leptogenesis

    International Nuclear Information System (INIS)

    Huetig, Janine

    2013-01-01

    . Furthermore, we have computed the Majorana neutrino production rate itself in chapter 6 to test our numerical procedure. In this context we have calculated the tree-level result as well as the gauge corrected result for the Majorana neutrino production rate. Finally in chapter 7, we have implemented the Majorana neutrino ladder rung diagram into our setup for leptogenesis: As a first consideration, we have collected all gauge corrected diagrams up to three-loop order for the asymmetry-causing two-loop diagrams. However, the results of chap. 5 showed that it is not sufficient to just include diagrams up to three-loop level. Due to the necessity of resumming all n-loop diagrams, we have constructed a cylindrical diagram that fulfils this condition. This diagram is the link between the Majorana neutrino ladder rung diagram calculated before on the one hand and the lepton asymmetry on the other. Therefore we have been able to derive a complete expression for the integrated lepton number matrix including all leading order corrections. The numerical analysis of this lepton number matrix needs a great computational effort since for the resulting eight-dimensional integral two ordinary differential equations have to be computed for each point the routine evaluates. Thus the result remains yet inaccessible. Research perspectives: Summarising, this thesis provides the basis for a systematic inclusion of gauge interactions in thermal leptogenesis scenarios. As a next step, one should evaluate the expression for the integrated lepton number numerically to gain a value, which can be used for comparison to earlier results such as the solutions of the Boltzmann equations as well as the Kadanoff-Baym ansatz with the implemented Standard Model widths. This numerical result would be the first quantitative number, which contains leading order corrections due to all interactions of the Majorana neutrino with the Standard Model particles. Further corrections by means of including washout effects

  5. Physics of quantum computation

    International Nuclear Information System (INIS)

    Belokurov, V.V.; Khrustalev, O.A.; Sadovnichij, V.A.; Timofeevskaya, O.D.

    2003-01-01

    In the paper, the modern status of the theory of quantum computation is considered. The fundamental principles of quantum computers and their basic notions such as quantum processors and computational basis states of the quantum Turing machine as well as the quantum Fourier transform are discussed. Some possible experimental realizations on the basis of NMR methods are given

  6. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  7. Design and applications of Computed Industrial Tomographic Imaging System (CITIS)

    Energy Technology Data Exchange (ETDEWEB)

    Ramakrishna, G S; Kumar, Umesh; Datta, S S [Bhabha Atomic Research Centre, Bombay (India). Isotope Div.

    1994-12-31

    This paper highlights the design and development of a prototype Computed Tomographic (CT) imaging system and its software for image reconstruction, simulation and display. It also describes results obtained with several test specimens including Dhruva reactor uranium fuel assembly and possibility of using neutrons as well as high energy x-rays in computed tomography. 5 refs., 4 figs.

  8. Computer applications in thermochemistry

    International Nuclear Information System (INIS)

    Vana Varamban, S.

    1996-01-01

    Knowledge of equilibrium is needed under many practical situations. Simple stoichiometric calculations can be performed by the use of hand calculators. Multi-component, multi-phase gas - solid chemical equilibrium calculations are far beyond the conventional devices and methods. Iterative techniques have to be resorted. Such problems are most elegantly handled by the use of modern computers. This report demonstrates the possible use of computers for chemical equilibrium calculations in the field of thermochemistry and chemical metallurgy. Four modules are explained. To fit the experimental C p data and to generate the thermal functions, to perform equilibrium calculations to the defined conditions, to prepare the elaborate input to the equilibrium and to analyse the calculated results graphically. The principles of thermochemical calculations are briefly described. An extensive input guide is given. Several illustrations are included to help the understanding and usage. (author)

  9. Computational creativity

    Directory of Open Access Journals (Sweden)

    López de Mántaras Badia, Ramon

    2013-12-01

    Full Text Available New technologies, and in particular artificial intelligence, are drastically changing the nature of creative processes. Computers are playing very significant roles in creative activities such as music, architecture, fine arts, and science. Indeed, the computer is already a canvas, a brush, a musical instrument, and so on. However, we believe that we must aim at more ambitious relations between computers and creativity. Rather than just seeing the computer as a tool to help human creators, we could see it as a creative entity in its own right. This view has triggered a new subfield of Artificial Intelligence called Computational Creativity. This article addresses the question of the possibility of achieving computational creativity through some examples of computer programs capable of replicating some aspects of creative behavior in the fields of music and science.Las nuevas tecnologías y en particular la Inteligencia Artificial están cambiando de forma importante la naturaleza del proceso creativo. Los ordenadores están jugando un papel muy significativo en actividades artísticas tales como la música, la arquitectura, las bellas artes y la ciencia. Efectivamente, el ordenador ya es el lienzo, el pincel, el instrumento musical, etc. Sin embargo creemos que debemos aspirar a relaciones más ambiciosas entre los ordenadores y la creatividad. En lugar de verlos solamente como herramientas de ayuda a la creación, los ordenadores podrían ser considerados agentes creativos. Este punto de vista ha dado lugar a un nuevo subcampo de la Inteligencia Artificial denominado Creatividad Computacional. En este artículo abordamos la cuestión de la posibilidad de alcanzar dicha creatividad computacional mediante algunos ejemplos de programas de ordenador capaces de replicar algunos aspectos relacionados con el comportamiento creativo en los ámbitos de la música y la ciencia.

  10. Optical Computing

    Indian Academy of Sciences (India)

    Other advantages of optics include low manufacturing costs, immunity to ... It is now possible to control atoms by trapping single photons in small, .... cement, and optical spectrum analyzers. ... risk of noise is further reduced, as light is immune to electro- ..... mode of operation including management of large multimedia.

  11. Future Computer Requirements for Computational Aerodynamics

    Science.gov (United States)

    1978-01-01

    Recent advances in computational aerodynamics are discussed as well as motivations for and potential benefits of a National Aerodynamic Simulation Facility having the capability to solve fluid dynamic equations at speeds two to three orders of magnitude faster than presently possible with general computers. Two contracted efforts to define processor architectures for such a facility are summarized.

  12. Computers and data processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Computers and Data Processing provides information pertinent to the advances in the computer field. This book covers a variety of topics, including the computer hardware, computer programs or software, and computer applications systems.Organized into five parts encompassing 19 chapters, this book begins with an overview of some of the fundamental computing concepts. This text then explores the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapters consider how computers present their results and explain the storage and retrieval of

  13. Computational chemistry

    Science.gov (United States)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  14. Studies on validation possibilities for computational codes for criticality and burnup calculations of boiling water reactor fuel; Untersuchungen zu Validierungsmoeglichkeiten von Rechencodes fuer Kritikalitaets- und Abbrandrechnungen von Siedewasserreaktor-Brennstoff

    Energy Technology Data Exchange (ETDEWEB)

    Behler, Matthais; Hannstein, Volker; Kilger, Robert; Sommer, Fabian; Stuke, Maik

    2017-06-15

    The Application of the method of Burn-up Credit on Boiling Water Reactor fuel is much more complex than in the case of Pressurized Water Reactors due to the increased heterogeneity and complexity of the fuel assemblies. Strongly varying enrichments, complex fuel assembly geometries, partial length fuel rods, and strong axial variations of the moderator density make the verification of conservative irradiation conditions difficult. In this Report, it was investigated whether it is possible to take into account the burn-up in criticality analyses for systems with irradiated Boiling Water Reactor fuel on the basis of freely available experimental data and by additionally applying stochastic methods. In order to achieve this goal, existing methods for stochastic analysis were adapted and further developed in order to being applicable to the specific conditions needed in Boiling Water Reactor analysis. The aim was to gain first insight whether a workable scheme for using burn-up credit in Boiling Water Reactor applications can be derived. Due to the fact that the different relevant quantities, like e.g. moderator density and the axial power profile, are strongly correlated, the GRS-tool SUnCISTT for Monte-Carlo uncertainty quantification was used in the analysis. This tool was coupled to a simplified, consistent model for the irradiation conditions. In contrast to conventional methods, this approach allows to simultaneously analyze all involved effects.

  15. Triphenylmethane, a possible moderator material

    International Nuclear Information System (INIS)

    Hügle, Th.; Mocko, M.; Hartl, M.A.; Daemen, L.L.; Muhrer, G.

    2014-01-01

    New challenges in neutron scattering result in an increased demand in novel moderator concepts. The most direct way to address the problem would be to change the moderator material itself. However the range of available neutron moderator materials is small. In this paper, we discuss triphenylmethane, a possible moderator material especially promising for cold neutron moderator applications. Our investigations include a parallel experimental and theoretical approach ranging from cross-section measurements and inelastic neutron spectroscopy to molecular modeling. -- Highlights: • Triphenylmethane as a potential moderator material is discussed. • Parallel theoretical and experimental approach. • Possibly very useful for cold neutrons

  16. Polymorphous computing fabric

    Science.gov (United States)

    Wolinski, Christophe Czeslaw [Los Alamos, NM; Gokhale, Maya B [Los Alamos, NM; McCabe, Kevin Peter [Los Alamos, NM

    2011-01-18

    Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.

  17. Including Children with Selective Mutism in Mainstream Schools and Kindergartens: Problems and Possibilities

    Science.gov (United States)

    Omdal, Heidi

    2008-01-01

    There is little research on inclusion of children with selective mutism in school/kindergarten. Moreover, few studies have tried to understand selectively mute children's interactions in the natural surroundings of their home and school/kindergarten. Five children meeting the DSM-IV criteria for selective mutism were video-observed in social…

  18. Possible Solution to Publication Bias Through Bayesian Statistics, Including Proper Null Hypothesis Testing

    NARCIS (Netherlands)

    Konijn, Elly A.; van de Schoot, Rens; Winter, Sonja D.; Ferguson, Christopher J.

    2015-01-01

    The present paper argues that an important cause of publication bias resides in traditional frequentist statistics forcing binary decisions. An alternative approach through Bayesian statistics provides various degrees of support for any hypothesis allowing balanced decisions and proper null

  19. Churn Possibilities and Impossibilities

    OpenAIRE

    Foreback , Dianne; Nesterenko , Mikhail; Tixeuil , Sébastien

    2018-01-01

    Churn is processes joining or leaving the peer-to-peer overlay network. We study handling of various churn variants. Cooperative churn requires leaving processes to participate in the churn algorithm while adversarial churn allows the processes to just quit. Infinite churn considers unbounded number of churning processes throughout a single computation. Unlimited churn does not place a bound on the number of concurrently churning processes. Fair churn handling requires that each churn request...

  20. Telemedicine. Possibilities and perspectives

    International Nuclear Information System (INIS)

    Lenzen, H.; Meier, N.; Bick, U.

    1997-01-01

    Radiological teaching files on the Interent suffer from certain restrictions such as limited user interactivity. The Internet teaching project CONRAD (computer online network for radiological didactics) circumvents these restrictions by using a new database structure that also reflects the development of a diagnosis over time. The cases are presented in the Internet with different HTML-based teaching programs. To support interactivity, CONRAD offers the building of online learning groups over the Internet. (orig.) [de

  1. Current puzzles and future possibilities

    International Nuclear Information System (INIS)

    Nagamiya, S.

    1982-02-01

    Four current puzzles and several future experimental possibilities in high-energy nuclear collision research are discussed. These puzzles are (1) entropy, (2) hydrodynamic flow, (3) anomalon, and (4) particle emission at backward angles in proton-nucleus collisions. The last one seems not to be directly related to the subject of the present school. But it is, because particle emission into the region far beyond the nucleon-nucleon kinematical limit is an interesting subject common for both proton-nucleus and nucleus-nucleus collisions, and the basic mechanism involved is strongly related in these two cases. Future experimental possibilities are described which include: (1) possibilities of studying multibaryonic excited states, (2) applications of neutron-rich isotopes, and (3) other needed experimental tasks. 72 references

  2. Computer security

    CERN Document Server

    Gollmann, Dieter

    2011-01-01

    A completely up-to-date resource on computer security Assuming no previous experience in the field of computer security, this must-have book walks you through the many essential aspects of this vast topic, from the newest advances in software and technology to the most recent information on Web applications security. This new edition includes sections on Windows NT, CORBA, and Java and discusses cross-site scripting and JavaScript hacking as well as SQL injection. Serving as a helpful introduction, this self-study guide is a wonderful starting point for examining the variety of competing sec

  3. Computational engineering

    CERN Document Server

    2014-01-01

    The book presents state-of-the-art works in computational engineering. Focus is on mathematical modeling, numerical simulation, experimental validation and visualization in engineering sciences. In particular, the following topics are presented: constitutive models and their implementation into finite element codes, numerical models in nonlinear elasto-dynamics including seismic excitations, multiphase models in structural engineering and multiscale models of materials systems, sensitivity and reliability analysis of engineering structures, the application of scientific computing in urban water management and hydraulic engineering, and the application of genetic algorithms for the registration of laser scanner point clouds.

  4. Another Theory is Possible

    DEFF Research Database (Denmark)

    Manners, Ian James

    2016-01-01

    The article argues that dissident voices which attempt to theorise Europe differently and advocate another European trajectory have been largely excluded and left unheard in mainstream discussions over the past decade of scholarship and analysis. Dissident voices in European Union studies are tho...... theory, is possible – indeed, probable....

  5. Possibilities of roentgenological method

    International Nuclear Information System (INIS)

    Sivash, Eh.S.; Sal'man, M.M.

    1980-01-01

    Literary and experimental data on estimating possibilities of roentgenologic investigations using an electron optical amplifier, X-ray television and roentgen cinematography are generalized. Different methods of studying gastro-intestinal tract are compared. The advantage of the roentgenologic method over the endoscopic method after stomach resection is shown [ru

  6. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  7. Evaluation of the Possibility of Applying Spatial 3D Imaging Using X-Ray Computed Tomography Reconstruction Methods for Quantitative Analysis of Multiphase Materials / Rentgenowska Analiza Ilościowa Materiałów Wielofazowych Z Wykorzystaniem Przestrzennego Obrazowania (3D Przy Użyciu Metod Rekonstrukcji Tomografii Komputerowej

    Directory of Open Access Journals (Sweden)

    Matysik P.

    2015-12-01

    Full Text Available In this paper the possibility of using X-ray computed tomography (CT in quantitative metallographic studies of homogeneous and composite materials is presented. Samples of spheroidal cast iron, Fe-Ti powder mixture compact and epoxy composite reinforced with glass fibers, were subjected to comparative structural tests. Volume fractions of each of the phase structure components were determined by conventional methods with the use of a scanning electron microscopy (SEM and X-ray diffraction (XRD quantitative analysis methods. These results were compared with those obtained by the method of spatial analysis of the reconstructed CT image. Based on the comparative analysis, taking into account the selectivity of data verification methods and the accuracy of the obtained results, the authors conclude that the method of computed tomography is suitable for quantitative analysis of several types of structural materials.

  8. Scientific computing

    CERN Document Server

    Trangenstein, John A

    2017-01-01

    This is the third of three volumes providing a comprehensive presentation of the fundamentals of scientific computing. This volume discusses topics that depend more on calculus than linear algebra, in order to prepare the reader for solving differential equations. This book and its companions show how to determine the quality of computational results, and how to measure the relative efficiency of competing methods. Readers learn how to determine the maximum attainable accuracy of algorithms, and how to select the best method for computing problems. This book also discusses programming in several languages, including C++, Fortran and MATLAB. There are 90 examples, 200 exercises, 36 algorithms, 40 interactive JavaScript programs, 91 references to software programs and 1 case study. Topics are introduced with goals, literature references and links to public software. There are descriptions of the current algorithms in GSLIB and MATLAB. This book could be used for a second course in numerical methods, for either ...

  9. Computational Psychiatry

    Science.gov (United States)

    Wang, Xiao-Jing; Krystal, John H.

    2014-01-01

    Psychiatric disorders such as autism and schizophrenia arise from abnormalities in brain systems that underlie cognitive, emotional and social functions. The brain is enormously complex and its abundant feedback loops on multiple scales preclude intuitive explication of circuit functions. In close interplay with experiments, theory and computational modeling are essential for understanding how, precisely, neural circuits generate flexible behaviors and their impairments give rise to psychiatric symptoms. This Perspective highlights recent progress in applying computational neuroscience to the study of mental disorders. We outline basic approaches, including identification of core deficits that cut across disease categories, biologically-realistic modeling bridging cellular and synaptic mechanisms with behavior, model-aided diagnosis. The need for new research strategies in psychiatry is urgent. Computational psychiatry potentially provides powerful tools for elucidating pathophysiology that may inform both diagnosis and treatment. To achieve this promise will require investment in cross-disciplinary training and research in this nascent field. PMID:25442941

  10. Algorithmically specialized parallel computers

    CERN Document Server

    Snyder, Lawrence; Gannon, Dennis B

    1985-01-01

    Algorithmically Specialized Parallel Computers focuses on the concept and characteristics of an algorithmically specialized computer.This book discusses the algorithmically specialized computers, algorithmic specialization using VLSI, and innovative architectures. The architectures and algorithms for digital signal, speech, and image processing and specialized architectures for numerical computations are also elaborated. Other topics include the model for analyzing generalized inter-processor, pipelined architecture for search tree maintenance, and specialized computer organization for raster

  11. Classical and quantum computing with C++ and Java simulations

    CERN Document Server

    Hardy, Y

    2001-01-01

    Classical and Quantum computing provides a self-contained, systematic and comprehensive introduction to all the subjects and techniques important in scientific computing. The style and presentation are readily accessible to undergraduates and graduates. A large number of examples, accompanied by complete C++ and Java code wherever possible, cover every topic. Features and benefits: - Comprehensive coverage of the theory with many examples - Topics in classical computing include boolean algebra, gates, circuits, latches, error detection and correction, neural networks, Turing machines, cryptography, genetic algorithms - For the first time, genetic expression programming is presented in a textbook - Topics in quantum computing include mathematical foundations, quantum algorithms, quantum information theory, hardware used in quantum computing This book serves as a textbook for courses in scientific computing and is also very suitable for self-study. Students, professionals and practitioners in computer...

  12. Computed tomography for radiographers

    International Nuclear Information System (INIS)

    Brooker, M.

    1986-01-01

    Computed tomography is regarded by many as a complicated union of sophisticated x-ray equipment and computer technology. This book overcomes these complexities. The rigid technicalities of the machinery and the clinical aspects of computed tomography are discussed including the preparation of patients, both physically and mentally, for scanning. Furthermore, the author also explains how to set up and run a computed tomography department, including advice on how the room should be designed

  13. Vibrating crystals as possible neutron monochromators

    International Nuclear Information System (INIS)

    Stoica, A.D.; Popovici, M.

    1983-09-01

    The Bragg reflection of neutrons of vibratinq perfect crystals is considered. The additional possibilities offered by the Doppler effect for shaping neutron beams in the k-space are discussed. A simple model for computing the vibrating crystal reflectivity is proposed. (author)

  14. Possible Subclinical Leaflet Thrombosis in Bioprosthetic Aortic Valves

    DEFF Research Database (Denmark)

    Makkar, Raj R; Fontana, Gregory; Jilaihawi, Hasan

    2015-01-01

    BACKGROUND: A finding of reduced aortic-valve leaflet motion was noted on computed tomography (CT) in a patient who had a stroke after transcatheter aortic-valve replacement (TAVR) during an ongoing clinical trial. This finding raised a concern about possible subclinical leaflet thrombosis...... patients and 1 of 115 patients, respectively; P=0.007). CONCLUSIONS: Reduced aortic-valve leaflet motion was shown in patients with bioprosthetic aortic valves. The condition resolved with therapeutic anticoagulation. The effect of this finding on clinical outcomes including stroke needs further...

  15. Towards the Possible Film

    OpenAIRE

    Dawood, S.

    2014-01-01

    Towards the Possible Film is a study in parallel universes – and the sparks that fly when worlds collide. As much of a projection into a far-off future as a flashback to a long-forgotten past, Dawood’s vivid 20-minute tableau combines the resonance of a mythic fable with the hallucinatory haziness of a waking dream. Emerging from the waves, as if transported from another dimension, two blue-skinned astronauts materialise on a red-rocked shoreline (Sidi Ifni in Southern Morocco). Blinking into...

  16. Structure problems in the analog computation

    International Nuclear Information System (INIS)

    Braffort, P.L.

    1957-01-01

    The recent mathematical development showed the importance of elementary structures (algebraic, topological, etc.) in abeyance under the great domains of classical analysis. Such structures in analog computation are put in evidence and possible development of applied mathematics are discussed. It also studied the topological structures of the standard representation of analog schemes such as additional triangles, integrators, phase inverters and functions generators. The analog method gives only the function of the variable: time, as results of its computations. But the course of computation, for systems including reactive circuits, introduces order structures which are called 'chronological'. Finally, it showed that the approximation methods of ordinary numerical and digital computation present the same structure as these analog computation. The structure analysis permits fruitful comparisons between the several domains of applied mathematics and suggests new important domains of application for analog method. (M.P.)

  17. Computer Security Handbook

    CERN Document Server

    Bosworth, Seymour; Whyne, Eric

    2012-01-01

    The classic and authoritative reference in the field of computer security, now completely updated and revised With the continued presence of large-scale computers; the proliferation of desktop, laptop, and handheld computers; and the vast international networks that interconnect them, the nature and extent of threats to computer security have grown enormously. Now in its fifth edition, Computer Security Handbook continues to provide authoritative guidance to identify and to eliminate these threats where possible, as well as to lessen any losses attributable to them. With seventy-seven chapter

  18. A Garden of Possibilities

    CERN Document Server

    Carolyn Lee

    2010-01-01

    Renowned landscape architect and designer Charles Jencks recently visited CERN along with the architect of the Globe, Hervé Dessimoz, to investigate the possibility of creating a cosmic-inspired garden at the entrance to the Laboratory.   Left to right: Charles Jencks, Peter Higgs, Rolf Heuer in the garden of cosmic speculation. Photo credit: University of Edinburgh/Maverick photo agency Charles Jencks is a master at designing whimsical, intriguing outdoor spaces that hold a much deeper meaning than just an interesting view. His Garden of Cosmic Speculation at his home in Scotland uses designs recalling cosmic forces, DNA, organic cells, spirals of time, black holes and the Universe, made with landform, plants, sculpture and water to re-shape the natural landscape. One of the possible symbols for CERN that came to his mind was the cosmic uroborus, an ancient Egyptian symbol of a snake eating its own tail dating back to 1600 BC. “Many scientists have discussed this as a poss...

  19. Computers for Lattice QCD

    International Nuclear Information System (INIS)

    Christ, Norman H

    2000-01-01

    The architecture and capabilities of the computers currently in use for large-scale lattice QCD calculations are described and compared. Based on this present experience, possible future directions are discussed

  20. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  1. Possible State Approaches to Cryptocurrencies

    Directory of Open Access Journals (Sweden)

    Jan Lansky

    2018-01-01

    Full Text Available Cryptocurrencies are a type of digital currencies that are relying on cryptographic proofs for confirmation of transactions. Cryptocurrencies usually achieve a unique combination of three features: ensuring limited anonymity, independence from central authority and double spending attack protection. No other group of currencies, including fiat currencies, has this combination of features. We will define cryptocurrency ownership and account anonymity. We will define cryptocurrency ownership and account anonymity. We will introduce a classification of the types of approaches to regulation of cryptocurrencies by various individual countries. We will present the risks that the use of cryptocurrencies involves and the possibilities of prevention of those risks. We will present the possible use of cryptocurrencies for the benefit of the state. The conclusion addresses the implications of adoption of a cryptocurrency as a national currency.

  2. Ambient culture: a possible future for entertainment computing

    NARCIS (Netherlands)

    Rauterberg, G.W.M.; Lygmayr, A.; Golebiowski, P.

    2007-01-01

    We provide an overview over cultural differences between East and West, as starting point for the development of entertainment technology towards cultural transformation. We argue for the importance of future entertainmant technology to contribute to cultural transformation processes in the large.

  3. Renewable resources - future possibilities

    International Nuclear Information System (INIS)

    Thomas, Martin H.

    1998-01-01

    The paper describes the Australian Cooperative Research Centre for Renewable Energy and Related Greenhouse Gas Abatement Technologies (ACRE), its technologies, commercial relationships and markets. The relevance of ACRE to developing country communities which lack reliable, adequate power supplies, is discussed. The opportunities for mutual collaboration between Australia and the developing countries in the application of renewable energy have never been stronger. Renewable energy promises real advantages to those who deploy it wisely, as well as significant job creation. Education at all level together with operational training, public awareness of what is possible and increased system reliability, are also vital ingredients for acceptance of these new technologies. They underpin successful commercialisation. The author concludes with the hope for a united international cooperative approach to the development of the renewable energy industry. (author)

  4. MOS modeling hierarchy including radiation effects

    International Nuclear Information System (INIS)

    Alexander, D.R.; Turfler, R.M.

    1975-01-01

    A hierarchy of modeling procedures has been developed for MOS transistors, circuit blocks, and integrated circuits which include the effects of total dose radiation and photocurrent response. The models were developed for use with the SCEPTRE circuit analysis program, but the techniques are suitable for other modern computer aided analysis programs. The modeling hierarchy permits the designer or analyst to select the level of modeling complexity consistent with circuit size, parametric information, and accuracy requirements. Improvements have been made in the implementation of important second order effects in the transistor MOS model, in the definition of MOS building block models, and in the development of composite terminal models for MOS integrated circuits

  5. (including travel dates) Proposed itinerary

    Indian Academy of Sciences (India)

    Ashok

    31 July to 22 August 2012 (including travel dates). Proposed itinerary: Arrival in Bangalore on 1 August. 1-5 August: Bangalore, Karnataka. Suggested institutions: Indian Institute of Science, Bangalore. St Johns Medical College & Hospital, Bangalore. Jawaharlal Nehru Centre, Bangalore. 6-8 August: Chennai, TN.

  6. Computers in Nuclear Physics Division

    International Nuclear Information System (INIS)

    Kowalczyk, M.; Tarasiuk, J.; Srebrny, J.

    1997-01-01

    Improving of the computer equipment in Nuclear Physics Division is described. It include: new computer equipment and hardware upgrading, software developing, new programs for computer booting and modernization of data acquisition systems

  7. Computer vision for sports

    DEFF Research Database (Denmark)

    Thomas, Graham; Gade, Rikke; Moeslund, Thomas B.

    2017-01-01

    fixed to players or equipment is generally not possible. This provides a rich set of opportunities for the application of computer vision techniques to help the competitors, coaches and audience. This paper discusses a selection of current commercial applications that use computer vision for sports...

  8. Ubiquitous human computing.

    Science.gov (United States)

    Zittrain, Jonathan

    2008-10-28

    Ubiquitous computing means network connectivity everywhere, linking devices and systems as small as a drawing pin and as large as a worldwide product distribution chain. What could happen when people are so readily networked? This paper explores issues arising from two possible emerging models of ubiquitous human computing: fungible networked brainpower and collective personal vital sign monitoring.

  9. The Computational Materials Repository

    DEFF Research Database (Denmark)

    Landis, David D.; Hummelshøj, Jens S.; Nestorov, Svetlozar

    2012-01-01

    The possibilities for designing new materials based on quantum physics calculations are rapidly growing, but these design efforts lead to a significant increase in the amount of computational data created. The Computational Materials Repository (CMR) addresses this data challenge and provides...

  10. Computer facilities for ISABELLE data handling

    International Nuclear Information System (INIS)

    Kramer, M.A.; Love, W.A.; Miller, R.J.; Zeller, M.

    1977-01-01

    The analysis of data produced by ISABELLE experiments will need a large system of computers. An official group of prospective users and operators of that system should begin planning now. Included in the array will be a substantial computer system at each ISABELLE intersection in use. These systems must include enough computer power to keep experimenters aware of the health of the experiment. This will require at least one very fast sophisticated processor in the system, the size depending on the experiment. Other features of the intersection systems must be a good, high speed graphic display, ability to record data on magnetic tape at 500 to 1000 KB, and a high speed link to a central computer. The operating system software must support multiple interactive users. A substantially larger capacity computer system, shared by the six intersection region experiments, must be available with good turnaround for experimenters while ISABELLE is running. A computer support group will be required to maintain the computer system and to provide and maintain software common to all experiments. Special superfast computing hardware or special function processors constructed with microprocessor circuitry may be necessary both in the data gathering and data processing work. Thus both the local and central processors should be chosen with the possibility of interfacing such devices in mind

  11. Research on cloud computing solutions

    OpenAIRE

    Liudvikas Kaklauskas; Vaida Zdanytė

    2015-01-01

    Cloud computing can be defined as a new style of computing in which dynamically scala-ble and often virtualized resources are provided as a services over the Internet. Advantages of the cloud computing technology include cost savings, high availability, and easy scalability. Voas and Zhang adapted six phases of computing paradigms, from dummy termi-nals/mainframes, to PCs, networking computing, to grid and cloud computing. There are four types of cloud computing: public cloud, private cloud, ...

  12. Theory including future not excluded

    DEFF Research Database (Denmark)

    Nagao, K.; Nielsen, H.B.

    2013-01-01

    We study a complex action theory (CAT) whose path runs over not only past but also future. We show that, if we regard a matrix element defined in terms of the future state at time T and the past state at time TA as an expectation value in the CAT, then we are allowed to have the Heisenberg equation......, Ehrenfest's theorem, and the conserved probability current density. In addition,we showthat the expectation value at the present time t of a future-included theory for large T - t and large t - T corresponds to that of a future-not-included theory with a proper inner product for large t - T. Hence, the CAT...

  13. Review of quantum computation

    International Nuclear Information System (INIS)

    Lloyd, S.

    1992-01-01

    Digital computers are machines that can be programmed to perform logical and arithmetical operations. Contemporary digital computers are ''universal,'' in the sense that a program that runs on one computer can, if properly compiled, run on any other computer that has access to enough memory space and time. Any one universal computer can simulate the operation of any other; and the set of tasks that any such machine can perform is common to all universal machines. Since Bennett's discovery that computation can be carried out in a non-dissipative fashion, a number of Hamiltonian quantum-mechanical systems have been proposed whose time-evolutions over discrete intervals are equivalent to those of specific universal computers. The first quantum-mechanical treatment of computers was given by Benioff, who exhibited a Hamiltonian system with a basis whose members corresponded to the logical states of a Turing machine. In order to make the Hamiltonian local, in the sense that its structure depended only on the part of the computation being performed at that time, Benioff found it necessary to make the Hamiltonian time-dependent. Feynman discovered a way to make the computational Hamiltonian both local and time-independent by incorporating the direction of computation in the initial condition. In Feynman's quantum computer, the program is a carefully prepared wave packet that propagates through different computational states. Deutsch presented a quantum computer that exploits the possibility of existing in a superposition of computational states to perform tasks that a classical computer cannot, such as generating purely random numbers, and carrying out superpositions of computations as a method of parallel processing. In this paper, we show that such computers, by virtue of their common function, possess a common form for their quantum dynamics

  14. Is self-regulation possible

    International Nuclear Information System (INIS)

    Barkenbus, J.N.

    1983-01-01

    The Nuclear Regulatory Commission's increasingly prescriptive regulation of the nuclear industry can have deleterious effects, perhaps the most serious being the shift in responsibility for safety from the utility to the NRC. Several factors account for this type of regulation including the nature and structure of the nuclear industry, public opinion and bureaucratic incentives, and the nature of the technology itself. The opportunities to create heightened industry self-regulation (performance-based regulation) deserve further examination. The key to self-regulation is to structure incentives so that it is clearly within the nuclear utilities' interests to build and operate nuclear power facilities in the safest manner possible. 27 references

  15. The Voice as Computer Interface: A Look at Tomorrow's Technologies.

    Science.gov (United States)

    Lange, Holley R.

    1991-01-01

    Discussion of voice as the communications device for computer-human interaction focuses on voice recognition systems for use within a library environment. Voice technologies are described, including voice response and voice recognition; examples of voice systems in use in libraries are examined; and further possibilities, including use with…

  16. ep possibility for ISABELLE

    International Nuclear Information System (INIS)

    Wilson, R.

    1977-01-01

    The feasibility of adding an electron ring to ISABELLE is discussed in terms of cost, physics goals, count rate estimates, the detector requirements, and the possibility of producing intermediate bosons. The purpose of adding an electron ring to ISABELLE must be considered to be the study of e + p → e + x, e - p → νx, and e + p → anti νx. Other processes, such as W production, are less interesting for an ep ring than for a pp ring. However, there may be other new particles--such as leptonic quarks--that only turn up here. These processes are, however, exciting. In a 30 day run at L = 10 32 cm -2 sec -1 300 neutrino events are expected at q 2 > 5000 GeV 2 where the propagator is expected to be less than 1 / 4 . Thus the value of the mass in the propagator can be measured to 5%. The ep cross section would be measured over the momentum transfer range 1 2 2 ). This range is large enough that a logarithmic deviation from scaling can be distinguished easily from a power law approach to scaling

  17. A new computing principle

    International Nuclear Information System (INIS)

    Fatmi, H.A.; Resconi, G.

    1988-01-01

    In 1954 while reviewing the theory of communication and cybernetics the late Professor Dennis Gabor presented a new mathematical principle for the design of advanced computers. During our work on these computers it was found that the Gabor formulation can be further advanced to include more recent developments in Lie algebras and geometric probability, giving rise to a new computing principle

  18. Computers and Information Flow.

    Science.gov (United States)

    Patrick, R. L.

    This paper is designed to fill the need for an easily understood introduction to the computing and data processing field for the layman who has, or can expect to have, some contact with it. Information provided includes the unique terminology and jargon of the field, the various types of computers and the scope of computational capabilities, and…

  19. Legal issues of computer imaging in plastic surgery: a primer.

    Science.gov (United States)

    Chávez, A E; Dagum, P; Koch, R J; Newman, J P

    1997-11-01

    Although plastic surgeons are increasingly incorporating computer imaging techniques into their practices, many fear the possibility of legally binding themselves to achieve surgical results identical to those reflected in computer images. Computer imaging allows surgeons to manipulate digital photographs of patients to project possible surgical outcomes. Some of the many benefits imaging techniques pose include improving doctor-patient communication, facilitating the education and training of residents, and reducing administrative and storage costs. Despite the many advantages computer imaging systems offer, however, surgeons understandably worry that imaging systems expose them to immense legal liability. The possible exploitation of computer imaging by novice surgeons as a marketing tool, coupled with the lack of consensus regarding the treatment of computer images, adds to the concern of surgeons. A careful analysis of the law, however, reveals that surgeons who use computer imaging carefully and conservatively, and adopt a few simple precautions, substantially reduce their vulnerability to legal claims. In particular, surgeons face possible claims of implied contract, failure to instruct, and malpractice from their use or failure to use computer imaging. Nevertheless, legal and practical obstacles frustrate each of those causes of actions. Moreover, surgeons who incorporate a few simple safeguards into their practice may further reduce their legal susceptibility.

  20. Towards distributed multiscale computing for the VPH

    NARCIS (Netherlands)

    Hoekstra, A.G.; Coveney, P.

    2010-01-01

    Multiscale modeling is fundamental to the Virtual Physiological Human (VPH) initiative. Most detailed three-dimensional multiscale models lead to prohibitive computational demands. As a possible solution we present MAPPER, a computational science infrastructure for Distributed Multiscale Computing

  1. Advanced computations in plasma physics

    International Nuclear Information System (INIS)

    Tang, W.M.

    2002-01-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. In this paper we review recent progress and future directions for advanced simulations in magnetically confined plasmas with illustrative examples chosen from magnetic confinement research areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to

  2. Design of Computer Experiments

    DEFF Research Database (Denmark)

    Dehlendorff, Christian

    The main topic of this thesis is design and analysis of computer and simulation experiments and is dealt with in six papers and a summary report. Simulation and computer models have in recent years received increasingly more attention due to their increasing complexity and usability. Software...... packages make the development of rather complicated computer models using predefined building blocks possible. This implies that the range of phenomenas that are analyzed by means of a computer model has expanded significantly. As the complexity grows so does the need for efficient experimental designs...... and analysis methods, since the complex computer models often are expensive to use in terms of computer time. The choice of performance parameter is an important part of the analysis of computer and simulation models and Paper A introduces a new statistic for waiting times in health care units. The statistic...

  3. Approximation and Computation

    CERN Document Server

    Gautschi, Walter; Rassias, Themistocles M

    2011-01-01

    Approximation theory and numerical analysis are central to the creation of accurate computer simulations and mathematical models. Research in these areas can influence the computational techniques used in a variety of mathematical and computational sciences. This collection of contributed chapters, dedicated to renowned mathematician Gradimir V. Milovanovia, represent the recent work of experts in the fields of approximation theory and numerical analysis. These invited contributions describe new trends in these important areas of research including theoretic developments, new computational alg

  4. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  5. Computing networks from cluster to cloud computing

    CERN Document Server

    Vicat-Blanc, Pascale; Guillier, Romaric; Soudan, Sebastien

    2013-01-01

    "Computing Networks" explores the core of the new distributed computing infrastructures we are using today:  the networking systems of clusters, grids and clouds. It helps network designers and distributed-application developers and users to better understand the technologies, specificities, constraints and benefits of these different infrastructures' communication systems. Cloud Computing will give the possibility for millions of users to process data anytime, anywhere, while being eco-friendly. In order to deliver this emerging traffic in a timely, cost-efficient, energy-efficient, and

  6. Device including a contact detector

    DEFF Research Database (Denmark)

    2011-01-01

    arms (12) may extend from the supporting body in co-planar relationship with the first surface. The plurality of cantilever arms (12) may extend substantially parallel to each other and each of the plurality of cantilever arms (12) may include an electrical conductive tip for contacting the area......The present invention relates to a probe for determining an electrical property of an area of a surface of a test sample, the probe is intended to be in a specific orientation relative to the test sample. The probe may comprise a supporting body defining a first surface. A plurality of cantilever...... of the test sample by movement of the probe relative to the surface of the test sample into the specific orientation.; The probe may further comprise a contact detector (14) extending from the supporting body arranged so as to contact the surface of the test sample prior to any one of the plurality...

  7. A formalization of computational trust

    NARCIS (Netherlands)

    Güven - Ozcelebi, C.; Holenderski, M.J.; Ozcelebi, T.; Lukkien, J.J.

    2018-01-01

    Computational trust aims to quantify trust and is studied by many disciplines including computer science, social sciences and business science. We propose a formal computational trust model, including its parameters and operations on these parameters, as well as a step by step guide to compute trust

  8. Possible phthalates transport into plants

    Directory of Open Access Journals (Sweden)

    Alžbeta Jarošová

    2010-01-01

    Full Text Available Soils can be contaminated by high concentrations of phthalic acid esters (PAE resulting from industrial and intensive agricultural activities. A plant receives water and substances (including pollutants from soil by means of rootage. Water solution received by the roots is distributed in particular by means of xylem. Reception by means of floem is not very considerable. Pollutants (including phthalates can be absorbed by roots either by diffusion by means of soil gas phase or soil liquid phase. Another possible way of pollutant entering into the plant is diffusion from atmosphere. Way of substance entering into the plant is decided by so called Henry constant as well as octanol-water partition coefficient. In case of phthalates, big differences between di-n-butyl phthalate (DBP reception and dioctyl phthalate reception were detected. For example, DBP can enter into the plant by means of gas as well as liquid phase while dioctyl phthalate only by gas phase.This publication summarizes fundamental knowledge on possible phthalates transport into plants.

  9. Control rod calibration including the rod coupling effect

    International Nuclear Information System (INIS)

    Szilard, R.; Nelson, G.W.

    1984-01-01

    In a reactor containing more than one control rod, which includes all reactors licensed in the United States, there will be a 'coupling' or 'shadowing' of control rod flux at the location of a control rod as a result of the flux depression caused by another control rod. It was decided to investigate this phenomenon further, and eventually to put calibration table data or formulae in a small computer in the control room, so once could insert the positions of the three control rods and receive the excess reactivity without referring to separate tables. For this to be accomplished, a 'three control- rod reactivity function' would be used which would include the flux coupling between the rods. The function is design and measured data was fitted into it to determine the calibration constants. The input data for fitting the trial functions consisted of 254 data points, each consisting of the position of the reg, shim, and transient rods, and the total excess reactivity. (About 200 of these points were 'critical balance points', that is the rod positions for which reactor was critical, and the remainder were determined by positive period measurements.) Although this may be unrealistic from a physical viewpoint, the function derived gave a very accurate recalculation of the input data, and thus would faithfully give the excess reactivity for any possible combination of the locations of the three control rods. The next step, incorporation of the three-rod function into the minicomputer, will be pursued in the summer and fall of 1984

  10. Privacy-Preserving Computation with Trusted Computing via Scramble-then-Compute

    OpenAIRE

    Dang Hung; Dinh Tien Tuan Anh; Chang Ee-Chien; Ooi Beng Chin

    2017-01-01

    We consider privacy-preserving computation of big data using trusted computing primitives with limited private memory. Simply ensuring that the data remains encrypted outside the trusted computing environment is insufficient to preserve data privacy, for data movement observed during computation could leak information. While it is possible to thwart such leakage using generic solution such as ORAM [42], designing efficient privacy-preserving algorithms is challenging. Besides computation effi...

  11. Possibility Theory and the Risk

    CERN Document Server

    Georgescu, Irina

    2012-01-01

    The book deals with some of the fundamental issues of risk assessment in grid computing environments. The book describes the development of a hybrid probabilistic and possibilistic model for assessing the success of a computing task in a grid environment

  12. Computer tomography in otolaryngology

    International Nuclear Information System (INIS)

    Gradzki, J.

    1981-01-01

    The principles of design and the action of computer tomography which was applied also for the diagnosis of nose, ear and throat diseases are discussed. Computer tomography makes possible visualization of the structures of the nose, nasal sinuses and facial skeleton in transverse and eoronal planes. The method enables an accurate evaluation of the position and size of neoplasms in these regions and differentiation of inflammatory exudates against malignant masses. In otology computer tomography is used particularly in the diagnosis of pontocerebellar angle tumours and otogenic brain abscesses. Computer tomography of the larynx and pharynx provides new diagnostic data owing to the possibility of obtaining transverse sections and visualization of cartilage. Computer tomograms of some cases are presented. (author)

  13. Customizable computing

    CERN Document Server

    Chen, Yu-Ting; Gill, Michael; Reinman, Glenn; Xiao, Bingjun

    2015-01-01

    Since the end of Dennard scaling in the early 2000s, improving the energy efficiency of computation has been the main concern of the research community and industry. The large energy efficiency gap between general-purpose processors and application-specific integrated circuits (ASICs) motivates the exploration of customizable architectures, where one can adapt the architecture to the workload. In this Synthesis lecture, we present an overview and introduction of the recent developments on energy-efficient customizable architectures, including customizable cores and accelerators, on-chip memory

  14. HCI in Mobile and Ubiquitous Computing

    OpenAIRE

    椎尾, 一郎; 安村, 通晃; 福本, 雅明; 伊賀, 聡一郎; 増井, 俊之

    2003-01-01

    This paper provides some perspectives to human computer interaction in mobile and ubiquitous computing. The review covers overview of ubiquitous computing, mobile computing and wearable computing. It also summarizes HCI topics on these field, including real-world oriented interface, multi-modal interface, context awareness and in-visible computers. Finally we discuss killer applications for coming ubiquitous computing era.

  15. Possibilities of urolithiasis crystallodiagnostics

    Directory of Open Access Journals (Sweden)

    A.K. Martusevich

    2017-06-01

    Full Text Available Nowadays, one of the most common groups of diseases in veterinary medicine is the urinary system pathology. Urolithiasis is widespread disease, could be found in many species, including cats, dogs, rabbits, guinea-pigs, turtles etc. Despite the large scale of this pathology in animal world, there are some challenges with diagnostic process and diagnosis’s verification. The aim of our study is estimation of diagnostic value of advanced urine tesiocrystalloscopy in urolithiasis (by the example of cats. We studied crystallogenic and initiated properties of 24 healthy cats and 32 animals with urolithiasis. Own and initiated crystallogenesis of the urine specimens was studied. For teziographic test we used sodium chloride solution (0.45%, 0.9%, 3% consequently, hydrochloric acid solution (0.1H and sodium hydroxide (0.1H as a crystal-forming substances. We used the original criterions to estimate crystalloscopic and tezigraphic facias. As the main parameters structure index (SI, crystallizability (Cr, facia's destruction degree (FDD and edge belt intensity (EB were used to describe free crystallogenesis, and main tezigraphic coefficient (Q, belt coefficient (B and FDD were used for the comparative tezigraphy data. Results showed that Cat’s urine in normal conditions has moderate crystallogenic activity, but in urolithiasis it acquires high level of crystallizing, with intermedium value of structure index, and significant destruction of crystal-forming elements. A similar changes of physical-chemical biomedium properties are detected during analysis of tezigraphic microslides of urines of cats with urolithiasis, that was prepared using 0.9% sodium chloride as basis substance. In conclusion, we fixed that tesiocristalloscopic „pattern“ of cats’ urine in urolithiasis significantly transforms into activation of crystal formation and increasing of biomedium’s initiating potential. So, the investigation of free or initialized urine

  16. Alcohol on Campus and Possible Liability.

    Science.gov (United States)

    Buchanan, E. T.

    1983-01-01

    Reviews laws and court cases relating to alcohol and possible civil and criminal liability. Suggests a number of risk management principles, including knowledge of the law, policies forbidding hazing, fostering alcohol awareness, and discipline. (JAC)

  17. Computer group

    International Nuclear Information System (INIS)

    Bauer, H.; Black, I.; Heusler, A.; Hoeptner, G.; Krafft, F.; Lang, R.; Moellenkamp, R.; Mueller, W.; Mueller, W.F.; Schati, C.; Schmidt, A.; Schwind, D.; Weber, G.

    1983-01-01

    The computer groups has been reorganized to take charge for the general purpose computers DEC10 and VAX and the computer network (Dataswitch, DECnet, IBM - connections to GSI and IPP, preparation for Datex-P). (orig.)

  18. Computer Engineers.

    Science.gov (United States)

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  19. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  20. POSSIBILITIES FOR RADIODIAGNOSIS OF TUBERCULOUS SPONDYLITIS

    Directory of Open Access Journals (Sweden)

    S. V. Smerdin

    2014-01-01

    Full Text Available The presented case illustrates the possibilities of complex radiodiagnosis in a patient with tuberculous spondylitis. The specific features of displaying a spinal tuberculous lesion during X-ray study, tomosynthesis, computed tomography, and magnetic resonance imaging are described. A rational algorithm for the examination and treatment of patients with this disease is proposed, by comparing the clinical manifestations of spinal tuberculous lesion and the results of its radiological studies.

  1. Prospective Evaluation of Magnetic Resonance Imaging and [18F]Fluorodeoxyglucose Positron Emission Tomography-Computed Tomography at Diagnosis and Before Maintenance Therapy in Symptomatic Patients With Multiple Myeloma Included in the IFM/DFCI 2009 Trial: Results of the IMAJEM Study.

    Science.gov (United States)

    Moreau, Philippe; Attal, Michel; Caillot, Denis; Macro, Margaret; Karlin, Lionel; Garderet, Laurent; Facon, Thierry; Benboubker, Lotfi; Escoffre-Barbe, Martine; Stoppa, Anne-Marie; Laribi, Kamel; Hulin, Cyrille; Perrot, Aurore; Marit, Gerald; Eveillard, Jean-Richard; Caillon, Florence; Bodet-Milin, Caroline; Pegourie, Brigitte; Dorvaux, Veronique; Chaleteix, Carine; Anderson, Kenneth; Richardson, Paul; Munshi, Nikhil C; Avet-Loiseau, Herve; Gaultier, Aurelie; Nguyen, Jean-Michel; Dupas, Benoit; Frampas, Eric; Kraeber-Bodere, Françoise

    2017-09-01

    Purpose Magnetic resonance imaging (MRI) and positron emission tomography-computed tomography (PET-CT) are important imaging techniques in multiple myeloma (MM). We conducted a prospective trial in patients with MM aimed at comparing MRI and PET-CT with respect to the detection of bone lesions at diagnosis and the prognostic value of the techniques. Patients and Methods One hundred thirty-four patients received a combination of lenalidomide, bortezomib, and dexamethasone (RVD) with or without autologous stem-cell transplantation, followed by lenalidomide maintenance. PET-CT and MRI were performed at diagnosis, after three cycles of RVD, and before maintenance therapy. The primary end point was the detection of bone lesions at diagnosis by MRI versus PET-CT. Secondary end points included the prognostic impact of MRI and PET-CT regarding progression-free (PFS) and overall survival (OS). Results At diagnosis, MRI results were positive in 127 of 134 patients (95%), and PET-CT results were positive in 122 of 134 patients (91%; P = .33). Normalization of MRI after three cycles of RVD and before maintenance was not predictive of PFS or OS. PET-CT became normal after three cycles of RVD in 32% of the patients with a positive evaluation at baseline, and PFS was improved in this group (30-month PFS, 78.7% v 56.8%, respectively). PET-CT normalization before maintenance was described in 62% of the patients who were positive at baseline. This was associated with better PFS and OS. Extramedullary disease at diagnosis was an independent prognostic factor for PFS and OS, whereas PET-CT normalization before maintenance was an independent prognostic factor for PFS. Conclusion There is no difference in the detection of bone lesions at diagnosis when comparing PET-CT and MRI. PET-CT is a powerful tool to evaluate the prognosis of de novo myeloma.

  2. Computational structural mechanics for engine structures

    Science.gov (United States)

    Chamis, C. C.

    1989-01-01

    The computational structural mechanics (CSM) program at Lewis encompasses: (1) fundamental aspects for formulating and solving structural mechanics problems, and (2) development of integrated software systems to computationally simulate the performance/durability/life of engine structures. It is structured to mainly supplement, complement, and whenever possible replace, costly experimental efforts which are unavoidable during engineering research and development programs. Specific objectives include: investigate unique advantages of parallel and multiprocesses for: reformulating/solving structural mechanics and formulating/solving multidisciplinary mechanics and develop integrated structural system computational simulators for: predicting structural performances, evaluating newly developed methods, and for identifying and prioritizing improved/missing methods needed. Herein the CSM program is summarized with emphasis on the Engine Structures Computational Simulator (ESCS). Typical results obtained using ESCS are described to illustrate its versatility.

  3. INTERRUPTION TO COMPUTING SERVICES, SATURDAY 9 FEBRUARY

    CERN Multimedia

    2002-01-01

    In order to allow the rerouting of electrical cables which power most of the B513 Computer Room, there will be a complete shutdown of central computing services on Saturday 9th February. This shutown affects all Central Computing services, including all NICE services (for Windows 95, Windows NT and Windows 2000), Mail and Web services, sitewide printing services, all Unix interactive and batch services, the ACB service, all AIS services and databases (such as EDH, BHT, CFU and HR), dedicated Engineering services, and general purpose database services. Services will be run down progressively from early on Saturday morning and reestablished as soon as possible, starting in the afternoon. However, it is unlikely that full computing services will be available before the Saturday evening. For operational reasons, some services may be shutdown on the evening of Friday 8th February and restarted on Monday 11th February. More detailed information about the stoppage and restart schedules will be given nearer...

  4. INTERRUPTION TO COMPUTING SERVICES, SATURDAY 9 FEBRUARY

    CERN Multimedia

    2002-01-01

    In order to allow the rerouting of electrical cables which power most of the B513 Computer Room, there will be a complete shutdown of central computing services on Saturday 9th February. This shutown affects all Central Computing services, including all NICE services (for Windows 95, Windows NT and Windows 2000), Mail and Web services, sitewide printing services, all Unix interactive and batch services, the ACB service, all AIS services and databases (such as EDH, BHT, CFU and HR) dedicated Engineering services, and general purpose database services. Services will be run down progressively from early on Saturday morning and reestablished as soon as possible, starting in the afternoon. However, it is unlikely that full computing services will be available before the Saturday evening. For operational reasons, some services may be shutdown on the evening of Friday 8th February and restarted on Monday 11th February. More detailed information about the stoppage and restart schedules will be given nearer ...

  5. Issues of nanoelectronics: a possible roadmap.

    Science.gov (United States)

    Wang, Kang L

    2002-01-01

    In this review, we will discuss a possible roadmap in scaling a nanoelectronic device from today's CMOS technology to the ultimate limit when the device fails. In other words, at the limit, CMOS will have a severe short channel effect, significant power dissipation in its quiescent (standby) state, and problems related to other essential characteristics. Efforts to use structures such as the double gate, vertical surround gate, and SOI to improve the gate control have continually been made. Other types of structures using SiGe source/drain, asymmetric Schottky source/drain, and the like will be investigated as viable structures to achieve ultimate CMOS. In reaching its scaling limit, tunneling will be an issue for CMOS. The tunneling current through the gate oxide and between the source and drain will limit the device operation. When tunneling becomes significant, circuits may incorporate tunneling devices with CMOS to further increase the functionality per device count. We will discuss both the top-down and bottom-up approaches in attaining the nanometer scale and eventually the atomic scale. Self-assembly is used as a bottom-up approach. The state of the art is reviewed, and the challenges of the multiple-step processing in using the self-assembly approach are outlined. Another facet of the scaling trend is to decrease the number of electrons in devices, ultimately leading to single electrons. If the size of a single-electron device is scaled in such a way that the Coulomb self-energy is higher than the thermal energy (at room temperature), a single-electron device will be able to operate at room temperature. In principle, the speed of the device will be fast as long as the capacitance of the load is also scaled accordingly. The single-electron device will have a small drive current, and thus the load capacitance, including those of interconnects and fanouts, must be small to achieve a reasonable speed. However, because the increase in the density (and

  6. Essential numerical computer methods

    CERN Document Server

    Johnson, Michael L

    2010-01-01

    The use of computers and computational methods has become ubiquitous in biological and biomedical research. During the last 2 decades most basic algorithms have not changed, but what has is the huge increase in computer speed and ease of use, along with the corresponding orders of magnitude decrease in cost. A general perception exists that the only applications of computers and computer methods in biological and biomedical research are either basic statistical analysis or the searching of DNA sequence data bases. While these are important applications they only scratch the surface of the current and potential applications of computers and computer methods in biomedical research. The various chapters within this volume include a wide variety of applications that extend far beyond this limited perception. As part of the Reliable Lab Solutions series, Essential Numerical Computer Methods brings together chapters from volumes 210, 240, 321, 383, 384, 454, and 467 of Methods in Enzymology. These chapters provide ...

  7. Scientific Computing Strategic Plan for the Idaho National Laboratory

    International Nuclear Information System (INIS)

    Whiting, Eric Todd

    2015-01-01

    Scientific computing is a critical foundation of modern science. Without innovations in the field of computational science, the essential missions of the Department of Energy (DOE) would go unrealized. Taking a leadership role in such innovations is Idaho National Laboratory's (INL's) challenge and charge, and is central to INL's ongoing success. Computing is an essential part of INL's future. DOE science and technology missions rely firmly on computing capabilities in various forms. Modeling and simulation, fueled by innovations in computational science and validated through experiment, are a critical foundation of science and engineering. Big data analytics from an increasing number of widely varied sources is opening new windows of insight and discovery. Computing is a critical tool in education, science, engineering, and experiments. Advanced computing capabilities in the form of people, tools, computers, and facilities, will position INL competitively to deliver results and solutions on important national science and engineering challenges. A computing strategy must include much more than simply computers. The foundational enabling component of computing at many DOE national laboratories is the combination of a showcase like data center facility coupled with a very capable supercomputer. In addition, network connectivity, disk storage systems, and visualization hardware are critical and generally tightly coupled to the computer system and co located in the same facility. The existence of these resources in a single data center facility opens the doors to many opportunities that would not otherwise be possible.

  8. Computer Tree

    Directory of Open Access Journals (Sweden)

    Onur AĞAOĞLU

    2014-12-01

    Full Text Available It is crucial that gifted and talented students should be supported by different educational methods for their interests and skills. The science and arts centres (gifted centres provide the Supportive Education Program for these students with an interdisciplinary perspective. In line with the program, an ICT lesson entitled “Computer Tree” serves for identifying learner readiness levels, and defining the basic conceptual framework. A language teacher also contributes to the process, since it caters for the creative function of the basic linguistic skills. The teaching technique is applied for 9-11 aged student level. The lesson introduces an evaluation process including basic information, skills, and interests of the target group. Furthermore, it includes an observation process by way of peer assessment. The lesson is considered to be a good sample of planning for any subject, for the unpredicted convergence of visual and technical abilities with linguistic abilities.

  9. Are baryonic galactic halos possible

    International Nuclear Information System (INIS)

    Olive, K.A.; Hegyi, D.J.

    1986-01-01

    There is little doubt from the rotation curves of spiral galaxies that galactic halos must contain large amounts of dark matter. In this contribution, the authors review arguments which indicate that it is very unlikely that galactic halos contain substantial amounts of baryonic matter. While the authors would like to be able to present a single argument which would rule out baryonic matter, at the present time they are only able to present a collection of arguments each of which argues against one form of baryonic matter. These include: 1) snowballs; 2) gas; 3) low mass stars and Jupiters; 4) high mass stars; and 5) high metalicity objects such as rooks or dust. Black holes, which do not have a well defined baryon number, are also a possible candidate for halo matter. They briefly discuss black holes

  10. [Possibilities of psychoprophylaxis in epilepsy].

    Science.gov (United States)

    Bilikiewicz, A

    1976-01-01

    The psychiatrist should be given also their share in the prevetion of epilepsy by means of raising the psychiatric culture of the society and teaching the population the principles of mental hygiene and psychoprophylaxia. The possibilities of psychiatry in prophylactic management of patients with developed epilepsy include: 1. Energetic measures for controlling attacks which has many psychoprophylactic aspects. 2. Prevention of psychotraumatizing situations leading to secondary neurotic, psychotic and other reactions and behaviour disorders of the type of homilopathy and sociopathy, 3. Counteracting the development of mental and social disability in epileptics. Treatment of epilepsy should be conducted from its very beginning in cooperation with psychiatrists and therapeutic psychologists. The probems of prophylaxis cannot be separated from prophylactic treatment, psychotherapy sociotherapy and rehabilitation.

  11. Computers for lattice field theories

    International Nuclear Information System (INIS)

    Iwasaki, Y.

    1994-01-01

    Parallel computers dedicated to lattice field theories are reviewed with emphasis on the three recent projects, the Teraflops project in the US, the CP-PACS project in Japan and the 0.5-Teraflops project in the US. Some new commercial parallel computers are also discussed. Recent development of semiconductor technologies is briefly surveyed in relation to possible approaches toward Teraflops computers. (orig.)

  12. Taxonomy of cloud computing services

    NARCIS (Netherlands)

    Hoefer, C.N.; Karagiannis, Georgios

    2010-01-01

    Cloud computing is a highly discussed topic, and many big players of the software industry are entering the development of cloud services. Several companies want to explore the possibilities and benefits of cloud computing, but with the amount of cloud computing services increasing quickly, the need

  13. Computation as Medium

    DEFF Research Database (Denmark)

    Jochum, Elizabeth Ann; Putnam, Lance

    2017-01-01

    Artists increasingly utilize computational tools to generate art works. Computational approaches to art making open up new ways of thinking about agency in interactive art because they invite participation and allow for unpredictable outcomes. Computational art is closely linked...... to the participatory turn in visual art, wherein spectators physically participate in visual art works. Unlike purely physical methods of interaction, computer assisted interactivity affords artists and spectators more nuanced control of artistic outcomes. Interactive art brings together human bodies, computer code......, and nonliving objects to create emergent art works. Computation is more than just a tool for artists, it is a medium for investigating new aesthetic possibilities for choreography and composition. We illustrate this potential through two artistic projects: an improvisational dance performance between a human...

  14. Introduction to morphogenetic computing

    CERN Document Server

    Resconi, Germano; Xu, Guanglin

    2017-01-01

    This book offers a concise introduction to morphogenetic computing, showing that its use makes global and local relations, defects in crystal non-Euclidean geometry databases with source and sink, genetic algorithms, and neural networks more stable and efficient. It also presents applications to database, language, nanotechnology with defects, biological genetic structure, electrical circuit, and big data structure. In Turing machines, input and output states form a system – when the system is in one state, the input is transformed into output. This computation is always deterministic and without any possible contradiction or defects. In natural computation there are defects and contradictions that have to be solved to give a coherent and effective computation. The new computation generates the morphology of the system that assumes different forms in time. Genetic process is the prototype of the morphogenetic computing. At the Boolean logic truth value, we substitute a set of truth (active sets) values with...

  15. Towards the computational design of solid catalysts

    DEFF Research Database (Denmark)

    Nørskov, Jens Kehlet; Bligaard, Thomas; Rossmeisl, Jan

    2009-01-01

    Over the past decade the theoretical description of surface reactions has undergone a radical development. Advances in density functional theory mean it is now possible to describe catalytic reactions at surfaces with the detail and accuracy required for computational results to compare favourably...... with experiments. Theoretical methods can be used to describe surface chemical reactions in detail and to understand variations in catalytic activity from one catalyst to another. Here, we review the first steps towards using computational methods to design new catalysts. Examples include screening for catalysts...

  16. Scalable optical quantum computer

    Energy Technology Data Exchange (ETDEWEB)

    Manykin, E A; Mel' nichenko, E V [Institute for Superconductivity and Solid-State Physics, Russian Research Centre ' Kurchatov Institute' , Moscow (Russian Federation)

    2014-12-31

    A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr{sup 3+}, regularly located in the lattice of the orthosilicate (Y{sub 2}SiO{sub 5}) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

  17. Computing meaning v.4

    CERN Document Server

    Bunt, Harry; Pulman, Stephen

    2013-01-01

    This book is a collection of papers by leading researchers in computational semantics. It presents a state-of-the-art overview of recent and current research in computational semantics, including descriptions of new methods for constructing and improving resources for semantic computation, such as WordNet, VerbNet, and semantically annotated corpora. It also presents new statistical methods in semantic computation, such as the application of distributional semantics in the compositional calculation of sentence meanings. Computing the meaning of sentences, texts, and spoken or texted dialogue i

  18. Scalable optical quantum computer

    International Nuclear Information System (INIS)

    Manykin, E A; Mel'nichenko, E V

    2014-01-01

    A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr 3+ , regularly located in the lattice of the orthosilicate (Y 2 SiO 5 ) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

  19. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2005-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

  20. 78 FR 34669 - Certain Electronic Devices, Including Wireless Communication Devices, Portable Music and Data...

    Science.gov (United States)

    2013-06-10

    ..., Including Wireless Communication Devices, Portable Music and Data Processing Devices, and Tablet Computers... importing wireless communication devices, portable music and data processing devices, and tablet computers... certain electronic devices, including wireless communication devices, portable music and data processing...

  1. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  2. Center for computer security: Computer Security Group conference. Summary

    Energy Technology Data Exchange (ETDEWEB)

    None

    1982-06-01

    Topics covered include: computer security management; detection and prevention of computer misuse; certification and accreditation; protection of computer security, perspective from a program office; risk analysis; secure accreditation systems; data base security; implementing R and D; key notarization system; DOD computer security center; the Sandia experience; inspector general's report; and backup and contingency planning. (GHT)

  3. Progress Towards an LES Wall Model Including Unresolved Roughness

    Science.gov (United States)

    Craft, Kyle; Redman, Andrew; Aikens, Kurt

    2015-11-01

    Wall models used in large eddy simulations (LES) are often based on theories for hydraulically smooth walls. While this is reasonable for many applications, there are also many where the impact of surface roughness is important. A previously developed wall model has been used primarily for jet engine aeroacoustics. However, jet simulations have not accurately captured thick initial shear layers found in some experimental data. This may partly be due to nozzle wall roughness used in the experiments to promote turbulent boundary layers. As a result, the wall model is extended to include the effects of unresolved wall roughness through appropriate alterations to the log-law. The methodology is tested for incompressible flat plate boundary layers with different surface roughness. Correct trends are noted for the impact of surface roughness on the velocity profile. However, velocity deficit profiles and the Reynolds stresses do not collapse as well as expected. Possible reasons for the discrepancies as well as future work will be presented. This work used the Extreme Science and Engineering Discovery Environment (XSEDE), which is supported by National Science Foundation grant number ACI-1053575. Computational resources on TACC Stampede were provided under XSEDE allocation ENG150001.

  4. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  5. Quantum Computing

    OpenAIRE

    Scarani, Valerio

    1998-01-01

    The aim of this thesis was to explain what quantum computing is. The information for the thesis was gathered from books, scientific publications, and news articles. The analysis of the information revealed that quantum computing can be broken down to three areas: theories behind quantum computing explaining the structure of a quantum computer, known quantum algorithms, and the actual physical realizations of a quantum computer. The thesis reveals that moving from classical memor...

  6. Computed tomography

    International Nuclear Information System (INIS)

    Boyd, D.P.

    1989-01-01

    This paper reports on computed tomographic (CT) scanning which has improved computer-assisted imaging modalities for radiologic diagnosis. The advantage of this modality is its ability to image thin cross-sectional planes of the body, thus uncovering density information in three dimensions without tissue superposition problems. Because this enables vastly superior imaging of soft tissues in the brain and body, CT scanning was immediately successful and continues to grow in importance as improvements are made in speed, resolution, and cost efficiency. CT scanners are used for general purposes, and the more advanced machines are generally preferred in large hospitals, where volume and variety of usage justifies the cost. For imaging in the abdomen, a scanner with a rapid speed is preferred because peristalsis, involuntary motion of the diaphram, and even cardiac motion are present and can significantly degrade image quality. When contrast media is used in imaging to demonstrate scanner, immediate review of images, and multiformat hardcopy production. A second console is reserved for the radiologist to read images and perform the several types of image analysis that are available. Since CT images contain quantitative information in terms of density values and contours of organs, quantitation of volumes, areas, and masses is possible. This is accomplished with region-of- interest methods, which involve the electronic outlining of the selected region of the television display monitor with a trackball-controlled cursor. In addition, various image- processing options, such as edge enhancement (for viewing fine details of edges) or smoothing filters (for enhancing the detectability of low-contrast lesions) are useful tools

  7. Adiabatic Quantum Computing

    Science.gov (United States)

    Landahl, Andrew

    2012-10-01

    Quantum computers promise to exploit counterintuitive quantum physics principles like superposition, entanglement, and uncertainty to solve problems using fundamentally fewer steps than any conventional computer ever could. The mere possibility of such a device has sharpened our understanding of quantum coherent information, just as lasers did for our understanding of coherent light. The chief obstacle to developing quantum computer technology is decoherence--one of the fastest phenomena in all of physics. In principle, decoherence can be overcome by using clever entangled redundancies in a process called fault-tolerant quantum error correction. However, the quality and scale of technology required to realize this solution appears distant. An exciting alternative is a proposal called ``adiabatic'' quantum computing (AQC), in which adiabatic quantum physics keeps the computer in its lowest-energy configuration throughout its operation, rendering it immune to many decoherence sources. The Adiabatic Quantum Architectures In Ultracold Systems (AQUARIUS) Grand Challenge Project at Sandia seeks to demonstrate this robustness in the laboratory and point a path forward for future hardware development. We are building devices in AQUARIUS that realize the AQC architecture on up to three quantum bits (``qubits'') in two platforms: Cs atoms laser-cooled to below 5 microkelvin and Si quantum dots cryo-cooled to below 100 millikelvin. We are also expanding theoretical frontiers by developing methods for scalable universal AQC in these platforms. We have successfully demonstrated operational qubits in both platforms and have even run modest one-qubit calculations using our Cs device. In the course of reaching our primary proof-of-principle demonstrations, we have developed multiple spinoff technologies including nanofabricated diffractive optical elements that define optical-tweezer trap arrays and atomic-scale Si lithography commensurate with placing individual donor atoms with

  8. Brain-Computer Symbiosis

    Science.gov (United States)

    Schalk, Gerwin

    2009-01-01

    The theoretical groundwork of the 1930’s and 1940’s and the technical advance of computers in the following decades provided the basis for dramatic increases in human efficiency. While computers continue to evolve, and we can still expect increasing benefits from their use, the interface between humans and computers has begun to present a serious impediment to full realization of the potential payoff. This article is about the theoretical and practical possibility that direct communication between the brain and the computer can be used to overcome this impediment by improving or augmenting conventional forms of human communication. It is about the opportunity that the limitations of our body’s input and output capacities can be overcome using direct interaction with the brain, and it discusses the assumptions, possible limitations, and implications of a technology that I anticipate will be a major source of pervasive changes in the coming decades. PMID:18310804

  9. 'Cloud computing' and clinical trials: report from an ECRIN workshop.

    Science.gov (United States)

    Ohmann, Christian; Canham, Steve; Danielyan, Edgar; Robertshaw, Steve; Legré, Yannick; Clivio, Luca; Demotes, Jacques

    2015-07-29

    Growing use of cloud computing in clinical trials prompted the European Clinical Research Infrastructures Network, a European non-profit organisation established to support multinational clinical research, to organise a one-day workshop on the topic to clarify potential benefits and risks. The issues that arose in that workshop are summarised and include the following: the nature of cloud computing and the cloud computing industry; the risks in using cloud computing services now; the lack of explicit guidance on this subject, both generally and with reference to clinical trials; and some possible ways of reducing risks. There was particular interest in developing and using a European 'community cloud' specifically for academic clinical trial data. It was recognised that the day-long workshop was only the start of an ongoing process. Future discussion needs to include clarification of trial-specific regulatory requirements for cloud computing and involve representatives from the relevant regulatory bodies.

  10. Resource allocation in grid computing

    NARCIS (Netherlands)

    Koole, Ger; Righter, Rhonda

    2007-01-01

    Grid computing, in which a network of computers is integrated to create a very fast virtual computer, is becoming ever more prevalent. Examples include the TeraGrid and Planet-lab.org, as well as applications on the existing Internet that take advantage of unused computing and storage capacity of

  11. Spatial Computing and Spatial Practices

    DEFF Research Database (Denmark)

    Brodersen, Anders; Büsher, Monika; Christensen, Michael

    2007-01-01

    The gathering momentum behind the research agendas of pervasive, ubiquitous and ambient computing, set in motion by Mark Weiser (1991), offer dramatic opportunities for information systems design. They raise the possibility of "putting computation where it belongs" by exploding computing power out...... the "disappearing computer" we have, therefore, carried over from previous research an interdisciplinary perspective, and a focus on the sociality of action (Suchman 1987)....

  12. Learning Universal Computations with Spikes

    Science.gov (United States)

    Thalmeier, Dominik; Uhlmann, Marvin; Kappen, Hilbert J.; Memmesheimer, Raoul-Martin

    2016-01-01

    Providing the neurobiological basis of information processing in higher animals, spiking neural networks must be able to learn a variety of complicated computations, including the generation of appropriate, possibly delayed reactions to inputs and the self-sustained generation of complex activity patterns, e.g. for locomotion. Many such computations require previous building of intrinsic world models. Here we show how spiking neural networks may solve these different tasks. Firstly, we derive constraints under which classes of spiking neural networks lend themselves to substrates of powerful general purpose computing. The networks contain dendritic or synaptic nonlinearities and have a constrained connectivity. We then combine such networks with learning rules for outputs or recurrent connections. We show that this allows to learn even difficult benchmark tasks such as the self-sustained generation of desired low-dimensional chaotic dynamics or memory-dependent computations. Furthermore, we show how spiking networks can build models of external world systems and use the acquired knowledge to control them. PMID:27309381

  13. New computer systems

    International Nuclear Information System (INIS)

    Faerber, G.

    1975-01-01

    Process computers have already become indespensable technical aids for monitoring and automation tasks in nuclear power stations. Yet there are still some problems connected with their use whose elimination should be the main objective in the development of new computer systems. In the paper, some of these problems are summarized, new tendencies in hardware development are outlined, and finally some new systems concepts made possible by the hardware development are explained. (orig./AK) [de

  14. SLAC B Factory computing

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1992-02-01

    As part of the research and development program in preparation for a possible B Factory at SLAC, a group has been studying various aspects of HEP computing. In particular, the group is investigating the use of UNIX for all computing, from data acquisition, through analysis, and word processing. A summary of some of the results of this study will be given, along with some personal opinions on these topics

  15. Computer games addiction

    OpenAIRE

    Nejepínský, Adam

    2010-01-01

    This bachelor thesis deals with the problem of computer games addiction. The attention is paid mainly to on-line games for more players. The purpose of this thesis was to describe this problem and to check - through questionnaire investigation - if the addiction to computer games and the impacts connected with the games really deserve excessive experts and laics attention. The thesis has two parts -- theoretical and practical ones. The theoretical part describes the possibilities of diagnosin...

  16. The surgery of peripheral nerves (including tumors)

    DEFF Research Database (Denmark)

    Fugleholm, Kåre

    2013-01-01

    Surgical pathology of the peripheral nervous system includes traumatic injury, entrapment syndromes, and tumors. The recent significant advances in the understanding of the pathophysiology and cellular biology of peripheral nerve degeneration and regeneration has yet to be translated into improved...... surgical techniques and better outcome after peripheral nerve injury. Decision making in peripheral nerve surgery continues to be a complex challenge, where the mechanism of injury, repeated clinical evaluation, neuroradiological and neurophysiological examination, and detailed knowledge of the peripheral...... nervous system response to injury are prerequisite to obtain the best possible outcome. Surgery continues to be the primary treatment modality for peripheral nerve tumors and advances in adjuvant oncological treatment has improved outcome after malignant peripheral nerve tumors. The present chapter...

  17. FOX, current state and possibilities

    Czech Academy of Sciences Publication Activity Database

    Černý, R.; Favre-Nicolin, V.; Rohlíček, Jan; Hušák, M.

    2017-01-01

    Roč. 7, č. 10 (2017), s. 1-9, č. článku 322. ISSN 2073-4352 Institutional support: RVO:68378271 Keywords : powder diffraction * crystal structure solution * global optimization * reversed Monte Carlo * simulated annealing Subject RIV: JC - Computer Hardware ; Software OBOR OECD: Computer hardware and architecture Impact factor: 1.566, year: 2016

  18. Computed tomography

    International Nuclear Information System (INIS)

    Wells, P.; Davis, J.; Morgan, M.

    1994-01-01

    X-ray or gamma-ray transmission computed tomography (CT) is a powerful non-destructive evaluation (NDE) technique that produces two-dimensional cross-sectional images of an object without the need to physically section it. CT is also known by the acronym CAT, for computerised axial tomography. This review article presents a brief historical perspective on CT, its current status and the underlying physics. The mathematical fundamentals of computed tomography are developed for the simplest transmission CT modality. A description of CT scanner instrumentation is provided with an emphasis on radiation sources and systems. Examples of CT images are shown indicating the range of materials that can be scanned and the spatial and contrast resolutions that may be achieved. Attention is also given to the occurrence, interpretation and minimisation of various image artefacts that may arise. A final brief section is devoted to the principles and potential of a range of more recently developed tomographic modalities including diffraction CT, positron emission CT and seismic tomography. 57 refs., 2 tabs., 14 figs

  19. All-optical reservoir computing.

    Science.gov (United States)

    Duport, François; Schneider, Bendix; Smerieri, Anteo; Haelterman, Marc; Massar, Serge

    2012-09-24

    Reservoir Computing is a novel computing paradigm that uses a nonlinear recurrent dynamical system to carry out information processing. Recent electronic and optoelectronic Reservoir Computers based on an architecture with a single nonlinear node and a delay loop have shown performance on standardized tasks comparable to state-of-the-art digital implementations. Here we report an all-optical implementation of a Reservoir Computer, made of off-the-shelf components for optical telecommunications. It uses the saturation of a semiconductor optical amplifier as nonlinearity. The present work shows that, within the Reservoir Computing paradigm, all-optical computing with state-of-the-art performance is possible.

  20. An Integrated Biochemistry Laboratory, Including Molecular Modeling

    Science.gov (United States)

    Hall, Adele J. Wolfson Mona L.; Branham, Thomas R.

    1996-11-01

    ) experience with methods of protein purification; (iii) incorporation of appropriate controls into experiments; (iv) use of basic statistics in data analysis; (v) writing papers and grant proposals in accepted scientific style; (vi) peer review; (vii) oral presentation of results and proposals; and (viii) introduction to molecular modeling. Figure 1 illustrates the modular nature of the lab curriculum. Elements from each of the exercises can be separated and treated as stand-alone exercises, or combined into short or long projects. We have been able to offer the opportunity to use sophisticated molecular modeling in the final module through funding from an NSF-ILI grant. However, many of the benefits of the research proposal can be achieved with other computer programs, or even by literature survey alone. Figure 1.Design of project-based biochemistry laboratory. Modules (projects, or portions of projects) are indicated as boxes. Each of these can be treated independently, or used as part of a larger project. Solid lines indicate some suggested paths from one module to the next. The skills and knowledge required for protein purification and design are developed in three units: (i) an introduction to critical assays needed to monitor degree of purification, including an evaluation of assay parameters; (ii) partial purification by ion-exchange techniques; and (iii) preparation of a grant proposal on protein design by mutagenesis. Brief descriptions of each of these units follow, with experimental details of each project at the end of this paper. Assays for Lysozyme Activity and Protein Concentration (4 weeks) The assays mastered during the first unit are a necessary tool for determining the purity of the enzyme during the second unit on purification by ion exchange. These assays allow an introduction to the concept of specific activity (units of enzyme activity per milligram of total protein) as a measure of purity. In this first sequence, students learn a turbidimetric assay

  1. Computer Prediction of Air Quality in Livestock Buildings

    DEFF Research Database (Denmark)

    Svidt, Kjeld; Bjerg, Bjarne

    In modem livestock buildings the design of ventilation systems is important in order to obtain good air quality. The use of Computational Fluid Dynamics for predicting the air distribution makes it possible to include the effect of room geometry and heat sources in the design process. This paper...... presents numerical prediction of air flow in a livestock building compared with laboratory measurements. An example of the calculation of contaminant distribution is given, and the future possibilities of the method are discussed....

  2. Introduction to reversible computing

    CERN Document Server

    Perumalla, Kalyan S

    2013-01-01

    Few books comprehensively cover the software and programming aspects of reversible computing. Filling this gap, Introduction to Reversible Computing offers an expanded view of the field that includes the traditional energy-motivated hardware viewpoint as well as the emerging application-motivated software approach. Collecting scattered knowledge into one coherent account, the book provides a compendium of both classical and recently developed results on reversible computing. It explores up-and-coming theories, techniques, and tools for the application of rever

  3. Computational movement analysis

    CERN Document Server

    Laube, Patrick

    2014-01-01

    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  4. Computational neurogenetic modeling

    CERN Document Server

    Benuskova, Lubica

    2010-01-01

    Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol

  5. Intelligent distributed computing

    CERN Document Server

    Thampi, Sabu

    2015-01-01

    This book contains a selection of refereed and revised papers of the Intelligent Distributed Computing Track originally presented at the third International Symposium on Intelligent Informatics (ISI-2014), September 24-27, 2014, Delhi, India.  The papers selected for this Track cover several Distributed Computing and related topics including Peer-to-Peer Networks, Cloud Computing, Mobile Clouds, Wireless Sensor Networks, and their applications.

  6. Computer science I essentials

    CERN Document Server

    Raus, Randall

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Computer Science I includes fundamental computer concepts, number representations, Boolean algebra, switching circuits, and computer architecture.

  7. Discrete computational structures

    CERN Document Server

    Korfhage, Robert R

    1974-01-01

    Discrete Computational Structures describes discrete mathematical concepts that are important to computing, covering necessary mathematical fundamentals, computer representation of sets, graph theory, storage minimization, and bandwidth. The book also explains conceptual framework (Gorn trees, searching, subroutines) and directed graphs (flowcharts, critical paths, information network). The text discusses algebra particularly as it applies to concentrates on semigroups, groups, lattices, propositional calculus, including a new tabular method of Boolean function minimization. The text emphasize

  8. Learner Perceptions of Realism and Magic in Computer Simulations.

    Science.gov (United States)

    Hennessy, Sara; O'Shea, Tim

    1993-01-01

    Discusses the possible lack of credibility in educational interactive computer simulations. Topics addressed include "Shopping on Mars," a collaborative adventure game for arithmetic calculation that uses direct manipulation in the microworld; the Alternative Reality Kit, a graphical animated environment for creating interactive…

  9. Olkiluoto surface hydrological modelling: Update 2012 including salt transport modelling

    International Nuclear Information System (INIS)

    Karvonen, T.

    2013-11-01

    Posiva Oy is responsible for implementing a final disposal program for spent nuclear fuel of its owners Teollisuuden Voima Oyj and Fortum Power and Heat Oy. The spent nuclear fuel is planned to be disposed at a depth of about 400-450 meters in the crystalline bedrock at the Olkiluoto site. Leakages located at or close to spent fuel repository may give rise to the upconing of deep highly saline groundwater and this is a concern with regard to the performance of the tunnel backfill material after the closure of the tunnels. Therefore a salt transport sub-model was added to the Olkiluoto surface hydrological model (SHYD). The other improvements include update of the particle tracking algorithm and possibility to estimate the influence of open drillholes in a case where overpressure in inflatable packers decreases causing a hydraulic short-circuit between hydrogeological zones HZ19 and HZ20 along the drillhole. Four new hydrogeological zones HZ056, HZ146, BFZ100 and HZ039 were added to the model. In addition, zones HZ20A and HZ20B intersect with each other in the new structure model, which influences salinity upconing caused by leakages in shafts. The aim of the modelling of long-term influence of ONKALO, shafts and repository tunnels provide computational results that can be used to suggest limits for allowed leakages. The model input data included all the existing leakages into ONKALO (35-38 l/min) and shafts in the present day conditions. The influence of shafts was computed using eight different values for total shaft leakage: 5, 11, 20, 30, 40, 50, 60 and 70 l/min. The selection of the leakage criteria for shafts was influenced by the fact that upconing of saline water increases TDS-values close to the repository areas although HZ20B does not intersect any deposition tunnels. The total limit for all leakages was suggested to be 120 l/min. The limit for HZ20 zones was proposed to be 40 l/min: about 5 l/min the present day leakages to access tunnel, 25 l/min from

  10. Olkiluoto surface hydrological modelling: Update 2012 including salt transport modelling

    Energy Technology Data Exchange (ETDEWEB)

    Karvonen, T. [WaterHope, Helsinki (Finland)

    2013-11-15

    Posiva Oy is responsible for implementing a final disposal program for spent nuclear fuel of its owners Teollisuuden Voima Oyj and Fortum Power and Heat Oy. The spent nuclear fuel is planned to be disposed at a depth of about 400-450 meters in the crystalline bedrock at the Olkiluoto site. Leakages located at or close to spent fuel repository may give rise to the upconing of deep highly saline groundwater and this is a concern with regard to the performance of the tunnel backfill material after the closure of the tunnels. Therefore a salt transport sub-model was added to the Olkiluoto surface hydrological model (SHYD). The other improvements include update of the particle tracking algorithm and possibility to estimate the influence of open drillholes in a case where overpressure in inflatable packers decreases causing a hydraulic short-circuit between hydrogeological zones HZ19 and HZ20 along the drillhole. Four new hydrogeological zones HZ056, HZ146, BFZ100 and HZ039 were added to the model. In addition, zones HZ20A and HZ20B intersect with each other in the new structure model, which influences salinity upconing caused by leakages in shafts. The aim of the modelling of long-term influence of ONKALO, shafts and repository tunnels provide computational results that can be used to suggest limits for allowed leakages. The model input data included all the existing leakages into ONKALO (35-38 l/min) and shafts in the present day conditions. The influence of shafts was computed using eight different values for total shaft leakage: 5, 11, 20, 30, 40, 50, 60 and 70 l/min. The selection of the leakage criteria for shafts was influenced by the fact that upconing of saline water increases TDS-values close to the repository areas although HZ20B does not intersect any deposition tunnels. The total limit for all leakages was suggested to be 120 l/min. The limit for HZ20 zones was proposed to be 40 l/min: about 5 l/min the present day leakages to access tunnel, 25 l/min from

  11. Computational Science and Innovation

    International Nuclear Information System (INIS)

    Dean, David Jarvis

    2011-01-01

    Simulations - utilizing computers to solve complicated science and engineering problems - are a key ingredient of modern science. The U.S. Department of Energy (DOE) is a world leader in the development of high-performance computing (HPC), the development of applied math and algorithms that utilize the full potential of HPC platforms, and the application of computing to science and engineering problems. An interesting general question is whether the DOE can strategically utilize its capability in simulations to advance innovation more broadly. In this article, I will argue that this is certainly possible.

  12. Place-Specific Computing

    DEFF Research Database (Denmark)

    Messeter, Jörn; Johansson, Michael

    project place- specific computing is explored through design oriented research. This article reports six pilot studies where design students have designed concepts for place-specific computing in Berlin (Germany), Cape Town (South Africa), Rome (Italy) and Malmö (Sweden). Background and arguments...... for place-specific computing as a genre of interaction design are described. A total number of 36 design concepts designed for 16 designated zones in the four cities are presented. An analysis of the design concepts is presented indicating potentials, possibilities and problems as directions for future......An increased interest in the notion of place has evolved in interaction design. Proliferation of wireless infrastructure, developments in digital media, and a ‘spatial turn’ in computing provides the base for place-specific computing as a suggested new genre of interaction design. In the REcult...

  13. The ripple electromagnetic calculation: accuracy demand and possible responses

    International Nuclear Information System (INIS)

    Cocilovo, V.; Ramogida, G.; Formisano, A.; Martone, R.; Portone, A.; Roccella, M.; Roccella, R.

    2006-01-01

    Due to a number of causes (the finite number of toroidal field coils or the presence of concentrate blocks of magnetic materials, as the neutral beam shielding) the actual magnetic configuration in a Tokamak differs from the desired one. For example, a ripple is added to the ideal axisymmetric toroidal field, impacting the equilibrium and stability of the plasma column; as a further example the magnetic field out of plasma affects the operation of a number of critical components, included the diagnostic system and the neutral beam. Therefore the actual magnetic field has to be suitably calculated and his shape controlled within the required limits. Due to the complexity of its design, the problem is quite critical for the ITER project. In this paper the problem is discussed both from mathematical and numerical point of view. In particular, a complete formulation is proposed, taking into account both the presence of the non linear magnetic materials and the fully 3D geometry. Then the quality level requirements are discussed, included the accuracy of calculations and the spatial resolution. As a consequence, the numerical tools able to fulfil the quality needs while requiring reasonable computer burden are considered. In particular possible tools based on numerical FEM scheme are considered; in addition, in spite of the presence of non linear materials, the practical possibility to use Biot-Savart based approaches, as cross check tools, is also discussed. The paper also analyses the possible geometrical simplifications of the geometry able to make possible the actual calculation while guarantying the required accuracy. Finally the characteristics required for a correction system able to effectively counteract the magnetic field degradation are presented. Of course a number of examples will be also reported and commented. (author)

  14. Computational Medicine

    DEFF Research Database (Denmark)

    Nygaard, Jens Vinge

    2017-01-01

    The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours......The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours...

  15. Grid Computing

    Indian Academy of Sciences (India)

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers ...

  16. Energy principle with included boundary conditions

    International Nuclear Information System (INIS)

    Lehnert, B.

    1994-01-01

    Earlier comments by the author on the limitations of the classical form of the extended energy principle are supported by a complementary analysis on the potential energy change arising from free-boundary displacements of a magnetically confined plasma. In the final formulation of the extended principle, restricted displacements, satisfying pressure continuity by means of plasma volume currents in a thin boundary layer, are replaced by unrestricted (arbitrary) displacements which can give rise to induced surface currents. It is found that these currents contribute to the change in potential energy, and that their contribution is not taken into account by such a formulation. A general expression is further given for surface currents induced by arbitrary displacements. The expression is used to reformulate the energy principle for the class of displacements which satisfy all necessary boundary conditions, including that of the pressure balance. This makes a minimization procedure of the potential energy possible, for the class of all physically relevant test functions which include the constraints imposed by the boundary conditions. Such a procedure is also consistent with a corresponding variational calculus. (Author)

  17. Possible displacement of mercury's dipole

    International Nuclear Information System (INIS)

    Ng, K.H.; Beard, D.B.

    1979-01-01

    Earlier attempts to model the Hermean magnetospheric field based on a planet-centered magnetic multipole field have required the addition of a quadrupole moment to obtain a good fit to space vehicle observations. In this work we obtain an equally satisfactory fit by assuming a null quadrupole moment and least squares fitting of the displacement of the planetary dipole from the center of the planet. We find a best fit for a dipole displacement from the planet center of 0.033 R/sub m/ away from the solar direction, 0.025 R/sub m/ toward dawn in the magnetic equatorial plane, and 0.189 R/sub m/ northward along the magnetic dipole axis, where R/sub m/ is the planet radius. Therefore the presence of a magnetic quadrupole moment is not ruled out. The compressed dipole field more completely represents the field in the present work than in previous work where the intrinsic quadrupole field was not included in the magnetopause surface and field calculations. Moreover, we have corrected a programing error in previous work in the computation of dipole tilt lambda away from the sun. We find a slight increase for the planet dipole moment of 190γR/sub m/ 3 and a dipole tilt angle lambda away from the sun. We find a slight increase for the planet moment of 190γR/sub m/ 3 and a dipole tilt angle lambda of only 1.2 0 away from the sun. All other parameters are essentially unchanged

  18. Zγ production at NNLO including anomalous couplings

    Science.gov (United States)

    Campbell, John M.; Neumann, Tobias; Williams, Ciaran

    2017-11-01

    In this paper we present a next-to-next-to-leading order (NNLO) QCD calculation of the processes pp → l + l -γ and pp\\to ν \\overline{ν}γ that we have implemented in MCFM. Our calculation includes QCD corrections at NNLO both for the Standard Model (SM) and additionally in the presence of Zγγ and ZZγ anomalous couplings. We compare our implementation, obtained using the jettiness slicing approach, with a previous SM calculation and find broad agreement. Focusing on the sensitivity of our results to the slicing parameter, we show that using our setup we are able to compute NNLO cross sections with numerical uncertainties of about 0.1%, which is small compared to residual scale uncertainties of a few percent. We study potential improvements using two different jettiness definitions and the inclusion of power corrections. At √{s}=13 TeV we present phenomenological results and consider Zγ as a background to H → Zγ production. We find that, with typical cuts, the inclusion of NNLO corrections represents a small effect and loosens the extraction of limits on anomalous couplings by about 10%.

  19. Langevin simulations of QCD, including fermions

    International Nuclear Information System (INIS)

    Kronfeld, A.S.

    1986-02-01

    We encounter critical slow down in updating when xi/a -> infinite and in matrix inversion (needed to include fermions) when msub(q)a -> 0. A simulation that purports to solve QCD numerically will encounter these limits, so to face the challenge in the title of this workshop, we must cure the disease of critical slow down. Physically, this critical slow down is due to the reluctance of changes at short distances to propagate to large distances. Numerically, the stability of an algorithm at short wavelengths requires a (moderately) small step size; critical slow down occurs when the effective long wavelength step size becomes tiny. The remedy for this disease is an algorithm that propagates signals quickly throughout the system; i.e. one whose effective step size is not reduced for the long wavelength conponents of the fields. (Here the effective ''step size'' is essentially an inverse decorrelation time.) To do so one must resolve various wavelengths of the system and modify the dynamics (in CPU time) of the simulation so that all modes evolve at roughly the same rate. This can be achieved by introducing Fourier transforms. I show how to implement Fourier acceleration for Langevin updating and for conjugate gradient matrix inversion. The crucial feature of these algorithms that lends them to Fourier acceleration is that they update the lattice globally; hence the Fourier transforms are computed once per sweep rather than once per hit. (orig./HSI)

  20. Diffuse abnormalities of the trachea: computed tomography findings

    International Nuclear Information System (INIS)

    Marchiori, Edson; Araujo Neto, Cesar de

    2008-01-01

    The aim of this pictorial essay was to present the main computed tomography findings seen in diffuse diseases of the trachea. The diseases studied included amyloidosis, tracheobronchopathia osteochondroplastica, tracheobronchomegaly, laryngotracheobronchial papillomatosis, lymphoma, neurofibromatosis, relapsing polychondritis, Wegener's granulomatosis, tuberculosis, paracoccidioidomycosis, and tracheobronchomalacia. The most common computed tomography finding was thickening of the walls of the trachea, with or without nodules, parietal calcifications, or involvement of the posterior wall. Although computed tomography allows the detection and characterization of diseases of the central airways, and the correlation with clinical data reduces the diagnostic possibilities, bronchoscopy with biopsy remains the most useful procedure for the diagnosis of diffuse lesions of the trachea. (author)

  1. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  2. Pervasive Computing

    NARCIS (Netherlands)

    Silvis-Cividjian, N.

    This book provides a concise introduction to Pervasive Computing, otherwise known as Internet of Things (IoT) and Ubiquitous Computing (Ubicomp) which addresses the seamless integration of computing systems within everyday objects. By introducing the core topics and exploring assistive pervasive

  3. Spatial Computation

    Science.gov (United States)

    2003-12-01

    Computation and today’s microprocessors with the approach to operating system architecture, and the controversy between microkernels and monolithic kernels...Both Spatial Computation and microkernels break away a relatively monolithic architecture into in- dividual lightweight pieces, well specialized...for their particular functionality. Spatial Computation removes global signals and control, in the same way microkernels remove the global address

  4. A survey of computational physics introductory computational science

    CERN Document Server

    Landau, Rubin H; Bordeianu, Cristian C

    2008-01-01

    Computational physics is a rapidly growing subfield of computational science, in large part because computers can solve previously intractable problems or simulate natural processes that do not have analytic solutions. The next step beyond Landau's First Course in Scientific Computing and a follow-up to Landau and Páez's Computational Physics, this text presents a broad survey of key topics in computational physics for advanced undergraduates and beginning graduate students, including new discussions of visualization tools, wavelet analysis, molecular dynamics, and computational fluid dynamics

  5. Internet at school: possibility for information literacy

    Directory of Open Access Journals (Sweden)

    Maria Conceição da Silva Linhares

    2015-06-01

    Full Text Available In this work the contribution of teaching practices using social networking tools and computer literacy of high school students the Internet is analyzed. According to authors like Gasque (2012, Cervero (2007 Area (2006, Smith (2002 and Freire (1987, knowing how to use the information and the means to express it, a creative approach, understanding of what we read in conjunction keywords, concepts and ideas on how to intertextuality. This knowledge is evaluative in today's society, adjective by the exponential increase of information available in various formats and languages device through information and communication, including Internet technologies. The qualitative approach in the perspective of participant observation is the option that the object of this study suits to consider in its analysis, the relationships between subjects and cultural mediations, objectified by Internet spaces and tools to illuminate computer literacy. Develop pedagogical practices using social media and Internet tools for computer literacy work contributes to a significant experience with information.

  6. Demonstration of blind quantum computing.

    Science.gov (United States)

    Barz, Stefanie; Kashefi, Elham; Broadbent, Anne; Fitzsimons, Joseph F; Zeilinger, Anton; Walther, Philip

    2012-01-20

    Quantum computers, besides offering substantial computational speedups, are also expected to preserve the privacy of a computation. We present an experimental demonstration of blind quantum computing in which the input, computation, and output all remain unknown to the computer. We exploit the conceptual framework of measurement-based quantum computation that enables a client to delegate a computation to a quantum server. Various blind delegated computations, including one- and two-qubit gates and the Deutsch and Grover quantum algorithms, are demonstrated. The client only needs to be able to prepare and transmit individual photonic qubits. Our demonstration is crucial for unconditionally secure quantum cloud computing and might become a key ingredient for real-life applications, especially when considering the challenges of making powerful quantum computers widely available.

  7. About possible technologies of creation nanostructures blankets

    International Nuclear Information System (INIS)

    Blednova, Zh.M.; Chaevskij, M.I.; Rusinov, P.O.

    2008-01-01

    Possible technologies of formation nanostructures blankets are considered: a method of thermal carrying over of weights in the conditions of a high gradient of temperatures; the combined method including cathode-plasma nitriding in the conditions of low pressure and drawing of nitride of the titan in a uniform work cycle; the combined method including high-frequency ionic nitriding and drawing of carbide of chrome by pyrolysis chrome and organic of connections in plasma of the decaying category. Possibility of formation layered nanostructures layers is shown.

  8. Computational mathematics in China

    CERN Document Server

    Shi, Zhong-Ci

    1994-01-01

    This volume describes the most significant contributions made by Chinese mathematicians over the past decades in various areas of computational mathematics. Some of the results are quite important and complement Western developments in the field. The contributors to the volume range from noted senior mathematicians to promising young researchers. The topics include finite element methods, computational fluid mechanics, numerical solutions of differential equations, computational methods in dynamical systems, numerical algebra, approximation, and optimization. Containing a number of survey articles, the book provides an excellent way for Western readers to gain an understanding of the status and trends of computational mathematics in China.

  9. Computer Games and Art

    Directory of Open Access Journals (Sweden)

    Anton Sukhov

    2015-10-01

    Full Text Available This article devoted to the search of relevant sources (primary and secondary and characteristics of computer games that allow to include them in the field of art (such as the creation of artistic games, computer graphics, active interaction with other forms of art, signs of spiritual aesthetic act, own temporality of computer games, “aesthetic illusion”, interactivity. In general, modern computer games can be attributed to commercial art and popular culture (blockbuster games and to elite forms of contemporary media art (author’s games, visionary games.

  10. Parallel computations

    CERN Document Server

    1982-01-01

    Parallel Computations focuses on parallel computation, with emphasis on algorithms used in a variety of numerical and physical applications and for many different types of parallel computers. Topics covered range from vectorization of fast Fourier transforms (FFTs) and of the incomplete Cholesky conjugate gradient (ICCG) algorithm on the Cray-1 to calculation of table lookups and piecewise functions. Single tridiagonal linear systems and vectorized computation of reactive flow are also discussed.Comprised of 13 chapters, this volume begins by classifying parallel computers and describing techn

  11. Human Computation

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  12. Nuclear Computational Low Energy Initiative (NUCLEI)

    Energy Technology Data Exchange (ETDEWEB)

    Reddy, Sanjay K. [University of Washington

    2017-08-14

    This is the final report for University of Washington for the NUCLEI SciDAC-3. The NUCLEI -project, as defined by the scope of work, will develop, implement and run codes for large-scale computations of many topics in low-energy nuclear physics. Physics to be studied include the properties of nuclei and nuclear decays, nuclear structure and reactions, and the properties of nuclear matter. The computational techniques to be used include Quantum Monte Carlo, Configuration Interaction, Coupled Cluster, and Density Functional methods. The research program will emphasize areas of high interest to current and possible future DOE nuclear physics facilities, including ATLAS and FRIB (nuclear structure and reactions, and nuclear astrophysics), TJNAF (neutron distributions in nuclei, few body systems, and electroweak processes), NIF (thermonuclear reactions), MAJORANA and FNPB (neutrino-less double-beta decay and physics beyond the Standard Model), and LANSCE (fission studies).

  13. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  14. Theoretical Computer Science

    DEFF Research Database (Denmark)

    2002-01-01

    The proceedings contains 8 papers from the Conference on Theoretical Computer Science. Topics discussed include: query by committee, linear separation and random walks; hardness results for neural network approximation problems; a geometric approach to leveraging weak learners; mind change...

  15. Paraconsistent Computational Logic

    DEFF Research Database (Denmark)

    Jensen, Andreas Schmidt; Villadsen, Jørgen

    2012-01-01

    In classical logic everything follows from inconsistency and this makes classical logic problematic in areas of computer science where contradictions seem unavoidable. We describe a many-valued paraconsistent logic, discuss the truth tables and include a small case study....

  16. Computational modeling in biomechanics

    CERN Document Server

    Mofrad, Mohammad

    2010-01-01

    This book provides a glimpse of the diverse and important roles that modern computational technology is playing in various areas of biomechanics. It includes unique chapters on ab initio quantum mechanical, molecular dynamic and scale coupling methods..

  17. The CMS Computing Model

    International Nuclear Information System (INIS)

    Bonacorsi, D.

    2007-01-01

    The CMS experiment at LHC has developed a baseline Computing Model addressing the needs of a computing system capable to operate in the first years of LHC running. It is focused on a data model with heavy streaming at the raw data level based on trigger, and on the achievement of the maximum flexibility in the use of distributed computing resources. The CMS distributed Computing Model includes a Tier-0 centre at CERN, a CMS Analysis Facility at CERN, several Tier-1 centres located at large regional computing centres, and many Tier-2 centres worldwide. The workflows have been identified, along with a baseline architecture for the data management infrastructure. This model is also being tested in Grid Service Challenges of increasing complexity, coordinated with the Worldwide LHC Computing Grid community

  18. Review your Computer Security Now and Frequently!

    CERN Multimedia

    IT Department

    2009-01-01

    The start-up of LHC is foreseen to take place in the autumn and we will be in the public spotlight again. This increases the necessity to be vigilant with respect to computer security and the defacement of an experiment’s Web page in September last year shows that we should be particularly attentive. Attackers are permanently probing CERN and so we must all do the maximum to reduce future risks. Security is a hierarchical responsibility and requires to balance the allocation of resources between making systems work and making them secure. Thus all of us, whether users, developers, system experts, administrators, or managers are responsible for securing our computing assets. These include computers, software applications, documents, accounts and passwords. There is no "silver bullet" for securing systems, which can only be achieved by a painstaking search for all possible vulnerabilities followed by their mitigation. Additional advice on particular topics can be obtained from the relevant I...

  19. (Some) Computer Futures: Mainframes.

    Science.gov (United States)

    Joseph, Earl C.

    Possible futures for the world of mainframe computers can be forecast through studies identifying forces of change and their impact on current trends. Some new prospects for the future have been generated by advances in information technology; for example, recent United States successes in applied artificial intelligence (AI) have created new…

  20. Advances and challenges in computational plasma science

    International Nuclear Information System (INIS)

    Tang, W M; Chan, V S

    2005-01-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This

  1. 75 FR 4583 - In the Matter of: Certain Electronic Devices, Including Mobile Phones, Portable Music Players...

    Science.gov (United States)

    2010-01-28

    ..., Including Mobile Phones, Portable Music Players, and Computers; Notice of Investigation AGENCY: U.S... music players, and computers, by reason of infringement of certain claims of U.S. Patent Nos. 6,714,091... importation of certain electronic devices, including mobile phones, portable music players, or computers that...

  2. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  3. Computer simulation on molten ionic salts

    International Nuclear Information System (INIS)

    Kawamura, K.; Okada, I.

    1978-01-01

    The extensive advances in computer technology have since made it possible to apply computer simulation to the evaluation of the macroscopic and microscopic properties of molten salts. The evaluation of the potential energy in molten salts systems is complicated by the presence of long-range energy, i.e. Coulomb energy, in contrast to simple liquids where the potential energy is easily evaluated. It has been shown, however, that no difficulties are encountered when the Ewald method is applied to the evaluation of Coulomb energy. After a number of attempts had been made to approximate the pair potential, the Huggins-Mayer potential based on ionic crystals became the most often employed. Since it is thought that the only appreciable contribution to many-body potential, not included in Huggins-Mayer potential, arises from the internal electrostatic polarization of ions in molten ionic salts, computer simulation with a provision for ion polarization has been tried recently. The computations, which are employed mainly for molten alkali halides, can provide: (1) thermodynamic data such as internal energy, internal pressure and isothermal compressibility; (2) microscopic configurational data such as radial distribution functions; (3) transport data such as the diffusion coefficient and electrical conductivity; and (4) spectroscopic data such as the intensity of inelastic scattering and the stretching frequency of simple molecules. The computed results seem to agree well with the measured results. Computer simulation can also be used to test the effectiveness of a proposed pair potential and the adequacy of postulated models of molten salts, and to obtain experimentally inaccessible data. A further application of MD computation employing the pair potential based on an ionic model to BeF 2 , ZnCl 2 and SiO 2 shows the possibility of quantitative interpretation of structures and glass transformation phenomena

  4. Current possibilities of chorioretinites diagnostics

    Directory of Open Access Journals (Sweden)

    O. V. Chudinova

    2014-07-01

    Full Text Available Purpose: to study the morphometric changes in retina and the state of regional hemodynamics for chorioretinites of different etiology, to draw parallels between these methods of study with evaluation of their diagnostical significance.Methods: Clinical and instrumental examination was performed in 15 patients (15 eyes — group 1 — with the verified diagnosis of toxoplasmous chorioretinitis and in 13 patients (13 eyes — group 2 — with the diagnosis of tuberculous chorioretinitis. Control (group 3 consisted of 20 subjects (40 eyes, 9 males, 11 females, without any pathology of organ of vision. Complex ophthalmologic examination was performed in all the patients; the examination included the following procedures: determination of visual acuity with correction, computer perimetry, biomicroscopy of eye fundus, inspection of eye fundus using Goldman lens, optic coherent tomogra- phy (OCt, ultrasound Dopplerography (USDG of eye vessels.Results: the following was determined by OCt data: subclinical serous retinal detachment, isolated cells of cyst-like edema, cyst- like edema in macular zone, unevenness of hyperreflective band of pigment epithelium, thinning of neurosensory retina in the area of scarry focus, hyperreflectivity of the zone of the fibrosis being formed, architectonics disorder of NE layers in foveolar zone and para- foveally at the expense of the presence of small hyperreflective parts. In the presence of proliferative process in the vascular coat the reliable decrease of blood flow maximal and minimal velocities in the posterior short ciliary arteries, maximal and minimal velocities of blood flow in the posterior long ciliary arteries in comparison with the values of patients from control group. the data obtained are supposed that proliferative processes in the vascular coat are accompanied by marked local hemodynamic disorders, which should be taken into consideration when complex therapy is prescribed.Conclusion: Dynamic

  5. McMaster University: College and University Computing Environment.

    Science.gov (United States)

    CAUSE/EFFECT, 1988

    1988-01-01

    The computing and information services (CIS) organization includes administrative computing, academic computing, and networking and has three divisions: computing services, development services, and information services. Other computing activities include Health Sciences, Humanities Computing Center, and Department of Computer Science and Systems.…

  6. An introduction to computer viruses

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D.R.

    1992-03-01

    This report on computer viruses is based upon a thesis written for the Master of Science degree in Computer Science from the University of Tennessee in December 1989 by David R. Brown. This thesis is entitled An Analysis of Computer Virus Construction, Proliferation, and Control and is available through the University of Tennessee Library. This paper contains an overview of the computer virus arena that can help the reader to evaluate the threat that computer viruses pose. The extent of this threat can only be determined by evaluating many different factors. These factors include the relative ease with which a computer virus can be written, the motivation involved in writing a computer virus, the damage and overhead incurred by infected systems, and the legal implications of computer viruses, among others. Based upon the research, the development of a computer virus seems to require more persistence than technical expertise. This is a frightening proclamation to the computing community. The education of computer professionals to the dangers that viruses pose to the welfare of the computing industry as a whole is stressed as a means of inhibiting the current proliferation of computer virus programs. Recommendations are made to assist computer users in preventing infection by computer viruses. These recommendations support solid general computer security practices as a means of combating computer viruses.

  7. GPU-computing in econophysics and statistical physics

    Science.gov (United States)

    Preis, T.

    2011-03-01

    A recent trend in computer science and related fields is general purpose computing on graphics processing units (GPUs), which can yield impressive performance. With multiple cores connected by high memory bandwidth, today's GPUs offer resources for non-graphics parallel processing. This article provides a brief introduction into the field of GPU computing and includes examples. In particular computationally expensive analyses employed in financial market context are coded on a graphics card architecture which leads to a significant reduction of computing time. In order to demonstrate the wide range of possible applications, a standard model in statistical physics - the Ising model - is ported to a graphics card architecture as well, resulting in large speedup values.

  8. Mathematics, the Computer, and the Impact on Mathematics Education.

    Science.gov (United States)

    Tooke, D. James

    2001-01-01

    Discusses the connection between mathematics and the computer; mathematics curriculum; mathematics instruction, including teachers learning to use computers; and the impact of the computer on learning mathematics. (LRW)

  9. Computer Graphics and Administrative Decision-Making.

    Science.gov (United States)

    Yost, Michael

    1984-01-01

    Reduction in prices now makes it possible for almost any institution to use computer graphics for administrative decision making and research. Current and potential uses of computer graphics in these two areas are discussed. (JN)

  10. Wireless Technologies, Ubiquitous Computing and Mobile Health: Application to Drug Abuse Treatment and Compliance with HIV Therapies.

    Science.gov (United States)

    Boyer, Edward W; Smelson, David; Fletcher, Richard; Ziedonis, Douglas; Picard, Rosalind W

    2010-06-01

    Beneficial advances in the treatment of substance abuse and compliance with medical therapies, including HAART, are possible with new mobile technologies related to personal physiological sensing and computational methods. When incorporated into mobile platforms that allow for ubiquitous computing, these technologies have great potential for extending the reach of behavioral interventions from clinical settings where they are learned into natural environments.

  11. Computational intelligence techniques for comparative genomics dedicated to Prof. Allam Appa Rao on the occasion of his 65th birthday

    CERN Document Server

    Gunjan, Vinit

    2015-01-01

    This Brief highlights Informatics and related techniques to Computer Science Professionals, Engineers, Medical Doctors, Bioinformatics researchers and other interdisciplinary researchers. Chapters include the Bioinformatics of Diabetes and several computational algorithms and statistical analysis approach to effectively study the disorders and possible causes along with medical applications.

  12. 77 FR 60720 - Certain Electronic Devices, Including Wireless Commmunication Devices, Portable Music and Data...

    Science.gov (United States)

    2012-10-04

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-794] Certain Electronic Devices, Including Wireless Commmunication Devices, Portable Music and Data Processing Devices, and Tablet Computers... communication devices, portable music and data processing devices, and tablet computers, imported by Apple Inc...

  13. 77 FR 70464 - Certain Electronic Devices, Including Wireless Communication Devices, Portable Music and Data...

    Science.gov (United States)

    2012-11-26

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-794] Certain Electronic Devices, Including Wireless Communication Devices, Portable Music and Data Processing Devices, and Tablet Computers... wireless communication devices, portable music and data processing devices, and tablet computers, by reason...

  14. Analysis of Smart Composite Structures Including Debonding

    Science.gov (United States)

    Chattopadhyay, Aditi; Seeley, Charles E.

    1997-01-01

    Smart composite structures with distributed sensors and actuators have the capability to actively respond to a changing environment while offering significant weight savings and additional passive controllability through ply tailoring. Piezoelectric sensing and actuation of composite laminates is the most promising concept due to the static and dynamic control capabilities. Essential to the implementation of these smart composites are the development of accurate and efficient modeling techniques and experimental validation. This research addresses each of these important topics. A refined higher order theory is developed to model composite structures with surface bonded or embedded piezoelectric transducers. These transducers are used as both sensors and actuators for closed loop control. The theory accurately captures the transverse shear deformation through the thickness of the smart composite laminate while satisfying stress free boundary conditions on the free surfaces. The theory is extended to include the effect of debonding at the actuator-laminate interface. The developed analytical model is implemented using the finite element method utilizing an induced strain approach for computational efficiency. This allows general laminate geometries and boundary conditions to be analyzed. The state space control equations are developed to allow flexibility in the design of the control system. Circuit concepts are also discussed. Static and dynamic results of smart composite structures, obtained using the higher order theory, are correlated with available analytical data. Comparisons, including debonded laminates, are also made with a general purpose finite element code and available experimental data. Overall, very good agreement is observed. Convergence of the finite element implementation of the higher order theory is shown with exact solutions. Additional results demonstrate the utility of the developed theory to study piezoelectric actuation of composite

  15. Computational biomechanics

    International Nuclear Information System (INIS)

    Ethier, C.R.

    2004-01-01

    Computational biomechanics is a fast-growing field that integrates modern biological techniques and computer modelling to solve problems of medical and biological interest. Modelling of blood flow in the large arteries is the best-known application of computational biomechanics, but there are many others. Described here is work being carried out in the laboratory on the modelling of blood flow in the coronary arteries and on the transport of viral particles in the eye. (author)

  16. Overhead Crane Computer Model

    Science.gov (United States)

    Enin, S. S.; Omelchenko, E. Y.; Fomin, N. V.; Beliy, A. V.

    2018-03-01

    The paper has a description of a computer model of an overhead crane system. The designed overhead crane system consists of hoisting, trolley and crane mechanisms as well as a payload two-axis system. With the help of the differential equation of specified mechanisms movement derived through Lagrange equation of the II kind, it is possible to build an overhead crane computer model. The computer model was obtained using Matlab software. Transients of coordinate, linear speed and motor torque of trolley and crane mechanism systems were simulated. In addition, transients of payload swaying were obtained with respect to the vertical axis. A trajectory of the trolley mechanism with simultaneous operation with the crane mechanism is represented in the paper as well as a two-axis trajectory of payload. The designed computer model of an overhead crane is a great means for studying positioning control and anti-sway control systems.

  17. Quantum computational webs

    International Nuclear Information System (INIS)

    Gross, D.; Eisert, J.

    2010-01-01

    We discuss the notion of quantum computational webs: These are quantum states universal for measurement-based computation, which can be built up from a collection of simple primitives. The primitive elements--reminiscent of building blocks in a construction kit--are (i) one-dimensional states (computational quantum wires) with the power to process one logical qubit and (ii) suitable couplings, which connect the wires to a computationally universal web. All elements are preparable by nearest-neighbor interactions in a single pass, of the kind accessible in a number of physical architectures. We provide a complete classification of qubit wires, a physically well-motivated class of universal resources that can be fully understood. Finally, we sketch possible realizations in superlattices and explore the power of coupling mechanisms based on Ising or exchange interactions.

  18. Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.

    to understand the computer as a material like any other material we would use for design, like wood, aluminum, or plastic. That as soon as the computer forms a composition with other materials it becomes just as approachable and inspiring as other smart materials. I present a series of investigations of what...... Computational Composite, and Telltale). Through the investigations, I show how the computer can be understood as a material and how it partakes in a new strand of materials whose expressions come to be in context. I uncover some of their essential material properties and potential expressions. I develop a way...

  19. LTRACK: Beam-transport calculation including wakefield effects

    International Nuclear Information System (INIS)

    Chan, K.C.D.; Cooper, R.K.

    1988-01-01

    LTRACK is a first-order beam-transport code that includes wakefield effects up to quadrupole modes. This paper will introduce the readers to this computer code by describing the history, the method of calculations, and a brief summary of the input/output information. Future plans for the code will also be described

  20. Possibilities for Proactive Library Services.

    Science.gov (United States)

    Morgan, Eric Lease

    1999-01-01

    Considers ways in which library services can be more proactive in today's networked-computer environment. Discusses how to find and use patterns of behavior, such as borrowing behavior and profiles of patrons' interests; making a CD-ROM with information describing the library's services and products; and reviving telephone reference. (LRW)

  1. Computational Design of Batteries from Materials to Systems

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Kandler A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Santhanagopalan, Shriram [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Yang, Chuanbo [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Graf, Peter A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Usseglio Viretta, Francois L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Li, Qibo [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Finegan, Donal [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Pesaran, Ahmad A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Yao, Koffi (Pierre) [Argonne National Laboratory; Abraham, Daniel [Argonne National Laboratory; Dees, Dennis [Argonne National Laboratory; Jansen, Andy [Argonne National Laboratory; Mukherjee, Partha [Texas A& M University; Mistry, Aashutosh [Texas A& M University; Verma, Ankit [Texas A& M University; Lamb, Josh [Sandia National Laboratories; Darcy, Eric [NASA

    2017-09-01

    Computer models are helping to accelerate the design and validation of next generation batteries and provide valuable insights not possible through experimental testing alone. Validated 3-D physics-based models exist for predicting electrochemical performance, thermal and mechanical response of cells and packs under normal and abuse scenarios. The talk describes present efforts to make the models better suited for engineering design, including improving their computation speed, developing faster processes for model parameter identification including under aging, and predicting the performance of a proposed electrode material recipe a priori using microstructure models.

  2. Computational thinking as an emerging competence domain

    NARCIS (Netherlands)

    Yadav, A.; Good, J.; Voogt, J.; Fisser, P.; Mulder, M.

    2016-01-01

    Computational thinking is a problem-solving skill set, which includes problem decomposition, algorithmic thinking, abstraction, and automation. Even though computational thinking draws upon concepts fundamental to computer science (CS), it has broad application to all disciplines. It has been

  3. ALGORITHMS AND PROGRAMS FOR STRONG GRAVITATIONAL LENSING IN KERR SPACE-TIME INCLUDING POLARIZATION

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Bin; Maddumage, Prasad [Research Computing Center, Department of Scientific Computing, Florida State University, Tallahassee, FL 32306 (United States); Kantowski, Ronald; Dai, Xinyu; Baron, Eddie, E-mail: bchen3@fsu.edu [Homer L. Dodge Department of Physics and Astronomy, University of Oklahoma, Norman, OK 73019 (United States)

    2015-05-15

    Active galactic nuclei (AGNs) and quasars are important astrophysical objects to understand. Recently, microlensing observations have constrained the size of the quasar X-ray emission region to be of the order of 10 gravitational radii of the central supermassive black hole. For distances within a few gravitational radii, light paths are strongly bent by the strong gravity field of the central black hole. If the central black hole has nonzero angular momentum (spin), then a photon’s polarization plane will be rotated by the gravitational Faraday effect. The observed X-ray flux and polarization will then be influenced significantly by the strong gravity field near the source. Consequently, linear gravitational lensing theory is inadequate for such extreme circumstances. We present simple algorithms computing the strong lensing effects of Kerr black holes, including the effects on polarization. Our algorithms are realized in a program “KERTAP” in two versions: MATLAB and Python. The key ingredients of KERTAP are a graphic user interface, a backward ray-tracing algorithm, a polarization propagator dealing with gravitational Faraday rotation, and algorithms computing observables such as flux magnification and polarization angles. Our algorithms can be easily realized in other programming languages such as FORTRAN, C, and C++. The MATLAB version of KERTAP is parallelized using the MATLAB Parallel Computing Toolbox and the Distributed Computing Server. The Python code was sped up using Cython and supports full implementation of MPI using the “mpi4py” package. As an example, we investigate the inclination angle dependence of the observed polarization and the strong lensing magnification of AGN X-ray emission. We conclude that it is possible to perform complex numerical-relativity related computations using interpreted languages such as MATLAB and Python.

  4. ALGORITHMS AND PROGRAMS FOR STRONG GRAVITATIONAL LENSING IN KERR SPACE-TIME INCLUDING POLARIZATION

    International Nuclear Information System (INIS)

    Chen, Bin; Maddumage, Prasad; Kantowski, Ronald; Dai, Xinyu; Baron, Eddie

    2015-01-01

    Active galactic nuclei (AGNs) and quasars are important astrophysical objects to understand. Recently, microlensing observations have constrained the size of the quasar X-ray emission region to be of the order of 10 gravitational radii of the central supermassive black hole. For distances within a few gravitational radii, light paths are strongly bent by the strong gravity field of the central black hole. If the central black hole has nonzero angular momentum (spin), then a photon’s polarization plane will be rotated by the gravitational Faraday effect. The observed X-ray flux and polarization will then be influenced significantly by the strong gravity field near the source. Consequently, linear gravitational lensing theory is inadequate for such extreme circumstances. We present simple algorithms computing the strong lensing effects of Kerr black holes, including the effects on polarization. Our algorithms are realized in a program “KERTAP” in two versions: MATLAB and Python. The key ingredients of KERTAP are a graphic user interface, a backward ray-tracing algorithm, a polarization propagator dealing with gravitational Faraday rotation, and algorithms computing observables such as flux magnification and polarization angles. Our algorithms can be easily realized in other programming languages such as FORTRAN, C, and C++. The MATLAB version of KERTAP is parallelized using the MATLAB Parallel Computing Toolbox and the Distributed Computing Server. The Python code was sped up using Cython and supports full implementation of MPI using the “mpi4py” package. As an example, we investigate the inclination angle dependence of the observed polarization and the strong lensing magnification of AGN X-ray emission. We conclude that it is possible to perform complex numerical-relativity related computations using interpreted languages such as MATLAB and Python

  5. Minimal mobile human computer interaction

    NARCIS (Netherlands)

    el Ali, A.

    2013-01-01

    In the last 20 years, the widespread adoption of personal, mobile computing devices in everyday life, has allowed entry into a new technological era in Human Computer Interaction (HCI). The constant change of the physical and social context in a user's situation made possible by the portability of

  6. Visual implementation of computer communication

    OpenAIRE

    Gunnarsson, Tobias; Johansson, Hans

    2010-01-01

    Communication is a fundamental part of life and during the 20th century several new ways for communication has been developed and created. From the first telegraph which made it possible to send messages over long distances to radio communication and the telephone. In the last decades, computer to computer communication at high speed has become increasingly important, and so also the need for understanding computer communication. Since data communication today works in speeds that are so high...

  7. Possible refurbishment of Point Lepreau

    International Nuclear Information System (INIS)

    White, R.M.; Groom, S.H.; Thompson, P.D.; Barclay, J.M.; Allen, P.J.

    2001-01-01

    In February 2000, the NB Power Board of Directors approved Phase one of a project to produce a business case including a detailed scope and estimate associated with the possible refurbishment of the Point Lepreau Generating Station (PLGS). The Preliminary plan for refurbishment projects an 18-month outage starting as early as the spring of 2006. If the station were to be refurbished, then it would be run for another 25 to 30 years. The decision on whether or not to refurbish PLGS has not been made and is not expected until the summer of 2002. The results of the first phase of the project will be used to prepare a detailed business case that will be presented to the NB Power board of directors in January of 2002. At that time a decision will be made as to whether to refurbish the unit, or obtain other means of replacing the energy produced by PLGS. The station currently produces about a third of the power generated within the province. If the business case is approved, all-380 Pressure Tubes and Calandria Tubes, along with their related End Fittings and Feeders would be replaced. This material would be stored in new storage vaults to be constructed at the existing on-site Waste Management Facility. Replacement of other station components will be performed as required, as determined from the results of a comprehensive Plant Condition Assessment. The condition assessments build on work done under the Plant Life Management Program. Point Lepreau Generating Station has operated well since start of commercial operation in early 1983. With a lifetime capacity factor of about 84% (up to the end of 2000), it has proven to be an economic and environmentally sound electricity provider. The station has also had a significant positive economic impact in Southern New Brunswick, employing over 600 people. However the Pressure Tubes and Feeders are nearing the point in time in which they will exceed their fitness for service criteria. Although tubes can be replaced on an

  8. Possible refurbishment of Point Lepreau

    International Nuclear Information System (INIS)

    White, R.M.; Groom, S.H.; Thompson, P.D.; Barclay, J.M.; Allen, P.J.

    2001-01-01

    In February 2000, the NB Power Board of Directors approved Phase one of a project to produce a business case including a detailed scope and estimate associated with the possible refurbishment of the Point Lepreau Generating Station (PLGS). The Preliminary plan for refurbishment projects an 18-month outage starting as early as the spring of 2006. If the station were to be refurbished, then it would be run for another 25 to 30 years. The decision on whether or not to refurbish PLGS has not been made and is not expected until the summer of 2002. The results of the first phase of the project will be used to prepare a detailed business case that will be presented to the NB Power board of directors in January of 2002. At that time a decision will be made as to whether to refurbish the unit, or obtain other means of replacing the energy produced by PLGS. The station currently produces about a third of the power generated within the province. If the business case is approved, all-380 Pressure Tubes and Calandria Tubes, along with their related End Fittings and Feeders would be replaced. This material would be stored in new storage vaults to be constructed at the existing on-site Waste Management Facility. Replacement of other station components will be performed as required, as determined from the results of a comprehensive Plant Condition Assessment. The condition assessments build on work done under the Plant Life Management Program. Point Lepreau Generating Station has operated well since start of commercial operation in early 1983. With a lifetime capacity factor of about 84% (up to the end of 2000), it has proven to be an economic and environmentally sound electricity provider. The station has also had a significant positive economic impact in Southern New Brunswick, employing over 600 people. However the Pressure Tubes and Feeders are nearing the point in time in which they will exceed their fitness for service criteria. Although tubes can be replaced on an

  9. COMPUTATIONAL SCIENCE CENTER

    International Nuclear Information System (INIS)

    DAVENPORT, J.

    2006-01-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to bring together

  10. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2006-11-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to

  11. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  12. Quantum Computing

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 9. Quantum Computing - Building Blocks of a Quantum Computer. C S Vijay Vishal Gupta. General Article Volume 5 Issue 9 September 2000 pp 69-81. Fulltext. Click here to view fulltext PDF. Permanent link:

  13. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  14. Quantum Computing

    Indian Academy of Sciences (India)

    In the first part of this article, we had looked at how quantum physics can be harnessed to make the building blocks of a quantum computer. In this concluding part, we look at algorithms which can exploit the power of this computational device, and some practical difficulties in building such a device. Quantum Algorithms.

  15. Quantum computing

    OpenAIRE

    Burba, M.; Lapitskaya, T.

    2017-01-01

    This article gives an elementary introduction to quantum computing. It is a draft for a book chapter of the "Handbook of Nature-Inspired and Innovative Computing", Eds. A. Zomaya, G.J. Milburn, J. Dongarra, D. Bader, R. Brent, M. Eshaghian-Wilner, F. Seredynski (Springer, Berlin Heidelberg New York, 2006).

  16. Computer-aided cleanup

    International Nuclear Information System (INIS)

    Williams, J.; Jones, B.

    1994-01-01

    In late 1992, the remedial investigation of operable unit 2 at the Department of Energy (DOE) Superfund site in Fernald, Ohio was in trouble. Despite years of effort--including an EPA-approved field-investigation work plan, 123 soil borings, 51 ground-water-monitoring wells, analysis of more than 650 soil and ground-water samples, and preparation of a draft remedial-investigation (RI) report--it was not possible to conclude if contaminated material in the unit was related to ground-water contamination previously detected beneath and beyond the site boundary. Compounding the problem, the schedule for the RI, feasibility study and record of decision for operable unit 2 was governed by a DOE-EPA consent agreement stipulating penalties of up to $10,000 per week for not meeting scheduled milestones--and time was running out. An advanced three-dimensional computer model confirmed that radioactive wastes dumped at the Fernald, Ohio Superfund site had contaminated ground water, after years of previous testing has been inconclusive. The system is now being used to aid feasibility and design work on the more-than-$1 billion remediation project

  17. Computed tomography intravenous cholangiography

    International Nuclear Information System (INIS)

    Nascimento, S.; Murray, W.; Wilson, P.

    1997-01-01

    Indications for direct visualization of the bile ducts include bile duct dilatation demonstrated by ultrasound or computed tomography (CT) scanning, where the cause of the bile duct dilatation is uncertain or where the anatomy of bile duct obstruction needs further clarification. Another indication is right upper quadrant pain, particularly in a post-cholecystectomy patient, where choledocholithiasis is suspected. A possible new indication is pre-operative evaluation prior to laparoscopic cholecystectomy. The bile ducts are usually studied by endoscopic retrograde cholangiopancreatography (ERCP), or, less commonly, trans-hepatic cholangiography. The old technique of intravenous cholangiography has fallen into disrepute because of inconsistent bile-duct opacification. The advent of spiral CT scanning has renewed interest in intravenous cholangiography. The CT technique is very sensitive to the contrast agent in the bile ducts, and angiographic and three-dimensional reconstructions of the biliary tree can readily be obtained using the CT intravenous cholangiogram technique (CT IVC). Seven patients have been studied using this CT IVC technique, between February 1995 and June 1996, and are the subject of the present report. Eight further studies have since been performed. The results suggest that CT IVC could replace ERCP as the primary means of direct cholangiography, where pancreatic duct visualization is not required. (authors)

  18. Power throttling of collections of computing elements

    Science.gov (United States)

    Bellofatto, Ralph E [Ridgefield, CT; Coteus, Paul W [Yorktown Heights, NY; Crumley, Paul G [Yorktown Heights, NY; Gara, Alan G [Mount Kidsco, NY; Giampapa, Mark E [Irvington, NY; Gooding,; Thomas, M [Rochester, MN; Haring, Rudolf A [Cortlandt Manor, NY; Megerian, Mark G [Rochester, MN; Ohmacht, Martin [Yorktown Heights, NY; Reed, Don D [Mantorville, MN; Swetz, Richard A [Mahopac, NY; Takken, Todd [Brewster, NY

    2011-08-16

    An apparatus and method for controlling power usage in a computer includes a plurality of computers communicating with a local control device, and a power source supplying power to the local control device and the computer. A plurality of sensors communicate with the computer for ascertaining power usage of the computer, and a system control device communicates with the computer for controlling power usage of the computer.

  19. DNS in Computer Forensics

    Directory of Open Access Journals (Sweden)

    Neil Fowler Wright

    2012-06-01

    Full Text Available The Domain Name Service (DNS is a critical core component of the global Internet and integral to the majority of corporate intranets. It provides resolution services between the human-readable name-based system addresses and the machine operable Internet Protocol (IP based addresses required for creating network level connections. Whilst structured as a globally dispersed resilient tree data structure, from the Global and Country Code Top Level Domains (gTLD/ccTLD down to the individual site and system leaf nodes, it is highly resilient although vulnerable to various attacks, exploits and systematic failures. This paper examines the history along with the rapid growth of DNS up to its current critical status. It then explores the often overlooked value of DNS query data; from packet traces, DNS cache data, and DNS logs, with its use in System Forensics and more frequently in Network Forensics, extrapolating examples and experiments that enhance knowledge.Continuing on, it details the common attacks that can be used directly against the DNS systems and services, before following on with the malicious uses of DNS in direct system attacks, Distributed Denial of Service (DDoS, traditional Denial of Service (DOS attacks and malware. It explores both cyber-criminal activities and cyber-warfare based attacks, and also extrapolates from a number of more recent attacks the possible methods for data exfiltration. It explores some of the potential analytical methodologies including; common uses in Intrusion Detection Systems (IDS, as well as infection and activity tracking in malware traffic analysis, and covers some of the associated methods around technology designed to defend against, mitigate, and/or manage these and other risks, plus the effect that ISP and nation states can have by direct manipulation of DNS queries and return traffic.This paper also investigates potential behavioural analysis and time-lining, which can then be used for the

  20. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production......), for instance, in establishing and maintaining trust between the involved parties (Sabherwal, 1999). So far, research in cloud computing has neglected this perspective and focused entirely on aspects relating to technology, economy, security and legal questions. While the core technologies of cloud computing (e...

  1. Possibilities for the prevention of cancer

    International Nuclear Information System (INIS)

    Doll, Richard

    1986-01-01

    Two types of evidence suggest that the prevention of cancer is a practical possibility: first, our increasing knowledge of the causes of cancer, many of which can be avoided without difficulty, and second, evidence that all common cancers whose causes are still unknown vary in incidence with place, time or social group. Many known causes still exist, however, and are responsible for hundreds of thousands of cases annually throughout the world. Practical possibilities for prevention now and in the near future include changes in personal habits (tobacco, alcohol, diet), control of exposure to known cancer-producing substances (carcinogens) in both industry and the general environment, and immunization against viruses causing cancer. (author)

  2. Building a High Performance Computing Infrastructure for Novosibirsk Scientific Center

    International Nuclear Information System (INIS)

    Adakin, A; Chubarov, D; Nikultsev, V; Belov, S; Kaplin, V; Sukharev, A; Zaytsev, A; Kalyuzhny, V; Kuchin, N; Lomakin, S

    2011-01-01

    Novosibirsk Scientific Center (NSC), also known worldwide as Akademgorodok, is one of the largest Russian scientific centers hosting Novosibirsk State University (NSU) and more than 35 research organizations of the Siberian Branch of Russian Academy of Sciences including Budker Institute of Nuclear Physics (BINP), Institute of Computational Technologies (ICT), and Institute of Computational Mathematics and Mathematical Geophysics (ICM and MG). Since each institute has specific requirements on the architecture of the computing farms involved in its research field, currently we've got several computing facilities hosted by NSC institutes, each optimized for the particular set of tasks, of which the largest are the NSU Supercomputer Center, Siberian Supercomputer Center (ICM and MG), and a Grid Computing Facility of BINP. Recently a dedicated optical network with the initial bandwidth of 10 Gbps connecting these three facilities was built in order to make it possible to share the computing resources among the research communities of participating institutes, thus providing a common platform for building the computing infrastructure for various scientific projects. Unification of the computing infrastructure is achieved by extensive use of virtualization technologies based on XEN and KVM platforms. The solution implemented was tested thoroughly within the computing environment of KEDR detector experiment which is being carried out at BINP, and foreseen to be applied to the use cases of other HEP experiments in the upcoming future.

  3. Blind Quantum Computation

    DEFF Research Database (Denmark)

    Salvail, Louis; Arrighi, Pablo

    2006-01-01

    We investigate the possibility of "having someone carry out the work of executing a function for you, but without letting him learn anything about your input". Say Alice wants Bob to compute some known function f upon her input x, but wants to prevent Bob from learning anything about x. The situa......We investigate the possibility of "having someone carry out the work of executing a function for you, but without letting him learn anything about your input". Say Alice wants Bob to compute some known function f upon her input x, but wants to prevent Bob from learning anything about x....... The situation arises for instance if client Alice has limited computational resources in comparison with mistrusted server Bob, or if x is an inherently mobile piece of data. Could there be a protocol whereby Bob is forced to compute f(x) "blindly", i.e. without observing x? We provide such a blind computation...... protocol for the class of functions which admit an efficient procedure to generate random input-output pairs, e.g. factorization. The cheat-sensitive security achieved relies only upon quantum theory being true. The security analysis carried out assumes the eavesdropper performs individual attacks....

  4. Computer games and prosocial behaviour.

    Science.gov (United States)

    Mengel, Friederike

    2014-01-01

    We relate different self-reported measures of computer use to individuals' propensity to cooperate in the Prisoner's dilemma. The average cooperation rate is positively related to the self-reported amount participants spend playing computer games. None of the other computer time use variables (including time spent on social media, browsing internet, working etc.) are significantly related to cooperation rates.

  5. The Need for Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Bernier, David

    2011-01-01

    Broadening computer science learning to include more students is a crucial item on the United States' education agenda, these authors say. Although policymakers advocate more computer science expertise, computer science offerings in high schools are few--and actually shrinking. In addition, poorly resourced schools with a high percentage of…

  6. Computing architecture for autonomous microgrids

    Science.gov (United States)

    Goldsmith, Steven Y.

    2015-09-29

    A computing architecture that facilitates autonomously controlling operations of a microgrid is described herein. A microgrid network includes numerous computing devices that execute intelligent agents, each of which is assigned to a particular entity (load, source, storage device, or switch) in the microgrid. The intelligent agents can execute in accordance with predefined protocols to collectively perform computations that facilitate uninterrupted control of the .

  7. Administrative Computing in Continuing Education.

    Science.gov (United States)

    Broxton, Harry

    1982-01-01

    Describes computer applications in the Division of Continuing Education at Brigham Young University. These include instructional applications (computer assisted instruction, computer science education, and student problem solving) and administrative applications (registration, payment records, grades, reports, test scoring, mailing, and others).…

  8. AV Programs for Computer Know-How.

    Science.gov (United States)

    Mandell, Phyllis Levy

    1985-01-01

    Lists 44 audiovisual programs (most released between 1983 and 1984) grouped in seven categories: computers in society, introduction to computers, computer operations, languages and programing, computer graphics, robotics, computer careers. Excerpts from "School Library Journal" reviews, price, and intended grade level are included. Names…

  9. Computers, Nanotechnology and Mind

    Science.gov (United States)

    Ekdahl, Bertil

    2008-10-01

    In 1958, two years after the Dartmouth conference, where the term artificial intelligence was coined, Herbert Simon and Allen Newell asserted the existence of "machines that think, that learn and create." They were further prophesying that the machines' capacity would increase and be on par with the human mind. Now, 50 years later, computers perform many more tasks than one could imagine in the 1950s but, virtually, no computer can do more than could the first digital computer, developed by John von Neumann in the 1940s. Computers still follow algorithms, they do not create them. However, the development of nanotechnology seems to have given rise to new hopes. With nanotechnology two things are supposed to happen. Firstly, due to the small scale it will be possible to construct huge computer memories which are supposed to be the precondition for building an artificial brain, secondly, nanotechnology will make it possible to scan the brain which in turn will make reverse engineering possible; the mind will be decoded by studying the brain. The consequence of such a belief is that the brain is no more than a calculator, i.e., all that the mind can do is in principle the results of arithmetical operations. Computers are equivalent to formal systems which in turn was an answer to an idea by Hilbert that proofs should contain ideal statements for which operations cannot be applied in a contentual way. The advocates of artificial intelligence will place content in a machine that is developed not only to be free of content but also cannot contain content. In this paper I argue that the hope for artificial intelligence is in vain.

  10. Parallelism in matrix computations

    CERN Document Server

    Gallopoulos, Efstratios; Sameh, Ahmed H

    2016-01-01

    This book is primarily intended as a research monograph that could also be used in graduate courses for the design of parallel algorithms in matrix computations. It assumes general but not extensive knowledge of numerical linear algebra, parallel architectures, and parallel programming paradigms. The book consists of four parts: (I) Basics; (II) Dense and Special Matrix Computations; (III) Sparse Matrix Computations; and (IV) Matrix functions and characteristics. Part I deals with parallel programming paradigms and fundamental kernels, including reordering schemes for sparse matrices. Part II is devoted to dense matrix computations such as parallel algorithms for solving linear systems, linear least squares, the symmetric algebraic eigenvalue problem, and the singular-value decomposition. It also deals with the development of parallel algorithms for special linear systems such as banded ,Vandermonde ,Toeplitz ,and block Toeplitz systems. Part III addresses sparse matrix computations: (a) the development of pa...

  11. Offline computing and networking

    International Nuclear Information System (INIS)

    Appel, J.A.; Avery, P.; Chartrand, G.

    1985-01-01

    This note summarizes the work of the Offline Computing and Networking Group. The report is divided into two sections; the first deals with the computing and networking requirements and the second with the proposed way to satisfy those requirements. In considering the requirements, we have considered two types of computing problems. The first is CPU-intensive activity such as production data analysis (reducing raw data to DST), production Monte Carlo, or engineering calculations. The second is physicist-intensive computing such as program development, hardware design, physics analysis, and detector studies. For both types of computing, we examine a variety of issues. These included a set of quantitative questions: how much CPU power (for turn-around and for through-put), how much memory, mass-storage, bandwidth, and so on. There are also very important qualitative issues: what features must be provided by the operating system, what tools are needed for program design, code management, database management, and for graphics

  12. Applications of Computer Algebra Conference

    CERN Document Server

    Martínez-Moro, Edgar

    2017-01-01

    The Applications of Computer Algebra (ACA) conference covers a wide range of topics from Coding Theory to Differential Algebra to Quantam Computing, focusing on the interactions of these and other areas with the discipline of Computer Algebra. This volume provides the latest developments in the field as well as its applications in various domains, including communications, modelling, and theoretical physics. The book will appeal to researchers and professors of computer algebra, applied mathematics, and computer science, as well as to engineers and computer scientists engaged in research and development.

  13. Virtually going green: The role of quantum computational chemistry in reducing pollution and toxicity in chemistry

    Science.gov (United States)

    Stevens, Jonathan

    2017-07-01

    Continuing advances in computational chemistry has permitted quantum mechanical calculation to assist in research in green chemistry and to contribute to the greening of chemical practice. Presented here are recent examples illustrating the contribution of computational quantum chemistry to green chemistry, including the possibility of using computation as a green alternative to experiments, but also illustrating contributions to greener catalysis and the search for greener solvents. Examples of applications of computation to ambitious projects for green synthetic chemistry using carbon dioxide are also presented.

  14. Neutrino geophysics - a future possibility

    International Nuclear Information System (INIS)

    Kiss, Dezsoe

    1988-01-01

    The history and basic properties of the neutrinos are reviewed. A new idea: neutrino tomography of the Earth interior is discussed in detail. The main contradiction: the high pervasivity of neutrinos, which makes possible the transillumination of the Earth, and the gigantic technical problems of detection caused by the small cross section is pointed out. The proposed possibilities of detection (radiowaves, sound, muons and Cherenkov light emitted by neutrinos) are described. Proposed futuristic technical ideas (mobile muon detectors aboard trucks, floating proton accelerators of 100 km circumference, moving in the ocean) and supposed geological aims (Earth's core, internal density anomalies, quarries of minerals and crude oil) are discussed. (D.Gy.) 5 figs

  15. Review on possible gravitational anomalies

    International Nuclear Information System (INIS)

    Amador, Xavier E

    2005-01-01

    This is an updated introductory review of 2 possible gravitational anomalies that has attracted part of the Scientific community: the Allais effect that occur during solar eclipses, and the Pioneer 10 spacecraft anomaly, experimented also by Pioneer 11 and Ulysses spacecrafts. It seems that, to date, no satisfactory conventional explanation exist to these phenomena, and this suggests that possible new physics will be needed to account for them. The main purpose of this review is to announce 3 other new measurements that will be carried on during the 2005 solar eclipses in Panama and Colombia (Apr. 8) and in Portugal (Oct.15)

  16. Cloud Computing Organizational Benefits : A Managerial concern

    OpenAIRE

    Mandala, Venkata Bhaskar Reddy; Chandra, Marepalli Sharat

    2012-01-01

    Context: Software industry is looking for new methods and opportunities to reduce the project management problems and operational costs. Cloud Computing concept is providing answers to these problems. Cloud Computing is made possible with the availability of high internet bandwidth. Cloud Computing is providing wide range of various services to varied customer base. Cloud Computing has some key elements such as on-demand services, large pool of configurable computing resources and minimal man...

  17. EXPLORATIONS IN QUANTUM COMPUTING FOR FINANCIAL APPLICATIONS

    OpenAIRE

    Gare, Jesse

    2010-01-01

    Quantum computers have the potential to increase the solution speed for many computational problems. This paper is a first step into possible applications for quantum computing in the context of computational finance. The fundamental ideas of quantum computing are introduced, followed by an exposition of the algorithms of Deutsch and Grover. Improved mean and median estimation are shown as results of Grover?s generalized framework. The algorithm for mean estimation is refined to an improved M...

  18. Introduction to computer networking

    CERN Document Server

    Robertazzi, Thomas G

    2017-01-01

    This book gives a broad look at both fundamental networking technology and new areas that support it and use it. It is a concise introduction to the most prominent, recent technological topics in computer networking. Topics include network technology such as wired and wireless networks, enabling technologies such as data centers, software defined networking, cloud and grid computing and applications such as networks on chips, space networking and network security. The accessible writing style and non-mathematical treatment makes this a useful book for the student, network and communications engineer, computer scientist and IT professional. • Features a concise, accessible treatment of computer networking, focusing on new technological topics; • Provides non-mathematical introduction to networks in their most common forms today;< • Includes new developments in switching, optical networks, WiFi, Bluetooth, LTE, 5G, and quantum cryptography.

  19. Home, Hearth and Computing.

    Science.gov (United States)

    Seelig, Anita

    1982-01-01

    Advantages of having children use microcomputers at school and home include learning about sophisticated concepts early in life without a great deal of prodding, playing games that expand knowledge, and becoming literate in computer knowledge needed later in life. Includes comments from parents on their experiences with microcomputers and…

  20. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  1. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  2. Computational Viscoelasticity

    CERN Document Server

    Marques, Severino P C

    2012-01-01

    This text is a guide how to solve problems in which viscoelasticity is present using existing commercial computational codes. The book gives information on codes’ structure and use, data preparation  and output interpretation and verification. The first part of the book introduces the reader to the subject, and to provide the models, equations and notation to be used in the computational applications. The second part shows the most important Computational techniques: Finite elements formulation, Boundary elements formulation, and presents the solutions of Viscoelastic problems with Abaqus.

  3. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot...... encompass human concepts of subjective experience and intersubjective meaningful communication, which prevents it from being genuinely transdisciplinary. (3) Philosophically, it does not sufficiently accept the deep ontological differences between various paradigms such as von Foerster’s second- order...

  4. A personal computer-based nuclear magnetic resonance spectrometer

    Science.gov (United States)

    Job, Constantin; Pearson, Robert M.; Brown, Michael F.

    1994-11-01

    Nuclear magnetic resonance (NMR) spectroscopy using personal computer-based hardware has the potential of enabling the application of NMR methods to fields where conventional state of the art equipment is either impractical or too costly. With such a strategy for data acquisition and processing, disciplines including civil engineering, agriculture, geology, archaeology, and others have the possibility of utilizing magnetic resonance techniques within the laboratory or conducting applications directly in the field. Another aspect is the possibility of utilizing existing NMR magnets which may be in good condition but unused because of outdated or nonrepairable electronics. Moreover, NMR applications based on personal computer technology may open up teaching possibilities at the college or even secondary school level. The goal of developing such a personal computer (PC)-based NMR standard is facilitated by existing technologies including logic cell arrays, direct digital frequency synthesis, use of PC-based electrical engineering software tools to fabricate electronic circuits, and the use of permanent magnets based on neodymium-iron-boron alloy. Utilizing such an approach, we have been able to place essentially an entire NMR spectrometer console on two printed circuit boards, with the exception of the receiver and radio frequency power amplifier. Future upgrades to include the deuterium lock and the decoupler unit are readily envisioned. The continued development of such PC-based NMR spectrometers is expected to benefit from the fast growing, practical, and low cost personal computer market.

  5. Cloud computing basics for librarians.

    Science.gov (United States)

    Hoy, Matthew B

    2012-01-01

    "Cloud computing" is the name for the recent trend of moving software and computing resources to an online, shared-service model. This article briefly defines cloud computing, discusses different models, explores the advantages and disadvantages, and describes some of the ways cloud computing can be used in libraries. Examples of cloud services are included at the end of the article. Copyright © Taylor & Francis Group, LLC

  6. Student Leadership: Challenges and Possibilities*

    African Journals Online (AJOL)

    Abstract. In my attempt to adhere to the request that I provide an interpretation of the theme for the session,. 'Critical Engagement, Innovation and Inclusivity', and cognisant of the primary audience,. I weave student leadership responsibilities, challenges and possibilities into the address. Events since the plenary address ...

  7. Student Leadership: Challenges and Possibilities*

    African Journals Online (AJOL)

    I weave student leadership responsibilities, challenges and possibilities into the ... The conundrum of fee-free higher education is not an abstract concept floating ... You can choose to engage with the issue by expressing yourself in many ways. .... Africa today, such as Recognition of Prior Learning (SAQA, 2004), we very ...

  8. Machiavelli’s Possibility Hypothesis

    OpenAIRE

    Holler, Manfred J.; Marciano, Alain

    2010-01-01

    This paper discusses the thesis that in Arrow’s Possibility Theorem the dictator (merely) serves as a solution to the logical problem of aggregating preference while Machiavelli’s dictator, the Prince, has the historical function to bring order into a world of chaos and thus make society ready for the implementation of a republican structure.

  9. Possible nonvanishing mass of photon

    International Nuclear Information System (INIS)

    Nakazato, Hiromichi; Namiki, Mikio; Yamanaka, Yoshiya; Yokoyama, Kan-ichi.

    1985-05-01

    From phenomenological and field-theoretical considerations on photon mass, we first show that photon is not limitted to being massless at the present stage. Next we illustrate a possibility of formulating a local field theory for massive photons coupled with nonconserved currents, while we cannot do for massless photons. (author)

  10. Student Leadership: Challenges and Possibilities

    Science.gov (United States)

    Nel, Willy

    2016-01-01

    In my attempt to adhere to the request that I provide an interpretation of the theme for the session, "Critical Engagement, Innovation and Inclusivity", and cognisant of the primary audience, I weave student leadership responsibilities, challenges and possibilities into the address. Events since the plenary address have however…

  11. 12 CFR 1102.27 - Computing time.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Computing time. 1102.27 Section 1102.27 Banks... for Proceedings § 1102.27 Computing time. (a) General rule. In computing any period of time prescribed... time begins to run is not included. The last day so computed is included, unless it is a Saturday...

  12. The NASA computer science research program plan

    Science.gov (United States)

    1983-01-01

    A taxonomy of computer science is included, one state of the art of each of the major computer science categories is summarized. A functional breakdown of NASA programs under Aeronautics R and D, space R and T, and institutional support is also included. These areas were assessed against the computer science categories. Concurrent processing, highly reliable computing, and information management are identified.

  13. 12 CFR 622.21 - Computing time.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Computing time. 622.21 Section 622.21 Banks and... Formal Hearings § 622.21 Computing time. (a) General rule. In computing any period of time prescribed or... run is not to be included. The last day so computed shall be included, unless it is a Saturday, Sunday...

  14. Desktop grid computing

    CERN Document Server

    Cerin, Christophe

    2012-01-01

    Desktop Grid Computing presents common techniques used in numerous models, algorithms, and tools developed during the last decade to implement desktop grid computing. These techniques enable the solution of many important sub-problems for middleware design, including scheduling, data management, security, load balancing, result certification, and fault tolerance. The book's first part covers the initial ideas and basic concepts of desktop grid computing. The second part explores challenging current and future problems. Each chapter presents the sub-problems, discusses theoretical and practical

  15. Computational biology for ageing

    Science.gov (United States)

    Wieser, Daniela; Papatheodorou, Irene; Ziehm, Matthias; Thornton, Janet M.

    2011-01-01

    High-throughput genomic and proteomic technologies have generated a wealth of publicly available data on ageing. Easy access to these data, and their computational analysis, is of great importance in order to pinpoint the causes and effects of ageing. Here, we provide a description of the existing databases and computational tools on ageing that are available for researchers. We also describe the computational approaches to data interpretation in the field of ageing including gene expression, comparative and pathway analyses, and highlight the challenges for future developments. We review recent biological insights gained from applying bioinformatics methods to analyse and interpret ageing data in different organisms, tissues and conditions. PMID:21115530

  16. Digital computers in action

    CERN Document Server

    Booth, A D

    1965-01-01

    Digital Computers in Action is an introduction to the basics of digital computers as well as their programming and various applications in fields such as mathematics, science, engineering, economics, medicine, and law. Other topics include engineering automation, process control, special purpose games-playing devices, machine translation and mechanized linguistics, and information retrieval. This book consists of 14 chapters and begins by discussing the history of computers, from the idea of performing complex arithmetical calculations to the emergence of a modern view of the structure of a ge

  17. CONFERENCE: Computers and accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1984-01-15

    In September of last year a Conference on 'Computers in Accelerator Design and Operation' was held in West Berlin attracting some 160 specialists including many from outside Europe. It was a Europhysics Conference, organized by the Hahn-Meitner Institute with Roman Zelazny as Conference Chairman, postponed from an earlier intended venue in Warsaw. The aim was to bring together specialists in the fields of accelerator design, computer control and accelerator operation.

  18. Collectively loading an application in a parallel computer

    Science.gov (United States)

    Aho, Michael E.; Attinella, John E.; Gooding, Thomas M.; Miller, Samuel J.; Mundy, Michael B.

    2016-01-05

    Collectively loading an application in a parallel computer, the parallel computer comprising a plurality of compute nodes, including: identifying, by a parallel computer control system, a subset of compute nodes in the parallel computer to execute a job; selecting, by the parallel computer control system, one of the subset of compute nodes in the parallel computer as a job leader compute node; retrieving, by the job leader compute node from computer memory, an application for executing the job; and broadcasting, by the job leader to the subset of compute nodes in the parallel computer, the application for executing the job.

  19. Computed tomography of splenic trauma

    Energy Technology Data Exchange (ETDEWEB)

    Jeffrey, R.B.; Laing, F.C.; Federle, M.P.; Goodman, P.C.

    1981-12-01

    Fifty patients with abdominal trauma and possible splenic injury were evaluated by computed tomography (CT). CT correctly diagnosed 21 of 22 surgically proved traumatic sesions of the spleen (96%). Twenty-seven patients had no evidence of splenic injury. This was confirmed at operation in 1 patient and clinical follow-up in 26. There were one false negative and one false positive. In 5 patients (10%), CT demonstrated other clinically significant lesions, including hepatic or renal lacerations in 3 and large retroperitoneal hematomas in 2. In adolescents and adults, CT is an accurate, noninvasive method of rapidly diagnosing splenic trauma and associated injuries. Further experience is needed to assess its usefulness in evaluating splenic injuries in infants and small children.

  20. Computed tomography of splenic trauma

    International Nuclear Information System (INIS)

    Jeffrey, R.B.; Laing, F.C.; Federle, M.P.; Goodman, P.C.

    1981-01-01

    Fifty patients with abdominal trauma and possible splenic injury were evaluated by computed tomography (CT). CT correctly diagnosed 21 of 22 surgically proved traumatic sesions of the spleen (96%). Twenty-seven patients had no evidence of splenic injury. This was confirmed at operation in 1 patient and clinical follow-up in 26. There were one false negative and one false positive. In 5 patients (10%), CT demonstrated other clinically significant lesions, including hepatic or renal lacerations in 3 and large retroperitoneal hematomas in 2. In adolescents and adults, CT is an accurate, noninvasive method of rapidly diagnosing splenic trauma and associated injuries. Further experience is needed to assess its usefulness in evaluating splenic injuries in infants and small children

  1. Essentials of cloud computing

    CERN Document Server

    Chandrasekaran, K

    2014-01-01

    ForewordPrefaceComputing ParadigmsLearning ObjectivesPreambleHigh-Performance ComputingParallel ComputingDistributed ComputingCluster ComputingGrid ComputingCloud ComputingBiocomputingMobile ComputingQuantum ComputingOptical ComputingNanocomputingNetwork ComputingSummaryReview PointsReview QuestionsFurther ReadingCloud Computing FundamentalsLearning ObjectivesPreambleMotivation for Cloud ComputingThe Need for Cloud ComputingDefining Cloud ComputingNIST Definition of Cloud ComputingCloud Computing Is a ServiceCloud Computing Is a Platform5-4-3 Principles of Cloud computingFive Essential Charact

  2. Computational Literacy

    DEFF Research Database (Denmark)

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies for the acquisit......In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies...... for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...

  3. Computing Religion

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  4. Computational Controversy

    NARCIS (Netherlands)

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have

  5. Grid Computing

    Indian Academy of Sciences (India)

    IAS Admin

    emergence of supercomputers led to the use of computer simula- tion as an .... Scientific and engineering applications (e.g., Tera grid secure gate way). Collaborative ... Encryption, privacy, protection from malicious software. Physical Layer.

  6. Computer tomographs

    International Nuclear Information System (INIS)

    Niedzwiedzki, M.

    1982-01-01

    Physical foundations and the developments in the transmission and emission computer tomography are presented. On the basis of the available literature and private communications a comparison is made of the various transmission tomographs. A new technique of computer emission tomography ECT, unknown in Poland, is described. The evaluation of two methods of ECT, namely those of positron and single photon emission tomography is made. (author)

  7. Computing farms

    International Nuclear Information System (INIS)

    Yeh, G.P.

    2000-01-01

    High-energy physics, nuclear physics, space sciences, and many other fields have large challenges in computing. In recent years, PCs have achieved performance comparable to the high-end UNIX workstations, at a small fraction of the price. We review the development and broad applications of commodity PCs as the solution to CPU needs, and look forward to the important and exciting future of large-scale PC computing

  8. Possible rotation-power nature of SGRs and AXPs

    International Nuclear Information System (INIS)

    Malheiro, M.; Lobato, R. V.; Coelho, Jaziel G.; Cáceres, D. L.; De Lima, R. C. R.; Rueda, J. A.; Ruffini, R.

    2017-01-01

    We investigate the possibility of some Soft Gamma-ray Repeaters (SGRs) and Anomalous X-ray Pulsars (AXPs) could be described as rotation-powered neutron stars (NSs). The analysis was carried out by computing the structure properties of NSs, and then we focus on giving estimates for the surface magnetic field using both realistic structure parameters of NSs and a general relativistic model of a rotating magnetic dipole. We show that the use of realistic parameters of rotating neutron stars obtained from numerical integration of the self-consistent axisymmetric general relativistic equations of equilibrium leads to values of the magnetic field and radiation efficiency of SGRs/AXPs very different from estimates based on fiducial parameters. This analysis leads to a precise prediction of the range of NS masses, obtained here by making use of selected up-to-date nuclear equations of state (EOS). We show that 40% (nine) of the entire observed population of SGRs and AXPs can be described as canonical pulsars driven by the rotational energy of neutron stars, for which we give their possible range of masses. We also show that if the blackbody component in soft X-rays is due to the surface temperature of NSs, then 50% of the sources could be explained as ordinary rotation-powered pulsars. Besides, amongst these sources we find the four SGRs/AXPs with observed radio emission and six that are possibly associated with supernova remnants (including Swift J1834.9-0846 as the first magnetar to show a surrounding wind nebula), suggesting as well a natural explanation as ordinary pulsars. (paper)

  9. The Possible "Proton Sponge " Effect of Polyethylenimine (PEI) Does Not Include Change in Lysosomal pH

    DEFF Research Database (Denmark)

    Søndergaard, Rikke Vicki; Mattebjerg, Maria Ahlm; Henriksen, Jonas Rosager

    2013-01-01

    is still elusive. The "proton sponge " hypothesis remains the most generally accepted mechanism, although it is heavily debated. This hypothesis is associated with the large buffering capacity of PEI and other polycations, which has been interpreted to cause an increase in lysosomal pH even though...... no conclusive proof has been provided. In the present study, we have used a nanoparticle pH sensor that was developed for pH measurements in the endosomal/lysosomal pathway. We have carried out quantitative measurements of lysosomal pH as a function of PEI content and correlate the results to the "proton sponge...... " hypothesis. Our measurements show that PEI does not induce change in lysosomal pH as previously suggested and quantification of PEI concentrations in lysosomes makes it uncertain that the "proton sponge " effect is the dominant mechanism of polyplex escape.Molecular Therapy (2012); doi:10.1038/mt.2012.185....

  10. DISCOVERY OF PULSATIONS, INCLUDING POSSIBLE PRESSURE MODES, IN TWO NEW EXTREMELY LOW MASS, He-CORE WHITE DWARFS

    Energy Technology Data Exchange (ETDEWEB)

    Hermes, J. J.; Montgomery, M. H.; Winget, D. E.; Bell, Keaton J.; Harrold, Samuel T. [Department of Astronomy, University of Texas at Austin, Austin, TX 78712 (United States); Brown, Warren R.; Kenyon, Scott J. [Smithsonian Astrophysical Observatory, 60 Garden Street, Cambridge, MA 02138 (United States); Gianninas, A.; Kilic, Mukremin, E-mail: jjhermes@astro.as.utexas.edu [Homer L. Dodge Department of Physics and Astronomy, University of Oklahoma, 440 W. Brooks Street, Norman, OK 73019 (United States)

    2013-03-10

    We report the discovery of the second and third pulsating extremely low mass (ELM) white dwarfs (WDs), SDSS J111215.82+111745.0 (hereafter J1112) and SDSS J151826.68+065813.2 (hereafter J1518). Both have masses < 0.25 M{sub Sun} and effective temperatures below 10, 000 K, establishing these putatively He-core WDs as a cooler class of pulsating hydrogen-atmosphere WDs (DAVs, or ZZ Ceti stars). The short-period pulsations evidenced in the light curve of J1112 may also represent the first observation of acoustic (p-mode) pulsations in any WD, which provide an exciting opportunity to probe this WD in a complimentary way compared to the long-period g-modes that are also present. J1112 is a T{sub eff} =9590 {+-} 140 K and log g =6.36 {+-} 0.06 WD. The star displays sinusoidal variability at five distinct periodicities between 1792 and 2855 s. In this star, we also see short-period variability, strongest at 134.3 s, well short of the expected g-modes for such a low-mass WD. The other new pulsating WD, J1518, is a T{sub eff} =9900 {+-} 140 K and log g =6.80 {+-} 0.05 WD. The light curve of J1518 is highly non-sinusoidal, with at least seven significant periods between 1335 and 3848 s. Consistent with the expectation that ELM WDs must be formed in binaries, these two new pulsating He-core WDs, in addition to the prototype SDSS J184037.78+642312.3, have close companions. However, the observed variability is inconsistent with tidally induced pulsations and is so far best explained by the same hydrogen partial-ionization driving mechanism at work in classic C/O-core ZZ Ceti stars.

  11. Present and possible utilization of PUSPATI reactor

    International Nuclear Information System (INIS)

    Gui Ah Auu.

    1983-01-01

    The utilization of PUSPATI TRIGA Mark II Reactor (PTR) has increased reasonably well since its commissioning last year. PTR was used mainly for training of operators, neutron flux measurements and neutron activation analysis. However, the present utilization data indicates that further increase in PTR utilization to include teaching and the usage of the beam ports is desirable. Some possible areas of PTR applications in the future in relevance to our needs are also described in this paper. (author)

  12. PEDAGOGICAL ASPECTS OF CLOUD COMPUTING

    Directory of Open Access Journals (Sweden)

    N. Morze

    2011-05-01

    Full Text Available Recent progress in computer science in the field of redundancy and protection has led to the sharing of data in many different repositories. Modern infrastructure has made cloud computing safe and reliable, and advancement of such computations radically changes the understanding of the use of resources and services. The materials in this article are connected with the definition of pedagogical possibilities of using cloud computing to provide education on the basis of competence-based approach and monitoring of learners (students.

  13. Noninvasive coronary angioscopy using electron beam computed tomography and multidetector computed tomography

    NARCIS (Netherlands)

    van Ooijen, PMA; Nieman, K; de Feyter, PJ; Oudkerk, M

    2002-01-01

    With the advent of noninvasive coronary imaging techniques like multidetector computed tomography and electron beam computed tomography, new representation methods such as intracoronary visualization. have been introduced. We explore the possibilities of these novel visualization techniques and

  14. Computational Physics Program of the National MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1984-12-01

    The principal objective of the computational physics group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. A summary of the groups activities is presented, including computational studies in MHD equilibria and stability, plasma transport, Fokker-Planck, and efficient numerical and programming algorithms. References are included

  15. MOBILE LEARING - possibilities and perspectives

    DEFF Research Database (Denmark)

    Larsen, Lasse Juel

    2009-01-01

    This paper proposes that SMS (Short Message Service) or text-messaging on mobile devices can serve as an extension or possible create another way of learning traditional scholastic content normally associated with the school system. The potential of the SMS is still very much untapped and largely...... unexplored as a pedagogical tool within teaching and learning domains. This paper is inspired by locative arts and ongoing experiments regarding not only SMS based pervasive systems, but also the more complex usage of mobile devices in investigating urban living conditions and experiences both existentially...... and as an exploring mechanism of the cityscape. This paper aims at discussing the potentials and outlining the possibilities for mobile learning in the traditional school setting. The complexity of these issues derives not only from the traditions of the school system, but also from diverging perspectives...

  16. Sonification as a possible stroke rehabilitation strategy

    Science.gov (United States)

    Scholz, Daniel S.; Wu, Liming; Pirzer, Jonas; Schneider, Johann; Rollnik, Jens D.; Großbach, Michael; Altenmüller, Eckart O.

    2014-01-01

    Despite cerebral stroke being one of the main causes of acquired impairments of motor skills worldwide, well-established therapies to improve motor functions are sparse. Recently, attempts have been made to improve gross motor rehabilitation by mapping patient movements to sound, termed sonification. Sonification provides additional sensory input, supplementing impaired proprioception. However, to date no established sonification-supported rehabilitation protocol strategy exists. In order to examine and validate the effectiveness of sonification in stroke rehabilitation, we developed a computer program, termed “SonicPointer”: Participants' computer mouse movements were sonified in real-time with complex tones. Tone characteristics were derived from an invisible parameter mapping, overlaid on the computer screen. The parameters were: tone pitch and tone brightness. One parameter varied along the x, the other along the y axis. The order of parameter assignment to axes was balanced in two blocks between subjects so that each participant performed under both conditions. Subjects were naive to the overlaid parameter mappings and its change between blocks. In each trial a target tone was presented and subjects were instructed to indicate its origin with respect to the overlaid parameter mappings on the screen as quickly and accurately as possible with a mouse click. Twenty-six elderly healthy participants were tested. Required time and two-dimensional accuracy were recorded. Trial duration times and learning curves were derived. We hypothesized that subjects performed in one of the two parameter-to-axis–mappings better, indicating the most natural sonification. Generally, subjects' localizing performance was better on the pitch axis as compared to the brightness axis. Furthermore, the learning curves were steepest when pitch was mapped onto the vertical and brightness onto the horizontal axis. This seems to be the optimal constellation for this two

  17. Another possible energy landscape; Un autre paysage energetique possible

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-06-01

    This analysis presents the national energy balances from the national energy accounting. The first part presents the accounting analysis on the electric power consumption and production in France. The second part deals with the global energy accounting, for the energy sources and utilization, together. From these analysis the authors show how the global energy efficiency of production and utilization is possible. Solutions allowing the reduction of the non renewable energies consumption and solution for the nuclear power phaseout are also proposed. (A.L.B.)

  18. Freedom: A Promise of Possibility.

    Science.gov (United States)

    Bunkers, Sandra Schmidt

    2015-10-01

    The idea of freedom as a promise of possibility is explored in this column. The core concepts from a research study on considering tomorrow (Bunkers, 1998) coupled with humanbecoming community change processes (Parse, 2003) are used to illuminate this notion. The importance of intentionality in human freedom is discussed from both a human science and a natural science perspective. © The Author(s) 2015.

  19. Bioplastics: Development, Possibilities and Difficulties

    OpenAIRE

    Karpušenkaitė, Aistė; Varžinskas, Visvaldas

    2014-01-01

    New possible ways of plastics manufacture and waste treatment are being searched when trying to tackle the problems related to the growth of waste quantities and decline in non-renewable resources. Presently, the most promising and effective way to solve the mentioned problems is production of bioplastics, but its way to recognition is very slow. One of the barriers is the absence of clear and united opinion throughout the EU Arising new discussions about biodegradable and biobased plastics w...

  20. new possibilities in diagnostic radiology

    OpenAIRE

    Scheel, Michael

    2014-01-01

    Diffusion Tensor Imaging (DTI) allows a non-invasive diffusion-based tissue characterization and thus offers completely new possibilities in the field of diagnostic radiology. On the one hand, this method allows an improved detection of pathological changes at the microstructural level, which are frequently not detectable in conventional MRI methods. On the other hand new strategies for therapy monitoring are feasible by quantification of diffusion parameters (e.g., Parallel, Radial and Mean ...

  1. Steel refining possibilities in LF

    Science.gov (United States)

    Dumitru, M. G.; Ioana, A.; Constantin, N.; Ciobanu, F.; Pollifroni, M.

    2018-01-01

    This article presents the main possibilities for steel refining in Ladle Furnace (LF). These, are presented: steelmaking stages, steel refining through argon bottom stirring, online control of the bottom stirring, bottom stirring diagram during LF treatment of a heat, porous plug influence over the argon stirring, bottom stirring porous plug, analysis of porous plugs disposal on ladle bottom surface, bottom stirring simulation with ANSYS, bottom stirring simulation with Autodesk CFD.

  2. Dry eye syndrome among computer users

    Science.gov (United States)

    Gajta, Aurora; Turkoanje, Daniela; Malaescu, Iosif; Marin, Catalin-Nicolae; Koos, Marie-Jeanne; Jelicic, Biljana; Milutinovic, Vuk

    2015-12-01

    Dry eye syndrome is characterized by eye irritation due to changes of the tear film. Symptoms include itching, foreign body sensations, mucous discharge and transitory vision blurring. Less occurring symptoms include photophobia and eye tiredness. Aim of the work was to determine the quality of the tear film and ocular dryness potential risk in persons who spend more than 8 hours using computers and possible correlations between severity of symptoms (dry eyes symptoms anamnesis) and clinical signs assessed by: Schirmer test I, TBUT (Tears break-up time), TFT (Tear ferning test). The results show that subjects using computer have significantly shorter TBUT (less than 5 s for 56 % of subjects and less than 10 s for 37 % of subjects), TFT type II/III in 50 % of subjects and type III 31% of subjects was found when compared to computer non users (TFT type I and II was present in 85,71% of subjects). Visual display terminal use, more than 8 hours daily, has been identified as a significant risk factor for dry eye. It's been advised to all persons who spend substantial time using computers to use artificial tears drops in order to minimize the symptoms of dry eyes syndrome and prevents serious complications.

  3. The Possibilities of Network Sociality

    Science.gov (United States)

    Willson, Michele

    Technologically networked social forms are broad, extensive and in demand. The rapid development and growth of web 2.0, or the social web, is evidence of the need and indeed hunger for social connectivity: people are searching for many and varied ways of enacting being-together. However, the ways in which we think of, research and write about network(ed) sociality are relatively recent and arguably restricted, warranting further critique and development. This article attempts to do several things: it raises questions about the types of sociality enacted in contemporary techno-society; critically explores the notion of the networked individual and the focus on the individual evident in much of the technology and sociality literature and asks questions about the place of the social in these discussions. It argues for a more well-balanced and multilevelled approach to questions of sociality in networked societies. The article starts from the position that possibilities enabled/afforded by the technologies we have in place have an effect upon the ways in which we understand being in the world together and our possible actions and futures. These possibilities are more than simply supplementary; in many ways they are transformative. The ways in which we grapple with these questions reveals as much about our understandings of sociality as it does about the technologies themselves.

  4. Natural Computing in Computational Finance Volume 4

    CERN Document Server

    O’Neill, Michael; Maringer, Dietmar

    2012-01-01

    This book follows on from Natural Computing in Computational Finance  Volumes I, II and III.   As in the previous volumes of this series, the  book consists of a series of  chapters each of  which was selected following a rigorous, peer-reviewed, selection process.  The chapters illustrate the application of a range of cutting-edge natural  computing and agent-based methodologies in computational finance and economics.  The applications explored include  option model calibration, financial trend reversal detection, enhanced indexation, algorithmic trading,  corporate payout determination and agent-based modeling of liquidity costs, and trade strategy adaptation.  While describing cutting edge applications, the chapters are  written so that they are accessible to a wide audience. Hence, they should be of interest  to academics, students and practitioners in the fields of computational finance and  economics.  

  5. Computers and clinical arrhythmias.

    Science.gov (United States)

    Knoebel, S B; Lovelace, D E

    1983-02-01

    Cardiac arrhythmias are ubiquitous in normal and abnormal hearts. These disorders may be life-threatening or benign, symptomatic or unrecognized. Arrhythmias may be the precursor of sudden death, a cause or effect of cardiac failure, a clinical reflection of acute or chronic disorders, or a manifestation of extracardiac conditions. Progress is being made toward unraveling the diagnostic and therapeutic problems involved in arrhythmogenesis. Many of the advances would not be possible, however, without the availability of computer technology. To preserve the proper balance and purposeful progression of computer usage, engineers and physicians have been exhorted not to work independently in this field. Both should learn some of the other's trade. The two disciplines need to come together to solve important problems with computers in cardiology. The intent of this article was to acquaint the practicing cardiologist with some of the extant and envisioned computer applications and some of the problems with both. We conclude that computer-based database management systems are necessary for sorting out the clinical factors of relevance for arrhythmogenesis, but computer database management systems are beset with problems that will require sophisticated solutions. The technology for detecting arrhythmias on routine electrocardiograms is quite good but human over-reading is still required, and the rationale for computer application in this setting is questionable. Systems for qualitative, continuous monitoring and review of extended time ECG recordings are adequate with proper noise rejection algorithms and editing capabilities. The systems are limited presently for clinical application to the recognition of ectopic rhythms and significant pauses. Attention should now be turned to the clinical goals for detection and quantification of arrhythmias. We should be asking the following questions: How quantitative do systems need to be? Are computers required for the detection of

  6. Applications of interval computations

    CERN Document Server

    Kreinovich, Vladik

    1996-01-01

    Primary Audience for the Book • Specialists in numerical computations who are interested in algorithms with automatic result verification. • Engineers, scientists, and practitioners who desire results with automatic verification and who would therefore benefit from the experience of suc­ cessful applications. • Students in applied mathematics and computer science who want to learn these methods. Goal Of the Book This book contains surveys of applications of interval computations, i. e. , appli­ cations of numerical methods with automatic result verification, that were pre­ sented at an international workshop on the subject in EI Paso, Texas, February 23-25, 1995. The purpose of this book is to disseminate detailed and surveyed information about existing and potential applications of this new growing field. Brief Description of the Papers At the most fundamental level, interval arithmetic operations work with sets: The result of a single arithmetic operation is the set of all possible results as the o...

  7. Computer aided surface representation

    Energy Technology Data Exchange (ETDEWEB)

    Barnhill, R.E.

    1990-02-19

    The central research problem of this project is the effective representation, computation, and display of surfaces interpolating to information in three or more dimensions. If the given information is located on another surface, then the problem is to construct a surface defined on a surface''. Sometimes properties of an already defined surface are desired, which is geometry processing''. Visualization of multivariate surfaces is possible by means of contouring higher dimensional surfaces. These problems and more are discussed below. The broad sweep from constructive mathematics through computational algorithms to computer graphics illustrations is utilized in this research. The breadth and depth of this research activity makes this research project unique.

  8. Computed Tomography Status

    Science.gov (United States)

    Hansche, B. D.

    1983-01-01

    Computed tomography (CT) is a relatively new radiographic technique which has become widely used in the medical field, where it is better known as computerized axial tomographic (CAT) scanning. This technique is also being adopted by the industrial radiographic community, although the greater range of densities, variation in samples sizes, plus possible requirement for finer resolution make it difficult to duplicate the excellent results that the medical scanners have achieved.

  9. Proposal for grid computing for nuclear applications

    International Nuclear Information System (INIS)

    Faridah Mohamad Idris; Wan Ahmad Tajuddin Wan Abdullah; Zainol Abidin Ibrahim; Zukhaimira Zolkapli

    2013-01-01

    Full-text: The use of computer clusters for computational sciences including computational physics is vital as it provides computing power to crunch big numbers at a faster rate. In compute intensive applications that requires high resolution such as Monte Carlo simulation, the use of computer clusters in a grid form that supplies computational power to any nodes within the grid that needs computing power, has now become a necessity. In this paper, we described how the clusters running on a specific application could use resources within the grid, to run the applications to speed up the computing process. (author)

  10. Computational colour science using MATLAB

    CERN Document Server

    Westland, Stephen; Cheung, Vien

    2012-01-01

    Computational Colour Science Using MATLAB 2nd Edition offers a practical, problem-based approach to colour physics. The book focuses on the key issues encountered in modern colour engineering, including efficient representation of colour information, Fourier analysis of reflectance spectra and advanced colorimetric computation. Emphasis is placed on the practical applications rather than the techniques themselves, with material structured around key topics. These topics include colour calibration of visual displays, computer recipe prediction and models for colour-appearance prediction. Each t

  11. Framework for Computation Offloading in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Dejan Kovachev

    2012-12-01

    Full Text Available The inherently limited processing power and battery lifetime of mobile phones hinder the possible execution of computationally intensive applications like content-based video analysis or 3D modeling. Offloading of computationally intensive application parts from the mobile platform into a remote cloud infrastructure or nearby idle computers addresses this problem. This paper presents our Mobile Augmentation Cloud Services (MACS middleware which enables adaptive extension of Android application execution from a mobile client into the cloud. Applications are developed by using the standard Android development pattern. The middleware does the heavy lifting of adaptive application partitioning, resource monitoring and computation offloading. These elastic mobile applications can run as usual mobile application, but they can also use remote computing resources transparently. Two prototype applications using the MACS middleware demonstrate the benefits of the approach. The evaluation shows that applications, which involve costly computations, can benefit from offloading with around 95% energy savings and significant performance gains compared to local execution only.

  12. Transportation Research & Analysis Computing Center

    Data.gov (United States)

    Federal Laboratory Consortium — The technical objectives of the TRACC project included the establishment of a high performance computing center for use by USDOT research teams, including those from...

  13. Accreditation of academic programmes in computing South Africa

    CSIR Research Space (South Africa)

    Gerber, A

    2012-05-01

    Full Text Available Over the past two decades, strong technical convergence has been observed between computing and engineering. Computing in this context includes Computer Engineering, Computer Science, Information Systems, Information Technology and Software...

  14. CHEP95: Computing in high energy physics. Abstracts

    International Nuclear Information System (INIS)

    1995-01-01

    These proceedings cover the technical papers on computation in High Energy Physics, including computer codes, computer devices, control systems, simulations, data acquisition systems. New approaches on computer architectures are also discussed

  15. Experiment Dashboard for Monitoring of the LHC Distributed Computing Systems

    International Nuclear Information System (INIS)

    Andreeva, J; Campos, M Devesas; Cros, J Tarragon; Gaidioz, B; Karavakis, E; Kokoszkiewicz, L; Lanciotti, E; Maier, G; Ollivier, W; Nowotka, M; Rocha, R; Sadykov, T; Saiz, P; Sargsyan, L; Sidorova, I; Tuckett, D

    2011-01-01

    LHC experiments are currently taking collisions data. A distributed computing model chosen by the four main LHC experiments allows physicists to benefit from resources spread all over the world. The distributed model and the scale of LHC computing activities increase the level of complexity of middleware, and also the chances of possible failures or inefficiencies in involved components. In order to ensure the required performance and functionality of the LHC computing system, monitoring the status of the distributed sites and services as well as monitoring LHC computing activities are among the key factors. Over the last years, the Experiment Dashboard team has been working on a number of applications that facilitate the monitoring of different activities: including following up jobs, transfers, and also site and service availabilities. This presentation describes Experiment Dashboard applications used by the LHC experiments and experience gained during the first months of data taking.

  16. From transistor to trapped-ion computers for quantum chemistry.

    Science.gov (United States)

    Yung, M-H; Casanova, J; Mezzacapo, A; McClean, J; Lamata, L; Aspuru-Guzik, A; Solano, E

    2014-01-07

    Over the last few decades, quantum chemistry has progressed through the development of computational methods based on modern digital computers. However, these methods can hardly fulfill the exponentially-growing resource requirements when applied to large quantum systems. As pointed out by Feynman, this restriction is intrinsic to all computational models based on classical physics. Recently, the rapid advancement of trapped-ion technologies has opened new possibilities for quantum control and quantum simulations. Here, we present an efficient toolkit that exploits both the internal and motional degrees of freedom of trapped ions for solving problems in quantum chemistry, including molecular electronic structure, molecular dynamics, and vibronic coupling. We focus on applications that go beyond the capacity of classical computers, but may be realizable on state-of-the-art trapped-ion systems. These results allow us to envision a new paradigm of quantum chemistry that shifts from the current transistor to a near-future trapped-ion-based technology.

  17. Sex differences in perceived attributes of computer-mediated communication.

    Science.gov (United States)

    Harper, Vernon B

    2003-02-01

    Researchers have pointed to the influence of sex with respect to the attributes of the computer medium. The author elaborates upon possible sex differences in reference to perceived attributes of the computer medium, i.e., Richness, Accessibility, Velocity, Interactivity, Plasticity, and Immediacy. Data from both a pilot and main study are reported and interpreted. The pilot study included 78 participants, while the main study involved 211. The independent samples were composed of Communication Studies students enrolled at two Mid-Atlantic universities. Nine items with anchors of 1: strongly disagree and 7: strongly agree were taken from the 2000 Computer Mediated Communication Competence Scale of Spitzberg to assess the attributes of computer-mediated interaction. The results indicate that women scored higher than men on perceptions of Accessibility, Velocity, Interactivity, and Immediacy.

  18. Another Possibility for Boyajian's Star

    Science.gov (United States)

    Kohler, Susanna

    2017-07-01

    The unusual light curve of the star KIC 8462852, also known as Tabbys star or Boyajians star, has puzzled us since its discovery last year. A new study now explores whether the stars missing flux is due to internal blockage rather than something outside of the star.Mysterious DipsMost explanations for the flux dips of Boyajians star rely on external factors, like this illustrated swarm of comets. [NASA/JPL-Caltech]Boyajians star shows unusual episodes of dimming in its light curve by as much as 20%, each lasting a few to tens of days and separated by periods of typically hundreds of days. In addition, archival observations show that it has gradually faded by roughly 15% over the span of the last hundred years. What could be causing both the sporadic flux dips and the long-term fading of this odd star?Explanations thus far have varied from mundane to extreme. Alien megastructures, pieces of smashed planets or comets orbiting the star, and intervening interstellar medium have all been proposed as possible explanations but these require some object external to the star. A new study by researcher Peter Foukal proposes an alternative: what if the source of the flux obstruction is the star itself?Analogy to the SunDecades ago, researchers discovered that our own stars total flux isnt as constant as we thought. When magnetic dark spots on the Suns surface block the heat transport, the Suns luminosity dips slightly. The diverted heat is redistributed in the Suns interior, becoming stored as a very small global heating and expansion of the convective envelope. When the blocking starspot is removed, the Sun appears slightly brighter than it did originally. Its luminosity then gradually relaxes, decaying back to its original value.Model of a stars flux after a 1,000-km starspot is inserted at time t = 0 and removed at time t = ts at a depth of 10,000 km in the convective zone. The stars luminosity dips, then becomes brighter than originally, and then gradually decays. [Foukal

  19. Enhanced delegated computing using coherence

    Science.gov (United States)

    Barz, Stefanie; Dunjko, Vedran; Schlederer, Florian; Moore, Merritt; Kashefi, Elham; Walmsley, Ian A.

    2016-03-01

    A longstanding question is whether it is possible to delegate computational tasks securely—such that neither the computation nor the data is revealed to the server. Recently, both a classical and a quantum solution to this problem were found [C. Gentry, in Proceedings of the 41st Annual ACM Symposium on the Theory of Computing (Association for Computing Machinery, New York, 2009), pp. 167-178; A. Broadbent, J. Fitzsimons, and E. Kashefi, in Proceedings of the 50th Annual Symposium on Foundations of Computer Science (IEEE Computer Society, Los Alamitos, CA, 2009), pp. 517-526]. Here, we study the first step towards the interplay between classical and quantum approaches and show how coherence can be used as a tool for secure delegated classical computation. We show that a client with limited computational capacity—restricted to an XOR gate—can perform universal classical computation by manipulating information carriers that may occupy superpositions of two states. Using single photonic qubits or coherent light, we experimentally implement secure delegated classical computations between an independent client and a server, which are installed in two different laboratories and separated by 50 m . The server has access to the light sources and measurement devices, whereas the client may use only a restricted set of passive optical devices to manipulate the information-carrying light beams. Thus, our work highlights how minimal quantum and classical resources can be combined and exploited for classical computing.

  20. 22nd International Conference on Soft Computing

    CERN Document Server

    2017-01-01

    This proceeding book contains a collection of selected accepted papers of the Mendel conference held in Brno, Czech Republic in June 2016. The proceedings book contains three chapters which present recent advances in soft computing including intelligent image processing. The Mendel conference was established in 1995 and is named after the scientist and Augustinian priest Gregor J. Mendel who discovered the famous Laws of Heredity. The main aim of the conference is to create a regular possibility for students, academics and researchers to exchange ideas and novel research methods on a yearly basis.