WorldWideScience

Sample records for computing longest common

  1. Faster Algorithms for Computing Longest Common Increasing Subsequences

    DEFF Research Database (Denmark)

    Kutz, Martin; Brodal, Gerth Stølting; Kaligosi, Kanela

    2011-01-01

    of the alphabet, and Sort is the time to sort each input sequence. For k⩾3 length-n sequences we present an algorithm which improves the previous best bound by more than a factor k for many inputs. In both cases, our algorithms are conceptually quite simple but rely on existing sophisticated data structures......We present algorithms for finding a longest common increasing subsequence of two or more input sequences. For two sequences of lengths n and m, where m⩾n, we present an algorithm with an output-dependent expected running time of and O(m) space, where ℓ is the length of an LCIS, σ is the size....... Finally, we introduce the problem of longest common weakly-increasing (or non-decreasing) subsequences (LCWIS), for which we present an -time algorithm for the 3-letter alphabet case. For the extensively studied longest common subsequence problem, comparable speedups have not been achieved for small...

  2. Time-Space Trade-Offs for the Longest Common Substring Problem

    DEFF Research Database (Denmark)

    Starikovskaya, Tatiana; Vildhøj, Hjalte Wedel

    2013-01-01

    The Longest Common Substring problem is to compute the longest substring which occurs in at least d ≥ 2 of m strings of total length n. In this paper we ask the question whether this problem allows a deterministic time-space trade-off using O(n1+ε) time and O(n1-ε) space for 0 ≤ ε ≤ 1. We give a ...... a positive answer in the case of two strings (d = m = 2) and 0 can be solved in O(n1-ε) space and O(n1+ε log2n (d log2n + d2)) time for any 0 ≤ ε ...The Longest Common Substring problem is to compute the longest substring which occurs in at least d ≥ 2 of m strings of total length n. In this paper we ask the question whether this problem allows a deterministic time-space trade-off using O(n1+ε) time and O(n1-ε) space for 0 ≤ ε ≤ 1. We give...

  3. Sublinear Space Algorithms for the Longest Common Substring Problem

    DEFF Research Database (Denmark)

    Kociumaka, Tomasz; Starikovskaya, Tatiana; Vildhøj, Hjalte Wedel

    2014-01-01

    Given m documents of total length n, we consider the problem of finding a longest string common to at least d ≥ 2 of the documents. This problem is known as the longest common substring (LCS) problem and has a classic O(n) space and O(n) time solution (Weiner [FOCS'73], Hui [CPM'92]). However...

  4. FACC: A Novel Finite Automaton Based on Cloud Computing for the Multiple Longest Common Subsequences Search

    Directory of Open Access Journals (Sweden)

    Yanni Li

    2012-01-01

    Full Text Available Searching for the multiple longest common subsequences (MLCS has significant applications in the areas of bioinformatics, information processing, and data mining, and so forth, Although a few parallel MLCS algorithms have been proposed, the efficiency and effectiveness of the algorithms are not satisfactory with the increasing complexity and size of biologic data. To overcome the shortcomings of the existing MLCS algorithms, and considering that MapReduce parallel framework of cloud computing being a promising technology for cost-effective high performance parallel computing, a novel finite automaton (FA based on cloud computing called FACC is proposed under MapReduce parallel framework, so as to exploit a more efficient and effective general parallel MLCS algorithm. FACC adopts the ideas of matched pairs and finite automaton by preprocessing sequences, constructing successor tables, and common subsequences finite automaton to search for MLCS. Simulation experiments on a set of benchmarks from both real DNA and amino acid sequences have been conducted and the results show that the proposed FACC algorithm outperforms the current leading parallel MLCS algorithm FAST-MLCS.

  5. Comparative Visualization of Vector Field Ensembles Based on Longest Common Subsequence

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Richen; Guo, Hanqi; Zhang, Jiang; Yuan, Xiaoru

    2016-04-19

    We propose a longest common subsequence (LCS) based approach to compute the distance among vector field ensembles. By measuring how many common blocks the ensemble pathlines passing through, the LCS distance defines the similarity among vector field ensembles by counting the number of sharing domain data blocks. Compared to the traditional methods (e.g. point-wise Euclidean distance or dynamic time warping distance), the proposed approach is robust to outlier, data missing, and sampling rate of pathline timestep. Taking the advantages of smaller and reusable intermediate output, visualization based on the proposed LCS approach revealing temporal trends in the data at low storage cost, and avoiding tracing pathlines repeatedly. Finally, we evaluate our method on both synthetic data and simulation data, which demonstrate the robustness of the proposed approach.

  6. File Type Identification of File Fragments using Longest Common Subsequence (LCS)

    Science.gov (United States)

    Rahmat, R. F.; Nicholas, F.; Purnamawati, S.; Sitompul, O. S.

    2017-01-01

    Computer forensic analyst is a person in charge of investigation and evidence tracking. In certain cases, the file needed to be presented as digital evidence was deleted. It is difficult to reconstruct the file, because it often lost its header and cannot be identified while being restored. Therefore, a method is required for identifying the file type of file fragments. In this research, we propose Longest Common Subsequences that consists of three steps, namely training, testing and validation, to identify the file type from file fragments. From all testing results we can conlude that our proposed method works well and achieves 92.91% of accuracy to identify the file type of file fragment for three data types.

  7. Longest common extensions in trees

    DEFF Research Database (Denmark)

    Bille, Philip; Gawrychowski, Pawel; Gørtz, Inge Li

    2016-01-01

    to trees and suggest a few applications of LCE in trees to tries and XML databases. Given a labeled and rooted tree T of size n, the goal is to preprocess T into a compact data structure that support the following LCE queries between subpaths and subtrees in T. Let v1, v2, w1, and w2 be nodes of T...... such that w1 and w2 are descendants of v1 and v2 respectively. - LCEPP(v1, w1, v2, w2): (path-path LCE) return the longest common prefix of the paths v1 ~→ w1 and v2 ~→ w2. - LCEPT(v1, w1, v2): (path-tree LCE) return maximal path-path LCE of the path v1 ~→ w1 and any path from v2 to a descendant leaf. - LCETT......(v1, v2): (tree-tree LCE) return a maximal path-path LCE of any pair of paths from v1 and v2 to descendant leaves. We present the first non-trivial bounds for supporting these queries. For LCEPP queries, we present a linear-space solution with O(log* n) query time. For LCEPT queries, we present...

  8. Longest Common Extensions in Trees

    DEFF Research Database (Denmark)

    Bille, Philip; Gawrychowski, Pawel; Gørtz, Inge Li

    2015-01-01

    to trees and suggest a few applications of LCE in trees to tries and XML databases. Given a labeled and rooted tree T of size n, the goal is to preprocess T into a compact data structure that support the following LCE queries between subpaths and subtrees in T. Let v1, v2, w1, and w2 be nodes of T...... such that w1 and w2 are descendants of v1 and v2 respectively. - LCEPP(v1, w1, v2, w2): (path-path LCE) return the longest common prefix of the paths v1 ~→ w1 and v2 ~→ w2. - LCEPT(v1, w1, v2): (path-tree LCE) return maximal path-path LCE of the path v1 ~→ w1 and any path from v2 to a descendant leaf. - LCETT......(v1, v2): (tree-tree LCE) return a maximal path-path LCE of any pair of paths from v1 and v2 to descendant leaves. We present the first non-trivial bounds for supporting these queries. For LCEPP queries, we present a linear-space solution with O(log* n) query time. For LCEPT queries, we present...

  9. Chords in longest cycles

    DEFF Research Database (Denmark)

    Thomassen, Carsten

    2017-01-01

    If a graph G is 3-connected and has minimum degree at least 4, then some longest cycle in G has a chord. If G is 2-connected and cubic, then every longest cycle in G has a chord.......If a graph G is 3-connected and has minimum degree at least 4, then some longest cycle in G has a chord. If G is 2-connected and cubic, then every longest cycle in G has a chord....

  10. The symmetric longest queue system

    NARCIS (Netherlands)

    van Houtum, Geert-Jan; Adan, Ivo; van der Wal, Jan

    1997-01-01

    We derive the performance of the exponential symmetric longest queue system from two variants: a longest queue system with Threshold Rejection of jobs and one with Threshold Addition of jobs. It is shown that these two systems provide lower and upper bounds for the performance of the longest queue

  11. On the number of longest and almost longest cycles in cubic graphs

    DEFF Research Database (Denmark)

    Chia, Gek Ling; Thomassen, Carsten

    2012-01-01

    We consider the questions: How many longest cycles must a cubic graph have, and how many may it have? For each k >= 2 there are infinitely many p such that there is a cubic graph with p vertices and precisely one longest cycle of length p-k. On the other hand, if G is a graph with p vertices, all...

  12. A Note on Longest Paths in Circular Arc Graphs

    Directory of Open Access Journals (Sweden)

    Joos Felix

    2015-08-01

    Full Text Available As observed by Rautenbach and Sereni [SIAM J. Discrete Math. 28 (2014 335-341] there is a gap in the proof of the theorem of Balister et al. [Combin. Probab. Comput. 13 (2004 311-317], which states that the intersection of all longest paths in a connected circular arc graph is nonempty. In this paper we close this gap.

  13. Longest Common Extensions via Fingerprinting

    DEFF Research Database (Denmark)

    Bille, Philip; Gørtz, Inge Li; Kristensen, Jesper

    2012-01-01

    query time, no extra space and no preprocessing achieves significantly better average case performance. We show a new algorithm, Fingerprint k , which for a parameter k, 1 ≤ k ≤ [log n], on a string of length n and alphabet size σ, gives O(k n1/k) query time using O(k n) space and O(k n + sort......(n,σ)) preprocessing time, where sort(n,σ) is the time it takes to sort n numbers from σ. Though this solution is asymptotically strictly worse than the asymptotically best previously known algorithms, it outperforms them in practice in average case and is almost as fast as the simple linear time algorithm. On worst....... The LCE problem can be solved in linear space with constant query time and a preprocessing of sorting complexity. There are two known approaches achieving these bounds, which use nearest common ancestors and range minimum queries, respectively. However, in practice a much simpler approach with linear...

  14. A Novel Efficient Graph Model for the Multiple Longest Common Subsequences (MLCS Problem

    Directory of Open Access Journals (Sweden)

    Zhan Peng

    2017-08-01

    Full Text Available Searching for the Multiple Longest Common Subsequences (MLCS of multiple sequences is a classical NP-hard problem, which has been used in many applications. One of the most effective exact approaches for the MLCS problem is based on dominant point graph, which is a kind of directed acyclic graph (DAG. However, the time and space efficiency of the leading dominant point graph based approaches is still unsatisfactory: constructing the dominated point graph used by these approaches requires a huge amount of time and space, which hinders the applications of these approaches to large-scale and long sequences. To address this issue, in this paper, we propose a new time and space efficient graph model called the Leveled-DAG for the MLCS problem. The Leveled-DAG can timely eliminate all the nodes in the graph that cannot contribute to the construction of MLCS during constructing. At any moment, only the current level and some previously generated nodes in the graph need to be kept in memory, which can greatly reduce the memory consumption. Also, the final graph contains only one node in which all of the wanted MLCS are saved, thus, no additional operations for searching the MLCS are needed. The experiments are conducted on real biological sequences with different numbers and lengths respectively, and the proposed algorithm is compared with three state-of-the-art algorithms. The experimental results show that the time and space needed for the Leveled-DAG approach are smaller than those for the compared algorithms especially on large-scale and long sequences.

  15. Longest cable-stayed bridge TATARA; Longest shachokyo Tatara Ohashi

    Energy Technology Data Exchange (ETDEWEB)

    Fujii, K. [Hiroshima University, Hiroshima (Japan). Faculty of Engineering

    1998-06-15

    The world`s longest cable-stayed bridge Tatara having a central span of 890 m had the both ends closed in August 1997, linking Namakuchi Island and Omishima Island. Final finishing work is continuing for opening of the West Seto Expressway in the spring of 1999. A cable-stayed bridge supports the bridge girders by perpendicular components of tensile force of cables stayed obliquely. On the other hand, there is a concern that the girders may have axial compression force generated due to horizontal components of the force from the cable tensile force, which can cause buckling of the girders. Therefore, in order to suspend the girders efficiently by increasing the perpendicular components of the cable force, and moreover to suppress the axial compression force on the girders, it is more advantageous to make bridge towers high, hence the towers of this bridge are highest among the bridges on the Shimanami Ocean Road. This bridge whose long girders are stayed with 21-stage multi cables presented a problem in designing the buckling in steel girders near the towers due to the horizontal components of the force generated by the bridge. Discussions were given, therefore, by using load withstanding force experiments using a whole bridge model of 1/50 scale, buckling experiments on full-size reinforcing plate models, and load withstanding force analysis using a tower model. A number of other technical discussions were repeated, by which the world`s longest cable-stayed bridge was completed. 9 figs., 1 tab.

  16. WMO World Record Lightning Extremes: Longest Reported Flash Distance and Longest Reported Flash Duration.

    Science.gov (United States)

    Lang, Timothy J; Pédeboy, Stéphane; Rison, William; Cerveny, Randall S; Montanyà, Joan; Chauzy, Serge; MacGorman, Donald R; Holle, Ronald L; Ávila, Eldo E; Zhang, Yijun; Carbin, Gregory; Mansell, Edward R; Kuleshov, Yuriy; Peterson, Thomas C; Brunet, Manola; Driouech, Fatima; Krahenbuhl, Daniel S

    2017-06-01

    A World Meteorological Organization weather and climate extremes committee has judged that the world's longest reported distance for a single lightning flash occurred with a horizontal distance of 321 km (199.5 mi) over Oklahoma in 2007, while the world's longest reported duration for a single lightning flash is an event that lasted continuously for 7.74 seconds over southern France in 2012. In addition, the committee has unanimously recommended amendment of the AMS Glossary of Meteorology definition of lightning discharge as a "series of electrical processes taking place within 1 second" by removing the phrase "within one second" and replacing with "continuously." Validation of these new world extremes (a) demonstrates the recent and on-going dramatic augmentations and improvements to regional lightning detection and measurement networks, (b) provides reinforcement regarding the dangers of lightning, and (c) provides new information for lightning engineering concerns.

  17. ONE OF THE LONGEST APPENDIX: A RARE CASE REPORT

    Directory of Open Access Journals (Sweden)

    Venkat Rao

    2015-03-01

    Full Text Available The vermiform appendix is an organ that can have variable sizes. We are prompted to report here one of the longest appendix removed, measuring about 16cm in length. INTRODUCTION : The vermiform appendix is an organ that can vary in size, site, and presence, as well as in other clinical and functional aspects. We describe here one of the longest appendix removed, measuring about 16cm in length in a case of acute appendicitis

  18. The longest telomeres: a general signature of adult stem cell compartments

    Science.gov (United States)

    Flores, Ignacio; Canela, Andres; Vera, Elsa; Tejera, Agueda; Cotsarelis, George; Blasco, María A.

    2008-01-01

    Identification of adult stem cells and their location (niches) is of great relevance for regenerative medicine. However, stem cell niches are still poorly defined in most adult tissues. Here, we show that the longest telomeres are a general feature of adult stem cell compartments. Using confocal telomere quantitative fluorescence in situ hybridization (telomapping), we find gradients of telomere length within tissues, with the longest telomeres mapping to the known stem cell compartments. In mouse hair follicles, we show that cells with the longest telomeres map to the known stem cell compartments, colocalize with stem cell markers, and behave as stem cells upon treatment with mitogenic stimuli. Using K15-EGFP reporter mice, which mark hair follicle stem cells, we show that GFP-positive cells have the longest telomeres. The stem cell compartments in small intestine, testis, cornea, and brain of the mouse are also enriched in cells with the longest telomeres. This constitutes the description of a novel general property of adult stem cell compartments. Finally, we make the novel finding that telomeres shorten with age in different mouse stem cell compartments, which parallels a decline in stem cell functionality, suggesting that telomere loss may contribute to stem cell dysfunction with age. PMID:18283121

  19. On the Core of Multiple Longest Traveling Salesman Games

    NARCIS (Netherlands)

    Estevez Fernandez, M.A.; Borm, P.E.M.; Hamers, H.J.M.

    2003-01-01

    In this paper we introduce multiple longest traveling salesman (MLTS) games. An MLTS game arises from a network in which a salesman has to visit each node (player) precisely once, except its home location, in an order that maximizes the total reward.First it is shown that the value of a coalition of

  20. Destroying longest cycles in graphs and digraphs

    DEFF Research Database (Denmark)

    Van Aardt, Susan A.; Burger, Alewyn P.; Dunbar, Jean E.

    2015-01-01

    In 1978, C. Thomassen proved that in any graph one can destroy all the longest cycles by deleting at most one third of the vertices. We show that for graphs with circumference k≤8 it suffices to remove at most 1/k of the vertices. The Petersen graph demonstrates that this result cannot be extended...... to include k=9 but we show that in every graph with circumference nine we can destroy all 9-cycles by removing 1/5 of the vertices. We consider the analogous problem for digraphs and show that for digraphs with circumference k=2,3, it suffices to remove 1/k of the vertices. However this does not hold for k≥4....

  1. Collection of reports on use of computation fund utilized in common in 1988

    International Nuclear Information System (INIS)

    1989-05-01

    Nuclear Physics Research Center, Osaka University, has provided the computation fund utilized in common since 1976 for supporting the computation related to the activities of the Center. When this computation fund is used, after finishing the use, the simple report of definite form (printed in RCNP-Z together with the report of the committee on computation fund utilized in common) and the detailed report concerning the contents of computation are to be presented. In the latter report, English abstract, explanation of the results obtained by computation and physical contents, new development, difficult point and the method of its solution in computation techniques, subroutine and function used for computation and their functions and block diagrams and so on are included. This book is the collection of the latter reports on the use of the computation fund utilized in common in fiscal year 1988. The invitation to the computation fund utilized in common is informed in December every year in RCNP-Z. (K.I.)

  2. Second longest conveyor belt in UK installed and fully operational

    Energy Technology Data Exchange (ETDEWEB)

    1981-07-01

    A conveyor belt (which after the completion of the Selby complex will be the second longest conveyor belt in the UK) has been installed at the Prince Charles Drift Mine, Prince of Wales Colliery, United Kingdom. The 1706 m conveyor is the sole underground-to-surface conveyor at the Drift Mine, and is powered by a single 2240 kW, 3000 hp drive unit.

  3. Nature's longest threads new frontiers in the mathematics and physics of information in biology

    CERN Document Server

    Sreekantan, B V

    2014-01-01

    Organisms endowed with life show a sense of awareness, interacting with and learning from the universe in and around them. Each level of interaction involves transfer of information of various kinds, and at different levels. Each thread of information is interlinked with the other, and woven together, these constitute the universe — both the internal self and the external world — as we perceive it. They are, figuratively speaking, Nature's longest threads. This volume reports inter-disciplinary research and views on information and its transfer at different levels of organization by reputed scientists working on the frontier areas of science. It is a frontier where physics, mathematics and biology merge seamlessly, binding together specialized streams such as quantum mechanics, dynamical systems theory, and mathematics. The topics would interest a broad cross-section of researchers in life sciences, physics, cognition, neuroscience, mathematics and computer science, as well as interested amateurs, familia...

  4. Test Anxiety, Computer-Adaptive Testing and the Common Core

    Science.gov (United States)

    Colwell, Nicole Makas

    2013-01-01

    This paper highlights the current findings and issues regarding the role of computer-adaptive testing in test anxiety. The computer-adaptive test (CAT) proposed by one of the Common Core consortia brings these issues to the forefront. Research has long indicated that test anxiety impairs student performance. More recent research indicates that…

  5. World's longest underwater line part of new dc transmission link

    Energy Technology Data Exchange (ETDEWEB)

    1967-04-01

    The world's seventh dc transmission system including the world's longest underwater power cable is now operative. The system, linking the Italian Mainland with Sardinia, was designed and engineered by the English Electric Co. Ltd. It will ensure a constant power supply for Sardinia and allow export of 200 MW of power to the Tuscany area in Italy. Proving test began on the link in Decmeber and continued until full demand is made on it from Italy.

  6. Diversity Programme | Tasneem Zahra Husain presents her book “Only the Longest Threads” | 4 October

    CERN Multimedia

    Diversity Programme

    2016-01-01

    “Only the Longest Threads”, by Tasneem Zahra Husain. Tuesday 4 October 2016 - 15:30 Room Georges Charpak (Room F / 60-6-015) *Coffee will be served after the event* Tasneem Zehra Husain is a string theorist and the first Pakistani woman to obtain a PhD in this field. Husain’s first novel, “Only the Longest Threads” reimagines the stories of great breakthroughs and discoveries in physics from Newton’s classical mechanics to the Higgs Boson from the viewpoint of fictional characters. These tales promise to be great reads for both lay audiences and to those who have a more advanced understanding of physics. Registration is now open. Please register using the following link: https://indico.cern.ch/event/562079/.

  7. Monitoring the progressive increase of the longest episode of spontaneous movements in Guinea pig fetus

    Directory of Open Access Journals (Sweden)

    Sekulić S.

    2013-01-01

    Full Text Available The aim of this work was to determine the changes in the duration of spontaneous movements in the guinea pig fetus after the appearance of its first movements. Every day from the 25th to the 35th gestation day, one fetus from each of twenty pregnant animals was examined by ultrasound. Fetal movements were observed for 5 min. The episode with the longest period of movement was taken into consideration and was recorded as: 3 s. Days 25 and 26 were characterized by episodes lasting 3 s (χ2 = 140.51 p <0.05. Tracking the dynamics of progressive increases in the longest episode of spontaneous movement could be a useful factor in estimating the maturity and condition of a fetus. [Projekat Ministarstva nauke Republike Srbije, br. 175006/2011

  8. Maximum likelihood as a common computational framework in tomotherapy

    International Nuclear Information System (INIS)

    Olivera, G.H.; Shepard, D.M.; Reckwerdt, P.J.; Ruchala, K.; Zachman, J.; Fitchard, E.E.; Mackie, T.R.

    1998-01-01

    Tomotherapy is a dose delivery technique using helical or axial intensity modulated beams. One of the strengths of the tomotherapy concept is that it can incorporate a number of processes into a single piece of equipment. These processes include treatment optimization planning, dose reconstruction and kilovoltage/megavoltage image reconstruction. A common computational technique that could be used for all of these processes would be very appealing. The maximum likelihood estimator, originally developed for emission tomography, can serve as a useful tool in imaging and radiotherapy. We believe that this approach can play an important role in the processes of optimization planning, dose reconstruction and kilovoltage and/or megavoltage image reconstruction. These processes involve computations that require comparable physical methods. They are also based on equivalent assumptions, and they have similar mathematical solutions. As a result, the maximum likelihood approach is able to provide a common framework for all three of these computational problems. We will demonstrate how maximum likelihood methods can be applied to optimization planning, dose reconstruction and megavoltage image reconstruction in tomotherapy. Results for planning optimization, dose reconstruction and megavoltage image reconstruction will be presented. Strengths and weaknesses of the methodology are analysed. Future directions for this work are also suggested. (author)

  9. A Study on the Radiographic Diagnosis of Common Periapical Lesions by Using Computer

    International Nuclear Information System (INIS)

    Kim, Jae Duck; Kim, Seung Kug

    1990-01-01

    The purpose of this study was to estimate the diagnostic availability of the common periapical lesions by using computer. The author used a domestic personal computer and rearranged the applied program appropriately with RF (Rapid File), a program to answer the purpose of this study, and then input the consequence made out through collection, analysis and classification of the clinical and radiological features about the common periapical lesions as a basic data. The 256 cases (Cyst 91, Periapical granuloma 74, Periapical abscess 91) were obtained from the chart recordings and radiographs of the patients diagnosed or treated under the common periapical lesions during the past 8 years (1983-1990) at the infirmary of Dental School, Chosun University. Next, the clinical and radiographic features of the 256 cases were applied to RF program for diagnosis, and the diagnosis by using computer was compared with the hidden final diagnosis by clinical and histopathological examination. The obtained results were as follow: 1. In cases of the cyst, diagnosis through the computer program was shown rather lower accuracy (80.22%) as compared with accuracy (90.1%) by the radiologists. 2. In cases of the granuloma, diagnosis through the computer program was shown rather higher accuracy(75.7%) as compared with the accuracy (70.3%) by the radiologists. 3. In cases of periapical abscess, the diagnostic accuracy was shown 88% in both diagnoses. 4. The average diagnostic accuracy of 256 cases through the computer program was shown rather lower accuracy (81.2%) as compared with the accuracy (82.8%) by the radiologists. 5. The applied basic data for radiographic diagnosis of common periapical lesions by using computer was estimated to be available.

  10. A Study on the Radiographic Diagnosis of Common Periapical Lesions by Using Computer

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Duck; Kim, Seung Kug [Dept. of Oral Radiology, College of Dentistry, Chosun University, Kwangju (Korea, Republic of)

    1990-08-15

    The purpose of this study was to estimate the diagnostic availability of the common periapical lesions by using computer. The author used a domestic personal computer and rearranged the applied program appropriately with RF (Rapid File), a program to answer the purpose of this study, and then input the consequence made out through collection, analysis and classification of the clinical and radiological features about the common periapical lesions as a basic data. The 256 cases (Cyst 91, Periapical granuloma 74, Periapical abscess 91) were obtained from the chart recordings and radiographs of the patients diagnosed or treated under the common periapical lesions during the past 8 years (1983-1990) at the infirmary of Dental School, Chosun University. Next, the clinical and radiographic features of the 256 cases were applied to RF program for diagnosis, and the diagnosis by using computer was compared with the hidden final diagnosis by clinical and histopathological examination. The obtained results were as follow: 1. In cases of the cyst, diagnosis through the computer program was shown rather lower accuracy (80.22%) as compared with accuracy (90.1%) by the radiologists. 2. In cases of the granuloma, diagnosis through the computer program was shown rather higher accuracy(75.7%) as compared with the accuracy (70.3%) by the radiologists. 3. In cases of periapical abscess, the diagnostic accuracy was shown 88% in both diagnoses. 4. The average diagnostic accuracy of 256 cases through the computer program was shown rather lower accuracy (81.2%) as compared with the accuracy (82.8%) by the radiologists. 5. The applied basic data for radiographic diagnosis of common periapical lesions by using computer was estimated to be available.

  11. A Computer-Based Instrument That Identifies Common Science Misconceptions

    Science.gov (United States)

    Larrabee, Timothy G.; Stein, Mary; Barman, Charles

    2006-01-01

    This article describes the rationale for and development of a computer-based instrument that helps identify commonly held science misconceptions. The instrument, known as the Science Beliefs Test, is a 47-item instrument that targets topics in chemistry, physics, biology, earth science, and astronomy. The use of an online data collection system…

  12. On Longest Cycles in Essentially 4-Connected Planar Graphs

    Directory of Open Access Journals (Sweden)

    Fabrici Igor

    2016-08-01

    Full Text Available A planar 3-connected graph G is essentially 4-connected if, for any 3-separator S of G, one component of the graph obtained from G by removing S is a single vertex. Jackson and Wormald proved that an essentially 4-connected planar graph on n vertices contains a cycle C such that . For a cubic essentially 4-connected planar graph G, Grünbaum with Malkevitch, and Zhang showed that G has a cycle on at least ¾ n vertices. In the present paper the result of Jackson and Wormald is improved. Moreover, new lower bounds on the length of a longest cycle of G are presented if G is an essentially 4-connected planar graph of maximum degree 4 or G is an essentially 4-connected maximal planar graph.

  13. KEPLER-1647B: THE LARGEST AND LONGEST-PERIOD KEPLER TRANSITING CIRCUMBINARY PLANET

    Energy Technology Data Exchange (ETDEWEB)

    Kostov, Veselin B. [NASA Goddard Space Flight Center, Mail Code 665, Greenbelt, MD 20771 (United States); Orosz, Jerome A.; Welsh, William F.; Short, Donald R. [Department of Astronomy, San Diego State University, 5500 Campanile Drive, San Diego, CA 92182 (United States); Doyle, Laurance R. [SETI Institute, 189 Bernardo Avenue, Mountain View, CA 94043 (United States); Principia College, IMoP, One Maybeck Place, Elsah, IL 62028 (United States); Fabrycky, Daniel C. [Department of Astronomy and Astrophysics, University of Chicago, 5640 South Ellis Avenue, Chicago, IL 60637 (United States); Haghighipour, Nader [Institute for Astronomy, University of Hawaii-Manoa, Honolulu, HI 96822 (United States); Quarles, Billy [Department of Physics and Physical Science, The University of Nebraska at Kearney, Kearney, NE 68849 (United States); Cochran, William D.; Endl, Michael [McDonald Observatory, The University of Texas as Austin, Austin, TX 78712-0259 (United States); Ford, Eric B. [Department of Astronomy and Astrophysics, The Pennsylvania State University, 428A Davey Lab, University Park, PA 16802 (United States); Gregorio, Joao [Atalaia Group and Crow-Observatory, Portalegre (Portugal); Hinse, Tobias C. [Korea Astronomy and Space Science Institute (KASI), Advanced Astronomy and Space Science Division, Daejeon 305-348 (Korea, Republic of); Isaacson, Howard [Department of Astronomy, University of California Berkeley, 501 Campbell Hall, Berkeley, CA 94720 (United States); Jenkins, Jon M. [NASA Ames Research Center, Moffett Field, CA 94035 (United States); Jensen, Eric L. N. [Department of Physics and Astronomy, Swarthmore College, Swarthmore, PA 19081 (United States); Kane, Stephen [Department of Physics and Astronomy, San Francisco State University, 1600 Holloway Avenue, San Francisco, CA 94132 (United States); Kull, Ilya, E-mail: veselin.b.kostov@nasa.gov [Department of Astronomy and Astrophysics, Tel Aviv University, 69978 Tel Aviv (Israel); and others

    2016-08-10

    We report the discovery of a new Kepler transiting circumbinary planet (CBP). This latest addition to the still-small family of CBPs defies the current trend of known short-period planets orbiting near the stability limit of binary stars. Unlike the previous discoveries, the planet revolving around the eclipsing binary system Kepler-1647 has a very long orbital period (∼1100 days) and was at conjunction only twice during the Kepler mission lifetime. Due to the singular configuration of the system, Kepler-1647b is not only the longest-period transiting CBP at the time of writing, but also one of the longest-period transiting planets. With a radius of 1.06 ± 0.01 R {sub Jup}, it is also the largest CBP to date. The planet produced three transits in the light curve of Kepler-1647 (one of them during an eclipse, creating a syzygy) and measurably perturbed the times of the stellar eclipses, allowing us to measure its mass, 1.52 ± 0.65 M {sub Jup}. The planet revolves around an 11-day period eclipsing binary consisting of two solar-mass stars on a slightly inclined, mildly eccentric ( e {sub bin} = 0.16), spin-synchronized orbit. Despite having an orbital period three times longer than Earth’s, Kepler-1647b is in the conservative habitable zone of the binary star throughout its orbit.

  14. KEPLER-1647B: THE LARGEST AND LONGEST-PERIOD KEPLER TRANSITING CIRCUMBINARY PLANET

    International Nuclear Information System (INIS)

    Kostov, Veselin B.; Orosz, Jerome A.; Welsh, William F.; Short, Donald R.; Doyle, Laurance R.; Fabrycky, Daniel C.; Haghighipour, Nader; Quarles, Billy; Cochran, William D.; Endl, Michael; Ford, Eric B.; Gregorio, Joao; Hinse, Tobias C.; Isaacson, Howard; Jenkins, Jon M.; Jensen, Eric L. N.; Kane, Stephen; Kull, Ilya

    2016-01-01

    We report the discovery of a new Kepler transiting circumbinary planet (CBP). This latest addition to the still-small family of CBPs defies the current trend of known short-period planets orbiting near the stability limit of binary stars. Unlike the previous discoveries, the planet revolving around the eclipsing binary system Kepler-1647 has a very long orbital period (∼1100 days) and was at conjunction only twice during the Kepler mission lifetime. Due to the singular configuration of the system, Kepler-1647b is not only the longest-period transiting CBP at the time of writing, but also one of the longest-period transiting planets. With a radius of 1.06 ± 0.01 R Jup , it is also the largest CBP to date. The planet produced three transits in the light curve of Kepler-1647 (one of them during an eclipse, creating a syzygy) and measurably perturbed the times of the stellar eclipses, allowing us to measure its mass, 1.52 ± 0.65 M Jup . The planet revolves around an 11-day period eclipsing binary consisting of two solar-mass stars on a slightly inclined, mildly eccentric ( e bin = 0.16), spin-synchronized orbit. Despite having an orbital period three times longer than Earth’s, Kepler-1647b is in the conservative habitable zone of the binary star throughout its orbit.

  15. Common accounting system for monitoring the ATLAS distributed computing resources

    International Nuclear Information System (INIS)

    Karavakis, E; Andreeva, J; Campana, S; Saiz, P; Gayazov, S; Jezequel, S; Sargsyan, L; Schovancova, J; Ueda, I

    2014-01-01

    This paper covers in detail a variety of accounting tools used to monitor the utilisation of the available computational and storage resources within the ATLAS Distributed Computing during the first three years of Large Hadron Collider data taking. The Experiment Dashboard provides a set of common accounting tools that combine monitoring information originating from many different information sources; either generic or ATLAS specific. This set of tools provides quality and scalable solutions that are flexible enough to support the constantly evolving requirements of the ATLAS user community.

  16. Common data buffer system. [communication with computational equipment utilized in spacecraft operations

    Science.gov (United States)

    Byrne, F. (Inventor)

    1981-01-01

    A high speed common data buffer system is described for providing an interface and communications medium between a plurality of computers utilized in a distributed computer complex forming part of a checkout, command and control system for space vehicles and associated ground support equipment. The system includes the capability for temporarily storing data to be transferred between computers, for transferring a plurality of interrupts between computers, for monitoring and recording these transfers, and for correcting errors incurred in these transfers. Validity checks are made on each transfer and appropriate error notification is given to the computer associated with that transfer.

  17. Empirical scaling of the length of the longest increasing subsequences of random walks

    Science.gov (United States)

    Mendonça, J. Ricardo G.

    2017-02-01

    We provide Monte Carlo estimates of the scaling of the length L n of the longest increasing subsequences of n-step random walks for several different distributions of step lengths, short and heavy-tailed. Our simulations indicate that, barring possible logarithmic corrections, {{L}n}∼ {{n}θ} with the leading scaling exponent 0.60≲ θ ≲ 0.69 for the heavy-tailed distributions of step lengths examined, with values increasing as the distribution becomes more heavy-tailed, and θ ≃ 0.57 for distributions of finite variance, irrespective of the particular distribution. The results are consistent with existing rigorous bounds for θ, although in a somewhat surprising manner. For random walks with step lengths of finite variance, we conjecture that the correct asymptotic behavior of L n is given by \\sqrt{n}\\ln n , and also propose the form for the subleading asymptotics. The distribution of L n was found to follow a simple scaling form with scaling functions that vary with θ. Accordingly, when the step lengths are of finite variance they seem to be universal. The nature of this scaling remains unclear, since we lack a working model, microscopic or hydrodynamic, for the behavior of the length of the longest increasing subsequences of random walks.

  18. Computer aided approach to qualitative and quantitative common cause failure analysis for complex systems

    International Nuclear Information System (INIS)

    Cate, C.L.; Wagner, D.P.; Fussell, J.B.

    1977-01-01

    Common cause failure analysis, also called common mode failure analysis, is an integral part of a complete system reliability analysis. Existing methods of computer aided common cause failure analysis are extended by allowing analysis of the complex systems often encountered in practice. The methods aid in identifying potential common cause failures and also address quantitative common cause failure analysis

  19. Theoretical restrictions on longest implicit time scales in Markov state models of biomolecular dynamics

    Science.gov (United States)

    Sinitskiy, Anton V.; Pande, Vijay S.

    2018-01-01

    Markov state models (MSMs) have been widely used to analyze computer simulations of various biomolecular systems. They can capture conformational transitions much slower than an average or maximal length of a single molecular dynamics (MD) trajectory from the set of trajectories used to build the MSM. A rule of thumb claiming that the slowest implicit time scale captured by an MSM should be comparable by the order of magnitude to the aggregate duration of all MD trajectories used to build this MSM has been known in the field. However, this rule has never been formally proved. In this work, we present analytical results for the slowest time scale in several types of MSMs, supporting the above rule. We conclude that the slowest implicit time scale equals the product of the aggregate sampling and four factors that quantify: (1) how much statistics on the conformational transitions corresponding to the longest implicit time scale is available, (2) how good the sampling of the destination Markov state is, (3) the gain in statistics from using a sliding window for counting transitions between Markov states, and (4) a bias in the estimate of the implicit time scale arising from finite sampling of the conformational transitions. We demonstrate that in many practically important cases all these four factors are on the order of unity, and we analyze possible scenarios that could lead to their significant deviation from unity. Overall, we provide for the first time analytical results on the slowest time scales captured by MSMs. These results can guide further practical applications of MSMs to biomolecular dynamics and allow for higher computational efficiency of simulations.

  20. Transaction processing in the common node of a distributed function laboratory computer system

    International Nuclear Information System (INIS)

    Stubblefield, F.W.; Dimmler, D.G.

    1975-01-01

    A computer network architecture consisting of a common node processor for managing peripherals and files and a number of private node processors for laboratory experiment control is briefly reviewed. Central to the problem of private node-common node communication is the concept of a transaction. The collection of procedures and the data structure associated with a transaction are described. The common node properties assigned to a transaction and procedures required for its complete processing are discussed. (U.S.)

  1. Cultural Commonalities and Differences in Spatial Problem-Solving: A Computational Analysis

    Science.gov (United States)

    Lovett, Andrew; Forbus, Kenneth

    2011-01-01

    A fundamental question in human cognition is how people reason about space. We use a computational model to explore cross-cultural commonalities and differences in spatial cognition. Our model is based upon two hypotheses: (1) the structure-mapping model of analogy can explain the visual comparisons used in spatial reasoning; and (2) qualitative,…

  2. Personal best marathon time and longest training run, not anthropometry, predict performance in recreational 24-hour ultrarunners.

    Science.gov (United States)

    Knechtle, Beat; Knechtle, Patrizia; Rosemann, Thomas; Lepers, Romuald

    2011-08-01

    In recent studies, a relationship between both low body fat and low thicknesses of selected skinfolds has been demonstrated for running performance of distances from 100 m to the marathon but not in ultramarathon. We investigated the association of anthropometric and training characteristics with race performance in 63 male recreational ultrarunners in a 24-hour run using bi and multivariate analysis. The athletes achieved an average distance of 146.1 (43.1) km. In the bivariate analysis, body mass (r = -0.25), the sum of 9 skinfolds (r = -0.32), the sum of upper body skinfolds (r = -0.34), body fat percentage (r = -0.32), weekly kilometers ran (r = 0.31), longest training session before the 24-hour run (r = 0.56), and personal best marathon time (r = -0.58) were related to race performance. Stepwise multiple regression showed that both the longest training session before the 24-hour run (p = 0.0013) and the personal best marathon time (p = 0.0015) had the best correlation with race performance. Performance in these 24-hour runners may be predicted (r2 = 0.46) by the following equation: Performance in a 24-hour run, km) = 234.7 + 0.481 (longest training session before the 24-hour run, km) - 0.594 (personal best marathon time, minutes). For practical applications, training variables such as volume and intensity were associated with performance but not anthropometric variables. To achieve maximum kilometers in a 24-hour run, recreational ultrarunners should have a personal best marathon time of ∼3 hours 20 minutes and complete a long training run of ∼60 km before the race, whereas anthropometric characteristics such as low body fat or low skinfold thicknesses showed no association with performance.

  3. Logical and physical resource management in the common node of a distributed function laboratory computer network

    International Nuclear Information System (INIS)

    Stubblefield, F.W.

    1976-01-01

    A scheme for managing resources required for transaction processing in the common node of a distributed function computer system has been given. The scheme has been found to be satisfactory for all common node services provided so far

  4. Fine tuning of work practices of common radiological investigations performed using computed radiography system

    International Nuclear Information System (INIS)

    Livingstone, Roshan S.; Timothy Peace, B.S.; Sunny, S.; Victor Raj, D.

    2007-01-01

    Introduction: The advent of the computed radiography (CR) has brought about remarkable changes in the field of diagnostic radiology. A relatively large cross-section of the human population is exposed to ionizing radiation on account of common radiological investigations. This study is intended to audit radiation doses imparted to patients during common radiological investigations involving the use of CR systems. Method: The entrance surface doses (ESD) were measured using thermoluminescent dosimeters (TLD) for various radiological investigations performed using the computed radiography (CR) systems. Optimization of radiographic techniques and radiation doses was done by fine tuning the work practices. Results and conclusion: Reduction of radiation doses as high as 47% was achieved during certain investigations with the use of optimized exposure factors and fine-tuned work practices

  5. System of common usage on the base of external memory devices and the SM-3 computer

    International Nuclear Information System (INIS)

    Baluka, G.; Vasin, A.Yu.; Ermakov, V.A.; Zhukov, G.P.; Zimin, G.N.; Namsraj, Yu.; Ostrovnoj, A.I.; Savvateev, A.S.; Salamatin, I.M.; Yanovskij, G.Ya.

    1980-01-01

    An easily modified system of common usage on the base of external memories and a SM-3 minicomputer replacing some pulse analysers is described. The system has merits of PA and is more advantageous with regard to effectiveness of equipment using, the possibility of changing configuration and functions, the data protection against losses due to user errors and some failures, price of one registration channel, place occupied. The system of common usage is intended for the IBR-2 pulse reactor computing centre. It is designed using the SANPO system means for SM-3 computer [ru

  6. A common currency for the computation of motivational values in the human striatum

    NARCIS (Netherlands)

    Sescousse, G.T.; Li, Y.; Dreher, J.C.

    2015-01-01

    Reward comparison in the brain is thought to be achieved through the use of a 'common currency', implying that reward value representations are computed on a unique scale in the same brain regions regardless of the reward type. Although such a mechanism has been identified in the ventro-medial

  7. KIC 4552982: outbursts and pulsations in the longest-ever pseudo-continuous light curve of a ZZ Ceti

    Directory of Open Access Journals (Sweden)

    Bell K. J.

    2015-01-01

    Full Text Available KIC 4552982 was the first ZZ Ceti (hydrogen-atmosphere pulsating white dwarf identified to lie in the Kepler field, resulting in the longest pseudo-continuous light curve ever obtained for this type of variable star. In addition to the pulsations, this light curve exhibits stochastic episodes of brightness enhancement unlike any previously studied white dwarf phenomenon. We briefly highlight the basic outburst and pulsation properties in these proceedings.

  8. NEGOTIATING COMMON GROUND IN COMPUTER-MEDIATED VERSUS FACE-TO-FACE DISCUSSIONS

    Directory of Open Access Journals (Sweden)

    Ilona Vandergriff

    2006-01-01

    Full Text Available To explore the impact of the communication medium on building common ground, this article presents research comparing learner use of reception strategies in traditional face-to-face (FTF and in synchronous computer-mediated communication (CMC.Reception strategies, such as reprises, hypothesis testing and forward inferencing provide evidence of comprehension and thus serve to establish common ground among participants. A number of factors, including communicative purpose or medium are hypothesized to affect the use of such strategies (Clark & Brennan, 1991. In the data analysis, I 1 identify specific types of reception strategies, 2 compare their relative frequencies by communication medium, by task, and by learner and 3 describe how these reception strategies function in the discussions. The findings of the quantitative analysis show that the medium alone seems to have little impact on grounding as indicated by use of reception strategies. The qualitative analysis provides evidence that participants adapted the strategies to the goals of the communicative interaction as they used them primarily to negotiate and update common ground on their collaborative activity rather than to compensate for L2 deficiencies.

  9. A common currency for the computation of motivational values in the human striatum

    NARCIS (Netherlands)

    Sescousse, G.T.; Li, Y.; Dreher, J.C.

    2014-01-01

    Reward comparison in the brain is thought to be achieved through the use of a ‘common currency’, implying that reward value representations are computed on a unique scale in the same brain regions regardless of the reward type. Although such a mechanism has been identified in the ventro-medial

  10. Longest interval between zeros of the tied-down random walk, the Brownian bridge and related renewal processes

    Science.gov (United States)

    Godrèche, Claude

    2017-05-01

    The probability distribution of the longest interval between two zeros of a simple random walk starting and ending at the origin, and of its continuum limit, the Brownian bridge, was analysed in the past by Rosén and Wendel, then extended by the latter to stable processes. We recover and extend these results using simple concepts of renewal theory, which allows to revisit past and recent works of the physics literature.

  11. Longest interval between zeros of the tied-down random walk, the Brownian bridge and related renewal processes

    International Nuclear Information System (INIS)

    Godrèche, Claude

    2017-01-01

    The probability distribution of the longest interval between two zeros of a simple random walk starting and ending at the origin, and of its continuum limit, the Brownian bridge, was analysed in the past by Rosén and Wendel, then extended by the latter to stable processes. We recover and extend these results using simple concepts of renewal theory, which allows to revisit past and recent works of the physics literature. (paper)

  12. SSVEP recognition using common feature analysis in brain-computer interface.

    Science.gov (United States)

    Zhang, Yu; Zhou, Guoxu; Jin, Jing; Wang, Xingyu; Cichocki, Andrzej

    2015-04-15

    Canonical correlation analysis (CCA) has been successfully applied to steady-state visual evoked potential (SSVEP) recognition for brain-computer interface (BCI) application. Although the CCA method outperforms the traditional power spectral density analysis through multi-channel detection, it requires additionally pre-constructed reference signals of sine-cosine waves. It is likely to encounter overfitting in using a short time window since the reference signals include no features from training data. We consider that a group of electroencephalogram (EEG) data trials recorded at a certain stimulus frequency on a same subject should share some common features that may bear the real SSVEP characteristics. This study therefore proposes a common feature analysis (CFA)-based method to exploit the latent common features as natural reference signals in using correlation analysis for SSVEP recognition. Good performance of the CFA method for SSVEP recognition is validated with EEG data recorded from ten healthy subjects, in contrast to CCA and a multiway extension of CCA (MCCA). Experimental results indicate that the CFA method significantly outperformed the CCA and the MCCA methods for SSVEP recognition in using a short time window (i.e., less than 1s). The superiority of the proposed CFA method suggests it is promising for the development of a real-time SSVEP-based BCI. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Neurobiological roots of language in primate audition: common computational properties.

    Science.gov (United States)

    Bornkessel-Schlesewsky, Ina; Schlesewsky, Matthias; Small, Steven L; Rauschecker, Josef P

    2015-03-01

    Here, we present a new perspective on an old question: how does the neurobiology of human language relate to brain systems in nonhuman primates? We argue that higher-order language combinatorics, including sentence and discourse processing, can be situated in a unified, cross-species dorsal-ventral streams architecture for higher auditory processing, and that the functions of the dorsal and ventral streams in higher-order language processing can be grounded in their respective computational properties in primate audition. This view challenges an assumption, common in the cognitive sciences, that a nonhuman primate model forms an inherently inadequate basis for modeling higher-level language functions. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Meta-heuristic algorithms for parallel identical machines scheduling problem with weighted late work criterion and common due date.

    Science.gov (United States)

    Xu, Zhenzhen; Zou, Yongxing; Kong, Xiangjie

    2015-01-01

    To our knowledge, this paper investigates the first application of meta-heuristic algorithms to tackle the parallel machines scheduling problem with weighted late work criterion and common due date ([Formula: see text]). Late work criterion is one of the performance measures of scheduling problems which considers the length of late parts of particular jobs when evaluating the quality of scheduling. Since this problem is known to be NP-hard, three meta-heuristic algorithms, namely ant colony system, genetic algorithm, and simulated annealing are designed and implemented, respectively. We also propose a novel algorithm named LDF (largest density first) which is improved from LPT (longest processing time first). The computational experiments compared these meta-heuristic algorithms with LDF, LPT and LS (list scheduling), and the experimental results show that SA performs the best in most cases. However, LDF is better than SA in some conditions, moreover, the running time of LDF is much shorter than SA.

  15. MONTHLY VARIATION IN SPERM MOTILITY IN COMMON CARP ASSESSED USING COMPUTER-ASSISTED SPERM ANALYSIS (CASA)

    Science.gov (United States)

    Sperm motility variables from the milt of the common carp Cyprinus carpio were assessed using a computer-assisted sperm analysis (CASA) system across several months (March-August 1992) known to encompass the natural spawning period. Two-year-old pond-raised males obtained each mo...

  16. [Hyp-Au-Sn9(Hyp)3-Au-Sn9(Hyp)3-Au-Hyp]-: the longest intermetalloid chain compound of tin.

    Science.gov (United States)

    Binder, Mareike; Schrenk, Claudio; Block, Theresa; Pöttgen, Rainer; Schnepf, Andreas

    2017-10-12

    The reaction of the metalloid tin cluster [Sn 10 (Hyp) 4 ] 2- with (Ph 3 P)Au-SHyp (Hyp = Si(SiMe 3 ) 3 ) gave an intermetalloid cluster [Au 3 Sn 18 (Hyp) 8 ] - 1, which is the longest intermetalloid chain compound of tin to date. 1 shows a structural resemblance to binary AuSn phases, which is expected for intermetalloid clusters.

  17. Using the longest significance run to estimate region-specific p-values in genetic association mapping studies

    Directory of Open Access Journals (Sweden)

    Yang Hsin-Chou

    2008-05-01

    Full Text Available Abstract Background Association testing is a powerful tool for identifying disease susceptibility genes underlying complex diseases. Technological advances have yielded a dramatic increase in the density of available genetic markers, necessitating an increase in the number of association tests required for the analysis of disease susceptibility genes. As such, multiple-tests corrections have become a critical issue. However the conventional statistical corrections on locus-specific multiple tests usually result in lower power as the number of markers increases. Alternatively, we propose here the application of the longest significant run (LSR method to estimate a region-specific p-value to provide an index for the most likely candidate region. Results An advantage of the LSR method relative to procedures based on genotypic data is that only p-value data are needed and hence can be applied extensively to different study designs. In this study the proposed LSR method was compared with commonly used methods such as Bonferroni's method and FDR controlling method. We found that while all methods provide good control over false positive rate, LSR has much better power and false discovery rate. In the authentic analysis on psoriasis and asthma disease data, the LSR method successfully identified important candidate regions and replicated the results of previous association studies. Conclusion The proposed LSR method provides an efficient exploratory tool for the analysis of sequences of dense genetic markers. Our results show that the LSR method has better power and lower false discovery rate comparing with the locus-specific multiple tests.

  18. Computer vision syndrome-A common cause of unexplained visual symptoms in the modern era.

    Science.gov (United States)

    Munshi, Sunil; Varghese, Ashley; Dhar-Munshi, Sushma

    2017-07-01

    The aim of this study was to assess the evidence and available literature on the clinical, pathogenetic, prognostic and therapeutic aspects of Computer vision syndrome. Information was collected from Medline, Embase & National Library of Medicine over the last 30 years up to March 2016. The bibliographies of relevant articles were searched for additional references. Patients with Computer vision syndrome present to a variety of different specialists, including General Practitioners, Neurologists, Stroke physicians and Ophthalmologists. While the condition is common, there is a poor awareness in the public and among health professionals. Recognising this condition in the clinic or in emergency situations like the TIA clinic is crucial. The implications are potentially huge in view of the extensive and widespread use of computers and visual display units. Greater public awareness of Computer vision syndrome and education of health professionals is vital. Preventive strategies should form part of work place ergonomics routinely. Prompt and correct recognition is important to allow management and avoid unnecessary treatments. © 2017 John Wiley & Sons Ltd.

  19. A Scheduling Algorithm for Cloud Computing System Based on the Driver of Dynamic Essential Path.

    Science.gov (United States)

    Xie, Zhiqiang; Shao, Xia; Xin, Yu

    2016-01-01

    To solve the problem of task scheduling in the cloud computing system, this paper proposes a scheduling algorithm for cloud computing based on the driver of dynamic essential path (DDEP). This algorithm applies a predecessor-task layer priority strategy to solve the problem of constraint relations among task nodes. The strategy assigns different priority values to every task node based on the scheduling order of task node as affected by the constraint relations among task nodes, and the task node list is generated by the different priority value. To address the scheduling order problem in which task nodes have the same priority value, the dynamic essential long path strategy is proposed. This strategy computes the dynamic essential path of the pre-scheduling task nodes based on the actual computation cost and communication cost of task node in the scheduling process. The task node that has the longest dynamic essential path is scheduled first as the completion time of task graph is indirectly influenced by the finishing time of task nodes in the longest dynamic essential path. Finally, we demonstrate the proposed algorithm via simulation experiments using Matlab tools. The experimental results indicate that the proposed algorithm can effectively reduce the task Makespan in most cases and meet a high quality performance objective.

  20. QCI Common

    Energy Technology Data Exchange (ETDEWEB)

    2016-11-18

    There are many common software patterns and utilities for the ORNL Quantum Computing Institute that can and should be shared across projects. Otherwise we find duplication of code which adds unwanted complexity. This is a software product seeks to alleviate this by providing common utilities such as object factories, graph data structures, parameter input mechanisms, etc., for other software products within the ORNL Quantum Computing Institute. This work enables pure basic research, has no export controlled utilities, and has no real commercial value.

  1. THE LONGEST TIMESCALE X-RAY VARIABILITY REVEALS EVIDENCE FOR ACTIVE GALACTIC NUCLEI IN THE HIGH ACCRETION STATE

    International Nuclear Information System (INIS)

    Zhang Youhong

    2011-01-01

    The All Sky Monitor (ASM) on board the Rossi X-ray Timing Explorer has continuously monitored a number of active galactic nuclei (AGNs) with similar sampling rates for 14 years, from 1996 January to 2009 December. Utilizing the archival ASM data of 27 AGNs, we calculate the normalized excess variances of the 300-day binned X-ray light curves on the longest timescale (between 300 days and 14 years) explored so far. The observed variance appears to be independent of AGN black-hole mass and bolometric luminosity. According to the scaling relation of black-hole mass (and bolometric luminosity) from galactic black hole X-ray binaries (GBHs) to AGNs, the break timescales that correspond to the break frequencies detected in the power spectral density (PSD) of our AGNs are larger than the binsize (300 days) of the ASM light curves. As a result, the singly broken power-law (soft-state) PSD predicts the variance to be independent of mass and luminosity. Nevertheless, the doubly broken power-law (hard-state) PSD predicts, with the widely accepted ratio of the two break frequencies, that the variance increases with increasing mass and decreases with increasing luminosity. Therefore, the independence of the observed variance on mass and luminosity suggests that AGNs should have soft-state PSDs. Taking into account the scaling of the break timescale with mass and luminosity synchronously, the observed variances are also more consistent with the soft-state than the hard-state PSD predictions. With the averaged variance of AGNs and the soft-state PSD assumption, we obtain a universal PSD amplitude of 0.030 ± 0.022. By analogy with the GBH PSDs in the high/soft state, the longest timescale variability supports the standpoint that AGNs are scaled-up GBHs in the high accretion state, as already implied by the direct PSD analysis.

  2. Transfer Kernel Common Spatial Patterns for Motor Imagery Brain-Computer Interface Classification

    Science.gov (United States)

    Dai, Mengxi; Liu, Shucong; Zhang, Pengju

    2018-01-01

    Motor-imagery-based brain-computer interfaces (BCIs) commonly use the common spatial pattern (CSP) as preprocessing step before classification. The CSP method is a supervised algorithm. Therefore a lot of time-consuming training data is needed to build the model. To address this issue, one promising approach is transfer learning, which generalizes a learning model can extract discriminative information from other subjects for target classification task. To this end, we propose a transfer kernel CSP (TKCSP) approach to learn a domain-invariant kernel by directly matching distributions of source subjects and target subjects. The dataset IVa of BCI Competition III is used to demonstrate the validity by our proposed methods. In the experiment, we compare the classification performance of the TKCSP against CSP, CSP for subject-to-subject transfer (CSP SJ-to-SJ), regularizing CSP (RCSP), stationary subspace CSP (ssCSP), multitask CSP (mtCSP), and the combined mtCSP and ssCSP (ss + mtCSP) method. The results indicate that the superior mean classification performance of TKCSP can achieve 81.14%, especially in case of source subjects with fewer number of training samples. Comprehensive experimental evidence on the dataset verifies the effectiveness and efficiency of the proposed TKCSP approach over several state-of-the-art methods. PMID:29743934

  3. Assessing attitudes toward computers and the use of Internet resources among undergraduate microbiology students

    Science.gov (United States)

    Anderson, Delia Marie Castro

    Computer literacy and use have become commonplace in our colleges and universities. In an environment that demands the use of technology, educators should be knowledgeable of the components that make up the overall computer attitude of students and be willing to investigate the processes and techniques of effective teaching and learning that can take place with computer technology. The purpose of this study is two fold. First, it investigates the relationship between computer attitudes and gender, ethnicity, and computer experience. Second, it addresses the question of whether, and to what extent, students' attitudes toward computers change over a 16 week period in an undergraduate microbiology course that supplements the traditional lecture with computer-driven assignments. Multiple regression analyses, using data from the Computer Attitudes Scale (Loyd & Loyd, 1985), showed that, in the experimental group, no significant relationships were found between computer anxiety and gender or ethnicity or between computer confidence and gender or ethnicity. However, students who used computers the longest (p = .001) and who were self-taught (p = .046) had the lowest computer anxiety levels. Likewise students who used computers the longest (p = .001) and who were self-taught (p = .041) had the highest confidence levels. No significant relationships between computer liking, usefulness, or the use of Internet resources and gender, ethnicity, or computer experience were found. Dependent T-tests were performed to determine whether computer attitude scores (pretest and posttest) increased over a 16-week period for students who had been exposed to computer-driven assignments and other Internet resources. Results showed that students in the experimental group were less anxious about working with computers and considered computers to be more useful. In the control group, no significant changes in computer anxiety, confidence, liking, or usefulness were noted. Overall, students in

  4. An R package to compute commonality coefficients in the multiple regression case: an introduction to the package and a practical example.

    Science.gov (United States)

    Nimon, Kim; Lewis, Mitzi; Kane, Richard; Haynes, R Michael

    2008-05-01

    Multiple regression is a widely used technique for data analysis in social and behavioral research. The complexity of interpreting such results increases when correlated predictor variables are involved. Commonality analysis provides a method of determining the variance accounted for by respective predictor variables and is especially useful in the presence of correlated predictors. However, computing commonality coefficients is laborious. To make commonality analysis accessible to more researchers, a program was developed to automate the calculation of unique and common elements in commonality analysis, using the statistical package R. The program is described, and a heuristic example using data from the Holzinger and Swineford (1939) study, readily available in the MBESS R package, is presented.

  5. The worst case scenario: Locomotor and collision demands of the longest periods of gameplay in professional rugby union.

    Directory of Open Access Journals (Sweden)

    Cillian Reardon

    Full Text Available A number of studies have used global positioning systems (GPS to report on positional differences in the physical game demands of rugby union both on an average and singular bout basis. However, the ability of these studies to report quantitative data is limited by a lack of validation of certain aspects of measurement by GPS micro-technology. Furthermore no study has analyzed the positional physical demands of the longest bouts of ball-in-play time in rugby union. The aim of the present study is to compare the demands of the single longest period of ball-in-play, termed "worst case scenario" (WCS between positional groups, which have previously been reported to have distinguishable game demands. The results of this study indicate that WCS periods follow a similar sporadic pattern as average demands but are played at a far higher pace than previously reported for average game demands with average meters per minute of 116.8 m. The positional differences in running and collision activity previously reported are perpetuated within WCS periods. Backs covered greater total distances than forwards (318 m vs 289 m, carried out more high-speed running (11.1 m·min-1 vs 5.5 m·min-1 and achieved higher maximum velocities (MaxVel. Outside Backs achieved the highest MaxVel values (6.84 m·sec-1. Tight Five and Back Row forwards underwent significantly more collisions than Inside Back and Outside Backs (0.73 & 0.89 collisions·min-1 vs 0.28 & 0.41 collisions·min-1 respectively. The results of the present study provide information on the positional physical requirements of performance in prolonged periods involving multiple high intensity bursts of effort. Although the current state of GPS micro-technology as a measurement tool does not permit reporting of collision intensity or acceleration data, the combined use of video and GPS provides valuable information to the practitioner. This can be used to match and replicate game demands in training.

  6. The worst case scenario: Locomotor and collision demands of the longest periods of gameplay in professional rugby union

    Science.gov (United States)

    Reardon, Cillian; Tobin, Daniel P.; Tierney, Peter; Delahunt, Eamonn

    2017-01-01

    A number of studies have used global positioning systems (GPS) to report on positional differences in the physical game demands of rugby union both on an average and singular bout basis. However, the ability of these studies to report quantitative data is limited by a lack of validation of certain aspects of measurement by GPS micro-technology. Furthermore no study has analyzed the positional physical demands of the longest bouts of ball-in-play time in rugby union. The aim of the present study is to compare the demands of the single longest period of ball-in-play, termed “worst case scenario” (WCS) between positional groups, which have previously been reported to have distinguishable game demands. The results of this study indicate that WCS periods follow a similar sporadic pattern as average demands but are played at a far higher pace than previously reported for average game demands with average meters per minute of 116.8 m. The positional differences in running and collision activity previously reported are perpetuated within WCS periods. Backs covered greater total distances than forwards (318 m vs 289 m), carried out more high-speed running (11.1 m·min-1 vs 5.5 m·min-1) and achieved higher maximum velocities (MaxVel). Outside Backs achieved the highest MaxVel values (6.84 m·sec-1). Tight Five and Back Row forwards underwent significantly more collisions than Inside Back and Outside Backs (0.73 & 0.89 collisions·min-1 vs 0.28 & 0.41 collisions·min-1 respectively). The results of the present study provide information on the positional physical requirements of performance in prolonged periods involving multiple high intensity bursts of effort. Although the current state of GPS micro-technology as a measurement tool does not permit reporting of collision intensity or acceleration data, the combined use of video and GPS provides valuable information to the practitioner. This can be used to match and replicate game demands in training. PMID:28510582

  7. A common currency for the computation of motivational values in the human striatum

    Science.gov (United States)

    Li, Yansong; Dreher, Jean-Claude

    2015-01-01

    Reward comparison in the brain is thought to be achieved through the use of a ‘common currency’, implying that reward value representations are computed on a unique scale in the same brain regions regardless of the reward type. Although such a mechanism has been identified in the ventro-medial prefrontal cortex and ventral striatum in the context of decision-making, it is less clear whether it similarly applies to non-choice situations. To answer this question, we scanned 38 participants with fMRI while they were presented with single cues predicting either monetary or erotic rewards, without the need to make a decision. The ventral striatum was the main brain structure to respond to both cues while showing increasing activity with increasing expected reward intensity. Most importantly, the relative response of the striatum to monetary vs erotic cues was correlated with the relative motivational value of these rewards as inferred from reaction times. Similar correlations were observed in a fronto-parietal network known to be involved in attentional focus and motor readiness. Together, our results suggest that striatal reward value signals not only obey to a common currency mechanism in the absence of choice but may also serve as an input to adjust motivated behaviour accordingly. PMID:24837478

  8. Lithospheric controls on magma composition along Earth's longest continental hotspot track.

    Science.gov (United States)

    Davies, D R; Rawlinson, N; Iaffaldano, G; Campbell, I H

    2015-09-24

    Hotspots are anomalous regions of volcanism at Earth's surface that show no obvious association with tectonic plate boundaries. Classic examples include the Hawaiian-Emperor chain and the Yellowstone-Snake River Plain province. The majority are believed to form as Earth's tectonic plates move over long-lived mantle plumes: buoyant upwellings that bring hot material from Earth's deep mantle to its surface. It has long been recognized that lithospheric thickness limits the rise height of plumes and, thereby, their minimum melting pressure. It should, therefore, have a controlling influence on the geochemistry of plume-related magmas, although unambiguous evidence of this has, so far, been lacking. Here we integrate observational constraints from surface geology, geochronology, plate-motion reconstructions, geochemistry and seismology to ascertain plume melting depths beneath Earth's longest continental hotspot track, a 2,000-kilometre-long track in eastern Australia that displays a record of volcanic activity between 33 and 9 million years ago, which we call the Cosgrove track. Our analyses highlight a strong correlation between lithospheric thickness and magma composition along this track, with: (1) standard basaltic compositions in regions where lithospheric thickness is less than 110 kilometres; (2) volcanic gaps in regions where lithospheric thickness exceeds 150 kilometres; and (3) low-volume, leucitite-bearing volcanism in regions of intermediate lithospheric thickness. Trace-element concentrations from samples along this track support the notion that these compositional variations result from different degrees of partial melting, which is controlled by the thickness of overlying lithosphere. Our results place the first observational constraints on the sub-continental melting depth of mantle plumes and provide direct evidence that lithospheric thickness has a dominant influence on the volume and chemical composition of plume-derived magmas.

  9. Deep-sea octopus (Graneledone boreopacifica) conducts the longest-known egg-brooding period of any animal.

    Science.gov (United States)

    Robison, Bruce; Seibel, Brad; Drazen, Jeffrey

    2014-01-01

    Octopuses typically have a single reproductive period and then they die (semelparity). Once a clutch of fertilized eggs has been produced, the female protects and tends them until they hatch. In most shallow-water species this period of parental care can last from 1 to 3 months, but very little is known about the brooding of deep-living species. In the cold, dark waters of the deep ocean, metabolic processes are often slower than their counterparts at shallower depths. Extrapolations from data on shallow-water octopus species suggest that lower temperatures would prolong embryonic development periods. Likewise, laboratory studies have linked lower temperatures to longer brooding periods in cephalopods, but direct evidence has not been available. We found an opportunity to directly measure the brooding period of the deep-sea octopus Graneledone boreopacifica, in its natural habitat. At 53 months, it is by far the longest egg-brooding period ever reported for any animal species. These surprising results emphasize the selective value of prolonged embryonic development in order to produce competitive hatchlings. They also extend the known boundaries of physiological adaptations for life in the deep sea.

  10. The transcriptome of the bowhead whale Balaena mysticetus reveals adaptations of the longest-lived mammal

    Science.gov (United States)

    Seim, Inge; Ma, Siming; Zhou, Xuming; Gerashchenko, Maxim V.; Lee, Sang-Goo; Suydam, Robert; George, John C.; Bickham, John W.; Gladyshev, Vadim N.

    2014-01-01

    Mammals vary dramatically in lifespan, by at least two-orders of magnitude, but the molecular basis for this difference remains largely unknown. The bowhead whale Balaena mysticetus is the longest-lived mammal known, with an estimated maximal lifespan in excess of two hundred years. It is also one of the two largest animals and the most cold-adapted baleen whale species. Here, we report the first genome-wide gene expression analyses of the bowhead whale, based on the de novo assembly of its transcriptome. Bowhead whale or cetacean-specific changes in gene expression were identified in the liver, kidney and heart, and complemented with analyses of positively selected genes. Changes associated with altered insulin signaling and other gene expression patterns could help explain the remarkable longevity of bowhead whales as well as their adaptation to a lipid-rich diet. The data also reveal parallels in candidate longevity adaptations of the bowhead whale, naked mole rat and Brandt's bat. The bowhead whale transcriptome is a valuable resource for the study of this remarkable animal, including the evolution of longevity and its important correlates such as resistance to cancer and other diseases. PMID:25411232

  11. Overview of Parallel Platforms for Common High Performance Computing

    Directory of Open Access Journals (Sweden)

    T. Fryza

    2012-04-01

    Full Text Available The paper deals with various parallel platforms used for high performance computing in the signal processing domain. More precisely, the methods exploiting the multicores central processing units such as message passing interface and OpenMP are taken into account. The properties of the programming methods are experimentally proved in the application of a fast Fourier transform and a discrete cosine transform and they are compared with the possibilities of MATLAB's built-in functions and Texas Instruments digital signal processors with very long instruction word architectures. New FFT and DCT implementations were proposed and tested. The implementation phase was compared with CPU based computing methods and with possibilities of the Texas Instruments digital signal processing library on C6747 floating-point DSPs. The optimal combination of computing methods in the signal processing domain and new, fast routines' implementation is proposed as well.

  12. Chichibu park bridge, a Japan's longest PC cable suspension bridge that attaches importance to scenery. Keikan wo jushishita Nippon saidai no PC shachokyo 'Chichibu koenkyo'

    Energy Technology Data Exchange (ETDEWEB)

    1993-12-01

    This paper introduces the feature of Chichibu Park Bridge, a Japan's longest PC cable suspension bridge that attaches importance to scenery. The maximum effective span of Chichibu Park Bridge which is a two-span continuous PC cable suspension bridge measures 195 m, that means the center span length is equivalent to about 400 m if converted to a three-span structure. With respect to the design that values the scenic effect, the main tower has relief engravings of stone carving tone using Chichibu Night Festival as a motif disposed around it; lighting up is applied to the main tower to highlight it so that it can be viewed from far away places; and a balcony is built on the center of the bridge. Chichibu Park Bridge has the bridge axial direction stagger with the river flow direction at 45[degree] to reduce water resistance. The tensile force generated at the corbel section according to the main tower reactive force is dealt with reinforced concrete rather than with prestressed concrete. The main tower adopts a two-chamber girder structure as its cross section shape from the view points of rigidity assurance and scenic effect. For construction control, micro computers are used to correct growing change in bend of the main girder due to temperature change and cable tension change. 6 figs., 4 tabs.

  13. Common Agency and Computational Complexity : Theory and Experimental Evidence

    NARCIS (Netherlands)

    Kirchsteiger, G.; Prat, A.

    1999-01-01

    In a common agency game, several principals try to influence the behavior of an agent. Common agency games typically have multiple equilibria. One class of equilibria, called truthful, has been identified by Bernheim and Whinston and has found widespread use in the political economy literature. In

  14. Consumer attitudes towards computer-assisted self-care of the common cold.

    Science.gov (United States)

    Reis, J; Wrestler, F

    1994-04-01

    Knowledge of colds and flu and attitudes towards use of computers for self-care are compared for 260 young adult users and 194 young adult non-users of computer-assisted self-care for colds and flu. Participants completed a knowledge questionnaire on colds and flu, used a computer program designed to enhance self-care for colds and flu, and then completed a questionnaire on their attitudes towards using a computer for self-care for colds and flu, and then completed a questionnaire on their attitudes towards using a computer for self-care for colds and flu, perceived importance of physician interactions, physician expertise, and patient-physician communication. Compared with users, non-users preferred personal contact with their physicians and felt that computerized health assessments would be limited in vocabulary and range of current medical information. Non-users were also more likely to agree that people could not be trusted to do an accurate computerized health assessment and that the average person was too computer illiterate to use computers for self-care.

  15. Single-trial detection of visual evoked potentials by common spatial patterns and wavelet filtering for brain-computer interface.

    Science.gov (United States)

    Tu, Yiheng; Huang, Gan; Hung, Yeung Sam; Hu, Li; Hu, Yong; Zhang, Zhiguo

    2013-01-01

    Event-related potentials (ERPs) are widely used in brain-computer interface (BCI) systems as input signals conveying a subject's intention. A fast and reliable single-trial ERP detection method can be used to develop a BCI system with both high speed and high accuracy. However, most of single-trial ERP detection methods are developed for offline EEG analysis and thus have a high computational complexity and need manual operations. Therefore, they are not applicable to practical BCI systems, which require a low-complexity and automatic ERP detection method. This work presents a joint spatial-time-frequency filter that combines common spatial patterns (CSP) and wavelet filtering (WF) for improving the signal-to-noise (SNR) of visual evoked potentials (VEP), which can lead to a single-trial ERP-based BCI.

  16. The Tragedy of the Commons

    Science.gov (United States)

    Short, Daniel

    2016-01-01

    The tragedy of the commons is one of the principal tenets of ecology. Recent developments in experiential computer-based simulation of the tragedy of the commons are described. A virtual learning environment is developed using the popular video game "Minecraft". The virtual learning environment is used to experience first-hand depletion…

  17. Common envelope evolution

    NARCIS (Netherlands)

    Taam, Ronald E.; Ricker, Paul M.

    2010-01-01

    The common envelope phase of binary star evolution plays a central role in many evolutionary pathways leading to the formation of compact objects in short period systems. Using three dimensional hydrodynamical computations, we review the major features of this evolutionary phase, focusing on the

  18. Computational Fluid Dynamics (CFD) Computations With Zonal Navier-Stokes Flow Solver (ZNSFLOW) Common High Performance Computing Scalable Software Initiative (CHSSI) Software

    National Research Council Canada - National Science Library

    Edge, Harris

    1999-01-01

    ...), computational fluid dynamics (CFD) 6 project. Under the project, a proven zonal Navier-Stokes solver was rewritten for scalable parallel performance on both shared memory and distributed memory high performance computers...

  19. Approximate solutions of common fixed-point problems

    CERN Document Server

    Zaslavski, Alexander J

    2016-01-01

    This book presents results on the convergence behavior of algorithms which are known as vital tools for solving convex feasibility problems and common fixed point problems. The main goal for us in dealing with a known computational error is to find what approximate solution can be obtained and how many iterates one needs to find it. According to know results, these algorithms should converge to a solution. In this exposition, these algorithms are studied, taking into account computational errors which remain consistent in practice. In this case the convergence to a solution does not take place. We show that our algorithms generate a good approximate solution if computational errors are bounded from above by a small positive constant. Beginning with an introduction, this monograph moves on to study: · dynamic string-averaging methods for common fixed point problems in a Hilbert space · dynamic string methods for common fixed point problems in a metric space · dynamic string-averaging version of the proximal...

  20. A survey of common habits of computer users as indicators of ...

    African Journals Online (AJOL)

    Yomi

    2012-01-31

    Jan 31, 2012 ... Hygiene has been recognized as an infection control strategy and the extent of the problems of environmental contamination largely depends on personal hygiene. With the development of several computer applications in recent times, the uses of computer systems have greatly expanded. And with.

  1. Optical Computing

    OpenAIRE

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  2. Common cause failure analysis methodology for complex systems

    International Nuclear Information System (INIS)

    Wagner, D.P.; Cate, C.L.; Fussell, J.B.

    1977-01-01

    Common cause failure analysis, also called common mode failure analysis, is an integral part of a complex system reliability analysis. This paper extends existing methods of computer aided common cause failure analysis by allowing analysis of the complex systems often encountered in practice. The methods presented here aid in identifying potential common cause failures and also address quantitative common cause failure analysis

  3. Isolated and unexplained dilation of the common bile duct on computed tomography scanscans

    Directory of Open Access Journals (Sweden)

    Naveen B. Krishna

    2012-07-01

    Full Text Available Isolated dilation of common bile duct (CBD with normal sized pancreatic duct and without identifiable stones or mass lesion (unexplained is frequently encountered by computed tomography/magnetic resonance imaging. We studied the final diagnoses in these patients and tried to elucidate factors that can predict a malignant etiology. This is a retrospective analysis of prospective database from a University based clinical practice (2002- 2008. We included 107 consecutive patients who underwent endoscopic ultrasound (EUS for evaluation of isolated and unexplained CBD dilation noted on contrast computed tomography scans. EUS examination was performed using a radial echoendoscope followed by a linear echoechoendoscope, if a focal mass lesion was identified. Fine-needle aspirates were assessed immediately by an attending cytopathologist. Main outcome measurements included i prevalence of neoplasms, CBD stones and chronic pancreatitis and ii performance characteristics of EUS/EUS-fine needle aspiration (EUS-FNA. A malignant neoplasm was found in 16 patients (14.9% of the study subjects, all with obstructive jaundice (ObJ. Six patients had CBD stones; three with ObJ and three with abnormal liver function tests. EUS findings suggestive of chronic pancreatitis were identified in 27 patients. EUSFNA had 97.3% accuracy (94.1% in subset with ObJ with a sensitivity of 81.2% and specificity of 100% for diagnosing malignancy. Presence of ObJ and older patient age were only significant predictors of malignancy in our cohort. Amongst patients with isolated and unexplained dilation of CBD, the risk of malignancy is significantly higher in older patients presenting with ObJ. EUS-FNA can diagnose malignancy in these patients with high accuracy besides identifying other potential etiologies including missed CBD stones and chronic pancreatitis.

  4. Common Readout System in ALICE

    CERN Document Server

    Jubin, Mitra

    2016-01-01

    The ALICE experiment at the CERN Large Hadron Collider is going for a major physics upgrade in 2018. This upgrade is necessary for getting high statistics and high precision measurement for probing into rare physics channels needed to understand the dynamics of the condensed phase of QCD. The high interaction rate and the large event size in the upgraded detectors will result in an experimental data flow traffic of about 1 TB/s from the detectors to the on-line computing system. A dedicated Common Readout Unit (CRU) is proposed for data concentration, multiplexing, and trigger distribution. CRU, as common interface unit, handles timing, data and control signals between on-detector systems and online-offline computing system. An overview of the CRU architecture is presented in this manuscript.

  5. Common Readout System in ALICE

    CERN Document Server

    Jubin, Mitra

    2017-01-01

    The ALICE experiment at the CERN Large Hadron Collider is going for a major physics upgrade in 2018. This upgrade is necessary for getting high statistics and high precision measurement for probing into rare physics channels needed to understand the dynamics of the condensed phase of QCD. The high interaction rate and the large event size in the upgraded detectors will result in an experimental data flow traffic of about 1 TB/s from the detectors to the on-line computing system. A dedicated Common Readout Unit (CRU) is proposed for data concentration, multiplexing, and trigger distribution. CRU, as common interface unit, handles timing, data and control signals between on-detector systems and online-offline computing system. An overview of the CRU architecture is presented in this manuscript.

  6. Extended Postnatal Brain Development in the Longest-Lived Rodent: Prolonged Maintenance of Neotenous Traits in the Naked Mole-Rat Brain.

    Science.gov (United States)

    Orr, Miranda E; Garbarino, Valentina R; Salinas, Angelica; Buffenstein, Rochelle

    2016-01-01

    The naked mole-rat (NMR) is the longest-lived rodent with a maximum lifespan >31 years. Intriguingly, fully-grown naked mole-rats (NMRs) exhibit many traits typical of neonatal rodents. However, little is known about NMR growth and maturation, and we question whether sustained neotenous features when compared to mice, reflect an extended developmental period, commensurate with their exceptionally long life. We tracked development from birth to 3 years of age in the slowest maturing organ, the brain, by measuring mass, neural stem cell proliferation, axonal, and dendritic maturation, synaptogenesis and myelination. NMR brain maturation was compared to data from similar sized rodents, mice, and to that of long-lived mammals, humans, and non-human primates. We found that at birth, NMR brains are significantly more developed than mice, and rather are more similar to those of newborn primates, with clearly laminated hippocampi and myelinated white matter tracts. Despite this more mature brain at birth than mice, postnatal NMR brain maturation occurs at a far slower rate than mice, taking four-times longer than required for mice to fully complete brain development. At 4 months of age, NMR brains reach 90% of adult size with stable neuronal cytostructural protein expression whereas myelin protein expression does not plateau until 9 months of age in NMRs, and synaptic protein expression continues to change throughout the first 3 years of life. Intriguingly, NMR axonal composition is more similar to humans than mice whereby NMRs maintain expression of three-repeat (3R) tau even after brain growth is complete; mice experience an abrupt downregulation of 3R tau by postnatal day 8 which continues to diminish through 6 weeks of age. We have identified key ages in NMR cerebral development and suggest that the long-lived NMR may provide neurobiologists an exceptional model to study brain developmental processes that are compressed in common short-lived laboratory animal models.

  7. Extended postnatal brain development in the longest-lived rodent: prolonged maintenance of neotenous traits in the naked mole-rat brain

    Directory of Open Access Journals (Sweden)

    Miranda E. Orr

    2016-11-01

    Full Text Available The naked mole-rat (NMR is the longest-lived rodent with a maximum lifespan >31 years. Intriguingly, fully-grown naked mole-rats (NMRs exhibit many traits typical of neonatal rodents. However, little is known about NMR growth and maturation, and we question whether sustained neotenous features when compared to mice, reflect an extended developmental period, commensurate with their exceptionally long life. We tracked development from birth to three years of age in the slowest maturing organ, the brain, by measuring mass, neural stem cell proliferation, axonal and dendritic maturation, synaptogenesis and myelination. NMR brain maturation was compared to data from similar sized rodents, mice, and to that of long-lived mammals, humans and non-human primates. We found that at birth, NMR brains are significantly more developed than mice, and rather are more similar to those of newborn primates, with clearly laminated hippocampi and myelinated white matter tracts. Despite this more mature brain at birth than mice, postnatal NMR brain maturation occurs at a far slower rate than mice, taking four-times longer than required for mice to fully complete brain development. At four months of age, NMR brains reach 90% of adult size with stable neuronal cytostructural protein expression whereas myelin protein expression does not plateau until nine months of age in NMRs, and synaptic protein expression continues to change throughout the first three years of life. Intriguingly, NMR axonal composition is more similar to humans than mice whereby NMRs maintain expression of three-repeat (3R tau even after brain growth is complete; mice experience an abrupt downregulation of 3R tau by postnatal day 8 which continues to diminish through six weeks of age. We have identified key ages in NMR cerebral development and suggest that the long-lived NMR may provide neurobiologists an exceptional model to study brain developmental processes that are compressed in common short

  8. Computed Tomographic Measurement of Splenic Size in

    International Nuclear Information System (INIS)

    Sung, Nak Kwan; Woo, Seong Ku; Ko, Young Tae; Kim, Soon Young

    2010-01-01

    Authors analyzed 72 cases of abdominal computed tomography of Korean adults who didn't have any medical reasons to believe the spleen was abnormal. The following criteria were measured with multiple transverse scanning of the entire length of spleen (height, breadth, thickness) relationship with fixed midline structure, the spine (the shortest distance from midline to medial edge of spleen, the longest distance from anterior margin of vertebral body to anterior tip of spleen). The results were as follows: 1. The average size in adult was 8.0±1.5cm in height, 8.6±1.2cm in breadth and 3.4±0.6cm in thickness; in adult female, 7.8±1.1cm, 8.4±1.0cm and 3.4±0.6cm, respectively; total average, 7.9±1.3cm, 8.5±1.1cm and 3.4±0.6cm, respectively. No remarkable difference was noted between both sexes and age groups. 2. The shortest distance from midline to medial edge of spleen was 4.1±1.1cm in male, 3.6±1.0cm in female and total average of 3.9±1.1cm. There was remarkable difference between both sexes (P<0.005) but not between age groups. 3. The longest distance from anterior margin of vertebral body to anterior adge of spleen was 2.3±1.7cm in male, 2.0±1.4cm in female and total average of 2.2±1.6cm. No remarkable difference was seen between both sexes and age groups.

  9. Walking the oxidative stress tightrope: a perspective from the naked mole-rat, the longest-living rodent.

    Science.gov (United States)

    Rodriguez, Karl A; Wywial, Ewa; Perez, Viviana I; Lambert, Adriant J; Edrey, Yael H; Lewis, Kaitlyn N; Grimes, Kelly; Lindsey, Merry L; Brand, Martin D; Buffenstein, Rochelle

    2011-01-01

    Reactive oxygen species (ROS), by-products of aerobic metabolism, cause oxidative damage to cells and tissue and not surprisingly many theories have arisen to link ROS-induced oxidative stress to aging and health. While studies clearly link ROS to a plethora of divergent diseases, their role in aging is still debatable. Genetic knock-down manipulations of antioxidants alter the levels of accrued oxidative damage, however, the resultant effect of increased oxidative stress on lifespan are equivocal. Similarly the impact of elevating antioxidant levels through transgenic manipulations yield inconsistent effects on longevity. Furthermore, comparative data from a wide range of endotherms with disparate longevity remain inconclusive. Many long-living species such as birds, bats and mole-rats exhibit high-levels of oxidative damage, evident already at young ages. Clearly, neither the amount of ROS per se nor the sensitivity in neutralizing ROS are as important as whether or not the accrued oxidative stress leads to oxidative-damage-linked age-associated diseases. In this review we examine the literature on ROS, its relation to disease and the lessons gleaned from a comparative approach based upon species with widely divergent responses. We specifically focus on the longest lived rodent, the naked mole-rat, which maintains good health and provides novel insights into the paradox of maintaining both an extended healthspan and lifespan despite high oxidative stress from a young age.

  10. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses Computed tomography (CT) of the sinuses ... CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, more commonly known ...

  11. Optimal design of wind barriers using 3D computational fluid dynamics simulations

    Science.gov (United States)

    Fang, H.; Wu, X.; Yang, X.

    2017-12-01

    Desertification is a significant global environmental and ecological problem that requires human-regulated control and management. Wind barriers are commonly used to reduce wind velocity or trap drifting sand in arid or semi-arid areas. Therefore, optimal design of wind barriers becomes critical in Aeolian engineering. In the current study, we perform 3D computational fluid dynamics (CFD) simulations for flow passing through wind barriers with different structural parameters. To validate the simulation results, we first inter-compare the simulated flow field results with those from both wind-tunnel experiments and field measurements. Quantitative analyses of the shelter effect are then conducted based on a series of simulations with different structural parameters (such as wind barrier porosity, row numbers, inter-row spacing and belt schemes). The results show that wind barriers with porosity of 0.35 could provide the longest shelter distance (i.e., where the wind velocity reduction is more than 50%) thus are recommended in engineering designs. To determine the optimal row number and belt scheme, we introduce a cost function that takes both wind-velocity reduction effects and economical expense into account. The calculated cost function show that a 3-row-belt scheme with inter-row spacing of 6h (h as the height of wind barriers) and inter-belt spacing of 12h is the most effective.

  12. Design and simulation of virtual telephone keypad control based on brain computer interface (BCI with very high transfer rates

    Directory of Open Access Journals (Sweden)

    Rehab B. Ashari

    2011-03-01

    Full Text Available Brain Computer Interface (BCI is a communication and control mechanism, which does not rely on any kind of muscular response to send a message to the external world. This technique is used to help the paralyzed people with spinal cord injury to have the ability to communicate with the external world. In this paper we emphasize to increase the BCI System bit rate for controlling a virtual telephone keypad. To achieve the proposed algorithm, a simulated virtual telephone keypad based on Steady State Visual Evoked Potential (SSVEP BCI system is developed. Dynamic programming technique with specifically modified Longest Common Subsequence (LCS algorithm is used. By comparing the paralyzed user selection with the recent, and then the rest, of the stored records in the file of the telephone, the user can save the rest of his choices for controlling the keypad and thence improving the overall performance of the BCI system. This axiomatic approach, which is used in searching the web pages for increasing the performance of the searching, is urgent to be used for the paralyzed people rather than the normal user.

  13. Review of quantum computation

    International Nuclear Information System (INIS)

    Lloyd, S.

    1992-01-01

    Digital computers are machines that can be programmed to perform logical and arithmetical operations. Contemporary digital computers are ''universal,'' in the sense that a program that runs on one computer can, if properly compiled, run on any other computer that has access to enough memory space and time. Any one universal computer can simulate the operation of any other; and the set of tasks that any such machine can perform is common to all universal machines. Since Bennett's discovery that computation can be carried out in a non-dissipative fashion, a number of Hamiltonian quantum-mechanical systems have been proposed whose time-evolutions over discrete intervals are equivalent to those of specific universal computers. The first quantum-mechanical treatment of computers was given by Benioff, who exhibited a Hamiltonian system with a basis whose members corresponded to the logical states of a Turing machine. In order to make the Hamiltonian local, in the sense that its structure depended only on the part of the computation being performed at that time, Benioff found it necessary to make the Hamiltonian time-dependent. Feynman discovered a way to make the computational Hamiltonian both local and time-independent by incorporating the direction of computation in the initial condition. In Feynman's quantum computer, the program is a carefully prepared wave packet that propagates through different computational states. Deutsch presented a quantum computer that exploits the possibility of existing in a superposition of computational states to perform tasks that a classical computer cannot, such as generating purely random numbers, and carrying out superpositions of computations as a method of parallel processing. In this paper, we show that such computers, by virtue of their common function, possess a common form for their quantum dynamics

  14. Common Sense Planning for a Computer, or, What's It Worth to You?

    Science.gov (United States)

    Crawford, Walt

    1984-01-01

    Suggests factors to be considered in planning for the purchase of a microcomputer, including budgets, benefits, costs, and decisions. Major uses of a personal computer are described--word processing, financial analysis, file and database management, programming and computer literacy, education, entertainment, and thrill of high technology. (EJS)

  15. Computer jargon explained

    CERN Document Server

    Enticknap, Nicholas

    2014-01-01

    Computer Jargon Explained is a feature in Computer Weekly publications that discusses 68 of the most commonly used technical computing terms. The book explains what the terms mean and why the terms are important to computer professionals. The text also discusses how the terms relate to the trends and developments that are driving the information technology industry. Computer jargon irritates non-computer people and in turn causes problems for computer people. The technology and the industry are changing so rapidly; it is very hard even for professionals to keep updated. Computer people do not

  16. Missile signal processing common computer architecture for rapid technology upgrade

    Science.gov (United States)

    Rabinkin, Daniel V.; Rutledge, Edward; Monticciolo, Paul

    2004-10-01

    Interceptor missiles process IR images to locate an intended target and guide the interceptor towards it. Signal processing requirements have increased as the sensor bandwidth increases and interceptors operate against more sophisticated targets. A typical interceptor signal processing chain is comprised of two parts. Front-end video processing operates on all pixels of the image and performs such operations as non-uniformity correction (NUC), image stabilization, frame integration and detection. Back-end target processing, which tracks and classifies targets detected in the image, performs such algorithms as Kalman tracking, spectral feature extraction and target discrimination. In the past, video processing was implemented using ASIC components or FPGAs because computation requirements exceeded the throughput of general-purpose processors. Target processing was performed using hybrid architectures that included ASICs, DSPs and general-purpose processors. The resulting systems tended to be function-specific, and required custom software development. They were developed using non-integrated toolsets and test equipment was developed along with the processor platform. The lifespan of a system utilizing the signal processing platform often spans decades, while the specialized nature of processor hardware and software makes it difficult and costly to upgrade. As a result, the signal processing systems often run on outdated technology, algorithms are difficult to update, and system effectiveness is impaired by the inability to rapidly respond to new threats. A new design approach is made possible three developments; Moore's Law - driven improvement in computational throughput; a newly introduced vector computing capability in general purpose processors; and a modern set of open interface software standards. Today's multiprocessor commercial-off-the-shelf (COTS) platforms have sufficient throughput to support interceptor signal processing requirements. This application

  17. Topics in combinatorial pattern matching

    DEFF Research Database (Denmark)

    Vildhøj, Hjalte Wedel

    Problem. Given m documents of total length n, we consider the problem of finding a longest string common to at least d ≥ 2 of the documents. This problem is known as the longest common substring (LCS) problem and has a classic O(n) space and O(n) time solution (Weiner [FOCS’73], Hui [CPM’92]). However...

  18. Osteoid osteomas in common and in technically challenging locations treated with computed tomography-guided percutaneous radiofrequency ablation

    International Nuclear Information System (INIS)

    Mylona, Sophia; Patsoura, Sofia; Karapostolakis, Georgios; Galani, Panagiota; Pomoni, Anastasia; Thanos, Loukas

    2010-01-01

    To evaluate the efficacy of computed tomography (CT)-guided radiofrequency (RF) ablation for the treatment of osteoid osteomas in common and in technically challenging locations. Twenty-three patients with osteoid osteomas in common (nine cases) and technically challenging [14 cases: intra-articular (n = 7), spinal (n = 5), metaphyseal (n = 2)] positions were treated with CT-guided RF ablation. Therapy was performed under conscious sedation with a seven-array expandable RF electrode for 8-10 min at 80-110 C and power of 90-110 W. The patients went home under instruction. A brief pain inventory (BPI) score was calculated before and after (1 day, 4 weeks, 6 months and 1 year) treatment. All procedures were technically successful. Primary clinical success was 91.3% (21 of total 23 patients), despite the lesions' locations. BPI score was dramatically reduced after the procedure, and the decrease in BPI score was significant (P < 0.001, paired t-test; n - 1 = 22) for all periods during follow up. Two patients had persistent pain after 1 month and were treated successfully with a second procedure (secondary success rate 100%). No immediate or delayed complications were observed. CT-guided RF ablation is safe and highly effective for treatment of osteoid osteomas, even in technically difficult positions. (orig.)

  19. Negligible senescence in the longest living rodent, the naked mole-rat: insights from a successfully aging species.

    Science.gov (United States)

    Buffenstein, Rochelle

    2008-05-01

    Aging refers to a gradual deterioration in function that, over time, leads to increased mortality risk, and declining fertility. This pervasive process occurs in almost all organisms, although some long-lived trees and cold water inhabitants reportedly show insignificant aging. Negligible senescence is characterized by attenuated age-related change in reproductive and physiological functions, as well as no observable age-related gradual increase in mortality rate. It was questioned whether the longest living rodent, the naked mole-rat, met these three strict criteria. Naked mole-rats live in captivity for more than 28.3 years, approximately 9 times longer than similar-sized mice. They maintain body composition from 2 to 24 years, and show only slight age-related changes in all physiological and morphological characteristics studied to date. Surprisingly breeding females show no decline in fertility even when well into their third decade of life. Moreover, these animals have never been observed to develop any spontaneous neoplasm. As such they do not show the typical age-associated acceleration in mortality risk that characterizes every other known mammalian species and may therefore be the first reported mammal showing negligible senescence over the majority of their long lifespan. Clearly physiological and biochemical processes in this species have evolved to dramatically extend healthy lifespan. The challenge that lies ahead is to understand what these mechanisms are.

  20. Joint Service Common Operating Environment (COE) Common Geographic Information System functional requirements

    Energy Technology Data Exchange (ETDEWEB)

    Meitzler, W.D.

    1992-06-01

    In the context of this document and COE, the Geographic Information Systems (GIS) are decision support systems involving the integration of spatially referenced data in a problem solving environment. They are digital computer systems for capturing, processing, managing, displaying, modeling, and analyzing geographically referenced spatial data which are described by attribute data and location. The ability to perform spatial analysis and the ability to combine two or more data sets to create new spatial information differentiates a GIS from other computer mapping systems. While the CCGIS allows for data editing and input, its primary purpose is not to prepare data, but rather to manipulate, analyte, and clarify it. The CCGIS defined herein provides GIS services and resources including the spatial and map related functionality common to all subsystems contained within the COE suite of C4I systems. The CCGIS, which is an integral component of the COE concept, relies on the other COE standard components to provide the definition for other support computing services required.

  1. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Head Computed tomography (CT) of the head uses special x-ray ... What is CT Scanning of the Head? Computed tomography, more commonly known as a CT or CAT ...

  2. Reliability model for common mode failures in redundant safety systems

    International Nuclear Information System (INIS)

    Fleming, K.N.

    1974-12-01

    A method is presented for computing the reliability of redundant safety systems, considering both independent and common mode type failures. The model developed for the computation is a simple extension of classical reliability theory. The feasibility of the method is demonstrated with the use of an example. The probability of failure of a typical diesel-generator emergency power system is computed based on data obtained from U. S. diesel-generator operating experience. The results are compared with reliability predictions based on the assumption that all failures are independent. The comparison shows a significant increase in the probability of redundant system failure, when common failure modes are considered. (U.S.)

  3. Diagnostic reference levels for common computed tomography (CT) examinations: results from the first Nigerian nationwide dose survey.

    Science.gov (United States)

    Ekpo, Ernest U; Adejoh, Thomas; Akwo, Judith D; Emeka, Owujekwe C; Modu, Ali A; Abba, Mohammed; Adesina, Kudirat A; Omiyi, David O; Chiegwu, Uche H

    2018-01-29

    To explore doses from common adult computed tomography (CT) examinations and propose national diagnostic reference levels (nDRLs) for Nigeria. This retrospective study was approved by the Nnamdi Azikiwe University and University Teaching Hospital Institutional Review Boards (IRB: NAUTH/CS/66/Vol8/84) and involved dose surveys of adult CT examinations across the six geographical regions of Nigeria and Abuja from January 2016 to August 2017. Dose data of adult head, chest and abdomen/pelvis CT examinations were extracted from patient folders. The median, 75th and 25th percentile CT dose index volume (CTDI vol ) and dose-length-product (DLP) were computed for each of these procedures. Effective doses (E) for these examinations were estimated using the k conversion factor as described in the ICRP publication 103 (E DLP  =  k × DLP ). The proposed 75th percentile CTDI vol for head, chest, and abdomen/pelvis are 61 mGy, 17 mGy, and 20 mGy, respectively. The corresponding DLPs are 1310 mGy.cm, 735 mGy.cm, and 1486 mGy.cm respectively. The effective doses were 2.75 mSv (head), 10.29 mSv (chest), and 22.29 mSv (abdomen/pelvis). Findings demonstrate wide dose variations within and across centres in Nigeria. The results also show CTDI vol comparable to international standards, but considerably higher DLP and effective doses.

  4. The influence of age, gender and other information technology use on young people's computer use at school and home.

    Science.gov (United States)

    Harris, C; Straker, L; Pollock, C

    2013-01-01

    Young people are exposed to a range of information technologies (IT) in different environments, including home and school, however the factors influencing IT use at home and school are poorly understood. The aim of this study was to investigate young people's computer exposure patterns at home and school, and related factors such as age, gender and the types of IT used. 1351 children in Years 1, 6, 9 and 11 from 10 schools in metropolitan Western Australia were surveyed. Most children had access to computers at home and school, with computer exposures comparable to TV, reading and writing. Total computer exposure was greater at home than school, and increased with age. Computer activities varied with age and gender and became more social with increased age, at the same time parental involvement reduced. Bedroom computer use was found to result in higher exposure patterns. High use of home and school computers were associated with each other. Associations varied depending on the type of IT exposure measure (frequency, mean weekly hours, usual and longest duration). The frequency and duration of children's computer exposure were associated with a complex interplay of the environment of use, the participant's age and gender and other IT activities.

  5. Children's (Pediatric) CT (Computed Tomography)

    Medline Plus

    Full Text Available ... News Physician Resources Professions Site Index A-Z Children's (Pediatric) CT (Computed Tomography) Pediatric computed tomography (CT) ... are the limitations of Children's CT? What is Children's CT? Computed tomography, more commonly known as a ...

  6. Satellite tagging of rehabilitated green sea turtles Chelonia mydas from the United Arab Emirates, including the longest tracked journey for the species.

    Science.gov (United States)

    Robinson, David P; Jabado, Rima W; Rohner, Christoph A; Pierce, Simon J; Hyland, Kevin P; Baverstock, Warren R

    2017-01-01

    We collected movement data for eight rehabilitated and satellite-tagged green sea turtles Chelonia mydas released off the United Arab Emirates between 2005 and 2013. Rehabilitation periods ranged from 96 to 1353 days (mean = 437 ± 399 days). Seven of the eight tagged turtles survived after release; one turtle was killed by what is thought to be a post-release spear gun wound. The majority of turtles (63%) used shallow-water core habitats and established home ranges between Dubai and Abu Dhabi, the same area in which they had originally washed ashore prior to rescue. Four turtles made movements across international boundaries, highlighting that regional cooperation is necessary for the management of the species. One turtle swam from Fujairah to the Andaman Sea, a total distance of 8283 km, which is the longest published track of a green turtle. This study demonstrates that sea turtles can be successfully reintroduced into the wild after sustaining serious injury and undergoing prolonged periods of intense rehabilitation.

  7. Project Energise: Using participatory approaches and real time computer prompts to reduce occupational sitting and increase work time physical activity in office workers.

    Science.gov (United States)

    Gilson, Nicholas D; Ng, Norman; Pavey, Toby G; Ryde, Gemma C; Straker, Leon; Brown, Wendy J

    2016-11-01

    This efficacy study assessed the added impact real time computer prompts had on a participatory approach to reduce occupational sedentary exposure and increase physical activity. Quasi-experimental. 57 Australian office workers (mean [SD]; age=47 [11] years; BMI=28 [5]kg/m 2 ; 46 men) generated a menu of 20 occupational 'sit less and move more' strategies through participatory workshops, and were then tasked with implementing strategies for five months (July-November 2014). During implementation, a sub-sample of workers (n=24) used a chair sensor/software package (Sitting Pad) that gave real time prompts to interrupt desk sitting. Baseline and intervention sedentary behaviour and physical activity (GENEActiv accelerometer; mean work time percentages), and minutes spent sitting at desks (Sitting Pad; mean total time and longest bout) were compared between non-prompt and prompt workers using a two-way ANOVA. Workers spent close to three quarters of their work time sedentary, mostly sitting at desks (mean [SD]; total desk sitting time=371 [71]min/day; longest bout spent desk sitting=104 [43]min/day). Intervention effects were four times greater in workers who used real time computer prompts (8% decrease in work time sedentary behaviour and increase in light intensity physical activity; pcomputer prompts facilitated the impact of a participatory approach on reductions in occupational sedentary exposure, and increases in physical activity. Copyright © 2016 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  8. A Case for Data Commons: Toward Data Science as a Service.

    Science.gov (United States)

    Grossman, Robert L; Heath, Allison; Murphy, Mark; Patterson, Maria; Wells, Walt

    2016-01-01

    Data commons collocate data, storage, and computing infrastructure with core services and commonly used tools and applications for managing, analyzing, and sharing data to create an interoperable resource for the research community. An architecture for data commons is described, as well as some lessons learned from operating several large-scale data commons.

  9. Digital dissection - using contrast-enhanced computed tomography scanning to elucidate hard- and soft-tissue anatomy in the Common Buzzard Buteo buteo.

    Science.gov (United States)

    Lautenschlager, Stephan; Bright, Jen A; Rayfield, Emily J

    2014-04-01

    Gross dissection has a long history as a tool for the study of human or animal soft- and hard-tissue anatomy. However, apart from being a time-consuming and invasive method, dissection is often unsuitable for very small specimens and often cannot capture spatial relationships of the individual soft-tissue structures. The handful of comprehensive studies on avian anatomy using traditional dissection techniques focus nearly exclusively on domestic birds, whereas raptorial birds, and in particular their cranial soft tissues, are essentially absent from the literature. Here, we digitally dissect, identify, and document the soft-tissue anatomy of the Common Buzzard (Buteo buteo) in detail, using the new approach of contrast-enhanced computed tomography using Lugol's iodine. The architecture of different muscle systems (adductor, depressor, ocular, hyoid, neck musculature), neurovascular, and other soft-tissue structures is three-dimensionally visualised and described in unprecedented detail. The three-dimensional model is further presented as an interactive PDF to facilitate the dissemination and accessibility of anatomical data. Due to the digital nature of the data derived from the computed tomography scanning and segmentation processes, these methods hold the potential for further computational analyses beyond descriptive and illustrative proposes. © 2013 The Authors. Journal of Anatomy published by John Wiley & Sons Ltd on behalf of Anatomical Society.

  10. Wave-equation Migration Velocity Analysis Using Plane-wave Common Image Gathers

    KAUST Repository

    Guo, Bowen; Schuster, Gerard T.

    2017-01-01

    Wave-equation migration velocity analysis (WEMVA) based on subsurface-offset, angle domain or time-lag common image gathers (CIGs) requires significant computational and memory resources because it computes higher dimensional migration images

  11. Children's (Pediatric) CT (Computed Tomography)

    Medline Plus

    Full Text Available ... Site Index A-Z Children's (Pediatric) CT (Computed Tomography) Pediatric computed tomography (CT) is a fast, painless exam that uses ... of Children's CT? What is Children's CT? Computed tomography, more commonly known as a CT or CAT ...

  12. Children's (Pediatric) CT (Computed Tomography)

    Medline Plus

    Full Text Available ... Z Children's (Pediatric) CT (Computed Tomography) Pediatric computed tomography (CT) is a fast, painless exam that uses special ... the limitations of Children's CT? What is Children's CT? Computed tomography, more commonly known as a CT or CAT ...

  13. Amyloid beta and the longest-lived rodent: the naked mole-rat as a model for natural protection from Alzheimer's disease.

    Science.gov (United States)

    Edrey, Yael H; Medina, David X; Gaczynska, Maria; Osmulski, Pawel A; Oddo, Salvatore; Caccamo, Antonella; Buffenstein, Rochelle

    2013-10-01

    Amyloid beta (Aβ) is implicated in Alzheimer's disease (AD) as an integral component of both neural toxicity and plaque formation. Brains of the longest-lived rodents, naked mole-rats (NMRs) approximately 32 years of age, had levels of Aβ similar to those of the 3xTg-AD mouse model of AD. Interestingly, there was no evidence of extracellular plaques, nor was there an age-related increase in Aβ levels in the individuals examined (2-20+ years). The NMR Aβ peptide showed greater homology to the human sequence than to the mouse sequence, differing by only 1 amino acid from the former. This subtle difference led to interspecies differences in aggregation propensity but not neurotoxicity; NMR Aβ was less prone to aggregation than human Aβ. Nevertheless, both NMR and human Aβ were equally toxic to mouse hippocampal neurons, suggesting that Aβ neurotoxicity and aggregation properties were not coupled. Understanding how NMRs acquire and tolerate high levels of Aβ with no plaque formation could provide useful insights into AD, and may elucidate protective mechanisms that delay AD progression. Copyright © 2013 Elsevier Inc. All rights reserved.

  14. The common ancestry of life

    Directory of Open Access Journals (Sweden)

    Wolf Yuri I

    2010-11-01

    Full Text Available Abstract Background It is common belief that all cellular life forms on earth have a common origin. This view is supported by the universality of the genetic code and the universal conservation of multiple genes, particularly those that encode key components of the translation system. A remarkable recent study claims to provide a formal, homology independent test of the Universal Common Ancestry hypothesis by comparing the ability of a common-ancestry model and a multiple-ancestry model to predict sequences of universally conserved proteins. Results We devised a computational experiment on a concatenated alignment of universally conserved proteins which shows that the purported demonstration of the universal common ancestry is a trivial consequence of significant sequence similarity between the analyzed proteins. The nature and origin of this similarity are irrelevant for the prediction of "common ancestry" of by the model-comparison approach. Thus, homology (common origin of the compared proteins remains an inference from sequence similarity rather than an independent property demonstrated by the likelihood analysis. Conclusion A formal demonstration of the Universal Common Ancestry hypothesis has not been achieved and is unlikely to be feasible in principle. Nevertheless, the evidence in support of this hypothesis provided by comparative genomics is overwhelming. Reviewers this article was reviewed by William Martin, Ivan Iossifov (nominated by Andrey Rzhetsky and Arcady Mushegian. For the complete reviews, see the Reviewers' Report section.

  15. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... are the limitations of CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed ... nasal cavity by small openings. top of page What are some common uses of the procedure? CT ...

  16. Neural computation and the computational theory of cognition.

    Science.gov (United States)

    Piccinini, Gualtiero; Bahar, Sonya

    2013-04-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism-neural processes are computations in the generic sense. After that, we reject on empirical grounds the common assimilation of neural computation to either analog or digital computation, concluding that neural computation is sui generis. Analog computation requires continuous signals; digital computation requires strings of digits. But current neuroscientific evidence indicates that typical neural signals, such as spike trains, are graded like continuous signals but are constituted by discrete functional elements (spikes); thus, typical neural signals are neither continuous signals nor strings of digits. It follows that neural computation is sui generis. Finally, we highlight three important consequences of a proper understanding of neural computation for the theory of cognition. First, understanding neural computation requires a specially designed mathematical theory (or theories) rather than the mathematical theories of analog or digital computation. Second, several popular views about neural computation turn out to be incorrect. Third, computational theories of cognition that rely on non-neural notions of computation ought to be replaced or reinterpreted in terms of neural computation. Copyright © 2012 Cognitive Science Society, Inc.

  17. Computer systems a programmer's perspective

    CERN Document Server

    Bryant, Randal E

    2016-01-01

    Computer systems: A Programmer’s Perspective explains the underlying elements common among all computer systems and how they affect general application performance. Written from the programmer’s perspective, this book strives to teach readers how understanding basic elements of computer systems and executing real practice can lead them to create better programs. Spanning across computer science themes such as hardware architecture, the operating system, and systems software, the Third Edition serves as a comprehensive introduction to programming. This book strives to create programmers who understand all elements of computer systems and will be able to engage in any application of the field--from fixing faulty software, to writing more capable programs, to avoiding common flaws. It lays the groundwork for readers to delve into more intensive topics such as computer architecture, embedded systems, and cybersecurity. This book focuses on systems that execute an x86-64 machine code, and recommends th...

  18. Automated quantification of pulmonary emphysema from computed tomography scans: comparison of variation and correlation of common measures in a large cohort

    Science.gov (United States)

    Keller, Brad M.; Reeves, Anthony P.; Yankelevitz, David F.; Henschke, Claudia I.

    2010-03-01

    The purpose of this work was to retrospectively investigate the variation of standard indices of pulmonary emphysema from helical computed tomographic (CT) scans as related to inspiration differences over a 1 year interval and determine the strength of the relationship between these measures in a large cohort. 626 patients that had 2 scans taken at an interval of 9 months to 15 months (μ: 381 days, σ: 31 days) were selected for this work. All scans were acquired at a 1.25mm slice thickness using a low dose protocol. For each scan, the emphysema index (EI), fractal dimension (FD), mean lung density (MLD), and 15th percentile of the histogram (HIST) were computed. The absolute and relative changes for each measure were computed and the empirical 95% confidence interval was reported both in non-normalized and normalized scales. Spearman correlation coefficients are computed between the relative change in each measure and relative change in inspiration between each scan-pair, as well as between each pair-wise combination of the four measures. EI varied on a range of -10.5 to 10.5 on a non-normalized scale and -15 to 15 on a normalized scale, with FD and MLD showing slightly larger but comparable spreads, and HIST having a much larger variation. MLD was found to show the strongest correlation to inspiration change (r=0.85, pemphysema index and fractal dimension have the least variability overall of the commonly used measures of emphysema and that they offer the most unique quantification of emphysema relative to each other.

  19. Common-Reliability Cumulative-Binomial Program

    Science.gov (United States)

    Scheuer, Ernest, M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CROSSER, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CROSSER, CUMBIN (NPO-17555), and NEWTONP (NPO-17556), used independently of one another. Point of equality between reliability of system and common reliability of components found. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Program written in C.

  20. Development of a common data model for scientific simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ambrosiano, J. [Los Alamos National Lab., NM (United States); Butler, D.M. [Limit Point Systems, Inc. (United States); Matarazzo, C.; Miller, M. [Lawrence Livermore National Lab., CA (United States); Schoof, L. [Sandia National Lab., Albuquerque, NM (United States)

    1999-06-01

    The problem of sharing data among scientific simulation models is a difficult and persistent one. Computational scientists employ an enormous variety of discrete approximations in modeling physical processes on computers. Problems occur when models based on different representations are required to exchange data with one another, or with some other software package. Within the DOE`s Accelerated Strategic Computing Initiative (ASCI), a cross-disciplinary group called the Data Models and Formats (DMF) group, has been working to develop a common data model. The current model is comprised of several layers of increasing semantic complexity. One of these layers is an abstract model based on set theory and topology called the fiber bundle kernel (FBK). This layer provides the flexibility needed to describe a wide range of mesh-approximated functions as well as other entities. This paper briefly describes the ASCI common data model, its mathematical basis, and ASCI prototype development. These prototypes include an object-oriented data management library developed at Los Alamos called the Common Data Model Library or CDMlib, the Vector Bundle API from the Lawrence Livermore Laboratory, and the DMF API from Sandia National Laboratory.

  1. Computer Lexis and Terminology

    Directory of Open Access Journals (Sweden)

    Gintautas Grigas

    2011-04-01

    Full Text Available Computer becomes a widely used tool in everyday work and at home. Every computer user sees texts on its screen containing a lot of words naming new concepts. Those words come from the terminology used by specialists. The common vocabury between computer terminology and lexis of everyday language comes into existence. The article deals with the part of computer terminology which goes to everyday usage and the influence of ordinary language to computer terminology. The relation between English and Lithuanian computer terminology, the construction and pronouncing of acronyms are discussed as well.

  2. Structures for common-cause failure analysis

    International Nuclear Information System (INIS)

    Vaurio, J.K.

    1981-01-01

    Common-cause failure methodology and terminology have been reviewed and structured to provide a systematical basis for addressing and developing models and methods for quantification. The structure is based on (1) a specific set of definitions, (2) categories based on the way faults are attributable to a common cause, and (3) classes based on the time of entry and the time of elimination of the faults. The failure events are then characterized by their likelihood or frequency and the average residence time. The structure provides a basis for selecting computational models, collecting and evaluating data and assessing the importance of various failure types, and for developing effective defences against common-cause failure. The relationships of this and several other structures are described

  3. A survey of common habits of computer users as indicators of ...

    African Journals Online (AJOL)

    Other unhealthy practices found among computer users included eating (52.1), drinking (56), coughing, sneezing and scratching of head (48.2%). Since microorganisms can be transferred through contact, droplets or airborne routes, it follows that these habits exhibited by users may act as sources of bacteria on keyboards ...

  4. Study of the flying ability of Rhynchophorus ferrugineus (Coleoptera: Dryophthoridae) adults using a computer-monitored flight mill.

    Science.gov (United States)

    Ávalos, J A; Martí-Campoy, A; Soto, A

    2014-08-01

    The red palm weevil, Rhynchophorus ferrugineus (Olivier) (Coleoptera: Dryophthoridae), native to tropical Asian regions, has become a serious threat to palm trees all over the world. Knowledge of its flight potential is vital to improving the preventive and curative measures currently used to manage this pest. As R. ferrugineus is a quarantine pest, it is difficult to study its flight potential in the field. A computer-monitored flight mill was adapted to analyse the flying ability of R. ferrugineus through the study of different flight parameters (number of flights, total distance flown, longest single flight, flight duration, and average and maximum speed) and the influence of the weevil's sex, age, and body size on these flight parameters. Despite significant differences in the adult body size (body weight and length) of males and females, the sex of R. ferrugineus adults did not have an influence on their flight potential. Neither adult body size nor age was found to affect the weevil's flying abilities, although there was a significantly higher percentage of individuals flying that were 8-23 days old than 1-7 days old. Compared to the longest single flight, 54% of the insects were classified as short-distance flyers (covering 5000 m), respectively. The results are compared with similar studies on different insect species under laboratory and field conditions.

  5. Algorithms for solving common fixed point problems

    CERN Document Server

    Zaslavski, Alexander J

    2018-01-01

    This book details approximate solutions to common fixed point problems and convex feasibility problems in the presence of perturbations. Convex feasibility problems search for a common point of a finite collection of subsets in a Hilbert space; common fixed point problems pursue a common fixed point of a finite collection of self-mappings in a Hilbert space. A variety of algorithms are considered in this book for solving both types of problems, the study of which has fueled a rapidly growing area of research. This monograph is timely and highlights the numerous applications to engineering, computed tomography, and radiation therapy planning. Totaling eight chapters, this book begins with an introduction to foundational material and moves on to examine iterative methods in metric spaces. The dynamic string-averaging methods for common fixed point problems in normed space are analyzed in Chapter 3. Dynamic string methods, for common fixed point problems in a metric space are introduced and discussed in Chapter ...

  6. CERN's common Unix and X terminal environment

    International Nuclear Information System (INIS)

    Cass, Tony

    1996-01-01

    The Desktop Infrastructure Group of CERN's Computing and Networks Division has developed a Common Unix and X Terminal Environment to case the migration to Unix based Interactive Computing. The CUTE architecture relies on a distributed flesystem - currently Transarc's AFS - to enable essentially interchangeable client workstation to access both home directory and program files transparently. Additionally, we provide a suite of programs to configure workstations for CUTE and to ensure continued compatibility. This paper describes the different components and the development of the CUTE architecture. (author)

  7. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  8. From computer to brain foundations of computational neuroscience

    CERN Document Server

    Lytton, William W

    2002-01-01

    Biology undergraduates, medical students and life-science graduate students often have limited mathematical skills. Similarly, physics, math and engineering students have little patience for the detailed facts that make up much of biological knowledge. Teaching computational neuroscience as an integrated discipline requires that both groups be brought forward onto common ground. This book does this by making ancillary material available in an appendix and providing basic explanations without becoming bogged down in unnecessary details. The book will be suitable for undergraduates and beginning graduate students taking a computational neuroscience course and also to anyone with an interest in the uses of the computer in modeling the nervous system.

  9. Desktop grid computing

    CERN Document Server

    Cerin, Christophe

    2012-01-01

    Desktop Grid Computing presents common techniques used in numerous models, algorithms, and tools developed during the last decade to implement desktop grid computing. These techniques enable the solution of many important sub-problems for middleware design, including scheduling, data management, security, load balancing, result certification, and fault tolerance. The book's first part covers the initial ideas and basic concepts of desktop grid computing. The second part explores challenging current and future problems. Each chapter presents the sub-problems, discusses theoretical and practical

  10. Detection of common bile duct stones: comparison between endoscopic ultrasonography, magnetic resonance cholangiography, and helical-computed-tomographic cholangiography

    International Nuclear Information System (INIS)

    Kondo, Shintaro; Isayama, Hiroyuki; Akahane, Masaaki; Toda, Nobuo; Sasahira, Naoki; Nakai, Yosuke; Yamamoto, Natsuyo; Hirano, Kenji; Komatsu, Yutaka; Tada, Minoru; Yoshida, Haruhiko; Kawabe, Takao; Ohtomo, Kuni; Omata, Masao

    2005-01-01

    Objectives: New modalities, namely, endoscopic ultrasonography (EUS), magnetic resonance cholangiopancreatography (MRCP), and helical computed-tomographic cholangiography (HCT-C), have been introduced recently for the detection of common bile duct (CBD) stones and shown improved detectability compared to conventional ultrasound or computed tomography. We conducted this study to compare the diagnostic ability of EUS, MRCP, and HCT-C in patients with suspected choledocholithiasis. Methods: Twenty-eight patients clinically suspected of having CBD stones were enrolled, excluding those with cholangitis or a definite history of choledocholithiasis. Each patient underwent EUS, MRCP, and HCT-C prior to endoscopic retrograde cholangio-pancreatography (ERCP), the result of which served as the diagnostic gold standard. Results: CBD stones were detected in 24 (86%) of 28 patients by ERCP/IDUS. The sensitivity of EUS, MRCP, and HCT-C was 100%, 88%, and 88%, respectively. False negative cases for MRCP and HCT-C had a CBD stone smaller than 5 mm in diameter. No serious complications occurred while one patient complained of itching in the eyelids after the infusion of contrast agent on HCT-C. Conclusions: When examination can be scheduled, MRCP or HCT-C will be the first choice because they were less invasive than EUS. MRCP and HCT-C had similar detectability but the former may be preferable considering the possibility of allergic reaction in the latter. When MRCP is negative, EUS is recommended to check for small CBD stones

  11. Motivating Contributions for Home Computer Security

    Science.gov (United States)

    Wash, Richard L.

    2009-01-01

    Recently, malicious computer users have been compromising computers en masse and combining them to form coordinated botnets. The rise of botnets has brought the problem of home computers to the forefront of security. Home computer users commonly have insecure systems; these users do not have the knowledge, experience, and skills necessary to…

  12. Experts' views on digital competence: commonalities and differences

    NARCIS (Netherlands)

    Janssen, José; Stoyanov, Slavi; Ferrari, Anusca; Punie, Yves; Pannekeet, Kees; Sloep, Peter

    2013-01-01

    Janssen, J., Stoyanov, S., Ferrari, A., Punie, Y., Pannekeet, K., & Sloep, P. B. (2013). Experts’ views on digital competence: commonalities and differences. Computers & Education, 68, 473–481. doi:10.1016/j.compedu.2013.06.008

  13. Clinical diagnosis and computer analysis of headache symptoms.

    OpenAIRE

    Drummond, P D; Lance, J W

    1984-01-01

    The headache histories obtained from clinical interviews of 600 patients were analysed by computer to see whether patients could be separated systematically into clinical categories and to see whether sets of symptoms commonly reported together differed in distribution among the categories. The computer classification procedure assigned 537 patients to the same category as their clinical diagnosis, the majority of discrepancies between clinical and computer classifications involving common mi...

  14. Highly reliable computer network for real time system

    International Nuclear Information System (INIS)

    Mohammed, F.A.; Omar, A.A.; Ayad, N.M.A.; Madkour, M.A.I.; Ibrahim, M.K.

    1988-01-01

    Many of computer networks have been studied different trends regarding the network architecture and the various protocols that govern data transfers and guarantee a reliable communication among all a hierarchical network structure has been proposed to provide a simple and inexpensive way for the realization of a reliable real-time computer network. In such architecture all computers in the same level are connected to a common serial channel through intelligent nodes that collectively control data transfers over the serial channel. This level of computer network can be considered as a local area computer network (LACN) that can be used in nuclear power plant control system since it has geographically dispersed subsystems. network expansion would be straight the common channel for each added computer (HOST). All the nodes are designed around a microprocessor chip to provide the required intelligence. The node can be divided into two sections namely a common section that interfaces with serial data channel and a private section to interface with the host computer. This part would naturally tend to have some variations in the hardware details to match the requirements of individual host computers. fig 7

  15. Validity of two methods to assess computer use: Self-report by questionnaire and computer use software

    NARCIS (Netherlands)

    Douwes, M.; Kraker, H.de; Blatter, B.M.

    2007-01-01

    A long duration of computer use is known to be positively associated with Work Related Upper Extremity Disorders (WRUED). Self-report by questionnaire is commonly used to assess a worker's duration of computer use. The aim of the present study was to assess the validity of self-report and computer

  16. New Mexico district work-effort analysis computer program

    Science.gov (United States)

    Hiss, W.L.; Trantolo, A.P.; Sparks, J.L.

    1972-01-01

    6600 computer system. Central processing computer time has seldom exceeded 5 minutes on the longest year-to-date runs.

  17. Computational force, mass, and energy

    International Nuclear Information System (INIS)

    Numrich, R.W.

    1997-01-01

    This paper describes a correspondence between computational quantities commonly used to report computer performance measurements and mechanical quantities from classical Newtonian mechanics. It defines a set of three fundamental computational quantities that are sufficient to establish a system of computational measurement. From these quantities, it defines derived computational quantities that have analogous physical counterparts. These computational quantities obey three laws of motion in computational space. The solutions to the equations of motion, with appropriate boundary conditions, determine the computational mass of the computer. Computational forces, with magnitudes specific to each instruction and to each computer, overcome the inertia represented by this mass. The paper suggests normalizing the computational mass scale by picking the mass of a register on the CRAY-1 as the standard unit of mass

  18. Computer information systems framework

    International Nuclear Information System (INIS)

    Shahabuddin, S.

    1989-01-01

    Management information systems (MIS) is a commonly used term in computer profession. The new information technology has caused management to expect more from computer. The process of supplying information follows a well defined procedure. MIS should be capable for providing usable information to the various areas and levels of organization. MIS is different from data processing. MIS and business hierarchy provides a good framework for many organization which are using computers. (A.B.)

  19. Computational approaches for discovery of common immunomodulators in fungal infections: towards broad-spectrum immunotherapeutic interventions.

    Science.gov (United States)

    Kidane, Yared H; Lawrence, Christopher; Murali, T M

    2013-10-07

    Fungi are the second most abundant type of human pathogens. Invasive fungal pathogens are leading causes of life-threatening infections in clinical settings. Toxicity to the host and drug-resistance are two major deleterious issues associated with existing antifungal agents. Increasing a host's tolerance and/or immunity to fungal pathogens has potential to alleviate these problems. A host's tolerance may be improved by modulating the immune system such that it responds more rapidly and robustly in all facets, ranging from the recognition of pathogens to their clearance from the host. An understanding of biological processes and genes that are perturbed during attempted fungal exposure, colonization, and/or invasion will help guide the identification of endogenous immunomodulators and/or small molecules that activate host-immune responses such as specialized adjuvants. In this study, we present computational techniques and approaches using publicly available transcriptional data sets, to predict immunomodulators that may act against multiple fungal pathogens. Our study analyzed data sets derived from host cells exposed to five fungal pathogens, namely, Alternaria alternata, Aspergillus fumigatus, Candida albicans, Pneumocystis jirovecii, and Stachybotrys chartarum. We observed statistically significant associations between host responses to A. fumigatus and C. albicans. Our analysis identified biological processes that were consistently perturbed by these two pathogens. These processes contained both immune response-inducing genes such as MALT1, SERPINE1, ICAM1, and IL8, and immune response-repressing genes such as DUSP8, DUSP6, and SPRED2. We hypothesize that these genes belong to a pool of common immunomodulators that can potentially be activated or suppressed (agonized or antagonized) in order to render the host more tolerant to infections caused by A. fumigatus and C. albicans. Our computational approaches and methodologies described here can now be applied to

  20. Computed tomographic findings of intracranial pyogenic abscess

    International Nuclear Information System (INIS)

    Kim, S. J.; Suh, J. H.; Park, C. Y.; Lee, K. C.; Chung, S. S.

    1982-01-01

    The early diagnosis and effective treatment of brain abscess pose a difficult clinical problem. With the advent of computed tomography, however, it appears that mortality due to intracranial abscess has significantly diminished. 54 cases of intracranial pyogenic abscess are presented. Etiologic factors and computed tomographic findings are analyzed and following result are obtained. 1. The common etiologic factors are otitis media, post operation, and head trauma, in order of frequency. 2. The most common initial computed tomographic findings of brain abscess is ring contrast enhancement with surrounding brain edema. 3. The most characteristic computed tomographic finding of ring contrast enhancement is smooth thin walled ring contrast enhancement. 4. Most of thick irregular ring contrast enhancement are abscess associated with cyanotic heart disease or poor operation. 5. The most common findings of epidural and subdural empyema is crescentic radiolucent area with thin wall contrast enhancement without surrounding brain edema in convexity of brain

  1. Observations of long-term tide-gauge records for indications of accelerated sea-level rise

    International Nuclear Information System (INIS)

    Gornitz, V.; Solow, A.

    1990-01-01

    Long-term tide-gauge records have been examined for indications of accelerated sea-level rise. An initial evaluation of 21 records was made, using least-squares linear regression. Four of the longest records were selected for more formal statistical analysis. The regional-mean sea-level curve for west-central Europe displays an upswing after 1900. However, individual stations vary considerably over short distances. Results from other regions are less conclusive. Application of univariate and multivariate techniques to the four longest records yields strong evidence for the presence of a non-global, nonlinear component. However, the three longest European records show some common nonlinear features, implying the presence of a regional component. There is some weak statistical evidence for a common changepoint around 1895 in the long-term European records, particularly for Amsterdam and Brest

  2. Advanced computer-based training

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, H D; Martin, H D

    1987-05-01

    The paper presents new techniques of computer-based training for personnel of nuclear power plants. Training on full-scope simulators is further increased by use of dedicated computer-based equipment. An interactive communication system runs on a personal computer linked to a video disc; a part-task simulator runs on 32 bit process computers and shows two versions: as functional trainer or as on-line predictor with an interactive learning system (OPAL), which may be well-tailored to a specific nuclear power plant. The common goal of both develoments is the optimization of the cost-benefit ratio for training and equipment.

  3. Advanced computer-based training

    International Nuclear Information System (INIS)

    Fischer, H.D.; Martin, H.D.

    1987-01-01

    The paper presents new techniques of computer-based training for personnel of nuclear power plants. Training on full-scope simulators is further increased by use of dedicated computer-based equipment. An interactive communication system runs on a personal computer linked to a video disc; a part-task simulator runs on 32 bit process computers and shows two versions: as functional trainer or as on-line predictor with an interactive learning system (OPAL), which may be well-tailored to a specific nuclear power plant. The common goal of both develoments is the optimization of the cost-benefit ratio for training and equipment. (orig.) [de

  4. Recent trends in grid computing

    International Nuclear Information System (INIS)

    Miura, Kenichi

    2004-01-01

    Grid computing is a technology which allows uniform and transparent access to geographically dispersed computational resources, such as computers, databases, experimental and observational equipment etc. via high-speed, high-bandwidth networking. The commonly used analogy is that of electrical power grid, whereby the household electricity is made available from outlets on the wall, and little thought need to be given to where the electricity is generated and how it is transmitted. The usage of grid also includes distributed parallel computing, high through-put computing, data intensive computing (data grid) and collaborative computing. This paper reviews the historical background, software structure, current status and on-going grid projects, including applications of grid technology to nuclear fusion research. (author)

  5. Programming in biomolecular computation

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Our goal is to provide a top-down approach to biomolecular computation. In spite of widespread discussion about connections between biology and computation, one question seems notable by its absence: Where are the programs? We identify a number of common features in programming that seem...... conspicuously absent from the literature on biomolecular computing; to partially redress this absence, we introduce a model of computation that is evidently programmable, by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined...... by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only executable, but are also compilable and interpretable. It is universal: all computable functions can be computed (in natural ways...

  6. Women in computer science: An interpretative phenomenological analysis exploring common factors contributing to women's selection and persistence in computer science as an academic major

    Science.gov (United States)

    Thackeray, Lynn Roy

    The purpose of this study is to understand the meaning that women make of the social and cultural factors that influence their reasons for entering and remaining in study of computer science. The twenty-first century presents many new challenges in career development and workforce choices for both men and women. Information technology has become the driving force behind many areas of the economy. As this trend continues, it has become essential that U.S. citizens need to pursue a career in technologies, including the computing sciences. Although computer science is a very lucrative profession, many Americans, especially women, are not choosing it as a profession. Recent studies have shown no significant differences in math, technical and science competency between men and women. Therefore, other factors, such as social, cultural, and environmental influences seem to affect women's decisions in choosing an area of study and career choices. A phenomenological method of qualitative research was used in this study, based on interviews of seven female students who are currently enrolled in a post-secondary computer science program. Their narratives provided meaning into the social and cultural environments that contribute to their persistence in their technical studies, as well as identifying barriers and challenges that are faced by female students who choose to study computer science. It is hoped that the data collected from this study may provide recommendations for the recruiting, retention and support for women in computer science departments of U.S. colleges and universities, and thereby increase the numbers of women computer scientists in industry. Keywords: gender access, self-efficacy, culture, stereotypes, computer education, diversity.

  7. Cross-sectional anatomy, computed tomography and magnetic resonance imaging of the head of common dolphin (Delphinus delphis) and striped dolphin (Stenella coeruleoalba).

    Science.gov (United States)

    Alonso-Farré, J M; Gonzalo-Orden, M; Barreiro-Vázquez, J D; Barreiro-Lois, A; André, M; Morell, M; Llarena-Reino, M; Monreal-Pawlowsky, T; Degollada, E

    2015-02-01

    Computed tomography (CT) and low-field magnetic resonance imaging (MRI) were used to scan seven by-caught dolphin cadavers, belonging to two species: four common dolphins (Delphinus delphis) and three striped dolphins (Stenella coeruleoalba). CT and MRI were obtained with the animals in ventral recumbency. After the imaging procedures, six dolphins were frozen at -20°C and sliced in the same position they were examined. Not only CT and MRI scans, but also cross sections of the heads were obtained in three body planes: transverse (slices of 1 cm thickness) in three dolphins, sagittal (5 cm thickness) in two dolphins and dorsal (5 cm thickness) in two dolphins. Relevant anatomical structures were identified and labelled on each cross section, obtaining a comprehensive bi-dimensional topographical anatomy guide of the main features of the common and the striped dolphin head. Furthermore, the anatomical cross sections were compared with their corresponding CT and MRI images, allowing an imaging identification of most of the anatomical features. CT scans produced an excellent definition of the bony and air-filled structures, while MRI allowed us to successfully identify most of the soft tissue structures in the dolphin's head. This paper provides a detailed anatomical description of the head structures of common and striped dolphins and compares anatomical cross sections with CT and MRI scans, becoming a reference guide for the interpretation of imaging studies. © 2014 Blackwell Verlag GmbH.

  8. Indirection and computer security.

    Energy Technology Data Exchange (ETDEWEB)

    Berg, Michael J.

    2011-09-01

    The discipline of computer science is built on indirection. David Wheeler famously said, 'All problems in computer science can be solved by another layer of indirection. But that usually will create another problem'. We propose that every computer security vulnerability is yet another problem created by the indirections in system designs and that focusing on the indirections involved is a better way to design, evaluate, and compare security solutions. We are not proposing that indirection be avoided when solving problems, but that understanding the relationships between indirections and vulnerabilities is key to securing computer systems. Using this perspective, we analyze common vulnerabilities that plague our computer systems, consider the effectiveness of currently available security solutions, and propose several new security solutions.

  9. An Evaluation of Windows-Based Computer Forensics Application Software Running on a Macintosh

    Directory of Open Access Journals (Sweden)

    Gregory H. Carlton

    2008-09-01

    Full Text Available The two most common computer forensics applications perform exclusively on Microsoft Windows Operating Systems, yet contemporary computer forensics examinations frequently encounter one or more of the three most common operating system environments, namely Windows, OS-X, or some form of UNIX or Linux. Additionally, government and private computer forensics laboratories frequently encounter budget constraints that limit their access to computer hardware. Currently, Macintosh computer systems are marketed with the ability to accommodate these three common operating system environments, including Windows XP in native and virtual environments. We performed a series of experiments to measure the functionality and performance of the two most commonly used Windows-based computer forensics applications on a Macintosh running Windows XP in native mode and in two virtual environments relative to a similarly configured Dell personal computer. The research results are directly beneficial to practitioners, and the process illustrates affective pedagogy whereby students were engaged in applied research.

  10. Comparative evaluation of the cadaveric, radiographic and computed tomographic anatomy of the heads of green iguana (Iguana iguana) , common tegu ( Tupinambis merianae) and bearded dragon ( Pogona vitticeps)

    OpenAIRE

    Banzato, Tommaso; Selleri, Paolo; Veladiano, Irene A; Martin, Andrea; Zanetti, Emanuele; Zotti, Alessandro

    2012-01-01

    Abstract Background Radiology and computed tomography are the most commonly available diagnostic tools for the diagnosis of pathologies affecting the head and skull in veterinary practice. Nevertheless, accurate interpretation of radiographic and CT studies requires a thorough knowledge of the gross and the cross-sectional anatomy. Despite the increasing success of reptiles as pets, only a few reports over their normal imaging features are currently available. The aim of this study is to desc...

  11. Applications of computer algebra

    CERN Document Server

    1985-01-01

    Today, certain computer software systems exist which surpass the computational ability of researchers when their mathematical techniques are applied to many areas of science and engineering. These computer systems can perform a large portion of the calculations seen in mathematical analysis. Despite this massive power, thousands of people use these systems as a routine resource for everyday calculations. These software programs are commonly called "Computer Algebra" systems. They have names such as MACSYMA, MAPLE, muMATH, REDUCE and SMP. They are receiving credit as a computational aid with in­ creasing regularity in articles in the scientific and engineering literature. When most people think about computers and scientific research these days, they imagine a machine grinding away, processing numbers arithmetically. It is not generally realized that, for a number of years, computers have been performing non-numeric computations. This means, for example, that one inputs an equa­ tion and obtains a closed for...

  12. Common-mask guided image reconstruction (c-MGIR) for enhanced 4D cone-beam computed tomography

    International Nuclear Information System (INIS)

    Park, Justin C; Li, Jonathan G; Liu, Chihray; Lu, Bo; Zhang, Hao; Chen, Yunmei; Fan, Qiyong

    2015-01-01

    Compared to 3D cone beam computed tomography (3D CBCT), the image quality of commercially available four-dimensional (4D) CBCT is severely impaired due to the insufficient amount of projection data available for each phase. Since the traditional Feldkamp-Davis-Kress (FDK)-based algorithm is infeasible for reconstructing high quality 4D CBCT images with limited projections, investigators had developed several compress-sensing (CS) based algorithms to improve image quality. The aim of this study is to develop a novel algorithm which can provide better image quality than the FDK and other CS based algorithms with limited projections. We named this algorithm ‘the common mask guided image reconstruction’ (c-MGIR).In c-MGIR, the unknown CBCT volume is mathematically modeled as a combination of phase-specific motion vectors and phase-independent static vectors. The common-mask matrix, which is the key concept behind the c-MGIR algorithm, separates the common static part across all phase images from the possible moving part in each phase image. The moving part and the static part of the volumes were then alternatively updated by solving two sub-minimization problems iteratively. As the novel mathematical transformation allows the static volume and moving volumes to be updated (during each iteration) with global projections and ‘well’ solved static volume respectively, the algorithm was able to reduce the noise and under-sampling artifact (an issue faced by other algorithms) to the maximum extent. To evaluate the performance of our proposed c-MGIR, we utilized imaging data from both numerical phantoms and a lung cancer patient. The qualities of the images reconstructed with c-MGIR were compared with (1) standard FDK algorithm, (2) conventional total variation (CTV) based algorithm, (3) prior image constrained compressed sensing (PICCS) algorithm, and (4) motion-map constrained image reconstruction (MCIR) algorithm, respectively. To improve the efficiency of the

  13. Common-mask guided image reconstruction (c-MGIR) for enhanced 4D cone-beam computed tomography.

    Science.gov (United States)

    Park, Justin C; Zhang, Hao; Chen, Yunmei; Fan, Qiyong; Li, Jonathan G; Liu, Chihray; Lu, Bo

    2015-12-07

    Compared to 3D cone beam computed tomography (3D CBCT), the image quality of commercially available four-dimensional (4D) CBCT is severely impaired due to the insufficient amount of projection data available for each phase. Since the traditional Feldkamp-Davis-Kress (FDK)-based algorithm is infeasible for reconstructing high quality 4D CBCT images with limited projections, investigators had developed several compress-sensing (CS) based algorithms to improve image quality. The aim of this study is to develop a novel algorithm which can provide better image quality than the FDK and other CS based algorithms with limited projections. We named this algorithm 'the common mask guided image reconstruction' (c-MGIR).In c-MGIR, the unknown CBCT volume is mathematically modeled as a combination of phase-specific motion vectors and phase-independent static vectors. The common-mask matrix, which is the key concept behind the c-MGIR algorithm, separates the common static part across all phase images from the possible moving part in each phase image. The moving part and the static part of the volumes were then alternatively updated by solving two sub-minimization problems iteratively. As the novel mathematical transformation allows the static volume and moving volumes to be updated (during each iteration) with global projections and 'well' solved static volume respectively, the algorithm was able to reduce the noise and under-sampling artifact (an issue faced by other algorithms) to the maximum extent. To evaluate the performance of our proposed c-MGIR, we utilized imaging data from both numerical phantoms and a lung cancer patient. The qualities of the images reconstructed with c-MGIR were compared with (1) standard FDK algorithm, (2) conventional total variation (CTV) based algorithm, (3) prior image constrained compressed sensing (PICCS) algorithm, and (4) motion-map constrained image reconstruction (MCIR) algorithm, respectively. To improve the efficiency of the algorithm

  14. High-Speed Computation of the Kleene Star in Max-Plus Algebraic System Using a Cell Broadband Engine

    Science.gov (United States)

    Goto, Hiroyuki

    This research addresses a high-speed computation method for the Kleene star of the weighted adjacency matrix in a max-plus algebraic system. We focus on systems whose precedence constraints are represented by a directed acyclic graph and implement it on a Cell Broadband Engine™ (CBE) processor. Since the resulting matrix gives the longest travel times between two adjacent nodes, it is often utilized in scheduling problem solvers for a class of discrete event systems. This research, in particular, attempts to achieve a speedup by using two approaches: parallelization and SIMDization (Single Instruction, Multiple Data), both of which can be accomplished by a CBE processor. The former refers to a parallel computation using multiple cores, while the latter is a method whereby multiple elements are computed by a single instruction. Using the implementation on a Sony PlayStation 3™ equipped with a CBE processor, we found that the SIMDization is effective regardless of the system's size and the number of processor cores used. We also found that the scalability of using multiple cores is remarkable especially for systems with a large number of nodes. In a numerical experiment where the number of nodes is 2000, we achieved a speedup of 20 times compared with the method without the above techniques.

  15. An Evaluation of Windows-Based Computer Forensics Application Software Running on a Macintosh

    OpenAIRE

    Gregory H. Carlton

    2008-01-01

    The two most common computer forensics applications perform exclusively on Microsoft Windows Operating Systems, yet contemporary computer forensics examinations frequently encounter one or more of the three most common operating system environments, namely Windows, OS-X, or some form of UNIX or Linux. Additionally, government and private computer forensics laboratories frequently encounter budget constraints that limit their access to computer hardware. Currently, Macintosh computer systems a...

  16. Bridging the digital divide through the integration of computer and information technology in science education: An action research study

    Science.gov (United States)

    Brown, Gail Laverne

    The presence of a digital divide, computer and information technology integration effectiveness, and barriers to continued usage of computer and information technology were investigated. Thirty-four African American and Caucasian American students (17 males and 17 females) in grades 9--11 from 2 Georgia high school science classes were exposed to 30 hours of hands-on computer and information technology skills. The purpose of the exposure was to improve students' computer and information technology skills. Pre-study and post-study skills surveys, and structured interviews were used to compare race, gender, income, grade-level, and age differences with respect to computer usage. A paired t-test and McNemar test determined mean differences between student pre-study and post-study perceived skills levels. The results were consistent with findings of the National Telecommunications and Information Administration (2000) that indicated the presence of a digital divide and digital inclusion. Caucasian American participants were found to have more at-home computer and Internet access than African American participants, indicating that there is a digital divide by ethnicity. Caucasian American females were found to have more computer and Internet access which was an indication of digital inclusion. Sophomores had more at-home computer access and Internet access than other levels indicating digital inclusion. Students receiving regular meals had more computer and Internet access than students receiving free/reduced meals. Older students had more computer and Internet access than younger students. African American males had been using computer and information technology the longest which is an indication of inclusion. The paired t-test and McNemar test revealed significant perceived student increases in all skills levels. Interviews did not reveal any barriers to continued usage of the computer and information technology skills.

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  18. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  19. Computational Investigation of a Boundary-Layer Ingesting Propulsion System for the Common Research Model

    Science.gov (United States)

    Blumenthal, Brennan T.; Elmiligui, Alaa; Geiselhart, Karl A.; Campbell, Richard L.; Maughmer, Mark D.; Schmitz, Sven

    2016-01-01

    The present paper examines potential propulsive and aerodynamic benefits of integrating a Boundary-Layer Ingestion (BLI) propulsion system into a typical commercial aircraft using the Common Research Model (CRM) geometry and the NASA Tetrahedral Unstructured Software System (TetrUSS). The Numerical Propulsion System Simulation (NPSS) environment is used to generate engine conditions for CFD analysis. Improvements to the BLI geometry are made using the Constrained Direct Iterative Surface Curvature (CDISC) design method. Previous studies have shown reductions of up to 25% in terms of propulsive power required for cruise for other axisymmetric geometries using the BLI concept. An analysis of engine power requirements, drag, and lift coefficients using the baseline and BLI geometries coupled with the NPSS model are shown. Potential benefits of the BLI system relating to cruise propulsive power are quantified using a power balance method, and a comparison to the baseline case is made. Iterations of the BLI geometric design are shown and any improvements between subsequent BLI designs presented. Simulations are conducted for a cruise flight condition of Mach 0.85 at an altitude of 38,500 feet and an angle of attack of 2 deg for all geometries. A comparison between available wind tunnel data, previous computational results, and the original CRM model is presented for model verification purposes along with full results for BLI power savings. Results indicate a 14.4% reduction in engine power requirements at cruise for the BLI configuration over the baseline geometry. Minor shaping of the aft portion of the fuselage using CDISC has been shown to increase the benefit from Boundary-Layer Ingestion further, resulting in a 15.6% reduction in power requirements for cruise as well as a drag reduction of eighteen counts over the baseline geometry.

  20. Computational Investigation of a Boundary-Layer Ingestion Propulsion System for the Common Research Model

    Science.gov (United States)

    Blumenthal, Brennan

    2016-01-01

    This thesis will examine potential propulsive and aerodynamic benefits of integrating a boundary-layer ingestion (BLI) propulsion system with a typical commercial aircraft using the Common Research Model geometry and the NASA Tetrahedral Unstructured Software System (TetrUSS). The Numerical Propulsion System Simulation (NPSS) environment will be used to generate engine conditions for CFD analysis. Improvements to the BLI geometry will be made using the Constrained Direct Iterative Surface Curvature (CDISC) design method. Previous studies have shown reductions of up to 25% in terms of propulsive power required for cruise for other axisymmetric geometries using the BLI concept. An analysis of engine power requirements, drag, and lift coefficients using the baseline and BLI geometries coupled with the NPSS model are shown. Potential benefits of the BLI system relating to cruise propulsive power are quantified using a power balance method and a comparison to the baseline case is made. Iterations of the BLI geometric design are shown and any improvements between subsequent BLI designs presented. Simulations are conducted for a cruise flight condition of Mach 0.85 at an altitude of 38,500 feet and an angle of attack of 2deg for all geometries. A comparison between available wind tunnel data, previous computational results, and the original CRM model is presented for model verification purposes along with full results for BLI power savings. Results indicate a 14.3% reduction in engine power requirements at cruise for the BLI configuration over the baseline geometry. Minor shaping of the aft portion of the fuselage using CDISC has been shown to increase the benefit from boundary-layer ingestion further, resulting in a 15.6% reduction in power requirements for cruise as well as a drag reduction of eighteen counts over the baseline geometry.

  1. Computer hardware for radiologists: Part 2

    International Nuclear Information System (INIS)

    Indrajit, IK; Alam, A

    2010-01-01

    Computers are an integral part of modern radiology equipment. In the first half of this two-part article, we dwelt upon some fundamental concepts regarding computer hardware, covering components like motherboard, central processing unit (CPU), chipset, random access memory (RAM), and memory modules. In this article, we describe the remaining computer hardware components that are of relevance to radiology. “Storage drive” is a term describing a “memory” hardware used to store data for later retrieval. Commonly used storage drives are hard drives, floppy drives, optical drives, flash drives, and network drives. The capacity of a hard drive is dependent on many factors, including the number of disk sides, number of tracks per side, number of sectors on each track, and the amount of data that can be stored in each sector. “Drive interfaces” connect hard drives and optical drives to a computer. The connections of such drives require both a power cable and a data cable. The four most popular “input/output devices” used commonly with computers are the printer, monitor, mouse, and keyboard. The “bus” is a built-in electronic signal pathway in the motherboard to permit efficient and uninterrupted data transfer. A motherboard can have several buses, including the system bus, the PCI express bus, the PCI bus, the AGP bus, and the (outdated) ISA bus. “Ports” are the location at which external devices are connected to a computer motherboard. All commonly used peripheral devices, such as printers, scanners, and portable drives, need ports. A working knowledge of computers is necessary for the radiologist if the workflow is to realize its full potential and, besides, this knowledge will prepare the radiologist for the coming innovations in the ‘ever increasing’ digital future

  2. Computer hardware for radiologists: Part 2

    Directory of Open Access Journals (Sweden)

    Indrajit I

    2010-01-01

    Full Text Available Computers are an integral part of modern radiology equipment. In the first half of this two-part article, we dwelt upon some fundamental concepts regarding computer hardware, covering components like motherboard, central processing unit (CPU, chipset, random access memory (RAM, and memory modules. In this article, we describe the remaining computer hardware components that are of relevance to radiology. "Storage drive" is a term describing a "memory" hardware used to store data for later retrieval. Commonly used storage drives are hard drives, floppy drives, optical drives, flash drives, and network drives. The capacity of a hard drive is dependent on many factors, including the number of disk sides, number of tracks per side, number of sectors on each track, and the amount of data that can be stored in each sector. "Drive interfaces" connect hard drives and optical drives to a computer. The connections of such drives require both a power cable and a data cable. The four most popular "input/output devices" used commonly with computers are the printer, monitor, mouse, and keyboard. The "bus" is a built-in electronic signal pathway in the motherboard to permit efficient and uninterrupted data transfer. A motherboard can have several buses, including the system bus, the PCI express bus, the PCI bus, the AGP bus, and the (outdated ISA bus. "Ports" are the location at which external devices are connected to a computer motherboard. All commonly used peripheral devices, such as printers, scanners, and portable drives, need ports. A working knowledge of computers is necessary for the radiologist if the workflow is to realize its full potential and, besides, this knowledge will prepare the radiologist for the coming innovations in the ′ever increasing′ digital future.

  3. Common findings and pseudolesions at computed tomography colonography: pictorial essay

    International Nuclear Information System (INIS)

    Atzingen, Augusto Castelli von; Tiferes, Dario Ariel; Matsumoto, Carlos Alberto; Nunes, Thiago Franchi; Maia, Marcos Vinicius Alvim Soares; D'Ippolito, Giuseppe

    2012-01-01

    Computed tomography colonography is a minimally invasive method for screening for polyps and colorectal cancer, with extremely unusual complications, increasingly used in the clinical practice. In the last decade, developments in bowel preparation, imaging, and in the training of investigators have determined a significant increase in the method sensitivity. Images interpretation is accomplished through a combined analysis of two-dimensional source images and several types of three-dimensional renderings, with sensitivity around 96% in the detection of lesions with dimensions equal or greater than 10 mm in size, when analyzed by experienced radiologists. The present pictorial essay includes examples of diseases and pseudolesions most frequently observed in this type of imaging study. The authors present examples of flat and polypoid lesions, benign and malignant lesions, diverticular disease of the colon, among other conditions, as well as pseudolesions, including those related to inappropriate bowel preparation and misinterpretation. (author)

  4. Common findings and pseudolesions at computed tomography colonography: pictorial essay

    Energy Technology Data Exchange (ETDEWEB)

    Atzingen, Augusto Castelli von [Clinical Radiology, Universidade Federal de Sao Paulo (UNIFESP), Sao Paulo, SP (Brazil); Tiferes, Dario Ariel; Matsumoto, Carlos Alberto; Nunes, Thiago Franchi; Maia, Marcos Vinicius Alvim Soares [Abdominal Imaging Section, Department of Imaging Diagnosis - Universidade Federal de Sao Paulo (UNIFESP), Sao Paulo, SP (Brazil); D' Ippolito, Giuseppe, E-mail: giuseppe_dr@uol.com.br [Department of Imaging Diagnosis, Universidade Federal de Sao Paulo (UNIFESP), Sao Paulo, SP (Brazil)

    2012-05-15

    Computed tomography colonography is a minimally invasive method for screening for polyps and colorectal cancer, with extremely unusual complications, increasingly used in the clinical practice. In the last decade, developments in bowel preparation, imaging, and in the training of investigators have determined a significant increase in the method sensitivity. Images interpretation is accomplished through a combined analysis of two-dimensional source images and several types of three-dimensional renderings, with sensitivity around 96% in the detection of lesions with dimensions equal or greater than 10 mm in size, when analyzed by experienced radiologists. The present pictorial essay includes examples of diseases and pseudolesions most frequently observed in this type of imaging study. The authors present examples of flat and polypoid lesions, benign and malignant lesions, diverticular disease of the colon, among other conditions, as well as pseudolesions, including those related to inappropriate bowel preparation and misinterpretation. (author)

  5. Computational Design of Urban Layouts

    KAUST Repository

    Wonka, Peter

    2015-10-07

    A fundamental challenge in computational design is to compute layouts by arranging a set of shapes. In this talk I will present recent urban modeling projects with applications in computer graphics, urban planning, and architecture. The talk will look at different scales of urban modeling (streets, floorplans, parcels). A common challenge in all these modeling problems are functional and aesthetic constraints that should be respected. The talk also highlights interesting links to geometry processing problems, such as field design and quad meshing.

  6. Common modelling approaches for training simulators for nuclear power plants

    International Nuclear Information System (INIS)

    1990-02-01

    Training simulators for nuclear power plant operating staff have gained increasing importance over the last twenty years. One of the recommendations of the 1983 IAEA Specialists' Meeting on Nuclear Power Plant Training Simulators in Helsinki was to organize a Co-ordinated Research Programme (CRP) on some aspects of training simulators. The goal statement was: ''To establish and maintain a common approach to modelling for nuclear training simulators based on defined training requirements''. Before adapting this goal statement, the participants considered many alternatives for defining the common aspects of training simulator models, such as the programming language used, the nature of the simulator computer system, the size of the simulation computers, the scope of simulation. The participants agreed that it was the training requirements that defined the need for a simulator, the scope of models and hence the type of computer complex that was required, the criteria for fidelity and verification, and was therefore the most appropriate basis for the commonality of modelling approaches. It should be noted that the Co-ordinated Research Programme was restricted, for a variety of reasons, to consider only a few aspects of training simulators. This report reflects these limitations, and covers only the topics considered within the scope of the programme. The information in this document is intended as an aid for operating organizations to identify possible modelling approaches for training simulators for nuclear power plants. 33 refs

  7. New results on classical problems in computational geometry in the plane

    DEFF Research Database (Denmark)

    Abrahamsen, Mikkel

    In this thesis, we revisit three classical problems in computational geometry in the plane. An obstacle that often occurs as a subproblem in more complicated problems is to compute the common tangents of two disjoint, simple polygons. For instance, the common tangents turn up in problems related...... to visibility, collision avoidance, shortest paths, etc. We provide a remarkably simple algorithm to compute all (at most four) common tangents of two disjoint simple polygons. Given each polygon as a read-only array of its corners in cyclic order, the algorithm runs in linear time and constant workspace...... and is the first to achieve the two complexity bounds simultaneously. The set of common tangents provides basic information about the convex hulls of the polygons—whether they are nested, overlapping, or disjoint—and our algorithm thus also decides this relationship. One of the best-known problems in computational...

  8. SEAL: Common Core Libraries and Services for LHC Applications

    CERN Document Server

    Generowicz, J; Moneta, L; Roiser, S; Marino, M; Tuura, L A

    2003-01-01

    The CERN LHC experiments have begun the LHC Computing Grid project in 2001. One of the project's aims is to develop common software infrastructure based on a development vision shared by the participating experiments. The SEAL project will provide common foundation libraries, services and utilities identified by the project's architecture blueprint report. This requires a broad range of functionality that no individual package suitably covers. SEAL thus selects external and experiment-developed packages, integrates them in a coherent whole, develops new code for missing functionality, and provides support to the experiments. We describe the set of basic components identified by the LHC Computing Grid project and thought to be sufficient for development of higher level framework components and specializations. Examples of such components are a plug-in manager, an object dictionary, object whiteboards, an incident or event manager. We present the design and implementation of some of these components and the und...

  9. And the beat goes on: maintained cardiovascular function during aging in the longest-lived rodent, the naked mole-rat.

    Science.gov (United States)

    Grimes, Kelly M; Reddy, Anilkumar K; Lindsey, Merry L; Buffenstein, Rochelle

    2014-08-01

    The naked mole-rat (NMR) is the longest-lived rodent known, with a maximum lifespan potential (MLSP) of >31 years. Despite such extreme longevity, these animals display attenuation of many age-associated diseases and functional changes until the last quartile of their MLSP. We questioned if such abilities would extend to cardiovascular function and structure in this species. To test this, we assessed cardiac functional reserve, ventricular morphology, and arterial stiffening in NMRs ranging from 2 to 24 years of age. Dobutamine echocardiography (3 μg/g ip) revealed no age-associated changes in left ventricular (LV) function either at baseline or with exercise-like stress. Baseline and dobutamine-induced LV pressure parameters also did not change. Thus the NMR, unlike other mammals, maintains cardiac reserve with age. NMRs showed no cardiac hypertrophy, evidenced by no increase in cardiomyocyte cross-sectional area or LV dimensions with age. Age-associated arterial stiffening does not occur since there are no changes in aortic blood pressures or pulse-wave velocity. Only LV interstitial collagen deposition increased 2.5-fold from young to old NMRs (P < 0.01). However, its effect on LV diastolic function is likely minor since NMRs experience attenuated age-related increases in diastolic dysfunction in comparison with other species. Overall, these findings conform to the negligible senescence phenotype, as NMRs largely stave off cardiovascular changes for at least 75% of their MLSP. This suggests that using a comparative strategy to find factors that change with age in other mammals but not NMRs could provide novel targets to slow or prevent cardiovascular aging in humans.

  10. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    Science.gov (United States)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  11. Common spatial pattern combined with kernel linear discriminate and generalized radial basis function for motor imagery-based brain computer interface applications

    Science.gov (United States)

    Hekmatmanesh, Amin; Jamaloo, Fatemeh; Wu, Huapeng; Handroos, Heikki; Kilpeläinen, Asko

    2018-04-01

    Brain Computer Interface (BCI) can be a challenge for developing of robotic, prosthesis and human-controlled systems. This work focuses on the implementation of a common spatial pattern (CSP) base algorithm to detect event related desynchronization patterns. Utilizing famous previous work in this area, features are extracted by filter bank with common spatial pattern (FBCSP) method, and then weighted by a sensitive learning vector quantization (SLVQ) algorithm. In the current work, application of the radial basis function (RBF) as a mapping kernel of linear discriminant analysis (KLDA) method on the weighted features, allows the transfer of data into a higher dimension for more discriminated data scattering by RBF kernel. Afterwards, support vector machine (SVM) with generalized radial basis function (GRBF) kernel is employed to improve the efficiency and robustness of the classification. Averagely, 89.60% accuracy and 74.19% robustness are achieved. BCI Competition III, Iva data set is used to evaluate the algorithm for detecting right hand and foot imagery movement patterns. Results show that combination of KLDA with SVM-GRBF classifier makes 8.9% and 14.19% improvements in accuracy and robustness, respectively. For all the subjects, it is concluded that mapping the CSP features into a higher dimension by RBF and utilization GRBF as a kernel of SVM, improve the accuracy and reliability of the proposed method.

  12. Preschool Cookbook of Computer Programming Topics

    Science.gov (United States)

    Morgado, Leonel; Cruz, Maria; Kahn, Ken

    2010-01-01

    A common problem in computer programming use for education in general, not simply as a technical skill, is that children and teachers find themselves constrained by what is possible through limited expertise in computer programming techniques. This is particularly noticeable at the preliterate level, where constructs tend to be limited to…

  13. A SURVEY ON UBIQUITOUS COMPUTING

    Directory of Open Access Journals (Sweden)

    Vishal Meshram

    2016-01-01

    Full Text Available This work presents a survey of ubiquitous computing research which is the emerging domain that implements communication technologies into day-to-day life activities. This research paper provides a classification of the research areas on the ubiquitous computing paradigm. In this paper, we present common architecture principles of ubiquitous systems and analyze important aspects in context-aware ubiquitous systems. In addition, this research work presents a novel architecture of ubiquitous computing system and a survey of sensors needed for applications in ubiquitous computing. The goals of this research work are three-fold: i serve as a guideline for researchers who are new to ubiquitous computing and want to contribute to this research area, ii provide a novel system architecture for ubiquitous computing system, and iii provides further research directions required into quality-of-service assurance of ubiquitous computing.

  14. Promising role of single photon emission computed tomography/computed tomography in Meckel's scan

    International Nuclear Information System (INIS)

    Jain, Anurag; Chauhan, MS; Pandit, AG; Kumar, Rajeev; Sharma, Amit

    2012-01-01

    Meckel's scan is a common procedure performed in nuclear medicine. Single-photon emission computed tomography/computed tomography (SPECT/CT) in a suspected case of heterotopic location of gastric mucosa can increase the accuracy of its anatomic localization. We present two suspected cases of Meckel's diverticulum in, which SPECT/CT co-registration has helped in better localization of the pathology

  15. Context-aware computing and self-managing systems

    CERN Document Server

    Dargie, Waltenegus

    2009-01-01

    Bringing together an extensively researched area with an emerging research issue, Context-Aware Computing and Self-Managing Systems presents the core contributions of context-aware computing in the development of self-managing systems, including devices, applications, middleware, and networks. The expert contributors reveal the usefulness of context-aware computing in developing autonomous systems that have practical application in the real world.The first chapter of the book identifies features that are common to both context-aware computing and autonomous computing. It offers a basic definit

  16. Where is the most common site of DVT? Evaluation by CT venography

    International Nuclear Information System (INIS)

    Yoshimura, Norihiko; Hori, Yoshiro; Horii, Yosuke; Takano, Toru; Ishikawa, Hiroyuki; Aoyama, Hidefumi

    2012-01-01

    Our aim was to clarify the common site of deep venous thrombosis (DVT) in patients suspected of having pulmonary embolism using computed tomography pulmonary angiography with computed tomography venography (CTV). We evaluated 215 patients. For all studies, 100 ml of 370 mg I/ml nonionic contrast material was administered. CTV were scanned with helical acquisition starting at 3 min in four-slice multidetector-row computed tomography (MDCT) or 5 min in 64-MDCT after the start of contrast material injection. The site of DVT was divided into iliac vein, femoral vein, popliteal vein, or calf vein. Calf vein was divided into muscular (soleal and gastrocnemius) and nonmuscular (anterior/posterior tibial and peroneal) veins. The 2 x 2 chi-square test was used. One hundred and thirty-seven patients showed DVT; the muscular calf vein was more prevalent than other veins (P<0.01). Our study showed that the most common site of DVT was the muscular calf vein. (author)

  17. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... the limitations of CT Scanning of the Head? What is CT Scanning of the Head? Computed tomography, ... than regular radiographs (x-rays). top of page What are some common uses of the procedure? CT ...

  18. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... of the Head? Computed tomography, more commonly known as a CT or CAT scan, is a diagnostic ... white on the x-ray; soft tissue, such as organs like the heart or liver, shows up ...

  19. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... of the Sinuses? Computed tomography, more commonly known as a CT or CAT scan, is a diagnostic ... white on the x-ray; soft tissue, such as organs like the heart or liver, shows up ...

  20. Common Nearly Best Linear Estimates of Location and Scale ...

    African Journals Online (AJOL)

    Common nearly best linear estimates of location and scale parameters of normal and logistic distributions, which are based on complete samples, are considered. Here, the population from which the samples are drawn is either normal or logistic population or a fusion of both distributions and the estimates are computed ...

  1. Cardiology office computer use: primer, pointers, pitfalls.

    Science.gov (United States)

    Shepard, R B; Blum, R I

    1986-10-01

    An office computer is a utility, like an automobile, with benefits and costs that are both direct and hidden and potential for disaster. For the cardiologist or cardiovascular surgeon, the increasing power and decreasing costs of computer hardware and the availability of software make use of an office computer system an increasingly attractive possibility. Management of office business functions is common; handling and scientific analysis of practice medical information are less common. The cardiologist can also access national medical information systems for literature searches and for interactive further education. Selection and testing of programs and the entire computer system before purchase of computer hardware will reduce the chances of disappointment or serious problems. Personnel pretraining and planning for office information flow and medical information security are necessary. Some cardiologists design their own office systems, buy hardware and software as needed, write programs for themselves and carry out the implementation themselves. For most cardiologists, the better course will be to take advantage of the professional experience of expert advisors. This article provides a starting point from which the practicing cardiologist can approach considering, specifying or implementing an office computer system for business functions and for scientific analysis of practice results.

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  3. Elementary software for the hand lens identification of some common iranian woods

    Science.gov (United States)

    Vahidreza Safdari; Margaret S. Devall

    2009-01-01

    A computer program, “Hyrcania”, has been developed for identifying some common woods (26 hardwoods and 6 softwoods) from the Hyrcanian forest type of Iran. The program has been written in JavaScript and is usable with computers as well as mobile phones. The databases use anatomical characteristics (visible with a hand lens) and wood colour, and can be searched in...

  4. Physical computation and cognitive science

    CERN Document Server

    Fresco, Nir

    2014-01-01

    This book presents a study of digital computation in contemporary cognitive science. Digital computation is a highly ambiguous concept, as there is no common core definition for it in cognitive science. Since this concept plays a central role in cognitive theory, an adequate cognitive explanation requires an explicit account of digital computation. More specifically, it requires an account of how digital computation is implemented in physical systems. The main challenge is to deliver an account encompassing the multiple types of existing models of computation without ending up in pancomputationalism, that is, the view that every physical system is a digital computing system. This book shows that only two accounts, among the ones examined by the author, are adequate for explaining physical computation. One of them is the instructional information processing account, which is developed here for the first time.   “This book provides a thorough and timely analysis of differing accounts of computation while adv...

  5. 计算机公共选修课分类教学研究%Research on the Classfication Teaching of Common Elective Courses of Computer

    Institute of Scientific and Technical Information of China (English)

    曾显峰; 张钰莎

    2012-01-01

    Through analysis of common elective courses of computer by examples, puts forward classifica- tion teaching in the sepcial field. The computer foundation courses would be divided into many modules, different professional selectively to learning, and facing the different professional, cstabishes course groups in the next stage, and establishes a flexible teaching methods, assessment methods, thereby efficiently provides services for profesional.%通过实例分析大学计算机公共选修课目前的通行模式,提出面向专业的分类教学。计算机基础按内容模块化,不同的专业时模块进行分类选修,后续选修课程建立面向专业的课程群,并建立起灵活的教学方式,考核方式,从而更好地让计算机公共课程为专业服务。

  6. Open Data as a New Commons

    DEFF Research Database (Denmark)

    Morelli, Nicola; Mulder, Ingrid; Concilio, Grazia

    2017-01-01

    and environmental opportunities around them and government choices. Developing spacesmeans for enabling citizens to harness the opportunities coming from the use of this new resource, offers thus a substantial promise of social innovation. This means that open data is vi (still) virtually a new resource that could...... become a new commons with the engagement of interested and active communities. The condition for open data becoming a new common is that citizens become aware of the potential of this resource, that they use it for creating new services and that new practices and infrastructures are defined, that would......An increasing computing capability is raising the opportunities to use a large amount of publicly available data for creating new applications and a new generation of public services. But while it is easy to find some early examples of services concerning control systems (e.g. traffic, meteo...

  7. Structure and dynamics of amorphous polymers: computer simulations compared to experiment and theory

    International Nuclear Information System (INIS)

    Paul, Wolfgang; Smith, Grant D

    2004-01-01

    This contribution considers recent developments in the computer modelling of amorphous polymeric materials. Progress in our capabilities to build models for the computer simulation of polymers from the detailed atomistic scale up to coarse-grained mesoscopic models, together with the ever-improving performance of computers, have led to important insights from computer simulations into the structural and dynamic properties of amorphous polymers. Structurally, chain connectivity introduces a range of length scales from that of the chemical bond to the radius of gyration of the polymer chain covering 2-4 orders of magnitude. Dynamically, this range of length scales translates into an even larger range of time scales observable in relaxation processes in amorphous polymers ranging from about 10 -13 to 10 -3 s or even to 10 3 s when glass dynamics is concerned. There is currently no single simulation technique that is able to describe all these length and time scales efficiently. On large length and time scales basic topology and entropy become the governing properties and this fact can be exploited using computer simulations of coarse-grained polymer models to study universal aspects of the structure and dynamics of amorphous polymers. On the largest length and time scales chain connectivity is the dominating factor leading to the strong increase in longest relaxation times described within the reptation theory of polymer melt dynamics. Recently, many of the universal aspects of this behaviour have been further elucidated by computer simulations of coarse-grained polymer models. On short length scales the detailed chemistry and energetics of the polymer are important, and one has to be able to capture them correctly using chemically realistic modelling of specific polymers, even when the aim is to extract generic physical behaviour exhibited by the specific chemistry. Detailed studies of chemically realistic models highlight the central importance of torsional dynamics

  8. FPGA-accelerated simulation of computer systems

    CERN Document Server

    Angepat, Hari; Chung, Eric S; Hoe, James C; Chung, Eric S

    2014-01-01

    To date, the most common form of simulators of computer systems are software-based running on standard computers. One promising approach to improve simulation performance is to apply hardware, specifically reconfigurable hardware in the form of field programmable gate arrays (FPGAs). This manuscript describes various approaches of using FPGAs to accelerate software-implemented simulation of computer systems and selected simulators that incorporate those techniques. More precisely, we describe a simulation architecture taxonomy that incorporates a simulation architecture specifically designed f

  9. Mathematics and Computer Science: The Interplay

    OpenAIRE

    Madhavan, Veni CE

    2005-01-01

    Mathematics has been an important intellectual preoccupation of man for a long time. Computer science as a formal discipline is about seven decades young. However, one thing in common between all users and producers of mathematical thought is the almost involuntary use of computing. In this article, we bring to fore the many close connections and parallels between the two sciences of mathematics and computing. We show that, unlike in the other branches of human inquiry where mathematics is me...

  10. Common primary headaches in pregnancy

    Directory of Open Access Journals (Sweden)

    Anuradha Mitra

    2015-01-01

    Full Text Available Headache is a very common problem in pregnancy. Evaluation of a complaint of headache requires categorizing it as primary or secondary. Migrainous headaches are known to be influenced by fluctuation of estrogen levels with high levels improving it and low levels deteriorating the symptoms. Tension-type Headaches (TTHs are the most common and usually less severe types of headache with female to male ratio 3:1. Women known to have primary headache before conception who present with a headache that is different from their usual headache, or women not known to have primary headache before conception who present with new-onset of headache during pregnancy need neurologic assessments for potential secondary cause for their headache. In addition to proper history and physical examination, both non-contrast computed tomography (CT and Magnetic Resonance Imaging (MRI are considered safe to be performed in pregnant women when indicated. Treatment of abortive and prophylactic therapy should include non-pharmacologic tools, judicious use of drugs which are safe for mother and fetus.

  11. Improved algorithms for approximate string matching (extended abstract

    Directory of Open Access Journals (Sweden)

    Papamichail Georgios

    2009-01-01

    Full Text Available Abstract Background The problem of approximate string matching is important in many different areas such as computational biology, text processing and pattern recognition. A great effort has been made to design efficient algorithms addressing several variants of the problem, including comparison of two strings, approximate pattern identification in a string or calculation of the longest common subsequence that two strings share. Results We designed an output sensitive algorithm solving the edit distance problem between two strings of lengths n and m respectively in time O((s - |n - m|·min(m, n, s + m + n and linear space, where s is the edit distance between the two strings. This worst-case time bound sets the quadratic factor of the algorithm independent of the longest string length and improves existing theoretical bounds for this problem. The implementation of our algorithm also excels in practice, especially in cases where the two strings compared differ significantly in length. Conclusion We have provided the design, analysis and implementation of a new algorithm for calculating the edit distance of two strings with both theoretical and practical implications. Source code of our algorithm is available online.

  12. Efficient heuristics for maximum common substructure search.

    Science.gov (United States)

    Englert, Péter; Kovács, Péter

    2015-05-26

    Maximum common substructure search is a computationally hard optimization problem with diverse applications in the field of cheminformatics, including similarity search, lead optimization, molecule alignment, and clustering. Most of these applications have strict constraints on running time, so heuristic methods are often preferred. However, the development of an algorithm that is both fast enough and accurate enough for most practical purposes is still a challenge. Moreover, in some applications, the quality of a common substructure depends not only on its size but also on various topological features of the one-to-one atom correspondence it defines. Two state-of-the-art heuristic algorithms for finding maximum common substructures have been implemented at ChemAxon Ltd., and effective heuristics have been developed to improve both their efficiency and the relevance of the atom mappings they provide. The implementations have been thoroughly evaluated and compared with existing solutions (KCOMBU and Indigo). The heuristics have been found to greatly improve the performance and applicability of the algorithms. The purpose of this paper is to introduce the applied methods and present the experimental results.

  13. Free Boomerang-shaped Extended Rectus Abdominis Myocutaneous flap: The longest possible skin/myocutaneous free flap for soft tissue reconstruction of extremities.

    Science.gov (United States)

    Koul, Ashok R; Nahar, Sushil; Prabhu, Jagdish; Kale, Subhash M; Kumar, Praveen H P

    2011-09-01

    A soft tissue defect requiring flap cover which is longer than that provided by the conventional "long" free flaps like latissimus dorsi (LD) and anterolateral thigh (ALT) flap is a challenging problem. Often, in such a situation, a combination of flaps is required. Over the last 3 years, we have managed nine such defects successfully with a free "Boomerang-shaped" Extended Rectus Abdominis Myocutaneous (BERAM) flap. This flap is the slightly modified and "free" version of a similar flap described by Ian Taylor in 1983. This is a retrospective study of patients who underwent free BERAM flap reconstruction of soft tissue defects of extremity over the last 3 years. We also did a clinical study on 30 volunteers to compare the length of flap available using our design of BERAM flap with the maximum available flap length of LD and ALT flaps, using standard markings. Our clinical experience of nine cases combined with the results of our clinical study has confirmed that our design of BERAM flap consistently provides a flap length which is 32.6% longer than the standard LD flap and 42.2% longer than the standard ALT flap in adults. The difference is even more marked in children. The BERAM flap is consistently reliable as long as the distal end is not extended beyond the mid-axillary line. BERAM flap is simple in design, easy to harvest, reliable and provides the longest possible free skin/myocutaneous flap in the body. It is a useful new alternative for covering long soft tissue defects in the limbs.

  14. Introduction to computer networking

    CERN Document Server

    Robertazzi, Thomas G

    2017-01-01

    This book gives a broad look at both fundamental networking technology and new areas that support it and use it. It is a concise introduction to the most prominent, recent technological topics in computer networking. Topics include network technology such as wired and wireless networks, enabling technologies such as data centers, software defined networking, cloud and grid computing and applications such as networks on chips, space networking and network security. The accessible writing style and non-mathematical treatment makes this a useful book for the student, network and communications engineer, computer scientist and IT professional. • Features a concise, accessible treatment of computer networking, focusing on new technological topics; • Provides non-mathematical introduction to networks in their most common forms today;< • Includes new developments in switching, optical networks, WiFi, Bluetooth, LTE, 5G, and quantum cryptography.

  15. Comparison of four classification methods for brain-computer interface

    Czech Academy of Sciences Publication Activity Database

    Frolov, A.; Húsek, Dušan; Bobrov, P.

    2011-01-01

    Roč. 21, č. 2 (2011), s. 101-115 ISSN 1210-0552 R&D Projects: GA MŠk(CZ) 1M0567; GA ČR GA201/05/0079; GA ČR GAP202/10/0262 Institutional research plan: CEZ:AV0Z10300504 Keywords : brain computer interface * motor imagery * visual imagery * EEG pattern classification * Bayesian classification * Common Spatial Patterns * Common Tensor Discriminant Analysis Subject RIV: IN - Informatics, Computer Science Impact factor: 0.646, year: 2011

  16. Neuroscience-inspired computational systems for speech recognition under noisy conditions

    Science.gov (United States)

    Schafer, Phillip B.

    advantage of the neural representation's invariance in noise. The scheme centers on a speech similarity measure based on the longest common subsequence between spike sequences. The combined encoding and decoding scheme outperforms a benchmark system in extremely noisy acoustic conditions. Finally, I consider methods for decoding spike representations of continuous speech. To help guide the alignment of templates to words, I design a syllable detection scheme that robustly marks the locations of syllabic nuclei. The scheme combines SVM-based training with a peak selection algorithm designed to improve noise tolerance. By incorporating syllable information into the ASR system, I obtain strong recognition results in noisy conditions, although the performance in noiseless conditions is below the state of the art. The work presented here constitutes a novel approach to the problem of ASR that can be applied in the many challenging acoustic environments in which we use computer technologies today. The proposed spike-based processing methods can potentially be exploited in effcient hardware implementations and could significantly reduce the computational costs of ASR. The work also provides a framework for understanding the advantages of spike-based acoustic coding in the human brain.

  17. Computed tomographic findings of hepatocellular carcinoma

    International Nuclear Information System (INIS)

    Jo, In Su; Jong, Woo Yung; Lee, Jong Yul; Choi, Han Yong; Kim, Bong Ki

    1987-01-01

    With Development of Computed Tomography, detection of the Hepatocellular Carcinoma are easily performed and frequently used in the world. During 15 months, from December 1985 to February 1987, 59 patients with hepatocellular carcinoma were evaluated with computed tomography in department of radiology at Wallace Memorial Baptist Hospital. The results were as follow: 1. The most prevalent age group was 5th to 7th decades, male to female ratio was 4.9:1. 2. Classification with incidence of computed tomographic appearance of the hepatocellular carcinoma were solitary type 28 cases (48%), multinodular type 24 cases (40%), and diffuse type 7 cases (12%), Association with liver cirrhosis was noted in 22 cases (38%). 3. Inhomogenous internal consistency of hepatocellular carcinoma due to central necrosis were 35 cases (60%). Portal vein invasion by hepatocellular carcinoma was noted in 15 cases (25%), and particularly most common in diffuse type 4 cases (55%). 4. On precontrast scan, all hepatocellular carcinoma were seen as area of low density except for 3 cases(0.5%) of near isodensity which turned out to be remarkable low density on postcontrast scan. 5. In solitary type, posterior segment of right lobe was most common site of involvement 12 cases (43%). In diffuse type, bilobar involvement was most common, 6 cases (85%)

  18. Sensitivity analysis on the effect of software-induced common cause failure probability in the computer-based reactor trip system unavailability

    International Nuclear Information System (INIS)

    Kamyab, Shahabeddin; Nematollahi, Mohammadreza; Shafiee, Golnoush

    2013-01-01

    Highlights: ► Importance and sensitivity analysis has been performed for a digitized reactor trip system. ► The results show acceptable trip unavailability, for software failure probabilities below 1E −4 . ► However, the value of Fussell–Vesley indicates that software common cause failure is still risk significant. ► Diversity and effective test is founded beneficial to reduce software contribution. - Abstract: The reactor trip system has been digitized in advanced nuclear power plants, since the programmable nature of computer based systems has a number of advantages over non-programmable systems. However, software is still vulnerable to common cause failure (CCF). Residual software faults represent a CCF concern, which threat the implemented achievements. This study attempts to assess the effectiveness of so-called defensive strategies against software CCF with respect to reliability. Sensitivity analysis has been performed by re-quantifying the models upon changing the software failure probability. Importance measures then have been estimated in order to reveal the specific contribution of software CCF in the trip failure probability. The results reveal the importance and effectiveness of signal and software diversity as applicable strategies to ameliorate inefficiencies due to software CCF in the reactor trip system (RTS). No significant change has been observed in the rate of RTS failure probability for the basic software CCF greater than 1 × 10 −4 . However, the related Fussell–Vesley has been greater than 0.005, for the lower values. The study concludes that consideration of risk associated with the software based systems is a multi-variant function which requires compromising among them in more precise and comprehensive studies

  19. Computer-assisted Particle-in-Cell code development

    International Nuclear Information System (INIS)

    Kawata, S.; Boonmee, C.; Teramoto, T.; Drska, L.; Limpouch, J.; Liska, R.; Sinor, M.

    1997-12-01

    This report presents a new approach for an electromagnetic Particle-in-Cell (PIC) code development by a computer: in general PIC codes have a common structure, and consist of a particle pusher, a field solver, charge and current density collections, and a field interpolation. Because of the common feature, the main part of the PIC code can be mechanically developed on a computer. In this report we use the packages FIDE and GENTRAN of the REDUCE computer algebra system for discretizations of field equations and a particle equation, and for an automatic generation of Fortran codes. The approach proposed is successfully applied to the development of 1.5-dimensional PIC code. By using the generated PIC code the Weibel instability in a plasma is simulated. The obtained growth rate agrees well with the theoretical value. (author)

  20. Emphysema Is Common in Lungs of Cystic Fibrosis Lung Transplantation Patients: A Histopathological and Computed Tomography Study.

    Directory of Open Access Journals (Sweden)

    Onno M Mets

    Full Text Available Lung disease in cystic fibrosis (CF involves excessive inflammation, repetitive infections and development of bronchiectasis. Recently, literature on emphysema in CF has emerged, which might become an increasingly important disease component due to the increased life expectancy. The purpose of this study was to assess the presence and extent of emphysema in endstage CF lungs.In explanted lungs of 20 CF patients emphysema was semi-quantitatively assessed on histology specimens. Also, emphysema was automatically quantified on pre-transplantation computed tomography (CT using the percentage of voxels below -950 Houndfield Units and was visually scored on CT. The relation between emphysema extent, pre-transplantation lung function and age was determined.All CF patients showed emphysema on histological examination: 3/20 (15% showed mild, 15/20 (75% moderate and 2/20 (10% severe emphysema, defined as 0-20% emphysema, 20-50% emphysema and >50% emphysema in residual lung tissue, respectively. Visually upper lobe bullous emphysema was identified in 13/20 and more diffuse non-bullous emphysema in 18/20. Histology showed a significant correlation to quantified CT emphysema (p = 0.03 and visual emphysema score (p = 0.001. CT and visual emphysema extent were positively correlated with age (p = 0.045 and p = 0.04, respectively.In conclusion, this study both pathologically and radiologically confirms that emphysema is common in end-stage CF lungs, and is age related. Emphysema might become an increasingly important disease component in the aging CF population.

  1. Understanding Computational Bayesian Statistics

    CERN Document Server

    Bolstad, William M

    2011-01-01

    A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic

  2. Computer science II essentials

    CERN Document Server

    Raus, Randall

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Computer Science II includes organization of a computer, memory and input/output, coding, data structures, and program development. Also included is an overview of the most commonly

  3. Using a Cloud-Based Computing Environment to Support Teacher Training on Common Core Implementation

    Science.gov (United States)

    Robertson, Cory

    2013-01-01

    A cloud-based computing environment, Google Apps for Education (GAFE), has provided the Anaheim City School District (ACSD) a comprehensive and collaborative avenue for creating, sharing, and editing documents, calendars, and social networking communities. With this environment, teachers and district staff at ACSD are able to utilize the deep…

  4. Children's (Pediatric) CT (Computed Tomography)

    Medline Plus

    Full Text Available ... is Children's CT? Computed tomography, more commonly known as a CT or CAT scan, is a diagnostic ... is used to evaluate: complications from infections such as pneumonia a tumor that arises in the lung ...

  5. Distributed simulation of large computer systems

    International Nuclear Information System (INIS)

    Marzolla, M.

    2001-01-01

    Sequential simulation of large complex physical systems is often regarded as a computationally expensive task. In order to speed-up complex discrete-event simulations, the paradigm of Parallel and Distributed Discrete Event Simulation (PDES) has been introduced since the late 70s. The authors analyze the applicability of PDES to the modeling and analysis of large computer system; such systems are increasingly common in the area of High Energy and Nuclear Physics, because many modern experiments make use of large 'compute farms'. Some feasibility tests have been performed on a prototype distributed simulator

  6. Practical advantages of evolutionary computation

    Science.gov (United States)

    Fogel, David B.

    1997-10-01

    Evolutionary computation is becoming a common technique for solving difficult, real-world problems in industry, medicine, and defense. This paper reviews some of the practical advantages to using evolutionary algorithms as compared with classic methods of optimization or artificial intelligence. Specific advantages include the flexibility of the procedures, as well as their ability to self-adapt the search for optimum solutions on the fly. As desktop computers increase in speed, the application of evolutionary algorithms will become routine.

  7. Distributed computing system with dual independent communications paths between computers and employing split tokens

    Science.gov (United States)

    Rasmussen, Robert D. (Inventor); Manning, Robert M. (Inventor); Lewis, Blair F. (Inventor); Bolotin, Gary S. (Inventor); Ward, Richard S. (Inventor)

    1990-01-01

    This is a distributed computing system providing flexible fault tolerance; ease of software design and concurrency specification; and dynamic balance of the loads. The system comprises a plurality of computers each having a first input/output interface and a second input/output interface for interfacing to communications networks each second input/output interface including a bypass for bypassing the associated computer. A global communications network interconnects the first input/output interfaces for providing each computer the ability to broadcast messages simultaneously to the remainder of the computers. A meshwork communications network interconnects the second input/output interfaces providing each computer with the ability to establish a communications link with another of the computers bypassing the remainder of computers. Each computer is controlled by a resident copy of a common operating system. Communications between respective ones of computers is by means of split tokens each having a moving first portion which is sent from computer to computer and a resident second portion which is disposed in the memory of at least one of computer and wherein the location of the second portion is part of the first portion. The split tokens represent both functions to be executed by the computers and data to be employed in the execution of the functions. The first input/output interfaces each include logic for detecting a collision between messages and for terminating the broadcasting of a message whereby collisions between messages are detected and avoided.

  8. Data mining in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Ruxandra-Ştefania PETRE

    2012-10-01

    Full Text Available This paper describes how data mining is used in cloud computing. Data Mining is used for extracting potentially useful information from raw data. The integration of data mining techniques into normal day-to-day activities has become common place. Every day people are confronted with targeted advertising, and data mining techniques help businesses to become more efficient by reducing costs.Data mining techniques and applications are very much needed in the cloud computing paradigm. The implementation of data mining techniques through Cloud computing will allow the users to retrieve meaningful information from virtually integrated data warehouse that reduces the costs of infrastructure and storage.

  9. [Reconstruction of long polynucleotide sequences from fragments using the Iskra-226 personal computer

    Science.gov (United States)

    Kostetskiĭ, P V; Dobrova, I E

    1988-04-01

    An algorithm for reconstructing long DNA sequences, i.e. arranging all overlapping gel readings in the contigs, and the corresponding BASIC programme for personal computer "Iskra-226" (USSR) are described. The contig construction begins with the search for all fragments overlapping the basic (longest) one follower by determination of coordinates of 5' ends of the overlapping fragments. Then the gel reading with minimal 5' end coordinate and the gel reading with maximal 3' end coordinate are selected and used as basic ones at the next assembly steps. The procedure is finished when no gel reading overlapping the basic one can be found. All gel readings entered the contig are ignored at the next steps of the assembly. Finally, one or several contigs consisted of DNA fragments are obtained. Effectiveness of the algorithm was tested on a model based on the multiple assembly of the nucleotide sequence, encoding the Na, K-ATPase alpha-subunit of pig kidney. The programme does not call for user's participation and can comprise contigs up to 10,000 nucleotides long.

  10. Intelligent computing systems emerging application areas

    CERN Document Server

    Virvou, Maria; Jain, Lakhmi

    2016-01-01

    This book at hand explores emerging scientific and technological areas in which Intelligent Computing Systems provide efficient solutions and, thus, may play a role in the years to come. It demonstrates how Intelligent Computing Systems make use of computational methodologies that mimic nature-inspired processes to address real world problems of high complexity for which exact mathematical solutions, based on physical and statistical modelling, are intractable. Common intelligent computational methodologies are presented including artificial neural networks, evolutionary computation, genetic algorithms, artificial immune systems, fuzzy logic, swarm intelligence, artificial life, virtual worlds and hybrid methodologies based on combinations of the previous. The book will be useful to researchers, practitioners and graduate students dealing with mathematically-intractable problems. It is intended for both the expert/researcher in the field of Intelligent Computing Systems, as well as for the general reader in t...

  11. Common approach to common interests

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-06-01

    In referring to issues confronting the energy field in this region and options to be exercised in the future, I would like to mention the fundamental condition of the utmost importance. That can be summed up as follows: any subject in energy area can never be solved by one country alone, given the geographical and geopolitical characteristics intrinsically possessed by energy. So, a regional approach is needed and it is especially necessary for the main players in the region to jointly address problems common to them. Though it may be a matter to be pursued in the distant future, I am personally dreaming a 'Common Energy Market for Northeast Asia,' in which member countries' interests are adjusted so that the market can be integrated and the region can become a most economically efficient market, thus formulating an effective power to encounter the outside. It should be noted that Europe needed forty years to integrate its market as the unified common market. It is necessary for us to follow a number of steps over the period to eventually materialize our common market concept, too. Now is the time for us to take a first step to lay the foundation for our descendants to enjoy prosperity from such a common market.

  12. CyVerse Data Commons: lessons learned in cyberinfrastructure management and data hosting from the Life Sciences

    Science.gov (United States)

    Swetnam, T. L.; Walls, R.; Merchant, N.

    2017-12-01

    CyVerse, is a US National Science Foundation funded initiative "to design, deploy, and expand a national cyberinfrastructure for life sciences research, and to train scientists in its use," supporting and enabling cross disciplinary collaborations across institutions. CyVerse' free, open-source, cyberinfrastructure is being adopted into biogeoscience and space sciences research. CyVerse data-science agnostic platforms provide shared data storage, high performance computing, and cloud computing that allow analysis of very large data sets (including incomplete or work-in-progress data sets). Part of CyVerse success has been in addressing the handling of data through its entire lifecycle, from creation to final publication in a digital data repository to reuse in new analyses. CyVerse developers and user communities have learned many lessons that are germane to Earth and Environmental Science. We present an overview of the tools and services available through CyVerse including: interactive computing with the Discovery Environment (https://de.cyverse.org/), an interactive data science workbench featuring data storage and transfer via the Data Store; cloud computing with Atmosphere (https://atmo.cyverse.org); and access to HPC via Agave API (https://agaveapi.co/). Each CyVerse service emphasizes access to long term data storage, including our own Data Commons (http://datacommons.cyverse.org), as well as external repositories. The Data Commons service manages, organizes, preserves, publishes, allows for discovery and reuse of data. All data published to CyVerse's Curated Data receive a permanent identifier (PID) in the form of a DOI (Digital Object Identifier) or ARK (Archival Resource Key). Data that is more fluid can also be published in the Data commons through Community Collaborated data. The Data Commons provides landing pages, permanent DOIs or ARKs, and supports data reuse and citation through features such as open data licenses and downloadable citations. The

  13. Bacterial contamination of computer touch screens.

    Science.gov (United States)

    Gerba, Charles P; Wuollet, Adam L; Raisanen, Peter; Lopez, Gerardo U

    2016-03-01

    The goal of this study was to determine the occurrence of opportunistic bacterial pathogens on the surfaces of computer touch screens used in hospitals and grocery stores. Opportunistic pathogenic bacteria were isolated on touch screens in hospitals; Clostridium difficile and vancomycin-resistant Enterococcus and in grocery stores; methicillin-resistant Staphylococcus aureus. Enteric bacteria were more common on grocery store touch screens than on hospital computer touch screens. Published by Elsevier Inc.

  14. Research on cloud computing solutions

    Directory of Open Access Journals (Sweden)

    Liudvikas Kaklauskas

    2015-07-01

    Full Text Available Cloud computing can be defined as a new style of computing in which dynamically scala-ble and often virtualized resources are provided as a services over the Internet. Advantages of the cloud computing technology include cost savings, high availability, and easy scalability. Voas and Zhang adapted six phases of computing paradigms, from dummy termi-nals/mainframes, to PCs, networking computing, to grid and cloud computing. There are four types of cloud computing: public cloud, private cloud, hybrid cloud and community. The most common and well-known deployment model is Public Cloud. A Private Cloud is suited for sensitive data, where the customer is dependent on a certain degree of security.According to the different types of services offered, cloud computing can be considered to consist of three layers (services models: IaaS (infrastructure as a service, PaaS (platform as a service, SaaS (software as a service. Main cloud computing solutions: web applications, data hosting, virtualization, database clusters and terminal services. The advantage of cloud com-puting is the ability to virtualize and share resources among different applications with the objective for better server utilization and without a clustering solution, a service may fail at the moment the server crashes.DOI: 10.15181/csat.v2i2.914

  15. Classical Logic and Quantum Logic with Multiple and Common Lattice Models

    Directory of Open Access Journals (Sweden)

    Mladen Pavičić

    2016-01-01

    Full Text Available We consider a proper propositional quantum logic and show that it has multiple disjoint lattice models, only one of which is an orthomodular lattice (algebra underlying Hilbert (quantum space. We give an equivalent proof for the classical logic which turns out to have disjoint distributive and nondistributive ortholattices. In particular, we prove that both classical logic and quantum logic are sound and complete with respect to each of these lattices. We also show that there is one common nonorthomodular lattice that is a model of both quantum and classical logic. In technical terms, that enables us to run the same classical logic on both a digital (standard, two-subset, 0-1-bit computer and a nondigital (say, a six-subset computer (with appropriate chips and circuits. With quantum logic, the same six-element common lattice can serve us as a benchmark for an efficient evaluation of equations of bigger lattice models or theorems of the logic.

  16. Computer games and software engineering

    CERN Document Server

    Cooper, Kendra M L

    2015-01-01

    Computer games represent a significant software application domain for innovative research in software engineering techniques and technologies. Game developers, whether focusing on entertainment-market opportunities or game-based applications in non-entertainment domains, thus share a common interest with software engineers and developers on how to best engineer game software.Featuring contributions from leading experts in software engineering, the book provides a comprehensive introduction to computer game software development that includes its history as well as emerging research on the inte

  17. [Computer-assisted temporomandibular joint reconstruction.

    Science.gov (United States)

    Zwetyenga, N; Mommers, X-A; Cheynet, F

    2013-08-02

    Prosthetic replacement of TMJ is gradually becoming a common procedure because of good functional and aesthetic results and low morbidity. Prosthetic models available can be standard or custom-made. Custom-made prosthesis are usually reserved for complex cases, but we think that computer assistance for custom-made prosthesis should be indicated for each case because it gives a greater implant stability and fewer complications. Computer assistance will further enlarge TMJ prosthesis replacement indications. Copyright © 2013. Published by Elsevier Masson SAS.

  18. APC: A New Code for Atmospheric Polarization Computations

    Science.gov (United States)

    Korkin, Sergey V.; Lyapustin, Alexei I.; Rozanov, Vladimir V.

    2014-01-01

    A new polarized radiative transfer code Atmospheric Polarization Computations (APC) is described. The code is based on separation of the diffuse light field into anisotropic and smooth (regular) parts. The anisotropic part is computed analytically. The smooth regular part is computed numerically using the discrete ordinates method. Vertical stratification of the atmosphere, common types of bidirectional surface reflection and scattering by spherical particles or spheroids are included. A particular consideration is given to computation of the bidirectional polarization distribution function (BPDF) of the waved ocean surface.

  19. [Musculoskeletal disorders among university student computer users].

    Science.gov (United States)

    Lorusso, A; Bruno, S; L'Abbate, N

    2009-01-01

    Musculoskeletal disorders are a common problem among computer users. Many epidemiological studies have shown that ergonomic factors and aspects of work organization play an important role in the development of these disorders. We carried out a cross-sectional survey to estimate the prevalence of musculoskeletal symptoms among university students using personal computers and to investigate the features of occupational exposure and the prevalence of symptoms throughout the study course. Another objective was to assess the students' level of knowledge of computer ergonomics and the relevant health risks. A questionnaire was distributed to 183 students attending the lectures for second and fourth year courses of the Faculty of Architecture. Data concerning personal characteristics, ergonomic and organizational aspects of computer use, and the presence of musculoskeletal symptoms in the neck and upper limbs were collected. Exposure to risk factors such as daily duration of computer use, time spent at the computer without breaks, duration of mouse use and poor workstation ergonomics was significantly higher among students of the fourth year course. Neck pain was the most commonly reported symptom (69%), followed by hand/wrist (53%), shoulder (49%) and arm (8%) pain. The prevalence of symptoms in the neck and hand/wrist area was signifcantly higher in the students of the fourth year course. In our survey we found high prevalence of musculoskeletal symptoms among university students using computers for long time periods on a daily basis. Exposure to computer-related ergonomic and organizational risk factors, and the prevalence ofmusculoskeletal symptoms both seem to increase significantly throughout the study course. Furthermore, we found that the level of perception of computer-related health risks among the students was low. Our findings suggest the need for preventive intervention consisting of education in computer ergonomics.

  20. Learning Commons in Academic Libraries: Discussing Themes in the Literature from 2001 to the Present

    Science.gov (United States)

    Blummer, Barbara; Kenton, Jeffrey M.

    2017-01-01

    Although the term lacks a standard definition, learning commons represent academic library spaces that provide computer and library resources as well as a range of academic services that support learners and learning. Learning commons have been equated to a laboratory for creating knowledge and staffed with librarians that serve as facilitators of…

  1. Common Courses for Common Purposes:

    DEFF Research Database (Denmark)

    Schaub Jr, Gary John

    2014-01-01

    (PME)? I suggest three alternative paths that increased cooperation in PME at the level of the command and staff course could take: a Nordic Defence College, standardized national command and staff courses, and a core curriculum of common courses for common purposes. I conclude with a discussion of how...

  2. Free Boomerang-shaped Extended Rectus Abdominis Myocutaneous flap: The longest possible skin/myocutaneous free flap for soft tissue reconstruction of extremities

    Directory of Open Access Journals (Sweden)

    Ashok R Koul

    2011-01-01

    Full Text Available Background: A soft tissue defect requiring flap cover which is longer than that provided by the conventional "long" free flaps like latissimus dorsi (LD and anterolateral thigh (ALT flap is a challenging problem. Often, in such a situation, a combination of flaps is required. Over the last 3 years, we have managed nine such defects successfully with a free "Boomerang-shaped" Extended Rectus Abdominis Myocutaneous (BERAM flap. This flap is the slightly modified and "free" version of a similar flap described by Ian Taylor in 1983. Materials and Methods: This is a retrospective study of patients who underwent free BERAM flap reconstruction of soft tissue defects of extremity over the last 3 years. We also did a clinical study on 30 volunteers to compare the length of flap available using our design of BERAM flap with the maximum available flap length of LD and ALT flaps, using standard markings. Results: Our clinical experience of nine cases combined with the results of our clinical study has confirmed that our design of BERAM flap consistently provides a flap length which is 32.6% longer than the standard LD flap and 42.2% longer than the standard ALT flap in adults. The difference is even more marked in children. The BERAM flap is consistently reliable as long as the distal end is not extended beyond the mid-axillary line. Conclusion: BERAM flap is simple in design, easy to harvest, reliable and provides the longest possible free skin/myocutaneous flap in the body. It is a useful new alternative for covering long soft tissue defects in the limbs.

  3. Computing platforms for software-defined radio

    CERN Document Server

    Nurmi, Jari; Isoaho, Jouni; Garzia, Fabio

    2017-01-01

    This book addresses Software-Defined Radio (SDR) baseband processing from the computer architecture point of view, providing a detailed exploration of different computing platforms by classifying different approaches, highlighting the common features related to SDR requirements and by showing pros and cons of the proposed solutions. Coverage includes architectures exploiting parallelism by extending single-processor environment (such as VLIW, SIMD, TTA approaches), multi-core platforms distributing the computation to either a homogeneous array or a set of specialized heterogeneous processors, and architectures exploiting fine-grained, coarse-grained, or hybrid reconfigurability. Describes a computer engineering approach to SDR baseband processing hardware; Discusses implementation of numerous compute-intensive signal processing algorithms on single and multicore platforms; Enables deep understanding of optimization techniques related to power and energy consumption of multicore platforms using several basic a...

  4. Collaboration, Collusion and Plagiarism in Computer Science Coursework

    Science.gov (United States)

    Fraser, Robert

    2014-01-01

    We present an overview of the nature of academic dishonesty with respect to computer science coursework. We discuss the efficacy of various policies for collaboration with regard to student education, and we consider a number of strategies for mitigating dishonest behaviour on computer science coursework by addressing some common causes. Computer…

  5. Measuring the Common Component of Stock Market Fluctuations in the Asia-Pacific Region

    OpenAIRE

    Mapa, Dennis S.; Briones, Kristine Joy S.

    2006-01-01

    This paper fits Generalized Auto-Regressive Conditional Heteroskedasticity (GARCH) models to the daily closing stock market indices of Australia, China, Hong Kong, Indonesia, Japan, Korea, Malaysia, Philippines, Singapore, and Taiwan to compute for time-varying weights associated with the volatilities of individual indices. These weights and the returns of the various indices were then used to determine the common component of stock market returns. Our results suggest that a common component ...

  6. An Electrical Analog Computer for Poets

    Science.gov (United States)

    Bruels, Mark C.

    1972-01-01

    Nonphysics majors are presented with a direct current experiment beyond Ohms law and series and parallel laws. This involves construction of an analog computer from common rheostats and student-assembled voltmeters. (Author/TS)

  7. Gender and computer games / video games : girls’ perspective orientation

    OpenAIRE

    Yan, Jingjing

    2010-01-01

    The topic of this thesis is “Gender Differences in Computer games/ Video games Industry”. Due to rapid development in technology and popularization of computers all around the world, computer games have already become a kind of common entertainment. Because computer games were designed especially for boys at the very beginning, there are still some remaining barriers when training female game designers and expanding game markets among female players.This thesis is mainly based on two studies ...

  8. Elements of matrix modeling and computing with Matlab

    CERN Document Server

    White, Robert E

    2006-01-01

    As discrete models and computing have become more common, there is a need to study matrix computation and numerical linear algebra. Encompassing a diverse mathematical core, Elements of Matrix Modeling and Computing with MATLAB examines a variety of applications and their modeling processes, showing you how to develop matrix models and solve algebraic systems. Emphasizing practical skills, it creates a bridge from problems with two and three variables to more realistic problems that have additional variables. Elements of Matrix Modeling and Computing with MATLAB focuses on seven basic applicat

  9. Correlation of the Deccan and Rajahmundry Trap lavas: Are these the longest and largest lava flows on Earth?

    Science.gov (United States)

    Self, S.; Jay, A. E.; Widdowson, M.; Keszthelyi, L. P.

    2008-05-01

    We propose that the Rajahmundry Trap lavas, found near the east coast of peninsular India , are remnants of the longest lava flows yet recognized on Earth (˜ 1000 km long). These outlying Deccan-like lavas are shown to belong to the main Deccan Traps. Several previous studies have already suggested this correlation, but have not demonstrated it categorically. The exposed Rajahmundry lavas are interpreted to be the distal parts of two very-large-volume pāhoehoe flow fields, one each from the Ambenali and Mahabaleshwar Formations of the Wai Sub-group in the Deccan Basalt Group. Eruptive conditions required to emplace such long flows are met by plausible values for cooling and eruption rates, and this is shown by applying a model for the formation of inflated pāhoehoe sheet flow lobes. The model predicts flow lobe thicknesses similar to those observed in the Rajahmundry lavas. For the last 400 km of flow, the lava flows were confined to the pre-existing Krishna valley drainage system that existed in the basement beyond the edge of the gradually expanding Deccan lava field, allowing the flows to extend across the subcontinent to the eastern margin where they were emplaced into a littoral and/or shallow marine environment. These lavas and other individual flow fields in the Wai Sub-group may exceed eruptive volumes of 5000 km 3, which would place them amongst the largest magnitude effusive eruptive units yet known. We suggest that the length of flood basalt lava flows on Earth is restricted mainly by the size of land masses and topography. In the case of the Rajahmundry lavas, the flows reached estuaries and the sea, where their advance was perhaps effectively terminated by cooling and/or disruption. However, it is only during large igneous province basaltic volcanism that such huge volumes of lava are erupted in single events, and when the magma supply rate is sufficiently high and maintained to allow the formation of very long lava flows. The Rajahmundry lava

  10. Integrating Computational Chemistry into a Course in Classical Thermodynamics

    Science.gov (United States)

    Martini, Sheridan R.; Hartzell, Cynthia J.

    2015-01-01

    Computational chemistry is commonly addressed in the quantum mechanics course of undergraduate physical chemistry curricula. Since quantum mechanics traditionally follows the thermodynamics course, there is a lack of curricula relating computational chemistry to thermodynamics. A method integrating molecular modeling software into a semester long…

  11. Use of common time base for checkpointing and rollback recovery in a distributed system

    Science.gov (United States)

    Ramanathan, Parameswaran; Shin, Kang G.

    1993-01-01

    An approach to checkpointing and rollback recovery in a distributed computing system using a common time base is proposed. A common time base is established in the system using a hardware clock synchronization algorithm. This common time base is coupled with the idea of pseudo-recovery points to develop a checkpointing algorithm that has the following advantages: reduced wait for commitment for establishing recovery lines, fewer messages to be exchanged, and less memory requirement. These advantages are assessed quantitatively by developing a probabilistic model.

  12. HIGH PERFORMANCE PHOTOGRAMMETRIC PROCESSING ON COMPUTER CLUSTERS

    Directory of Open Access Journals (Sweden)

    V. N. Adrov

    2012-07-01

    Full Text Available Most cpu consuming tasks in photogrammetric processing can be done in parallel. The algorithms take independent bits as input and produce independent bits as output. The independence of bits comes from the nature of such algorithms since images, stereopairs or small image blocks parts can be processed independently. Many photogrammetric algorithms are fully automatic and do not require human interference. Photogrammetric workstations can perform tie points measurements, DTM calculations, orthophoto construction, mosaicing and many other service operations in parallel using distributed calculations. Distributed calculations save time reducing several days calculations to several hours calculations. Modern trends in computer technology show the increase of cpu cores in workstations, speed increase in local networks, and as a result dropping the price of the supercomputers or computer clusters that can contain hundreds or even thousands of computing nodes. Common distributed processing in DPW is usually targeted for interactive work with a limited number of cpu cores and is not optimized for centralized administration. The bottleneck of common distributed computing in photogrammetry can be in the limited lan throughput and storage performance, since the processing of huge amounts of large raster images is needed.

  13. Computer Vision Syndrome: Implications for the Occupational Health Nurse.

    Science.gov (United States)

    Lurati, Ann Regina

    2018-02-01

    Computers and other digital devices are commonly used both in the workplace and during leisure time. Computer vision syndrome (CVS) is a new health-related condition that negatively affects workers. This article reviews the pathology of and interventions for CVS with implications for the occupational health nurse.

  14. CytoMCS: A Multiple Maximum Common Subgraph Detection Tool for Cytoscape

    DEFF Research Database (Denmark)

    Larsen, Simon; Baumbach, Jan

    2017-01-01

    such analyses we have developed CytoMCS, a Cytoscape app for computing inexact solutions to the maximum common edge subgraph problem for two or more graphs. Our algorithm uses an iterative local search heuristic for computing conserved subgraphs, optimizing a squared edge conservation score that is able...... to detect not only fully conserved edges but also partially conserved edges. It can be applied to any set of directed or undirected, simple graphs loaded as networks into Cytoscape, e.g. protein-protein interaction networks or gene regulatory networks. CytoMCS is available as a Cytoscape app at http://apps.cytoscape.org/apps/cytomcs....

  15. Application of a common spatial pattern-based algorithm for an fNIRS-based motor imagery brain-computer interface.

    Science.gov (United States)

    Zhang, Shen; Zheng, Yanchun; Wang, Daifa; Wang, Ling; Ma, Jianai; Zhang, Jing; Xu, Weihao; Li, Deyu; Zhang, Dan

    2017-08-10

    Motor imagery is one of the most investigated paradigms in the field of brain-computer interfaces (BCIs). The present study explored the feasibility of applying a common spatial pattern (CSP)-based algorithm for a functional near-infrared spectroscopy (fNIRS)-based motor imagery BCI. Ten participants performed kinesthetic imagery of their left- and right-hand movements while 20-channel fNIRS signals were recorded over the motor cortex. The CSP method was implemented to obtain the spatial filters specific for both imagery tasks. The mean, slope, and variance of the CSP filtered signals were taken as features for BCI classification. Results showed that the CSP-based algorithm outperformed two representative channel-wise methods for classifying the two imagery statuses using either data from all channels or averaged data from imagery responsive channels only (oxygenated hemoglobin: CSP-based: 75.3±13.1%; all-channel: 52.3±5.3%; averaged: 64.8±13.2%; deoxygenated hemoglobin: CSP-based: 72.3±13.0%; all-channel: 48.8±8.2%; averaged: 63.3±13.3%). Furthermore, the effectiveness of the CSP method was also observed for the motor execution data to a lesser extent. A partial correlation analysis revealed significant independent contributions from all three types of features, including the often-ignored variance feature. To our knowledge, this is the first study demonstrating the effectiveness of the CSP method for fNIRS-based motor imagery BCIs. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Collaboration, Collusion and Plagiarism in Computer Science Coursework

    OpenAIRE

    Robert FRASER

    2014-01-01

    We present an overview of the nature of academic dishonesty with respect to computer science coursework. We discuss the efficacy of various policies for collaboration with regard to student education, and we consider a number of strategies for mitigating dishonest behaviour on computer science coursework by addressing some common causes. Computer science coursework is somewhat unique, in that there often exist ideal solutions for problems, and work may be shared and copied with very little ef...

  17. Natural computing for mechanical systems research: A tutorial overview

    Science.gov (United States)

    Worden, Keith; Staszewski, Wieslaw J.; Hensman, James J.

    2011-01-01

    A great many computational algorithms developed over the past half-century have been motivated or suggested by biological systems or processes, the most well-known being the artificial neural networks. These algorithms are commonly grouped together under the terms soft or natural computing. A property shared by most natural computing algorithms is that they allow exploration of, or learning from, data. This property has proved extremely valuable in the solution of many diverse problems in science and engineering. The current paper is intended as a tutorial overview of the basic theory of some of the most common methods of natural computing as they are applied in the context of mechanical systems research. The application of some of the main algorithms is illustrated using case studies. The paper also attempts to give some indication as to which of the algorithms emerging now from the machine learning community are likely to be important for mechanical systems research in the future.

  18. Impact of computer use on children's vision.

    Science.gov (United States)

    Kozeis, N

    2009-10-01

    Today, millions of children use computers on a daily basis. Extensive viewing of the computer screen can lead to eye discomfort, fatigue, blurred vision and headaches, dry eyes and other symptoms of eyestrain. These symptoms may be caused by poor lighting, glare, an improper work station set-up, vision problems of which the person was not previously aware, or a combination of these factors. Children can experience many of the same symptoms related to computer use as adults. However, some unique aspects of how children use computers may make them more susceptible than adults to the development of these problems. In this study, the most common eye symptoms related to computer use in childhood, the possible causes and ways to avoid them are reviewed.

  19. Mobile Cloud Computing for Telemedicine Solutions

    Directory of Open Access Journals (Sweden)

    Mihaela GHEORGHE

    2014-01-01

    Full Text Available Mobile Cloud Computing is a significant technology which combines emerging domains such as mobile computing and cloud computing which has conducted to the development of one of the most IT industry challenging and innovative trend. This is still at the early stage of devel-opment but its main characteristics, advantages and range of services which are provided by an internet-based cluster system have a strong impact on the process of developing telemedi-cine solutions for overcoming the wide challenges the medical system is confronting with. Mo-bile Cloud integrates cloud computing into the mobile environment and has the advantage of overcoming obstacles related to performance (e.g. battery life, storage, and bandwidth, envi-ronment (e.g. heterogeneity, scalability, availability and security (e.g. reliability and privacy which are commonly present at mobile computing level. In this paper, I will present a compre-hensive overview on mobile cloud computing including definitions, services and the use of this technology for developing telemedicine application.

  20. Virtualization and cloud computing in dentistry.

    Science.gov (United States)

    Chow, Frank; Muftu, Ali; Shorter, Richard

    2014-01-01

    The use of virtualization and cloud computing has changed the way we use computers. Virtualization is a method of placing software called a hypervisor on the hardware of a computer or a host operating system. It allows a guest operating system to run on top of the physical computer with a virtual machine (i.e., virtual computer). Virtualization allows multiple virtual computers to run on top of one physical computer and to share its hardware resources, such as printers, scanners, and modems. This increases the efficient use of the computer by decreasing costs (e.g., hardware, electricity administration, and management) since only one physical computer is needed and running. This virtualization platform is the basis for cloud computing. It has expanded into areas of server and storage virtualization. One of the commonly used dental storage systems is cloud storage. Patient information is encrypted as required by the Health Insurance Portability and Accountability Act (HIPAA) and stored on off-site private cloud services for a monthly service fee. As computer costs continue to increase, so too will the need for more storage and processing power. Virtual and cloud computing will be a method for dentists to minimize costs and maximize computer efficiency in the near future. This article will provide some useful information on current uses of cloud computing.

  1. Fast DRR splat rendering using common consumer graphics hardware

    International Nuclear Information System (INIS)

    Spoerk, Jakob; Bergmann, Helmar; Wanschitz, Felix; Dong, Shuo; Birkfellner, Wolfgang

    2007-01-01

    Digitally rendered radiographs (DRR) are a vital part of various medical image processing applications such as 2D/3D registration for patient pose determination in image-guided radiotherapy procedures. This paper presents a technique to accelerate DRR creation by using conventional graphics hardware for the rendering process. DRR computation itself is done by an efficient volume rendering method named wobbled splatting. For programming the graphics hardware, NVIDIAs C for Graphics (Cg) is used. The description of an algorithm used for rendering DRRs on the graphics hardware is presented, together with a benchmark comparing this technique to a CPU-based wobbled splatting program. Results show a reduction of rendering time by about 70%-90% depending on the amount of data. For instance, rendering a volume of 2x10 6 voxels is feasible at an update rate of 38 Hz compared to 6 Hz on a common Intel-based PC using the graphics processing unit (GPU) of a conventional graphics adapter. In addition, wobbled splatting using graphics hardware for DRR computation provides higher resolution DRRs with comparable image quality due to special processing characteristics of the GPU. We conclude that DRR generation on common graphics hardware using the freely available Cg environment is a major step toward 2D/3D registration in clinical routine

  2. [Clinical skills and outcomes of chair-side computer aided design and computer aided manufacture system].

    Science.gov (United States)

    Yu, Q

    2018-04-09

    Computer aided design and computer aided manufacture (CAD/CAM) technology is a kind of oral digital system which is applied to clinical diagnosis and treatment. It overturns the traditional pattern, and provides a solution to restore defect tooth quickly and efficiently. In this paper we mainly discuss the clinical skills of chair-side CAD/CAM system, including tooth preparation, digital impression, the three-dimensional design of prosthesis, numerical control machining, clinical bonding and so on, and review the outcomes of several common kinds of materials at the same time.

  3. Universal computer interfaces

    CERN Document Server

    Dheere, RFBM

    1988-01-01

    Presents a survey of the latest developments in the field of the universal computer interface, resulting from a study of the world patent literature. Illustrating the state of the art today, the book ranges from basic interface structure, through parameters and common characteristics, to the most important industrial bus realizations. Recent technical enhancements are also included, with special emphasis devoted to the universal interface adapter circuit. Comprehensively indexed.

  4. Cloud computing applications for biomedical science: A perspective.

    Science.gov (United States)

    Navale, Vivek; Bourne, Philip E

    2018-06-01

    Biomedical research has become a digital data-intensive endeavor, relying on secure and scalable computing, storage, and network infrastructure, which has traditionally been purchased, supported, and maintained locally. For certain types of biomedical applications, cloud computing has emerged as an alternative to locally maintained traditional computing approaches. Cloud computing offers users pay-as-you-go access to services such as hardware infrastructure, platforms, and software for solving common biomedical computational problems. Cloud computing services offer secure on-demand storage and analysis and are differentiated from traditional high-performance computing by their rapid availability and scalability of services. As such, cloud services are engineered to address big data problems and enhance the likelihood of data and analytics sharing, reproducibility, and reuse. Here, we provide an introductory perspective on cloud computing to help the reader determine its value to their own research.

  5. Common Envelope Evolution: Implications for Post-AGB Stars and Planetary Nebulae

    Science.gov (United States)

    Nordhaus, J.

    2017-10-01

    Common envelopes (CE) are of broad interest as they represent one method by which binaries with initially long-period orbits of a few years can be converted into short-period orbits of a few hours. Despite their importance, the brief lifetimes of CE phases make them difficult to directly observe. Nevertheless, CE interactions are potentially common, can produce a diverse array of nebular shapes, and can accommodate current post-AGB and planetary nebula outflow constraints. Here, I discuss ongoing theoretical and computational work on CEs and speculate on what lies ahead for determining accurate outcomes of this elusive phase of evolution.

  6. Synchrotron Imaging Computations on the Grid without the Computing Element

    International Nuclear Information System (INIS)

    Curri, A; Pugliese, R; Borghes, R; Kourousias, G

    2011-01-01

    Besides the heavy use of the Grid in the Synchrotron Radiation Facility (SRF) Elettra, additional special requirements from the beamlines had to be satisfied through a novel solution that we present in this work. In the traditional Grid Computing paradigm the computations are performed on the Worker Nodes of the grid element known as the Computing Element. A Grid middleware extension that our team has been working on, is that of the Instrument Element. In general it is used to Grid-enable instrumentation; and it can be seen as a neighbouring concept to that of the traditional Control Systems. As a further extension we demonstrate the Instrument Element as the steering mechanism for a series of computations. In our deployment it interfaces a Control System that manages a series of computational demanding Scientific Imaging tasks in an online manner. The instrument control in Elettra is done through a suitable Distributed Control System, a common approach in the SRF community. The applications that we present are for a beamline working in medical imaging. The solution resulted to a substantial improvement of a Computed Tomography workflow. The near-real-time requirements could not have been easily satisfied from our Grid's middleware (gLite) due to the various latencies often occurred during the job submission and queuing phases. Moreover the required deployment of a set of TANGO devices could not have been done in a standard gLite WN. Besides the avoidance of certain core Grid components, the Grid Security infrastructure has been utilised in the final solution.

  7. Computing quantum discord is NP-complete

    International Nuclear Information System (INIS)

    Huang, Yichen

    2014-01-01

    We study the computational complexity of quantum discord (a measure of quantum correlation beyond entanglement), and prove that computing quantum discord is NP-complete. Therefore, quantum discord is computationally intractable: the running time of any algorithm for computing quantum discord is believed to grow exponentially with the dimension of the Hilbert space so that computing quantum discord in a quantum system of moderate size is not possible in practice. As by-products, some entanglement measures (namely entanglement cost, entanglement of formation, relative entropy of entanglement, squashed entanglement, classical squashed entanglement, conditional entanglement of mutual information, and broadcast regularization of mutual information) and constrained Holevo capacity are NP-hard/NP-complete to compute. These complexity-theoretic results are directly applicable in common randomness distillation, quantum state merging, entanglement distillation, superdense coding, and quantum teleportation; they may offer significant insights into quantum information processing. Moreover, we prove the NP-completeness of two typical problems: linear optimization over classical states and detecting classical states in a convex set, providing evidence that working with classical states is generically computationally intractable. (paper)

  8. A summary of numerical computation for special functions

    International Nuclear Information System (INIS)

    Zhang Shanjie

    1992-01-01

    In the paper, special functions frequently encountered in science and engineering calculations are introduced. The computation of the values of Bessel function and elliptic integrals are taken as the examples, and some common algorithms for computing most special functions, such as series expansion for small argument, asymptotic approximations for large argument, polynomial approximations, recurrence formulas and iteration method, are discussed. In addition, the determination of zeros of some special functions, and the other questions related to numerical computation are also discussed

  9. Engineering applications of soft computing

    CERN Document Server

    Díaz-Cortés, Margarita-Arimatea; Rojas, Raúl

    2017-01-01

    This book bridges the gap between Soft Computing techniques and their applications to complex engineering problems. In each chapter we endeavor to explain the basic ideas behind the proposed applications in an accessible format for readers who may not possess a background in some of the fields. Therefore, engineers or practitioners who are not familiar with Soft Computing methods will appreciate that the techniques discussed go beyond simple theoretical tools, since they have been adapted to solve significant problems that commonly arise in such areas. At the same time, the book will show members of the Soft Computing community how engineering problems are now being solved and handled with the help of intelligent approaches. Highlighting new applications and implementations of Soft Computing approaches in various engineering contexts, the book is divided into 12 chapters. Further, it has been structured so that each chapter can be read independently of the others.

  10. Common mental disorders among medical students in Jimma University, SouthWest Ethiopia.

    Science.gov (United States)

    Kerebih, Habtamu; Ajaeb, Mohammed; Hailesilassie, Hailemariam

    2017-09-01

    Medical students are at risk of common mental disorders due to difficulties of adjustment to the medical school environment, exposure to death and human suffering. However there is limited data on this aspect. Therefore, the current study assessed the magnitude of common mental disorders and contributing factors among medical students. An institutional based cross-sectional study was conducted from May 12-16, 2015 using stratified sampling technique. Three hundred and five medical students participated in the study. Common mental disorders were assessed using the self-reported questionnaire (SRQ-20). Logistic regression analysis was used to identify factors associated with common mental disorders among students. Adjusted odds ratios with 95% confidence interval were computed to determine the level of significance. Prevalence of common mental disorders among medical students was 35.2%. Being female, younger age, married, having less than 250 birr monthly pocket money, attending pre-clinical class, khat chewing, smoking cigarettes, alcohol drinking and ganja/shisha use were significantly associated with common mental disorders. The overall prevalence of common mental disorders among medical students was high. Therefore, it is essential to institute effective intervention strategies giving emphasis to contributing factors to common mental disorders.

  11. Computed Tomography evaluation of maxillofacial injuries

    Directory of Open Access Journals (Sweden)

    V Natraj Prasad

    2017-01-01

    Full Text Available Background & Objectives: The maxillofacial region, a complex anatomical structure, can be evaluated by conventional (plain films, Tomography, Multidetector Computed Tomography, Three-Dimensional Computed Tomography, Orthopantomogram and Magnetic Resonance Imaging. The study was conducted with objective of describing various forms of maxillofacial injuries, imaging features of different types of maxillofacial fractures and the advantage of using Three- Dimensional Computed Tomography reconstructed image. Materials & Methods: A hospital based cross-sectional study was conducted among 50 patients during April 2014 to September 2016 using Toshiba Aquilion Prime 160 slice Multi Detector Computed Tomography scanner.Results: The maxillofacial fractures were significantly higher in male population (88% than female population (12 %. Road traffic accidents were the most common cause of injury others being physical assault and fall from height. It was most common in 31-40 years (26% and 21-30 (24% years age group. Maxillary sinus was the commonest fracture (36% followed by nasal bone and zygomatic bone (30%, mandible and orbital bones (28%. Soft tissue swelling was the commonest associated finding. Three dimensional images (3 D compared to the axial scans missed some fractures. However, the extension of the complex fracture lines and degree of displacement were more accurately assessed. Complex fractures found were Le fort (6% and naso-orbito-ethmoid (4% fractures.Conclusion: The proper evaluation of complex anatomy of the facial bones requires Multidetector Computed Tomography which offers excellent spatial resolution enabling multiplanar reformations and three dimensional reconstructions for enhanced diagnostic accuracy and surgical planning.

  12. APC: A new code for Atmospheric Polarization Computations

    International Nuclear Information System (INIS)

    Korkin, Sergey V.; Lyapustin, Alexei I.; Rozanov, Vladimir V.

    2013-01-01

    A new polarized radiative transfer code Atmospheric Polarization Computations (APC) is described. The code is based on separation of the diffuse light field into anisotropic and smooth (regular) parts. The anisotropic part is computed analytically. The smooth regular part is computed numerically using the discrete ordinates method. Vertical stratification of the atmosphere, common types of bidirectional surface reflection and scattering by spherical particles or spheroids are included. A particular consideration is given to computation of the bidirectional polarization distribution function (BPDF) of the waved ocean surface. -- Highlights: •A new code, APC, has been developed. •The code was validated against well-known codes. •The BPDF for an arbitrary Mueller matrix is computed

  13. Light reflection models for computer graphics.

    Science.gov (United States)

    Greenberg, D P

    1989-04-14

    During the past 20 years, computer graphic techniques for simulating the reflection of light have progressed so that today images of photorealistic quality can be produced. Early algorithms considered direct lighting only, but global illumination phenomena with indirect lighting, surface interreflections, and shadows can now be modeled with ray tracing, radiosity, and Monte Carlo simulations. This article describes the historical development of computer graphic algorithms for light reflection and pictorially illustrates what will be commonly available in the near future.

  14. International Symposium on Complex Computing-Networks

    CERN Document Server

    Sevgi, L; CCN2005; Complex computing networks: Brain-like and wave-oriented electrodynamic algorithms

    2006-01-01

    This book uniquely combines new advances in the electromagnetic and the circuits&systems theory. It integrates both fields regarding computational aspects of common interest. Emphasized subjects are those methods which mimic brain-like and electrodynamic behaviour; among these are cellular neural networks, chaos and chaotic dynamics, attractor-based computation and stream ciphers. The book contains carefully selected contributions from the Symposium CCN2005. Pictures from the bestowal of Honorary Doctorate degrees to Leon O. Chua and Leopold B. Felsen are included.

  15. Wave-equation Migration Velocity Analysis Using Plane-wave Common Image Gathers

    KAUST Repository

    Guo, Bowen

    2017-06-01

    Wave-equation migration velocity analysis (WEMVA) based on subsurface-offset, angle domain or time-lag common image gathers (CIGs) requires significant computational and memory resources because it computes higher dimensional migration images in the extended image domain. To mitigate this problem, a WEMVA method using plane-wave CIGs is presented. Plane-wave CIGs reduce the computational cost and memory storage because they are directly calculated from prestack plane-wave migration, and the number of plane waves is often much smaller than the number of shots. In the case of an inaccurate migration velocity, the moveout of plane-wave CIGs is automatically picked by a semblance analysis method, which is then linked to the migration velocity update by a connective function. Numerical tests on two synthetic datasets and a field dataset validate the efficiency and effectiveness of this method.

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  17. A Simulated Annealing Algorithm for Maximum Common Edge Subgraph Detection in Biological Networks

    DEFF Research Database (Denmark)

    Larsen, Simon; Alkærsig, Frederik G.; Ditzel, Henrik

    2016-01-01

    Network alignment is a challenging computational problem that identifies node or edge mappings between two or more networks, with the aim to unravel common patterns among them. Pairwise network alignment is already intractable, making multiple network comparison even more difficult. Here, we intr...

  18. From Computer Forensics to Forensic Computing: Investigators Investigate, Scientists Associate

    OpenAIRE

    Dewald, Andreas; Freiling, Felix C.

    2014-01-01

    This paper draws a comparison of fundamental theories in traditional forensic science and the state of the art in current computer forensics, thereby identifying a certain disproportion between the perception of central aspects in common theory and the digital forensics reality. We propose a separation of what is currently demanded of practitioners in digital forensics into a rigorous scientific part on the one hand, and a more general methodology of searching and seizing digital evidence an...

  19. Advances in medical imaging for the diagnosis and management of common genitourinary cancers.

    Science.gov (United States)

    Bagheri, Mohammad H; Ahlman, Mark A; Lindenberg, Liza; Turkbey, Baris; Lin, Jeffrey; Cahid Civelek, Ali; Malayeri, Ashkan A; Agarwal, Piyush K; Choyke, Peter L; Folio, Les R; Apolo, Andrea B

    2017-07-01

    Medical imaging of the 3 most common genitourinary (GU) cancers-prostate adenocarcinoma, renal cell carcinoma, and urothelial carcinoma of the bladder-has evolved significantly during the last decades. The most commonly used imaging modalities for the diagnosis, staging, and follow-up of GU cancers are computed tomography, magnetic resonance imaging (MRI), and positron emission tomography (PET). Multiplanar multidetector computed tomography and multiparametric MRI with diffusion-weighted imaging are the main imaging modalities for renal cell carcinoma and urothelial carcinoma, and although multiparametric MRI is rapidly becoming the main imaging tool in the evaluation of prostate adenocarcinoma, biopsy is still required for diagnosis. Functional and molecular imaging using 18-fluorodeoxyglucose-PET and sodium fluoride-PET are essential for the diagnosis, and especially follow-up, of metastatic GU tumors. This review provides an overview of the latest advances in the imaging of these 3 major GU cancers. Published by Elsevier Inc.

  20. Using Noninvasive Brain Measurement to Explore the Psychological Effects of Computer Malfunctions on Users during Human-Computer Interactions

    Directory of Open Access Journals (Sweden)

    Leanne M. Hirshfield

    2014-01-01

    Full Text Available In today’s technologically driven world, there is a need to better understand the ways that common computer malfunctions affect computer users. These malfunctions may have measurable influences on computer user’s cognitive, emotional, and behavioral responses. An experiment was conducted where participants conducted a series of web search tasks while wearing functional near-infrared spectroscopy (fNIRS and galvanic skin response sensors. Two computer malfunctions were introduced during the sessions which had the potential to influence correlates of user trust and suspicion. Surveys were given after each session to measure user’s perceived emotional state, cognitive load, and perceived trust. Results suggest that fNIRS can be used to measure the different cognitive and emotional responses associated with computer malfunctions. These cognitive and emotional changes were correlated with users’ self-report levels of suspicion and trust, and they in turn suggest future work that further explores the capability of fNIRS for the measurement of user experience during human-computer interactions.

  1. Computer vision syndrome (CVS) - Thermographic Analysis

    Science.gov (United States)

    Llamosa-Rincón, L. E.; Jaime-Díaz, J. M.; Ruiz-Cardona, D. F.

    2017-01-01

    The use of computers has reported an exponential growth in the last decades, the possibility of carrying out several tasks for both professional and leisure purposes has contributed to the great acceptance by the users. The consequences and impact of uninterrupted tasks with computers screens or displays on the visual health, have grabbed researcher’s attention. When spending long periods of time in front of a computer screen, human eyes are subjected to great efforts, which in turn triggers a set of symptoms known as Computer Vision Syndrome (CVS). Most common of them are: blurred vision, visual fatigue and Dry Eye Syndrome (DES) due to unappropriate lubrication of ocular surface when blinking decreases. An experimental protocol was de-signed and implemented to perform thermographic studies on healthy human eyes during exposure to dis-plays of computers, with the main purpose of comparing the existing differences in temperature variations of healthy ocular surfaces.

  2. A study of computer-related upper limb discomfort and computer vision syndrome.

    Science.gov (United States)

    Sen, A; Richardson, Stanley

    2007-12-01

    Personal computers are one of the commonest office tools in Malaysia today. Their usage, even for three hours per day, leads to a health risk of developing Occupational Overuse Syndrome (OOS), Computer Vision Syndrome (CVS), low back pain, tension headaches and psychosocial stress. The study was conducted to investigate how a multiethnic society in Malaysia is coping with these problems that are increasing at a phenomenal rate in the west. This study investigated computer usage, awareness of ergonomic modifications of computer furniture and peripherals, symptoms of CVS and risk of developing OOS. A cross-sectional questionnaire study of 136 computer users was conducted on a sample population of university students and office staff. A 'Modified Rapid Upper Limb Assessment (RULA) for office work' technique was used for evaluation of OOS. The prevalence of CVS was surveyed incorporating a 10-point scoring system for each of its various symptoms. It was found that many were using standard keyboard and mouse without any ergonomic modifications. Around 50% of those with some low back pain did not have an adjustable backrest. Many users had higher RULA scores of the wrist and neck suggesting increased risk of developing OOS, which needed further intervention. Many (64%) were using refractive corrections and still had high scores of CVS commonly including eye fatigue, headache and burning sensation. The increase of CVS scores (suggesting more subjective symptoms) correlated with increase in computer usage spells. It was concluded that further onsite studies are needed, to follow up this survey to decrease the risks of developing CVS and OOS amongst young computer users.

  3. High-performance computing for airborne applications

    International Nuclear Information System (INIS)

    Quinn, Heather M.; Manuzatto, Andrea; Fairbanks, Tom; Dallmann, Nicholas; Desgeorges, Rose

    2010-01-01

    Recently, there has been attempts to move common satellite tasks to unmanned aerial vehicles (UAVs). UAVs are significantly cheaper to buy than satellites and easier to deploy on an as-needed basis. The more benign radiation environment also allows for an aggressive adoption of state-of-the-art commercial computational devices, which increases the amount of data that can be collected. There are a number of commercial computing devices currently available that are well-suited to high-performance computing. These devices range from specialized computational devices, such as field-programmable gate arrays (FPGAs) and digital signal processors (DSPs), to traditional computing platforms, such as microprocessors. Even though the radiation environment is relatively benign, these devices could be susceptible to single-event effects. In this paper, we will present radiation data for high-performance computing devices in a accelerated neutron environment. These devices include a multi-core digital signal processor, two field-programmable gate arrays, and a microprocessor. From these results, we found that all of these devices are suitable for many airplane environments without reliability problems.

  4. Multi-dimensional two-fluid flow computation. An overview

    International Nuclear Information System (INIS)

    Carver, M.B.

    1992-01-01

    This paper discusses a repertoire of three-dimensional computer programs developed to perform critical analysis of single-phase, two-phase and multi-fluid flow in reactor components. The basic numerical approach to solving the governing equations common to all the codes is presented and the additional constitutive relationships required for closure are discussed. Particular applications are presented for a number of computer codes. (author). 12 refs

  5. Computer-based quantitative computed tomography image analysis in idiopathic pulmonary fibrosis: A mini review.

    Science.gov (United States)

    Ohkubo, Hirotsugu; Nakagawa, Hiroaki; Niimi, Akio

    2018-01-01

    Idiopathic pulmonary fibrosis (IPF) is the most common type of progressive idiopathic interstitial pneumonia in adults. Many computer-based image analysis methods of chest computed tomography (CT) used in patients with IPF include the mean CT value of the whole lungs, density histogram analysis, density mask technique, and texture classification methods. Most of these methods offer good assessment of pulmonary functions, disease progression, and mortality. Each method has merits that can be used in clinical practice. One of the texture classification methods is reported to be superior to visual CT scoring by radiologist for correlation with pulmonary function and prediction of mortality. In this mini review, we summarize the current literature on computer-based CT image analysis of IPF and discuss its limitations and several future directions. Copyright © 2017 The Japanese Respiratory Society. Published by Elsevier B.V. All rights reserved.

  6. EU grid computing effort takes on malaria

    CERN Multimedia

    Lawrence, Stacy

    2006-01-01

    Malaria is the world's most common parasitic infection, affecting more thatn 500 million people annually and killing more than 1 million. In order to help combat malaria, CERN has launched a grid computing effort (1 page)

  7. Trends in Continuity and Interpolation for Computer Graphics.

    Science.gov (United States)

    Gonzalez Garcia, Francisco

    2015-01-01

    In every computer graphics oriented application today, it is a common practice to texture 3D models as a way to obtain realistic material. As part of this process, mesh texturing, deformation, and visualization are all key parts of the computer graphics field. This PhD dissertation was completed in the context of these three important and related fields in computer graphics. The article presents techniques that improve on existing state-of-the-art approaches related to continuity and interpolation in texture space (texturing), object space (deformation), and screen space (rendering).

  8. ATLAS@Home: Harnessing Volunteer Computing for HEP

    International Nuclear Information System (INIS)

    Adam-Bourdarios, C; Cameron, D; Filipčič, A; Lancon, E; Wu, W

    2015-01-01

    A recent common theme among HEP computing is exploitation of opportunistic resources in order to provide the maximum statistics possible for Monte Carlo simulation. Volunteer computing has been used over the last few years in many other scientific fields and by CERN itself to run simulations of the LHC beams. The ATLAS@Home project was started to allow volunteers to run simulations of collisions in the ATLAS detector. So far many thousands of members of the public have signed up to contribute their spare CPU cycles for ATLAS, and there is potential for volunteer computing to provide a significant fraction of ATLAS computing resources. Here we describe the design of the project, the lessons learned so far and the future plans. (paper)

  9. ATLAS@Home: Harnessing Volunteer Computing for HEP

    CERN Document Server

    Bourdarios, Claire; Filipcic, Andrej; Lancon, Eric; Wu, Wenjing

    2015-01-01

    A recent common theme among HEP computing is exploitation of opportunistic resources in order to provide the maximum statistics possible for Monte-Carlo simulation. Volunteer computing has been used over the last few years in many other scientific fields and by CERN itself to run simulations of the LHC beams. The ATLAS@Home project was started to allow volunteers to run simulations of collisions in the ATLAS detector. So far many thousands of members of the public have signed up to contribute their spare CPU cycles for ATLAS, and there is potential for volunteer computing to provide a significant fraction of ATLAS computing resources. Here we describe the design of the project, the lessons learned so far and the future plans.

  10. Group Awareness and Self-Presentation in Computer-Supported Information Exchange

    Science.gov (United States)

    Kimmerle, Joachim; Cress, Ulrike

    2008-01-01

    A common challenge in many situations of computer-supported collaborative learning is increasing the willingness of those involved to share their knowledge with other group members. As a prototypical situation of computer-supported information exchange, a shared-database setting was chosen for the current study. This information-exchange situation…

  11. A CAMAC-based laboratory computer system

    International Nuclear Information System (INIS)

    Westphal, G.P.

    1975-01-01

    A CAMAC-based laboratory computer network is described by sharing a common mass memory this offers distinct advantages over slow and core-consuming single-processor installations. A fast compiler-BASIC, with extensions for CAMAC and real-time, provides a convenient means for interactive experiment control

  12. [Common physicochemical characteristics of endogenous hormones-- liberins and statins].

    Science.gov (United States)

    Zamiatnin, A A; Voronina, O L

    1998-01-01

    The common chemical features of oligopeptide releasing-hormones and release inhibiting hormones were investigated with the aid of computer methods. 339 regulatory molecules of such type have been extracted out of data from computer bank EROP-Moscow. They contain from 2 to 47 amino acid residues and their sequences include short sites, which play apparently a decisive role in realization of interactions with the receptors. The analysis of chemical radicals shows that all liberins and statins contain positively charged group and cyclic radical of some amino acids or hydrophobic group. Results of this study indicate that the most chemical radicals of hormones are open for the interaction with potential receptors of target-cells. The mechanism of hormone ligand and receptors binding and conceivable role of amino acid and neurotransmitter radicals in hormonal properties of liberins and statins is discussed.

  13. Common-image gathers using the excitation amplitude imaging condition

    KAUST Repository

    Kalita, Mahesh

    2016-06-06

    Common-image gathers (CIGs) are extensively used in migration velocity analysis. Any defocused events in the subsurface offset domain or equivalently nonflat events in angle-domain CIGs are accounted for revising the migration velocities. However, CIGs from wave-equation methods such as reverse time migration are often expensive to compute, especially in 3D. Using the excitation amplitude imaging condition that simplifies the forward-propagated source wavefield, we have managed to extract extended images for space and time lags in conjunction with prestack reverse time migration. The extended images tend to be cleaner, and the memory cost/disk storage is extensively reduced because we do not need to store the source wavefield. In addition, by avoiding the crosscorrelation calculation, we reduce the computational cost. These features are demonstrated on a linear v(z) model, a two-layer velocity model, and the Marmousi model.

  14. Computed tomography after radical pancreaticoduodenectomy (Whipple's procedure)

    International Nuclear Information System (INIS)

    Smith, S.L.; Hampson, F.; Duxbury, M.; Rae, D.M.; Sinclair, M.T.

    2008-01-01

    Whipple's procedure (radical pancreaticoduodenectomy) is currently the only curative option for patients with periampullary malignancy. The surgery is highly complex and involves multiple anastomoses. Complications are common and can lead to significant postoperative morbidity. Early detection and treatment of complications is vital, and high-quality multidetector computed tomography (MDCT) is currently the best method of investigation. This review outlines the surgical technique and illustrates the range of normal postoperative appearances together with the common complications

  15. Grid Computing Making the Global Infrastructure a Reality

    CERN Document Server

    Fox, Geoffrey C; Hey, Anthony J G

    2003-01-01

    Grid computing is applying the resources of many computers in a network to a single problem at the same time Grid computing appears to be a promising trend for three reasons: (1) Its ability to make more cost-effective use of a given amount of computer resources, (2) As a way to solve problems that can't be approached without an enormous amount of computing power (3) Because it suggests that the resources of many computers can be cooperatively and perhaps synergistically harnessed and managed as a collaboration toward a common objective. A number of corporations, professional groups, university consortiums, and other groups have developed or are developing frameworks and software for managing grid computing projects. The European Community (EU) is sponsoring a project for a grid for high-energy physics, earth observation, and biology applications. In the United States, the National Technology Grid is prototyping a computational grid for infrastructure and an access grid for people. Sun Microsystems offers Gri...

  16. An approach to quantum-computational hydrologic inverse analysis.

    Science.gov (United States)

    O'Malley, Daniel

    2018-05-02

    Making predictions about flow and transport in an aquifer requires knowledge of the heterogeneous properties of the aquifer such as permeability. Computational methods for inverse analysis are commonly used to infer these properties from quantities that are more readily observable such as hydraulic head. We present a method for computational inverse analysis that utilizes a type of quantum computer called a quantum annealer. While quantum computing is in an early stage compared to classical computing, we demonstrate that it is sufficiently developed that it can be used to solve certain subsurface flow problems. We utilize a D-Wave 2X quantum annealer to solve 1D and 2D hydrologic inverse problems that, while small by modern standards, are similar in size and sometimes larger than hydrologic inverse problems that were solved with early classical computers. Our results and the rapid progress being made with quantum computing hardware indicate that the era of quantum-computational hydrology may not be too far in the future.

  17. The prevalence of computer and Internet addiction among pupils.

    Science.gov (United States)

    Zboralski, Krzysztof; Orzechowska, Agata; Talarowska, Monika; Darmosz, Anna; Janiak, Aneta; Janiak, Marcin; Florkowski, Antoni; Gałecki, Piotr

    2009-02-02

    Media have an influence on the human psyche similar to the addictive actions of psychoactive substances or gambling. Computer overuse is claimed to be a cause of psychiatric disturbances such as computer and Internet addiction. It has not yet been recognized as a disease, but it evokes increasing controversy and results in mental disorders commonly defined as computer and Internet addiction. This study was based on a diagnostic survey in which 120 subjects participated. The participants were pupils of three kinds of schools: primary, middle, and secondary school (high school). Information for this study was obtained from a questionnaire prepared by the authors as well as the State-Trait Anxiety Inventory (STAI) and the Psychological Inventory of Aggression Syndrome (IPSA-II). he results confirmed that every fourth pupil was addicted to the Internet. Internet addiction was very common among the youngest users of computers and the Internet, especially those who had no brothers and sisters or came from families with some kind of problems. Moreover, more frequent use of the computer and the Internet was connected with higher levels of aggression and anxiety. Because computer and Internet addiction already constitute a real danger, it is worth considering preventive activities to treat this phenomenon. It is also necessary to make the youth and their parents aware of the dangers of uncontrolled Internet use and pay attention to behavior connected with Internet addiction.

  18. Can Tablet Computers Enhance Faculty Teaching?

    Science.gov (United States)

    Narayan, Aditee P; Whicker, Shari A; Benjamin, Robert W; Hawley, Jeffrey; McGann, Kathleen A

    2015-06-01

    Learner benefits of tablet computer use have been demonstrated, yet there is little evidence regarding faculty tablet use for teaching. Our study sought to determine if supplying faculty with tablet computers and peer mentoring provided benefits to learners and faculty beyond that of non-tablet-based teaching modalities. We provided faculty with tablet computers and three 2-hour peer-mentoring workshops on tablet-based teaching. Faculty used tablets to teach, in addition to their current, non-tablet-based methods. Presurveys, postsurveys, and monthly faculty surveys assessed feasibility, utilization, and comparisons to current modalities. Learner surveys assessed perceived effectiveness and comparisons to current modalities. All feedback received from open-ended questions was reviewed by the authors and organized into categories. Of 15 eligible faculty, 14 participated. Each participant attended at least 2 of the 3 workshops, with 10 to 12 participants at each workshop. All participants found the workshops useful, and reported that the new tablet-based teaching modality added value beyond that of current teaching methods. Respondents developed the following tablet-based outputs: presentations, photo galleries, evaluation tools, and online modules. Of the outputs, 60% were used in the ambulatory clinics, 33% in intensive care unit bedside teaching rounds, and 7% in inpatient medical unit bedside teaching rounds. Learners reported that common benefits of tablet computers were: improved access/convenience (41%), improved interactive learning (38%), and improved bedside teaching and patient care (13%). A common barrier faculty identified was inconsistent wireless access (14%), while no barriers were identified by the majority of learners. Providing faculty with tablet computers and having peer-mentoring workshops to discuss their use was feasible and added value.

  19. Adaptively detecting changes in Autonomic Grid Computing

    KAUST Repository

    Zhang, Xiangliang; Germain, Cé cile; Sebag, Michè le

    2010-01-01

    Detecting the changes is the common issue in many application fields due to the non-stationary distribution of the applicative data, e.g., sensor network signals, web logs and gridrunning logs. Toward Autonomic Grid Computing, adaptively detecting

  20. Linking computers for science

    CERN Multimedia

    2005-01-01

    After the success of SETI@home, many other scientists have found computer power donated by the public to be a valuable resource - and sometimes the only possibility to achieve their goals. In July, representatives of several “public resource computing” projects came to CERN to discuss technical issues and R&D activities on the common computing platform they are using, BOINC. This photograph shows the LHC@home screen-saver which uses the BOINC platform: the dots represent protons and the position of the status bar indicates the progress of the calculations. This summer, CERN hosted the first “pangalactic workshop” on BOINC (Berkeley Open Interface for Network Computing). BOINC is modelled on SETI@home, which millions of people have downloaded to help search for signs of extraterrestrial intelligence in radio-astronomical data. BOINC provides a general-purpose framework for scientists to adapt their software to, so that the public can install and run it. An important part of BOINC is managing the...

  1. Security in Computer Applications

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    Computer security has been an increasing concern for IT professionals for a number of years, yet despite all the efforts, computer systems and networks remain highly vulnerable to attacks of different kinds. Design flaws and security bugs in the underlying software are among the main reasons for this. This lecture addresses the following question: how to create secure software? The lecture starts with a definition of computer security and an explanation of why it is so difficult to achieve. It then introduces the main security principles (like least-privilege, or defense-in-depth) and discusses security in different phases of the software development cycle. The emphasis is put on the implementation part: most common pitfalls and security bugs are listed, followed by advice on best practice for security development. The last part of the lecture covers some miscellaneous issues like the use of cryptography, rules for networking applications, and social engineering threats. This lecture was first given on Thursd...

  2. Detailed requirements document for common software of shuttle program information management system

    Science.gov (United States)

    Everette, J. M.; Bradfield, L. D.; Horton, C. L.

    1975-01-01

    Common software was investigated as a method for minimizing development and maintenance cost of the shuttle program information management system (SPIMS) applications while reducing the time-frame of their development. Those requirements satisfying these criteria are presented along with the stand-alone modules which may be used directly by applications. The SPIMS applications operating on the CYBER 74 computer, are specialized information management systems which use System 2000 as a data base manager. Common software provides the features to support user interactions on a CRT terminal using form input and command response capabilities. These features are available as subroutines to the applications.

  3. Evolution of Cloud Storage as Cloud Computing Infrastructure Service

    OpenAIRE

    Rajan, Arokia Paul; Shanmugapriyaa

    2013-01-01

    Enterprises are driving towards less cost, more availability, agility, managed risk - all of which is accelerated towards Cloud Computing. Cloud is not a particular product, but a way of delivering IT services that are consumable on demand, elastic to scale up and down as needed, and follow a pay-for-usage model. Out of the three common types of cloud computing service models, Infrastructure as a Service (IaaS) is a service model that provides servers, computing power, network bandwidth and S...

  4. Ocular problems of computer vision syndrome: Review

    Directory of Open Access Journals (Sweden)

    Ayakutty Muni Raja

    2015-01-01

    Full Text Available Nowadays, ophthalmologists are facing a new group of patients having eye problems related to prolonged and excessive computer use. When the demand for near work exceeds the normal ability of the eye to perform the job comfortably, one develops discomfort and prolonged exposure, which leads to a cascade of reactions that can be put together as computer vision syndrome (CVS. In India, the computer-using population is more than 40 million, and 80% have discomfort due to CVS. Eye strain, headache, blurring of vision and dryness are the most common symptoms. Workstation modification, voluntary blinking, adjustment of the brightness of screen and breaks in between can reduce CVS.

  5. Java and its future in biomedical computing.

    Science.gov (United States)

    Rodgers, R P

    1996-01-01

    Java, a new object-oriented computing language related to C++, is receiving considerable attention due to its use in creating network-sharable, platform-independent software modules (known as "applets") that can be used with the World Wide Web. The Web has rapidly become the most commonly used information-retrieval tool associated with the global computer network known as the Internet, and Java has the potential to further accelerate the Web's application to medical problems. Java's potentially wide acceptance due to its Web association and its own technical merits also suggests that it may become a popular language for non-Web-based, object-oriented computing. PMID:8880677

  6. Computer Anxiety, Academic Stress, and Academic Procrastination on College Students

    OpenAIRE

    Wahyu Rahardjo; Juneman Juneman; Yeni Setiani

    2013-01-01

    Academic procrastination is fairly and commonly found among college students. The lack of understanding in making the best use of computer technology may lead to anxiety in terms of operating computer hence cause postponement in completing course assignments related to computer operation. On the other hand, failure in achieving certain academic targets as expected by parents and/or the students themselves also makes students less focused and leads to tendency of postponing many completions of...

  7. The SIMRAND 1 computer program: Simulation of research and development projects

    Science.gov (United States)

    Miles, R. F., Jr.

    1986-01-01

    The SIMRAND I Computer Program (Version 5.0 x 0.3) written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles is described. The SIMRAND I Computer Program comprises eleven modules-a main routine and ten subroutines. Two additional files are used at compile time; one inserts the system or task equations into the source code, while the other inserts the dimension statements and common blocks. The SIMRAND I Computer Program can be run on most microcomputers or mainframe computers with only minor modifications to the computer code.

  8. Comparison and selection of client computer in nuclear instrument

    International Nuclear Information System (INIS)

    Ma Guizhen; Xie Yanhui; Peng Jing; Xu Feiyan

    2012-01-01

    The function of modern new nuclear instrument is very much. And the information degree is high requested. Through close matching for host computer and client computer, the data processing function can be carried out. This article puts forward a few of projects for the client computer of general nuclear instrument. The function and features of several common client computers, such as FPGA, ARM and DSP, are analyzed and compared. The applied scope is discussed also. At the same time, using a practical design as an example, the selection ideas of client computer are described. This article can be used for reference for the hardware design of data acquisition processing unit in nuclear instrument. (authors)

  9. Proposals for common definitions of reference points in gynecological brachytherapy

    International Nuclear Information System (INIS)

    Chassagne, D.; Horiot, J.C.

    1977-01-01

    In May 1975 the report of European Curietherapy Group recommended in gynecological Dosimetry by computer. Use of reference points = lymphatic trapezoid figure with 6 points, Pelvic wall, all points are refering to bony structures. Use of critical organ reference points = maximum rectum dose, bladder dose mean rectal dose. Use of 6,000 rads reference isodose described by height, width, and thickness dimensions. These proposals are the basis of a common language in gynecological brachytherapy [fr

  10. Critical services in the LHC computing

    International Nuclear Information System (INIS)

    Sciaba, A

    2010-01-01

    The LHC experiments (ALICE, ATLAS, CMS and LHCb) rely for the data acquisition, processing, distribution, analysis and simulation on complex computing systems, running using a variety of services, provided by the experiments, the Worldwide LHC Computing Grid and the different computing centres. These services range from the most basic (network, batch systems, file systems) to the mass storage services or the Grid information system, up to the different workload management systems, data catalogues and data transfer tools, often internally developed in the collaborations. In this contribution we review the status of the services most critical to the experiments by quantitatively measuring their readiness with respect to the start of the LHC operations. Shortcomings are identified and common recommendations are offered.

  11. Understanding failures in petascale computers

    International Nuclear Information System (INIS)

    Schroeder, Bianca; Gibson, Garth A

    2007-01-01

    With petascale computers only a year or two away there is a pressing need to anticipate and compensate for a probable increase in failure and application interruption rates. Researchers, designers and integrators have available to them far too little detailed information on the failures and interruptions that even smaller terascale computers experience. The information that is available suggests that application interruptions will become far more common in the coming decade, and the largest applications may surrender large fractions of the computer's resources to taking checkpoints and restarting from a checkpoint after an interruption. This paper reviews sources of failure information for compute clusters and storage systems, projects failure rates and the corresponding decrease in application effectiveness, and discusses coping strategies such as application-level checkpoint compression and system level process-pairs fault-tolerance for supercomputing. The need for a public repository for detailed failure and interruption records is particularly concerning, as projections from one architectural family of machines to another are widely disputed. To this end, this paper introduces the Computer Failure Data Repository and issues a call for failure history data to publish in it

  12. Computer Assisted Language Learning” (CALL

    Directory of Open Access Journals (Sweden)

    Nazlı Gündüz

    2005-10-01

    Full Text Available This article will provide an overview of computers; an overview of the history of CALL, itspros and cons, the internet, World Wide Web, Multimedia, and research related to the uses of computers in the language classroom. Also, it also aims to provide some background for the beginnerson using the Internet in language classes today. It discusses some of the common types of Internetactivities that are being used today, what the minimum requirements are for using the Internet forlanguage learning, and some easy activities you can adapt for your classes. Some special terminology related to computers will also be used in this paper. For example, computer assisted language learning(CALL refers to the sets of instructions which need to be loaded into the computer for it to be able to work in the language classroom. It should be borne in mind that CALL does not refer to the use of acomputer by a teacher to type out a worksheet or a class list or preparing his/her own teaching alone.Hardware refers to any computer equipment used, including the computer itself, the keyboard, screen (or the monitor, the disc-drive, and the printer. Software (computer programs refers to the sets of instructions which need to be loaded into the computer for it to be able to work.

  13. Fourth Thematic CERN School of Computing

    CERN Multimedia

    Alberto Pace, CSC Director

    2016-01-01

    The Fourth Thematic School of Computing (tCSC2016) takes place this year in Split, Croatia, from 22 to 28 May 2016.   The theme is "Efficient and Parallel Processing of Scientific Data", looking at: The challenge of scientific data processing: commonalities, analogies and the main differences between different sciences. Size of scientific software projects. Parallelism and asynchronism: computation and I/O. The School is open to postgraduate students and research workers with a few years' experience in elementary particle physics, computing, engineering or related fields.  All applicants are welcome, including former and future participants in the main CSC summer school. Registration will close on 15 February and participation is limited to 24 students. To register, please go here. About: The Thematic Schools are part of the annual series of CERN Schools of Computing, to promote advanced learning and knowledge exchange on the subject of scientific compu...

  14. Algorithms for image processing and computer vision

    CERN Document Server

    Parker, J R

    2010-01-01

    A cookbook of algorithms for common image processing applications Thanks to advances in computer hardware and software, algorithms have been developed that support sophisticated image processing without requiring an extensive background in mathematics. This bestselling book has been fully updated with the newest of these, including 2D vision methods in content-based searches and the use of graphics cards as image processing computational aids. It's an ideal reference for software engineers and developers, advanced programmers, graphics programmers, scientists, and other specialists wh

  15. Computer self-efficacy - is there a gender gap in tertiary level introductory computing classes?

    Directory of Open Access Journals (Sweden)

    Shirley Gibbs

    Full Text Available This paper explores the relationship between introductory computing students, self-efficacy, and gender. Since the use of computers has become more common there has been speculation that the confidence and ability to use them differs between genders. Self-efficacy is an important and useful concept used to describe how a student may perceive their own ability or confidence in using and learning new technology. A survey of students in an introductory computing class has been completed intermittently since the late 1990\\'s. Although some questions have been adapted to meet the changing technology the aim of the survey has remain unchanged. In this study self-efficacy is measured using two self-rating questions. Students are asked to rate their confidence using a computer and also asked to give their perception of their computing knowledge. This paper examines these two aspects of a person\\'s computer self-efficacy in order to identify any differences that may occur between genders in two introductory computing classes, one in 1999 and the other in 2012. Results from the 1999 survey are compared with those from the survey completed in 2012 and investigated to ascertain if the perception that males were more likely to display higher computer self-efficacy levels than their female classmates does or did exist in a class of this type. Results indicate that while overall there has been a general increase in self-efficacy levels in 2012 compared with 1999, there is no significant gender gap.

  16. Comparative evaluation of the cadaveric, radiographic and computed tomographic anatomy of the heads of green iguana (Iguana iguana), common tegu (Tupinambis merianae) and bearded dragon (Pogona vitticeps).

    Science.gov (United States)

    Banzato, Tommaso; Selleri, Paolo; Veladiano, Irene A; Martin, Andrea; Zanetti, Emanuele; Zotti, Alessandro

    2012-05-11

    Radiology and computed tomography are the most commonly available diagnostic tools for the diagnosis of pathologies affecting the head and skull in veterinary practice. Nevertheless, accurate interpretation of radiographic and CT studies requires a thorough knowledge of the gross and the cross-sectional anatomy. Despite the increasing success of reptiles as pets, only a few reports over their normal imaging features are currently available. The aim of this study is to describe the normal cadaveric, radiographic and computed tomographic features of the heads of the green iguana, tegu and bearded dragon. 6 adult green iguanas, 4 tegus, 3 bearded dragons, and, the adult cadavers of: 4 green iguana, 4 tegu, 4 bearded dragon were included in the study. 2 cadavers were dissected following a stratigraphic approach and 2 cadavers were cross-sectioned for each species. These latter specimens were stored in a freezer (-20°C) until completely frozen. Transversal sections at 5 mm intervals were obtained by means of an electric band-saw. Each section was cleaned and photographed on both sides. Radiographs of the head of each subject were obtained. Pre- and post- contrast computed tomographic studies of the head were performed on all the live animals. CT images were displayed in both bone and soft tissue windows. Individual anatomic structures were first recognised and labelled on the anatomic images and then matched on radiographs and CT images. Radiographic and CT images of the skull provided good detail of the bony structures in all species. In CT contrast medium injection enabled good detail of the soft tissues to be obtained in the iguana whereas only the eye was clearly distinguishable from the remaining soft tissues in both the tegu and the bearded dragon. The results provide an atlas of the normal anatomical and in vivo radiographic and computed tomographic features of the heads of lizards, and this may be useful in interpreting any imaging modality involving these

  17. Comparative evaluation of the cadaveric, radiographic and computed tomographic anatomy of the heads of green iguana (Iguana iguana) , common tegu ( Tupinambis merianae) and bearded dragon ( Pogona vitticeps)

    Science.gov (United States)

    2012-01-01

    Background Radiology and computed tomography are the most commonly available diagnostic tools for the diagnosis of pathologies affecting the head and skull in veterinary practice. Nevertheless, accurate interpretation of radiographic and CT studies requires a thorough knowledge of the gross and the cross-sectional anatomy. Despite the increasing success of reptiles as pets, only a few reports over their normal imaging features are currently available. The aim of this study is to describe the normal cadaveric, radiographic and computed tomographic features of the heads of the green iguana, tegu and bearded dragon. Results 6 adult green iguanas, 4 tegus, 3 bearded dragons, and, the adult cadavers of : 4 green iguana, 4 tegu, 4 bearded dragon were included in the study. 2 cadavers were dissected following a stratigraphic approach and 2 cadavers were cross-sectioned for each species. These latter specimens were stored in a freezer (−20°C) until completely frozen. Transversal sections at 5 mm intervals were obtained by means of an electric band-saw. Each section was cleaned and photographed on both sides. Radiographs of the head of each subject were obtained. Pre- and post- contrast computed tomographic studies of the head were performed on all the live animals. CT images were displayed in both bone and soft tissue windows. Individual anatomic structures were first recognised and labelled on the anatomic images and then matched on radiographs and CT images. Radiographic and CT images of the skull provided good detail of the bony structures in all species. In CT contrast medium injection enabled good detail of the soft tissues to be obtained in the iguana whereas only the eye was clearly distinguishable from the remaining soft tissues in both the tegu and the bearded dragon. Conclusions The results provide an atlas of the normal anatomical and in vivo radiographic and computed tomographic features of the heads of lizards, and this may be useful in interpreting any

  18. Comparative evaluation of the cadaveric, radiographic and computed tomographic anatomy of the heads of green iguana (Iguana iguana , common tegu ( Tupinambis merianae and bearded dragon ( Pogona vitticeps

    Directory of Open Access Journals (Sweden)

    Banzato Tommaso

    2012-05-01

    Full Text Available Abstract Background Radiology and computed tomography are the most commonly available diagnostic tools for the diagnosis of pathologies affecting the head and skull in veterinary practice. Nevertheless, accurate interpretation of radiographic and CT studies requires a thorough knowledge of the gross and the cross-sectional anatomy. Despite the increasing success of reptiles as pets, only a few reports over their normal imaging features are currently available. The aim of this study is to describe the normal cadaveric, radiographic and computed tomographic features of the heads of the green iguana, tegu and bearded dragon. Results 6 adult green iguanas, 4 tegus, 3 bearded dragons, and, the adult cadavers of : 4 green iguana, 4 tegu, 4 bearded dragon were included in the study. 2 cadavers were dissected following a stratigraphic approach and 2 cadavers were cross-sectioned for each species. These latter specimens were stored in a freezer (−20°C until completely frozen. Transversal sections at 5 mm intervals were obtained by means of an electric band-saw. Each section was cleaned and photographed on both sides. Radiographs of the head of each subject were obtained. Pre- and post- contrast computed tomographic studies of the head were performed on all the live animals. CT images were displayed in both bone and soft tissue windows. Individual anatomic structures were first recognised and labelled on the anatomic images and then matched on radiographs and CT images. Radiographic and CT images of the skull provided good detail of the bony structures in all species. In CT contrast medium injection enabled good detail of the soft tissues to be obtained in the iguana whereas only the eye was clearly distinguishable from the remaining soft tissues in both the tegu and the bearded dragon. Conclusions The results provide an atlas of the normal anatomical and in vivo radiographic and computed tomographic features of the heads of lizards, and this may be

  19. The role of computed tomography in uncertain obstructive jaundice

    International Nuclear Information System (INIS)

    Saito, Yoshihiro; Yoshino, Toyoaki; Takayanagi, Ryuichi; Negishi, Ken; Tanaka, Teruhiko; Ito, Ichiro.

    1985-01-01

    42 patients with uncertain obstructive jaundice were examined by computed tomography (CT). CT correctly diagnosed obstructive jaundice in 97% of 37 proven cases and the accuracy of CT in determing the level of obstruction was also 97%. But the sensitivity of CT in determing the cause of obstructive jaundice was 62.5%, particularly poor in common bile duct stone (61.5%), inflammation of common bile duct (0%), and common bile duct carcinoma (50%). All cases of diagnosed malignant tumors were inoperable. (author)

  20. Computer Security: Security operations at CERN (4/4)

    CERN Document Server

    CERN. Geneva

    2012-01-01

    Stefan Lueders, PhD, graduated from the Swiss Federal Institute of Technology in Zurich and joined CERN in 2002. Being initially developer of a common safety system used in all four experiments at the Large Hadron Collider, he gathered expertise in cyber-security issues of control systems. Consequently in 2004, he took over responsibilities in securing CERN's accelerator and infrastructure control systems against cyber-threats. Subsequently, he joined the CERN Computer Security Incident Response Team and is today heading this team as CERN's Computer Security Officer with the mandate to coordinate all aspects of CERN's computer security --- office computing security, computer centre security, GRID computing security and control system security --- whilst taking into account CERN's operational needs. Dr. Lueders has presented on these topics at many different occasions to international bodies, governments, and companies, and published several articles. With the prevalence of modern information technologies and...

  1. Personal computers in accelerator control

    International Nuclear Information System (INIS)

    Anderssen, P.S.

    1988-01-01

    The advent of the personal computer has created a popular movement which has also made a strong impact on science and engineering. Flexible software environments combined with good computational performance and large storage capacities are becoming available at steadily decreasing costs. Of equal importance, however, is the quality of the user interface offered on many of these products. Graphics and screen interaction is available in ways that were only possible on specialized systems before. Accelerator engineers were quick to pick up the new technology. The first applications were probably for controllers and data gatherers for beam measurement equipment. Others followed, and today it is conceivable to make personal computer a standard component of an accelerator control system. This paper reviews the experience gained at CERN so far and describes the approach taken in the design of the common control center for the SPS and the future LEP accelerators. The design goal has been to be able to integrate personal computers into the accelerator control system and to build the operator's workplace around it. (orig.)

  2. Computer-integrated electric-arc melting process control system

    OpenAIRE

    Дёмин, Дмитрий Александрович

    2014-01-01

    Developing common principles of completing melting process automation systems with hardware and creating on their basis rational choices of computer- integrated electricarc melting control systems is an actual task since it allows a comprehensive approach to the issue of modernizing melting sites of workshops. This approach allows to form the computer-integrated electric-arc furnace control system as part of a queuing system “electric-arc furnace - foundry conveyor” and consider, when taking ...

  3. Dwindling Numbers of Female Computer Students: What Are We Missing?

    Science.gov (United States)

    Saulsberry, Donna

    2012-01-01

    There is common agreement among researchers that women are under-represented in both 2-year and 4-year collegiate computer study programs. This leads to women being under-represented in the computer industry which may be limiting the progress of technology developments that will benefit mankind. It may also be depriving women of the opportunity to…

  4. Machine learning and computer vision approaches for phenotypic profiling.

    Science.gov (United States)

    Grys, Ben T; Lo, Dara S; Sahin, Nil; Kraus, Oren Z; Morris, Quaid; Boone, Charles; Andrews, Brenda J

    2017-01-02

    With recent advances in high-throughput, automated microscopy, there has been an increased demand for effective computational strategies to analyze large-scale, image-based data. To this end, computer vision approaches have been applied to cell segmentation and feature extraction, whereas machine-learning approaches have been developed to aid in phenotypic classification and clustering of data acquired from biological images. Here, we provide an overview of the commonly used computer vision and machine-learning methods for generating and categorizing phenotypic profiles, highlighting the general biological utility of each approach. © 2017 Grys et al.

  5. Present SLAC accelerator computer control system features

    International Nuclear Information System (INIS)

    Davidson, V.; Johnson, R.

    1981-02-01

    The current functional organization and state of software development of the computer control system of the Stanford Linear Accelerator is described. Included is a discussion of the distribution of functions throughout the system, the local controller features, and currently implemented features of the touch panel portion of the system. The functional use of our triplex of PDP11-34 computers sharing common memory is described. Also included is a description of the use of pseudopanel tables as data tables for closed loop control functions

  6. Computer facilities for ISABELLE data handling

    International Nuclear Information System (INIS)

    Kramer, M.A.; Love, W.A.; Miller, R.J.; Zeller, M.

    1977-01-01

    The analysis of data produced by ISABELLE experiments will need a large system of computers. An official group of prospective users and operators of that system should begin planning now. Included in the array will be a substantial computer system at each ISABELLE intersection in use. These systems must include enough computer power to keep experimenters aware of the health of the experiment. This will require at least one very fast sophisticated processor in the system, the size depending on the experiment. Other features of the intersection systems must be a good, high speed graphic display, ability to record data on magnetic tape at 500 to 1000 KB, and a high speed link to a central computer. The operating system software must support multiple interactive users. A substantially larger capacity computer system, shared by the six intersection region experiments, must be available with good turnaround for experimenters while ISABELLE is running. A computer support group will be required to maintain the computer system and to provide and maintain software common to all experiments. Special superfast computing hardware or special function processors constructed with microprocessor circuitry may be necessary both in the data gathering and data processing work. Thus both the local and central processors should be chosen with the possibility of interfacing such devices in mind

  7. Misleading Performance Claims in Parallel Computations

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, David H.

    2009-05-29

    In a previous humorous note entitled 'Twelve Ways to Fool the Masses,' I outlined twelve common ways in which performance figures for technical computer systems can be distorted. In this paper and accompanying conference talk, I give a reprise of these twelve 'methods' and give some actual examples that have appeared in peer-reviewed literature in years past. I then propose guidelines for reporting performance, the adoption of which would raise the level of professionalism and reduce the level of confusion, not only in the world of device simulation but also in the larger arena of technical computing.

  8. [Computer program "PANCREAS"].

    Science.gov (United States)

    Jakubowicz, J; Jankowski, M; Szomański, B; Switka, S; Zagórowicz, E; Pertkiewicz, M; Szczygieł, B

    1998-01-01

    Contemporary computer technology allows precise and fast large database analysis. Widespread and common use depends on appropriate, user friendly software, usually lacking in special medical applications. The aim of this work was to develop an integrated system designed to store, explore and analyze data of patients treated for pancreatic cancer. For that purpose the database administration system MS Visual Fox Pro 3.0 was used and special application, according to ISO 9000 series has been developed. The system works under MS Windows 95 with possibility of easy adaptation to MS Windows 3.11 or MS Windows NT by graphic user's interface. The system stores personal data, laboratory results, visual and histological analyses and information on treatment course and complications. However the system archives them and enables the preparation reports of according to individual and statistical needs. Help and security settings allow to work also for one not familiar with computer science.

  9. Comparison on Computed Tomography using industrial items

    DEFF Research Database (Denmark)

    Angel, Jais Andreas Breusch; De Chiffre, Leonardo

    2014-01-01

    In a comparison involving 27 laboratories from 8 countries, measurements on two common industrial items, a polymer part and a metal part, were carried out using X-ray Computed Tomography. All items were measured using coordinate measuring machines before and after circulation, with reference...

  10. A computational investigation of the red and blue shifts in hydrogen

    Indian Academy of Sciences (India)

    The presentwork reports results of computational investigations of hydrogen bonding, with regard to the most common red shift in the vibrational frequency, as well as the less common blue shift in several hydrogen bonded systems. A few new correlations of the frequency shifts with the calculated electrostatic parameters ...

  11. High performance computing and quantum trajectory method in CPU and GPU systems

    International Nuclear Information System (INIS)

    Wiśniewska, Joanna; Sawerwain, Marek; Leoński, Wiesław

    2015-01-01

    Nowadays, a dynamic progress in computational techniques allows for development of various methods, which offer significant speed-up of computations, especially those related to the problems of quantum optics and quantum computing. In this work, we propose computational solutions which re-implement the quantum trajectory method (QTM) algorithm in modern parallel computation environments in which multi-core CPUs and modern many-core GPUs can be used. In consequence, new computational routines are developed in more effective way than those applied in other commonly used packages, such as Quantum Optics Toolbox (QOT) for Matlab or QuTIP for Python

  12. Hispanic women overcoming deterrents to computer science: A phenomenological study

    Science.gov (United States)

    Herling, Lourdes

    The products of computer science are important to all aspects of society and are tools in the solution of the world's problems. It is, therefore, troubling that the United States faces a shortage in qualified graduates in computer science. The number of women and minorities in computer science is significantly lower than the percentage of the U.S. population which they represent. The overall enrollment in computer science programs has continued to decline with the enrollment of women declining at a higher rate than that of men. This study addressed three aspects of underrepresentation about which there has been little previous research: addressing computing disciplines specifically rather than embedding them within the STEM disciplines, what attracts women and minorities to computer science, and addressing the issues of race/ethnicity and gender in conjunction rather than in isolation. Since women of underrepresented ethnicities are more severely underrepresented than women in general, it is important to consider whether race and ethnicity play a role in addition to gender as has been suggested by previous research. Therefore, this study examined what attracted Hispanic women to computer science specifically. The study determines whether being subjected to multiple marginalizations---female and Hispanic---played a role in the experiences of Hispanic women currently in computer science. The study found five emergent themes within the experiences of Hispanic women in computer science. Encouragement and role models strongly influenced not only the participants' choice to major in the field, but to persist as well. Most of the participants experienced a negative atmosphere and feelings of not fitting in while in college and industry. The interdisciplinary nature of computer science was the most common aspect that attracted the participants to computer science. The aptitudes participants commonly believed are needed for success in computer science are the Twenty

  13. Age group athletes in inline skating: decrease in overall and increase in master athlete participation in the longest inline skating race in Europe - the Inline One-Eleven.

    Science.gov (United States)

    Teutsch, Uwe; Knechtle, Beat; Rüst, Christoph Alexander; Rosemann, Thomas; Lepers, Romuald

    2013-01-01

    Participation and performance trends in age group athletes have been investigated in endurance and ultraendurance races in swimming, cycling, running, and triathlon, but not in long-distance inline skating. The aim of this study was to investigate trends in participation, age, and performance in the longest inline race in Europe, the Inline One-Eleven over 111 km, held between 1998 and 2009. The total number, age distribution, age at the time of the competition, and race times of male and female finishers at the Inline One-Eleven were analyzed. Overall participation increased until 2003 but decreased thereafter. During the 12-year period, the relative participation in skaters younger than 40 years old decreased while relative participation increased for skaters older than 40 years. The mean top ten skating time was 199 ± 9 minutes (range: 189-220 minutes) for men and 234 ± 17 minutes (range: 211-271 minutes) for women, respectively. The gender difference in performance remained stable at 17% ± 5% across years. To summarize, although the participation of master long-distance inline skaters increased, the overall participation decreased across years in the Inline One-Eleven. The race times of the best female and male skaters stabilized across years with a gender difference in performance of 17% ± 5%. Further studies should focus on the participation in the international World Inline Cup races.

  14. The “Common Solutions” Strategy of the Experiment Support group at CERN for the LHC Experiments

    Science.gov (United States)

    Girone, M.; Andreeva, J.; Barreiro Megino, F. H.; Campana, S.; Cinquilli, M.; Di Girolamo, A.; Dimou, M.; Giordano, D.; Karavakis, E.; Kenyon, M. J.; Kokozkiewicz, L.; Lanciotti, E.; Litmaath, M.; Magini, N.; Negri, G.; Roiser, S.; Saiz, P.; Saiz Santos, M. D.; Schovancova, J.; Sciabà, A.; Spiga, D.; Trentadue, R.; Tuckett, D.; Valassi, A.; Van der Ster, D. C.; Shiers, J. D.

    2012-12-01

    After two years of LHC data taking, processing and analysis and with numerous changes in computing technology, a number of aspects of the experiments’ computing, as well as WLCG deployment and operations, need to evolve. As part of the activities of the Experiment Support group in CERN's IT department, and reinforced by effort from the EGI-InSPIRE project, we present work aimed at common solutions across all LHC experiments. Such solutions allow us not only to optimize development manpower but also offer lower long-term maintenance and support costs. The main areas cover Distributed Data Management, Data Analysis, Monitoring and the LCG Persistency Framework. Specific tools have been developed including the HammerCloud framework, automated services for data placement, data cleaning and data integrity (such as the data popularity service for CMS, the common Victor cleaning agent for ATLAS and CMS and tools for catalogue/storage consistency), the Dashboard Monitoring framework (job monitoring, data management monitoring, File Transfer monitoring) and the Site Status Board. This talk focuses primarily on the strategic aspects of providing such common solutions and how this relates to the overall goals of long-term sustainability and the relationship to the various WLCG Technical Evolution Groups. The success of the service components has given us confidence in the process, and has developed the trust of the stakeholders. We are now attempting to expand the development of common solutions into the more critical workflows. The first is a feasibility study of common analysis workflow execution elements between ATLAS and CMS. We look forward to additional common development in the future.

  15. The “Common Solutions” Strategy of the Experiment Support group at CERN for the LHC Experiments

    International Nuclear Information System (INIS)

    Girone, M; Andreeva, J; Barreiro Megino, F H; Campana, S; Cinquilli, M; Di Girolamo, A; Dimou, M; Giordano, D; Karavakis, E; Kenyon, M J; Kokozkiewicz, L; Lanciotti, E; Litmaath, M; Magini, N; Negri, G; Roiser, S; Saiz, P; Saiz Santos, M D; Schovancova, J; Sciabà, A

    2012-01-01

    After two years of LHC data taking, processing and analysis and with numerous changes in computing technology, a number of aspects of the experiments’ computing, as well as WLCG deployment and operations, need to evolve. As part of the activities of the Experiment Support group in CERN's IT department, and reinforced by effort from the EGI-InSPIRE project, we present work aimed at common solutions across all LHC experiments. Such solutions allow us not only to optimize development manpower but also offer lower long-term maintenance and support costs. The main areas cover Distributed Data Management, Data Analysis, Monitoring and the LCG Persistency Framework. Specific tools have been developed including the HammerCloud framework, automated services for data placement, data cleaning and data integrity (such as the data popularity service for CMS, the common Victor cleaning agent for ATLAS and CMS and tools for catalogue/storage consistency), the Dashboard Monitoring framework (job monitoring, data management monitoring, File Transfer monitoring) and the Site Status Board. This talk focuses primarily on the strategic aspects of providing such common solutions and how this relates to the overall goals of long-term sustainability and the relationship to the various WLCG Technical Evolution Groups. The success of the service components has given us confidence in the process, and has developed the trust of the stakeholders. We are now attempting to expand the development of common solutions into the more critical workflows. The first is a feasibility study of common analysis workflow execution elements between ATLAS and CMS. We look forward to additional common development in the future.

  16. Finding the most significant common sequence and structure motifs in a set of RNA sequences

    DEFF Research Database (Denmark)

    Gorodkin, Jan; Heyer, L.J.; Stormo, G.D.

    1997-01-01

    We present a computational scheme to locally align a collection of RNA sequences using sequence and structure constraints, In addition, the method searches for the resulting alignments with the most significant common motifs, among all possible collections, The first part utilizes a simplified...

  17. Diffuse abnormalities of the trachea: computed tomography findings

    International Nuclear Information System (INIS)

    Marchiori, Edson; Araujo Neto, Cesar de

    2008-01-01

    The aim of this pictorial essay was to present the main computed tomography findings seen in diffuse diseases of the trachea. The diseases studied included amyloidosis, tracheobronchopathia osteochondroplastica, tracheobronchomegaly, laryngotracheobronchial papillomatosis, lymphoma, neurofibromatosis, relapsing polychondritis, Wegener's granulomatosis, tuberculosis, paracoccidioidomycosis, and tracheobronchomalacia. The most common computed tomography finding was thickening of the walls of the trachea, with or without nodules, parietal calcifications, or involvement of the posterior wall. Although computed tomography allows the detection and characterization of diseases of the central airways, and the correlation with clinical data reduces the diagnostic possibilities, bronchoscopy with biopsy remains the most useful procedure for the diagnosis of diffuse lesions of the trachea. (author)

  18. National Ignition Facility integrated computer control system

    International Nuclear Information System (INIS)

    Van Arsdall, P.J. LLNL

    1998-01-01

    The NIF design team is developing the Integrated Computer Control System (ICCS), which is based on an object-oriented software framework applicable to event-driven control systems. The framework provides an open, extensible architecture that is sufficiently abstract to construct future mission-critical control systems. The ICCS will become operational when the first 8 out of 192 beams are activated in mid 2000. The ICCS consists of 300 front-end processors attached to 60,000 control points coordinated by a supervisory system. Computers running either Solaris or VxWorks are networked over a hybrid configuration of switched fast Ethernet and asynchronous transfer mode (ATM). ATM carries digital motion video from sensors to operator consoles. Supervisory software is constructed by extending the reusable framework components for each specific application. The framework incorporates services for database persistence, system configuration, graphical user interface, status monitoring, event logging, scripting language, alert management, and access control. More than twenty collaborating software applications are derived from the common framework. The framework is interoperable among different kinds of computers and functions as a plug-in software bus by leveraging a common object request brokering architecture (CORBA). CORBA transparently distributes the software objects across the network. Because of the pivotal role played, CORBA was tested to ensure adequate performance

  19. Scientific computer simulation review

    International Nuclear Information System (INIS)

    Kaizer, Joshua S.; Heller, A. Kevin; Oberkampf, William L.

    2015-01-01

    Before the results of a scientific computer simulation are used for any purpose, it should be determined if those results can be trusted. Answering that question of trust is the domain of scientific computer simulation review. There is limited literature that focuses on simulation review, and most is specific to the review of a particular type of simulation. This work is intended to provide a foundation for a common understanding of simulation review. This is accomplished through three contributions. First, scientific computer simulation review is formally defined. This definition identifies the scope of simulation review and provides the boundaries of the review process. Second, maturity assessment theory is developed. This development clarifies the concepts of maturity criteria, maturity assessment sets, and maturity assessment frameworks, which are essential for performing simulation review. Finally, simulation review is described as the application of a maturity assessment framework. This is illustrated through evaluating a simulation review performed by the U.S. Nuclear Regulatory Commission. In making these contributions, this work provides a means for a more objective assessment of a simulation’s trustworthiness and takes the next step in establishing scientific computer simulation review as its own field. - Highlights: • We define scientific computer simulation review. • We develop maturity assessment theory. • We formally define a maturity assessment framework. • We describe simulation review as the application of a maturity framework. • We provide an example of a simulation review using a maturity framework

  20. Functional programming for computer vision

    Science.gov (United States)

    Breuel, Thomas M.

    1992-04-01

    Functional programming is a style of programming that avoids the use of side effects (like assignment) and uses functions as first class data objects. Compared with imperative programs, functional programs can be parallelized better, and provide better encapsulation, type checking, and abstractions. This is important for building and integrating large vision software systems. In the past, efficiency has been an obstacle to the application of functional programming techniques in computationally intensive areas such as computer vision. We discuss and evaluate several 'functional' data structures for representing efficiently data structures and objects common in computer vision. In particular, we will address: automatic storage allocation and reclamation issues; abstraction of control structures; efficient sequential update of large data structures; representing images as functions; and object-oriented programming. Our experience suggests that functional techniques are feasible for high- performance vision systems, and that a functional approach simplifies the implementation and integration of vision systems greatly. Examples in C++ and SML are given.

  1. Computational methods for fluid dynamics

    CERN Document Server

    Ferziger, Joel H

    2002-01-01

    In its 3rd revised and extended edition the book offers an overview of the techniques used to solve problems in fluid mechanics on computers and describes in detail those most often used in practice. Included are advanced methods in computational fluid dynamics, like direct and large-eddy simulation of turbulence, multigrid methods, parallel computing, moving grids, structured, block-structured and unstructured boundary-fitted grids, free surface flows. The 3rd edition contains a new section dealing with grid quality and an extended description of discretization methods. The book shows common roots and basic principles for many different methods. The book also contains a great deal of practical advice for code developers and users, it is designed to be equally useful to beginners and experts. The issues of numerical accuracy, estimation and reduction of numerical errors are dealt with in detail, with many examples. A full-feature user-friendly demo-version of a commercial CFD software has been added, which ca...

  2. Multisubject Learning for Common Spatial Patterns in Motor-Imagery BCI

    Directory of Open Access Journals (Sweden)

    Dieter Devlaminck

    2011-01-01

    Full Text Available Motor-imagery-based brain-computer interfaces (BCIs commonly use the common spatial pattern filter (CSP as preprocessing step before feature extraction and classification. The CSP method is a supervised algorithm and therefore needs subject-specific training data for calibration, which is very time consuming to collect. In order to reduce the amount of calibration data that is needed for a new subject, one can apply multitask (from now on called multisubject machine learning techniques to the preprocessing phase. Here, the goal of multisubject learning is to learn a spatial filter for a new subject based on its own data and that of other subjects. This paper outlines the details of the multitask CSP algorithm and shows results on two data sets. In certain subjects a clear improvement can be seen, especially when the number of training trials is relatively low.

  3. An eLearning Standard Approach for Supporting PBL in Computer Engineering

    Science.gov (United States)

    Garcia-Robles, R.; Diaz-del-Rio, F.; Vicente-Diaz, S.; Linares-Barranco, A.

    2009-01-01

    Problem-based learning (PBL) has proved to be a highly successful pedagogical model in many fields, although it is not that common in computer engineering. PBL goes beyond the typical teaching methodology by promoting student interaction. This paper presents a PBL trial applied to a course in a computer engineering degree at the University of…

  4. Space Use in the Commons: Evaluating a Flexible Library Environment

    Directory of Open Access Journals (Sweden)

    Andrew D. Asher

    2017-06-01

    Full Text Available Abstract Objective – This article evaluates the usage and user experience of the Herman B Wells Library’s Learning Commons, a newly renovated technology and learning centre that provides services and spaces tailored to undergraduates’ academic needs at Indiana University Bloomington (IUB. Methods – A mixed-method research protocol combining time-lapse photography, unobtrusive observation, and random-sample surveys was employed to construct and visualize a representative usage and activity profile for the Learning Commons space. Results – Usage of the Learning Commons by particular student groups varied considerably from expectations based on student enrollments. In particular, business, first and second year students, and international students used the Learning Commons to a higher degree than expected, while humanities students used it to a much lower degree. While users were satisfied with the services provided and the overall atmosphere of the space, they also experienced the negative effects of insufficient space and facilities due to the space often operating at or near its capacity. Demand for collaboration rooms and computer workstations was particularly high, while additional evidence suggests that the Learning Commons furniture mix may not adequately match users’ needs. Conclusions – This study presents a unique approach to space use evaluation that enables researchers to collect and visualize representative observational data. This study demonstrates a model for quickly and reliably assessing space use for open-plan and learning-centred academic environments and for evaluating how well these learning spaces fulfill their institutional mission.

  5. Computational Intelligence Techniques for New Product Design

    CERN Document Server

    Chan, Kit Yan; Dillon, Tharam S

    2012-01-01

    Applying computational intelligence for product design is a fast-growing and promising research area in computer sciences and industrial engineering. However, there is currently a lack of books, which discuss this research area. This book discusses a wide range of computational intelligence techniques for implementation on product design. It covers common issues on product design from identification of customer requirements in product design, determination of importance of customer requirements, determination of optimal design attributes, relating design attributes and customer satisfaction, integration of marketing aspects into product design, affective product design, to quality control of new products. Approaches for refinement of computational intelligence are discussed, in order to address different issues on product design. Cases studies of product design in terms of development of real-world new products are included, in order to illustrate the design procedures, as well as the effectiveness of the com...

  6. High-order hydrodynamic algorithms for exascale computing

    Energy Technology Data Exchange (ETDEWEB)

    Morgan, Nathaniel Ray [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-02-05

    Hydrodynamic algorithms are at the core of many laboratory missions ranging from simulating ICF implosions to climate modeling. The hydrodynamic algorithms commonly employed at the laboratory and in industry (1) typically lack requisite accuracy for complex multi- material vortical flows and (2) are not well suited for exascale computing due to poor data locality and poor FLOP/memory ratios. Exascale computing requires advances in both computer science and numerical algorithms. We propose to research the second requirement and create a new high-order hydrodynamic algorithm that has superior accuracy, excellent data locality, and excellent FLOP/memory ratios. This proposal will impact a broad range of research areas including numerical theory, discrete mathematics, vorticity evolution, gas dynamics, interface instability evolution, turbulent flows, fluid dynamics and shock driven flows. If successful, the proposed research has the potential to radically transform simulation capabilities and help position the laboratory for computing at the exascale.

  7. Exploitation of a component event data bank for common cause failure analysis

    International Nuclear Information System (INIS)

    Games, A.M.; Amendola, A.; Martin, P.

    1985-01-01

    Investigations into using the European Reliability Data System Component Event Data Bank for common cause failure analysis have been carried out. Starting from early exercises where data were analyzed without computer aid, different types of linked multiple failures have been identified. A classification system is proposed based on this experience. It defines a multiple failure event space wherein each category defines causal, modal, temporal and structural links between failures. It is shown that a search algorithm which incorporates the specific interrogative procedures of the data bank can be developed in conjunction with this classification system. It is concluded that the classification scheme and the search algorithm are useful organizational tools in the field of common cause failures studies. However, it is also suggested that the use of the term common cause failure should be avoided since it embodies to many different types of linked multiple failures

  8. Consolidation of cloud computing in ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00224309; The ATLAS collaboration; Cordeiro, Cristovao; Hover, John; Kouba, Tomas; Love, Peter; Mcnab, Andrew; Schovancova, Jaroslava; Sobie, Randall; Giordano, Domenico

    2017-01-01

    Throughout the first half of LHC Run 2, ATLAS cloud computing has undergone a period of consolidation, characterized by building upon previously established systems, with the aim of reducing operational effort, improving robustness, and reaching higher scale. This paper describes the current state of ATLAS cloud computing. Cloud activities are converging on a common contextualization approach for virtual machines, and cloud resources are sharing monitoring and service discovery components. We describe the integration of Vacuum resources, streamlined usage of the Simulation at Point 1 cloud for offline processing, extreme scaling on Amazon compute resources, and procurement of commercial cloud capacity in Europe. Finally, building on the previously established monitoring infrastructure, we have deployed a real-time monitoring and alerting platform which coalesces data from multiple sources, provides flexible visualization via customizable dashboards, and issues alerts and carries out corrective actions in resp...

  9. Inleiding: 'History of computing'. Geschiedschrijving over computers en computergebruik in Nederland

    Directory of Open Access Journals (Sweden)

    Adrienne van den Boogaard

    2008-06-01

    of the 2000 Paderborn meeting and by Martin Campbell-Kelly resonate in work done in The Netherlands and recently in a major research project sponsored by the European Science Foundation: Software for Europe.The four contributions to this issue offer a true cross-section of ongoing history of computing in The Netherlands. Gerard Alberts and Huub de Beer return to the earliest computers at the Mathematical Center. As they do so under the perspective of using the machines, the result is, let us say, remarkable. Adrienne van den Bogaard compares the styles of software as practiced by Van der Poel and Dijkstra: so much had these two pioneers in common, so different the consequences they took. Frank Veraart treats us with an excerpt from his recent dissertation on the domestication of the micro computer technology: appropriation of computing technology is shown by the role of intermediate actors. Onno de Wit, finally, gives an account of the development, prior to internet, of a national data communication network among large scale users and its remarkable persistence under competition with new network technologies.

  10. Software Defined Radio Datalink Implementation Using PC-Type Computers

    National Research Council Canada - National Science Library

    Zafeiropoulos, Georgios

    2003-01-01

    The objective of this thesis was to examine the feasibility of implementation and the performance of a Software Defined Radio datalink, using a common PC type host computer and a high level programming language...

  11. Efficient computation of spaced seeds

    Directory of Open Access Journals (Sweden)

    Ilie Silvana

    2012-02-01

    Full Text Available Abstract Background The most frequently used tools in bioinformatics are those searching for similarities, or local alignments, between biological sequences. Since the exact dynamic programming algorithm is quadratic, linear-time heuristics such as BLAST are used. Spaced seeds are much more sensitive than the consecutive seed of BLAST and using several seeds represents the current state of the art in approximate search for biological sequences. The most important aspect is computing highly sensitive seeds. Since the problem seems hard, heuristic algorithms are used. The leading software in the common Bernoulli model is the SpEED program. Findings SpEED uses a hill climbing method based on the overlap complexity heuristic. We propose a new algorithm for this heuristic that improves its speed by over one order of magnitude. We use the new implementation to compute improved seeds for several software programs. We compute as well multiple seeds of the same weight as MegaBLAST, that greatly improve its sensitivity. Conclusion Multiple spaced seeds are being successfully used in bioinformatics software programs. Enabling researchers to compute very fast high quality seeds will help expanding the range of their applications.

  12. An Overview of the Advanced CompuTational Software (ACTS)Collection

    Energy Technology Data Exchange (ETDEWEB)

    Drummond, Leroy A.; Marques, Osni A.

    2005-02-02

    The ACTS Collection brings together a number of general-purpose computational tools that were developed by independent research projects mostly funded and supported by the U.S. Department of Energy. These tools tackle a number of common computational issues found in many applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. In this article, we introduce the numerical tools in the collection and their functionalities, present a model for developing more complex computational applications on top of ACTS tools, and summarize applications that use these tools. Lastly, we present a vision of the ACTS project for deployment of the ACTS Collection by the computational sciences community.

  13. Performance of the engineering analysis and data system 2 common file system

    Science.gov (United States)

    Debrunner, Linda S.

    1993-01-01

    The Engineering Analysis and Data System (EADS) was used from April 1986 to July 1993 to support large scale scientific and engineering computation (e.g. computational fluid dynamics) at Marshall Space Flight Center. The need for an updated system resulted in a RFP in June 1991, after which a contract was awarded to Cray Grumman. EADS II was installed in February 1993, and by July 1993 most users were migrated. EADS II is a network of heterogeneous computer systems supporting scientific and engineering applications. The Common File System (CFS) is a key component of this system. The CFS provides a seamless, integrated environment to the users of EADS II including both disk and tape storage. UniTree software is used to implement this hierarchical storage management system. The performance of the CFS suffered during the early months of the production system. Several of the performance problems were traced to software bugs which have been corrected. Other problems were associated with hardware. However, the use of NFS in UniTree UCFM software limits the performance of the system. The performance issues related to the CFS have led to a need to develop a greater understanding of the CFS organization. This paper will first describe the EADS II with emphasis on the CFS. Then, a discussion of mass storage systems will be presented, and methods of measuring the performance of the Common File System will be outlined. Finally, areas for further study will be identified and conclusions will be drawn.

  14. Numerical computation of molecular integrals via optimized (vectorized) FORTRAN code

    International Nuclear Information System (INIS)

    Scott, T.C.; Grant, I.P.; Saunders, V.R.

    1997-01-01

    The calculation of molecular properties based on quantum mechanics is an area of fundamental research whose horizons have always been determined by the power of state-of-the-art computers. A computational bottleneck is the numerical calculation of the required molecular integrals to sufficient precision. Herein, we present a method for the rapid numerical evaluation of molecular integrals using optimized FORTRAN code generated by Maple. The method is based on the exploitation of common intermediates and the optimization can be adjusted to both serial and vectorized computations. (orig.)

  15. "Classical Blalock-Taussig shunt" gone wrong: Confusing the right common carotid with right subclavian artery.

    Science.gov (United States)

    Idhrees, A Mohammed; Cherian, Vijay Thomas; Menon, Sabarinath; Mathew, Thomas; Dharan, Baiju S; Jayakumar, Karunakaran

    2015-01-01

    A 14-year-old girl underwent classical Blalock-Taussig shunt at 5 months of age. Computed tomography evaluation showed "Adachi type H" pattern of aortic arch vessels with the right common carotid artery being anastomosed to the right pulmonary artery mistaking it for the right subclavian artery.

  16. Componential analysis of kinship terminology a computational perspective

    CERN Document Server

    Pericliev, V

    2013-01-01

    This book presents the first computer program automating the task of componential analysis of kinship vocabularies. The book examines the program in relation to two basic problems: the commonly occurring inconsistency of componential models; and the huge number of alternative componential models.

  17. The "Common Solutions" Strategy of the Experiment Support group at CERN for the LHC Experiments

    CERN Document Server

    Girone, M; Barreiro Megino, F H; Campana, S; Cinquilli, M; Di Girolamo, A; Dimou, M; Giordano, D; Karavakis, E; Kenyon, M J; Kokozkiewicz, L; Lanciotti, E; Litmaath, M; Magini, N; Negri, G; Roiser, S; Saiz, P; Saiz Santos, M D; Schovancova, J; Sciabà, A; Spiga, D; Trentadue, R; Tuckett, D; Valassi, A; Van der Ster, D C; Shiers, J D

    2012-01-01

    After two years of LHC data taking, processing and analysis and with numerous changes in computing technology, a number of aspects of the experiments' computing, as well as WLCG deployment and operations, need to evolve. As part of the activities of the Experiment Support group in CERN's IT department, and reinforced by effort from the EGI-InSPIRE project, we present work aimed at common solutions across all LHC experiments. Such solutions allow us not only to optimize development manpower but also offer lower long-term maintenance and support costs. The main areas cover Distributed Data Management, Data Analysis, Monitoring and the LCG Persistency Framework. Specific tools have been developed including the HammerCloud framework, automated services for data placement, data cleaning and data integrity (such as the data popularity service for CMS, the common Victor cleaning agent for ATLAS and CMS and tools for catalogue/storage consistency), the Dashboard Monitoring framework (job monitoring, data management m...

  18. Ontology-Driven Discovery of Scientific Computational Entities

    Science.gov (United States)

    Brazier, Pearl W.

    2010-01-01

    Many geoscientists use modern computational resources, such as software applications, Web services, scientific workflows and datasets that are readily available on the Internet, to support their research and many common tasks. These resources are often shared via human contact and sometimes stored in data portals; however, they are not necessarily…

  19. CASE SERIES Multi-detector computer tomography venography ...

    African Journals Online (AJOL)

    in the curved coronal plane with particular reference to the course of the common and external iliac veins through the pelvis. Axial venous. Aim. To evaluate the role of multi-detector computer tomography venography (MDCTV), compared with conventional venography, as a diagnostic tool in the management of patients with ...

  20. ATLAS experience with HEP software at the Argonne leadership computing facility

    International Nuclear Information System (INIS)

    Uram, Thomas D; LeCompte, Thomas J; Benjamin, D

    2014-01-01

    A number of HEP software packages used by the ATLAS experiment, including GEANT4, ROOT and ALPGEN, have been adapted to run on the IBM Blue Gene supercomputers at the Argonne Leadership Computing Facility. These computers use a non-x86 architecture and have a considerably less rich operating environment than in common use in HEP, but also represent a computing capacity an order of magnitude beyond what ATLAS is presently using via the LCG. The status and potential for making use of leadership-class computing, including the status of integration with the ATLAS production system, is discussed.

  1. ATLAS Experience with HEP Software at the Argonne Leadership Computing Facility

    CERN Document Server

    LeCompte, T; The ATLAS collaboration; Benjamin, D

    2014-01-01

    A number of HEP software packages used by the ATLAS experiment, including GEANT4, ROOT and ALPGEN, have been adapted to run on the IBM Blue Gene supercomputers at the Argonne Leadership Computing Facility. These computers use a non-x86 architecture and have a considerably less rich operating environment than in common use in HEP, but also represent a computing capacity an order of magnitude beyond what ATLAS is presently using via the LCG. The status and potential for making use of leadership-class computing, including the status of integration with the ATLAS production system, is discussed.

  2. Hand hygiene after touching a patient's surroundings: the opportunities most commonly missed.

    Science.gov (United States)

    FitzGerald, G; Moore, G; Wilson, A P R

    2013-05-01

    Healthcare workers generally underestimate the role of environmental surfaces in the transmission of infection, and compliance with hand hygiene following contact with the environment is generally lower than following direct patient contact. To reduce the risk of onward transmission, healthcare workers must identify the need to wash hands with specific tasks or events. To observe the movement of staff in critical care and general wards and determine the routes most commonly travelled and the surfaces most frequently touched with and without appropriate hand hygiene. Fifty-eight 90 min sessions of unobtrusive observation were made in open bays and isolation rooms. Link analysis was used to record staff movement from one location to another as well as the frequency of motion. Hand-hygiene audits were conducted using the World Health Organization 'five moments for hand hygiene' observational tool. In critical care, the majority of movement occurred within the bed space. The bedside computer and equipment trolley were the surfaces most commonly touched, often immediately after patient contact. In the general ward, movement between bed spaces was more common and observed hand hygiene ranged from 25% to 33%. Regardless of ward type, observed hand-hygiene compliance when touching the patient immediately on entering an isolation room was less than 30%. Healthcare workers must be made aware that bacterial spread can occur even during activities of perceived low risk. Education and intervention programmes should focus on the potential contamination of ward computers, case notes and door handles. Copyright © 2013 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  3. A common feature pharmacophore for FDA-approved drugs inhibiting the Ebola virus.

    Science.gov (United States)

    Ekins, Sean; Freundlich, Joel S; Coffee, Megan

    2014-01-01

    We are currently faced with a global infectious disease crisis which has been anticipated for decades. While many promising biotherapeutics are being tested, the search for a small molecule has yet to deliver an approved drug or therapeutic for the Ebola or similar filoviruses that cause haemorrhagic fever. Two recent high throughput screens published in 2013 did however identify several hits that progressed to animal studies that are FDA approved drugs used for other indications. The current computational analysis uses these molecules from two different structural classes to construct a common features pharmacophore. This ligand-based pharmacophore implicates a possible common target or mechanism that could be further explored. A recent structure based design project yielded nine co-crystal structures of pyrrolidinone inhibitors bound to the viral protein 35 (VP35). When receptor-ligand pharmacophores based on the analogs of these molecules and the protein structures were constructed, the molecular features partially overlapped with the common features of solely ligand-based pharmacophore models based on FDA approved drugs. These previously identified FDA approved drugs with activity against Ebola were therefore docked into this protein. The antimalarials chloroquine and amodiaquine docked favorably in VP35. We propose that these drugs identified to date as inhibitors of the Ebola virus may be targeting VP35. These computational models may provide preliminary insights into the molecular features that are responsible for their activity against Ebola virus in vitro and in vivo and we propose that this hypothesis could be readily tested.

  4. Embedded systems for supporting computer accessibility.

    Science.gov (United States)

    Mulfari, Davide; Celesti, Antonio; Fazio, Maria; Villari, Massimo; Puliafito, Antonio

    2015-01-01

    Nowadays, customized AT software solutions allow their users to interact with various kinds of computer systems. Such tools are generally available on personal devices (e.g., smartphones, laptops and so on) commonly used by a person with a disability. In this paper, we investigate a way of using the aforementioned AT equipments in order to access many different devices without assistive preferences. The solution takes advantage of open source hardware and its core component consists of an affordable Linux embedded system: it grabs data coming from the assistive software, which runs on the user's personal device, then, after processing, it generates native keyboard and mouse HID commands for the target computing device controlled by the end user. This process supports any operating system available on the target machine and it requires no specialized software installation; therefore the user with a disability can rely on a single assistive tool to control a wide range of computing platforms, including conventional computers and many kinds of mobile devices, which receive input commands through the USB HID protocol.

  5. Developing a framework for digital objects in the Big Data to Knowledge (BD2K) commons: Report from the Commons Framework Pilots workshop.

    Science.gov (United States)

    Jagodnik, Kathleen M; Koplev, Simon; Jenkins, Sherry L; Ohno-Machado, Lucila; Paten, Benedict; Schurer, Stephan C; Dumontier, Michel; Verborgh, Ruben; Bui, Alex; Ping, Peipei; McKenna, Neil J; Madduri, Ravi; Pillai, Ajay; Ma'ayan, Avi

    2017-07-01

    The volume and diversity of data in biomedical research have been rapidly increasing in recent years. While such data hold significant promise for accelerating discovery, their use entails many challenges including: the need for adequate computational infrastructure, secure processes for data sharing and access, tools that allow researchers to find and integrate diverse datasets, and standardized methods of analysis. These are just some elements of a complex ecosystem that needs to be built to support the rapid accumulation of these data. The NIH Big Data to Knowledge (BD2K) initiative aims to facilitate digitally enabled biomedical research. Within the BD2K framework, the Commons initiative is intended to establish a virtual environment that will facilitate the use, interoperability, and discoverability of shared digital objects used for research. The BD2K Commons Framework Pilots Working Group (CFPWG) was established to clarify goals and work on pilot projects that address existing gaps toward realizing the vision of the BD2K Commons. This report reviews highlights from a two-day meeting involving the BD2K CFPWG to provide insights on trends and considerations in advancing Big Data science for biomedical research in the United States. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Computer simulation of hopper flow

    International Nuclear Information System (INIS)

    Potapov, A.V.; Campbell, C.S.

    1996-01-01

    This paper describes two-dimensional computer simulations of granular flow in plane hoppers. The simulations can reproduce an experimentally observed asymmetric unsteadiness for monodispersed particle sizes, but also could eliminate it by adding a small amount of polydispersity. This appears to be a result of the strong packings that may be formed by monodispersed particles and is thus a noncontinuum effect. The internal stress state was also sampled, which among other things, allows an evaluation of common assumptions made in granular material models. These showed that the internal friction coefficient is far from a constant, which is in contradiction to common models based on plasticity theory which assume that the material is always at the point of imminent yield. Furthermore, it is demonstrated that rapid granular flow theory, another common modeling technique, is inapplicable to this problem even near the exit where the flow is moving its fastest. copyright 1996 American Institute of Physics

  7. Computer systems and networks status and perspectives

    CERN Document Server

    Zacharov, V

    1981-01-01

    The properties of computers are discussed, both as separate units and in inter-coupled systems. The main elements of modern processor technology are reviewed and the associated peripheral components are discussed in the light of the prevailing rapid pace of developments. Particular emphasis is given to the impact of very large scale integrated circuitry in these developments. Computer networks are considered in some detail, including common-carrier and local-area networks, and the problem of inter-working is included in the discussion. Components of network systems and the associated technology are also among the topics treated.

  8. Parallel computing for data science with examples in R, C++ and CUDA

    CERN Document Server

    Matloff, Norman

    2015-01-01

    Parallel Computing for Data Science: With Examples in R, C++ and CUDA is one of the first parallel computing books to concentrate exclusively on parallel data structures, algorithms, software tools, and applications in data science. It includes examples not only from the classic ""n observations, p variables"" matrix format but also from time series, network graph models, and numerous other structures common in data science. The examples illustrate the range of issues encountered in parallel programming.With the main focus on computation, the book shows how to compute on three types of platfor

  9. Supporting Undergraduate Computer Architecture Students Using a Visual MIPS64 CPU Simulator

    Science.gov (United States)

    Patti, D.; Spadaccini, A.; Palesi, M.; Fazzino, F.; Catania, V.

    2012-01-01

    The topics of computer architecture are always taught using an Assembly dialect as an example. The most commonly used textbooks in this field use the MIPS64 Instruction Set Architecture (ISA) to help students in learning the fundamentals of computer architecture because of its orthogonality and its suitability for real-world applications. This…

  10. Computing for magnetic fusion energy research: An updated vision

    International Nuclear Information System (INIS)

    Henline, P.; Giarrusso, J.; Davis, S.; Casper, T.

    1993-01-01

    This Fusion Computing Council perspective is written to present the primary of the fusion computing community at the time of publication of the report necessarily as a summary of the information contained in the individual sections. These concerns reflect FCC discussions during final review of contributions from the various working groups and portray our latest information. This report itself should be considered as dynamic, requiring periodic updating in an attempt to track rapid evolution of the computer industry relevant to requirements for magnetic fusion research. The most significant common concern among the Fusion Computing Council working groups is networking capability. All groups see an increasing need for network services due to the use of workstations, distributed computing environments, increased use of graphic services, X-window usage, remote experimental collaborations, remote data access for specific projects and other collaborations. Other areas of concern include support for workstations, enhanced infrastructure to support collaborations, the User Service Centers, NERSC and future massively parallel computers, and FCC sponsored workshops

  11. Indication for dental computed tomography. Case reports

    International Nuclear Information System (INIS)

    Schom, C.; Engelke, W.; Kopka, L.; Fischer, U.; Grabbe, E.

    1996-01-01

    Based on case reports, common indications for dental computed tomography are demonstrated and typical findings are analysed. From a group of 110 patients who had a reformatted computed tomography of the maxilla and mandibula, 10 typical cases were chosen as examples and are presented with a detailed description of the findings. The most important indication was the analysis of the morphology of the alveolar ridge needed in presurgical planning for osseointegrated implants as well as in special cases of postsurgical control. Apart from implantology, the method could be used in cases of mandibular cysts and bony destructions. In conclusion, dental computed tomography has become established mainly in implantology. It can provide valuable results in cases where a demonstration of the bone in all dimensions and free of overlappings and distortions is needed. (orig.) [de

  12. Computer Aided Drug Design: Success and Limitations.

    Science.gov (United States)

    Baig, Mohammad Hassan; Ahmad, Khurshid; Roy, Sudeep; Ashraf, Jalaluddin Mohammad; Adil, Mohd; Siddiqui, Mohammad Haris; Khan, Saif; Kamal, Mohammad Amjad; Provazník, Ivo; Choi, Inho

    2016-01-01

    Over the last few decades, computer-aided drug design has emerged as a powerful technique playing a crucial role in the development of new drug molecules. Structure-based drug design and ligand-based drug design are two methods commonly used in computer-aided drug design. In this article, we discuss the theory behind both methods, as well as their successful applications and limitations. To accomplish this, we reviewed structure based and ligand based virtual screening processes. Molecular dynamics simulation, which has become one of the most influential tool for prediction of the conformation of small molecules and changes in their conformation within the biological target, has also been taken into account. Finally, we discuss the principles and concepts of molecular docking, pharmacophores and other methods used in computer-aided drug design.

  13. Computed tomography: ocular manifestations in acute head injury ...

    African Journals Online (AJOL)

    Background: Acute head injuries are common in the population. Associated ocular injuries are occasionally encountered and these are of varying nature and outcome. Methods: We reviewed 98 brain computed tomographic results retrospectively. These are cases that were done between Jan. 2013- Jan. 2014. Statistical ...

  14. Consolidation of cloud computing in ATLAS

    Science.gov (United States)

    Taylor, Ryan P.; Domingues Cordeiro, Cristovao Jose; Giordano, Domenico; Hover, John; Kouba, Tomas; Love, Peter; McNab, Andrew; Schovancova, Jaroslava; Sobie, Randall; ATLAS Collaboration

    2017-10-01

    Throughout the first half of LHC Run 2, ATLAS cloud computing has undergone a period of consolidation, characterized by building upon previously established systems, with the aim of reducing operational effort, improving robustness, and reaching higher scale. This paper describes the current state of ATLAS cloud computing. Cloud activities are converging on a common contextualization approach for virtual machines, and cloud resources are sharing monitoring and service discovery components. We describe the integration of Vacuum resources, streamlined usage of the Simulation at Point 1 cloud for offline processing, extreme scaling on Amazon compute resources, and procurement of commercial cloud capacity in Europe. Finally, building on the previously established monitoring infrastructure, we have deployed a real-time monitoring and alerting platform which coalesces data from multiple sources, provides flexible visualization via customizable dashboards, and issues alerts and carries out corrective actions in response to problems.

  15. Normalization as a canonical neural computation

    Science.gov (United States)

    Carandini, Matteo; Heeger, David J.

    2012-01-01

    There is increasing evidence that the brain relies on a set of canonical neural computations, repeating them across brain regions and modalities to apply similar operations to different problems. A promising candidate for such a computation is normalization, in which the responses of neurons are divided by a common factor that typically includes the summed activity of a pool of neurons. Normalization was developed to explain responses in the primary visual cortex and is now thought to operate throughout the visual system, and in many other sensory modalities and brain regions. Normalization may underlie operations such as the representation of odours, the modulatory effects of visual attention, the encoding of value and the integration of multisensory information. Its presence in such a diversity of neural systems in multiple species, from invertebrates to mammals, suggests that it serves as a canonical neural computation. PMID:22108672

  16. Reports on the utilization of the grant-in-aid for computational programs (the fiscal year 1990)

    International Nuclear Information System (INIS)

    1991-06-01

    In the Research Center for Nuclear Physics, Osaka University, as a part of the common utilization activities, the computation expense for common utilization has been instituted, and the mainly theoretical numerical computation in the field of nuclear physics has been supported since 1976. This computation expense invites the participation of the subjects of more than 50,000 yen every year, and the subjects are adopted through the opinion of referees and the discussion of the ad hoc committee. When this computation expense is used, the simple report in definite form and the more detailed report on the contents of computation shall be presented after the use was finished. In the former, the period and expense of computation, the publication of result and the main results are included, and in the latter, the abstract in English, the explanation of the results obtained by the computation and the physical contents, the new development, difficulty and the method of resolution in computation techniques, the subroutine, function and block diagram used for the computation are included. In this book, the latter reports of 27 subjects are collected. (K.I.)

  17. Risk in Enterprise Cloud Computing: Re-Evaluated

    Science.gov (United States)

    Funmilayo, Bolonduro, R.

    2016-01-01

    A quantitative study was conducted to get the perspectives of IT experts about risks in enterprise cloud computing. In businesses, these IT experts are often not in positions to prioritize business needs. The business experts commonly known as business managers mostly determine an organization's business needs. Even if an IT expert classified a…

  18. Artificial Intelligence, Computational Thinking, and Mathematics Education

    Science.gov (United States)

    Gadanidis, George

    2017-01-01

    Purpose: The purpose of this paper is to examine the intersection of artificial intelligence (AI), computational thinking (CT), and mathematics education (ME) for young students (K-8). Specifically, it focuses on three key elements that are common to AI, CT and ME: agency, modeling of phenomena and abstracting concepts beyond specific instances.…

  19. The common component architecture for particle accelerator simulations

    International Nuclear Information System (INIS)

    Dechow, D.R.; Norris, B.; Amundson, J.

    2007-01-01

    Synergia2 is a beam dynamics modeling and simulation application for high-energy accelerators such as the Tevatron at Fermilab and the International Linear Collider, which is now under planning and development. Synergia2 is a hybrid, multilanguage software package comprised of two separate accelerator physics packages (Synergia and MaryLie/Impact) and one high-performance computer science package (PETSc). We describe our approach to producing a set of beam dynamics-specific software components based on the Common Component Architecture specification. Among other topics, we describe particular experiences with the following tasks: using Python steering to guide the creation of interfaces and to prototype components; working with legacy Fortran codes; and an example component-based, beam dynamics simulation.

  20. CAT: a computer code for the automated construction of fault trees

    International Nuclear Information System (INIS)

    Apostolakis, G.E.; Salem, S.L.; Wu, J.S.

    1978-03-01

    A computer code, CAT (Computer Automated Tree, is presented which applies decision table methods to model the behavior of components for systematic construction of fault trees. The decision tables for some commonly encountered mechanical and electrical components are developed; two nuclear subsystems, a Containment Spray Recirculation System and a Consequence Limiting Control System, are analyzed to demonstrate the applications of CAT code

  1. Quantum computing

    International Nuclear Information System (INIS)

    Steane, Andrew

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  2. Quantum computing

    Energy Technology Data Exchange (ETDEWEB)

    Steane, Andrew [Department of Atomic and Laser Physics, University of Oxford, Clarendon Laboratory, Oxford (United Kingdom)

    1998-02-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  3. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    Science.gov (United States)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  4. Hardware architecture design of image restoration based on time-frequency domain computation

    Science.gov (United States)

    Wen, Bo; Zhang, Jing; Jiao, Zipeng

    2013-10-01

    The image restoration algorithms based on time-frequency domain computation is high maturity and applied widely in engineering. To solve the high-speed implementation of these algorithms, the TFDC hardware architecture is proposed. Firstly, the main module is designed, by analyzing the common processing and numerical calculation. Then, to improve the commonality, the iteration control module is planed for iterative algorithms. In addition, to reduce the computational cost and memory requirements, the necessary optimizations are suggested for the time-consuming module, which include two-dimensional FFT/IFFT and the plural calculation. Eventually, the TFDC hardware architecture is adopted for hardware design of real-time image restoration system. The result proves that, the TFDC hardware architecture and its optimizations can be applied to image restoration algorithms based on TFDC, with good algorithm commonality, hardware realizability and high efficiency.

  5. Integrated evolutionary computation neural network quality controller for automated systems

    Energy Technology Data Exchange (ETDEWEB)

    Patro, S.; Kolarik, W.J. [Texas Tech Univ., Lubbock, TX (United States). Dept. of Industrial Engineering

    1999-06-01

    With increasing competition in the global market, more and more stringent quality standards and specifications are being demands at lower costs. Manufacturing applications of computing power are becoming more common. The application of neural networks to identification and control of dynamic processes has been discussed. The limitations of using neural networks for control purposes has been pointed out and a different technique, evolutionary computation, has been discussed. The results of identifying and controlling an unstable, dynamic process using evolutionary computation methods has been presented. A framework for an integrated system, using both neural networks and evolutionary computation, has been proposed to identify the process and then control the product quality, in a dynamic, multivariable system, in real-time.

  6. ATLAS Tier-2 at the Compute Resource Center GoeGrid in Göttingen

    Science.gov (United States)

    Meyer, Jörg; Quadt, Arnulf; Weber, Pavel; ATLAS Collaboration

    2011-12-01

    GoeGrid is a grid resource center located in Göttingen, Germany. The resources are commonly used, funded, and maintained by communities doing research in the fields of grid development, computer science, biomedicine, high energy physics, theoretical physics, astrophysics, and the humanities. For the high energy physics community, GoeGrid serves as a Tier-2 center for the ATLAS experiment as part of the world-wide LHC computing grid (WLCG). The status and performance of the Tier-2 center is presented with a focus on the interdisciplinary setup and administration of the cluster. Given the various requirements of the different communities on the hardware and software setup the challenge of the common operation of the cluster is detailed. The benefits are an efficient use of computer and personpower resources.

  7. Reviews on Security Issues and Challenges in Cloud Computing

    Science.gov (United States)

    An, Y. Z.; Zaaba, Z. F.; Samsudin, N. F.

    2016-11-01

    Cloud computing is an Internet-based computing service provided by the third party allowing share of resources and data among devices. It is widely used in many organizations nowadays and becoming more popular because it changes the way of how the Information Technology (IT) of an organization is organized and managed. It provides lots of benefits such as simplicity and lower costs, almost unlimited storage, least maintenance, easy utilization, backup and recovery, continuous availability, quality of service, automated software integration, scalability, flexibility and reliability, easy access to information, elasticity, quick deployment and lower barrier to entry. While there is increasing use of cloud computing service in this new era, the security issues of the cloud computing become a challenges. Cloud computing must be safe and secure enough to ensure the privacy of the users. This paper firstly lists out the architecture of the cloud computing, then discuss the most common security issues of using cloud and some solutions to the security issues since security is one of the most critical aspect in cloud computing due to the sensitivity of user's data.

  8. Computational chemistry reviews of current trends v.4

    CERN Document Server

    1999-01-01

    This volume presents a balanced blend of methodological and applied contributions. It supplements well the first three volumes of the series, revealing results of current research in computational chemistry. It also reviews the topographical features of several molecular scalar fields. A brief discussion of topographical concepts is followed by examples of their application to several branches of chemistry.The size of a basis set applied in a calculation determines the amount of computer resources necessary for a particular task. The details of a common strategy - the ab initio model potential

  9. Component-based software for high-performance scientific computing

    Energy Technology Data Exchange (ETDEWEB)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly.

  10. Component-based software for high-performance scientific computing

    International Nuclear Information System (INIS)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly

  11. Network survivability performance (computer diskette)

    Science.gov (United States)

    1993-11-01

    File characteristics: Data file; 1 file. Physical description: 1 computer diskette; 3 1/2 in.; high density; 2.0MB. System requirements: Mac; Word. This technical report has been developed to address the survivability of telecommunications networks including services. It responds to the need for a common understanding of, and assessment techniques for network survivability, availability, integrity, and reliability. It provides a basis for designing and operating telecommunication networks to user expectations for network survivability.

  12. FAST: framework for heterogeneous medical image computing and visualization.

    Science.gov (United States)

    Smistad, Erik; Bozorgi, Mohammadmehdi; Lindseth, Frank

    2015-11-01

    Computer systems are becoming increasingly heterogeneous in the sense that they consist of different processors, such as multi-core CPUs and graphic processing units. As the amount of medical image data increases, it is crucial to exploit the computational power of these processors. However, this is currently difficult due to several factors, such as driver errors, processor differences, and the need for low-level memory handling. This paper presents a novel FrAmework for heterogeneouS medical image compuTing and visualization (FAST). The framework aims to make it easier to simultaneously process and visualize medical images efficiently on heterogeneous systems. FAST uses common image processing programming paradigms and hides the details of memory handling from the user, while enabling the use of all processors and cores on a system. The framework is open-source, cross-platform and available online. Code examples and performance measurements are presented to show the simplicity and efficiency of FAST. The results are compared to the insight toolkit (ITK) and the visualization toolkit (VTK) and show that the presented framework is faster with up to 20 times speedup on several common medical imaging algorithms. FAST enables efficient medical image computing and visualization on heterogeneous systems. Code examples and performance evaluations have demonstrated that the toolkit is both easy to use and performs better than existing frameworks, such as ITK and VTK.

  13. Computed tomography of chest wall abscess

    International Nuclear Information System (INIS)

    Ikezoe, Junpei; Morimoto, Shizuo; Akira, Masanori

    1986-01-01

    Inflammatory lesions of the chest wall become less common because of the improvement of antibiotics and chemotherapeutic agents. Over a 5-year period, 7 patients with chest wall inflammatory diseases underwent chest computed tomography. These were 2 tuberculous pericostal abscesses, 2 empyema necessitatis, 1 spinal caries, and 2 bacterial chest wall abscesses (unknown organisms). Computed tomography (CT) helped in demonstrating the density, border, site, and extent of the lesions. CT images also demonstrated the accompaning abnormalities which included bone changes, pleural calcification, or old tuberculous changes of the lung. CT was very effective to demonstrate the communicating portions from the inside of the bony thorax to the outside of the bony thorax in 2 empyema necessitatis. (author)

  14. Certainty in Stockpile Computing: Recommending a Verification and Validation Program for Scientific Software

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J.R.

    1998-11-01

    As computing assumes a more central role in managing the nuclear stockpile, the consequences of an erroneous computer simulation could be severe. Computational failures are common in other endeavors and have caused project failures, significant economic loss, and loss of life. This report examines the causes of software failure and proposes steps to mitigate them. A formal verification and validation program for scientific software is recommended and described.

  15. "Classical Blalock-Taussig shunt" gone wrong: Confusing the right common carotid with right subclavian artery

    Directory of Open Access Journals (Sweden)

    A Mohammed Idhrees

    2015-01-01

    Full Text Available A 14-year-old girl underwent classical Blalock-Taussig shunt at 5 months of age. Computed tomography evaluation showed "Adachi type H" pattern of aortic arch vessels with the right common carotid artery being anastomosed to the right pulmonary artery mistaking it for the right subclavian artery.

  16. Computed tomography after radical pancreaticoduodenectomy (Whipple's procedure)

    Energy Technology Data Exchange (ETDEWEB)

    Smith, S.L. [Department of Radiology, Ipswich Hospital, Ipswich IP4 5PD (United Kingdom)], E-mail: simon.smith@ipswichhospital.nhs.uk; Hampson, F. [Department of Radiology, Addenbrooke' s Hospital NHS Trust, Cambridge (United Kingdom); Duxbury, M. [Department of Surgery, Royal Infirmary of Edinburgh, Little France, Edinburgh EH16 4SU (United Kingdom); Rae, D.M.; Sinclair, M.T. [Department of Pancreaticobiliary surgery, Ipswich Hospital, Ipswich IP4 5PD (United Kingdom)

    2008-08-15

    Whipple's procedure (radical pancreaticoduodenectomy) is currently the only curative option for patients with periampullary malignancy. The surgery is highly complex and involves multiple anastomoses. Complications are common and can lead to significant postoperative morbidity. Early detection and treatment of complications is vital, and high-quality multidetector computed tomography (MDCT) is currently the best method of investigation. This review outlines the surgical technique and illustrates the range of normal postoperative appearances together with the common complications.

  17. Developing Cancer Informatics Applications and Tools Using the NCI Genomic Data Commons API.

    Science.gov (United States)

    Wilson, Shane; Fitzsimons, Michael; Ferguson, Martin; Heath, Allison; Jensen, Mark; Miller, Josh; Murphy, Mark W; Porter, James; Sahni, Himanso; Staudt, Louis; Tang, Yajing; Wang, Zhining; Yu, Christine; Zhang, Junjun; Ferretti, Vincent; Grossman, Robert L

    2017-11-01

    The NCI Genomic Data Commons (GDC) was launched in 2016 and makes available over 4 petabytes (PB) of cancer genomic and associated clinical data to the research community. This dataset continues to grow and currently includes over 14,500 patients. The GDC is an example of a biomedical data commons, which collocates biomedical data with storage and computing infrastructure and commonly used web services, software applications, and tools to create a secure, interoperable, and extensible resource for researchers. The GDC is (i) a data repository for downloading data that have been submitted to it, and also a system that (ii) applies a common set of bioinformatics pipelines to submitted data; (iii) reanalyzes existing data when new pipelines are developed; and (iv) allows users to build their own applications and systems that interoperate with the GDC using the GDC Application Programming Interface (API). We describe the GDC API and how it has been used both by the GDC itself and by third parties. Cancer Res; 77(21); e15-18. ©2017 AACR . ©2017 American Association for Cancer Research.

  18. Status of integration of small computers into NDE systems

    International Nuclear Information System (INIS)

    Dau, G.J.; Behravesh, M.M.

    1988-01-01

    Introduction of computers in nondestructive evaluations (NDE) has enabled data acquisition devices to provide a more thorough and complete coverage in the scanning process, and has aided human inspectors in their data analysis and decision making efforts. The price and size/weight of small computers, coupled with recent increases in processing and storage capacity, have made small personal computers (PC's) the most viable platform for NDE equipment. Several NDE systems using minicomputers and newer PC-based systems, capable of automatic data acquisition, and knowledge-based analysis of the test data, have been field tested in the nuclear power plant environment and are currently available through commercial sources. While computers have been in common use for several NDE methods during the last few years, their greatest impact, however, has been on ultrasonic testing. This paper discusses the evolution of small computers and their integration into the ultrasonic testing process

  19. Computing aggregate properties of preimages for 2D cellular automata.

    Science.gov (United States)

    Beer, Randall D

    2017-11-01

    Computing properties of the set of precursors of a given configuration is a common problem underlying many important questions about cellular automata. Unfortunately, such computations quickly become intractable in dimension greater than one. This paper presents an algorithm-incremental aggregation-that can compute aggregate properties of the set of precursors exponentially faster than naïve approaches. The incremental aggregation algorithm is demonstrated on two problems from the two-dimensional binary Game of Life cellular automaton: precursor count distributions and higher-order mean field theory coefficients. In both cases, incremental aggregation allows us to obtain new results that were previously beyond reach.

  20. BIOSAY: a computer program to store and report bioassay data

    International Nuclear Information System (INIS)

    Williams, G.E.; Parlagreco, J.R.

    1978-12-01

    BIOSAY is a computer program designed to manipulate bioassay data. Using BIOSAY, the Dosimetry Group can generate a report suitable for an individual's dosimetry record. A second copy of this report may be mailed to the individual who provided the sample or the area health physicist for review. BIOSAY also contains a data sorting option which allows all the results for particular individuals, or groups of individuals with common attributes, to be separated from the data base. The computer code is written in a conversational tone with aids which make it usable by even casual users of the computer system

  1. Operating System Concepts for Reconfigurable Computing: Review and Survey

    Directory of Open Access Journals (Sweden)

    Marcel Eckert

    2016-01-01

    Full Text Available One of the key future challenges for reconfigurable computing is to enable higher design productivity and a more easy way to use reconfigurable computing systems for users that are unfamiliar with the underlying concepts. One way of doing this is to provide standardization and abstraction, usually supported and enforced by an operating system. This article gives historical review and a summary on ideas and key concepts to include reconfigurable computing aspects in operating systems. The article also presents an overview on published and available operating systems targeting the area of reconfigurable computing. The purpose of this article is to identify and summarize common patterns among those systems that can be seen as de facto standard. Furthermore, open problems, not covered by these already available systems, are identified.

  2. Data-flow oriented visual programming libraries for scientific computing

    NARCIS (Netherlands)

    Maubach, J.M.L.; Drenth, W.D.; Sloot, P.M.A.

    2002-01-01

    The growing release of scientific computational software does not seem to aid the implementation of complex numerical algorithms. Released libraries lack a common standard interface with regard to for instance finite element, difference or volume discretizations. And, libraries written in standard

  3. Common signs and symptoms in hypothyroidism in central part of iran

    International Nuclear Information System (INIS)

    Jabbari, A.; Besharat, S.; Razavianzadeh, N.; Moetabar, M.

    2008-01-01

    This study was designed to evaluate the common signs and symptoms of hypothyroidism in persons with clinical diagnosis of hypothyroidism that was confirmed with laboratory tests. This descriptive cross-sectional study was done during 13 months in medical centers of Shahrood city, in central part of Iran. All cases with probable diagnosis of hypothyroidism based on the signs and symptoms, referred to health care services were included in the study. Radioimmunoassay tests and thyroid hormones evaluation were done. Demographic data and signs were recorded through interview. Data were entered in the computer and analyzed by SPSS software. Patients who completed questionnaires (n=50) were interviewed three times during this period. Female/male ratio was 6/1. The most common signs were cold intolerance (95%), weight gain and menorrhagia. The most common symptoms were edema (80%) and pallor (60%). The severe disease was seen in 4%. Mild type was the most common presentation of hypothyroidism (60%). The most common signs and symptoms of hypothyroidism in the central part of Iran (Shahrood city), that is one of the iodine deficient areas in Iran; were different from other studies. Socio-demographic and nutritional status, illiteracy level and personal self-care are among the probable causes. Unfortunately, concomitance of some of the signs and symptoms are not diagnostic for hypothyroidism. It seems that strong clinical suspicious and laboratory confirmation are the only reliable methods for hypothyroidism diagnosis. (author)

  4. Cloud computing patterns fundamentals to design, build, and manage cloud applications

    CERN Document Server

    Fehling, Christoph; Retter, Ralph; Schupeck, Walter; Arbitter, Peter

    2014-01-01

    The current work provides CIOs, software architects, project managers, developers, and cloud strategy initiatives with a set of architectural patterns that offer nuggets of advice on how to achieve common cloud computing-related goals. The cloud computing patterns capture knowledge and experience in an abstract format that is independent of concrete vendor products. Readers are provided with a toolbox to structure cloud computing strategies and design cloud application architectures. By using this book cloud-native applications can be implemented and best suited cloud vendors and tooling for i

  5. Interferences and events on epistemic shifts in physics through computer simulations

    CERN Document Server

    Warnke, Martin

    2017-01-01

    Computer simulations are omnipresent media in today's knowledge production. For scientific endeavors such as the detection of gravitational waves and the exploration of subatomic worlds, simulations are essential; however, the epistemic status of computer simulations is rather controversial as they are neither just theory nor just experiment. Therefore, computer simulations have challenged well-established insights and common scientific practices as well as our very understanding of knowledge. This volume contributes to the ongoing discussion on the epistemic position of computer simulations in a variety of physical disciplines, such as quantum optics, quantum mechanics, and computational physics. Originating from an interdisciplinary event, it shows that accounts of contemporary physics can constructively interfere with media theory, philosophy, and the history of science.

  6. CT applications of medical computer graphics

    International Nuclear Information System (INIS)

    Rhodes, M.L.

    1985-01-01

    Few applications of computer graphics show as much promise and early success as that for CT. Unlike electron microscopy, ultrasound, business, military, and animation applications, CT image data are inherently digital. CT pictures can be processed directly by programs well established in the fields of computer graphics and digital image processing. Methods for reformatting digital pictures, enhancing structure shape, reducing image noise, and rendering three-dimensional (3D) scenes of anatomic structures have all become routine at many CT centers. In this chapter, the authors provide a brief introduction to computer graphics terms and techniques commonly applied to CT pictures and, when appropriate, to those showing promise for magnetic resonance images. Topics discussed here are image-processing options that are applied to digital images already constructed. In the final portion of this chapter techniques for ''slicing'' CT image data are presented, and geometric principles that describe the specification of oblique and curved images are outlined. Clinical examples are included

  7. Energy Efficiency in Computing (1/2)

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    As manufacturers improve the silicon process, truly low energy computing is becoming a reality - both in servers and in the consumer space. This series of lectures covers a broad spectrum of aspects related to energy efficient computing - from circuits to datacentres. We will discuss common trade-offs and basic components, such as processors, memory and accelerators. We will also touch on the fundamentals of modern datacenter design and operation. Lecturer's short bio: Andrzej Nowak has 10 years of experience in computing technologies, primarily from CERN openlab and Intel. At CERN, he managed a research lab collaborating with Intel and was part of the openlab Chief Technology Office. Andrzej also worked closely and initiated projects with the private sector (e.g. HP and Google), as well as international research institutes, such as EPFL. Currently, Andrzej acts as a consultant on technology and innovation with TIK Services (http://tik.services), and runs a peer-to-peer lending start-up. NB! All Academic L...

  8. Computational discovery of extremal microstructure families

    Science.gov (United States)

    Chen, Desai; Skouras, Mélina; Zhu, Bo; Matusik, Wojciech

    2018-01-01

    Modern fabrication techniques, such as additive manufacturing, can be used to create materials with complex custom internal structures. These engineered materials exhibit a much broader range of bulk properties than their base materials and are typically referred to as metamaterials or microstructures. Although metamaterials with extraordinary properties have many applications, designing them is very difficult and is generally done by hand. We propose a computational approach to discover families of microstructures with extremal macroscale properties automatically. Using efficient simulation and sampling techniques, we compute the space of mechanical properties covered by physically realizable microstructures. Our system then clusters microstructures with common topologies into families. Parameterized templates are eventually extracted from families to generate new microstructure designs. We demonstrate these capabilities on the computational design of mechanical metamaterials and present five auxetic microstructure families with extremal elastic material properties. Our study opens the way for the completely automated discovery of extremal microstructures across multiple domains of physics, including applications reliant on thermal, electrical, and magnetic properties. PMID:29376124

  9. Monte Carlo strategies in scientific computing

    CERN Document Server

    Liu, Jun S

    2008-01-01

    This paperback edition is a reprint of the 2001 Springer edition This book provides a self-contained and up-to-date treatment of the Monte Carlo method and develops a common framework under which various Monte Carlo techniques can be "standardized" and compared Given the interdisciplinary nature of the topics and a moderate prerequisite for the reader, this book should be of interest to a broad audience of quantitative researchers such as computational biologists, computer scientists, econometricians, engineers, probabilists, and statisticians It can also be used as the textbook for a graduate-level course on Monte Carlo methods Many problems discussed in the alter chapters can be potential thesis topics for masters’ or PhD students in statistics or computer science departments Jun Liu is Professor of Statistics at Harvard University, with a courtesy Professor appointment at Harvard Biostatistics Department Professor Liu was the recipient of the 2002 COPSS Presidents' Award, the most prestigious one for sta...

  10. The “Common Solutions" Strategy of the Experiment Support group at CERN for the LHC Experiments

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    After two years of LHC data taking, processing and analysis and with numerous changes in computing technology, a number of aspects of the experiments’ computing as well as WLCG deployment and operations need to evolve. As part of the activities of the Experiment Support group in CERN’s IT department, and reinforced by effort from the EGI-InSPIRE project, we present work aimed at common solutions across all LHC experiments. Such solutions allow us not only to optimize development manpower but also offer lower long-term maintenance and support costs. The main areas cover Distributed Data Management, Data Analysis, Monitoring and the LCG Persistency Framework. Specific tools have been developed including the HammerCloud framework, automated services for data placement, data cleaning and data integrity (such as the data popularity service for CMS, the common Victor cleaning agent for ATLAS and CMS and tools for catalogue/storage consistency), the Dashboard Monitoring framework (job monitoring, data management...

  11. Computational intelligence, medicine and biology selected links

    CERN Document Server

    Zaitseva, Elena

    2015-01-01

    This book contains an interesting and state-of the art collection of chapters presenting several examples of attempts to developing modern tools utilizing computational intelligence in different real life problems encountered by humans. Reasoning, prediction, modeling, optimization, decision making, etc. need modern, soft and intelligent algorithms, methods and methodologies to solve, in the efficient ways, problems appearing in human activity. The contents of the book is divided into two parts. Part I, consisting of four chapters, is devoted to selected links of computational intelligence, medicine, health care and biomechanics. Several problems are considered: estimation of healthcare system reliability, classification of ultrasound thyroid images, application of fuzzy logic to measure weight status and central fatness, and deriving kinematics directly from video records. Part II, also consisting of four chapters, is devoted to selected links of computational intelligence and biology. The common denominato...

  12. State-of-the-art in Heterogeneous Computing

    Directory of Open Access Journals (Sweden)

    Andre R. Brodtkorb

    2010-01-01

    Full Text Available Node level heterogeneous architectures have become attractive during the last decade for several reasons: compared to traditional symmetric CPUs, they offer high peak performance and are energy and/or cost efficient. With the increase of fine-grained parallelism in high-performance computing, as well as the introduction of parallelism in workstations, there is an acute need for a good overview and understanding of these architectures. We give an overview of the state-of-the-art in heterogeneous computing, focusing on three commonly found architectures: the Cell Broadband Engine Architecture, graphics processing units (GPUs, and field programmable gate arrays (FPGAs. We present a review of hardware, available software tools, and an overview of state-of-the-art techniques and algorithms. Furthermore, we present a qualitative and quantitative comparison of the architectures, and give our view on the future of heterogeneous computing.

  13. Interior spatial layout with soft objectives using evolutionary computation

    NARCIS (Netherlands)

    Chatzikonstantinou, I.; Bengisu, E.

    2016-01-01

    This paper presents the design problem of furniture arrangement in a residential interior living space, and addresses it by means of evolutionary computation. Interior arrangement is an important and interesting problem that occurs commonly when designing living spaces. It entails determining the

  14. Computational Fragment-Based Drug Design: Current Trends, Strategies, and Applications.

    Science.gov (United States)

    Bian, Yuemin; Xie, Xiang-Qun Sean

    2018-04-09

    Fragment-based drug design (FBDD) has become an effective methodology for drug development for decades. Successful applications of this strategy brought both opportunities and challenges to the field of Pharmaceutical Science. Recent progress in the computational fragment-based drug design provide an additional approach for future research in a time- and labor-efficient manner. Combining multiple in silico methodologies, computational FBDD possesses flexibilities on fragment library selection, protein model generation, and fragments/compounds docking mode prediction. These characteristics provide computational FBDD superiority in designing novel and potential compounds for a certain target. The purpose of this review is to discuss the latest advances, ranging from commonly used strategies to novel concepts and technologies in computational fragment-based drug design. Particularly, in this review, specifications and advantages are compared between experimental and computational FBDD, and additionally, limitations and future prospective are discussed and emphasized.

  15. [INVITED] Computational intelligence for smart laser materials processing

    Science.gov (United States)

    Casalino, Giuseppe

    2018-03-01

    Computational intelligence (CI) involves using a computer algorithm to capture hidden knowledge from data and to use them for training ;intelligent machine; to make complex decisions without human intervention. As simulation is becoming more prevalent from design and planning to manufacturing and operations, laser material processing can also benefit from computer generating knowledge through soft computing. This work is a review of the state-of-the-art on the methodology and applications of CI in laser materials processing (LMP), which is nowadays receiving increasing interest from world class manufacturers and 4.0 industry. The focus is on the methods that have been proven effective and robust in solving several problems in welding, cutting, drilling, surface treating and additive manufacturing using the laser beam. After a basic description of the most common computational intelligences employed in manufacturing, four sections, namely, laser joining, machining, surface, and additive covered the most recent applications in the already extensive literature regarding the CI in LMP. Eventually, emerging trends and future challenges were identified and discussed.

  16. Investigation into the effect of common factors on rolling resistance of belt conveyor

    Directory of Open Access Journals (Sweden)

    Lu Yan

    2015-08-01

    Full Text Available Since indentation rolling resistance accounts for the major part of total resistance of belt conveyor, it is important to compute it using a proper method during the design and application study of the belt conveyor. First, an approximate formula for computing the indentation rolling resistance is offered. In this formula, a one-dimensional Winkler foundation and a three-parameter viscoelastic Maxwell solid model of the belt backing material are used to determine the resistance to motion of a conveyor belt over idlers. With the help of this formula, the authors analyze the effect of common factors on the rolling resistance. Finally, experiments are carried out under certain condition compared with theoretical analysis. A reasonable correlation exists between the experimental results and the theoretical formulae.

  17. computed tomography features of basal ganglia and periventricular

    African Journals Online (AJOL)

    HIV is probably the most common cause of basal ganglia and periventricular calcification today. on-enhanced computed tomography (NECT) shows diffuse cerebral atrophy in 90% of cases. Bilateral, symmetrical basal ganglia calcification is seen in 30% of cases, but virtually never before 1 year of age.1. CMV (FIG.2).

  18. Cloud Computing and Its Applications in GIS

    Science.gov (United States)

    Kang, Cao

    2011-12-01

    of cloud computing. This paper presents a parallel Euclidean distance algorithm that works seamlessly with the distributed nature of cloud computing infrastructures. The mechanism of this algorithm is to subdivide a raster image into sub-images and wrap them with a one pixel deep edge layer of individually computed distance information. Each sub-image is then processed by a separate node, after which the resulting sub-images are reassembled into the final output. It is shown that while any rectangular sub-image shape can be used, those approximating squares are computationally optimal. This study also serves as a demonstration of this subdivide and layer-wrap strategy, which would enable the migration of many truly spatial GIS algorithms to cloud computing infrastructures. However, this research also indicates that certain spatial GIS algorithms such as cost distance cannot be migrated by adopting this mechanism, which presents significant challenges for the development of cloud-based GIS systems. The third article is entitled "A Distributed Storage Schema for Cloud Computing based Raster GIS Systems". This paper proposes a NoSQL Database Management System (NDDBMS) based raster GIS data storage schema. NDDBMS has good scalability and is able to use distributed commodity computers, which make it superior to Relational Database Management Systems (RDBMS) in a cloud computing environment. In order to provide optimized data service performance, the proposed storage schema analyzes the nature of commonly used raster GIS data sets. It discriminates two categories of commonly used data sets, and then designs corresponding data storage models for both categories. As a result, the proposed storage schema is capable of hosting and serving enormous volumes of raster GIS data speedily and efficiently on cloud computing infrastructures. In addition, the scheme also takes advantage of the data compression characteristics of Quadtrees, thus promoting efficient data storage. Through

  19. Making the Common Good Common

    Science.gov (United States)

    Chase, Barbara

    2011-01-01

    How are independent schools to be useful to the wider world? Beyond their common commitment to educate their students for meaningful lives in service of the greater good, can they educate a broader constituency and, thus, share their resources and skills more broadly? Their answers to this question will be shaped by their independence. Any…

  20. Genetic Patterns of Common-Bean Seed Acquisition and Early-Stage Adoption Among Farmer Groups in Western Uganda

    Science.gov (United States)

    Wilkus, Erin L.; Berny Mier y Teran, Jorge C.; Mukankusi, Clare M.; Gepts, Paul

    2018-01-01

    Widespread adoption of new varieties can be valuable, especially where improved agricultural production technologies are hard to access. However, as farmers adopt new varieties, in situ population structure and genetic diversity of their seed holdings can change drastically. Consequences of adoption are still poorly understood due to a lack of crop genetic diversity assessments and detailed surveys of farmers' seed management practices. Common bean (Phaseolus vulgaris) is an excellent model for these types of studies, as it has a long history of cultivation among smallholder farmers, exhibits eco-geographic patterns of diversity (e.g., Andean vs. Mesoamerican gene-pools), and has been subjected to post-Columbian dispersal and recent introduction of improved cultivars. The Hoima district of western Uganda additionally provides an excellent social setting for evaluating consequences of adoption because access to improved varieties has varied across farmer groups in this production region. This study establishes a baseline understanding of the common bean diversity found among household producers in Uganda and compares the crop population structure, diversity and consequences of adoption of household producers with different adoption practices. Molecular diversity analysis, based on 4,955 single nucleotide polymorphism (SNP) markers, evaluated a total of 1,156 seed samples that included 196 household samples collected from household producers in the Hoima district, 19 breeder-selected varieties used in participatory breeding activities that had taken place prior to the study in the region, and a global bean germplasm collection. Households that had participated in regional participatory breeding efforts were more likely to adopt new varieties and, consequently, diversify their seed stocks than those that had not participated. Of the three farmer groups that participated in breeding efforts, households from the farmer group with the longest history of bean production

  1. Computer Anxiety, Academic Stress, and Academic Procrastination on College Students

    Directory of Open Access Journals (Sweden)

    Wahyu Rahardjo

    2013-01-01

    Full Text Available Academic procrastination is fairly and commonly found among college students. The lack of understanding in making the best use of computer technology may lead to anxiety in terms of operating computer hence cause postponement in completing course assignments related to computer operation. On the other hand, failure in achieving certain academic targets as expected by parents and/or the students themselves also makes students less focused and leads to tendency of postponing many completions of course assignments. The aim of this research is to investigate contribution of anxiety in operating computer and academic stress toward procrastination on students. As much as 65 students majoring in psychology became participants in this study. The results showed that anxiety in operating computer and academic stress play significant role in influencing academic procrastination among social sciences students. In terms of academic procrastination tendencies, anxiety in operating computer and academic stress, male students have higher percentage than female students.

  2. Context-aware distributed cloud computing using CloudScheduler

    Science.gov (United States)

    Seuster, R.; Leavett-Brown, CR; Casteels, K.; Driemel, C.; Paterson, M.; Ring, D.; Sobie, RJ; Taylor, RP; Weldon, J.

    2017-10-01

    The distributed cloud using the CloudScheduler VM provisioning service is one of the longest running systems for HEP workloads. It has run millions of jobs for ATLAS and Belle II over the past few years using private and commercial clouds around the world. Our goal is to scale the distributed cloud to the 10,000-core level, with the ability to run any type of application (low I/O, high I/O and high memory) on any cloud. To achieve this goal, we have been implementing changes that utilize context-aware computing designs that are currently employed in the mobile communication industry. Context-awareness makes use of real-time and archived data to respond to user or system requirements. In our distributed cloud, we have many opportunistic clouds with no local HEP services, software or storage repositories. A context-aware design significantly improves the reliability and performance of our system by locating the nearest location of the required services. We describe how we are collecting and managing contextual information from our workload management systems, the clouds, the virtual machines and our services. This information is used not only to monitor the system but also to carry out automated corrective actions. We are incrementally adding new alerting and response services to our distributed cloud. This will enable us to scale the number of clouds and virtual machines. Further, a context-aware design will enable us to run analysis or high I/O application on opportunistic clouds. We envisage an open-source HTTP data federation (for example, the DynaFed system at CERN) as a service that would provide us access to existing storage elements used by the HEP experiments.

  3. Age group athletes in inline skating: decrease in overall and increase in master athlete participation in the longest inline skating race in Europe – the Inline One-Eleven

    Science.gov (United States)

    Teutsch, Uwe; Knechtle, Beat; Rüst, Christoph Alexander; Rosemann, Thomas; Lepers, Romuald

    2013-01-01

    Background Participation and performance trends in age group athletes have been investigated in endurance and ultraendurance races in swimming, cycling, running, and triathlon, but not in long-distance inline skating. The aim of this study was to investigate trends in participation, age, and performance in the longest inline race in Europe, the Inline One-Eleven over 111 km, held between 1998 and 2009. Methods The total number, age distribution, age at the time of the competition, and race times of male and female finishers at the Inline One-Eleven were analyzed. Results Overall participation increased until 2003 but decreased thereafter. During the 12-year period, the relative participation in skaters younger than 40 years old decreased while relative participation increased for skaters older than 40 years. The mean top ten skating time was 199 ± 9 minutes (range: 189–220 minutes) for men and 234 ± 17 minutes (range: 211–271 minutes) for women, respectively. The gender difference in performance remained stable at 17% ± 5% across years. Conclusion To summarize, although the participation of master long-distance inline skaters increased, the overall participation decreased across years in the Inline One-Eleven. The race times of the best female and male skaters stabilized across years with a gender difference in performance of 17% ± 5%. Further studies should focus on the participation in the international World Inline Cup races. PMID:23690697

  4. Basic definitions for discrete modeling of computer worms epidemics

    Directory of Open Access Journals (Sweden)

    Pedro Guevara López

    2015-01-01

    Full Text Available The information technologies have evolved in such a way that communication between computers or hosts has become common, so much that the worldwide organization (governments and corporations depends on it; what could happen if these computers stop working for a long time is catastrophic. Unfortunately, networks are attacked by malware such as viruses and worms that could collapse the system. This has served as motivation for the formal study of computer worms and epidemics to develop strategies for prevention and protection; this is why in this paper, before analyzing epidemiological models, a set of formal definitions based on set theory and functions is proposed for describing 21 concepts used in the study of worms. These definitions provide a basis for future qualitative research on the behavior of computer worms, and quantitative for the study of their epidemiological models.

  5. Partially annotated bibliography for computer protection and related topics

    Energy Technology Data Exchange (ETDEWEB)

    Huskamp, J.C.

    1976-07-20

    References for the commonly cited technical papers in the area of computer protection are given. Great care is taken to exclude papers with no technical content or merit. For the purposes of this bibliography, computer protection is broadly defined to encompass all facets of the protection problem. The papers cover, but are not limited to, the topics of protection features in operating systems (e.g., MULTICS and HYDRA), hardware implementations of protection facilities (e.g., Honeywell 6180, System 250, BCC 5000, B6500), data base protection controls, confinement and protection models. Since computer protection is related to many other areas in computer science and electrical engineering, a bibliography of related areas is included after the protection bibliography. These sections also include articles of general interest in the named areas which are not necessarily related to protection.

  6. Computed tomography of the gastro-intestinal tract : its value

    Energy Technology Data Exchange (ETDEWEB)

    Reeders, J W.A.J. [Academic Medical Center, Amsterdam (Netherlands)

    1996-12-31

    The subject discussed include indications and accuracy of CT - computed tomography, technical considerations, common pitfalls in CT interpretation, parameters for CT evaluation, benign lesions, double halo and target signs, hyperattenuated, inflammatory bowel disease, intestinal ischaemia, primary adenocarcinoma of the GIT, lymphoma and leiomyosarcoma (3 refs.).

  7. Computed tomography of the gastro-intestinal tract : its value

    International Nuclear Information System (INIS)

    Reeders, J.W.A.J.

    1995-01-01

    The subject discussed include indications and accuracy of CT - computed tomography, technical considerations, common pitfalls in CT interpretation, parameters for CT evaluation, benign lesions, double halo and target signs, hyperattenuated, inflammatory bowel disease, intestinal ischaemia, primary adenocarcinoma of the GIT, lymphoma and leiomyosarcoma (3 refs.)

  8. Functional tests of a prototype for the CMS-ATLAS common non-event data handling framework

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00366910; The ATLAS collaboration; Formica, Andrea

    2017-01-01

    Since 2014 the ATLAS and CMS experiments share a common vision on the database infrastructure for the handling of the non-event data in forthcoming LHC runs. The wide commonality in the use cases has allowed to agree on a common overall design solution that is meeting the requirements of both experiments. A first prototype has been completed in 2016 and has been made available to both experiments. The prototype is based on a web service implementing a REST api with a set of functions for the management of conditions data. In this contribution, we describe this prototype architecture and the tests that have been performed within the CMS computing infrastructure, with the aim of validating the support of the main use cases and of suggesting future improvements.

  9. Reliability of the NINDS common data elements cranial tomography (CT) rating variables for traumatic brain injury (TBI)

    NARCIS (Netherlands)

    Harburg, Leah; McCormack, Erin; Kenney, Kimbra; Moore, Carol; Yang, Kelly; Vos, Pieter; Jacobs, Bram; Madden, Christopher J; Diaz-Arrastia, Ramon R; Bogoslovsky, Tanya

    2017-01-01

    Background: Non-contrast head computer tomography (CT) is widely used to evaluate eligibility of patients after acute traumatic brain injury (TBI) for clinical trials. The NINDS Common Data Elements (CDEs) TBI were developed to standardize collection of CT variables. The objectives of this study

  10. Polyphony: A Workflow Orchestration Framework for Cloud Computing

    Science.gov (United States)

    Shams, Khawaja S.; Powell, Mark W.; Crockett, Tom M.; Norris, Jeffrey S.; Rossi, Ryan; Soderstrom, Tom

    2010-01-01

    Cloud Computing has delivered unprecedented compute capacity to NASA missions at affordable rates. Missions like the Mars Exploration Rovers (MER) and Mars Science Lab (MSL) are enjoying the elasticity that enables them to leverage hundreds, if not thousands, or machines for short durations without making any hardware procurements. In this paper, we describe Polyphony, a resilient, scalable, and modular framework that efficiently leverages a large set of computing resources to perform parallel computations. Polyphony can employ resources on the cloud, excess capacity on local machines, as well as spare resources on the supercomputing center, and it enables these resources to work in concert to accomplish a common goal. Polyphony is resilient to node failures, even if they occur in the middle of a transaction. We will conclude with an evaluation of a production-ready application built on top of Polyphony to perform image-processing operations of images from around the solar system, including Mars, Saturn, and Titan.

  11. Cloud computing for protein-ligand binding site comparison.

    Science.gov (United States)

    Hung, Che-Lun; Hua, Guan-Jie

    2013-01-01

    The proteome-wide analysis of protein-ligand binding sites and their interactions with ligands is important in structure-based drug design and in understanding ligand cross reactivity and toxicity. The well-known and commonly used software, SMAP, has been designed for 3D ligand binding site comparison and similarity searching of a structural proteome. SMAP can also predict drug side effects and reassign existing drugs to new indications. However, the computing scale of SMAP is limited. We have developed a high availability, high performance system that expands the comparison scale of SMAP. This cloud computing service, called Cloud-PLBS, combines the SMAP and Hadoop frameworks and is deployed on a virtual cloud computing platform. To handle the vast amount of experimental data on protein-ligand binding site pairs, Cloud-PLBS exploits the MapReduce paradigm as a management and parallelizing tool. Cloud-PLBS provides a web portal and scalability through which biologists can address a wide range of computer-intensive questions in biology and drug discovery.

  12. Computed tomography features of small bowel volvulus

    International Nuclear Information System (INIS)

    Loh, Y.H.; Dunn, G.D.

    2000-01-01

    Small bowel volvulus is a cause of acute abdomen and commonly occurs in neonates and young infants. Although it is rare in adults in the Western world,' it is a relatively common surgical emergency in the Middle East, India and Central Africa. It is associated with a mortality rate of 10-67% and, hence, it is important to make an early diagnosis to expedite surgical intervention. Computed tomography has become an important imaging modality in diagnosis and a number of signs have been recognized in a handful of documented case reports. We describe a case of small bowel volvulus that illustrates these important CT signs. Copyright (1999) Blackwell Science Pty Ltd

  13. Exploiting graphics processing units for computational biology and bioinformatics.

    Science.gov (United States)

    Payne, Joshua L; Sinnott-Armstrong, Nicholas A; Moore, Jason H

    2010-09-01

    Advances in the video gaming industry have led to the production of low-cost, high-performance graphics processing units (GPUs) that possess more memory bandwidth and computational capability than central processing units (CPUs), the standard workhorses of scientific computing. With the recent release of generalpurpose GPUs and NVIDIA's GPU programming language, CUDA, graphics engines are being adopted widely in scientific computing applications, particularly in the fields of computational biology and bioinformatics. The goal of this article is to concisely present an introduction to GPU hardware and programming, aimed at the computational biologist or bioinformaticist. To this end, we discuss the primary differences between GPU and CPU architecture, introduce the basics of the CUDA programming language, and discuss important CUDA programming practices, such as the proper use of coalesced reads, data types, and memory hierarchies. We highlight each of these topics in the context of computing the all-pairs distance between instances in a dataset, a common procedure in numerous disciplines of scientific computing. We conclude with a runtime analysis of the GPU and CPU implementations of the all-pairs distance calculation. We show our final GPU implementation to outperform the CPU implementation by a factor of 1700.

  14. A survey of current trends in computational drug repositioning.

    Science.gov (United States)

    Li, Jiao; Zheng, Si; Chen, Bin; Butte, Atul J; Swamidass, S Joshua; Lu, Zhiyong

    2016-01-01

    Computational drug repositioning or repurposing is a promising and efficient tool for discovering new uses from existing drugs and holds the great potential for precision medicine in the age of big data. The explosive growth of large-scale genomic and phenotypic data, as well as data of small molecular compounds with granted regulatory approval, is enabling new developments for computational repositioning. To achieve the shortest path toward new drug indications, advanced data processing and analysis strategies are critical for making sense of these heterogeneous molecular measurements. In this review, we show recent advancements in the critical areas of computational drug repositioning from multiple aspects. First, we summarize available data sources and the corresponding computational repositioning strategies. Second, we characterize the commonly used computational techniques. Third, we discuss validation strategies for repositioning studies, including both computational and experimental methods. Finally, we highlight potential opportunities and use-cases, including a few target areas such as cancers. We conclude with a brief discussion of the remaining challenges in computational drug repositioning. Published by Oxford University Press 2015. This work is written by US Government employees and is in the public domain in the US.

  15. Computed tomographic findings in penetrating peptic ulcer

    International Nuclear Information System (INIS)

    Madrazo, B.L.; Halpert, R.D.; Sandler, M.A.; Pearlberg, J.L.

    1984-01-01

    Four cases of peptic ulcer penetrating the head of the pancreas were diagnosed by computed tomography (CT). Findings common to 3 cases included (a) an ulcer crater, (b) a sinus tract, and (c) enlargement of the head of the pancreas. Unlike other modalities, the inherent spatial resolution of CT allows a convenient diagnosis of this important complication of peptic ulcer disease

  16. Computed tomography of obstructive jaundice

    International Nuclear Information System (INIS)

    Suh, Jung Hek; Lee, Joong Suk; Chun, Beung He; Suh, Soo Jhi

    1982-01-01

    It is well known that the computed tomography (CT) is very useful in the evaluation of obstructive jaundice. We have studied 55 cases of obstructive jaundice with whole body scanner from Jun.1980 to Jun. 1981. The results were as follows: 1. The sex distribution was 36 males and 19 females, and 40 cases of obstructive jaundice were seen in fifth, sixth, and seventh decades. 2. Causes of obstructive jaundice were 25 cases of pancreas cancer, 8 cases of common duct cancer, 4 cases of gallbladder cancer, 4 cases of ampulla vater cancer, 12 cases of common duct stone, and 2 cases of common duct stricture. 3. Levels of obstruction were 8 cases of hepatic portion, 15 cases of suprapancreatic portion, 28 cases of pancreatic portion, and 4 cases of ampullary portion. 4. In tumorous condition, CT demonstrated metastasis of other organs, 9 cases of the liver, 1 case of the lung, 3 cases of the pancreas, 3 cases of the common bile duct, 1 case of the stomach, and 12 cases of adjacent lymph nodes. 5. Associated diseases were 12 cases of intrahepatic stone, 4 cases of clonorchiasis, 2 cases of pancreas pseudocyst, 1 cases of hydronephrosis, and 1 case of renal cyst

  17. Airborne Cloud Computing Environment (ACCE)

    Science.gov (United States)

    Hardman, Sean; Freeborn, Dana; Crichton, Dan; Law, Emily; Kay-Im, Liz

    2011-01-01

    Airborne Cloud Computing Environment (ACCE) is JPL's internal investment to improve the return on airborne missions. Improve development performance of the data system. Improve return on the captured science data. The investment is to develop a common science data system capability for airborne instruments that encompasses the end-to-end lifecycle covering planning, provisioning of data system capabilities, and support for scientific analysis in order to improve the quality, cost effectiveness, and capabilities to enable new scientific discovery and research in earth observation.

  18. The Development of Educational and/or Training Computer Games for Students with Disabilities

    Science.gov (United States)

    Kwon, Jungmin

    2012-01-01

    Computer and video games have much in common with the strategies used in special education. Free resources for game development are becoming more widely available, so lay computer users, such as teachers and other practitioners, now have the capacity to develop games using a low budget and a little self-teaching. This article provides a guideline…

  19. Fluid Centrality: A Social Network Analysis of Social-Technical Relations in Computer-Mediated Communication

    Science.gov (United States)

    Enriquez, Judith Guevarra

    2010-01-01

    In this article, centrality is explored as a measure of computer-mediated communication (CMC) in networked learning. Centrality measure is quite common in performing social network analysis (SNA) and in analysing social cohesion, strength of ties and influence in CMC, and computer-supported collaborative learning research. It argues that measuring…

  20. Common patterns in 558 diagnostic radiology errors.

    Science.gov (United States)

    Donald, Jennifer J; Barnard, Stuart A

    2012-04-01

    As a Quality Improvement initiative our department has held regular discrepancy meetings since 2003. We performed a retrospective analysis of the cases presented and identified the most common pattern of error. A total of 558 cases were referred for discussion over 92 months, and errors were classified as perceptual or interpretative. The most common patterns of error for each imaging modality were analysed, and the misses were scored by consensus as subtle or non-subtle. Of 558 diagnostic errors, 447 (80%) were perceptual and 111 (20%) were interpretative errors. Plain radiography and computed tomography (CT) scans were the most frequent imaging modalities accounting for 246 (44%) and 241 (43%) of the total number of errors, respectively. In the plain radiography group 120 (49%) of the errors occurred in chest X-ray reports with perceptual miss of a lung nodule occurring in 40% of this subgroup. In the axial and appendicular skeleton missed fractures occurred most frequently, and metastatic bone disease was overlooked in 12 of 50 plain X-rays of the pelvis or spine. The majority of errors within the CT group were in reports of body scans with the commonest perceptual errors identified including 16 missed significant bone lesions, 14 cases of thromboembolic disease and 14 gastrointestinal tumours. Of the 558 errors, 312 (56%) were considered subtle and 246 (44%) non-subtle. Diagnostic errors are not uncommon and are most frequently perceptual in nature. Identification of the most common patterns of error has the potential to improve the quality of reporting by improving the search behaviour of radiologists. © 2012 The Authors. Journal of Medical Imaging and Radiation Oncology © 2012 The Royal Australian and New Zealand College of Radiologists.

  1. Zero-Slack, Noncritical Paths

    Science.gov (United States)

    Simons, Jacob V., Jr.

    2017-01-01

    The critical path method/program evaluation and review technique method of project scheduling is based on the importance of managing a project's critical path(s). Although a critical path is the longest path through a network, its location in large projects is facilitated by the computation of activity slack. However, logical fallacies in…

  2. Analytical and laser scanning techniques to determine shape properties of aggregates used in pavements

    CSIR Research Space (South Africa)

    Komba, Julius J

    2013-06-01

    Full Text Available and volume of an aggregate particle, the sphericity computed by using orthogonal dimensions of an aggregate particle, and the flat and elongated ratio computed by using longest and smallest dimensions of an aggregate particle. The second approach employed.... Further validation of the laser-based technique was achieved by correlating the laser-based aggregate form indices with the results from two current standard tests; the flakiness index and the flat and elongated particles ratio tests. The laser...

  3. Common features of microRNA target prediction tools

    Directory of Open Access Journals (Sweden)

    Sarah M. Peterson

    2014-02-01

    Full Text Available The human genome encodes for over 1800 microRNAs, which are short noncoding RNA molecules that function to regulate gene expression post-transcriptionally. Due to the potential for one microRNA to target multiple gene transcripts, microRNAs are recognized as a major mechanism to regulate gene expression and mRNA translation. Computational prediction of microRNA targets is a critical initial step in identifying microRNA:mRNA target interactions for experimental validation. The available tools for microRNA target prediction encompass a range of different computational approaches, from the modeling of physical interactions to the incorporation of machine learning. This review provides an overview of the major computational approaches to microRNA target prediction. Our discussion highlights three tools for their ease of use, reliance on relatively updated versions of miRBase, and range of capabilities, and these are DIANA-microT-CDS, miRanda-mirSVR, and TargetScan. In comparison across all microRNA target prediction tools, four main aspects of the microRNA:mRNA target interaction emerge as common features on which most target prediction is based: seed match, conservation, free energy, and site accessibility. This review explains these features and identifies how they are incorporated into currently available target prediction tools. MicroRNA target prediction is a dynamic field with increasing attention on development of new analysis tools. This review attempts to provide a comprehensive assessment of these tools in a manner that is accessible across disciplines. Understanding the basis of these prediction methodologies will aid in user selection of the appropriate tools and interpretation of the tool output.

  4. Computation of rainfall erosivity from daily precipitation amounts.

    Science.gov (United States)

    Beguería, Santiago; Serrano-Notivoli, Roberto; Tomas-Burguera, Miquel

    2018-10-01

    Rainfall erosivity is an important parameter in many erosion models, and the EI30 defined by the Universal Soil Loss Equation is one of the best known erosivity indices. One issue with this and other erosivity indices is that they require continuous breakpoint, or high frequency time interval, precipitation data. These data are rare, in comparison to more common medium-frequency data, such as daily precipitation data commonly recorded by many national and regional weather services. Devising methods for computing estimates of rainfall erosivity from daily precipitation data that are comparable to those obtained by using high-frequency data is, therefore, highly desired. Here we present a method for producing such estimates, based on optimal regression tools such as the Gamma Generalised Linear Model and universal kriging. Unlike other methods, this approach produces unbiased and very close to observed EI30, especially when these are aggregated at the annual level. We illustrate the method with a case study comprising more than 1500 high-frequency precipitation records across Spain. Although the original records have a short span (the mean length is around 10 years), computation of spatially-distributed upscaling parameters offers the possibility to compute high-resolution climatologies of the EI30 index based on currently available, long-span, daily precipitation databases. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Desiderata for computable representations of electronic health records-driven phenotype algorithms.

    Science.gov (United States)

    Mo, Huan; Thompson, William K; Rasmussen, Luke V; Pacheco, Jennifer A; Jiang, Guoqian; Kiefer, Richard; Zhu, Qian; Xu, Jie; Montague, Enid; Carrell, David S; Lingren, Todd; Mentch, Frank D; Ni, Yizhao; Wehbe, Firas H; Peissig, Peggy L; Tromp, Gerard; Larson, Eric B; Chute, Christopher G; Pathak, Jyotishman; Denny, Joshua C; Speltz, Peter; Kho, Abel N; Jarvik, Gail P; Bejan, Cosmin A; Williams, Marc S; Borthwick, Kenneth; Kitchner, Terrie E; Roden, Dan M; Harris, Paul A

    2015-11-01

    Electronic health records (EHRs) are increasingly used for clinical and translational research through the creation of phenotype algorithms. Currently, phenotype algorithms are most commonly represented as noncomputable descriptive documents and knowledge artifacts that detail the protocols for querying diagnoses, symptoms, procedures, medications, and/or text-driven medical concepts, and are primarily meant for human comprehension. We present desiderata for developing a computable phenotype representation model (PheRM). A team of clinicians and informaticians reviewed common features for multisite phenotype algorithms published in PheKB.org and existing phenotype representation platforms. We also evaluated well-known diagnostic criteria and clinical decision-making guidelines to encompass a broader category of algorithms. We propose 10 desired characteristics for a flexible, computable PheRM: (1) structure clinical data into queryable forms; (2) recommend use of a common data model, but also support customization for the variability and availability of EHR data among sites; (3) support both human-readable and computable representations of phenotype algorithms; (4) implement set operations and relational algebra for modeling phenotype algorithms; (5) represent phenotype criteria with structured rules; (6) support defining temporal relations between events; (7) use standardized terminologies and ontologies, and facilitate reuse of value sets; (8) define representations for text searching and natural language processing; (9) provide interfaces for external software algorithms; and (10) maintain backward compatibility. A computable PheRM is needed for true phenotype portability and reliability across different EHR products and healthcare systems. These desiderata are a guide to inform the establishment and evolution of EHR phenotype algorithm authoring platforms and languages. © The Author 2015. Published by Oxford University Press on behalf of the American Medical

  6. Common Graphics Library (CGL). Volume 1: LEZ user's guide

    Science.gov (United States)

    Taylor, Nancy L.; Hammond, Dana P.; Hofler, Alicia S.; Miner, David L.

    1988-01-01

    Users are introduced to and instructed in the use of the Langley Easy (LEZ) routines of the Common Graphics Library (CGL). The LEZ routines form an application independent graphics package which enables the user community to view data quickly and easily, while providing a means of generating scientific charts conforming to the publication and/or viewgraph process. A distinct advantage for using the LEZ routines is that the underlying graphics package may be replaced or modified without requiring the users to change their application programs. The library is written in ANSI FORTRAN 77, and currently uses a CORE-based underlying graphics package, and is therefore machine independent, providing support for centralized and/or distributed computer systems.

  7. Health Records and the Cloud Computing Paradigm from a Privacy Perspective

    OpenAIRE

    Stingl, Christian; Slamanig, Daniel

    2011-01-01

    With the advent of cloud computing, the realization of highly available electronic health records providing location-independent access seems to be very promising. However, cloud computing raises major security issues that need to be addressed particularly within the health care domain. The protection of the privacy of individuals often seems to be left on the sidelines. For instance, common protection against malicious insiders, i.e., non-disclosure agreements, is purely organizational. Clea...

  8. Computation system for nuclear reactor core analysis

    International Nuclear Information System (INIS)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.; Petrie, L.M.

    1977-04-01

    This report documents a system which contains computer codes as modules developed to evaluate nuclear reactor core performance. The diffusion theory approximation to neutron transport may be applied with the VENTURE code treating up to three dimensions. The effect of exposure may be determined with the BURNER code, allowing depletion calculations to be made. The features and requirements of the system are discussed and aspects common to the computational modules, but the latter are documented elsewhere. User input data requirements, data file management, control, and the modules which perform general functions are described. Continuing development and implementation effort is enhancing the analysis capability available locally and to other installations from remote terminals

  9. Open environments to support systems engineering tool integration: A study using the Portable Common Tool Environment (PCTE)

    Science.gov (United States)

    Eckhardt, Dave E., Jr.; Jipping, Michael J.; Wild, Chris J.; Zeil, Steven J.; Roberts, Cathy C.

    1993-01-01

    A study of computer engineering tool integration using the Portable Common Tool Environment (PCTE) Public Interface Standard is presented. Over a 10-week time frame, three existing software products were encapsulated to work in the Emeraude environment, an implementation of the PCTE version 1.5 standard. The software products used were a computer-aided software engineering (CASE) design tool, a software reuse tool, and a computer architecture design and analysis tool. The tool set was then demonstrated to work in a coordinated design process in the Emeraude environment. The project and the features of PCTE used are described, experience with the use of Emeraude environment over the project time frame is summarized, and several related areas for future research are summarized.

  10. Ruling the Commons. Introducing a new methodology for the analysis of historical commons

    Directory of Open Access Journals (Sweden)

    Tine de Moor

    2016-10-01

    Full Text Available Despite significant progress in recent years, the evolution of commons over the long run remains an under-explored area within commons studies. During the last years an international team of historians have worked under the umbrella of the Common Rules Project in order to design and test a new methodology aimed at advancing our knowledge on the dynamics of institutions for collective action – in particular commons. This project aims to contribute to the current debate on commons on three different fronts. Theoretically, it explicitly draws our attention to issues of change and adaptation in the commons – contrasting with more static analyses. Empirically, it highlights the value of historical records as a rich source of information for longitudinal analysis of the functioning of commons. Methodologically, it develops a systematic way of analyzing and comparing commons’ regulations across regions and time, setting a number of variables that have been defined on the basis of the “most common denominators” in commons regulation across countries and time periods. In this paper we introduce the project, describe our sources and methodology, and present the preliminary results of our analysis.

  11. The DIII-D Computing Environment: Characteristics and Recent Changes

    International Nuclear Information System (INIS)

    McHarg, B.B. Jr.

    1999-01-01

    The DIII-D tokamak national fusion research facility along with its predecessor Doublet III has been operating for over 21 years. The DIII-D computing environment consists of real-time systems controlling the tokamak, heating systems, and diagnostics, and systems acquiring experimental data from instrumentation; major data analysis server nodes performing short term and long term data access and data analysis; and systems providing mechanisms for remote collaboration and the dissemination of information over the world wide web. Computer systems for the facility have undergone incredible changes over the course of time as the computer industry has changed dramatically. Yet there are certain valuable characteristics of the DIII-D computing environment that have been developed over time and have been maintained to this day. Some of these characteristics include: continuous computer infrastructure improvements, distributed data and data access, computing platform integration, and remote collaborations. These characteristics are being carried forward as well as new characteristics resulting from recent changes which have included: a dedicated storage system and a hierarchical storage management system for raw shot data, various further infrastructure improvements including deployment of Fast Ethernet, the introduction of MDSplus, LSF and common IDL based tools, and improvements to remote collaboration capabilities. This paper will describe this computing environment, important characteristics that over the years have contributed to the success of DIII-D computing systems, and recent changes to computer systems

  12. Towards a global monitoring system for CMS computing operations

    CERN Multimedia

    CERN. Geneva; Bauerdick, Lothar A.T.

    2012-01-01

    The operation of the CMS computing system requires a complex monitoring system to cover all its aspects: central services, databases, the distributed computing infrastructure, production and analysis workflows, the global overview of the CMS computing activities and the related historical information. Several tools are available to provide this information, developed both inside and outside of the collaboration and often used in common with other experiments. Despite the fact that the current monitoring allowed CMS to successfully perform its computing operations, an evolution of the system is clearly required, to adapt to the recent changes in the data and workload management tools and models and to address some shortcomings that make its usage less than optimal. Therefore, a recent and ongoing coordinated effort was started in CMS, aiming at improving the entire monitoring system by identifying its weaknesses and the new requirements from the stakeholders, rationalise and streamline existing components and ...

  13. Fractal approach to computer-analytical modelling of tree crown

    International Nuclear Information System (INIS)

    Berezovskaya, F.S.; Karev, G.P.; Kisliuk, O.F.; Khlebopros, R.G.; Tcelniker, Yu.L.

    1993-09-01

    In this paper we discuss three approaches to the modeling of a tree crown development. These approaches are experimental (i.e. regressive), theoretical (i.e. analytical) and simulation (i.e. computer) modeling. The common assumption of these is that a tree can be regarded as one of the fractal objects which is the collection of semi-similar objects and combines the properties of two- and three-dimensional bodies. We show that a fractal measure of crown can be used as the link between the mathematical models of crown growth and light propagation through canopy. The computer approach gives the possibility to visualize a crown development and to calibrate the model on experimental data. In the paper different stages of the above-mentioned approaches are described. The experimental data for spruce, the description of computer system for modeling and the variant of computer model are presented. (author). 9 refs, 4 figs

  14. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  15. Is tube repair of aortic aneurysm followed by aneurysmal change in the common iliac arteries?

    Science.gov (United States)

    Provan, J L; Fialkov, J; Ameli, F M; St Louis, E L

    1990-10-01

    To address the concern that tube repair of an abdominal aortic aneurysm might be followed by aneurysmal change in the common iliac arteries, 23 patients who had undergone the operation were re-examined 3 to 5 years later. Although 9 had had minimal ectasia of these arteries preoperatively, in none of the 23 was there symptomatic or radiologic evidence of aneurysmal change on follow-up. Measurements of the maximum intraluminal diameters were made by computed tomography; they indicated no significant differences between the preoperative and follow-up sizes of the common iliac arteries. The variation in time to follow-up also showed no significant correlation with change in artery diameter.

  16. Gender differences in computer and information literacy: An exploration of the performances of girls and boys in ICILS 2013

    NARCIS (Netherlands)

    Punter, Renate Annemiek; Meelissen, Martina R.M.; Glas, Cornelis A.W.

    2017-01-01

    IEA’s International Computer and Information Literacy Study (ICILS) 2013 showed that in the majority of the participating countries, 14-year-old girls outperformed boys in computer and information literacy (CIL): results that seem to contrast with the common view of boys having better computer

  17. Resilient and Robust High Performance Computing Platforms for Scientific Computing Integrity

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Yier [Univ. of Central Florida, Orlando, FL (United States)

    2017-07-14

    As technology advances, computer systems are subject to increasingly sophisticated cyber-attacks that compromise both their security and integrity. High performance computing platforms used in commercial and scientific applications involving sensitive, or even classified data, are frequently targeted by powerful adversaries. This situation is made worse by a lack of fundamental security solutions that both perform efficiently and are effective at preventing threats. Current security solutions fail to address the threat landscape and ensure the integrity of sensitive data. As challenges rise, both private and public sectors will require robust technologies to protect its computing infrastructure. The research outcomes from this project try to address all these challenges. For example, we present LAZARUS, a novel technique to harden kernel Address Space Layout Randomization (KASLR) against paging-based side-channel attacks. In particular, our scheme allows for fine-grained protection of the virtual memory mappings that implement the randomization. We demonstrate the effectiveness of our approach by hardening a recent Linux kernel with LAZARUS, mitigating all of the previously presented side-channel attacks on KASLR. Our extensive evaluation shows that LAZARUS incurs only 0.943% overhead for standard benchmarks, and is therefore highly practical. We also introduced HA2lloc, a hardware-assisted allocator that is capable of leveraging an extended memory management unit to detect memory errors in the heap. We also perform testing using HA2lloc in a simulation environment and find that the approach is capable of preventing common memory vulnerabilities.

  18. Activity report of Computing Research Center

    Energy Technology Data Exchange (ETDEWEB)

    1997-07-01

    On April 1997, National Laboratory for High Energy Physics (KEK), Institute of Nuclear Study, University of Tokyo (INS), and Meson Science Laboratory, Faculty of Science, University of Tokyo began to work newly as High Energy Accelerator Research Organization after reconstructing and converting their systems, under aiming at further development of a wide field of accelerator science using a high energy accelerator. In this Research Organization, Applied Research Laboratory is composed of four Centers to execute assistance of research actions common to one of the Research Organization and their relating research and development (R and D) by integrating the present four centers and their relating sections in Tanashi. What is expected for the assistance of research actions is not only its general assistance but also its preparation and R and D of a system required for promotion and future plan of the research. Computer technology is essential to development of the research and can communize for various researches in the Research Organization. On response to such expectation, new Computing Research Center is required for promoting its duty by coworking and cooperating with every researchers at a range from R and D on data analysis of various experiments to computation physics acting under driving powerful computer capacity such as supercomputer and so forth. Here were described on report of works and present state of Data Processing Center of KEK at the first chapter and of the computer room of INS at the second chapter and on future problems for the Computing Research Center. (G.K.)

  19. Action and perception in literacy: A common-code for spelling and reading.

    Science.gov (United States)

    Houghton, George

    2018-01-01

    There is strong evidence that reading and spelling in alphabetical scripts depend on a shared representation (common-coding). However, computational models usually treat the two skills separately, producing a wide variety of proposals as to how the identity and position of letters is represented. This article treats reading and spelling in terms of the common-coding hypothesis for perception-action coupling. Empirical evidence for common representations in spelling-reading is reviewed. A novel version of the Start-End Competitive Queuing (SE-CQ) spelling model is introduced, and tested against the distribution of positional errors in Letter Position Dysgraphia, data from intralist intrusion errors in spelling to dictation, and dysgraphia because of nonperipheral neglect. It is argued that no other current model is equally capable of explaining this range of data. To pursue the common-coding hypothesis, the representation used in SE-CQ is applied, without modification, to the coding of letter identity and position for reading and lexical access, and a lexical matching rule for the representation is proposed (Start End Position Code model, SE-PC). Simulations show the model's compatibility with benchmark findings from form priming, its ability to account for positional effects in letter identification priming and the positional distribution of perseverative intrusion errors. The model supports the view that spelling and reading use a common orthographic description, providing a well-defined account of the major features of this representation. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  20. Crowdsourcing cyber security: a property rights view of exclusion and theft on the information commons

    Directory of Open Access Journals (Sweden)

    Gary Shiffman

    2013-02-01

    Full Text Available Individuals increasingly rely upon the internet for basic economic interaction. Current cyber security mechanisms are unable to stop adversaries and hackers from gaining access to sensitive information stored on government, business, and public computers. Experts propose implementing attribution and audit frameworks in cyberspace to deter, prevent, and prosecute cyber criminals and attackers. However, this method faces significant policy and resource constraints. Social science research, specifically in law and economics, concerning common-pool resources suggests an organic approach to cyber security may yield an appropriate solution. This cyber commons method involves treating the internet as a commons and encouraging individuals and institutions to voluntarily implement innovative and adaptive monitoring mechanisms. Such mechanisms are already in use and in many cases have proven more effective than attribution mechanisms in resisting and tracing the source of cyber attacks.

  1. The common good

    OpenAIRE

    Argandoña, Antonio

    2011-01-01

    The concept of the common good occupied a relevant place in classical social, political and economic philosophy. After losing ground in the Modern age, it has recently reappeared, although with different and sometimes confusing meanings. This paper is the draft of a chapter of a Handbook; it explains the meaning of common good in the Aristotelian-Thomistic philosophy and in the Social Doctrine of the Catholic Church; why the common good is relevant; and how it is different from the other uses...

  2. Cloud Computing for Geosciences--GeoCloud for standardized geospatial service platforms (Invited)

    Science.gov (United States)

    Nebert, D. D.; Huang, Q.; Yang, C.

    2013-12-01

    The 21st century geoscience faces challenges of Big Data, spike computing requirements (e.g., when natural disaster happens), and sharing resources through cyberinfrastructure across different organizations (Yang et al., 2011). With flexibility and cost-efficiency of computing resources a primary concern, cloud computing emerges as a promising solution to provide core capabilities to address these challenges. Many governmental and federal agencies are adopting cloud technologies to cut costs and to make federal IT operations more efficient (Huang et al., 2010). However, it is still difficult for geoscientists to take advantage of the benefits of cloud computing to facilitate the scientific research and discoveries. This presentation reports using GeoCloud to illustrate the process and strategies used in building a common platform for geoscience communities to enable the sharing, integration of geospatial data, information and knowledge across different domains. GeoCloud is an annual incubator project coordinated by the Federal Geographic Data Committee (FGDC) in collaboration with the U.S. General Services Administration (GSA) and the Department of Health and Human Services. It is designed as a staging environment to test and document the deployment of a common GeoCloud community platform that can be implemented by multiple agencies. With these standardized virtual geospatial servers, a variety of government geospatial applications can be quickly migrated to the cloud. In order to achieve this objective, multiple projects are nominated each year by federal agencies as existing public-facing geospatial data services. From the initial candidate projects, a set of common operating system and software requirements was identified as the baseline for platform as a service (PaaS) packages. Based on these developed common platform packages, each project deploys and monitors its web application, develops best practices, and documents cost and performance information. This

  3. Implicit computational complexity and compilers

    DEFF Research Database (Denmark)

    Rubiano, Thomas

    Complexity theory helps us predict and control resources, usually time and space, consumed by programs. Static analysis on specific syntactic criterion allows us to categorize some programs. A common approach is to observe the program’s data’s behavior. For instance, the detection of non...... evolution and a lot of research came from this theory. Until now, these implicit complexity theories were essentially applied on more or less toy languages. This thesis applies implicit computational complexity methods into “real life” programs by manipulating intermediate representation languages...

  4. Computation of peak discharge at culverts

    Science.gov (United States)

    Carter, Rolland William

    1957-01-01

    Methods for computing peak flood flow through culverts on the basis of a field survey of highwater marks and culvert geometry are presented. These methods are derived from investigations of culvert flow as reported in the literature and on extensive laboratory studies of culvert flow. For convenience in computation, culvert flow has been classified into six types, according to the location of the control section and the relative heights of the head-water and tail-water levels. The type of flow which occurred at any site can be determined from the field data and the criteria given in this report. A discharge equation has been developed for each flow type by combining the energy and continuity equations for the distance between an approach section upstream from the culvert and a terminal section within the culvert barrel. The discharge coefficient applicable to each flow type is listed for the more common entrance geometries. Procedures for computing peak discharge through culverts are outlined in detail for each of the six flow types.

  5. Computed tomography in basilar artery occlusion in childhood

    International Nuclear Information System (INIS)

    Mori, Koreaki; Miwa, Soichi; Handa, Hajime

    1978-01-01

    A case of basilar artery occlusion in a 13-year-old boy is presented. Eighteen other cases of such occlusion in childhood in the relevant literature were analyzed, and then all nineteen cases were compared to adult cases. In comparison with adult cases, the following points were clear: (1) In children as well as in adults, basilar artery occlusion is more common in males. (2) As is well known, arteriosclerosis is the commonest cause in adults. In children, however, idiopathic and/or congenital occlusion are more common causes. (3) The main clinical manifestations in childhood as well as in maturity are consciousness disturbance, hemiplegia or quadriplegia, and pupillary abnormalities. (4) An occlusion of the proximal third of the basilar artery is common in adults, whereas an occlusion of the middle third is common in childhood. (5) Diagnosis is based on clinical manifestations, cerebral angiography, and computed tomography. (6) In contrast to the poor prognosis in adults, the prognosis is fairly in children. (author)

  6. Common-image gathers in the offset domain from reverse-time migration

    KAUST Repository

    Zhan, Ge

    2014-04-01

    Kirchhoff migration is flexible to output common-image gathers (CIGs) in the offset domain by imaging data with different offsets separately. These CIGs supply important information for velocity model updates and amplitude-variation-with-offset (AVO) analysis. Reverse-time migration (RTM) offers more insights into complex geology than Kirchhoff migration by accurately describing wave propagation using the two-way wave equation. But, it has difficulty to produce offset domain CIGs like Kirchhoff migration. In this paper, we develop a method for obtaining offset domain CIGs from RTM. The method first computes the RTM operator of an offset gather, followed by a dot product of the operator and the offset data to form a common-offset RTM image. The offset domain CIGs are then achieved after separately migrating data with different offsets. We generate offset domain CIGs on both the Marmousi synthetic data and 2D Gulf of Mexico real data using this approach. © 2014.

  7. Perceptual Categories Derived from Reid’s “Common Sense” Philosophy

    Science.gov (United States)

    Reeves, Adam; Dresp-Langley, Birgitta

    2017-01-01

    The 18th-century Scottish ‘common sense’ philosopher Thomas Reid argued that perception can be distinguished on several dimensions from other categories of experience, such as sensation, illusion, hallucination, mental images, and what he called ‘fancy.’ We extend his approach to eleven mental categories, and discuss how these distinctions, often ignored in the empirical literature, bear on current research. We also score each category on five properties (ones abstracted from Reid) to form a 5 × 11 matrix, and thus can generate statistical measures of their mutual dependencies, a procedure that may have general interest as illustrating what we can call ‘computational philosophy.’ PMID:28634457

  8. Statistical methods and computing for big data

    Science.gov (United States)

    Wang, Chun; Chen, Ming-Hui; Schifano, Elizabeth; Wu, Jing

    2016-01-01

    Big data are data on a massive scale in terms of volume, intensity, and complexity that exceed the capacity of standard analytic tools. They present opportunities as well as challenges to statisticians. The role of computational statisticians in scientific discovery from big data analyses has been under-recognized even by peer statisticians. This article summarizes recent methodological and software developments in statistics that address the big data challenges. Methodologies are grouped into three classes: subsampling-based, divide and conquer, and online updating for stream data. As a new contribution, the online updating approach is extended to variable selection with commonly used criteria, and their performances are assessed in a simulation study with stream data. Software packages are summarized with focuses on the open source R and R packages, covering recent tools that help break the barriers of computer memory and computing power. Some of the tools are illustrated in a case study with a logistic regression for the chance of airline delay. PMID:27695593

  9. Statistical methods and computing for big data.

    Science.gov (United States)

    Wang, Chun; Chen, Ming-Hui; Schifano, Elizabeth; Wu, Jing; Yan, Jun

    2016-01-01

    Big data are data on a massive scale in terms of volume, intensity, and complexity that exceed the capacity of standard analytic tools. They present opportunities as well as challenges to statisticians. The role of computational statisticians in scientific discovery from big data analyses has been under-recognized even by peer statisticians. This article summarizes recent methodological and software developments in statistics that address the big data challenges. Methodologies are grouped into three classes: subsampling-based, divide and conquer, and online updating for stream data. As a new contribution, the online updating approach is extended to variable selection with commonly used criteria, and their performances are assessed in a simulation study with stream data. Software packages are summarized with focuses on the open source R and R packages, covering recent tools that help break the barriers of computer memory and computing power. Some of the tools are illustrated in a case study with a logistic regression for the chance of airline delay.

  10. Computer-aided diagnosis of pneumoconiosis abnormalities extracted from chest radiographs scanned with a CCD scanner

    International Nuclear Information System (INIS)

    Abe, Koji; Minami, Masahide; Nakamura, Munehiro

    2011-01-01

    This paper presents a computer-aided diagnosis for pneumoconiosis radiographs obtained with a common charge-coupled devices (CCD) scanner. Since the current computer-aided diagnosis systems of pneumoconiosis are not practical for medical doctors due to high costs of usage for a special scanner, we propose a novel system which measures abnormalities of pneumoconiosis from lung images obtained with a common CCD scanner. Experimental results of discriminations between normal and abnormal cases for 56 right-lung images including 6 standard pneumoconiosis images have shown that the proposed abnormalities are well extracted according to the standards of pneumoconiosis categories. (author)

  11. Backfilling the Grid with Containerized BOINC in the ATLAS computing

    CERN Document Server

    Wu, Wenjing; The ATLAS collaboration

    2018-01-01

    Virtualization is a commonly used solution for utilizing the opportunistic computing resources in the HEP field, as it provides a unified software and OS layer that the HEP computing tasks require over the heterogeneous opportunistic computing resources. However there is always performance penalty with virtualization, especially for short jobs which are always the case for volunteer computing tasks, the overhead of virtualization becomes a big portion in the wall time, hence it leads to low CPU efficiency of the jobs. With the wide usage of containers in HEP computing, we explore the possibility of adopting the container technology into the ATLAS BOINC project, hence we implemented a Native version in BOINC, which uses the singularity container or direct usage of the target OS to replace VirtualBox. In this paper, we will discuss 1) the implementation and workflow of the Native version in the ATLAS BOINC; 2) the performance measurement of the Native version comparing to the previous Virtualization version. 3)...

  12. 2-D and 3-D computations of curved accelerator magnets

    International Nuclear Information System (INIS)

    Turner, L.R.

    1991-01-01

    In order to save computer memory, a long accelerator magnet may be computed by treating the long central region and the end regions separately. The dipole magnets for the injector synchrotron of the Advanced Photon Source (APS), now under construction at Argonne National Laboratory (ANL), employ magnet iron consisting of parallel laminations, stacked with a uniform radius of curvature of 33.379 m. Laplace's equation for the magnetic scalar potential has a different form for a straight magnet (x-y coordinates), a magnet with surfaces curved about a common center (r-θ coordinates), and a magnet with parallel laminations like the APS injector dipole. Yet pseudo 2-D computations for the three geometries give basically identical results, even for a much more strongly curved magnet. Hence 2-D (x-y) computations of the central region and 3-D computations of the end regions can be combined to determine the overall magnetic behavior of the magnets. 1 ref., 6 figs

  13. Practical methods to improve the development of computational software

    International Nuclear Information System (INIS)

    Osborne, A. G.; Harding, D. W.; Deinert, M. R.

    2013-01-01

    The use of computation has become ubiquitous in science and engineering. As the complexity of computer codes has increased, so has the need for robust methods to minimize errors. Past work has show that the number of functional errors is related the number of commands that a code executes. Since the late 1960's, major participants in the field of computation have encouraged the development of best practices for programming to help reduce coder induced error, and this has lead to the emergence of 'software engineering' as a field of study. Best practices for coding and software production have now evolved and become common in the development of commercial software. These same techniques, however, are largely absent from the development of computational codes by research groups. Many of the best practice techniques from the professional software community would be easy for research groups in nuclear science and engineering to adopt. This paper outlines the history of software engineering, as well as issues in modern scientific computation, and recommends practices that should be adopted by individual scientific programmers and university research groups. (authors)

  14. Hybrid Single Photon Emission Computed Tomography/Computed Tomography Sulphur Colloid Scintigraphy in Focal Nodular Hyperplasia

    International Nuclear Information System (INIS)

    Bhoil, Amit; Gayana, Shankramurthy; Sood, Ashwani; Bhattacharya, Anish; Mittal, Bhagwant Rai

    2013-01-01

    It is important to differentiate focal nodular hyperplasia (FNH), a benign condition of liver most commonly affecting women, from other neoplasm such as hepatic adenoma and metastasis. The functional reticuloendothelial features of FNH can be demonstrated by scintigraphy. We present a case of breast cancer in whom fluorodeoxyglucose positron emission tomography/computerized tomography (CT) showed a homogenous hyperdense lesion in liver, which on Tc99m sulfur colloid single-photon emission computed tomography/CT was found to have increased focal tracer uptake suggestive of FNH

  15. Effect of correlated decay on fault-tolerant quantum computation

    Science.gov (United States)

    Lemberger, B.; Yavuz, D. D.

    2017-12-01

    We analyze noise in the circuit model of quantum computers when the qubits are coupled to a common bosonic bath and discuss the possible failure of scalability of quantum computation. Specifically, we investigate correlated (super-radiant) decay between the qubit energy levels from a two- or three-dimensional array of qubits without imposing any restrictions on the size of the sample. We first show that regardless of how the spacing between the qubits compares with the emission wavelength, correlated decay produces errors outside the applicability of the threshold theorem. This is because the sum of the norms of the two-body interaction Hamiltonians (which can be viewed as the upper bound on the single-qubit error) that decoheres each qubit scales with the total number of qubits and is unbounded. We then discuss two related results: (1) We show that the actual error (instead of the upper bound) on each qubit scales with the number of qubits. As a result, in the limit of large number of qubits in the computer, N →∞ , correlated decay causes each qubit in the computer to decohere in ever shorter time scales. (2) We find the complete eigenvalue spectrum of the exchange Hamiltonian that causes correlated decay in the same limit. We show that the spread of the eigenvalue distribution grows faster with N compared to the spectrum of the unperturbed system Hamiltonian. As a result, as N →∞ , quantum evolution becomes completely dominated by the noise due to correlated decay. These results argue that scalable quantum computing may not be possible in the circuit model in a two- or three- dimensional geometry when the qubits are coupled to a common bosonic bath.

  16. Quantum computers and quantum computations

    International Nuclear Information System (INIS)

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  17. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  18. The Common Good

    DEFF Research Database (Denmark)

    Feldt, Liv Egholm

    At present voluntary and philanthropic organisations are experiencing significant public attention and academic discussions about their role in society. Central to the debate is on one side the question of how they contribute to “the common good”, and on the other the question of how they can avoid...... and concepts continuously over time have blurred the different sectors and “polluted” contemporary definitions of the “common good”. The analysis shows that “the common good” is not an autonomous concept owned or developed by specific spheres of society. The analysis stresses that historically, “the common...... good” has always been a contested concept. It is established through messy and blurred heterogeneity of knowledge, purposes and goal achievements originating from a multitude of scientific, religious, political and civil society spheres contested not only in terms of words and definitions but also...

  19. Optimizing the Usability of Brain-Computer Interfaces.

    Science.gov (United States)

    Zhang, Yin; Chase, Steve M

    2018-03-22

    Brain-computer interfaces are in the process of moving from the laboratory to the clinic. These devices act by reading neural activity and using it to directly control a device, such as a cursor on a computer screen. An open question in the field is how to map neural activity to device movement in order to achieve the most proficient control. This question is complicated by the fact that learning, especially the long-term skill learning that accompanies weeks of practice, can allow subjects to improve performance over time. Typical approaches to this problem attempt to maximize the biomimetic properties of the device in order to limit the need for extensive training. However, it is unclear if this approach would ultimately be superior to performance that might be achieved with a nonbiomimetic device once the subject has engaged in extended practice and learned how to use it. Here we approach this problem using ideas from optimal control theory. Under the assumption that the brain acts as an optimal controller, we present a formal definition of the usability of a device and show that the optimal postlearning mapping can be written as the solution of a constrained optimization problem. We then derive the optimal mappings for particular cases common to most brain-computer interfaces. Our results suggest that the common approach of creating biomimetic interfaces may not be optimal when learning is taken into account. More broadly, our method provides a blueprint for optimal device design in general control-theoretic contexts.

  20. Principles for the wise use of computers by children.

    Science.gov (United States)

    Straker, L; Pollock, C; Maslen, B

    2009-11-01

    Computer use by children at home and school is now common in many countries. Child computer exposure varies with the type of computer technology available and the child's age, gender and social group. This paper reviews the current exposure data and the evidence for positive and negative effects of computer use by children. Potential positive effects of computer use by children include enhanced cognitive development and school achievement, reduced barriers to social interaction, enhanced fine motor skills and visual processing and effective rehabilitation. Potential negative effects include threats to child safety, inappropriate content, exposure to violence, bullying, Internet 'addiction', displacement of moderate/vigorous physical activity, exposure to junk food advertising, sleep displacement, vision problems and musculoskeletal problems. The case for child specific evidence-based guidelines for wise use of computers is presented based on children using computers differently to adults, being physically, cognitively and socially different to adults, being in a state of change and development and the potential to impact on later adult risk. Progress towards child-specific guidelines is reported. Finally, a set of guideline principles is presented as the basis for more detailed guidelines on the physical, cognitive and social impact of computer use by children. The principles cover computer literacy, technology safety, child safety and privacy and appropriate social, cognitive and physical development. The majority of children in affluent communities now have substantial exposure to computers. This is likely to have significant effects on child physical, cognitive and social development. Ergonomics can provide and promote guidelines for wise use of computers by children and by doing so promote the positive effects and reduce the negative effects of computer-child, and subsequent computer-adult, interaction.

  1. [Advancements of computer chemistry in separation of Chinese medicine].

    Science.gov (United States)

    Li, Lingjuan; Hong, Hong; Xu, Xuesong; Guo, Liwei

    2011-12-01

    Separating technique of Chinese medicine is not only a key technique in the field of Chinese medicine' s research and development, but also a significant step in the modernization of Chinese medicinal preparation. Computer chemistry can build model and look for the regulations from Chinese medicine system which is full of complicated data. This paper analyzed the applicability, key technology, basic mode and common algorithm of computer chemistry applied in the separation of Chinese medicine, introduced the mathematic mode and the setting methods of Extraction kinetics, investigated several problems which based on traditional Chinese medicine membrane procession, and forecasted the application prospect.

  2. Absent right common carotid artery associated with aberrant right subclavian artery.

    Science.gov (United States)

    Uchino, Akira; Uwabe, Kazuhiko; Osawa, Iichiro

    2018-06-01

    Rarely, the external and internal carotid arteries arise separately from the brachiocephalic trunk and right subclavian artery (SA) or the aortic arch and reflect the absence of a common carotid artery (CCA). We report a 45-year-old man with absent right CCA associated with aberrant right SA, an extremely rare combination, diagnosed by computed tomography (CT) angiography during follow-up for postoperative aortic dissection. Retrospective careful observation of preoperative postcontrast CT revealed the absent right CCA. Previously reported arch variations associated with absent CCA include cervical aortic arch, double aortic arch, and right aortic arch.

  3. An Integer Programming Formulation of the Minimum Common String Partition Problem.

    Directory of Open Access Journals (Sweden)

    S M Ferdous

    Full Text Available We consider the problem of finding a minimum common string partition (MCSP of two strings, which is an NP-hard problem. The MCSP problem is closely related to genome comparison and rearrangement, an important field in Computational Biology. In this paper, we map the MCSP problem into a graph applying a prior technique and using this graph, we develop an Integer Linear Programming (ILP formulation for the problem. We implement the ILP formulation and compare the results with the state-of-the-art algorithms from the literature. The experimental results are found to be promising.

  4. Privacy-Preserving Computation with Trusted Computing via Scramble-then-Compute

    OpenAIRE

    Dang Hung; Dinh Tien Tuan Anh; Chang Ee-Chien; Ooi Beng Chin

    2017-01-01

    We consider privacy-preserving computation of big data using trusted computing primitives with limited private memory. Simply ensuring that the data remains encrypted outside the trusted computing environment is insufficient to preserve data privacy, for data movement observed during computation could leak information. While it is possible to thwart such leakage using generic solution such as ORAM [42], designing efficient privacy-preserving algorithms is challenging. Besides computation effi...

  5. The Next Generation ARC Middleware and ATLAS Computing Model

    CERN Document Server

    Filipcic, A; The ATLAS collaboration; Smirnova, O; Konstantinov, A; Karpenko, D

    2012-01-01

    The distributed NDGF Tier-1 and associated Nordugrid clusters are well integrated into the ATLAS computing model but follow a slightly different paradigm than other ATLAS resources. The current strategy does not divide the sites as in the commonly used hierarchical model, but rather treats them as a single storage endpoint and a pool of distributed computing nodes. The next generation ARC middleware with its several new technologies provides new possibilities in development of the ATLAS computing model, such as pilot jobs with pre-cached input files, automatic job migration between the sites, integration of remote sites without connected storage elements, and automatic brokering for jobs with non-standard resource requirements. ARC's data transfer model provides an automatic way for the computing sites to participate in ATLAS' global task management system without requiring centralised brokering or data transfer services. The powerful API combined with Python and Java bindings can easily be used to build new ...

  6. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  7. Science commons

    CERN Multimedia

    CERN. Geneva

    2007-01-01

    SCP: Creative Commons licensing for open access publishing, Open Access Law journal-author agreements for converting journals to open access, and the Scholar's Copyright Addendum Engine for retaining rights to self-archive in meaningful formats and locations for future re-use. More than 250 science and technology journals already publish under Creative Commons licensing while 35 law journals utilize the Open Access Law agreements. The Addendum Engine is a new tool created in partnership with SPARC and U.S. universities. View John Wilbanks's biography

  8. A rare and lethal case of right common carotid pseudoaneurysm following whiplash trauma.

    Science.gov (United States)

    Pomara, Cristoforo; Bello, Stefania; Serinelli, Serenella; Fineschi, Vittorio

    2015-03-01

    Whiplash trauma from a car crash is one of the most common causes of neck injury, resulting in pain and dysfunction. We report on an unusual case of post-whiplash pseudoaneurysm of the right common carotid artery, which led to acute massive hemorrhage and death days after the initial trauma. A post-mortem computed tomography angiography showed rupture of the pseudoaneurysm of the right common carotid artery with the contrast agent leaking out into the mouth. The subsequent autopsy confirmed a large hemorrhagic clot extending to the right side of the neck and mediastinum. A rupture of the right wall of the oropharynx was identified with massive bronchial hemoaspiration. The case demonstrates a rare but lethal clinical entity, and is important in providing a better understanding of the potentially fatal consequences of minor trauma, such as whiplash injury, and its physiopathological mechanisms. Thus, changing symptoms after a whiplash injury should be carefully evaluated since they can be related to the underlying severe consequences of a rapid hyperextension-hyperflexion of the neck, as in the reported case.

  9. Computer aid in solar architecture

    Energy Technology Data Exchange (ETDEWEB)

    Rosendahl, E W

    1982-02-01

    Among architects the question is being discussed in how far new buildings can be designed in a way to make more economical use of energy by architectural means. Solar houses in the USA are often taken as a model. As yet it is unclear how such measures will affect heat demand in the central European climate and with domestic building materials being used. A computer simulation program is introduced by which these questions can be answered as early as in the stage of planning. The program can be run on a common microcomputersystem.

  10. Computing the partial volume of pressure vessels

    Energy Technology Data Exchange (ETDEWEB)

    Wiencke, Bent [Nestle USA, Corporate Engineering, 800 N. Brand Blvd, Glendale, CA 91203 (United States)

    2010-06-15

    The computation of the partial and total volume of pressure vessels with various type of head profiles requires detailed knowledge of the head profile geometry. Depending on the type of head profile the derivation of the equations can become very complex and the calculation process cumbersome. Certain head profiles require numerical methods to obtain the partial volume, which for most application is beyond the scope of practicability. This paper suggests a unique method that simplifies the calculation procedure for the various types of head profiles by using one common set of equations without the need for numerical or complex computation methods. For ease of use, all equations presented in this paper are summarized in a single table format for horizontal and vertical vessels. (author)

  11. Computable Frames in Computable Banach Spaces

    Directory of Open Access Journals (Sweden)

    S.K. Kaushik

    2016-06-01

    Full Text Available We develop some parts of the frame theory in Banach spaces from the point of view of Computable Analysis. We define computable M-basis and use it to construct a computable Banach space of scalar valued sequences. Computable Xd frames and computable Banach frames are also defined and computable versions of sufficient conditions for their existence are obtained.

  12. Specialized computer architectures for computational aerodynamics

    Science.gov (United States)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  13. Isolated submucosal lipomatosis of appendix mimicking acute appendicitis: computed tomography findings

    Directory of Open Access Journals (Sweden)

    Şükrü Şanlı

    2014-03-01

    Full Text Available Acute appendicitis is one of the more common surgical emergencies, and it is one of the most common causes of acute abdominal pain. Intestinal lipomatosis is a rare condition particularly the isolated form of lipomatosis of the appendix which may mimic or present as an acute appendicitis, that frequently requires the surgical exploration.In this paper, we report computed tomography findings of a case wıth isolated form of submucosal lipomatosis of appendix.

  14. High resolution computed tomography(HRCT) findings of a solitary pulmonary nodule : differential diagnosis of cancer and tuberculosis

    International Nuclear Information System (INIS)

    Kim, Hee Soo; Choe, Kyu Ok

    1996-01-01

    To evaluate the role of HRCT in the differentiation of Pulmonary tuberculosis and lung cancer, where the manifestation of disease is a solitary pulmonary nodule(SPN). Forty eight SPNs including 29 cancers proven by surgery(n=10), by bronchoscopic biopsy(n=7) and by fine needle aspiration biopsy(n=12), and 19 tuberculous nodules proven by surgery(n=4), by bronchoscopic biopsy(n=4), by fine needle aspiration biopsy(n=5), by a positive result in AFB culture without evidence of malignant cells(n=3), and by a decrease in size on serial plain chests despite negative AFB culture(n=3) were included. Scanning parameters for HRCT were 140 KVp, 170 mA, 1.5 mm collimation, 3 sec scanning time, and a high spatial frequency algorithm was used. With regard to the marginal features of nodules, the findings more commonly observed in malignant nodules were greater average length of the longest spicule(5.35 ± 3.19 mm versus 2.75 ± 1.56 mm), and more commonspiculated nodules greater than 3 cm in diameter, 16(55%) versus 2(10.5%)(p<0.05). Regarding the internal characteristics of nodules and perinodular parenchymal changes, the findings more commonly observed in cases of cancer were air-bronchograms within nodules(14 ; 48.3%) and interlobar fissure puckering (6 ; 20.7%), whereas in tuberculosis cases the most common findings were low density of nodule(16 ; 84.2%), cavitation(12 ; 63.1%), and perinodular focal lung hypodensity(5 ; 26.3%), (p<0.05). no statstically significant difference was observed between the incidence of satellite lesion of tuberculous(73.7%) and of malignant nodules(34.5%). However, perilobular nodules or bronchovascular bundle thickening s were more commonly observed in the satellite lesions of malignant nodules(9 ; 90%), whereas centrilobular nodules or lobular consolidation were more commonly observed in those of tuberculous nodules(12 ; 85.7%), (p<0.05). HRCT provides detailed information concerning perinodular parenchymal changes and characteristics of

  15. Dynamics of a Novel Highly Repetitive CACTA Family in Common Bean (Phaseolus vulgaris

    Directory of Open Access Journals (Sweden)

    Dongying Gao

    2016-07-01

    Full Text Available Transposons are ubiquitous genomic components that play pivotal roles in plant gene and genome evolution. We analyzed two genome sequences of common bean (Phaseolus vulgaris and identified a new CACTA transposon family named pvCACTA1. The family is extremely abundant, as more than 12,000 pvCACTA1 elements were found. To our knowledge, this is the most abundant CACTA family reported thus far. The computational and fluorescence in situ hybridization (FISH analyses indicated that the pvCACTA1 elements were concentrated in terminal regions of chromosomes and frequently generated AT-rich 3 bp target site duplications (TSD, WWW, W is A or T. Comparative analysis of the common bean genomes from two domesticated genetic pools revealed that new insertions or excisions of pvCACTA1 elements occurred after the divergence of the two common beans, and some of the polymorphic elements likely resulted in variation in gene sequences. pvCACTA1 elements were detected in related species but not outside the Phaseolus genus. We calculated the molecular evolutionary rate of pvCACTA1 transposons using orthologous elements that indicated that most transposition events likely occurred before the divergence of the two gene pools. These results reveal unique features and evolution of this new transposon family in the common bean genome.

  16. Evaluation of computer tomography in cerebro-vascular disease (Strokes)

    International Nuclear Information System (INIS)

    Lee, Young Sik; Baek, Seung Yon; Rhee, Chung Sik; Kim, Hee Seup

    1984-01-01

    Most of cerebrovascular disease are composed of vascular occulusive changes and hemorrhage. Now a day, the computed tomography is the best way for evaluation of cerebrovascular disease including detection of nature, location, and associated changes. This study includes evaluation of computed tomography of 70 patients with cerebrovascular disease during the period of 10 months from April. 1983 to Feb. 1984 in Department of Radiology, Ewha Womans University Hospital. The results were as follows: 1. Age distribution of the total 70 patients was broad ranging from 25 years to 79 years. 78.6% of patients were over the age of 50. The male and female sex ratio was 1.4:1. 2. 4 out of 70 patients were normal and 66 patients revealed abnormal on C.T. findings; those were intracranial hemorrhage (28 patients), cerebral infarction (34 patients) and brain atrophy (4 patients). 3. In cases of cerebral infarction, the cerebral hemisphere was most common site of lesion (28 cases), and next was basal ganglia (2 cases). Most of the infarcts in cerebral hemisphere were located in the parietal and temporal lobes. 4. In cases of intracranial hemorrhage, the basal ganglia was most common site of lesion (15 cases). The next common site was cerebral hemisphere (9 cases). 6 patients of all intracranial hemorrhage were combined with intraventricular hemorrhage. Ratio of right and left was 2:3. 5. In patients with motor weakness or hemiparesis, more common findings on CT scan were cerebral infarction. In case with hemiplegia, more common CT findings were intracerebral hemorrhage. 6. Of the 40 cases thought to be cerebral infarction initially by clinical findings and spinal tap. 8 cases (20.0%) were proved to be cerebral hemorrhage by the CT scan. However, of the 22 cases thought to be cerebral hemorrhage, initially, only two cases (9.0%) were cerebral infarction

  17. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  18. Consolidation of Cloud Computing in ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00224309; The ATLAS collaboration; Cordeiro, Cristovao; Di Girolamo, Alessandro; Hover, John; Kouba, Tomas; Love, Peter; Mcnab, Andrew; Schovancova, Jaroslava; Sobie, Randall

    2016-01-01

    Throughout the first year of LHC Run 2, ATLAS Cloud Computing has undergone a period of consolidation, characterized by building upon previously established systems, with the aim of reducing operational effort, improving robustness, and reaching higher scale. This paper describes the current state of ATLAS Cloud Computing. Cloud activities are converging on a common contextualization approach for virtual machines, and cloud resources are sharing monitoring and service discovery components. We describe the integration of Vac resources, streamlined usage of the High Level Trigger cloud for simulation and reconstruction, extreme scaling on Amazon EC2, and procurement of commercial cloud capacity in Europe. Building on the previously established monitoring infrastructure, we have deployed a real-time monitoring and alerting platform which coalesces data from multiple sources, provides flexible visualization via customizable dashboards, and issues alerts and carries out corrective actions in response to problems. ...

  19. Computer language evaluation for MFTF SCDS

    International Nuclear Information System (INIS)

    Anderson, R.E.; McGoldrick, P.R.; Wyman, R.H.

    1979-01-01

    The computer languages available for the systems and application implementation on the Supervisory Control and Diagnostics System (SCDS) for the Mirror Fusion Test Facility (MFTF) were surveyed and evaluated. Four language processors, CAL (Common Assembly Language), Extended FORTRAN, CORAL 66, and Sequential Pascal (SPASCAL, a subset of Concurrent Pascal [CPASCAL]) are commercially available for the Interdata 7/32 and 8/32 computers that constitute the SCDS. Of these, the Sequential Pascal available from Kansas State University appears best for the job in terms of minimizing the implementation time, debugging time, and maintenance time. This improvement in programming productivity is due to the availability of a high-level, block-structured language that includes many compile-time and run-time checks to detect errors. In addition, the advanced data-types in language allow easy description of the program variables. 1 table

  20. Parallel PDE-Based Simulations Using the Common Component Architecture

    International Nuclear Information System (INIS)

    McInnes, Lois C.; Allan, Benjamin A.; Armstrong, Robert; Benson, Steven J.; Bernholdt, David E.; Dahlgren, Tamara L.; Diachin, Lori; Krishnan, Manoj Kumar; Kohl, James A.; Larson, J. Walter; Lefantzi, Sophia; Nieplocha, Jarek; Norris, Boyana; Parker, Steven G.; Ray, Jaideep; Zhou, Shujia

    2006-01-01

    The complexity of parallel PDE-based simulations continues to increase as multimodel, multiphysics, and multi-institutional projects become widespread. A goal of component based software engineering in such large-scale simulations is to help manage this complexity by enabling better interoperability among various codes that have been independently developed by different groups. The Common Component Architecture (CCA) Forum is defining a component architecture specification to address the challenges of high-performance scientific computing. In addition, several execution frameworks, supporting infrastructure, and general purpose components are being developed. Furthermore, this group is collaborating with others in the high-performance computing community to design suites of domain-specific component interface specifications and underlying implementations. This chapter discusses recent work on leveraging these CCA efforts in parallel PDE-based simulations involving accelerator design, climate modeling, combustion, and accidental fires and explosions. We explain how component technology helps to address the different challenges posed by each of these applications, and we highlight how component interfaces built on existing parallel toolkits facilitate the reuse of software for parallel mesh manipulation, discretization, linear algebra, integration, optimization, and parallel data redistribution. We also present performance data to demonstrate the suitability of this approach, and we discuss strategies for applying component technologies to both new and existing applications

  1. CRITICAL ISSUES IN HIGH END COMPUTING - FINAL REPORT

    Energy Technology Data Exchange (ETDEWEB)

    Corones, James [Krell Institute

    2013-09-23

    High-End computing (HEC) has been a driver for advances in science and engineering for the past four decades. Increasingly HEC has become a significant element in the national security, economic vitality, and competitiveness of the United States. Advances in HEC provide results that cut across traditional disciplinary and organizational boundaries. This program provides opportunities to share information about HEC systems and computational techniques across multiple disciplines and organizations through conferences and exhibitions of HEC advances held in Washington DC so that mission agency staff, scientists, and industry can come together with White House, Congressional and Legislative staff in an environment conducive to the sharing of technical information, accomplishments, goals, and plans. A common thread across this series of conferences is the understanding of computational science and applied mathematics techniques across a diverse set of application areas of interest to the Nation. The specific objectives of this program are: Program Objective 1. To provide opportunities to share information about advances in high-end computing systems and computational techniques between mission critical agencies, agency laboratories, academics, and industry. Program Objective 2. To gather pertinent data, address specific topics of wide interest to mission critical agencies. Program Objective 3. To promote a continuing discussion of critical issues in high-end computing. Program Objective 4.To provide a venue where a multidisciplinary scientific audience can discuss the difficulties applying computational science techniques to specific problems and can specify future research that, if successful, will eliminate these problems.

  2. Computers, the Internet and medical education in Africa.

    Science.gov (United States)

    Williams, Christopher D; Pitchforth, Emma L; O'Callaghan, Christopher

    2010-05-01

    OBJECTIVES This study aimed to explore the use of information and communications technology (ICT) in undergraduate medical education in developing countries. METHODS Educators (deans and heads of medical education) in English-speaking countries across Africa were sent a questionnaire to establish the current state of ICT at medical schools. Non-respondents were contacted firstly by e-mail, subsequently by two postal mailings at 3-month intervals, and finally by telephone. Main outcome measures included cross-sectional data about the availability of computers, specifications, Internet connection speeds, use of ICT by students, and teaching of ICT and computerised research skills, presented by country or region. RESULTS The mean computer : student ratio was 0.123. Internet speeds were rated as 'slow' or 'very slow' on a 5-point Likert scale by 25.0% of respondents overall, but by 58.3% in East Africa and 33.3% in West Africa (including Cameroon). Mean estimates showed that campus computers more commonly supported CD-ROM (91.4%) and sound (87.3%) than DVD-ROM (48.1%) and Internet (72.5%). The teaching of ICT and computerised research skills, and the use of computers by medical students for research, assignments and personal projects were common. CONCLUSIONS It is clear that ICT infrastructure in Africa lags behind that in other regions. Poor download speeds limit the potential of Internet resources (especially videos, sound and other large downloads) to benefit students, particularly in East and West (including Cameroon) Africa. CD-ROM capability is more widely available, but has not yet gained momentum as a means of distributing materials. Despite infrastructure limitations, ICT is already being used and there is enthusiasm for developing this further. Priority should be given to developing partnerships to improve ICT infrastructure and maximise the potential of existing technology.

  3. Ceftriaxone-associated pancreatitis captured on serial computed tomography scans

    Directory of Open Access Journals (Sweden)

    Nozomu Nakagawa, MD

    2018-02-01

    Full Text Available A 74-year-old man was treated with ceftriaxone for 5 days and subsequently experienced epigastric pain. Computed tomography (CT was performed 7 and 3 days before epigastralgia. Although the first CT image revealed no radiographic signs in his biliary system, the second CT image revealed dense radiopaque material in the gallbladder lumen. The third CT image, taken at symptom onset, showed high density in the common bile duct and enlargement of the pancreatic head. This is a very rare case of pseudolithiasis involving the common bile duct, as captured on a series of CT images.

  4. Ceftriaxone-associated pancreatitis captured on serial computed tomography scans.

    Science.gov (United States)

    Nakagawa, Nozomu; Ochi, Nobuaki; Yamane, Hiromichi; Honda, Yoshihiro; Nagasaki, Yasunari; Urata, Noriyo; Nakanishi, Hidekazu; Kawamoto, Hirofumi; Takigawa, Nagio

    2018-02-01

    A 74-year-old man was treated with ceftriaxone for 5 days and subsequently experienced epigastric pain. Computed tomography (CT) was performed 7 and 3 days before epigastralgia. Although the first CT image revealed no radiographic signs in his biliary system, the second CT image revealed dense radiopaque material in the gallbladder lumen. The third CT image, taken at symptom onset, showed high density in the common bile duct and enlargement of the pancreatic head. This is a very rare case of pseudolithiasis involving the common bile duct, as captured on a series of CT images.

  5. The role of soft computing in intelligent machines.

    Science.gov (United States)

    de Silva, Clarence W

    2003-08-15

    An intelligent machine relies on computational intelligence in generating its intelligent behaviour. This requires a knowledge system in which representation and processing of knowledge are central functions. Approximation is a 'soft' concept, and the capability to approximate for the purposes of comparison, pattern recognition, reasoning, and decision making is a manifestation of intelligence. This paper examines the use of soft computing in intelligent machines. Soft computing is an important branch of computational intelligence, where fuzzy logic, probability theory, neural networks, and genetic algorithms are synergistically used to mimic the reasoning and decision making of a human. This paper explores several important characteristics and capabilities of machines that exhibit intelligent behaviour. Approaches that are useful in the development of an intelligent machine are introduced. The paper presents a general structure for an intelligent machine, giving particular emphasis to its primary components, such as sensors, actuators, controllers, and the communication backbone, and their interaction. The role of soft computing within the overall system is discussed. Common techniques and approaches that will be useful in the development of an intelligent machine are introduced, and the main steps in the development of an intelligent machine for practical use are given. An industrial machine, which employs the concepts of soft computing in its operation, is presented, and one aspect of intelligent tuning, which is incorporated into the machine, is illustrated.

  6. Three dimensional reconstruction of computed tomographic images by computer graphics method

    International Nuclear Information System (INIS)

    Kashiwagi, Toru; Kimura, Kazufumi.

    1986-01-01

    A three dimensional computer reconstruction system for CT images has been developed in a commonly used radionuclide data processing system using a computer graphics technique. The three dimensional model was constructed from organ surface information of CT images (slice thickness: 5 or 10 mm). Surface contours of the organs were extracted manually from a set of parallel transverse CT slices in serial order and stored in the computer memory. Interpolation was made between a set of the extracted contours by cubic spline functions, then three dimensional models were reconstructed. The three dimensional images were displayed as a wire-frame and/or solid models on the color CRT. Solid model images were obtained as follows. The organ surface constructed from contours was divided into many triangular patches. The intensity of light to each patch was calculated from the direction of incident light, eye position and the normal to the triangular patch. Firstly, this system was applied to the liver phantom. Reconstructed images of the liver phantom were coincident with the actual object. This system also has been applied to human various organs such as brain, lung, liver, etc. The anatomical organ surface was realistically viewed from any direction. The images made us more easily understand the location and configuration of organs in vivo than original CT images. Furthermore, spacial relationship among organs and/or lesions was clearly obtained by superimposition of wire-frame and/or different colored solid models. Therefore, it is expected that this system is clinically useful for evaluating the patho-morphological changes in broad perspective. (author)

  7. [Nutritional habits in children and adolescents practicing fencing. Part II. Characteristics of eating between meals].

    Science.gov (United States)

    Chalcarz, Wojciech; Radzimirska-Graczyk, Monika

    2010-01-01

    The aim of this study was to assess the longest interval between meals, eating until the feeling of satiety and eating between meals in children and adolescents who attended sports schools. The questionnaires on were filled in by 141 children and adolescents who practised fencing and attended sports classes in primary and secondary schools. The days with training and the days free of training were analysed separately. The influence of gender and age on the longest interval between meals, eating until the feeling of satiety and eating between meals on the days with training and the days free of training was analysed by means of the SPSS 12.0 PL for Windows computer programme. Gender and age had statistically significant influence on the longest interval between meals, eating until the feeling of satiety and eating vegetables, cured meat, sweets and energy drinks between meals. Eating between main meals was prevalent in the studied population. Higher percentage of girls ate fruit and vegetables between main meals, while higher percentage of boys ate sandwiches, irrespectively of the type of the day--with training or free of training.

  8. A boundary integral method for numerical computation of radar cross section of 3D targets using hybrid BEM/FEM with edge elements

    Science.gov (United States)

    Dodig, H.

    2017-11-01

    This contribution presents the boundary integral formulation for numerical computation of time-harmonic radar cross section for 3D targets. Method relies on hybrid edge element BEM/FEM to compute near field edge element coefficients that are associated with near electric and magnetic fields at the boundary of the computational domain. Special boundary integral formulation is presented that computes radar cross section directly from these edge element coefficients. Consequently, there is no need for near-to-far field transformation (NTFFT) which is common step in RCS computations. By the end of the paper it is demonstrated that the formulation yields accurate results for canonical models such as spheres, cubes, cones and pyramids. Method has demonstrated accuracy even in the case of dielectrically coated PEC sphere at interior resonance frequency which is common problem for computational electromagnetic codes.

  9. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    OpenAIRE

    Karlheinz Schwarz; Rainer Breitling; Christian Allen

    2013-01-01

    Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation) is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized ...

  10. Gender Differences in Computer and Information Literacy: An Exploration of the Performances of Girls and Boys in ICILS 2013

    Science.gov (United States)

    Punter, R. Annemiek; Meelissen, Martina R. M.; Glas, Cees A. W.

    2017-01-01

    IEA's International Computer and Information Literacy Study (ICILS) 2013 showed that in the majority of the participating countries, 14-year-old girls outperformed boys in computer and information literacy (CIL): results that seem to contrast with the common view of boys having better computer skills. This study used the ICILS data to explore…

  11. An evaluation of an improved method for computing histograms in dynamic tracer studies using positron-emission tomography

    International Nuclear Information System (INIS)

    Ollinger, J.M.; Snyder, D.L.

    1986-01-01

    A method for computing approximate minimum-mean-square-error estimates of histograms from list-mode data for use in dynamic tracer studies is evaluated. Parameters estimated from these histograms are significantly more accurate than those estimated from histograms computed by a commonly used method

  12. Building a High Performance Computing Infrastructure for Novosibirsk Scientific Center

    International Nuclear Information System (INIS)

    Adakin, A; Chubarov, D; Nikultsev, V; Belov, S; Kaplin, V; Sukharev, A; Zaytsev, A; Kalyuzhny, V; Kuchin, N; Lomakin, S

    2011-01-01

    Novosibirsk Scientific Center (NSC), also known worldwide as Akademgorodok, is one of the largest Russian scientific centers hosting Novosibirsk State University (NSU) and more than 35 research organizations of the Siberian Branch of Russian Academy of Sciences including Budker Institute of Nuclear Physics (BINP), Institute of Computational Technologies (ICT), and Institute of Computational Mathematics and Mathematical Geophysics (ICM and MG). Since each institute has specific requirements on the architecture of the computing farms involved in its research field, currently we've got several computing facilities hosted by NSC institutes, each optimized for the particular set of tasks, of which the largest are the NSU Supercomputer Center, Siberian Supercomputer Center (ICM and MG), and a Grid Computing Facility of BINP. Recently a dedicated optical network with the initial bandwidth of 10 Gbps connecting these three facilities was built in order to make it possible to share the computing resources among the research communities of participating institutes, thus providing a common platform for building the computing infrastructure for various scientific projects. Unification of the computing infrastructure is achieved by extensive use of virtualization technologies based on XEN and KVM platforms. The solution implemented was tested thoroughly within the computing environment of KEDR detector experiment which is being carried out at BINP, and foreseen to be applied to the use cases of other HEP experiments in the upcoming future.

  13. Application of microarray analysis on computer cluster and cloud platforms.

    Science.gov (United States)

    Bernau, C; Boulesteix, A-L; Knaus, J

    2013-01-01

    Analysis of recent high-dimensional biological data tends to be computationally intensive as many common approaches such as resampling or permutation tests require the basic statistical analysis to be repeated many times. A crucial advantage of these methods is that they can be easily parallelized due to the computational independence of the resampling or permutation iterations, which has induced many statistics departments to establish their own computer clusters. An alternative is to rent computing resources in the cloud, e.g. at Amazon Web Services. In this article we analyze whether a selection of statistical projects, recently implemented at our department, can be efficiently realized on these cloud resources. Moreover, we illustrate an opportunity to combine computer cluster and cloud resources. In order to compare the efficiency of computer cluster and cloud implementations and their respective parallelizations we use microarray analysis procedures and compare their runtimes on the different platforms. Amazon Web Services provide various instance types which meet the particular needs of the different statistical projects we analyzed in this paper. Moreover, the network capacity is sufficient and the parallelization is comparable in efficiency to standard computer cluster implementations. Our results suggest that many statistical projects can be efficiently realized on cloud resources. It is important to mention, however, that workflows can change substantially as a result of a shift from computer cluster to cloud computing.

  14. Review of Trusted Computer System Evaluation Criteria of Some Countries (USA, Canada, EC

    Directory of Open Access Journals (Sweden)

    Natalia Mikhailovna Nikiforova

    2013-02-01

    Full Text Available In the paper a brief review of the evolution in developing and designing trusted computer system evaluation criteria from specific national to common international standards is given/suggested.

  15. Potential implementation of reservoir computing models based on magnetic skyrmions

    Science.gov (United States)

    Bourianoff, George; Pinna, Daniele; Sitte, Matthias; Everschor-Sitte, Karin

    2018-05-01

    Reservoir Computing is a type of recursive neural network commonly used for recognizing and predicting spatio-temporal events relying on a complex hierarchy of nested feedback loops to generate a memory functionality. The Reservoir Computing paradigm does not require any knowledge of the reservoir topology or node weights for training purposes and can therefore utilize naturally existing networks formed by a wide variety of physical processes. Most efforts to implement reservoir computing prior to this have focused on utilizing memristor techniques to implement recursive neural networks. This paper examines the potential of magnetic skyrmion fabrics and the complex current patterns which form in them as an attractive physical instantiation for Reservoir Computing. We argue that their nonlinear dynamical interplay resulting from anisotropic magnetoresistance and spin-torque effects allows for an effective and energy efficient nonlinear processing of spatial temporal events with the aim of event recognition and prediction.

  16. Securing Cloud Computing from Different Attacks Using Intrusion Detection Systems

    Directory of Open Access Journals (Sweden)

    Omar Achbarou

    2017-03-01

    Full Text Available Cloud computing is a new way of integrating a set of old technologies to implement a new paradigm that creates an avenue for users to have access to shared and configurable resources through internet on-demand. This system has many common characteristics with distributed systems, hence, the cloud computing also uses the features of networking. Thus the security is the biggest issue of this system, because the services of cloud computing is based on the sharing. Thus, a cloud computing environment requires some intrusion detection systems (IDSs for protecting each machine against attacks. The aim of this work is to present a classification of attacks threatening the availability, confidentiality and integrity of cloud resources and services. Furthermore, we provide literature review of attacks related to the identified categories. Additionally, this paper also introduces related intrusion detection models to identify and prevent these types of attacks.

  17. Identification of risk factors of computer information technologies in education

    Directory of Open Access Journals (Sweden)

    Hrebniak M.P.

    2014-03-01

    Full Text Available The basic direction of development of secondary school and vocational training is computer training of schoolchildren and students, including distance forms of education and widespread usage of world information systems. The purpose of the work is to determine risk factors for schoolchildren and students, when using modern information and computer technologies. Results of researches allowed to establish dynamics of formation of skills using computer information technologies in education and characteristics of mental ability among schoolchildren and students during training in high school. Common risk factors, while operating CIT, are: intensification and formalization of intellectual activity, adverse ergonomic parameters, unfavorable working posture, excess of hygiene standards by chemical and physical characteristics. The priority preventive directions in applying computer information technology in education are: improvement of optimal visual parameters of activity, rationalization of ergonomic parameters, minimizing of adverse effects of chemical and physical conditions, rationalization of work and rest activity.

  18. [The efficacy of endoscopic endosonography in diagnosis of benign and malignant stenoses of common bile duct].

    Science.gov (United States)

    Solodinina, E N; Starkov, Iu G; Shumkina, L V

    2016-01-01

    To define criteria and to estimate diagnostic significance of endosonography in differential diagnosis of benign and malignant stenoses of common bile duct. We presented the results of survey and treatment of 57 patients with benign and malignant stenoses of common bile duct. The technique of endosonography is described. We have formulated major criteria of differential diagnostics of tumoral and non-tumoral lesion of extrahepatic bile ducts. Comparative analysis of endosonography, ultrasound, computed tomography and magnetic resonance cholangiopancreatography was performed. Sensitivity, specificity and accuracy of endosonography in diagnosis of stenosis cause is 97.7%, 100% and 98.2% respectively. So it exceeds the efficacy of other diagnostic X-ray methods. In modern surgical clinic endosonography should be mandatory performed. It is necessary for final diagnostics of cause of common bile duct stenosis especially in case of its low location.

  19. Direct linear measurement of root dentin thickness and dentin volume changes with post space preparation: A cone-beam computed tomography study

    Directory of Open Access Journals (Sweden)

    Shoeb Yakub Shaikh

    2018-01-01

    Full Text Available Aim: The purpose of the present study was direct linear measurement of dentin thickness and dentin volume changes for post space preparation with cone-beam computed tomography (CBCT. Materials and Methods: Ten maxillary central incisors were scanned, before and after root canal and post space preparation, with Orthophos XG three-dimensional hybrid unit. Thirteen axial section scans of each tooth from orifice to apex and dentin thickness for buccal, lingual, mesial, and distal were measured using proprietary measuring tool and thereafter subjected to statistical analysis. Furthermore, dentin volume was evaluated using ITK-SNAP software. Results: There was statistically significant difference between the dentin thickness in pre- and postinstrumentation (paired t-test and also between different groups (one-way ANOVA. In the shortest post length of 4.5mm the post space preparation resulted in 2.17% loss of hard tissue volume, where as 11mm longest post length post space preparation resulted in >40% loss of hard tissue volume. Conclusion: CBCT axial section scan for direct measurements of root dentin thickness can be guideline before and after post space preparation for selection of drill length and diameter.

  20. The gputools package enables GPU computing in R.

    Science.gov (United States)

    Buckner, Joshua; Wilson, Justin; Seligman, Mark; Athey, Brian; Watson, Stanley; Meng, Fan

    2010-01-01

    By default, the R statistical environment does not make use of parallelism. Researchers may resort to expensive solutions such as cluster hardware for large analysis tasks. Graphics processing units (GPUs) provide an inexpensive and computationally powerful alternative. Using R and the CUDA toolkit from Nvidia, we have implemented several functions commonly used in microarray gene expression analysis for GPU-equipped computers. R users can take advantage of the better performance provided by an Nvidia GPU. The package is available from CRAN, the R project's repository of packages, at http://cran.r-project.org/web/packages/gputools More information about our gputools R package is available at http://brainarray.mbni.med.umich.edu/brainarray/Rgpgpu