WorldWideScience

Sample records for computing longest common

  1. Faster Algorithms for Computing Longest Common Increasing Subsequences

    DEFF Research Database (Denmark)

    Kutz, Martin; Brodal, Gerth Stølting; Kaligosi, Kanela

    2011-01-01

    of the alphabet, and Sort is the time to sort each input sequence. For k⩾3 length-n sequences we present an algorithm which improves the previous best bound by more than a factor k for many inputs. In both cases, our algorithms are conceptually quite simple but rely on existing sophisticated data structures......We present algorithms for finding a longest common increasing subsequence of two or more input sequences. For two sequences of lengths n and m, where m⩾n, we present an algorithm with an output-dependent expected running time of and O(m) space, where ℓ is the length of an LCIS, σ is the size....... Finally, we introduce the problem of longest common weakly-increasing (or non-decreasing) subsequences (LCWIS), for which we present an -time algorithm for the 3-letter alphabet case. For the extensively studied longest common subsequence problem, comparable speedups have not been achieved for small...

  2. FACC: A Novel Finite Automaton Based on Cloud Computing for the Multiple Longest Common Subsequences Search

    Directory of Open Access Journals (Sweden)

    Yanni Li

    2012-01-01

    Full Text Available Searching for the multiple longest common subsequences (MLCS has significant applications in the areas of bioinformatics, information processing, and data mining, and so forth, Although a few parallel MLCS algorithms have been proposed, the efficiency and effectiveness of the algorithms are not satisfactory with the increasing complexity and size of biologic data. To overcome the shortcomings of the existing MLCS algorithms, and considering that MapReduce parallel framework of cloud computing being a promising technology for cost-effective high performance parallel computing, a novel finite automaton (FA based on cloud computing called FACC is proposed under MapReduce parallel framework, so as to exploit a more efficient and effective general parallel MLCS algorithm. FACC adopts the ideas of matched pairs and finite automaton by preprocessing sequences, constructing successor tables, and common subsequences finite automaton to search for MLCS. Simulation experiments on a set of benchmarks from both real DNA and amino acid sequences have been conducted and the results show that the proposed FACC algorithm outperforms the current leading parallel MLCS algorithm FAST-MLCS.

  3. Longest common extensions in trees

    DEFF Research Database (Denmark)

    Bille, Philip; Gawrychowski, Pawel; Gørtz, Inge Li

    2016-01-01

    to trees and suggest a few applications of LCE in trees to tries and XML databases. Given a labeled and rooted tree T of size n, the goal is to preprocess T into a compact data structure that support the following LCE queries between subpaths and subtrees in T. Let v1, v2, w1, and w2 be nodes of T...... such that w1 and w2 are descendants of v1 and v2 respectively. - LCEPP(v1, w1, v2, w2): (path-path LCE) return the longest common prefix of the paths v1 ~→ w1 and v2 ~→ w2. - LCEPT(v1, w1, v2): (path-tree LCE) return maximal path-path LCE of the path v1 ~→ w1 and any path from v2 to a descendant leaf. - LCETT......(v1, v2): (tree-tree LCE) return a maximal path-path LCE of any pair of paths from v1 and v2 to descendant leaves. We present the first non-trivial bounds for supporting these queries. For LCEPP queries, we present a linear-space solution with O(log* n) query time. For LCEPT queries, we present...

  4. Longest Common Extensions in Trees

    DEFF Research Database (Denmark)

    Bille, Philip; Gawrychowski, Pawel; Gørtz, Inge Li

    2015-01-01

    to trees and suggest a few applications of LCE in trees to tries and XML databases. Given a labeled and rooted tree T of size n, the goal is to preprocess T into a compact data structure that support the following LCE queries between subpaths and subtrees in T. Let v1, v2, w1, and w2 be nodes of T...... such that w1 and w2 are descendants of v1 and v2 respectively. - LCEPP(v1, w1, v2, w2): (path-path LCE) return the longest common prefix of the paths v1 ~→ w1 and v2 ~→ w2. - LCEPT(v1, w1, v2): (path-tree LCE) return maximal path-path LCE of the path v1 ~→ w1 and any path from v2 to a descendant leaf. - LCETT......(v1, v2): (tree-tree LCE) return a maximal path-path LCE of any pair of paths from v1 and v2 to descendant leaves. We present the first non-trivial bounds for supporting these queries. For LCEPP queries, we present a linear-space solution with O(log* n) query time. For LCEPT queries, we present...

  5. Longest Common Extensions via Fingerprinting

    DEFF Research Database (Denmark)

    Bille, Philip; Gørtz, Inge Li; Kristensen, Jesper

    2012-01-01

    query time, no extra space and no preprocessing achieves significantly better average case performance. We show a new algorithm, Fingerprint k , which for a parameter k, 1 ≤ k ≤ [log n], on a string of length n and alphabet size σ, gives O(k n1/k) query time using O(k n) space and O(k n + sort......(n,σ)) preprocessing time, where sort(n,σ) is the time it takes to sort n numbers from σ. Though this solution is asymptotically strictly worse than the asymptotically best previously known algorithms, it outperforms them in practice in average case and is almost as fast as the simple linear time algorithm. On worst....... The LCE problem can be solved in linear space with constant query time and a preprocessing of sorting complexity. There are two known approaches achieving these bounds, which use nearest common ancestors and range minimum queries, respectively. However, in practice a much simpler approach with linear...

  6. Sublinear Space Algorithms for the Longest Common Substring Problem

    DEFF Research Database (Denmark)

    Kociumaka, Tomasz; Starikovskaya, Tatiana; Vildhøj, Hjalte Wedel

    2014-01-01

    Given m documents of total length n, we consider the problem of finding a longest string common to at least d ≥ 2 of the documents. This problem is known as the longest common substring (LCS) problem and has a classic O(n) space and O(n) time solution (Weiner [FOCS'73], Hui [CPM'92]). However...

  7. Time-Space Trade-Offs for the Longest Common Substring Problem

    DEFF Research Database (Denmark)

    Starikovskaya, Tatiana; Vildhøj, Hjalte Wedel

    2013-01-01

    The Longest Common Substring problem is to compute the longest substring which occurs in at least d ≥ 2 of m strings of total length n. In this paper we ask the question whether this problem allows a deterministic time-space trade-off using O(n1+ε) time and O(n1-ε) space for 0 ≤ ε ≤ 1. We give a ...... a positive answer in the case of two strings (d = m = 2) and 0 can be solved in O(n1-ε) space and O(n1+ε log2n (d log2n + d2)) time for any 0 ≤ ε ...The Longest Common Substring problem is to compute the longest substring which occurs in at least d ≥ 2 of m strings of total length n. In this paper we ask the question whether this problem allows a deterministic time-space trade-off using O(n1+ε) time and O(n1-ε) space for 0 ≤ ε ≤ 1. We give...

  8. Comparative Visualization of Vector Field Ensembles Based on Longest Common Subsequence

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Richen; Guo, Hanqi; Zhang, Jiang; Yuan, Xiaoru

    2016-04-19

    We propose a longest common subsequence (LCS) based approach to compute the distance among vector field ensembles. By measuring how many common blocks the ensemble pathlines passing through, the LCS distance defines the similarity among vector field ensembles by counting the number of sharing domain data blocks. Compared to the traditional methods (e.g. point-wise Euclidean distance or dynamic time warping distance), the proposed approach is robust to outlier, data missing, and sampling rate of pathline timestep. Taking the advantages of smaller and reusable intermediate output, visualization based on the proposed LCS approach revealing temporal trends in the data at low storage cost, and avoiding tracing pathlines repeatedly. Finally, we evaluate our method on both synthetic data and simulation data, which demonstrate the robustness of the proposed approach.

  9. File Type Identification of File Fragments using Longest Common Subsequence (LCS)

    Science.gov (United States)

    Rahmat, R. F.; Nicholas, F.; Purnamawati, S.; Sitompul, O. S.

    2017-01-01

    Computer forensic analyst is a person in charge of investigation and evidence tracking. In certain cases, the file needed to be presented as digital evidence was deleted. It is difficult to reconstruct the file, because it often lost its header and cannot be identified while being restored. Therefore, a method is required for identifying the file type of file fragments. In this research, we propose Longest Common Subsequences that consists of three steps, namely training, testing and validation, to identify the file type from file fragments. From all testing results we can conlude that our proposed method works well and achieves 92.91% of accuracy to identify the file type of file fragment for three data types.

  10. A Novel Efficient Graph Model for the Multiple Longest Common Subsequences (MLCS Problem

    Directory of Open Access Journals (Sweden)

    Zhan Peng

    2017-08-01

    Full Text Available Searching for the Multiple Longest Common Subsequences (MLCS of multiple sequences is a classical NP-hard problem, which has been used in many applications. One of the most effective exact approaches for the MLCS problem is based on dominant point graph, which is a kind of directed acyclic graph (DAG. However, the time and space efficiency of the leading dominant point graph based approaches is still unsatisfactory: constructing the dominated point graph used by these approaches requires a huge amount of time and space, which hinders the applications of these approaches to large-scale and long sequences. To address this issue, in this paper, we propose a new time and space efficient graph model called the Leveled-DAG for the MLCS problem. The Leveled-DAG can timely eliminate all the nodes in the graph that cannot contribute to the construction of MLCS during constructing. At any moment, only the current level and some previously generated nodes in the graph need to be kept in memory, which can greatly reduce the memory consumption. Also, the final graph contains only one node in which all of the wanted MLCS are saved, thus, no additional operations for searching the MLCS are needed. The experiments are conducted on real biological sequences with different numbers and lengths respectively, and the proposed algorithm is compared with three state-of-the-art algorithms. The experimental results show that the time and space needed for the Leveled-DAG approach are smaller than those for the compared algorithms especially on large-scale and long sequences.

  11. Chords in longest cycles

    DEFF Research Database (Denmark)

    Thomassen, Carsten

    2017-01-01

    If a graph G is 3-connected and has minimum degree at least 4, then some longest cycle in G has a chord. If G is 2-connected and cubic, then every longest cycle in G has a chord.......If a graph G is 3-connected and has minimum degree at least 4, then some longest cycle in G has a chord. If G is 2-connected and cubic, then every longest cycle in G has a chord....

  12. The symmetric longest queue system

    NARCIS (Netherlands)

    van Houtum, Geert-Jan; Adan, Ivo; van der Wal, Jan

    1997-01-01

    We derive the performance of the exponential symmetric longest queue system from two variants: a longest queue system with Threshold Rejection of jobs and one with Threshold Addition of jobs. It is shown that these two systems provide lower and upper bounds for the performance of the longest queue

  13. Longest cable-stayed bridge TATARA; Longest shachokyo Tatara Ohashi

    Energy Technology Data Exchange (ETDEWEB)

    Fujii, K. [Hiroshima University, Hiroshima (Japan). Faculty of Engineering

    1998-06-15

    The world`s longest cable-stayed bridge Tatara having a central span of 890 m had the both ends closed in August 1997, linking Namakuchi Island and Omishima Island. Final finishing work is continuing for opening of the West Seto Expressway in the spring of 1999. A cable-stayed bridge supports the bridge girders by perpendicular components of tensile force of cables stayed obliquely. On the other hand, there is a concern that the girders may have axial compression force generated due to horizontal components of the force from the cable tensile force, which can cause buckling of the girders. Therefore, in order to suspend the girders efficiently by increasing the perpendicular components of the cable force, and moreover to suppress the axial compression force on the girders, it is more advantageous to make bridge towers high, hence the towers of this bridge are highest among the bridges on the Shimanami Ocean Road. This bridge whose long girders are stayed with 21-stage multi cables presented a problem in designing the buckling in steel girders near the towers due to the horizontal components of the force generated by the bridge. Discussions were given, therefore, by using load withstanding force experiments using a whole bridge model of 1/50 scale, buckling experiments on full-size reinforcing plate models, and load withstanding force analysis using a tower model. A number of other technical discussions were repeated, by which the world`s longest cable-stayed bridge was completed. 9 figs., 1 tab.

  14. A Note on Longest Paths in Circular Arc Graphs

    Directory of Open Access Journals (Sweden)

    Joos Felix

    2015-08-01

    Full Text Available As observed by Rautenbach and Sereni [SIAM J. Discrete Math. 28 (2014 335-341] there is a gap in the proof of the theorem of Balister et al. [Combin. Probab. Comput. 13 (2004 311-317], which states that the intersection of all longest paths in a connected circular arc graph is nonempty. In this paper we close this gap.

  15. On the number of longest and almost longest cycles in cubic graphs

    DEFF Research Database (Denmark)

    Chia, Gek Ling; Thomassen, Carsten

    2012-01-01

    We consider the questions: How many longest cycles must a cubic graph have, and how many may it have? For each k >= 2 there are infinitely many p such that there is a cubic graph with p vertices and precisely one longest cycle of length p-k. On the other hand, if G is a graph with p vertices, all...

  16. Test Anxiety, Computer-Adaptive Testing and the Common Core

    Science.gov (United States)

    Colwell, Nicole Makas

    2013-01-01

    This paper highlights the current findings and issues regarding the role of computer-adaptive testing in test anxiety. The computer-adaptive test (CAT) proposed by one of the Common Core consortia brings these issues to the forefront. Research has long indicated that test anxiety impairs student performance. More recent research indicates that…

  17. Destroying longest cycles in graphs and digraphs

    DEFF Research Database (Denmark)

    Van Aardt, Susan A.; Burger, Alewyn P.; Dunbar, Jean E.

    2015-01-01

    In 1978, C. Thomassen proved that in any graph one can destroy all the longest cycles by deleting at most one third of the vertices. We show that for graphs with circumference k≤8 it suffices to remove at most 1/k of the vertices. The Petersen graph demonstrates that this result cannot be extended...... to include k=9 but we show that in every graph with circumference nine we can destroy all 9-cycles by removing 1/5 of the vertices. We consider the analogous problem for digraphs and show that for digraphs with circumference k=2,3, it suffices to remove 1/k of the vertices. However this does not hold for k≥4....

  18. A Computer-Based Instrument That Identifies Common Science Misconceptions

    Science.gov (United States)

    Larrabee, Timothy G.; Stein, Mary; Barman, Charles

    2006-01-01

    This article describes the rationale for and development of a computer-based instrument that helps identify commonly held science misconceptions. The instrument, known as the Science Beliefs Test, is a 47-item instrument that targets topics in chemistry, physics, biology, earth science, and astronomy. The use of an online data collection system…

  19. Common accounting system for monitoring the ATLAS distributed computing resources

    International Nuclear Information System (INIS)

    Karavakis, E; Andreeva, J; Campana, S; Saiz, P; Gayazov, S; Jezequel, S; Sargsyan, L; Schovancova, J; Ueda, I

    2014-01-01

    This paper covers in detail a variety of accounting tools used to monitor the utilisation of the available computational and storage resources within the ATLAS Distributed Computing during the first three years of Large Hadron Collider data taking. The Experiment Dashboard provides a set of common accounting tools that combine monitoring information originating from many different information sources; either generic or ATLAS specific. This set of tools provides quality and scalable solutions that are flexible enough to support the constantly evolving requirements of the ATLAS user community.

  20. WMO World Record Lightning Extremes: Longest Reported Flash Distance and Longest Reported Flash Duration.

    Science.gov (United States)

    Lang, Timothy J; Pédeboy, Stéphane; Rison, William; Cerveny, Randall S; Montanyà, Joan; Chauzy, Serge; MacGorman, Donald R; Holle, Ronald L; Ávila, Eldo E; Zhang, Yijun; Carbin, Gregory; Mansell, Edward R; Kuleshov, Yuriy; Peterson, Thomas C; Brunet, Manola; Driouech, Fatima; Krahenbuhl, Daniel S

    2017-06-01

    A World Meteorological Organization weather and climate extremes committee has judged that the world's longest reported distance for a single lightning flash occurred with a horizontal distance of 321 km (199.5 mi) over Oklahoma in 2007, while the world's longest reported duration for a single lightning flash is an event that lasted continuously for 7.74 seconds over southern France in 2012. In addition, the committee has unanimously recommended amendment of the AMS Glossary of Meteorology definition of lightning discharge as a "series of electrical processes taking place within 1 second" by removing the phrase "within one second" and replacing with "continuously." Validation of these new world extremes (a) demonstrates the recent and on-going dramatic augmentations and improvements to regional lightning detection and measurement networks, (b) provides reinforcement regarding the dangers of lightning, and (c) provides new information for lightning engineering concerns.

  1. Maximum likelihood as a common computational framework in tomotherapy

    International Nuclear Information System (INIS)

    Olivera, G.H.; Shepard, D.M.; Reckwerdt, P.J.; Ruchala, K.; Zachman, J.; Fitchard, E.E.; Mackie, T.R.

    1998-01-01

    Tomotherapy is a dose delivery technique using helical or axial intensity modulated beams. One of the strengths of the tomotherapy concept is that it can incorporate a number of processes into a single piece of equipment. These processes include treatment optimization planning, dose reconstruction and kilovoltage/megavoltage image reconstruction. A common computational technique that could be used for all of these processes would be very appealing. The maximum likelihood estimator, originally developed for emission tomography, can serve as a useful tool in imaging and radiotherapy. We believe that this approach can play an important role in the processes of optimization planning, dose reconstruction and kilovoltage and/or megavoltage image reconstruction. These processes involve computations that require comparable physical methods. They are also based on equivalent assumptions, and they have similar mathematical solutions. As a result, the maximum likelihood approach is able to provide a common framework for all three of these computational problems. We will demonstrate how maximum likelihood methods can be applied to optimization planning, dose reconstruction and megavoltage image reconstruction in tomotherapy. Results for planning optimization, dose reconstruction and megavoltage image reconstruction will be presented. Strengths and weaknesses of the methodology are analysed. Future directions for this work are also suggested. (author)

  2. Neurobiological roots of language in primate audition: common computational properties.

    Science.gov (United States)

    Bornkessel-Schlesewsky, Ina; Schlesewsky, Matthias; Small, Steven L; Rauschecker, Josef P

    2015-03-01

    Here, we present a new perspective on an old question: how does the neurobiology of human language relate to brain systems in nonhuman primates? We argue that higher-order language combinatorics, including sentence and discourse processing, can be situated in a unified, cross-species dorsal-ventral streams architecture for higher auditory processing, and that the functions of the dorsal and ventral streams in higher-order language processing can be grounded in their respective computational properties in primate audition. This view challenges an assumption, common in the cognitive sciences, that a nonhuman primate model forms an inherently inadequate basis for modeling higher-level language functions. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. ONE OF THE LONGEST APPENDIX: A RARE CASE REPORT

    Directory of Open Access Journals (Sweden)

    Venkat Rao

    2015-03-01

    Full Text Available The vermiform appendix is an organ that can have variable sizes. We are prompted to report here one of the longest appendix removed, measuring about 16cm in length. INTRODUCTION : The vermiform appendix is an organ that can vary in size, site, and presence, as well as in other clinical and functional aspects. We describe here one of the longest appendix removed, measuring about 16cm in length in a case of acute appendicitis

  4. Common Agency and Computational Complexity : Theory and Experimental Evidence

    NARCIS (Netherlands)

    Kirchsteiger, G.; Prat, A.

    1999-01-01

    In a common agency game, several principals try to influence the behavior of an agent. Common agency games typically have multiple equilibria. One class of equilibria, called truthful, has been identified by Bernheim and Whinston and has found widespread use in the political economy literature. In

  5. Overview of Parallel Platforms for Common High Performance Computing

    Directory of Open Access Journals (Sweden)

    T. Fryza

    2012-04-01

    Full Text Available The paper deals with various parallel platforms used for high performance computing in the signal processing domain. More precisely, the methods exploiting the multicores central processing units such as message passing interface and OpenMP are taken into account. The properties of the programming methods are experimentally proved in the application of a fast Fourier transform and a discrete cosine transform and they are compared with the possibilities of MATLAB's built-in functions and Texas Instruments digital signal processors with very long instruction word architectures. New FFT and DCT implementations were proposed and tested. The implementation phase was compared with CPU based computing methods and with possibilities of the Texas Instruments digital signal processing library on C6747 floating-point DSPs. The optimal combination of computing methods in the signal processing domain and new, fast routines' implementation is proposed as well.

  6. Computer aided approach to qualitative and quantitative common cause failure analysis for complex systems

    International Nuclear Information System (INIS)

    Cate, C.L.; Wagner, D.P.; Fussell, J.B.

    1977-01-01

    Common cause failure analysis, also called common mode failure analysis, is an integral part of a complete system reliability analysis. Existing methods of computer aided common cause failure analysis are extended by allowing analysis of the complex systems often encountered in practice. The methods aid in identifying potential common cause failures and also address quantitative common cause failure analysis

  7. On the Core of Multiple Longest Traveling Salesman Games

    NARCIS (Netherlands)

    Estevez Fernandez, M.A.; Borm, P.E.M.; Hamers, H.J.M.

    2003-01-01

    In this paper we introduce multiple longest traveling salesman (MLTS) games. An MLTS game arises from a network in which a salesman has to visit each node (player) precisely once, except its home location, in an order that maximizes the total reward.First it is shown that the value of a coalition of

  8. Missile signal processing common computer architecture for rapid technology upgrade

    Science.gov (United States)

    Rabinkin, Daniel V.; Rutledge, Edward; Monticciolo, Paul

    2004-10-01

    Interceptor missiles process IR images to locate an intended target and guide the interceptor towards it. Signal processing requirements have increased as the sensor bandwidth increases and interceptors operate against more sophisticated targets. A typical interceptor signal processing chain is comprised of two parts. Front-end video processing operates on all pixels of the image and performs such operations as non-uniformity correction (NUC), image stabilization, frame integration and detection. Back-end target processing, which tracks and classifies targets detected in the image, performs such algorithms as Kalman tracking, spectral feature extraction and target discrimination. In the past, video processing was implemented using ASIC components or FPGAs because computation requirements exceeded the throughput of general-purpose processors. Target processing was performed using hybrid architectures that included ASICs, DSPs and general-purpose processors. The resulting systems tended to be function-specific, and required custom software development. They were developed using non-integrated toolsets and test equipment was developed along with the processor platform. The lifespan of a system utilizing the signal processing platform often spans decades, while the specialized nature of processor hardware and software makes it difficult and costly to upgrade. As a result, the signal processing systems often run on outdated technology, algorithms are difficult to update, and system effectiveness is impaired by the inability to rapidly respond to new threats. A new design approach is made possible three developments; Moore's Law - driven improvement in computational throughput; a newly introduced vector computing capability in general purpose processors; and a modern set of open interface software standards. Today's multiprocessor commercial-off-the-shelf (COTS) platforms have sufficient throughput to support interceptor signal processing requirements. This application

  9. Logical and physical resource management in the common node of a distributed function laboratory computer network

    International Nuclear Information System (INIS)

    Stubblefield, F.W.

    1976-01-01

    A scheme for managing resources required for transaction processing in the common node of a distributed function computer system has been given. The scheme has been found to be satisfactory for all common node services provided so far

  10. Common findings and pseudolesions at computed tomography colonography: pictorial essay

    International Nuclear Information System (INIS)

    Atzingen, Augusto Castelli von; Tiferes, Dario Ariel; Matsumoto, Carlos Alberto; Nunes, Thiago Franchi; Maia, Marcos Vinicius Alvim Soares; D'Ippolito, Giuseppe

    2012-01-01

    Computed tomography colonography is a minimally invasive method for screening for polyps and colorectal cancer, with extremely unusual complications, increasingly used in the clinical practice. In the last decade, developments in bowel preparation, imaging, and in the training of investigators have determined a significant increase in the method sensitivity. Images interpretation is accomplished through a combined analysis of two-dimensional source images and several types of three-dimensional renderings, with sensitivity around 96% in the detection of lesions with dimensions equal or greater than 10 mm in size, when analyzed by experienced radiologists. The present pictorial essay includes examples of diseases and pseudolesions most frequently observed in this type of imaging study. The authors present examples of flat and polypoid lesions, benign and malignant lesions, diverticular disease of the colon, among other conditions, as well as pseudolesions, including those related to inappropriate bowel preparation and misinterpretation. (author)

  11. Common findings and pseudolesions at computed tomography colonography: pictorial essay

    Energy Technology Data Exchange (ETDEWEB)

    Atzingen, Augusto Castelli von [Clinical Radiology, Universidade Federal de Sao Paulo (UNIFESP), Sao Paulo, SP (Brazil); Tiferes, Dario Ariel; Matsumoto, Carlos Alberto; Nunes, Thiago Franchi; Maia, Marcos Vinicius Alvim Soares [Abdominal Imaging Section, Department of Imaging Diagnosis - Universidade Federal de Sao Paulo (UNIFESP), Sao Paulo, SP (Brazil); D' Ippolito, Giuseppe, E-mail: giuseppe_dr@uol.com.br [Department of Imaging Diagnosis, Universidade Federal de Sao Paulo (UNIFESP), Sao Paulo, SP (Brazil)

    2012-05-15

    Computed tomography colonography is a minimally invasive method for screening for polyps and colorectal cancer, with extremely unusual complications, increasingly used in the clinical practice. In the last decade, developments in bowel preparation, imaging, and in the training of investigators have determined a significant increase in the method sensitivity. Images interpretation is accomplished through a combined analysis of two-dimensional source images and several types of three-dimensional renderings, with sensitivity around 96% in the detection of lesions with dimensions equal or greater than 10 mm in size, when analyzed by experienced radiologists. The present pictorial essay includes examples of diseases and pseudolesions most frequently observed in this type of imaging study. The authors present examples of flat and polypoid lesions, benign and malignant lesions, diverticular disease of the colon, among other conditions, as well as pseudolesions, including those related to inappropriate bowel preparation and misinterpretation. (author)

  12. Second longest conveyor belt in UK installed and fully operational

    Energy Technology Data Exchange (ETDEWEB)

    1981-07-01

    A conveyor belt (which after the completion of the Selby complex will be the second longest conveyor belt in the UK) has been installed at the Prince Charles Drift Mine, Prince of Wales Colliery, United Kingdom. The 1706 m conveyor is the sole underground-to-surface conveyor at the Drift Mine, and is powered by a single 2240 kW, 3000 hp drive unit.

  13. World's longest underwater line part of new dc transmission link

    Energy Technology Data Exchange (ETDEWEB)

    1967-04-01

    The world's seventh dc transmission system including the world's longest underwater power cable is now operative. The system, linking the Italian Mainland with Sardinia, was designed and engineered by the English Electric Co. Ltd. It will ensure a constant power supply for Sardinia and allow export of 200 MW of power to the Tuscany area in Italy. Proving test began on the link in Decmeber and continued until full demand is made on it from Italy.

  14. Collection of reports on use of computation fund utilized in common in 1988

    International Nuclear Information System (INIS)

    1989-05-01

    Nuclear Physics Research Center, Osaka University, has provided the computation fund utilized in common since 1976 for supporting the computation related to the activities of the Center. When this computation fund is used, after finishing the use, the simple report of definite form (printed in RCNP-Z together with the report of the committee on computation fund utilized in common) and the detailed report concerning the contents of computation are to be presented. In the latter report, English abstract, explanation of the results obtained by computation and physical contents, new development, difficult point and the method of its solution in computation techniques, subroutine and function used for computation and their functions and block diagrams and so on are included. This book is the collection of the latter reports on the use of the computation fund utilized in common in fiscal year 1988. The invitation to the computation fund utilized in common is informed in December every year in RCNP-Z. (K.I.)

  15. Common data buffer system. [communication with computational equipment utilized in spacecraft operations

    Science.gov (United States)

    Byrne, F. (Inventor)

    1981-01-01

    A high speed common data buffer system is described for providing an interface and communications medium between a plurality of computers utilized in a distributed computer complex forming part of a checkout, command and control system for space vehicles and associated ground support equipment. The system includes the capability for temporarily storing data to be transferred between computers, for transferring a plurality of interrupts between computers, for monitoring and recording these transfers, and for correcting errors incurred in these transfers. Validity checks are made on each transfer and appropriate error notification is given to the computer associated with that transfer.

  16. On Longest Cycles in Essentially 4-Connected Planar Graphs

    Directory of Open Access Journals (Sweden)

    Fabrici Igor

    2016-08-01

    Full Text Available A planar 3-connected graph G is essentially 4-connected if, for any 3-separator S of G, one component of the graph obtained from G by removing S is a single vertex. Jackson and Wormald proved that an essentially 4-connected planar graph on n vertices contains a cycle C such that . For a cubic essentially 4-connected planar graph G, Grünbaum with Malkevitch, and Zhang showed that G has a cycle on at least ¾ n vertices. In the present paper the result of Jackson and Wormald is improved. Moreover, new lower bounds on the length of a longest cycle of G are presented if G is an essentially 4-connected planar graph of maximum degree 4 or G is an essentially 4-connected maximal planar graph.

  17. Transaction processing in the common node of a distributed function laboratory computer system

    International Nuclear Information System (INIS)

    Stubblefield, F.W.; Dimmler, D.G.

    1975-01-01

    A computer network architecture consisting of a common node processor for managing peripherals and files and a number of private node processors for laboratory experiment control is briefly reviewed. Central to the problem of private node-common node communication is the concept of a transaction. The collection of procedures and the data structure associated with a transaction are described. The common node properties assigned to a transaction and procedures required for its complete processing are discussed. (U.S.)

  18. System of common usage on the base of external memory devices and the SM-3 computer

    International Nuclear Information System (INIS)

    Baluka, G.; Vasin, A.Yu.; Ermakov, V.A.; Zhukov, G.P.; Zimin, G.N.; Namsraj, Yu.; Ostrovnoj, A.I.; Savvateev, A.S.; Salamatin, I.M.; Yanovskij, G.Ya.

    1980-01-01

    An easily modified system of common usage on the base of external memories and a SM-3 minicomputer replacing some pulse analysers is described. The system has merits of PA and is more advantageous with regard to effectiveness of equipment using, the possibility of changing configuration and functions, the data protection against losses due to user errors and some failures, price of one registration channel, place occupied. The system of common usage is intended for the IBR-2 pulse reactor computing centre. It is designed using the SANPO system means for SM-3 computer [ru

  19. A Study on the Radiographic Diagnosis of Common Periapical Lesions by Using Computer

    International Nuclear Information System (INIS)

    Kim, Jae Duck; Kim, Seung Kug

    1990-01-01

    The purpose of this study was to estimate the diagnostic availability of the common periapical lesions by using computer. The author used a domestic personal computer and rearranged the applied program appropriately with RF (Rapid File), a program to answer the purpose of this study, and then input the consequence made out through collection, analysis and classification of the clinical and radiological features about the common periapical lesions as a basic data. The 256 cases (Cyst 91, Periapical granuloma 74, Periapical abscess 91) were obtained from the chart recordings and radiographs of the patients diagnosed or treated under the common periapical lesions during the past 8 years (1983-1990) at the infirmary of Dental School, Chosun University. Next, the clinical and radiographic features of the 256 cases were applied to RF program for diagnosis, and the diagnosis by using computer was compared with the hidden final diagnosis by clinical and histopathological examination. The obtained results were as follow: 1. In cases of the cyst, diagnosis through the computer program was shown rather lower accuracy (80.22%) as compared with accuracy (90.1%) by the radiologists. 2. In cases of the granuloma, diagnosis through the computer program was shown rather higher accuracy(75.7%) as compared with the accuracy (70.3%) by the radiologists. 3. In cases of periapical abscess, the diagnostic accuracy was shown 88% in both diagnoses. 4. The average diagnostic accuracy of 256 cases through the computer program was shown rather lower accuracy (81.2%) as compared with the accuracy (82.8%) by the radiologists. 5. The applied basic data for radiographic diagnosis of common periapical lesions by using computer was estimated to be available.

  20. A Study on the Radiographic Diagnosis of Common Periapical Lesions by Using Computer

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Duck; Kim, Seung Kug [Dept. of Oral Radiology, College of Dentistry, Chosun University, Kwangju (Korea, Republic of)

    1990-08-15

    The purpose of this study was to estimate the diagnostic availability of the common periapical lesions by using computer. The author used a domestic personal computer and rearranged the applied program appropriately with RF (Rapid File), a program to answer the purpose of this study, and then input the consequence made out through collection, analysis and classification of the clinical and radiological features about the common periapical lesions as a basic data. The 256 cases (Cyst 91, Periapical granuloma 74, Periapical abscess 91) were obtained from the chart recordings and radiographs of the patients diagnosed or treated under the common periapical lesions during the past 8 years (1983-1990) at the infirmary of Dental School, Chosun University. Next, the clinical and radiographic features of the 256 cases were applied to RF program for diagnosis, and the diagnosis by using computer was compared with the hidden final diagnosis by clinical and histopathological examination. The obtained results were as follow: 1. In cases of the cyst, diagnosis through the computer program was shown rather lower accuracy (80.22%) as compared with accuracy (90.1%) by the radiologists. 2. In cases of the granuloma, diagnosis through the computer program was shown rather higher accuracy(75.7%) as compared with the accuracy (70.3%) by the radiologists. 3. In cases of periapical abscess, the diagnostic accuracy was shown 88% in both diagnoses. 4. The average diagnostic accuracy of 256 cases through the computer program was shown rather lower accuracy (81.2%) as compared with the accuracy (82.8%) by the radiologists. 5. The applied basic data for radiographic diagnosis of common periapical lesions by using computer was estimated to be available.

  1. A common currency for the computation of motivational values in the human striatum

    NARCIS (Netherlands)

    Sescousse, G.T.; Li, Y.; Dreher, J.C.

    2015-01-01

    Reward comparison in the brain is thought to be achieved through the use of a 'common currency', implying that reward value representations are computed on a unique scale in the same brain regions regardless of the reward type. Although such a mechanism has been identified in the ventro-medial

  2. A common currency for the computation of motivational values in the human striatum

    NARCIS (Netherlands)

    Sescousse, G.T.; Li, Y.; Dreher, J.C.

    2014-01-01

    Reward comparison in the brain is thought to be achieved through the use of a ‘common currency’, implying that reward value representations are computed on a unique scale in the same brain regions regardless of the reward type. Although such a mechanism has been identified in the ventro-medial

  3. MONTHLY VARIATION IN SPERM MOTILITY IN COMMON CARP ASSESSED USING COMPUTER-ASSISTED SPERM ANALYSIS (CASA)

    Science.gov (United States)

    Sperm motility variables from the milt of the common carp Cyprinus carpio were assessed using a computer-assisted sperm analysis (CASA) system across several months (March-August 1992) known to encompass the natural spawning period. Two-year-old pond-raised males obtained each mo...

  4. Cultural Commonalities and Differences in Spatial Problem-Solving: A Computational Analysis

    Science.gov (United States)

    Lovett, Andrew; Forbus, Kenneth

    2011-01-01

    A fundamental question in human cognition is how people reason about space. We use a computational model to explore cross-cultural commonalities and differences in spatial cognition. Our model is based upon two hypotheses: (1) the structure-mapping model of analogy can explain the visual comparisons used in spatial reasoning; and (2) qualitative,…

  5. Nature's longest threads new frontiers in the mathematics and physics of information in biology

    CERN Document Server

    Sreekantan, B V

    2014-01-01

    Organisms endowed with life show a sense of awareness, interacting with and learning from the universe in and around them. Each level of interaction involves transfer of information of various kinds, and at different levels. Each thread of information is interlinked with the other, and woven together, these constitute the universe — both the internal self and the external world — as we perceive it. They are, figuratively speaking, Nature's longest threads. This volume reports inter-disciplinary research and views on information and its transfer at different levels of organization by reputed scientists working on the frontier areas of science. It is a frontier where physics, mathematics and biology merge seamlessly, binding together specialized streams such as quantum mechanics, dynamical systems theory, and mathematics. The topics would interest a broad cross-section of researchers in life sciences, physics, cognition, neuroscience, mathematics and computer science, as well as interested amateurs, familia...

  6. Fine tuning of work practices of common radiological investigations performed using computed radiography system

    International Nuclear Information System (INIS)

    Livingstone, Roshan S.; Timothy Peace, B.S.; Sunny, S.; Victor Raj, D.

    2007-01-01

    Introduction: The advent of the computed radiography (CR) has brought about remarkable changes in the field of diagnostic radiology. A relatively large cross-section of the human population is exposed to ionizing radiation on account of common radiological investigations. This study is intended to audit radiation doses imparted to patients during common radiological investigations involving the use of CR systems. Method: The entrance surface doses (ESD) were measured using thermoluminescent dosimeters (TLD) for various radiological investigations performed using the computed radiography (CR) systems. Optimization of radiographic techniques and radiation doses was done by fine tuning the work practices. Results and conclusion: Reduction of radiation doses as high as 47% was achieved during certain investigations with the use of optimized exposure factors and fine-tuned work practices

  7. Computer vision syndrome-A common cause of unexplained visual symptoms in the modern era.

    Science.gov (United States)

    Munshi, Sunil; Varghese, Ashley; Dhar-Munshi, Sushma

    2017-07-01

    The aim of this study was to assess the evidence and available literature on the clinical, pathogenetic, prognostic and therapeutic aspects of Computer vision syndrome. Information was collected from Medline, Embase & National Library of Medicine over the last 30 years up to March 2016. The bibliographies of relevant articles were searched for additional references. Patients with Computer vision syndrome present to a variety of different specialists, including General Practitioners, Neurologists, Stroke physicians and Ophthalmologists. While the condition is common, there is a poor awareness in the public and among health professionals. Recognising this condition in the clinic or in emergency situations like the TIA clinic is crucial. The implications are potentially huge in view of the extensive and widespread use of computers and visual display units. Greater public awareness of Computer vision syndrome and education of health professionals is vital. Preventive strategies should form part of work place ergonomics routinely. Prompt and correct recognition is important to allow management and avoid unnecessary treatments. © 2017 John Wiley & Sons Ltd.

  8. SSVEP recognition using common feature analysis in brain-computer interface.

    Science.gov (United States)

    Zhang, Yu; Zhou, Guoxu; Jin, Jing; Wang, Xingyu; Cichocki, Andrzej

    2015-04-15

    Canonical correlation analysis (CCA) has been successfully applied to steady-state visual evoked potential (SSVEP) recognition for brain-computer interface (BCI) application. Although the CCA method outperforms the traditional power spectral density analysis through multi-channel detection, it requires additionally pre-constructed reference signals of sine-cosine waves. It is likely to encounter overfitting in using a short time window since the reference signals include no features from training data. We consider that a group of electroencephalogram (EEG) data trials recorded at a certain stimulus frequency on a same subject should share some common features that may bear the real SSVEP characteristics. This study therefore proposes a common feature analysis (CFA)-based method to exploit the latent common features as natural reference signals in using correlation analysis for SSVEP recognition. Good performance of the CFA method for SSVEP recognition is validated with EEG data recorded from ten healthy subjects, in contrast to CCA and a multiway extension of CCA (MCCA). Experimental results indicate that the CFA method significantly outperformed the CCA and the MCCA methods for SSVEP recognition in using a short time window (i.e., less than 1s). The superiority of the proposed CFA method suggests it is promising for the development of a real-time SSVEP-based BCI. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. NEGOTIATING COMMON GROUND IN COMPUTER-MEDIATED VERSUS FACE-TO-FACE DISCUSSIONS

    Directory of Open Access Journals (Sweden)

    Ilona Vandergriff

    2006-01-01

    Full Text Available To explore the impact of the communication medium on building common ground, this article presents research comparing learner use of reception strategies in traditional face-to-face (FTF and in synchronous computer-mediated communication (CMC.Reception strategies, such as reprises, hypothesis testing and forward inferencing provide evidence of comprehension and thus serve to establish common ground among participants. A number of factors, including communicative purpose or medium are hypothesized to affect the use of such strategies (Clark & Brennan, 1991. In the data analysis, I 1 identify specific types of reception strategies, 2 compare their relative frequencies by communication medium, by task, and by learner and 3 describe how these reception strategies function in the discussions. The findings of the quantitative analysis show that the medium alone seems to have little impact on grounding as indicated by use of reception strategies. The qualitative analysis provides evidence that participants adapted the strategies to the goals of the communicative interaction as they used them primarily to negotiate and update common ground on their collaborative activity rather than to compensate for L2 deficiencies.

  10. A common currency for the computation of motivational values in the human striatum

    Science.gov (United States)

    Li, Yansong; Dreher, Jean-Claude

    2015-01-01

    Reward comparison in the brain is thought to be achieved through the use of a ‘common currency’, implying that reward value representations are computed on a unique scale in the same brain regions regardless of the reward type. Although such a mechanism has been identified in the ventro-medial prefrontal cortex and ventral striatum in the context of decision-making, it is less clear whether it similarly applies to non-choice situations. To answer this question, we scanned 38 participants with fMRI while they were presented with single cues predicting either monetary or erotic rewards, without the need to make a decision. The ventral striatum was the main brain structure to respond to both cues while showing increasing activity with increasing expected reward intensity. Most importantly, the relative response of the striatum to monetary vs erotic cues was correlated with the relative motivational value of these rewards as inferred from reaction times. Similar correlations were observed in a fronto-parietal network known to be involved in attentional focus and motor readiness. Together, our results suggest that striatal reward value signals not only obey to a common currency mechanism in the absence of choice but may also serve as an input to adjust motivated behaviour accordingly. PMID:24837478

  11. Transfer Kernel Common Spatial Patterns for Motor Imagery Brain-Computer Interface Classification

    Science.gov (United States)

    Dai, Mengxi; Liu, Shucong; Zhang, Pengju

    2018-01-01

    Motor-imagery-based brain-computer interfaces (BCIs) commonly use the common spatial pattern (CSP) as preprocessing step before classification. The CSP method is a supervised algorithm. Therefore a lot of time-consuming training data is needed to build the model. To address this issue, one promising approach is transfer learning, which generalizes a learning model can extract discriminative information from other subjects for target classification task. To this end, we propose a transfer kernel CSP (TKCSP) approach to learn a domain-invariant kernel by directly matching distributions of source subjects and target subjects. The dataset IVa of BCI Competition III is used to demonstrate the validity by our proposed methods. In the experiment, we compare the classification performance of the TKCSP against CSP, CSP for subject-to-subject transfer (CSP SJ-to-SJ), regularizing CSP (RCSP), stationary subspace CSP (ssCSP), multitask CSP (mtCSP), and the combined mtCSP and ssCSP (ss + mtCSP) method. The results indicate that the superior mean classification performance of TKCSP can achieve 81.14%, especially in case of source subjects with fewer number of training samples. Comprehensive experimental evidence on the dataset verifies the effectiveness and efficiency of the proposed TKCSP approach over several state-of-the-art methods. PMID:29743934

  12. Theoretical restrictions on longest implicit time scales in Markov state models of biomolecular dynamics

    Science.gov (United States)

    Sinitskiy, Anton V.; Pande, Vijay S.

    2018-01-01

    Markov state models (MSMs) have been widely used to analyze computer simulations of various biomolecular systems. They can capture conformational transitions much slower than an average or maximal length of a single molecular dynamics (MD) trajectory from the set of trajectories used to build the MSM. A rule of thumb claiming that the slowest implicit time scale captured by an MSM should be comparable by the order of magnitude to the aggregate duration of all MD trajectories used to build this MSM has been known in the field. However, this rule has never been formally proved. In this work, we present analytical results for the slowest time scale in several types of MSMs, supporting the above rule. We conclude that the slowest implicit time scale equals the product of the aggregate sampling and four factors that quantify: (1) how much statistics on the conformational transitions corresponding to the longest implicit time scale is available, (2) how good the sampling of the destination Markov state is, (3) the gain in statistics from using a sliding window for counting transitions between Markov states, and (4) a bias in the estimate of the implicit time scale arising from finite sampling of the conformational transitions. We demonstrate that in many practically important cases all these four factors are on the order of unity, and we analyze possible scenarios that could lead to their significant deviation from unity. Overall, we provide for the first time analytical results on the slowest time scales captured by MSMs. These results can guide further practical applications of MSMs to biomolecular dynamics and allow for higher computational efficiency of simulations.

  13. Isolated and unexplained dilation of the common bile duct on computed tomography scanscans

    Directory of Open Access Journals (Sweden)

    Naveen B. Krishna

    2012-07-01

    Full Text Available Isolated dilation of common bile duct (CBD with normal sized pancreatic duct and without identifiable stones or mass lesion (unexplained is frequently encountered by computed tomography/magnetic resonance imaging. We studied the final diagnoses in these patients and tried to elucidate factors that can predict a malignant etiology. This is a retrospective analysis of prospective database from a University based clinical practice (2002- 2008. We included 107 consecutive patients who underwent endoscopic ultrasound (EUS for evaluation of isolated and unexplained CBD dilation noted on contrast computed tomography scans. EUS examination was performed using a radial echoendoscope followed by a linear echoechoendoscope, if a focal mass lesion was identified. Fine-needle aspirates were assessed immediately by an attending cytopathologist. Main outcome measurements included i prevalence of neoplasms, CBD stones and chronic pancreatitis and ii performance characteristics of EUS/EUS-fine needle aspiration (EUS-FNA. A malignant neoplasm was found in 16 patients (14.9% of the study subjects, all with obstructive jaundice (ObJ. Six patients had CBD stones; three with ObJ and three with abnormal liver function tests. EUS findings suggestive of chronic pancreatitis were identified in 27 patients. EUSFNA had 97.3% accuracy (94.1% in subset with ObJ with a sensitivity of 81.2% and specificity of 100% for diagnosing malignancy. Presence of ObJ and older patient age were only significant predictors of malignancy in our cohort. Amongst patients with isolated and unexplained dilation of CBD, the risk of malignancy is significantly higher in older patients presenting with ObJ. EUS-FNA can diagnose malignancy in these patients with high accuracy besides identifying other potential etiologies including missed CBD stones and chronic pancreatitis.

  14. The longest telomeres: a general signature of adult stem cell compartments

    Science.gov (United States)

    Flores, Ignacio; Canela, Andres; Vera, Elsa; Tejera, Agueda; Cotsarelis, George; Blasco, María A.

    2008-01-01

    Identification of adult stem cells and their location (niches) is of great relevance for regenerative medicine. However, stem cell niches are still poorly defined in most adult tissues. Here, we show that the longest telomeres are a general feature of adult stem cell compartments. Using confocal telomere quantitative fluorescence in situ hybridization (telomapping), we find gradients of telomere length within tissues, with the longest telomeres mapping to the known stem cell compartments. In mouse hair follicles, we show that cells with the longest telomeres map to the known stem cell compartments, colocalize with stem cell markers, and behave as stem cells upon treatment with mitogenic stimuli. Using K15-EGFP reporter mice, which mark hair follicle stem cells, we show that GFP-positive cells have the longest telomeres. The stem cell compartments in small intestine, testis, cornea, and brain of the mouse are also enriched in cells with the longest telomeres. This constitutes the description of a novel general property of adult stem cell compartments. Finally, we make the novel finding that telomeres shorten with age in different mouse stem cell compartments, which parallels a decline in stem cell functionality, suggesting that telomere loss may contribute to stem cell dysfunction with age. PMID:18283121

  15. Computational approaches for discovery of common immunomodulators in fungal infections: towards broad-spectrum immunotherapeutic interventions.

    Science.gov (United States)

    Kidane, Yared H; Lawrence, Christopher; Murali, T M

    2013-10-07

    Fungi are the second most abundant type of human pathogens. Invasive fungal pathogens are leading causes of life-threatening infections in clinical settings. Toxicity to the host and drug-resistance are two major deleterious issues associated with existing antifungal agents. Increasing a host's tolerance and/or immunity to fungal pathogens has potential to alleviate these problems. A host's tolerance may be improved by modulating the immune system such that it responds more rapidly and robustly in all facets, ranging from the recognition of pathogens to their clearance from the host. An understanding of biological processes and genes that are perturbed during attempted fungal exposure, colonization, and/or invasion will help guide the identification of endogenous immunomodulators and/or small molecules that activate host-immune responses such as specialized adjuvants. In this study, we present computational techniques and approaches using publicly available transcriptional data sets, to predict immunomodulators that may act against multiple fungal pathogens. Our study analyzed data sets derived from host cells exposed to five fungal pathogens, namely, Alternaria alternata, Aspergillus fumigatus, Candida albicans, Pneumocystis jirovecii, and Stachybotrys chartarum. We observed statistically significant associations between host responses to A. fumigatus and C. albicans. Our analysis identified biological processes that were consistently perturbed by these two pathogens. These processes contained both immune response-inducing genes such as MALT1, SERPINE1, ICAM1, and IL8, and immune response-repressing genes such as DUSP8, DUSP6, and SPRED2. We hypothesize that these genes belong to a pool of common immunomodulators that can potentially be activated or suppressed (agonized or antagonized) in order to render the host more tolerant to infections caused by A. fumigatus and C. albicans. Our computational approaches and methodologies described here can now be applied to

  16. Computational Investigation of a Boundary-Layer Ingesting Propulsion System for the Common Research Model

    Science.gov (United States)

    Blumenthal, Brennan T.; Elmiligui, Alaa; Geiselhart, Karl A.; Campbell, Richard L.; Maughmer, Mark D.; Schmitz, Sven

    2016-01-01

    The present paper examines potential propulsive and aerodynamic benefits of integrating a Boundary-Layer Ingestion (BLI) propulsion system into a typical commercial aircraft using the Common Research Model (CRM) geometry and the NASA Tetrahedral Unstructured Software System (TetrUSS). The Numerical Propulsion System Simulation (NPSS) environment is used to generate engine conditions for CFD analysis. Improvements to the BLI geometry are made using the Constrained Direct Iterative Surface Curvature (CDISC) design method. Previous studies have shown reductions of up to 25% in terms of propulsive power required for cruise for other axisymmetric geometries using the BLI concept. An analysis of engine power requirements, drag, and lift coefficients using the baseline and BLI geometries coupled with the NPSS model are shown. Potential benefits of the BLI system relating to cruise propulsive power are quantified using a power balance method, and a comparison to the baseline case is made. Iterations of the BLI geometric design are shown and any improvements between subsequent BLI designs presented. Simulations are conducted for a cruise flight condition of Mach 0.85 at an altitude of 38,500 feet and an angle of attack of 2 deg for all geometries. A comparison between available wind tunnel data, previous computational results, and the original CRM model is presented for model verification purposes along with full results for BLI power savings. Results indicate a 14.4% reduction in engine power requirements at cruise for the BLI configuration over the baseline geometry. Minor shaping of the aft portion of the fuselage using CDISC has been shown to increase the benefit from Boundary-Layer Ingestion further, resulting in a 15.6% reduction in power requirements for cruise as well as a drag reduction of eighteen counts over the baseline geometry.

  17. Computational Investigation of a Boundary-Layer Ingestion Propulsion System for the Common Research Model

    Science.gov (United States)

    Blumenthal, Brennan

    2016-01-01

    This thesis will examine potential propulsive and aerodynamic benefits of integrating a boundary-layer ingestion (BLI) propulsion system with a typical commercial aircraft using the Common Research Model geometry and the NASA Tetrahedral Unstructured Software System (TetrUSS). The Numerical Propulsion System Simulation (NPSS) environment will be used to generate engine conditions for CFD analysis. Improvements to the BLI geometry will be made using the Constrained Direct Iterative Surface Curvature (CDISC) design method. Previous studies have shown reductions of up to 25% in terms of propulsive power required for cruise for other axisymmetric geometries using the BLI concept. An analysis of engine power requirements, drag, and lift coefficients using the baseline and BLI geometries coupled with the NPSS model are shown. Potential benefits of the BLI system relating to cruise propulsive power are quantified using a power balance method and a comparison to the baseline case is made. Iterations of the BLI geometric design are shown and any improvements between subsequent BLI designs presented. Simulations are conducted for a cruise flight condition of Mach 0.85 at an altitude of 38,500 feet and an angle of attack of 2deg for all geometries. A comparison between available wind tunnel data, previous computational results, and the original CRM model is presented for model verification purposes along with full results for BLI power savings. Results indicate a 14.3% reduction in engine power requirements at cruise for the BLI configuration over the baseline geometry. Minor shaping of the aft portion of the fuselage using CDISC has been shown to increase the benefit from boundary-layer ingestion further, resulting in a 15.6% reduction in power requirements for cruise as well as a drag reduction of eighteen counts over the baseline geometry.

  18. A survey of common habits of computer users as indicators of ...

    African Journals Online (AJOL)

    Yomi

    2012-01-31

    Jan 31, 2012 ... Hygiene has been recognized as an infection control strategy and the extent of the problems of environmental contamination largely depends on personal hygiene. With the development of several computer applications in recent times, the uses of computer systems have greatly expanded. And with.

  19. Common Sense Planning for a Computer, or, What's It Worth to You?

    Science.gov (United States)

    Crawford, Walt

    1984-01-01

    Suggests factors to be considered in planning for the purchase of a microcomputer, including budgets, benefits, costs, and decisions. Major uses of a personal computer are described--word processing, financial analysis, file and database management, programming and computer literacy, education, entertainment, and thrill of high technology. (EJS)

  20. Consumer attitudes towards computer-assisted self-care of the common cold.

    Science.gov (United States)

    Reis, J; Wrestler, F

    1994-04-01

    Knowledge of colds and flu and attitudes towards use of computers for self-care are compared for 260 young adult users and 194 young adult non-users of computer-assisted self-care for colds and flu. Participants completed a knowledge questionnaire on colds and flu, used a computer program designed to enhance self-care for colds and flu, and then completed a questionnaire on their attitudes towards using a computer for self-care for colds and flu, and then completed a questionnaire on their attitudes towards using a computer for self-care for colds and flu, perceived importance of physician interactions, physician expertise, and patient-physician communication. Compared with users, non-users preferred personal contact with their physicians and felt that computerized health assessments would be limited in vocabulary and range of current medical information. Non-users were also more likely to agree that people could not be trusted to do an accurate computerized health assessment and that the average person was too computer illiterate to use computers for self-care.

  1. Computational Fluid Dynamics (CFD) Computations With Zonal Navier-Stokes Flow Solver (ZNSFLOW) Common High Performance Computing Scalable Software Initiative (CHSSI) Software

    National Research Council Canada - National Science Library

    Edge, Harris

    1999-01-01

    ...), computational fluid dynamics (CFD) 6 project. Under the project, a proven zonal Navier-Stokes solver was rewritten for scalable parallel performance on both shared memory and distributed memory high performance computers...

  2. Diversity Programme | Tasneem Zahra Husain presents her book “Only the Longest Threads” | 4 October

    CERN Multimedia

    Diversity Programme

    2016-01-01

    “Only the Longest Threads”, by Tasneem Zahra Husain. Tuesday 4 October 2016 - 15:30 Room Georges Charpak (Room F / 60-6-015) *Coffee will be served after the event* Tasneem Zehra Husain is a string theorist and the first Pakistani woman to obtain a PhD in this field. Husain’s first novel, “Only the Longest Threads” reimagines the stories of great breakthroughs and discoveries in physics from Newton’s classical mechanics to the Higgs Boson from the viewpoint of fictional characters. These tales promise to be great reads for both lay audiences and to those who have a more advanced understanding of physics. Registration is now open. Please register using the following link: https://indico.cern.ch/event/562079/.

  3. Monitoring the progressive increase of the longest episode of spontaneous movements in Guinea pig fetus

    Directory of Open Access Journals (Sweden)

    Sekulić S.

    2013-01-01

    Full Text Available The aim of this work was to determine the changes in the duration of spontaneous movements in the guinea pig fetus after the appearance of its first movements. Every day from the 25th to the 35th gestation day, one fetus from each of twenty pregnant animals was examined by ultrasound. Fetal movements were observed for 5 min. The episode with the longest period of movement was taken into consideration and was recorded as: 3 s. Days 25 and 26 were characterized by episodes lasting 3 s (χ2 = 140.51 p <0.05. Tracking the dynamics of progressive increases in the longest episode of spontaneous movement could be a useful factor in estimating the maturity and condition of a fetus. [Projekat Ministarstva nauke Republike Srbije, br. 175006/2011

  4. Using a Cloud-Based Computing Environment to Support Teacher Training on Common Core Implementation

    Science.gov (United States)

    Robertson, Cory

    2013-01-01

    A cloud-based computing environment, Google Apps for Education (GAFE), has provided the Anaheim City School District (ACSD) a comprehensive and collaborative avenue for creating, sharing, and editing documents, calendars, and social networking communities. With this environment, teachers and district staff at ACSD are able to utilize the deep…

  5. A survey of common habits of computer users as indicators of ...

    African Journals Online (AJOL)

    Other unhealthy practices found among computer users included eating (52.1), drinking (56), coughing, sneezing and scratching of head (48.2%). Since microorganisms can be transferred through contact, droplets or airborne routes, it follows that these habits exhibited by users may act as sources of bacteria on keyboards ...

  6. KEPLER-1647B: THE LARGEST AND LONGEST-PERIOD KEPLER TRANSITING CIRCUMBINARY PLANET

    Energy Technology Data Exchange (ETDEWEB)

    Kostov, Veselin B. [NASA Goddard Space Flight Center, Mail Code 665, Greenbelt, MD 20771 (United States); Orosz, Jerome A.; Welsh, William F.; Short, Donald R. [Department of Astronomy, San Diego State University, 5500 Campanile Drive, San Diego, CA 92182 (United States); Doyle, Laurance R. [SETI Institute, 189 Bernardo Avenue, Mountain View, CA 94043 (United States); Principia College, IMoP, One Maybeck Place, Elsah, IL 62028 (United States); Fabrycky, Daniel C. [Department of Astronomy and Astrophysics, University of Chicago, 5640 South Ellis Avenue, Chicago, IL 60637 (United States); Haghighipour, Nader [Institute for Astronomy, University of Hawaii-Manoa, Honolulu, HI 96822 (United States); Quarles, Billy [Department of Physics and Physical Science, The University of Nebraska at Kearney, Kearney, NE 68849 (United States); Cochran, William D.; Endl, Michael [McDonald Observatory, The University of Texas as Austin, Austin, TX 78712-0259 (United States); Ford, Eric B. [Department of Astronomy and Astrophysics, The Pennsylvania State University, 428A Davey Lab, University Park, PA 16802 (United States); Gregorio, Joao [Atalaia Group and Crow-Observatory, Portalegre (Portugal); Hinse, Tobias C. [Korea Astronomy and Space Science Institute (KASI), Advanced Astronomy and Space Science Division, Daejeon 305-348 (Korea, Republic of); Isaacson, Howard [Department of Astronomy, University of California Berkeley, 501 Campbell Hall, Berkeley, CA 94720 (United States); Jenkins, Jon M. [NASA Ames Research Center, Moffett Field, CA 94035 (United States); Jensen, Eric L. N. [Department of Physics and Astronomy, Swarthmore College, Swarthmore, PA 19081 (United States); Kane, Stephen [Department of Physics and Astronomy, San Francisco State University, 1600 Holloway Avenue, San Francisco, CA 94132 (United States); Kull, Ilya, E-mail: veselin.b.kostov@nasa.gov [Department of Astronomy and Astrophysics, Tel Aviv University, 69978 Tel Aviv (Israel); and others

    2016-08-10

    We report the discovery of a new Kepler transiting circumbinary planet (CBP). This latest addition to the still-small family of CBPs defies the current trend of known short-period planets orbiting near the stability limit of binary stars. Unlike the previous discoveries, the planet revolving around the eclipsing binary system Kepler-1647 has a very long orbital period (∼1100 days) and was at conjunction only twice during the Kepler mission lifetime. Due to the singular configuration of the system, Kepler-1647b is not only the longest-period transiting CBP at the time of writing, but also one of the longest-period transiting planets. With a radius of 1.06 ± 0.01 R {sub Jup}, it is also the largest CBP to date. The planet produced three transits in the light curve of Kepler-1647 (one of them during an eclipse, creating a syzygy) and measurably perturbed the times of the stellar eclipses, allowing us to measure its mass, 1.52 ± 0.65 M {sub Jup}. The planet revolves around an 11-day period eclipsing binary consisting of two solar-mass stars on a slightly inclined, mildly eccentric ( e {sub bin} = 0.16), spin-synchronized orbit. Despite having an orbital period three times longer than Earth’s, Kepler-1647b is in the conservative habitable zone of the binary star throughout its orbit.

  7. Empirical scaling of the length of the longest increasing subsequences of random walks

    Science.gov (United States)

    Mendonça, J. Ricardo G.

    2017-02-01

    We provide Monte Carlo estimates of the scaling of the length L n of the longest increasing subsequences of n-step random walks for several different distributions of step lengths, short and heavy-tailed. Our simulations indicate that, barring possible logarithmic corrections, {{L}n}∼ {{n}θ} with the leading scaling exponent 0.60≲ θ ≲ 0.69 for the heavy-tailed distributions of step lengths examined, with values increasing as the distribution becomes more heavy-tailed, and θ ≃ 0.57 for distributions of finite variance, irrespective of the particular distribution. The results are consistent with existing rigorous bounds for θ, although in a somewhat surprising manner. For random walks with step lengths of finite variance, we conjecture that the correct asymptotic behavior of L n is given by \\sqrt{n}\\ln n , and also propose the form for the subleading asymptotics. The distribution of L n was found to follow a simple scaling form with scaling functions that vary with θ. Accordingly, when the step lengths are of finite variance they seem to be universal. The nature of this scaling remains unclear, since we lack a working model, microscopic or hydrodynamic, for the behavior of the length of the longest increasing subsequences of random walks.

  8. KEPLER-1647B: THE LARGEST AND LONGEST-PERIOD KEPLER TRANSITING CIRCUMBINARY PLANET

    International Nuclear Information System (INIS)

    Kostov, Veselin B.; Orosz, Jerome A.; Welsh, William F.; Short, Donald R.; Doyle, Laurance R.; Fabrycky, Daniel C.; Haghighipour, Nader; Quarles, Billy; Cochran, William D.; Endl, Michael; Ford, Eric B.; Gregorio, Joao; Hinse, Tobias C.; Isaacson, Howard; Jenkins, Jon M.; Jensen, Eric L. N.; Kane, Stephen; Kull, Ilya

    2016-01-01

    We report the discovery of a new Kepler transiting circumbinary planet (CBP). This latest addition to the still-small family of CBPs defies the current trend of known short-period planets orbiting near the stability limit of binary stars. Unlike the previous discoveries, the planet revolving around the eclipsing binary system Kepler-1647 has a very long orbital period (∼1100 days) and was at conjunction only twice during the Kepler mission lifetime. Due to the singular configuration of the system, Kepler-1647b is not only the longest-period transiting CBP at the time of writing, but also one of the longest-period transiting planets. With a radius of 1.06 ± 0.01 R Jup , it is also the largest CBP to date. The planet produced three transits in the light curve of Kepler-1647 (one of them during an eclipse, creating a syzygy) and measurably perturbed the times of the stellar eclipses, allowing us to measure its mass, 1.52 ± 0.65 M Jup . The planet revolves around an 11-day period eclipsing binary consisting of two solar-mass stars on a slightly inclined, mildly eccentric ( e bin = 0.16), spin-synchronized orbit. Despite having an orbital period three times longer than Earth’s, Kepler-1647b is in the conservative habitable zone of the binary star throughout its orbit.

  9. Single-trial detection of visual evoked potentials by common spatial patterns and wavelet filtering for brain-computer interface.

    Science.gov (United States)

    Tu, Yiheng; Huang, Gan; Hung, Yeung Sam; Hu, Li; Hu, Yong; Zhang, Zhiguo

    2013-01-01

    Event-related potentials (ERPs) are widely used in brain-computer interface (BCI) systems as input signals conveying a subject's intention. A fast and reliable single-trial ERP detection method can be used to develop a BCI system with both high speed and high accuracy. However, most of single-trial ERP detection methods are developed for offline EEG analysis and thus have a high computational complexity and need manual operations. Therefore, they are not applicable to practical BCI systems, which require a low-complexity and automatic ERP detection method. This work presents a joint spatial-time-frequency filter that combines common spatial patterns (CSP) and wavelet filtering (WF) for improving the signal-to-noise (SNR) of visual evoked potentials (VEP), which can lead to a single-trial ERP-based BCI.

  10. Detection of common bile duct stones: comparison between endoscopic ultrasonography, magnetic resonance cholangiography, and helical-computed-tomographic cholangiography

    International Nuclear Information System (INIS)

    Kondo, Shintaro; Isayama, Hiroyuki; Akahane, Masaaki; Toda, Nobuo; Sasahira, Naoki; Nakai, Yosuke; Yamamoto, Natsuyo; Hirano, Kenji; Komatsu, Yutaka; Tada, Minoru; Yoshida, Haruhiko; Kawabe, Takao; Ohtomo, Kuni; Omata, Masao

    2005-01-01

    Objectives: New modalities, namely, endoscopic ultrasonography (EUS), magnetic resonance cholangiopancreatography (MRCP), and helical computed-tomographic cholangiography (HCT-C), have been introduced recently for the detection of common bile duct (CBD) stones and shown improved detectability compared to conventional ultrasound or computed tomography. We conducted this study to compare the diagnostic ability of EUS, MRCP, and HCT-C in patients with suspected choledocholithiasis. Methods: Twenty-eight patients clinically suspected of having CBD stones were enrolled, excluding those with cholangitis or a definite history of choledocholithiasis. Each patient underwent EUS, MRCP, and HCT-C prior to endoscopic retrograde cholangio-pancreatography (ERCP), the result of which served as the diagnostic gold standard. Results: CBD stones were detected in 24 (86%) of 28 patients by ERCP/IDUS. The sensitivity of EUS, MRCP, and HCT-C was 100%, 88%, and 88%, respectively. False negative cases for MRCP and HCT-C had a CBD stone smaller than 5 mm in diameter. No serious complications occurred while one patient complained of itching in the eyelids after the infusion of contrast agent on HCT-C. Conclusions: When examination can be scheduled, MRCP or HCT-C will be the first choice because they were less invasive than EUS. MRCP and HCT-C had similar detectability but the former may be preferable considering the possibility of allergic reaction in the latter. When MRCP is negative, EUS is recommended to check for small CBD stones

  11. Diagnostic reference levels for common computed tomography (CT) examinations: results from the first Nigerian nationwide dose survey.

    Science.gov (United States)

    Ekpo, Ernest U; Adejoh, Thomas; Akwo, Judith D; Emeka, Owujekwe C; Modu, Ali A; Abba, Mohammed; Adesina, Kudirat A; Omiyi, David O; Chiegwu, Uche H

    2018-01-29

    To explore doses from common adult computed tomography (CT) examinations and propose national diagnostic reference levels (nDRLs) for Nigeria. This retrospective study was approved by the Nnamdi Azikiwe University and University Teaching Hospital Institutional Review Boards (IRB: NAUTH/CS/66/Vol8/84) and involved dose surveys of adult CT examinations across the six geographical regions of Nigeria and Abuja from January 2016 to August 2017. Dose data of adult head, chest and abdomen/pelvis CT examinations were extracted from patient folders. The median, 75th and 25th percentile CT dose index volume (CTDI vol ) and dose-length-product (DLP) were computed for each of these procedures. Effective doses (E) for these examinations were estimated using the k conversion factor as described in the ICRP publication 103 (E DLP  =  k × DLP ). The proposed 75th percentile CTDI vol for head, chest, and abdomen/pelvis are 61 mGy, 17 mGy, and 20 mGy, respectively. The corresponding DLPs are 1310 mGy.cm, 735 mGy.cm, and 1486 mGy.cm respectively. The effective doses were 2.75 mSv (head), 10.29 mSv (chest), and 22.29 mSv (abdomen/pelvis). Findings demonstrate wide dose variations within and across centres in Nigeria. The results also show CTDI vol comparable to international standards, but considerably higher DLP and effective doses.

  12. Osteoid osteomas in common and in technically challenging locations treated with computed tomography-guided percutaneous radiofrequency ablation

    International Nuclear Information System (INIS)

    Mylona, Sophia; Patsoura, Sofia; Karapostolakis, Georgios; Galani, Panagiota; Pomoni, Anastasia; Thanos, Loukas

    2010-01-01

    To evaluate the efficacy of computed tomography (CT)-guided radiofrequency (RF) ablation for the treatment of osteoid osteomas in common and in technically challenging locations. Twenty-three patients with osteoid osteomas in common (nine cases) and technically challenging [14 cases: intra-articular (n = 7), spinal (n = 5), metaphyseal (n = 2)] positions were treated with CT-guided RF ablation. Therapy was performed under conscious sedation with a seven-array expandable RF electrode for 8-10 min at 80-110 C and power of 90-110 W. The patients went home under instruction. A brief pain inventory (BPI) score was calculated before and after (1 day, 4 weeks, 6 months and 1 year) treatment. All procedures were technically successful. Primary clinical success was 91.3% (21 of total 23 patients), despite the lesions' locations. BPI score was dramatically reduced after the procedure, and the decrease in BPI score was significant (P < 0.001, paired t-test; n - 1 = 22) for all periods during follow up. Two patients had persistent pain after 1 month and were treated successfully with a second procedure (secondary success rate 100%). No immediate or delayed complications were observed. CT-guided RF ablation is safe and highly effective for treatment of osteoid osteomas, even in technically difficult positions. (orig.)

  13. QCI Common

    Energy Technology Data Exchange (ETDEWEB)

    2016-11-18

    There are many common software patterns and utilities for the ORNL Quantum Computing Institute that can and should be shared across projects. Otherwise we find duplication of code which adds unwanted complexity. This is a software product seeks to alleviate this by providing common utilities such as object factories, graph data structures, parameter input mechanisms, etc., for other software products within the ORNL Quantum Computing Institute. This work enables pure basic research, has no export controlled utilities, and has no real commercial value.

  14. Using the longest significance run to estimate region-specific p-values in genetic association mapping studies

    Directory of Open Access Journals (Sweden)

    Yang Hsin-Chou

    2008-05-01

    Full Text Available Abstract Background Association testing is a powerful tool for identifying disease susceptibility genes underlying complex diseases. Technological advances have yielded a dramatic increase in the density of available genetic markers, necessitating an increase in the number of association tests required for the analysis of disease susceptibility genes. As such, multiple-tests corrections have become a critical issue. However the conventional statistical corrections on locus-specific multiple tests usually result in lower power as the number of markers increases. Alternatively, we propose here the application of the longest significant run (LSR method to estimate a region-specific p-value to provide an index for the most likely candidate region. Results An advantage of the LSR method relative to procedures based on genotypic data is that only p-value data are needed and hence can be applied extensively to different study designs. In this study the proposed LSR method was compared with commonly used methods such as Bonferroni's method and FDR controlling method. We found that while all methods provide good control over false positive rate, LSR has much better power and false discovery rate. In the authentic analysis on psoriasis and asthma disease data, the LSR method successfully identified important candidate regions and replicated the results of previous association studies. Conclusion The proposed LSR method provides an efficient exploratory tool for the analysis of sequences of dense genetic markers. Our results show that the LSR method has better power and lower false discovery rate comparing with the locus-specific multiple tests.

  15. The transcriptome of the bowhead whale Balaena mysticetus reveals adaptations of the longest-lived mammal

    Science.gov (United States)

    Seim, Inge; Ma, Siming; Zhou, Xuming; Gerashchenko, Maxim V.; Lee, Sang-Goo; Suydam, Robert; George, John C.; Bickham, John W.; Gladyshev, Vadim N.

    2014-01-01

    Mammals vary dramatically in lifespan, by at least two-orders of magnitude, but the molecular basis for this difference remains largely unknown. The bowhead whale Balaena mysticetus is the longest-lived mammal known, with an estimated maximal lifespan in excess of two hundred years. It is also one of the two largest animals and the most cold-adapted baleen whale species. Here, we report the first genome-wide gene expression analyses of the bowhead whale, based on the de novo assembly of its transcriptome. Bowhead whale or cetacean-specific changes in gene expression were identified in the liver, kidney and heart, and complemented with analyses of positively selected genes. Changes associated with altered insulin signaling and other gene expression patterns could help explain the remarkable longevity of bowhead whales as well as their adaptation to a lipid-rich diet. The data also reveal parallels in candidate longevity adaptations of the bowhead whale, naked mole rat and Brandt's bat. The bowhead whale transcriptome is a valuable resource for the study of this remarkable animal, including the evolution of longevity and its important correlates such as resistance to cancer and other diseases. PMID:25411232

  16. Lithospheric controls on magma composition along Earth's longest continental hotspot track.

    Science.gov (United States)

    Davies, D R; Rawlinson, N; Iaffaldano, G; Campbell, I H

    2015-09-24

    Hotspots are anomalous regions of volcanism at Earth's surface that show no obvious association with tectonic plate boundaries. Classic examples include the Hawaiian-Emperor chain and the Yellowstone-Snake River Plain province. The majority are believed to form as Earth's tectonic plates move over long-lived mantle plumes: buoyant upwellings that bring hot material from Earth's deep mantle to its surface. It has long been recognized that lithospheric thickness limits the rise height of plumes and, thereby, their minimum melting pressure. It should, therefore, have a controlling influence on the geochemistry of plume-related magmas, although unambiguous evidence of this has, so far, been lacking. Here we integrate observational constraints from surface geology, geochronology, plate-motion reconstructions, geochemistry and seismology to ascertain plume melting depths beneath Earth's longest continental hotspot track, a 2,000-kilometre-long track in eastern Australia that displays a record of volcanic activity between 33 and 9 million years ago, which we call the Cosgrove track. Our analyses highlight a strong correlation between lithospheric thickness and magma composition along this track, with: (1) standard basaltic compositions in regions where lithospheric thickness is less than 110 kilometres; (2) volcanic gaps in regions where lithospheric thickness exceeds 150 kilometres; and (3) low-volume, leucitite-bearing volcanism in regions of intermediate lithospheric thickness. Trace-element concentrations from samples along this track support the notion that these compositional variations result from different degrees of partial melting, which is controlled by the thickness of overlying lithosphere. Our results place the first observational constraints on the sub-continental melting depth of mantle plumes and provide direct evidence that lithospheric thickness has a dominant influence on the volume and chemical composition of plume-derived magmas.

  17. Emphysema Is Common in Lungs of Cystic Fibrosis Lung Transplantation Patients: A Histopathological and Computed Tomography Study.

    Directory of Open Access Journals (Sweden)

    Onno M Mets

    Full Text Available Lung disease in cystic fibrosis (CF involves excessive inflammation, repetitive infections and development of bronchiectasis. Recently, literature on emphysema in CF has emerged, which might become an increasingly important disease component due to the increased life expectancy. The purpose of this study was to assess the presence and extent of emphysema in endstage CF lungs.In explanted lungs of 20 CF patients emphysema was semi-quantitatively assessed on histology specimens. Also, emphysema was automatically quantified on pre-transplantation computed tomography (CT using the percentage of voxels below -950 Houndfield Units and was visually scored on CT. The relation between emphysema extent, pre-transplantation lung function and age was determined.All CF patients showed emphysema on histological examination: 3/20 (15% showed mild, 15/20 (75% moderate and 2/20 (10% severe emphysema, defined as 0-20% emphysema, 20-50% emphysema and >50% emphysema in residual lung tissue, respectively. Visually upper lobe bullous emphysema was identified in 13/20 and more diffuse non-bullous emphysema in 18/20. Histology showed a significant correlation to quantified CT emphysema (p = 0.03 and visual emphysema score (p = 0.001. CT and visual emphysema extent were positively correlated with age (p = 0.045 and p = 0.04, respectively.In conclusion, this study both pathologically and radiologically confirms that emphysema is common in end-stage CF lungs, and is age related. Emphysema might become an increasingly important disease component in the aging CF population.

  18. Common-mask guided image reconstruction (c-MGIR) for enhanced 4D cone-beam computed tomography

    International Nuclear Information System (INIS)

    Park, Justin C; Li, Jonathan G; Liu, Chihray; Lu, Bo; Zhang, Hao; Chen, Yunmei; Fan, Qiyong

    2015-01-01

    Compared to 3D cone beam computed tomography (3D CBCT), the image quality of commercially available four-dimensional (4D) CBCT is severely impaired due to the insufficient amount of projection data available for each phase. Since the traditional Feldkamp-Davis-Kress (FDK)-based algorithm is infeasible for reconstructing high quality 4D CBCT images with limited projections, investigators had developed several compress-sensing (CS) based algorithms to improve image quality. The aim of this study is to develop a novel algorithm which can provide better image quality than the FDK and other CS based algorithms with limited projections. We named this algorithm ‘the common mask guided image reconstruction’ (c-MGIR).In c-MGIR, the unknown CBCT volume is mathematically modeled as a combination of phase-specific motion vectors and phase-independent static vectors. The common-mask matrix, which is the key concept behind the c-MGIR algorithm, separates the common static part across all phase images from the possible moving part in each phase image. The moving part and the static part of the volumes were then alternatively updated by solving two sub-minimization problems iteratively. As the novel mathematical transformation allows the static volume and moving volumes to be updated (during each iteration) with global projections and ‘well’ solved static volume respectively, the algorithm was able to reduce the noise and under-sampling artifact (an issue faced by other algorithms) to the maximum extent. To evaluate the performance of our proposed c-MGIR, we utilized imaging data from both numerical phantoms and a lung cancer patient. The qualities of the images reconstructed with c-MGIR were compared with (1) standard FDK algorithm, (2) conventional total variation (CTV) based algorithm, (3) prior image constrained compressed sensing (PICCS) algorithm, and (4) motion-map constrained image reconstruction (MCIR) algorithm, respectively. To improve the efficiency of the

  19. Common-mask guided image reconstruction (c-MGIR) for enhanced 4D cone-beam computed tomography.

    Science.gov (United States)

    Park, Justin C; Zhang, Hao; Chen, Yunmei; Fan, Qiyong; Li, Jonathan G; Liu, Chihray; Lu, Bo

    2015-12-07

    Compared to 3D cone beam computed tomography (3D CBCT), the image quality of commercially available four-dimensional (4D) CBCT is severely impaired due to the insufficient amount of projection data available for each phase. Since the traditional Feldkamp-Davis-Kress (FDK)-based algorithm is infeasible for reconstructing high quality 4D CBCT images with limited projections, investigators had developed several compress-sensing (CS) based algorithms to improve image quality. The aim of this study is to develop a novel algorithm which can provide better image quality than the FDK and other CS based algorithms with limited projections. We named this algorithm 'the common mask guided image reconstruction' (c-MGIR).In c-MGIR, the unknown CBCT volume is mathematically modeled as a combination of phase-specific motion vectors and phase-independent static vectors. The common-mask matrix, which is the key concept behind the c-MGIR algorithm, separates the common static part across all phase images from the possible moving part in each phase image. The moving part and the static part of the volumes were then alternatively updated by solving two sub-minimization problems iteratively. As the novel mathematical transformation allows the static volume and moving volumes to be updated (during each iteration) with global projections and 'well' solved static volume respectively, the algorithm was able to reduce the noise and under-sampling artifact (an issue faced by other algorithms) to the maximum extent. To evaluate the performance of our proposed c-MGIR, we utilized imaging data from both numerical phantoms and a lung cancer patient. The qualities of the images reconstructed with c-MGIR were compared with (1) standard FDK algorithm, (2) conventional total variation (CTV) based algorithm, (3) prior image constrained compressed sensing (PICCS) algorithm, and (4) motion-map constrained image reconstruction (MCIR) algorithm, respectively. To improve the efficiency of the algorithm

  20. KIC 4552982: outbursts and pulsations in the longest-ever pseudo-continuous light curve of a ZZ Ceti

    Directory of Open Access Journals (Sweden)

    Bell K. J.

    2015-01-01

    Full Text Available KIC 4552982 was the first ZZ Ceti (hydrogen-atmosphere pulsating white dwarf identified to lie in the Kepler field, resulting in the longest pseudo-continuous light curve ever obtained for this type of variable star. In addition to the pulsations, this light curve exhibits stochastic episodes of brightness enhancement unlike any previously studied white dwarf phenomenon. We briefly highlight the basic outburst and pulsation properties in these proceedings.

  1. Longest interval between zeros of the tied-down random walk, the Brownian bridge and related renewal processes

    Science.gov (United States)

    Godrèche, Claude

    2017-05-01

    The probability distribution of the longest interval between two zeros of a simple random walk starting and ending at the origin, and of its continuum limit, the Brownian bridge, was analysed in the past by Rosén and Wendel, then extended by the latter to stable processes. We recover and extend these results using simple concepts of renewal theory, which allows to revisit past and recent works of the physics literature.

  2. Longest interval between zeros of the tied-down random walk, the Brownian bridge and related renewal processes

    International Nuclear Information System (INIS)

    Godrèche, Claude

    2017-01-01

    The probability distribution of the longest interval between two zeros of a simple random walk starting and ending at the origin, and of its continuum limit, the Brownian bridge, was analysed in the past by Rosén and Wendel, then extended by the latter to stable processes. We recover and extend these results using simple concepts of renewal theory, which allows to revisit past and recent works of the physics literature. (paper)

  3. An R package to compute commonality coefficients in the multiple regression case: an introduction to the package and a practical example.

    Science.gov (United States)

    Nimon, Kim; Lewis, Mitzi; Kane, Richard; Haynes, R Michael

    2008-05-01

    Multiple regression is a widely used technique for data analysis in social and behavioral research. The complexity of interpreting such results increases when correlated predictor variables are involved. Commonality analysis provides a method of determining the variance accounted for by respective predictor variables and is especially useful in the presence of correlated predictors. However, computing commonality coefficients is laborious. To make commonality analysis accessible to more researchers, a program was developed to automate the calculation of unique and common elements in commonality analysis, using the statistical package R. The program is described, and a heuristic example using data from the Holzinger and Swineford (1939) study, readily available in the MBESS R package, is presented.

  4. Women in computer science: An interpretative phenomenological analysis exploring common factors contributing to women's selection and persistence in computer science as an academic major

    Science.gov (United States)

    Thackeray, Lynn Roy

    The purpose of this study is to understand the meaning that women make of the social and cultural factors that influence their reasons for entering and remaining in study of computer science. The twenty-first century presents many new challenges in career development and workforce choices for both men and women. Information technology has become the driving force behind many areas of the economy. As this trend continues, it has become essential that U.S. citizens need to pursue a career in technologies, including the computing sciences. Although computer science is a very lucrative profession, many Americans, especially women, are not choosing it as a profession. Recent studies have shown no significant differences in math, technical and science competency between men and women. Therefore, other factors, such as social, cultural, and environmental influences seem to affect women's decisions in choosing an area of study and career choices. A phenomenological method of qualitative research was used in this study, based on interviews of seven female students who are currently enrolled in a post-secondary computer science program. Their narratives provided meaning into the social and cultural environments that contribute to their persistence in their technical studies, as well as identifying barriers and challenges that are faced by female students who choose to study computer science. It is hoped that the data collected from this study may provide recommendations for the recruiting, retention and support for women in computer science departments of U.S. colleges and universities, and thereby increase the numbers of women computer scientists in industry. Keywords: gender access, self-efficacy, culture, stereotypes, computer education, diversity.

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  8. Personal best marathon time and longest training run, not anthropometry, predict performance in recreational 24-hour ultrarunners.

    Science.gov (United States)

    Knechtle, Beat; Knechtle, Patrizia; Rosemann, Thomas; Lepers, Romuald

    2011-08-01

    In recent studies, a relationship between both low body fat and low thicknesses of selected skinfolds has been demonstrated for running performance of distances from 100 m to the marathon but not in ultramarathon. We investigated the association of anthropometric and training characteristics with race performance in 63 male recreational ultrarunners in a 24-hour run using bi and multivariate analysis. The athletes achieved an average distance of 146.1 (43.1) km. In the bivariate analysis, body mass (r = -0.25), the sum of 9 skinfolds (r = -0.32), the sum of upper body skinfolds (r = -0.34), body fat percentage (r = -0.32), weekly kilometers ran (r = 0.31), longest training session before the 24-hour run (r = 0.56), and personal best marathon time (r = -0.58) were related to race performance. Stepwise multiple regression showed that both the longest training session before the 24-hour run (p = 0.0013) and the personal best marathon time (p = 0.0015) had the best correlation with race performance. Performance in these 24-hour runners may be predicted (r2 = 0.46) by the following equation: Performance in a 24-hour run, km) = 234.7 + 0.481 (longest training session before the 24-hour run, km) - 0.594 (personal best marathon time, minutes). For practical applications, training variables such as volume and intensity were associated with performance but not anthropometric variables. To achieve maximum kilometers in a 24-hour run, recreational ultrarunners should have a personal best marathon time of ∼3 hours 20 minutes and complete a long training run of ∼60 km before the race, whereas anthropometric characteristics such as low body fat or low skinfold thicknesses showed no association with performance.

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  10. Comparative evaluation of the cadaveric, radiographic and computed tomographic anatomy of the heads of green iguana (Iguana iguana) , common tegu ( Tupinambis merianae) and bearded dragon ( Pogona vitticeps)

    OpenAIRE

    Banzato, Tommaso; Selleri, Paolo; Veladiano, Irene A; Martin, Andrea; Zanetti, Emanuele; Zotti, Alessandro

    2012-01-01

    Abstract Background Radiology and computed tomography are the most commonly available diagnostic tools for the diagnosis of pathologies affecting the head and skull in veterinary practice. Nevertheless, accurate interpretation of radiographic and CT studies requires a thorough knowledge of the gross and the cross-sectional anatomy. Despite the increasing success of reptiles as pets, only a few reports over their normal imaging features are currently available. The aim of this study is to desc...

  11. A comparison of the accuracy of ultrasound and computed tomography in common diagnoses causing acute abdominal pain

    Energy Technology Data Exchange (ETDEWEB)

    Randen, Adrienne van; Stoker, Jaap [Academic Medical Centre, Department of Radiology (suite G1-227), Amsterdam (Netherlands); Lameris, Wytze; Boermeester, Marja A. [Academic Medical Center, Department of Surgery, Amsterdam (Netherlands); Es, H.W. van; Heesewijk, Hans P.M. van [St Antonius Hospital, Department of Radiology, Nieuwegein (Netherlands); Ramshorst, Bert van [St Antonius Hospital, Department of Surgery, Nieuwegein (Netherlands); Hove, Wim ten [Gelre Hospitals, Department of Radiology, Apeldoorn (Netherlands); Bouma, Willem H. [Gelre Hospitals, Department of Surgery, Apeldoorn (Netherlands); Leeuwen, Maarten S. van [University Medical Centre, Department of Radiology, Utrecht (Netherlands); Keulen, Esteban M. van [Tergooi Hospitals, Department of Radiology, Hilversum (Netherlands); Bossuyt, Patrick M. [Academic Medical Center, Department of Clinical Epidemiology, Biostatistics, and Bioinformatics, Amsterdam (Netherlands)

    2011-07-15

    Head-to-head comparison of ultrasound and CT accuracy in common diagnoses causing acute abdominal pain. Consecutive patients with abdominal pain for >2 h and <5 days referred for imaging underwent both US and CT by different radiologists/radiological residents. An expert panel assigned a final diagnosis. Ultrasound and CT sensitivity and predictive values were calculated for frequent final diagnoses. Effect of patient characteristics and observer experience on ultrasound sensitivity was studied. Frequent final diagnoses in the 1,021 patients (mean age 47; 55% female) were appendicitis (284; 28%), diverticulitis (118; 12%) and cholecystitis (52; 5%). The sensitivity of CT in detecting appendicitis and diverticulitis was significantly higher than that of ultrasound: 94% versus 76% (p < 0.01) and 81% versus 61% (p = 0.048), respectively. For cholecystitis, the sensitivity of both was 73% (p = 1.00). Positive predictive values did not differ significantly between ultrasound and CT for these conditions. Ultrasound sensitivity in detecting appendicitis and diverticulitis was not significantly negatively affected by patient characteristics or reader experience. CT misses fewer cases than ultrasound, but both ultrasound and CT can reliably detect common diagnoses causing acute abdominal pain. Ultrasound sensitivity was largely not influenced by patient characteristics and reader experience. (orig.)

  12. Chichibu park bridge, a Japan's longest PC cable suspension bridge that attaches importance to scenery. Keikan wo jushishita Nippon saidai no PC shachokyo 'Chichibu koenkyo'

    Energy Technology Data Exchange (ETDEWEB)

    1993-12-01

    This paper introduces the feature of Chichibu Park Bridge, a Japan's longest PC cable suspension bridge that attaches importance to scenery. The maximum effective span of Chichibu Park Bridge which is a two-span continuous PC cable suspension bridge measures 195 m, that means the center span length is equivalent to about 400 m if converted to a three-span structure. With respect to the design that values the scenic effect, the main tower has relief engravings of stone carving tone using Chichibu Night Festival as a motif disposed around it; lighting up is applied to the main tower to highlight it so that it can be viewed from far away places; and a balcony is built on the center of the bridge. Chichibu Park Bridge has the bridge axial direction stagger with the river flow direction at 45[degree] to reduce water resistance. The tensile force generated at the corbel section according to the main tower reactive force is dealt with reinforced concrete rather than with prestressed concrete. The main tower adopts a two-chamber girder structure as its cross section shape from the view points of rigidity assurance and scenic effect. For construction control, micro computers are used to correct growing change in bend of the main girder due to temperature change and cable tension change. 6 figs., 4 tabs.

  13. Optimization of Dose and Image Quality in Full-fiand Computed Radiography Systems for Common Digital Radiographic Examinations

    Directory of Open Access Journals (Sweden)

    Soo-Foon Moey

    2018-01-01

    Full Text Available IntroductionA fine balance of image quality and radiation dose can be achieved by optimization to minimize stochastic and deterministic effects. This study aimed in ensuring that images of acceptable quality for common radiographic examinations in digital imaging were produced without causing harmful effects. Materials and MethodsThe study was conducted in three phases. The pre-optimization involved ninety physically abled patients aged between 20 to 60 years and weighed between 60 and 80 kilograms for four common digital radiographic examinations. Kerma X_plus, DAP meter was utilized to measure the entrance surface dose (ESD while effective dose (ED was estimated using CALDose_X 5.0 Monte Carlo software. The second phase, an experimental study utilized an anthropomorphic phantom (PBU-50 and Leeds test object TOR CDR for relative comparison of image quality. For the optimization phase, the imaging parameters with acceptable image quality and lowest ESD from the experimental study was related to patient’s body thickness. Image quality were evaluated by two radiologists using the modified evaluation criteria score lists. ResultsSignificant differences were found for image quality for all examinations. However significant difference for ESD were found for PA chest and AP abdomen only. The ESD for three of the examinations were lower than all published data. Additionally, the ESD and ED obtained for all examinations were lower than that recommended by radiation regulatory bodies. ConclusionOptimization of image quality and dose was achieved by utilizing an appropriate tube potential, calibrated automatic exposure control and additional filtration of 0.2mm copper.

  14. An Experimental and Computational Study of the Gas-Phase Acidities of the Common Amino Acid Amides.

    Science.gov (United States)

    Plummer, Chelsea E; Stover, Michele L; Bokatzian, Samantha S; Davis, John T M; Dixon, David A; Cassady, Carolyn J

    2015-07-30

    Using proton-transfer reactions in a Fourier transform ion cyclotron resonance mass spectrometer and correlated molecular orbital theory at the G3(MP2) level, gas-phase acidities (GAs) and the associated structures for amides corresponding to the common amino acids have been determined for the first time. These values are important because amino acid amides are models for residues in peptides and proteins. For compounds whose most acidic site is the C-terminal amide nitrogen, two ions populations were observed experimentally with GAs that differ by 4-7 kcal/mol. The lower energy, more acidic structure accounts for the majority of the ions formed by electrospray ionization. G3(MP2) calculations predict that the lowest energy anionic conformer has a cis-like orientation of the [-C(═O)NH](-) group whereas the higher energy, less acidic conformer has a trans-like orientation of this group. These two distinct conformers were predicted for compounds with aliphatic, amide, basic, hydroxyl, and thioether side chains. For the most acidic amino acid amides (tyrosine, cysteine, tryptophan, histidine, aspartic acid, and glutamic acid amides) only one conformer was observed experimentally, and its experimental GA correlates with the theoretical GA related to side chain deprotonation.

  15. Computational study of the fibril organization of polyglutamine repeats reveals a common motif identified in beta-helices.

    Science.gov (United States)

    Zanuy, David; Gunasekaran, Kannan; Lesk, Arthur M; Nussinov, Ruth

    2006-04-21

    The formation of fibril aggregates by long polyglutamine sequences is assumed to play a major role in neurodegenerative diseases such as Huntington. Here, we model peptides rich in glutamine, through a series of molecular dynamics simulations. Starting from a rigid nanotube-like conformation, we have obtained a new conformational template that shares structural features of a tubular helix and of a beta-helix conformational organization. Our new model can be described as a super-helical arrangement of flat beta-sheet segments linked by planar turns or bends. Interestingly, our comprehensive analysis of the Protein Data Bank reveals that this is a common motif in beta-helices (termed beta-bend), although it has not been identified so far. The motif is based on the alternation of beta-sheet and helical conformation as the protein sequence is followed from the N to the C termini (beta-alpha(R)-beta-polyPro-beta). We further identify this motif in the ssNMR structure of the protofibril of the amyloidogenic peptide Abeta(1-40). The recurrence of the beta-bend suggests a general mode of connecting long parallel beta-sheet segments that would allow the growth of partially ordered fibril structures. The design allows the peptide backbone to change direction with a minimal loss of main chain hydrogen bonds. The identification of a coherent organization beyond that of the beta-sheet segments in different folds rich in parallel beta-sheets suggests a higher degree of ordered structure in protein fibrils, in agreement with their low solubility and dense molecular packing.

  16. THE LONGEST TIMESCALE X-RAY VARIABILITY REVEALS EVIDENCE FOR ACTIVE GALACTIC NUCLEI IN THE HIGH ACCRETION STATE

    International Nuclear Information System (INIS)

    Zhang Youhong

    2011-01-01

    The All Sky Monitor (ASM) on board the Rossi X-ray Timing Explorer has continuously monitored a number of active galactic nuclei (AGNs) with similar sampling rates for 14 years, from 1996 January to 2009 December. Utilizing the archival ASM data of 27 AGNs, we calculate the normalized excess variances of the 300-day binned X-ray light curves on the longest timescale (between 300 days and 14 years) explored so far. The observed variance appears to be independent of AGN black-hole mass and bolometric luminosity. According to the scaling relation of black-hole mass (and bolometric luminosity) from galactic black hole X-ray binaries (GBHs) to AGNs, the break timescales that correspond to the break frequencies detected in the power spectral density (PSD) of our AGNs are larger than the binsize (300 days) of the ASM light curves. As a result, the singly broken power-law (soft-state) PSD predicts the variance to be independent of mass and luminosity. Nevertheless, the doubly broken power-law (hard-state) PSD predicts, with the widely accepted ratio of the two break frequencies, that the variance increases with increasing mass and decreases with increasing luminosity. Therefore, the independence of the observed variance on mass and luminosity suggests that AGNs should have soft-state PSDs. Taking into account the scaling of the break timescale with mass and luminosity synchronously, the observed variances are also more consistent with the soft-state than the hard-state PSD predictions. With the averaged variance of AGNs and the soft-state PSD assumption, we obtain a universal PSD amplitude of 0.030 ± 0.022. By analogy with the GBH PSDs in the high/soft state, the longest timescale variability supports the standpoint that AGNs are scaled-up GBHs in the high accretion state, as already implied by the direct PSD analysis.

  17. Digital dissection - using contrast-enhanced computed tomography scanning to elucidate hard- and soft-tissue anatomy in the Common Buzzard Buteo buteo.

    Science.gov (United States)

    Lautenschlager, Stephan; Bright, Jen A; Rayfield, Emily J

    2014-04-01

    Gross dissection has a long history as a tool for the study of human or animal soft- and hard-tissue anatomy. However, apart from being a time-consuming and invasive method, dissection is often unsuitable for very small specimens and often cannot capture spatial relationships of the individual soft-tissue structures. The handful of comprehensive studies on avian anatomy using traditional dissection techniques focus nearly exclusively on domestic birds, whereas raptorial birds, and in particular their cranial soft tissues, are essentially absent from the literature. Here, we digitally dissect, identify, and document the soft-tissue anatomy of the Common Buzzard (Buteo buteo) in detail, using the new approach of contrast-enhanced computed tomography using Lugol's iodine. The architecture of different muscle systems (adductor, depressor, ocular, hyoid, neck musculature), neurovascular, and other soft-tissue structures is three-dimensionally visualised and described in unprecedented detail. The three-dimensional model is further presented as an interactive PDF to facilitate the dissemination and accessibility of anatomical data. Due to the digital nature of the data derived from the computed tomography scanning and segmentation processes, these methods hold the potential for further computational analyses beyond descriptive and illustrative proposes. © 2013 The Authors. Journal of Anatomy published by John Wiley & Sons Ltd on behalf of Anatomical Society.

  18. Cross-sectional anatomy, computed tomography and magnetic resonance imaging of the head of common dolphin (Delphinus delphis) and striped dolphin (Stenella coeruleoalba).

    Science.gov (United States)

    Alonso-Farré, J M; Gonzalo-Orden, M; Barreiro-Vázquez, J D; Barreiro-Lois, A; André, M; Morell, M; Llarena-Reino, M; Monreal-Pawlowsky, T; Degollada, E

    2015-02-01

    Computed tomography (CT) and low-field magnetic resonance imaging (MRI) were used to scan seven by-caught dolphin cadavers, belonging to two species: four common dolphins (Delphinus delphis) and three striped dolphins (Stenella coeruleoalba). CT and MRI were obtained with the animals in ventral recumbency. After the imaging procedures, six dolphins were frozen at -20°C and sliced in the same position they were examined. Not only CT and MRI scans, but also cross sections of the heads were obtained in three body planes: transverse (slices of 1 cm thickness) in three dolphins, sagittal (5 cm thickness) in two dolphins and dorsal (5 cm thickness) in two dolphins. Relevant anatomical structures were identified and labelled on each cross section, obtaining a comprehensive bi-dimensional topographical anatomy guide of the main features of the common and the striped dolphin head. Furthermore, the anatomical cross sections were compared with their corresponding CT and MRI images, allowing an imaging identification of most of the anatomical features. CT scans produced an excellent definition of the bony and air-filled structures, while MRI allowed us to successfully identify most of the soft tissue structures in the dolphin's head. This paper provides a detailed anatomical description of the head structures of common and striped dolphins and compares anatomical cross sections with CT and MRI scans, becoming a reference guide for the interpretation of imaging studies. © 2014 Blackwell Verlag GmbH.

  19. Automated quantification of pulmonary emphysema from computed tomography scans: comparison of variation and correlation of common measures in a large cohort

    Science.gov (United States)

    Keller, Brad M.; Reeves, Anthony P.; Yankelevitz, David F.; Henschke, Claudia I.

    2010-03-01

    The purpose of this work was to retrospectively investigate the variation of standard indices of pulmonary emphysema from helical computed tomographic (CT) scans as related to inspiration differences over a 1 year interval and determine the strength of the relationship between these measures in a large cohort. 626 patients that had 2 scans taken at an interval of 9 months to 15 months (μ: 381 days, σ: 31 days) were selected for this work. All scans were acquired at a 1.25mm slice thickness using a low dose protocol. For each scan, the emphysema index (EI), fractal dimension (FD), mean lung density (MLD), and 15th percentile of the histogram (HIST) were computed. The absolute and relative changes for each measure were computed and the empirical 95% confidence interval was reported both in non-normalized and normalized scales. Spearman correlation coefficients are computed between the relative change in each measure and relative change in inspiration between each scan-pair, as well as between each pair-wise combination of the four measures. EI varied on a range of -10.5 to 10.5 on a non-normalized scale and -15 to 15 on a normalized scale, with FD and MLD showing slightly larger but comparable spreads, and HIST having a much larger variation. MLD was found to show the strongest correlation to inspiration change (r=0.85, pemphysema index and fractal dimension have the least variability overall of the commonly used measures of emphysema and that they offer the most unique quantification of emphysema relative to each other.

  20. Comparative evaluation of the cadaveric, radiographic and computed tomographic anatomy of the heads of green iguana (Iguana iguana) , common tegu ( Tupinambis merianae) and bearded dragon ( Pogona vitticeps)

    Science.gov (United States)

    2012-01-01

    Background Radiology and computed tomography are the most commonly available diagnostic tools for the diagnosis of pathologies affecting the head and skull in veterinary practice. Nevertheless, accurate interpretation of radiographic and CT studies requires a thorough knowledge of the gross and the cross-sectional anatomy. Despite the increasing success of reptiles as pets, only a few reports over their normal imaging features are currently available. The aim of this study is to describe the normal cadaveric, radiographic and computed tomographic features of the heads of the green iguana, tegu and bearded dragon. Results 6 adult green iguanas, 4 tegus, 3 bearded dragons, and, the adult cadavers of : 4 green iguana, 4 tegu, 4 bearded dragon were included in the study. 2 cadavers were dissected following a stratigraphic approach and 2 cadavers were cross-sectioned for each species. These latter specimens were stored in a freezer (−20°C) until completely frozen. Transversal sections at 5 mm intervals were obtained by means of an electric band-saw. Each section was cleaned and photographed on both sides. Radiographs of the head of each subject were obtained. Pre- and post- contrast computed tomographic studies of the head were performed on all the live animals. CT images were displayed in both bone and soft tissue windows. Individual anatomic structures were first recognised and labelled on the anatomic images and then matched on radiographs and CT images. Radiographic and CT images of the skull provided good detail of the bony structures in all species. In CT contrast medium injection enabled good detail of the soft tissues to be obtained in the iguana whereas only the eye was clearly distinguishable from the remaining soft tissues in both the tegu and the bearded dragon. Conclusions The results provide an atlas of the normal anatomical and in vivo radiographic and computed tomographic features of the heads of lizards, and this may be useful in interpreting any

  1. Comparative evaluation of the cadaveric, radiographic and computed tomographic anatomy of the heads of green iguana (Iguana iguana , common tegu ( Tupinambis merianae and bearded dragon ( Pogona vitticeps

    Directory of Open Access Journals (Sweden)

    Banzato Tommaso

    2012-05-01

    Full Text Available Abstract Background Radiology and computed tomography are the most commonly available diagnostic tools for the diagnosis of pathologies affecting the head and skull in veterinary practice. Nevertheless, accurate interpretation of radiographic and CT studies requires a thorough knowledge of the gross and the cross-sectional anatomy. Despite the increasing success of reptiles as pets, only a few reports over their normal imaging features are currently available. The aim of this study is to describe the normal cadaveric, radiographic and computed tomographic features of the heads of the green iguana, tegu and bearded dragon. Results 6 adult green iguanas, 4 tegus, 3 bearded dragons, and, the adult cadavers of : 4 green iguana, 4 tegu, 4 bearded dragon were included in the study. 2 cadavers were dissected following a stratigraphic approach and 2 cadavers were cross-sectioned for each species. These latter specimens were stored in a freezer (−20°C until completely frozen. Transversal sections at 5 mm intervals were obtained by means of an electric band-saw. Each section was cleaned and photographed on both sides. Radiographs of the head of each subject were obtained. Pre- and post- contrast computed tomographic studies of the head were performed on all the live animals. CT images were displayed in both bone and soft tissue windows. Individual anatomic structures were first recognised and labelled on the anatomic images and then matched on radiographs and CT images. Radiographic and CT images of the skull provided good detail of the bony structures in all species. In CT contrast medium injection enabled good detail of the soft tissues to be obtained in the iguana whereas only the eye was clearly distinguishable from the remaining soft tissues in both the tegu and the bearded dragon. Conclusions The results provide an atlas of the normal anatomical and in vivo radiographic and computed tomographic features of the heads of lizards, and this may be

  2. Comparative evaluation of the cadaveric, radiographic and computed tomographic anatomy of the heads of green iguana (Iguana iguana), common tegu (Tupinambis merianae) and bearded dragon (Pogona vitticeps).

    Science.gov (United States)

    Banzato, Tommaso; Selleri, Paolo; Veladiano, Irene A; Martin, Andrea; Zanetti, Emanuele; Zotti, Alessandro

    2012-05-11

    Radiology and computed tomography are the most commonly available diagnostic tools for the diagnosis of pathologies affecting the head and skull in veterinary practice. Nevertheless, accurate interpretation of radiographic and CT studies requires a thorough knowledge of the gross and the cross-sectional anatomy. Despite the increasing success of reptiles as pets, only a few reports over their normal imaging features are currently available. The aim of this study is to describe the normal cadaveric, radiographic and computed tomographic features of the heads of the green iguana, tegu and bearded dragon. 6 adult green iguanas, 4 tegus, 3 bearded dragons, and, the adult cadavers of: 4 green iguana, 4 tegu, 4 bearded dragon were included in the study. 2 cadavers were dissected following a stratigraphic approach and 2 cadavers were cross-sectioned for each species. These latter specimens were stored in a freezer (-20°C) until completely frozen. Transversal sections at 5 mm intervals were obtained by means of an electric band-saw. Each section was cleaned and photographed on both sides. Radiographs of the head of each subject were obtained. Pre- and post- contrast computed tomographic studies of the head were performed on all the live animals. CT images were displayed in both bone and soft tissue windows. Individual anatomic structures were first recognised and labelled on the anatomic images and then matched on radiographs and CT images. Radiographic and CT images of the skull provided good detail of the bony structures in all species. In CT contrast medium injection enabled good detail of the soft tissues to be obtained in the iguana whereas only the eye was clearly distinguishable from the remaining soft tissues in both the tegu and the bearded dragon. The results provide an atlas of the normal anatomical and in vivo radiographic and computed tomographic features of the heads of lizards, and this may be useful in interpreting any imaging modality involving these

  3. The worst case scenario: Locomotor and collision demands of the longest periods of gameplay in professional rugby union

    Science.gov (United States)

    Reardon, Cillian; Tobin, Daniel P.; Tierney, Peter; Delahunt, Eamonn

    2017-01-01

    A number of studies have used global positioning systems (GPS) to report on positional differences in the physical game demands of rugby union both on an average and singular bout basis. However, the ability of these studies to report quantitative data is limited by a lack of validation of certain aspects of measurement by GPS micro-technology. Furthermore no study has analyzed the positional physical demands of the longest bouts of ball-in-play time in rugby union. The aim of the present study is to compare the demands of the single longest period of ball-in-play, termed “worst case scenario” (WCS) between positional groups, which have previously been reported to have distinguishable game demands. The results of this study indicate that WCS periods follow a similar sporadic pattern as average demands but are played at a far higher pace than previously reported for average game demands with average meters per minute of 116.8 m. The positional differences in running and collision activity previously reported are perpetuated within WCS periods. Backs covered greater total distances than forwards (318 m vs 289 m), carried out more high-speed running (11.1 m·min-1 vs 5.5 m·min-1) and achieved higher maximum velocities (MaxVel). Outside Backs achieved the highest MaxVel values (6.84 m·sec-1). Tight Five and Back Row forwards underwent significantly more collisions than Inside Back and Outside Backs (0.73 & 0.89 collisions·min-1 vs 0.28 & 0.41 collisions·min-1 respectively). The results of the present study provide information on the positional physical requirements of performance in prolonged periods involving multiple high intensity bursts of effort. Although the current state of GPS micro-technology as a measurement tool does not permit reporting of collision intensity or acceleration data, the combined use of video and GPS provides valuable information to the practitioner. This can be used to match and replicate game demands in training. PMID:28510582

  4. The worst case scenario: Locomotor and collision demands of the longest periods of gameplay in professional rugby union.

    Directory of Open Access Journals (Sweden)

    Cillian Reardon

    Full Text Available A number of studies have used global positioning systems (GPS to report on positional differences in the physical game demands of rugby union both on an average and singular bout basis. However, the ability of these studies to report quantitative data is limited by a lack of validation of certain aspects of measurement by GPS micro-technology. Furthermore no study has analyzed the positional physical demands of the longest bouts of ball-in-play time in rugby union. The aim of the present study is to compare the demands of the single longest period of ball-in-play, termed "worst case scenario" (WCS between positional groups, which have previously been reported to have distinguishable game demands. The results of this study indicate that WCS periods follow a similar sporadic pattern as average demands but are played at a far higher pace than previously reported for average game demands with average meters per minute of 116.8 m. The positional differences in running and collision activity previously reported are perpetuated within WCS periods. Backs covered greater total distances than forwards (318 m vs 289 m, carried out more high-speed running (11.1 m·min-1 vs 5.5 m·min-1 and achieved higher maximum velocities (MaxVel. Outside Backs achieved the highest MaxVel values (6.84 m·sec-1. Tight Five and Back Row forwards underwent significantly more collisions than Inside Back and Outside Backs (0.73 & 0.89 collisions·min-1 vs 0.28 & 0.41 collisions·min-1 respectively. The results of the present study provide information on the positional physical requirements of performance in prolonged periods involving multiple high intensity bursts of effort. Although the current state of GPS micro-technology as a measurement tool does not permit reporting of collision intensity or acceleration data, the combined use of video and GPS provides valuable information to the practitioner. This can be used to match and replicate game demands in training.

  5. Extended postnatal brain development in the longest-lived rodent: prolonged maintenance of neotenous traits in the naked mole-rat brain

    Directory of Open Access Journals (Sweden)

    Miranda E. Orr

    2016-11-01

    Full Text Available The naked mole-rat (NMR is the longest-lived rodent with a maximum lifespan >31 years. Intriguingly, fully-grown naked mole-rats (NMRs exhibit many traits typical of neonatal rodents. However, little is known about NMR growth and maturation, and we question whether sustained neotenous features when compared to mice, reflect an extended developmental period, commensurate with their exceptionally long life. We tracked development from birth to three years of age in the slowest maturing organ, the brain, by measuring mass, neural stem cell proliferation, axonal and dendritic maturation, synaptogenesis and myelination. NMR brain maturation was compared to data from similar sized rodents, mice, and to that of long-lived mammals, humans and non-human primates. We found that at birth, NMR brains are significantly more developed than mice, and rather are more similar to those of newborn primates, with clearly laminated hippocampi and myelinated white matter tracts. Despite this more mature brain at birth than mice, postnatal NMR brain maturation occurs at a far slower rate than mice, taking four-times longer than required for mice to fully complete brain development. At four months of age, NMR brains reach 90% of adult size with stable neuronal cytostructural protein expression whereas myelin protein expression does not plateau until nine months of age in NMRs, and synaptic protein expression continues to change throughout the first three years of life. Intriguingly, NMR axonal composition is more similar to humans than mice whereby NMRs maintain expression of three-repeat (3R tau even after brain growth is complete; mice experience an abrupt downregulation of 3R tau by postnatal day 8 which continues to diminish through six weeks of age. We have identified key ages in NMR cerebral development and suggest that the long-lived NMR may provide neurobiologists an exceptional model to study brain developmental processes that are compressed in common short

  6. Extended Postnatal Brain Development in the Longest-Lived Rodent: Prolonged Maintenance of Neotenous Traits in the Naked Mole-Rat Brain.

    Science.gov (United States)

    Orr, Miranda E; Garbarino, Valentina R; Salinas, Angelica; Buffenstein, Rochelle

    2016-01-01

    The naked mole-rat (NMR) is the longest-lived rodent with a maximum lifespan >31 years. Intriguingly, fully-grown naked mole-rats (NMRs) exhibit many traits typical of neonatal rodents. However, little is known about NMR growth and maturation, and we question whether sustained neotenous features when compared to mice, reflect an extended developmental period, commensurate with their exceptionally long life. We tracked development from birth to 3 years of age in the slowest maturing organ, the brain, by measuring mass, neural stem cell proliferation, axonal, and dendritic maturation, synaptogenesis and myelination. NMR brain maturation was compared to data from similar sized rodents, mice, and to that of long-lived mammals, humans, and non-human primates. We found that at birth, NMR brains are significantly more developed than mice, and rather are more similar to those of newborn primates, with clearly laminated hippocampi and myelinated white matter tracts. Despite this more mature brain at birth than mice, postnatal NMR brain maturation occurs at a far slower rate than mice, taking four-times longer than required for mice to fully complete brain development. At 4 months of age, NMR brains reach 90% of adult size with stable neuronal cytostructural protein expression whereas myelin protein expression does not plateau until 9 months of age in NMRs, and synaptic protein expression continues to change throughout the first 3 years of life. Intriguingly, NMR axonal composition is more similar to humans than mice whereby NMRs maintain expression of three-repeat (3R) tau even after brain growth is complete; mice experience an abrupt downregulation of 3R tau by postnatal day 8 which continues to diminish through 6 weeks of age. We have identified key ages in NMR cerebral development and suggest that the long-lived NMR may provide neurobiologists an exceptional model to study brain developmental processes that are compressed in common short-lived laboratory animal models.

  7. Deep-sea octopus (Graneledone boreopacifica) conducts the longest-known egg-brooding period of any animal.

    Science.gov (United States)

    Robison, Bruce; Seibel, Brad; Drazen, Jeffrey

    2014-01-01

    Octopuses typically have a single reproductive period and then they die (semelparity). Once a clutch of fertilized eggs has been produced, the female protects and tends them until they hatch. In most shallow-water species this period of parental care can last from 1 to 3 months, but very little is known about the brooding of deep-living species. In the cold, dark waters of the deep ocean, metabolic processes are often slower than their counterparts at shallower depths. Extrapolations from data on shallow-water octopus species suggest that lower temperatures would prolong embryonic development periods. Likewise, laboratory studies have linked lower temperatures to longer brooding periods in cephalopods, but direct evidence has not been available. We found an opportunity to directly measure the brooding period of the deep-sea octopus Graneledone boreopacifica, in its natural habitat. At 53 months, it is by far the longest egg-brooding period ever reported for any animal species. These surprising results emphasize the selective value of prolonged embryonic development in order to produce competitive hatchlings. They also extend the known boundaries of physiological adaptations for life in the deep sea.

  8. Negligible senescence in the longest living rodent, the naked mole-rat: insights from a successfully aging species.

    Science.gov (United States)

    Buffenstein, Rochelle

    2008-05-01

    Aging refers to a gradual deterioration in function that, over time, leads to increased mortality risk, and declining fertility. This pervasive process occurs in almost all organisms, although some long-lived trees and cold water inhabitants reportedly show insignificant aging. Negligible senescence is characterized by attenuated age-related change in reproductive and physiological functions, as well as no observable age-related gradual increase in mortality rate. It was questioned whether the longest living rodent, the naked mole-rat, met these three strict criteria. Naked mole-rats live in captivity for more than 28.3 years, approximately 9 times longer than similar-sized mice. They maintain body composition from 2 to 24 years, and show only slight age-related changes in all physiological and morphological characteristics studied to date. Surprisingly breeding females show no decline in fertility even when well into their third decade of life. Moreover, these animals have never been observed to develop any spontaneous neoplasm. As such they do not show the typical age-associated acceleration in mortality risk that characterizes every other known mammalian species and may therefore be the first reported mammal showing negligible senescence over the majority of their long lifespan. Clearly physiological and biochemical processes in this species have evolved to dramatically extend healthy lifespan. The challenge that lies ahead is to understand what these mechanisms are.

  9. Walking the oxidative stress tightrope: a perspective from the naked mole-rat, the longest-living rodent.

    Science.gov (United States)

    Rodriguez, Karl A; Wywial, Ewa; Perez, Viviana I; Lambert, Adriant J; Edrey, Yael H; Lewis, Kaitlyn N; Grimes, Kelly; Lindsey, Merry L; Brand, Martin D; Buffenstein, Rochelle

    2011-01-01

    Reactive oxygen species (ROS), by-products of aerobic metabolism, cause oxidative damage to cells and tissue and not surprisingly many theories have arisen to link ROS-induced oxidative stress to aging and health. While studies clearly link ROS to a plethora of divergent diseases, their role in aging is still debatable. Genetic knock-down manipulations of antioxidants alter the levels of accrued oxidative damage, however, the resultant effect of increased oxidative stress on lifespan are equivocal. Similarly the impact of elevating antioxidant levels through transgenic manipulations yield inconsistent effects on longevity. Furthermore, comparative data from a wide range of endotherms with disparate longevity remain inconclusive. Many long-living species such as birds, bats and mole-rats exhibit high-levels of oxidative damage, evident already at young ages. Clearly, neither the amount of ROS per se nor the sensitivity in neutralizing ROS are as important as whether or not the accrued oxidative stress leads to oxidative-damage-linked age-associated diseases. In this review we examine the literature on ROS, its relation to disease and the lessons gleaned from a comparative approach based upon species with widely divergent responses. We specifically focus on the longest lived rodent, the naked mole-rat, which maintains good health and provides novel insights into the paradox of maintaining both an extended healthspan and lifespan despite high oxidative stress from a young age.

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  11. 计算机公共选修课分类教学研究%Research on the Classfication Teaching of Common Elective Courses of Computer

    Institute of Scientific and Technical Information of China (English)

    曾显峰; 张钰莎

    2012-01-01

    Through analysis of common elective courses of computer by examples, puts forward classifica- tion teaching in the sepcial field. The computer foundation courses would be divided into many modules, different professional selectively to learning, and facing the different professional, cstabishes course groups in the next stage, and establishes a flexible teaching methods, assessment methods, thereby efficiently provides services for profesional.%通过实例分析大学计算机公共选修课目前的通行模式,提出面向专业的分类教学。计算机基础按内容模块化,不同的专业时模块进行分类选修,后续选修课程建立面向专业的课程群,并建立起灵活的教学方式,考核方式,从而更好地让计算机公共课程为专业服务。

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  13. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  14. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  15. Common spatial pattern combined with kernel linear discriminate and generalized radial basis function for motor imagery-based brain computer interface applications

    Science.gov (United States)

    Hekmatmanesh, Amin; Jamaloo, Fatemeh; Wu, Huapeng; Handroos, Heikki; Kilpeläinen, Asko

    2018-04-01

    Brain Computer Interface (BCI) can be a challenge for developing of robotic, prosthesis and human-controlled systems. This work focuses on the implementation of a common spatial pattern (CSP) base algorithm to detect event related desynchronization patterns. Utilizing famous previous work in this area, features are extracted by filter bank with common spatial pattern (FBCSP) method, and then weighted by a sensitive learning vector quantization (SLVQ) algorithm. In the current work, application of the radial basis function (RBF) as a mapping kernel of linear discriminant analysis (KLDA) method on the weighted features, allows the transfer of data into a higher dimension for more discriminated data scattering by RBF kernel. Afterwards, support vector machine (SVM) with generalized radial basis function (GRBF) kernel is employed to improve the efficiency and robustness of the classification. Averagely, 89.60% accuracy and 74.19% robustness are achieved. BCI Competition III, Iva data set is used to evaluate the algorithm for detecting right hand and foot imagery movement patterns. Results show that combination of KLDA with SVM-GRBF classifier makes 8.9% and 14.19% improvements in accuracy and robustness, respectively. For all the subjects, it is concluded that mapping the CSP features into a higher dimension by RBF and utilization GRBF as a kernel of SVM, improve the accuracy and reliability of the proposed method.

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  18. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  19. Sensitivity analysis on the effect of software-induced common cause failure probability in the computer-based reactor trip system unavailability

    International Nuclear Information System (INIS)

    Kamyab, Shahabeddin; Nematollahi, Mohammadreza; Shafiee, Golnoush

    2013-01-01

    Highlights: ► Importance and sensitivity analysis has been performed for a digitized reactor trip system. ► The results show acceptable trip unavailability, for software failure probabilities below 1E −4 . ► However, the value of Fussell–Vesley indicates that software common cause failure is still risk significant. ► Diversity and effective test is founded beneficial to reduce software contribution. - Abstract: The reactor trip system has been digitized in advanced nuclear power plants, since the programmable nature of computer based systems has a number of advantages over non-programmable systems. However, software is still vulnerable to common cause failure (CCF). Residual software faults represent a CCF concern, which threat the implemented achievements. This study attempts to assess the effectiveness of so-called defensive strategies against software CCF with respect to reliability. Sensitivity analysis has been performed by re-quantifying the models upon changing the software failure probability. Importance measures then have been estimated in order to reveal the specific contribution of software CCF in the trip failure probability. The results reveal the importance and effectiveness of signal and software diversity as applicable strategies to ameliorate inefficiencies due to software CCF in the reactor trip system (RTS). No significant change has been observed in the rate of RTS failure probability for the basic software CCF greater than 1 × 10 −4 . However, the related Fussell–Vesley has been greater than 0.005, for the lower values. The study concludes that consideration of risk associated with the software based systems is a multi-variant function which requires compromising among them in more precise and comprehensive studies

  20. EVALUATION OF BONE MINERALIZATION BY COMPUTED TOMOGRAPHY IN WILD AND CAPTIVE EUROPEAN COMMON SPADEFOOTS (PELOBATES FUSCUS), IN RELATION TO EXPOSURE TO ULTRAVIOLET B RADIATION AND DIETARY SUPPLEMENTS.

    Science.gov (United States)

    van Zijll Langhout, Martine; Struijk, Richard P J H; Könning, Tessa; van Zuilen, Dick; Horvath, Katalin; van Bolhuis, Hester; Maarschalkerweerd, Roelof; Verstappen, Frank

    2017-09-01

    Captive rearing programs have been initiated to save the European common spadefoot (Pelobates fuscus), a toad species in the family of Pelobatidae, from extinction in The Netherlands. Evaluating whether this species needs ultraviolet B (UVB) radiation and/or dietary supplementation for healthy bone development is crucial for its captive management and related conservation efforts. The bone mineralization in the femurs and the thickest part of the parietal bone of the skulls of European common spadefoots (n = 51) was measured in Hounsfield units (HUs) by computed tomography. One group, containing adults (n = 8) and juveniles (n = 13), was reared at ARTIS Amsterdam Royal Zoo without UVB exposure. During their terrestrial lifetime, these specimens received a vitamin-mineral supplement. Another group, containing adults (n = 8) and juveniles (n = 10), was reared and kept in an outdoor breeding facility in Münster, Germany, with permanent access to natural UVB light, without vitamin-mineral supplementation. The HUs in the ARTIS and Münster specimens were compared with those in wild specimens (n = 12). No significant difference was found between the HUs in the femurs of both ARTIS and Münster adults and wild adults (P = 0.537; P = 0.181). The HUs in the skulls of both captive-adult groups were significantly higher than in the skulls of wild specimens (P = 0.020; P = 0.005). The HUs in the femurs of the adult ARTIS animals were significantly higher than the HUs in the femurs of the adult Münster animals (P = 0.007). The absence of UVB radiation did not seem to have a negative effect on the bone development in the terrestrial stage. This suggests that this nocturnal, subterrestrial amphibian was able to extract sufficient vitamin D 3 from its diet and did not rely heavily on photobiosynthesis through UVB exposure.

  1. Correlation of the Deccan and Rajahmundry Trap lavas: Are these the longest and largest lava flows on Earth?

    Science.gov (United States)

    Self, S.; Jay, A. E.; Widdowson, M.; Keszthelyi, L. P.

    2008-05-01

    We propose that the Rajahmundry Trap lavas, found near the east coast of peninsular India , are remnants of the longest lava flows yet recognized on Earth (˜ 1000 km long). These outlying Deccan-like lavas are shown to belong to the main Deccan Traps. Several previous studies have already suggested this correlation, but have not demonstrated it categorically. The exposed Rajahmundry lavas are interpreted to be the distal parts of two very-large-volume pāhoehoe flow fields, one each from the Ambenali and Mahabaleshwar Formations of the Wai Sub-group in the Deccan Basalt Group. Eruptive conditions required to emplace such long flows are met by plausible values for cooling and eruption rates, and this is shown by applying a model for the formation of inflated pāhoehoe sheet flow lobes. The model predicts flow lobe thicknesses similar to those observed in the Rajahmundry lavas. For the last 400 km of flow, the lava flows were confined to the pre-existing Krishna valley drainage system that existed in the basement beyond the edge of the gradually expanding Deccan lava field, allowing the flows to extend across the subcontinent to the eastern margin where they were emplaced into a littoral and/or shallow marine environment. These lavas and other individual flow fields in the Wai Sub-group may exceed eruptive volumes of 5000 km 3, which would place them amongst the largest magnitude effusive eruptive units yet known. We suggest that the length of flood basalt lava flows on Earth is restricted mainly by the size of land masses and topography. In the case of the Rajahmundry lavas, the flows reached estuaries and the sea, where their advance was perhaps effectively terminated by cooling and/or disruption. However, it is only during large igneous province basaltic volcanism that such huge volumes of lava are erupted in single events, and when the magma supply rate is sufficiently high and maintained to allow the formation of very long lava flows. The Rajahmundry lava

  2. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  6. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  7. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  8. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  9. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  12. Application of a common spatial pattern-based algorithm for an fNIRS-based motor imagery brain-computer interface.

    Science.gov (United States)

    Zhang, Shen; Zheng, Yanchun; Wang, Daifa; Wang, Ling; Ma, Jianai; Zhang, Jing; Xu, Weihao; Li, Deyu; Zhang, Dan

    2017-08-10

    Motor imagery is one of the most investigated paradigms in the field of brain-computer interfaces (BCIs). The present study explored the feasibility of applying a common spatial pattern (CSP)-based algorithm for a functional near-infrared spectroscopy (fNIRS)-based motor imagery BCI. Ten participants performed kinesthetic imagery of their left- and right-hand movements while 20-channel fNIRS signals were recorded over the motor cortex. The CSP method was implemented to obtain the spatial filters specific for both imagery tasks. The mean, slope, and variance of the CSP filtered signals were taken as features for BCI classification. Results showed that the CSP-based algorithm outperformed two representative channel-wise methods for classifying the two imagery statuses using either data from all channels or averaged data from imagery responsive channels only (oxygenated hemoglobin: CSP-based: 75.3±13.1%; all-channel: 52.3±5.3%; averaged: 64.8±13.2%; deoxygenated hemoglobin: CSP-based: 72.3±13.0%; all-channel: 48.8±8.2%; averaged: 63.3±13.3%). Furthermore, the effectiveness of the CSP method was also observed for the motor execution data to a lesser extent. A partial correlation analysis revealed significant independent contributions from all three types of features, including the often-ignored variance feature. To our knowledge, this is the first study demonstrating the effectiveness of the CSP method for fNIRS-based motor imagery BCIs. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  14. [Hyp-Au-Sn9(Hyp)3-Au-Sn9(Hyp)3-Au-Hyp]-: the longest intermetalloid chain compound of tin.

    Science.gov (United States)

    Binder, Mareike; Schrenk, Claudio; Block, Theresa; Pöttgen, Rainer; Schnepf, Andreas

    2017-10-12

    The reaction of the metalloid tin cluster [Sn 10 (Hyp) 4 ] 2- with (Ph 3 P)Au-SHyp (Hyp = Si(SiMe 3 ) 3 ) gave an intermetalloid cluster [Au 3 Sn 18 (Hyp) 8 ] - 1, which is the longest intermetalloid chain compound of tin to date. 1 shows a structural resemblance to binary AuSn phases, which is expected for intermetalloid clusters.

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  16. Optical Computing

    OpenAIRE

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  17. Common Courses for Common Purposes:

    DEFF Research Database (Denmark)

    Schaub Jr, Gary John

    2014-01-01

    (PME)? I suggest three alternative paths that increased cooperation in PME at the level of the command and staff course could take: a Nordic Defence College, standardized national command and staff courses, and a core curriculum of common courses for common purposes. I conclude with a discussion of how...

  18. Meta-heuristic algorithms for parallel identical machines scheduling problem with weighted late work criterion and common due date.

    Science.gov (United States)

    Xu, Zhenzhen; Zou, Yongxing; Kong, Xiangjie

    2015-01-01

    To our knowledge, this paper investigates the first application of meta-heuristic algorithms to tackle the parallel machines scheduling problem with weighted late work criterion and common due date ([Formula: see text]). Late work criterion is one of the performance measures of scheduling problems which considers the length of late parts of particular jobs when evaluating the quality of scheduling. Since this problem is known to be NP-hard, three meta-heuristic algorithms, namely ant colony system, genetic algorithm, and simulated annealing are designed and implemented, respectively. We also propose a novel algorithm named LDF (largest density first) which is improved from LPT (longest processing time first). The computational experiments compared these meta-heuristic algorithms with LDF, LPT and LS (list scheduling), and the experimental results show that SA performs the best in most cases. However, LDF is better than SA in some conditions, moreover, the running time of LDF is much shorter than SA.

  19. Creative Commons

    DEFF Research Database (Denmark)

    Jensen, Lone

    2006-01-01

    En Creative Commons licens giver en forfatter mulighed for at udbyde sit værk i en alternativ licensløsning, som befinder sig på forskellige trin på en skala mellem yderpunkterne "All rights reserved" og "No rights reserved". Derved opnås licensen "Some rights reserved"......En Creative Commons licens giver en forfatter mulighed for at udbyde sit værk i en alternativ licensløsning, som befinder sig på forskellige trin på en skala mellem yderpunkterne "All rights reserved" og "No rights reserved". Derved opnås licensen "Some rights reserved"...

  20. Science commons

    CERN Multimedia

    CERN. Geneva

    2007-01-01

    SCP: Creative Commons licensing for open access publishing, Open Access Law journal-author agreements for converting journals to open access, and the Scholar's Copyright Addendum Engine for retaining rights to self-archive in meaningful formats and locations for future re-use. More than 250 science and technology journals already publish under Creative Commons licensing while 35 law journals utilize the Open Access Law agreements. The Addendum Engine is a new tool created in partnership with SPARC and U.S. universities. View John Wilbanks's biography

  1. Common approach to common interests

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-06-01

    In referring to issues confronting the energy field in this region and options to be exercised in the future, I would like to mention the fundamental condition of the utmost importance. That can be summed up as follows: any subject in energy area can never be solved by one country alone, given the geographical and geopolitical characteristics intrinsically possessed by energy. So, a regional approach is needed and it is especially necessary for the main players in the region to jointly address problems common to them. Though it may be a matter to be pursued in the distant future, I am personally dreaming a 'Common Energy Market for Northeast Asia,' in which member countries' interests are adjusted so that the market can be integrated and the region can become a most economically efficient market, thus formulating an effective power to encounter the outside. It should be noted that Europe needed forty years to integrate its market as the unified common market. It is necessary for us to follow a number of steps over the period to eventually materialize our common market concept, too. Now is the time for us to take a first step to lay the foundation for our descendants to enjoy prosperity from such a common market.

  2. Common envelope evolution

    NARCIS (Netherlands)

    Taam, Ronald E.; Ricker, Paul M.

    2010-01-01

    The common envelope phase of binary star evolution plays a central role in many evolutionary pathways leading to the formation of compact objects in short period systems. Using three dimensional hydrodynamical computations, we review the major features of this evolutionary phase, focusing on the

  3. Making the Common Good Common

    Science.gov (United States)

    Chase, Barbara

    2011-01-01

    How are independent schools to be useful to the wider world? Beyond their common commitment to educate their students for meaningful lives in service of the greater good, can they educate a broader constituency and, thus, share their resources and skills more broadly? Their answers to this question will be shaped by their independence. Any…

  4. Computed tomography dosimeter utilizing a radiochromic film and an optical common-mode rejection: characterization and calibration of the GafChromic XRCT film

    International Nuclear Information System (INIS)

    Ohuchi, H.; Abe, M.

    2008-01-01

    Gafchromic XRCT radiochromic film is a self-developing high sensitivity radiochromic film product which can be used for assessment of delivered radiation doses which could match applications such as computed tomography (CT) dosimetry. The film automatically changes color upon irradiation changing from amber to dark greenish-black depending on the level of exposure. The absorption spectra of Gafchromic XRCT radiochromic film as measured with reflectance spectrophotometry have been investigated to analyse the dosimetry characteristics of the film. Results show two main absorption peaks produced from irradiation located at around 630 nm and 580 nm. We employed a commercially available, optical flatbed scanner for digitization of the film and image analysis software to determine the response of the XRCT films to ionizing radiation. The two dose response curves as a function of delivered dose ranging from 1.069 to 119.7 mGy for tube voltages of 80, 100, and 120 kV X-ray beams and from films scanned 24 hrs after exposure are obtained. One represents the net optical density obtained with the conventional analysis way using only red component and another shows the net reduced OD with the optical CMR scheme, which we developed, using red and green components. The measured ODs obtained with the optical CMR scheme show a good consistency among four samples and all values show an improved consistency with a second-order polynomial fit less than 1 mGy, while those with the conventional analysis exhibited a large discrepancy among four samples and did not show a consistency with a second-order polynomial fit less than 1 mGy. This result combined with its energy independence from 80 kV to 120 kV X-ray energy range provides a unique enhancement in dosimetric measurement capabilities such as the acquisition of high-spatial resolution and calibrated radiation dose profiles over currently available dosimetry films for CT applications. (author)

  5. Algorithms for computing parsimonious evolutionary scenarios for genome evolution, the last universal common ancestor and dominance of horizontal gene transfer in the evolution of prokaryotes

    Directory of Open Access Journals (Sweden)

    Galperin Michael Y

    2003-01-01

    Full Text Available Abstract Background Comparative analysis of sequenced genomes reveals numerous instances of apparent horizontal gene transfer (HGT, at least in prokaryotes, and indicates that lineage-specific gene loss might have been even more common in evolution. This complicates the notion of a species tree, which needs to be re-interpreted as a prevailing evolutionary trend, rather than the full depiction of evolution, and makes reconstruction of ancestral genomes a non-trivial task. Results We addressed the problem of constructing parsimonious scenarios for individual sets of orthologous genes given a species tree. The orthologous sets were taken from the database of Clusters of Orthologous Groups of proteins (COGs. We show that the phyletic patterns (patterns of presence-absence in completely sequenced genomes of almost 90% of the COGs are inconsistent with the hypothetical species tree. Algorithms were developed to reconcile the phyletic patterns with the species tree by postulating gene loss, COG emergence and HGT (the latter two classes of events were collectively treated as gene gains. We prove that each of these algorithms produces a parsimonious evolutionary scenario, which can be represented as mapping of loss and gain events on the species tree. The distribution of the evolutionary events among the tree nodes substantially depends on the underlying assumptions of the reconciliation algorithm, e.g. whether or not independent gene gains (gain after loss after gain are permitted. Biological considerations suggest that, on average, gene loss might be a more likely event than gene gain. Therefore different gain penalties were used and the resulting series of reconstructed gene sets for the last universal common ancestor (LUCA of the extant life forms were analysed. The number of genes in the reconstructed LUCA gene sets grows as the gain penalty increases. However, qualitative examination of the LUCA versions reconstructed with different gain penalties

  6. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses Computed tomography (CT) of the sinuses ... CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, more commonly known ...

  7. Satellite tagging of rehabilitated green sea turtles Chelonia mydas from the United Arab Emirates, including the longest tracked journey for the species.

    Science.gov (United States)

    Robinson, David P; Jabado, Rima W; Rohner, Christoph A; Pierce, Simon J; Hyland, Kevin P; Baverstock, Warren R

    2017-01-01

    We collected movement data for eight rehabilitated and satellite-tagged green sea turtles Chelonia mydas released off the United Arab Emirates between 2005 and 2013. Rehabilitation periods ranged from 96 to 1353 days (mean = 437 ± 399 days). Seven of the eight tagged turtles survived after release; one turtle was killed by what is thought to be a post-release spear gun wound. The majority of turtles (63%) used shallow-water core habitats and established home ranges between Dubai and Abu Dhabi, the same area in which they had originally washed ashore prior to rescue. Four turtles made movements across international boundaries, highlighting that regional cooperation is necessary for the management of the species. One turtle swam from Fujairah to the Andaman Sea, a total distance of 8283 km, which is the longest published track of a green turtle. This study demonstrates that sea turtles can be successfully reintroduced into the wild after sustaining serious injury and undergoing prolonged periods of intense rehabilitation.

  8. Amyloid beta and the longest-lived rodent: the naked mole-rat as a model for natural protection from Alzheimer's disease.

    Science.gov (United States)

    Edrey, Yael H; Medina, David X; Gaczynska, Maria; Osmulski, Pawel A; Oddo, Salvatore; Caccamo, Antonella; Buffenstein, Rochelle

    2013-10-01

    Amyloid beta (Aβ) is implicated in Alzheimer's disease (AD) as an integral component of both neural toxicity and plaque formation. Brains of the longest-lived rodents, naked mole-rats (NMRs) approximately 32 years of age, had levels of Aβ similar to those of the 3xTg-AD mouse model of AD. Interestingly, there was no evidence of extracellular plaques, nor was there an age-related increase in Aβ levels in the individuals examined (2-20+ years). The NMR Aβ peptide showed greater homology to the human sequence than to the mouse sequence, differing by only 1 amino acid from the former. This subtle difference led to interspecies differences in aggregation propensity but not neurotoxicity; NMR Aβ was less prone to aggregation than human Aβ. Nevertheless, both NMR and human Aβ were equally toxic to mouse hippocampal neurons, suggesting that Aβ neurotoxicity and aggregation properties were not coupled. Understanding how NMRs acquire and tolerate high levels of Aβ with no plaque formation could provide useful insights into AD, and may elucidate protective mechanisms that delay AD progression. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. Age group athletes in inline skating: decrease in overall and increase in master athlete participation in the longest inline skating race in Europe - the Inline One-Eleven.

    Science.gov (United States)

    Teutsch, Uwe; Knechtle, Beat; Rüst, Christoph Alexander; Rosemann, Thomas; Lepers, Romuald

    2013-01-01

    Participation and performance trends in age group athletes have been investigated in endurance and ultraendurance races in swimming, cycling, running, and triathlon, but not in long-distance inline skating. The aim of this study was to investigate trends in participation, age, and performance in the longest inline race in Europe, the Inline One-Eleven over 111 km, held between 1998 and 2009. The total number, age distribution, age at the time of the competition, and race times of male and female finishers at the Inline One-Eleven were analyzed. Overall participation increased until 2003 but decreased thereafter. During the 12-year period, the relative participation in skaters younger than 40 years old decreased while relative participation increased for skaters older than 40 years. The mean top ten skating time was 199 ± 9 minutes (range: 189-220 minutes) for men and 234 ± 17 minutes (range: 211-271 minutes) for women, respectively. The gender difference in performance remained stable at 17% ± 5% across years. To summarize, although the participation of master long-distance inline skaters increased, the overall participation decreased across years in the Inline One-Eleven. The race times of the best female and male skaters stabilized across years with a gender difference in performance of 17% ± 5%. Further studies should focus on the participation in the international World Inline Cup races.

  10. The Tragedy of the Commons

    Science.gov (United States)

    Short, Daniel

    2016-01-01

    The tragedy of the commons is one of the principal tenets of ecology. Recent developments in experiential computer-based simulation of the tragedy of the commons are described. A virtual learning environment is developed using the popular video game "Minecraft". The virtual learning environment is used to experience first-hand depletion…

  11. Free Boomerang-shaped Extended Rectus Abdominis Myocutaneous flap: The longest possible skin/myocutaneous free flap for soft tissue reconstruction of extremities

    Directory of Open Access Journals (Sweden)

    Ashok R Koul

    2011-01-01

    Full Text Available Background: A soft tissue defect requiring flap cover which is longer than that provided by the conventional "long" free flaps like latissimus dorsi (LD and anterolateral thigh (ALT flap is a challenging problem. Often, in such a situation, a combination of flaps is required. Over the last 3 years, we have managed nine such defects successfully with a free "Boomerang-shaped" Extended Rectus Abdominis Myocutaneous (BERAM flap. This flap is the slightly modified and "free" version of a similar flap described by Ian Taylor in 1983. Materials and Methods: This is a retrospective study of patients who underwent free BERAM flap reconstruction of soft tissue defects of extremity over the last 3 years. We also did a clinical study on 30 volunteers to compare the length of flap available using our design of BERAM flap with the maximum available flap length of LD and ALT flaps, using standard markings. Results: Our clinical experience of nine cases combined with the results of our clinical study has confirmed that our design of BERAM flap consistently provides a flap length which is 32.6% longer than the standard LD flap and 42.2% longer than the standard ALT flap in adults. The difference is even more marked in children. The BERAM flap is consistently reliable as long as the distal end is not extended beyond the mid-axillary line. Conclusion: BERAM flap is simple in design, easy to harvest, reliable and provides the longest possible free skin/myocutaneous flap in the body. It is a useful new alternative for covering long soft tissue defects in the limbs.

  12. Free Boomerang-shaped Extended Rectus Abdominis Myocutaneous flap: The longest possible skin/myocutaneous free flap for soft tissue reconstruction of extremities.

    Science.gov (United States)

    Koul, Ashok R; Nahar, Sushil; Prabhu, Jagdish; Kale, Subhash M; Kumar, Praveen H P

    2011-09-01

    A soft tissue defect requiring flap cover which is longer than that provided by the conventional "long" free flaps like latissimus dorsi (LD) and anterolateral thigh (ALT) flap is a challenging problem. Often, in such a situation, a combination of flaps is required. Over the last 3 years, we have managed nine such defects successfully with a free "Boomerang-shaped" Extended Rectus Abdominis Myocutaneous (BERAM) flap. This flap is the slightly modified and "free" version of a similar flap described by Ian Taylor in 1983. This is a retrospective study of patients who underwent free BERAM flap reconstruction of soft tissue defects of extremity over the last 3 years. We also did a clinical study on 30 volunteers to compare the length of flap available using our design of BERAM flap with the maximum available flap length of LD and ALT flaps, using standard markings. Our clinical experience of nine cases combined with the results of our clinical study has confirmed that our design of BERAM flap consistently provides a flap length which is 32.6% longer than the standard LD flap and 42.2% longer than the standard ALT flap in adults. The difference is even more marked in children. The BERAM flap is consistently reliable as long as the distal end is not extended beyond the mid-axillary line. BERAM flap is simple in design, easy to harvest, reliable and provides the longest possible free skin/myocutaneous flap in the body. It is a useful new alternative for covering long soft tissue defects in the limbs.

  13. And the beat goes on: maintained cardiovascular function during aging in the longest-lived rodent, the naked mole-rat.

    Science.gov (United States)

    Grimes, Kelly M; Reddy, Anilkumar K; Lindsey, Merry L; Buffenstein, Rochelle

    2014-08-01

    The naked mole-rat (NMR) is the longest-lived rodent known, with a maximum lifespan potential (MLSP) of >31 years. Despite such extreme longevity, these animals display attenuation of many age-associated diseases and functional changes until the last quartile of their MLSP. We questioned if such abilities would extend to cardiovascular function and structure in this species. To test this, we assessed cardiac functional reserve, ventricular morphology, and arterial stiffening in NMRs ranging from 2 to 24 years of age. Dobutamine echocardiography (3 μg/g ip) revealed no age-associated changes in left ventricular (LV) function either at baseline or with exercise-like stress. Baseline and dobutamine-induced LV pressure parameters also did not change. Thus the NMR, unlike other mammals, maintains cardiac reserve with age. NMRs showed no cardiac hypertrophy, evidenced by no increase in cardiomyocyte cross-sectional area or LV dimensions with age. Age-associated arterial stiffening does not occur since there are no changes in aortic blood pressures or pulse-wave velocity. Only LV interstitial collagen deposition increased 2.5-fold from young to old NMRs (P < 0.01). However, its effect on LV diastolic function is likely minor since NMRs experience attenuated age-related increases in diastolic dysfunction in comparison with other species. Overall, these findings conform to the negligible senescence phenotype, as NMRs largely stave off cardiovascular changes for at least 75% of their MLSP. This suggests that using a comparative strategy to find factors that change with age in other mammals but not NMRs could provide novel targets to slow or prevent cardiovascular aging in humans.

  14. The common ancestry of life

    Directory of Open Access Journals (Sweden)

    Wolf Yuri I

    2010-11-01

    Full Text Available Abstract Background It is common belief that all cellular life forms on earth have a common origin. This view is supported by the universality of the genetic code and the universal conservation of multiple genes, particularly those that encode key components of the translation system. A remarkable recent study claims to provide a formal, homology independent test of the Universal Common Ancestry hypothesis by comparing the ability of a common-ancestry model and a multiple-ancestry model to predict sequences of universally conserved proteins. Results We devised a computational experiment on a concatenated alignment of universally conserved proteins which shows that the purported demonstration of the universal common ancestry is a trivial consequence of significant sequence similarity between the analyzed proteins. The nature and origin of this similarity are irrelevant for the prediction of "common ancestry" of by the model-comparison approach. Thus, homology (common origin of the compared proteins remains an inference from sequence similarity rather than an independent property demonstrated by the likelihood analysis. Conclusion A formal demonstration of the Universal Common Ancestry hypothesis has not been achieved and is unlikely to be feasible in principle. Nevertheless, the evidence in support of this hypothesis provided by comparative genomics is overwhelming. Reviewers this article was reviewed by William Martin, Ivan Iossifov (nominated by Andrey Rzhetsky and Arcady Mushegian. For the complete reviews, see the Reviewers' Report section.

  15. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  16. Common Readout System in ALICE

    CERN Document Server

    Jubin, Mitra

    2016-01-01

    The ALICE experiment at the CERN Large Hadron Collider is going for a major physics upgrade in 2018. This upgrade is necessary for getting high statistics and high precision measurement for probing into rare physics channels needed to understand the dynamics of the condensed phase of QCD. The high interaction rate and the large event size in the upgraded detectors will result in an experimental data flow traffic of about 1 TB/s from the detectors to the on-line computing system. A dedicated Common Readout Unit (CRU) is proposed for data concentration, multiplexing, and trigger distribution. CRU, as common interface unit, handles timing, data and control signals between on-detector systems and online-offline computing system. An overview of the CRU architecture is presented in this manuscript.

  17. Common Readout System in ALICE

    CERN Document Server

    Jubin, Mitra

    2017-01-01

    The ALICE experiment at the CERN Large Hadron Collider is going for a major physics upgrade in 2018. This upgrade is necessary for getting high statistics and high precision measurement for probing into rare physics channels needed to understand the dynamics of the condensed phase of QCD. The high interaction rate and the large event size in the upgraded detectors will result in an experimental data flow traffic of about 1 TB/s from the detectors to the on-line computing system. A dedicated Common Readout Unit (CRU) is proposed for data concentration, multiplexing, and trigger distribution. CRU, as common interface unit, handles timing, data and control signals between on-detector systems and online-offline computing system. An overview of the CRU architecture is presented in this manuscript.

  18. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... are the limitations of CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed ... nasal cavity by small openings. top of page What are some common uses of the procedure? CT ...

  19. Computer jargon explained

    CERN Document Server

    Enticknap, Nicholas

    2014-01-01

    Computer Jargon Explained is a feature in Computer Weekly publications that discusses 68 of the most commonly used technical computing terms. The book explains what the terms mean and why the terms are important to computer professionals. The text also discusses how the terms relate to the trends and developments that are driving the information technology industry. Computer jargon irritates non-computer people and in turn causes problems for computer people. The technology and the industry are changing so rapidly; it is very hard even for professionals to keep updated. Computer people do not

  20. Common Misconceptions about Cholesterol

    Science.gov (United States)

    ... Venous Thromboembolism Aortic Aneurysm More Common Misconceptions about Cholesterol Updated:Jan 29,2018 How much do you ... are some common misconceptions — and the truth. High cholesterol isn’t a concern for children. High cholesterol ...

  1. How Common Is PTSD?

    Science.gov (United States)

    ... Center for PTSD » Public » How Common Is PTSD? PTSD: National Center for PTSD Menu Menu PTSD PTSD Home For the Public ... here Enter ZIP code here How Common Is PTSD? Public This section is for Veterans, General Public, ...

  2. Children's (Pediatric) CT (Computed Tomography)

    Medline Plus

    Full Text Available ... Z Children's (Pediatric) CT (Computed Tomography) Pediatric computed tomography (CT) is a fast, painless exam that uses special ... the limitations of Children's CT? What is Children's CT? Computed tomography, more commonly known as a CT or CAT ...

  3. Children's (Pediatric) CT (Computed Tomography)

    Medline Plus

    Full Text Available ... News Physician Resources Professions Site Index A-Z Children's (Pediatric) CT (Computed Tomography) Pediatric computed tomography (CT) ... are the limitations of Children's CT? What is Children's CT? Computed tomography, more commonly known as a ...

  4. Common cause failure analysis methodology for complex systems

    International Nuclear Information System (INIS)

    Wagner, D.P.; Cate, C.L.; Fussell, J.B.

    1977-01-01

    Common cause failure analysis, also called common mode failure analysis, is an integral part of a complex system reliability analysis. This paper extends existing methods of computer aided common cause failure analysis by allowing analysis of the complex systems often encountered in practice. The methods presented here aid in identifying potential common cause failures and also address quantitative common cause failure analysis

  5. Common Law and Un-common Sense

    OpenAIRE

    Ballard, Roger

    2000-01-01

    This paper examines the practical and conceptual differences which arise when juries are invited to apply their common sense in assessing reasonable behaviour in the midst of an ethnically plural society. The author explores the conundrums which the increasing salience of ethnic pluralism has now begun to pose in legal terms, most especially with respect to organisation of system for the equitable administration and delivery of justice in the context of an increasingly heterogeneous society. ...

  6. The common good

    OpenAIRE

    Argandoña, Antonio

    2011-01-01

    The concept of the common good occupied a relevant place in classical social, political and economic philosophy. After losing ground in the Modern age, it has recently reappeared, although with different and sometimes confusing meanings. This paper is the draft of a chapter of a Handbook; it explains the meaning of common good in the Aristotelian-Thomistic philosophy and in the Social Doctrine of the Catholic Church; why the common good is relevant; and how it is different from the other uses...

  7. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Head Computed tomography (CT) of the head uses special x-ray ... What is CT Scanning of the Head? Computed tomography, more commonly known as a CT or CAT ...

  8. Efektivitas Instagram Common Grounds

    OpenAIRE

    Wifalin, Michelle

    2016-01-01

    Efektivitas Instagram Common Grounds merupakan rumusan masalah yang diambil dalam penelitian ini. Efektivitas Instagram diukur menggunakan Customer Response Index (CRI), dimana responden diukur dalam berbagai tingkatan, mulai dari awareness, comprehend, interest, intentions dan action. Tingkatan respons inilah yang digunakan untuk mengukur efektivitas Instagram Common Grounds. Teori-teori yang digunakan untuk mendukung penelitian ini yaitu teori marketing Public Relations, teori iklan, efekti...

  9. Common Variable Immunodeficiency (CVID)

    Science.gov (United States)

    ... Relations Cyber Infrastructure Computational Biology Equal Employment Opportunity Ethics Global Research Office of Mission Integration and Financial Management Strategic Planning Workforce Effectiveness Workplace Solutions Technology Transfer Intellectual Property Division of AIDS ...

  10. Genomic Data Commons launches

    Science.gov (United States)

    The Genomic Data Commons (GDC), a unified data system that promotes sharing of genomic and clinical data between researchers, launched today with a visit from Vice President Joe Biden to the operations center at the University of Chicago.

  11. Common Mental Health Issues

    Science.gov (United States)

    Stock, Susan R.; Levine, Heidi

    2016-01-01

    This chapter provides an overview of common student mental health issues and approaches for student affairs practitioners who are working with students with mental illness, and ways to support the overall mental health of students on campus.

  12. Five Common Glaucoma Tests

    Science.gov (United States)

    ... About Us Donate In This Section Five Common Glaucoma Tests en Español email Send this article to ... year or two after age 35. A Comprehensive Glaucoma Exam To be safe and accurate, five factors ...

  13. Common symptoms during pregnancy

    Science.gov (United States)

    ... keep your gums healthy Swelling, Varicose Veins, and Hemorrhoids Swelling in your legs is common. You may ... In your rectum, veins that swell are called hemorrhoids. To reduce swelling: Raise your legs and rest ...

  14. The Common Good

    DEFF Research Database (Denmark)

    Feldt, Liv Egholm

    At present voluntary and philanthropic organisations are experiencing significant public attention and academic discussions about their role in society. Central to the debate is on one side the question of how they contribute to “the common good”, and on the other the question of how they can avoid...... and concepts continuously over time have blurred the different sectors and “polluted” contemporary definitions of the “common good”. The analysis shows that “the common good” is not an autonomous concept owned or developed by specific spheres of society. The analysis stresses that historically, “the common...... good” has always been a contested concept. It is established through messy and blurred heterogeneity of knowledge, purposes and goal achievements originating from a multitude of scientific, religious, political and civil society spheres contested not only in terms of words and definitions but also...

  15. Childhood Obesity: Common Misconceptions

    Science.gov (United States)

    ... Issues Listen Español Text Size Email Print Share Childhood Obesity: Common Misconceptions Page Content Article Body Everyone, it ... for less than 1% of the cases of childhood obesity. Yes, hypothyroidism (a deficit in thyroid secretion) and ...

  16. Common Childhood Orthopedic Conditions

    Science.gov (United States)

    ... Parents Parents site Sitio para padres General Health Growth & Development Infections Diseases & Conditions Pregnancy & Baby Nutrition & Fitness Emotions & ... pain. Toe Walking Toe walking is common among toddlers as they learn to walk, especially during the ...

  17. Review of quantum computation

    International Nuclear Information System (INIS)

    Lloyd, S.

    1992-01-01

    Digital computers are machines that can be programmed to perform logical and arithmetical operations. Contemporary digital computers are ''universal,'' in the sense that a program that runs on one computer can, if properly compiled, run on any other computer that has access to enough memory space and time. Any one universal computer can simulate the operation of any other; and the set of tasks that any such machine can perform is common to all universal machines. Since Bennett's discovery that computation can be carried out in a non-dissipative fashion, a number of Hamiltonian quantum-mechanical systems have been proposed whose time-evolutions over discrete intervals are equivalent to those of specific universal computers. The first quantum-mechanical treatment of computers was given by Benioff, who exhibited a Hamiltonian system with a basis whose members corresponded to the logical states of a Turing machine. In order to make the Hamiltonian local, in the sense that its structure depended only on the part of the computation being performed at that time, Benioff found it necessary to make the Hamiltonian time-dependent. Feynman discovered a way to make the computational Hamiltonian both local and time-independent by incorporating the direction of computation in the initial condition. In Feynman's quantum computer, the program is a carefully prepared wave packet that propagates through different computational states. Deutsch presented a quantum computer that exploits the possibility of existing in a superposition of computational states to perform tasks that a classical computer cannot, such as generating purely random numbers, and carrying out superpositions of computations as a method of parallel processing. In this paper, we show that such computers, by virtue of their common function, possess a common form for their quantum dynamics

  18. Right-sided duplication of the inferior vena cava and the common iliac vein: hidden hinds in spiral-computed tomography; Rechtsseitige Dopplung der Vena cava inferior und Vena iliaca communis: Bildgebung mit der Spiral-Computertomographie

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, D.R.; Friedrich, M. [Krankenhaus am Urban (Germany). Abt. fuer Roentgendiagnostik und Nuklearmedizin; Andresen, R. [Staedtisches Krankenhaus Zehlendorf, Behring (Germany). Abt. fuer Roentgendiagnostik und Nuklearmedizin

    1998-05-01

    Duplications of the inferior vena cava (IVC) are rare variants of the abdominal vessels and are normally located on both sides of the abdominal aorta. The rare case of a rightsided infrarenal duplication of the IVC with involvement of the common iliac vein is reported. Details of the embryology are presented for the understanding of this IVC variant. The spiral CT with multiplanar reconstructions makes it possible to define the vascular morphology and to differentiate it from lymphoma. (orig.) [Deutsch] Duplikaturen der Vena cava inferior (VCI) sind seltene meist bilateral der Aorta abdominalis gelegene abdominelle Gefaessvarianten. Der ungewoehnliche Fall einer rechtsseitigen infrarenalen Dopplung der VCI mit Beteiligung der Vena iliaca communis wird dargestellt. Auf der Embryologie wird, soweit fuer das Verstaendnis der vorliegenden VCI-Variante notwendig, eingegangen. Die Spiral-CT mit multiplanaren Rekonstruktionen erlaubt die morphologische Beschreibung der Gefaesssituation und die Differenzierung gegenueber Lymphomen. (orig.)

  19. Computer Lexis and Terminology

    Directory of Open Access Journals (Sweden)

    Gintautas Grigas

    2011-04-01

    Full Text Available Computer becomes a widely used tool in everyday work and at home. Every computer user sees texts on its screen containing a lot of words naming new concepts. Those words come from the terminology used by specialists. The common vocabury between computer terminology and lexis of everyday language comes into existence. The article deals with the part of computer terminology which goes to everyday usage and the influence of ordinary language to computer terminology. The relation between English and Lithuanian computer terminology, the construction and pronouncing of acronyms are discussed as well.

  20. Algorithms for solving common fixed point problems

    CERN Document Server

    Zaslavski, Alexander J

    2018-01-01

    This book details approximate solutions to common fixed point problems and convex feasibility problems in the presence of perturbations. Convex feasibility problems search for a common point of a finite collection of subsets in a Hilbert space; common fixed point problems pursue a common fixed point of a finite collection of self-mappings in a Hilbert space. A variety of algorithms are considered in this book for solving both types of problems, the study of which has fueled a rapidly growing area of research. This monograph is timely and highlights the numerous applications to engineering, computed tomography, and radiation therapy planning. Totaling eight chapters, this book begins with an introduction to foundational material and moves on to examine iterative methods in metric spaces. The dynamic string-averaging methods for common fixed point problems in normed space are analyzed in Chapter 3. Dynamic string methods, for common fixed point problems in a metric space are introduced and discussed in Chapter ...

  1. Age group athletes in inline skating: decrease in overall and increase in master athlete participation in the longest inline skating race in Europe – the Inline One-Eleven

    Science.gov (United States)

    Teutsch, Uwe; Knechtle, Beat; Rüst, Christoph Alexander; Rosemann, Thomas; Lepers, Romuald

    2013-01-01

    Background Participation and performance trends in age group athletes have been investigated in endurance and ultraendurance races in swimming, cycling, running, and triathlon, but not in long-distance inline skating. The aim of this study was to investigate trends in participation, age, and performance in the longest inline race in Europe, the Inline One-Eleven over 111 km, held between 1998 and 2009. Methods The total number, age distribution, age at the time of the competition, and race times of male and female finishers at the Inline One-Eleven were analyzed. Results Overall participation increased until 2003 but decreased thereafter. During the 12-year period, the relative participation in skaters younger than 40 years old decreased while relative participation increased for skaters older than 40 years. The mean top ten skating time was 199 ± 9 minutes (range: 189–220 minutes) for men and 234 ± 17 minutes (range: 211–271 minutes) for women, respectively. The gender difference in performance remained stable at 17% ± 5% across years. Conclusion To summarize, although the participation of master long-distance inline skaters increased, the overall participation decreased across years in the Inline One-Eleven. The race times of the best female and male skaters stabilized across years with a gender difference in performance of 17% ± 5%. Further studies should focus on the participation in the international World Inline Cup races. PMID:23690697

  2. Seroprevalence of HCV and HIV infection among clients of the nation's longest-standing statewide syringe exchange program: A cross-sectional study of Community Health Outreach Work to Prevent AIDS (CHOW).

    Science.gov (United States)

    Salek, Thomas P; Katz, Alan R; Lenze, Stacy M; Lusk, Heather M; Li, Dongmei; Des Jarlais, Don C

    2017-10-01

    The Community Health Outreach Work to Prevent AIDS (CHOW) Project is the first and longest-standing statewide integrated and funded needle and syringe exchange program (SEP) in the US. Initiated on O'ahu in 1990, CHOW expanded statewide in 1993. The purpose of this study is to estimate the prevalences of hepatitis C virus (HCV) and human immunodeficiency virus (HIV) infection, and to characterize risk behaviors associated with infection among clients of a long-standing SEP through the analysis of the 2012 CHOW evaluation data. A cross-sectional sample of 130 CHOW Project clients was selected from January 1, 2012 through December 31, 2012. Questionnaires captured self-reported exposure information. HIV and HCV antibodies were detected via rapid, point-of-care FDA-approved tests. Log-binomial regressions were used to estimate prevalence proportion ratios (PPRs). A piecewise linear log-binomial regression model containing 1 spline knot was used to fit the age-HCV relationship. The estimated seroprevalence of HCV was 67.7% (95% confidence interval [CI]=59.5-75.8%). HIV seroprevalence was 2.3% (95% CI=0-4.9%). Anti-HCV prevalence demonstrated age-specific patterns, ranging from 31.6% through 90.9% in people who inject drugs (PWID) HIV prevalence compared with HCV prevalence reflects differences in transmissibility of these 2 blood-borne pathogens and suggests much greater efficacy of SEP for HIV prevention. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Common Ground and Delegation

    DEFF Research Database (Denmark)

    Dobrajska, Magdalena; Foss, Nicolai Juul; Lyngsie, Jacob

    preconditions of increasing delegation. We argue that key HR practices?namely, hiring, training and job-rotation?are associated with delegation of decision-making authority. These practices assist in the creation of shared knowledge conditions between managers and employees. In turn, such a ?common ground......? influences the confidence with which managers delegate decision authority to employees, as managers improve their knowledge of the educational background, firm-specific knowledge, and perhaps even the possible actions of those to whom they delegate such authority. To test these ideas, we match a large......-scale questionnaire survey with unique population-wide employer-employee data. We find evidence of a direct and positive influence of hiring decisions (proxied by common educational background), and the training and job rotation of employees on delegation. Moreover, we find a positive interaction between common...

  4. Towards common technical standards

    International Nuclear Information System (INIS)

    Rahmat, H.; Suardi, A.R.

    1993-01-01

    In 1989, PETRONAS launched its Total Quality Management (TQM) program. In the same year the decision was taken by the PETRONAS Management to introduce common technical standards group wide. These standards apply to the design, construction, operation and maintenance of all PETRONAS installations in the upstream, downstream and petrochemical sectors. The introduction of common company standards is seen as part of an overall technical management system, which is an integral part of Total Quality Management. The Engineering and Safety Unit in the PETRONAS Central Office in Kuala Lumpur has been charged with the task of putting in place a set of technical standards throughout PETRONAS and its operating units

  5. COMMON FISCAL POLICY

    Directory of Open Access Journals (Sweden)

    Gabriel Mursa

    2014-08-01

    Full Text Available The purpose of this article is to demonstrate that a common fiscal policy, designed to support the euro currency, has some significant drawbacks. The greatest danger is the possibility of leveling the tax burden in all countries. This leveling of the tax is to the disadvantage of countries in Eastern Europe, in principle, countries poorly endowed with capital, that use a lax fiscal policy (Romania, Bulgaria, etc. to attract foreign investment from rich countries of the European Union. In addition, common fiscal policy can lead to a higher degree of centralization of budgetary expenditures in the European Union.

  6. Common Privacy Myths

    Science.gov (United States)

    ... the common myths: Health information cannot be faxed – FALSE Your information may be shared between healthcare providers by faxing ... E-mail cannot be used to transmit health information – FALSE E-mail can be used to transmit information, ...

  7. Common Breastfeeding Challenges

    Science.gov (United States)

    ... or duplicated without permission of the Office on Women’s Health in the U.S. Department of Health and Human Services. Citation of the source is appreciated. Page last updated: March 02, 2018. Common breastfeeding challenges Breastfeeding can be ...

  8. Common mistakes of investors

    Directory of Open Access Journals (Sweden)

    Yuen Wai Pong Raymond

    2012-09-01

    Full Text Available Behavioral finance is an actively discussed topic in the academic and investment circle. The main reason is because behavioral finance challenges the validity of a cornerstone of the modern financial theory: rationality of investors. In this paper, the common irrational behaviors of investors are discussed

  9. Common tester platform concept.

    Energy Technology Data Exchange (ETDEWEB)

    Hurst, Michael James

    2008-05-01

    This report summarizes the results of a case study on the doctrine of a common tester platform, a concept of a standardized platform that can be applicable across the broad spectrum of testing requirements throughout the various stages of a weapons program, as well as across the various weapons programs. The common tester concept strives to define an affordable, next-generation design that will meet testing requirements with the flexibility to grow and expand; supporting the initial development stages of a weapons program through to the final production and surveillance stages. This report discusses a concept investing key leveraging technologies and operational concepts combined with prototype tester-development experiences and practical lessons learned gleaned from past weapons programs.

  10. Common-Reliability Cumulative-Binomial Program

    Science.gov (United States)

    Scheuer, Ernest, M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CROSSER, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CROSSER, CUMBIN (NPO-17555), and NEWTONP (NPO-17556), used independently of one another. Point of equality between reliability of system and common reliability of components found. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Program written in C.

  11. Common anorectal disorders.

    Science.gov (United States)

    Foxx-Orenstein, Amy E; Umar, Sarah B; Crowell, Michael D

    2014-05-01

    Anorectal disorders result in many visits to healthcare specialists. These disorders include benign conditions such as hemorrhoids to more serious conditions such as malignancy; thus, it is important for the clinician to be familiar with these disorders as well as know how to conduct an appropriate history and physical examination. This article reviews the most common anorectal disorders, including hemorrhoids, anal fissures, fecal incontinence, proctalgia fugax, excessive perineal descent, and pruritus ani, and provides guidelines on comprehensive evaluation and management.

  12. Common sense codified

    CERN Multimedia

    CERN Bulletin

    2010-01-01

    At CERN, people of more than a hundred different nationalities and hundreds of different professions work together towards a common goal. The new Code of Conduct is a tool that has been designed to help us keep our workplace pleasant and productive through common standards of behaviour. Its basic principle is mutual respect and common sense. This is only natural, but not trivial…  The Director-General announced it in his speech at the beginning of the year, and the Bulletin wrote about it immediately afterwards. "It" is the new Code of Conduct, the document that lists our Organization's values and describes the basic standards of behaviour that we should both adopt and expect from others. "The Code of Conduct is not going to establish new rights or new obligations," explains Anne-Sylvie Catherin, Head of the Human Resources Department (HR). But what it will do is provide a framework for our existing rights and obligations." The aim of a co...

  13. Common primary headaches in pregnancy

    Directory of Open Access Journals (Sweden)

    Anuradha Mitra

    2015-01-01

    Full Text Available Headache is a very common problem in pregnancy. Evaluation of a complaint of headache requires categorizing it as primary or secondary. Migrainous headaches are known to be influenced by fluctuation of estrogen levels with high levels improving it and low levels deteriorating the symptoms. Tension-type Headaches (TTHs are the most common and usually less severe types of headache with female to male ratio 3:1. Women known to have primary headache before conception who present with a headache that is different from their usual headache, or women not known to have primary headache before conception who present with new-onset of headache during pregnancy need neurologic assessments for potential secondary cause for their headache. In addition to proper history and physical examination, both non-contrast computed tomography (CT and Magnetic Resonance Imaging (MRI are considered safe to be performed in pregnant women when indicated. Treatment of abortive and prophylactic therapy should include non-pharmacologic tools, judicious use of drugs which are safe for mother and fetus.

  14. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... the limitations of CT Scanning of the Head? What is CT Scanning of the Head? Computed tomography, ... than regular radiographs (x-rays). top of page What are some common uses of the procedure? CT ...

  15. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... of the Head? Computed tomography, more commonly known as a CT or CAT scan, is a diagnostic ... white on the x-ray; soft tissue, such as organs like the heart or liver, shows up ...

  16. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... of the Sinuses? Computed tomography, more commonly known as a CT or CAT scan, is a diagnostic ... white on the x-ray; soft tissue, such as organs like the heart or liver, shows up ...

  17. A heart that beats for 500 years: age-related changes in cardiac proteasome activity, oxidative protein damage and expression of heat shock proteins, inflammatory factors, and mitochondrial complexes in Arctica islandica, the longest-living noncolonial animal.

    Science.gov (United States)

    Sosnowska, Danuta; Richardson, Chris; Sonntag, William E; Csiszar, Anna; Ungvari, Zoltan; Ridgway, Iain

    2014-12-01

    Study of negligibly senescent animals may provide clues that lead to better understanding of the cardiac aging process. To elucidate mechanisms of successful cardiac aging, we investigated age-related changes in proteasome activity, oxidative protein damage and expression of heat shock proteins, inflammatory factors, and mitochondrial complexes in the heart of the ocean quahog Arctica islandica, the longest-lived noncolonial animal (maximum life span potential: 508 years). We found that in the heart of A. islandica the level of oxidatively damaged proteins did not change significantly up to 120 years of age. No significant aging-induced changes were observed in caspase-like and trypsin-like proteasome activity. Chymotrypsin-like proteasome activity showed a significant early-life decline, then it remained stable for up to 182 years. No significant relationship was observed between the extent of protein ubiquitination and age. In the heart of A. islandica, an early-life decline in expression of HSP90 and five mitochondrial electron transport chain complexes was observed. We found significant age-related increases in the expression of three cytokine-like mediators (interleukin-6, interleukin-1β, and tumor necrosis factor-α) in the heart of A. islandica. Collectively, in extremely long-lived molluscs, maintenance of protein homeostasis likely contributes to the preservation of cardiac function. Our data also support the concept that low-grade chronic inflammation in the cardiovascular system is a universal feature of the aging process, which is also manifest in invertebrates. © The Author 2013. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  18. Age group athletes in inline skating: decrease in overall and increase in master athlete participation in the longest inline skating race in Europe – the Inline One-Eleven

    Directory of Open Access Journals (Sweden)

    Teutsch U

    2013-05-01

    Full Text Available Uwe Teutsch,1 Beat Knechtle,1,2 Christoph Alexander Rüst,1 Thomas Rosemann,1 Romuald Lepers31Institute of General Practice and Health Services Research, University of Zurich, Zurich, Switzerland; 2Gesundheitszentrum St Gallen, St Gallen, Switzerland; 3INSERM U1093, Faculty of Sport Sciences, University of Burgundy, Dijon, FranceBackground: Participation and performance trends in age group athletes have been investigated in endurance and ultraendurance races in swimming, cycling, running, and triathlon, but not in long-distance inline skating. The aim of this study was to investigate trends in participation, age, and performance in the longest inline race in Europe, the Inline One-Eleven over 111 km, held between 1998 and 2009.Methods: The total number, age distribution, age at the time of the competition, and race times of male and female finishers at the Inline One-Eleven were analyzed.Results: Overall participation increased until 2003 but decreased thereafter. During the 12-year period, the relative participation in skaters younger than 40 years old decreased while relative participation increased for skaters older than 40 years. The mean top ten skating time was 199 ± 9 minutes (range: 189–220 minutes for men and 234 ± 17 minutes (range: 211–271 minutes for women, respectively. The gender difference in performance remained stable at 17% ± 5% across years.Conclusion: To summarize, although the participation of master long-distance inline skaters increased, the overall participation decreased across years in the Inline One-Eleven. The race times of the best female and male skaters stabilized across years with a gender difference in performance of 17% ± 5%. Further studies should focus on the participation in the international World Inline Cup races.Keywords: endurance, men, women, gender

  19. Common Vestibular Disorders

    Directory of Open Access Journals (Sweden)

    Dimitrios G. Balatsouras

    2017-01-01

    Full Text Available The three most common vestibular diseases, benign paroxysmal positional vertigo (BPPV, Meniere's disease (MD and vestibular neuritis (VN, are presented in this paper. BPPV, which is the most common peripheral vestibular disorder, can be defined as transient vertigo induced by a rapid head position change, associated with a characteristic paroxysmal positional nystagmus. Canalolithiasis of the posterior semicircular canal is considered the most convincing theory of its pathogenesis and the development of appropriate therapeutic maneuvers resulted in its effective treatment. However, involvement of the horizontal or the anterior canal has been found in a significant rate and the recognition and treatment of these variants completed the clinical picture of the disease. MD is a chronic condition characterized by episodic attacks of vertigo, fluctuating hearing loss, tinnitus, aural pressure and a progressive loss of audiovestibular functions. Presence of endolymphatic hydrops on postmortem examination is its pathologic correlate. MD continues to be a diagnostic and therapeutic challenge. Patients with the disease range from minimally symptomatic, highly functional individuals to severely affected, disabled patients. Current management strategies are designed to control the acute and recurrent vestibulopathy but offer minimal remedy for the progressive cochlear dysfunction. VN is the most common cause of acute spontaneous vertigo, attributed to acute unilateral loss of vestibular function. Key signs and symptoms are an acute onset of spinning vertigo, postural imbalance and nausea as well as a horizontal rotatory nystagmus beating towards the non-affected side, a pathological headimpulse test and no evidence for central vestibular or ocular motor dysfunction. Vestibular neuritis preferentially involves the superior vestibular labyrinth and its afferents. Symptomatic medication is indicated only during the acute phase to relieve the vertigo and nausea

  20. Common Influence Join

    DEFF Research Database (Denmark)

    Yiu, Man Lung; Mamoulis, Nikos; Karras, Panagiotis

    2008-01-01

    We identify and formalize a novel join operator for two spatial pointsets P and Q. The common influence join (CIJ) returns the pairs of points (p,q),p isin P,q isin Q, such that there exists a location in space, being closer to p than to any other point in P and at the same time closer to q than ......-demand, is very efficient in practice, incurring only slightly higher I/O cost than the theoretical lower bound cost for the problem....

  1. English for common entrance

    CERN Document Server

    Kossuth, Kornel

    2013-01-01

    Succeed in the exam with this revision guide, designed specifically for the brand new Common Entrance English syllabus. It breaks down the content into manageable and straightforward chunks with easy-to-use, step-by-step instructions that should take away the fear of CE and guide you through all aspects of the exam. - Gives you step-by-step guidance on how to recognise various types of comprehension questions and answer them. - Shows you how to write creatively as well as for a purpose for the section B questions. - Reinforces and consolidates learning with tips, guidance and exercises through

  2. Building the common

    DEFF Research Database (Denmark)

    Agustin, Oscar Garcia

    document, A Common Immigration Policy for Europe: Principles, actions and tools (2008) as a part of Hague Programme (2004) on actions against terrorism, organised crime and migration and asylum management and influenced by the renewed Lisbon Strategy (2005-2010) for growth and jobs. My aim is to explore...... policy in the European Union is constructed and the categories and themes that are discussed. I will look also at the discourse strategies to show the linguistic representations of the social actors, who are excluded from or include in such representations. I will analysis a European Commission’s policy...

  3. Managing common marital stresses.

    Science.gov (United States)

    Martin, A C; Starling, B P

    1989-10-01

    Marital conflict and divorce are problems of great magnitude in our society, and nurse practitioners are frequently asked by patients to address marital problems in clinical practice. "Family life cycle theory" provides a framework for understanding the common stresses of marital life and for developing nursing strategies to improve marital satisfaction. If unaddressed, marital difficulties have serious adverse consequences for a couple's health, leading to greater dysfunction and a decline in overall wellness. This article focuses on identifying couples in crisis, assisting them to achieve pre-crisis equilibrium or an even higher level of functioning, and providing appropriate referral if complex relationship problems exist.

  4. Applications of computer algebra

    CERN Document Server

    1985-01-01

    Today, certain computer software systems exist which surpass the computational ability of researchers when their mathematical techniques are applied to many areas of science and engineering. These computer systems can perform a large portion of the calculations seen in mathematical analysis. Despite this massive power, thousands of people use these systems as a routine resource for everyday calculations. These software programs are commonly called "Computer Algebra" systems. They have names such as MACSYMA, MAPLE, muMATH, REDUCE and SMP. They are receiving credit as a computational aid with in­ creasing regularity in articles in the scientific and engineering literature. When most people think about computers and scientific research these days, they imagine a machine grinding away, processing numbers arithmetically. It is not generally realized that, for a number of years, computers have been performing non-numeric computations. This means, for example, that one inputs an equa­ tion and obtains a closed for...

  5. Quantum computing

    International Nuclear Information System (INIS)

    Steane, Andrew

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  6. Quantum computing

    Energy Technology Data Exchange (ETDEWEB)

    Steane, Andrew [Department of Atomic and Laser Physics, University of Oxford, Clarendon Laboratory, Oxford (United Kingdom)

    1998-02-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  7. Common Sense Biblical Hermeneutics

    Directory of Open Access Journals (Sweden)

    Michael B. Mangini

    2014-12-01

    Full Text Available Since the noetics of moderate realism provide a firm foundation upon which to build a hermeneutic of common sense, in the first part of his paper the author adopts Thomas Howe’s argument that the noetical aspect of moderate realism is a necessary condition for correct, universally valid biblical interpretation, but he adds, “insofar as it gives us hope in discovering the true meaning of a given passage.” In the second part, the author relies on John Deely’s work to show how semiotics may help interpreters go beyond meaning and seek the significance of the persons, places, events, ideas, etc., of which the meaning of the text has presented as objects to be interpreted. It is in significance that the unity of Scripture is found. The chief aim is what every passage of the Bible signifies. Considered as a genus, Scripture is composed of many parts/species that are ordered to a chief aim. This is the structure of common sense hermeneutics; therefore in the third part the author restates Peter Redpath’s exposition of Aristotle and St. Thomas’s ontology of the one and the many and analogously applies it to the question of how an exegete can discern the proper significance and faithfully interpret the word of God.

  8. True and common balsams

    Directory of Open Access Journals (Sweden)

    Dayana L. Custódio

    2012-08-01

    Full Text Available Balsams have been used since ancient times, due to their therapeutic and healing properties; in the perfume industry, they are used as fixatives, and in the cosmetics industry and in cookery, they are used as preservatives and aromatizers. They are generally defined as vegetable material with highly aromatic properties that supposedly have the ability to heal diseases, not only of the body, but also of the soul. When viewed according to this concept, many substances can be considered balsams. A more modern concept is based on its chemical composition and origin: a secretion or exudate of plants that contain cinnamic and benzoic acids, and their derivatives, in their composition. The most common naturally-occurring balsams (i.e. true balsams are the Benzoins, Liquid Storaque and the Balsams of Tolu and Peru. Many other aromatic exudates, such as Copaiba Oil and Canada Balsam, are wrongly called balsam. These usually belong to other classes of natural products, such as essential oils, resins and oleoresins. Despite the understanding of some plants, many plants are still called balsams. This article presents a chemical and pharmacological review of the most common balsams.

  9. Computer information systems framework

    International Nuclear Information System (INIS)

    Shahabuddin, S.

    1989-01-01

    Management information systems (MIS) is a commonly used term in computer profession. The new information technology has caused management to expect more from computer. The process of supplying information follows a well defined procedure. MIS should be capable for providing usable information to the various areas and levels of organization. MIS is different from data processing. MIS and business hierarchy provides a good framework for many organization which are using computers. (A.B.)

  10. Optimal design of wind barriers using 3D computational fluid dynamics simulations

    Science.gov (United States)

    Fang, H.; Wu, X.; Yang, X.

    2017-12-01

    Desertification is a significant global environmental and ecological problem that requires human-regulated control and management. Wind barriers are commonly used to reduce wind velocity or trap drifting sand in arid or semi-arid areas. Therefore, optimal design of wind barriers becomes critical in Aeolian engineering. In the current study, we perform 3D computational fluid dynamics (CFD) simulations for flow passing through wind barriers with different structural parameters. To validate the simulation results, we first inter-compare the simulated flow field results with those from both wind-tunnel experiments and field measurements. Quantitative analyses of the shelter effect are then conducted based on a series of simulations with different structural parameters (such as wind barrier porosity, row numbers, inter-row spacing and belt schemes). The results show that wind barriers with porosity of 0.35 could provide the longest shelter distance (i.e., where the wind velocity reduction is more than 50%) thus are recommended in engineering designs. To determine the optimal row number and belt scheme, we introduce a cost function that takes both wind-velocity reduction effects and economical expense into account. The calculated cost function show that a 3-row-belt scheme with inter-row spacing of 6h (h as the height of wind barriers) and inter-belt spacing of 12h is the most effective.

  11. Disscusion on the common

    Directory of Open Access Journals (Sweden)

    Antonio Negri

    2011-01-01

    Full Text Available In this interview taken shortly after the launch of the Italian translation of the Commonwealth, Antonio Negri, besides discussing details of his collaboration with Michael Hardt, addresses the most important topics of the book, which could remain unclear for the readers. He gives a wide range of answers for the questions on, for example, importance of revision and revitalization of seventeenth century’s categories, what does it mean to be a communist today, elaboration of the thesis of real subsumption. He also stresses the significance of the struggle over the common and processes of its institutionalization for contemporary revolutionary politics and faces criticism of the conception of immaterial and biopolitical labour.

  12. CPL: Common Pipeline Library

    Science.gov (United States)

    ESO CPL Development Team

    2014-02-01

    The Common Pipeline Library (CPL) is a set of ISO-C libraries that provide a comprehensive, efficient and robust software toolkit to create automated astronomical data reduction pipelines. Though initially developed as a standardized way to build VLT instrument pipelines, the CPL may be more generally applied to any similar application. The code also provides a variety of general purpose image- and signal-processing functions, making it an excellent framework for the creation of more generic data handling packages. The CPL handles low-level data types (images, tables, matrices, strings, property lists, etc.) and medium-level data access methods (a simple data abstraction layer for FITS files). It also provides table organization and manipulation, keyword/value handling and management, and support for dynamic loading of recipe modules using programs such as EsoRex (ascl:1504.003).

  13. Common Superficial Bursitis.

    Science.gov (United States)

    Khodaee, Morteza

    2017-02-15

    Superficial bursitis most often occurs in the olecranon and prepatellar bursae. Less common locations are the superficial infrapatellar and subcutaneous (superficial) calcaneal bursae. Chronic microtrauma (e.g., kneeling on the prepatellar bursa) is the most common cause of superficial bursitis. Other causes include acute trauma/hemorrhage, inflammatory disorders such as gout or rheumatoid arthritis, and infection (septic bursitis). Diagnosis is usually based on clinical presentation, with a particular focus on signs of septic bursitis. Ultrasonography can help distinguish bursitis from cellulitis. Blood testing (white blood cell count, inflammatory markers) and magnetic resonance imaging can help distinguish infectious from noninfectious causes. If infection is suspected, bursal aspiration should be performed and fluid examined using Gram stain, crystal analysis, glucose measurement, blood cell count, and culture. Management depends on the type of bursitis. Acute traumatic/hemorrhagic bursitis is treated conservatively with ice, elevation, rest, and analgesics; aspiration may shorten the duration of symptoms. Chronic microtraumatic bursitis should be treated conservatively, and the underlying cause addressed. Bursal aspiration of microtraumatic bursitis is generally not recommended because of the risk of iatrogenic septic bursitis. Although intrabursal corticosteroid injections are sometimes used to treat microtraumatic bursitis, high-quality evidence demonstrating any benefit is unavailable. Chronic inflammatory bursitis (e.g., gout, rheumatoid arthritis) is treated by addressing the underlying condition, and intrabursal corticosteroid injections are often used. For septic bursitis, antibiotics effective against Staphylococcus aureus are generally the initial treatment, with surgery reserved for bursitis not responsive to antibiotics or for recurrent cases. Outpatient antibiotics may be considered in those who are not acutely ill; patients who are acutely ill

  14. Common Lisp a gentle introduction to symbolic computation

    CERN Document Server

    Touretzky, David S

    2013-01-01

    This highly accessible introduction to Lisp is suitable both for novices approaching their first programming language and experienced programmers interested in exploring a key tool for artificial intelligence research. The text offers clear, reader-friendly explanations of such essential concepts as cons cell structures, evaluation rules, programs as data, and recursive and applicative programming styles. The treatment incorporates several innovative instructional devices, such as the use of function boxes in the first two chapters to visually distinguish functions from data, use of evaltrace

  15. CERN's common Unix and X terminal environment

    International Nuclear Information System (INIS)

    Cass, Tony

    1996-01-01

    The Desktop Infrastructure Group of CERN's Computing and Networks Division has developed a Common Unix and X Terminal Environment to case the migration to Unix based Interactive Computing. The CUTE architecture relies on a distributed flesystem - currently Transarc's AFS - to enable essentially interchangeable client workstation to access both home directory and program files transparently. Additionally, we provide a suite of programs to configure workstations for CUTE and to ensure continued compatibility. This paper describes the different components and the development of the CUTE architecture. (author)

  16. Computational force, mass, and energy

    International Nuclear Information System (INIS)

    Numrich, R.W.

    1997-01-01

    This paper describes a correspondence between computational quantities commonly used to report computer performance measurements and mechanical quantities from classical Newtonian mechanics. It defines a set of three fundamental computational quantities that are sufficient to establish a system of computational measurement. From these quantities, it defines derived computational quantities that have analogous physical counterparts. These computational quantities obey three laws of motion in computational space. The solutions to the equations of motion, with appropriate boundary conditions, determine the computational mass of the computer. Computational forces, with magnitudes specific to each instruction and to each computer, overcome the inertia represented by this mass. The paper suggests normalizing the computational mass scale by picking the mass of a register on the CRAY-1 as the standard unit of mass

  17. APME launches common method

    International Nuclear Information System (INIS)

    Anon.

    1993-01-01

    A common approach for carrying out ecological balances for commodity thermoplastics is due to be launched by the Association of Plastics Manufacturers in Europe (APME; Brussels) and its affiliate, The European Centre for Plastics in the Environment (PWMI) this week. The methodology report is the latest stage of a program started in 1990 that aims to describe all operations up to the production of polymer powder or granules at the plant gate. Information gathered will be made freely available to companies considering the use of polymers. An industry task force, headed by PWMI executive director Vince Matthews, has gathered information on the plastics production processes from oil to granule, and an independent panel of specialists, chaired by Ian Boustead of the U.K.'s Open University, devised the methodology and analysis. The methodology report stresses the need to define the system being analyzed and discusses how complex chemical processes can be analyzed in terms of consumption of fuels, energy, and raw materials, as well as solid, liquid, and gaseous emissions

  18. Reformulating the commons

    Directory of Open Access Journals (Sweden)

    Ostrom Elinor

    2002-01-01

    Full Text Available The western hemisphere is richly endowed with a diversity of natural resource systems that are governed by complex local and national institutional arrangements that have not, until recently, been well understood. While many local communities that possess a high degree of autonomy to govern local resources have been highly successful over long periods of time, others fail to take action to prevent overuse and degradation of forests, inshore fisheries, and other natural resources. The conventional theory used to predict and explain how local users will relate to resources that they share makes a uniform prediction that users themselves will be unable to extricate themselves from the tragedy of the commons. Using this theoretical view of the world, there is no variance in the performance of self-organized groups. In theory, there are no self-organized groups. Empirical evidence tells us, however, that considerable variance in performance exists and many more local users self-organize and are more successful than it is consistent with the conventional theory . Parts of a new theory are presented here.

  19. Neural computation and the computational theory of cognition.

    Science.gov (United States)

    Piccinini, Gualtiero; Bahar, Sonya

    2013-04-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism-neural processes are computations in the generic sense. After that, we reject on empirical grounds the common assimilation of neural computation to either analog or digital computation, concluding that neural computation is sui generis. Analog computation requires continuous signals; digital computation requires strings of digits. But current neuroscientific evidence indicates that typical neural signals, such as spike trains, are graded like continuous signals but are constituted by discrete functional elements (spikes); thus, typical neural signals are neither continuous signals nor strings of digits. It follows that neural computation is sui generis. Finally, we highlight three important consequences of a proper understanding of neural computation for the theory of cognition. First, understanding neural computation requires a specially designed mathematical theory (or theories) rather than the mathematical theories of analog or digital computation. Second, several popular views about neural computation turn out to be incorrect. Third, computational theories of cognition that rely on non-neural notions of computation ought to be replaced or reinterpreted in terms of neural computation. Copyright © 2012 Cognitive Science Society, Inc.

  20. Programming in biomolecular computation

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Our goal is to provide a top-down approach to biomolecular computation. In spite of widespread discussion about connections between biology and computation, one question seems notable by its absence: Where are the programs? We identify a number of common features in programming that seem...... conspicuously absent from the literature on biomolecular computing; to partially redress this absence, we introduce a model of computation that is evidently programmable, by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined...... by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only executable, but are also compilable and interpretable. It is universal: all computable functions can be computed (in natural ways...

  1. Threads of common knowledge.

    Science.gov (United States)

    Icamina, P

    1993-04-01

    Indigenous knowledge is examined as it is affected by development and scientific exploration. The indigenous culture of shamanism, which originated in northern and southeast Asia, is a "political and religious technique for managing societies through rituals, myths, and world views." There is respect for the natural environment and community life as a social common good. This world view is still practiced by many in Latin America and in Colombia specifically. Colombian shamanism has an environmental accounting system, but the Brazilian government has established its own system of land tenure and political representation which does not adequately represent shamanism. In 1992 a conference was held in the Philippines by the International Institute for Rural Reconstruction and IDRC on sustainable development and indigenous knowledge. The link between the two is necessary. Unfortunately, there are already examples in the Philippines of loss of traditional crop diversity after the introduction of modern farming techniques and new crop varieties. An attempt was made to collect species, but without proper identification. Opposition was expressed to the preservation of wilderness preserves; the desire was to allow indigenous people to maintain their homeland and use their time-tested sustainable resource management strategies. Property rights were also discussed during the conference. Of particular concern was the protection of knowledge rights about biological diversity or pharmaceutical properties of indigenous plant species. The original owners and keepers of the knowledge must retain access and control. The research gaps were identified and found to be expansive. Reference was made to a study of Mexican Indian children who knew 138 plant species while non-Indian children knew only 37. Sometimes there is conflict of interest where foresters prefer timber forests and farmers desire fuelwood supplies and fodder and grazing land, which is provided by shrubland. Information

  2. Children's (Pediatric) CT (Computed Tomography)

    Medline Plus

    Full Text Available ... Site Index A-Z Children's (Pediatric) CT (Computed Tomography) Pediatric computed tomography (CT) is a fast, painless exam that uses ... of Children's CT? What is Children's CT? Computed tomography, more commonly known as a CT or CAT ...

  3. Experts' views on digital competence: commonalities and differences

    NARCIS (Netherlands)

    Janssen, José; Stoyanov, Slavi; Ferrari, Anusca; Punie, Yves; Pannekeet, Kees; Sloep, Peter

    2013-01-01

    Janssen, J., Stoyanov, S., Ferrari, A., Punie, Y., Pannekeet, K., & Sloep, P. B. (2013). Experts’ views on digital competence: commonalities and differences. Computers & Education, 68, 473–481. doi:10.1016/j.compedu.2013.06.008

  4. Structures for common-cause failure analysis

    International Nuclear Information System (INIS)

    Vaurio, J.K.

    1981-01-01

    Common-cause failure methodology and terminology have been reviewed and structured to provide a systematical basis for addressing and developing models and methods for quantification. The structure is based on (1) a specific set of definitions, (2) categories based on the way faults are attributable to a common cause, and (3) classes based on the time of entry and the time of elimination of the faults. The failure events are then characterized by their likelihood or frequency and the average residence time. The structure provides a basis for selecting computational models, collecting and evaluating data and assessing the importance of various failure types, and for developing effective defences against common-cause failure. The relationships of this and several other structures are described

  5. Computer group

    International Nuclear Information System (INIS)

    Bauer, H.; Black, I.; Heusler, A.; Hoeptner, G.; Krafft, F.; Lang, R.; Moellenkamp, R.; Mueller, W.; Mueller, W.F.; Schati, C.; Schmidt, A.; Schwind, D.; Weber, G.

    1983-01-01

    The computer groups has been reorganized to take charge for the general purpose computers DEC10 and VAX and the computer network (Dataswitch, DECnet, IBM - connections to GSI and IPP, preparation for Datex-P). (orig.)

  6. Computer Engineers.

    Science.gov (United States)

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  7. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  8. Design and simulation of virtual telephone keypad control based on brain computer interface (BCI with very high transfer rates

    Directory of Open Access Journals (Sweden)

    Rehab B. Ashari

    2011-03-01

    Full Text Available Brain Computer Interface (BCI is a communication and control mechanism, which does not rely on any kind of muscular response to send a message to the external world. This technique is used to help the paralyzed people with spinal cord injury to have the ability to communicate with the external world. In this paper we emphasize to increase the BCI System bit rate for controlling a virtual telephone keypad. To achieve the proposed algorithm, a simulated virtual telephone keypad based on Steady State Visual Evoked Potential (SSVEP BCI system is developed. Dynamic programming technique with specifically modified Longest Common Subsequence (LCS algorithm is used. By comparing the paralyzed user selection with the recent, and then the rest, of the stored records in the file of the telephone, the user can save the rest of his choices for controlling the keypad and thence improving the overall performance of the BCI system. This axiomatic approach, which is used in searching the web pages for increasing the performance of the searching, is urgent to be used for the paralyzed people rather than the normal user.

  9. Common Sleep Problems (For Teens)

    Science.gov (United States)

    ... Safe Videos for Educators Search English Español Common Sleep Problems KidsHealth / For Teens / Common Sleep Problems What's ... have emotional problems, like depression. What Happens During Sleep? You don't notice it, of course, but ...

  10. 6 Common Cancers - Skin Cancer

    Science.gov (United States)

    ... Bar Home Current Issue Past Issues 6 Common Cancers - Skin Cancer Past Issues / Spring 2007 Table of Contents ... AP Photo/Herald-Mail, Kevin G. Gilbert Skin Cancer Skin cancer is the most common form of cancer ...

  11. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  12. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  13. Reliability model for common mode failures in redundant safety systems

    International Nuclear Information System (INIS)

    Fleming, K.N.

    1974-12-01

    A method is presented for computing the reliability of redundant safety systems, considering both independent and common mode type failures. The model developed for the computation is a simple extension of classical reliability theory. The feasibility of the method is demonstrated with the use of an example. The probability of failure of a typical diesel-generator emergency power system is computed based on data obtained from U. S. diesel-generator operating experience. The results are compared with reliability predictions based on the assumption that all failures are independent. The comparison shows a significant increase in the probability of redundant system failure, when common failure modes are considered. (U.S.)

  14. Quantum Computing

    OpenAIRE

    Scarani, Valerio

    1998-01-01

    The aim of this thesis was to explain what quantum computing is. The information for the thesis was gathered from books, scientific publications, and news articles. The analysis of the information revealed that quantum computing can be broken down to three areas: theories behind quantum computing explaining the structure of a quantum computer, known quantum algorithms, and the actual physical realizations of a quantum computer. The thesis reveals that moving from classical memor...

  15. Desktop grid computing

    CERN Document Server

    Cerin, Christophe

    2012-01-01

    Desktop Grid Computing presents common techniques used in numerous models, algorithms, and tools developed during the last decade to implement desktop grid computing. These techniques enable the solution of many important sub-problems for middleware design, including scheduling, data management, security, load balancing, result certification, and fault tolerance. The book's first part covers the initial ideas and basic concepts of desktop grid computing. The second part explores challenging current and future problems. Each chapter presents the sub-problems, discusses theoretical and practical

  16. Indirection and computer security.

    Energy Technology Data Exchange (ETDEWEB)

    Berg, Michael J.

    2011-09-01

    The discipline of computer science is built on indirection. David Wheeler famously said, 'All problems in computer science can be solved by another layer of indirection. But that usually will create another problem'. We propose that every computer security vulnerability is yet another problem created by the indirections in system designs and that focusing on the indirections involved is a better way to design, evaluate, and compare security solutions. We are not proposing that indirection be avoided when solving problems, but that understanding the relationships between indirections and vulnerabilities is key to securing computer systems. Using this perspective, we analyze common vulnerabilities that plague our computer systems, consider the effectiveness of currently available security solutions, and propose several new security solutions.

  17. Otwarty model licencjonowania Creative Commons

    OpenAIRE

    Tarkowski, Alek

    2007-01-01

    The paper presents a family of Creative Commons licenses (which form nowadays one of the basic legal tools used in the Open Access movement), as well as a genesis of the licenses – inspired by Open Software Licenses and the concept of commons. Then legal tools such as individual Creative Commons licenses are discussed as well as how to use them, with a special emphasis on practical applications in science and education. The author discusses also his research results on scientific publishers a...

  18. Five Theses on the Common

    Directory of Open Access Journals (Sweden)

    Gigi Roggero

    2011-01-01

    Full Text Available I present five theses on the common within the context of the transformations of capitalist social relations as well as their contemporary global crisis. My framework involves ‘‘cognitive capitalism,’’ new processes of class composition, and the production of living knowledge and subjectivity. The commons is often discussed today in reference to the privatizationand commodification of ‘‘common goods.’’ This suggests a naturalistic and conservative image of the common, unhooked from the relations of production. I distinguish between commons and the common: the first model is related to Karl Polanyi, the second to Karl Marx. As elaborated in the postoperaista debate, the common assumes an antagonistic double status: it is boththe plane of the autonomy of living labor and it is subjected to capitalist ‘‘capture.’’ Consequently, what is at stake is not the conservation of ‘‘commons,’’ but rather the production of the common and its organization into new institutions that would take us beyond the exhausted dialectic between public and private.

  19. Open Data as a New Commons

    DEFF Research Database (Denmark)

    Morelli, Nicola; Mulder, Ingrid; Concilio, Grazia

    2017-01-01

    and environmental opportunities around them and government choices. Developing spacesmeans for enabling citizens to harness the opportunities coming from the use of this new resource, offers thus a substantial promise of social innovation. This means that open data is vi (still) virtually a new resource that could...... become a new commons with the engagement of interested and active communities. The condition for open data becoming a new common is that citizens become aware of the potential of this resource, that they use it for creating new services and that new practices and infrastructures are defined, that would......An increasing computing capability is raising the opportunities to use a large amount of publicly available data for creating new applications and a new generation of public services. But while it is easy to find some early examples of services concerning control systems (e.g. traffic, meteo...

  20. Motivating Contributions for Home Computer Security

    Science.gov (United States)

    Wash, Richard L.

    2009-01-01

    Recently, malicious computer users have been compromising computers en masse and combining them to form coordinated botnets. The rise of botnets has brought the problem of home computers to the forefront of security. Home computer users commonly have insecure systems; these users do not have the knowledge, experience, and skills necessary to…

  1. Computational Medicine

    DEFF Research Database (Denmark)

    Nygaard, Jens Vinge

    2017-01-01

    The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours......The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours...

  2. Grid Computing

    Indian Academy of Sciences (India)

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers ...

  3. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  4. Facts about the Common Cold

    Science.gov (United States)

    ... different viruses. Rhinovirus is the most common cause, accounting for 10 to 40 percent of colds. Other common cold viruses include coronavirus and ... RSS | Terms Of Use | Privacy | Sitemap Our Family Of Sites ... Introduction Risk Factors Screening Symptoms Tumor Testing Summary '; var ...

  5. Wave-equation Migration Velocity Analysis Using Plane-wave Common Image Gathers

    KAUST Repository

    Guo, Bowen; Schuster, Gerard T.

    2017-01-01

    Wave-equation migration velocity analysis (WEMVA) based on subsurface-offset, angle domain or time-lag common image gathers (CIGs) requires significant computational and memory resources because it computes higher dimensional migration images

  6. Children's (Pediatric) CT (Computed Tomography)

    Medline Plus

    Full Text Available ... is Children's CT? Computed tomography, more commonly known as a CT or CAT scan, is a diagnostic ... is used to evaluate: complications from infections such as pneumonia a tumor that arises in the lung ...

  7. Quantum computers and quantum computations

    International Nuclear Information System (INIS)

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  8. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  9. Pervasive Computing

    NARCIS (Netherlands)

    Silvis-Cividjian, N.

    This book provides a concise introduction to Pervasive Computing, otherwise known as Internet of Things (IoT) and Ubiquitous Computing (Ubicomp) which addresses the seamless integration of computing systems within everyday objects. By introducing the core topics and exploring assistive pervasive

  10. Computational vision

    CERN Document Server

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  11. Spatial Computation

    Science.gov (United States)

    2003-12-01

    Computation and today’s microprocessors with the approach to operating system architecture, and the controversy between microkernels and monolithic kernels...Both Spatial Computation and microkernels break away a relatively monolithic architecture into in- dividual lightweight pieces, well specialized...for their particular functionality. Spatial Computation removes global signals and control, in the same way microkernels remove the global address

  12. Parallel computations

    CERN Document Server

    1982-01-01

    Parallel Computations focuses on parallel computation, with emphasis on algorithms used in a variety of numerical and physical applications and for many different types of parallel computers. Topics covered range from vectorization of fast Fourier transforms (FFTs) and of the incomplete Cholesky conjugate gradient (ICCG) algorithm on the Cray-1 to calculation of table lookups and piecewise functions. Single tridiagonal linear systems and vectorized computation of reactive flow are also discussed.Comprised of 13 chapters, this volume begins by classifying parallel computers and describing techn

  13. Human Computation

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  14. Quantum computation

    International Nuclear Information System (INIS)

    Deutsch, D.

    1992-01-01

    As computers become ever more complex, they inevitably become smaller. This leads to a need for components which are fabricated and operate on increasingly smaller size scales. Quantum theory is already taken into account in microelectronics design. This article explores how quantum theory will need to be incorporated into computers in future in order to give them their components functionality. Computation tasks which depend on quantum effects will become possible. Physicists may have to reconsider their perspective on computation in the light of understanding developed in connection with universal quantum computers. (UK)

  15. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  16. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  17. Governing of common cause failures

    International Nuclear Information System (INIS)

    Bock, H.W.

    1998-01-01

    Agreed strategy is to govern common cause failures by the application of diversity, to assure that the overall plant safety objectives are met even in the case that a common cause failure of a system with all redundant trains is assumed. The presented strategy aims on the application of functional diversity without the implementation of equipment diversity. In the focus are the design criteria which have to be met for the design of independent systems in such a way that the time-correlated failure of such independent systems according a common cause can be excluded deterministically. (author)

  18. Physical computation and cognitive science

    CERN Document Server

    Fresco, Nir

    2014-01-01

    This book presents a study of digital computation in contemporary cognitive science. Digital computation is a highly ambiguous concept, as there is no common core definition for it in cognitive science. Since this concept plays a central role in cognitive theory, an adequate cognitive explanation requires an explicit account of digital computation. More specifically, it requires an account of how digital computation is implemented in physical systems. The main challenge is to deliver an account encompassing the multiple types of existing models of computation without ending up in pancomputationalism, that is, the view that every physical system is a digital computing system. This book shows that only two accounts, among the ones examined by the author, are adequate for explaining physical computation. One of them is the instructional information processing account, which is developed here for the first time.   “This book provides a thorough and timely analysis of differing accounts of computation while adv...

  19. Computer systems a programmer's perspective

    CERN Document Server

    Bryant, Randal E

    2016-01-01

    Computer systems: A Programmer’s Perspective explains the underlying elements common among all computer systems and how they affect general application performance. Written from the programmer’s perspective, this book strives to teach readers how understanding basic elements of computer systems and executing real practice can lead them to create better programs. Spanning across computer science themes such as hardware architecture, the operating system, and systems software, the Third Edition serves as a comprehensive introduction to programming. This book strives to create programmers who understand all elements of computer systems and will be able to engage in any application of the field--from fixing faulty software, to writing more capable programs, to avoiding common flaws. It lays the groundwork for readers to delve into more intensive topics such as computer architecture, embedded systems, and cybersecurity. This book focuses on systems that execute an x86-64 machine code, and recommends th...

  20. Introduction to computer networking

    CERN Document Server

    Robertazzi, Thomas G

    2017-01-01

    This book gives a broad look at both fundamental networking technology and new areas that support it and use it. It is a concise introduction to the most prominent, recent technological topics in computer networking. Topics include network technology such as wired and wireless networks, enabling technologies such as data centers, software defined networking, cloud and grid computing and applications such as networks on chips, space networking and network security. The accessible writing style and non-mathematical treatment makes this a useful book for the student, network and communications engineer, computer scientist and IT professional. • Features a concise, accessible treatment of computer networking, focusing on new technological topics; • Provides non-mathematical introduction to networks in their most common forms today;< • Includes new developments in switching, optical networks, WiFi, Bluetooth, LTE, 5G, and quantum cryptography.

  1. Computed Tomographic Measurement of Splenic Size in

    International Nuclear Information System (INIS)

    Sung, Nak Kwan; Woo, Seong Ku; Ko, Young Tae; Kim, Soon Young

    2010-01-01

    Authors analyzed 72 cases of abdominal computed tomography of Korean adults who didn't have any medical reasons to believe the spleen was abnormal. The following criteria were measured with multiple transverse scanning of the entire length of spleen (height, breadth, thickness) relationship with fixed midline structure, the spine (the shortest distance from midline to medial edge of spleen, the longest distance from anterior margin of vertebral body to anterior tip of spleen). The results were as follows: 1. The average size in adult was 8.0±1.5cm in height, 8.6±1.2cm in breadth and 3.4±0.6cm in thickness; in adult female, 7.8±1.1cm, 8.4±1.0cm and 3.4±0.6cm, respectively; total average, 7.9±1.3cm, 8.5±1.1cm and 3.4±0.6cm, respectively. No remarkable difference was noted between both sexes and age groups. 2. The shortest distance from midline to medial edge of spleen was 4.1±1.1cm in male, 3.6±1.0cm in female and total average of 3.9±1.1cm. There was remarkable difference between both sexes (P<0.005) but not between age groups. 3. The longest distance from anterior margin of vertebral body to anterior adge of spleen was 2.3±1.7cm in male, 2.0±1.4cm in female and total average of 2.2±1.6cm. No remarkable difference was seen between both sexes and age groups.

  2. The illusion of common ground

    DEFF Research Database (Denmark)

    Cowley, Stephen; Harvey, Matthew

    2016-01-01

    When people talk about “common ground”, they invoke shared experiences, convictions, and emotions. In the language sciences, however, ‘common ground’ also has a technical sense. Many taking a representational view of language and cognition seek to explain that everyday feeling in terms of how...... isolated individuals “use” language to communicate. Autonomous cognitive agents are said to use words to communicate inner thoughts and experiences; in such a framework, ‘common ground’ describes a body of information that people allegedly share, hold common, and use to reason about how intentions have......, together with concerted bodily (and vocal) activity, serve to organize, regulate and coordinate both attention and the verbal and non-verbal activity that it gives rise to. Since wordings are normative, they can be used to develop skills for making cultural sense of environments and other peoples’ doings...

  3. NIH Common Data Elements Repository

    Data.gov (United States)

    U.S. Department of Health & Human Services — The NIH Common Data Elements (CDE) Repository has been designed to provide access to structured human and machine-readable definitions of data elements that have...

  4. 6 Common Cancers - Colorectal Cancer

    Science.gov (United States)

    ... Home Current Issue Past Issues 6 Common Cancers - Colorectal Cancer Past Issues / Spring 2007 Table of Contents For ... of colon cancer. Photo: AP Photo/Ron Edmonds Colorectal Cancer Cancer of the colon (large intestine) or rectum ( ...

  5. 6 Common Cancers - Lung Cancer

    Science.gov (United States)

    ... Bar Home Current Issue Past Issues 6 Common Cancers - Lung Cancer Past Issues / Spring 2007 Table of Contents ... Desperate Housewives. (Photo ©2005 Kathy Hutchins / Hutchins) Lung Cancer Lung cancer causes more deaths than the next three ...

  6. Common High Blood Pressure Myths

    Science.gov (United States)

    ... Disease Venous Thromboembolism Aortic Aneurysm More Common High Blood Pressure Myths Updated:May 4,2018 Knowing the facts ... This content was last reviewed October 2016. High Blood Pressure • Home • Get the Facts About HBP Introduction What ...

  7. Common mode and coupled failure

    International Nuclear Information System (INIS)

    Taylor, J.R.

    1975-10-01

    Based on examples and data from Abnormal Occurence Reports for nuclear reactors, a classification of common mode or coupled failures is given, and some simple statistical models are investigated. (author)

  8. Common Systems Integration Lab (CSIL)

    Data.gov (United States)

    Federal Laboratory Consortium — The Common Systems Integration Lab (CSIL)supports the PMA-209 Air Combat Electronics Program Office. CSIL also supports development, test, integration and life cycle...

  9. 6 Common Cancers - Breast Cancer

    Science.gov (United States)

    ... Home Current Issue Past Issues 6 Common Cancers - Breast Cancer Past Issues / Spring 2007 Table of Contents For ... slow her down. Photo: AP Photo/Brett Flashnick Breast Cancer Breast cancer is a malignant (cancerous) growth that ...

  10. Communication, timing, and common learning

    Czech Academy of Sciences Publication Activity Database

    Steiner, Jakub; Stewart, C.

    2011-01-01

    Roč. 146, č. 1 (2011), s. 230-247 ISSN 0022-0531 Institutional research plan: CEZ:AV0Z70850503 Keywords : common knowledge * learning * communication Subject RIV: AH - Economics Impact factor: 1.235, year: 2011

  11. Computational Design of Urban Layouts

    KAUST Repository

    Wonka, Peter

    2015-10-07

    A fundamental challenge in computational design is to compute layouts by arranging a set of shapes. In this talk I will present recent urban modeling projects with applications in computer graphics, urban planning, and architecture. The talk will look at different scales of urban modeling (streets, floorplans, parcels). A common challenge in all these modeling problems are functional and aesthetic constraints that should be respected. The talk also highlights interesting links to geometry processing problems, such as field design and quad meshing.

  12. Philosophy vs the common sense

    OpenAIRE

    V. V. Chernyshov

    2017-01-01

    The paper deals with the antinomy of philosophy and the common sense. Philosophy emerges as a way of specifically human knowledge, which purposes analytics of the reality of subjective experience. The study reveals that in order to alienate philosophy from the common sense it was essential to revise the understanding of wisdom. The new, philosophical interpretation of wisdom – offered by Pythagoras – has laid the foundation of any future philosophy. Thus, philosophy emerges, alienating itself...

  13. Sustainability of common pool resources

    OpenAIRE

    Timilsina, Raja Rajendra; Kotani, Koji; Kamijo, Yoshio

    2017-01-01

    Sustainability has become a key issue in managing natural resources together with growing concerns for capitalism, environmental and resource problems. We hypothesize that the ongoing modernization of competitive societies, which we refer to as "capitalism," affects human nature for utilizing common pool resources, thus compromising sustainability. To test this hypothesis, we design and implement a set of dynamic common pool resource games and experiments in the following two types of Nepales...

  14. Whose commons are mobilities spaces?

    DEFF Research Database (Denmark)

    Freudendal-Pedersen, Malene

    2015-01-01

    for cyclists and cycling to be given greater consideration in broader societal understandings of the common good. I argue that this is in fact not the case. Rather the specific project identities that are nurtured by Copenhagen’s cycling community inhibit it from advocating publicly or aggressively...... for a vision of the common good that gives cyclists greater and more protected access to the city’s mobility spaces...

  15. Casuistry as common law morality.

    Science.gov (United States)

    Paulo, Norbert

    2015-12-01

    This article elaborates on the relation between ethical casuistry and common law reasoning. Despite the frequent talk of casuistry as common law morality, remarks on this issue largely remain at the purely metaphorical level. The article outlines and scrutinizes Albert Jonsen and Stephen Toulmin's version of casuistry and its basic elements. Drawing lessons for casuistry from common law reasoning, it is argued that one generally has to be faithful to ethical paradigms. There are, however, limitations for the binding force of paradigms. The most important limitations--the possibilities of overruling and distinguishing paradigm norms--are similar in common law and in casuistry, or so it is argued. These limitations explain why casuistry is not necessarily overly conservative and conventional, which is one line of criticism to which casuists can now better respond. Another line of criticism has it that the very reasoning from case to case is extremely unclear in casuistry. I suggest a certain model of analogical reasoning to address this critique. All my suggestions to understand and to enhance casuistry make use of common law reasoning whilst remaining faithful to Jonsen and Toulmin's main ideas and commitments. Further developed along these lines, casuistry can appropriately be called "common law morality."

  16. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  17. A Case for Data Commons: Toward Data Science as a Service.

    Science.gov (United States)

    Grossman, Robert L; Heath, Allison; Murphy, Mark; Patterson, Maria; Wells, Walt

    2016-01-01

    Data commons collocate data, storage, and computing infrastructure with core services and commonly used tools and applications for managing, analyzing, and sharing data to create an interoperable resource for the research community. An architecture for data commons is described, as well as some lessons learned from operating several large-scale data commons.

  18. Assessing attitudes toward computers and the use of Internet resources among undergraduate microbiology students

    Science.gov (United States)

    Anderson, Delia Marie Castro

    Computer literacy and use have become commonplace in our colleges and universities. In an environment that demands the use of technology, educators should be knowledgeable of the components that make up the overall computer attitude of students and be willing to investigate the processes and techniques of effective teaching and learning that can take place with computer technology. The purpose of this study is two fold. First, it investigates the relationship between computer attitudes and gender, ethnicity, and computer experience. Second, it addresses the question of whether, and to what extent, students' attitudes toward computers change over a 16 week period in an undergraduate microbiology course that supplements the traditional lecture with computer-driven assignments. Multiple regression analyses, using data from the Computer Attitudes Scale (Loyd & Loyd, 1985), showed that, in the experimental group, no significant relationships were found between computer anxiety and gender or ethnicity or between computer confidence and gender or ethnicity. However, students who used computers the longest (p = .001) and who were self-taught (p = .046) had the lowest computer anxiety levels. Likewise students who used computers the longest (p = .001) and who were self-taught (p = .041) had the highest confidence levels. No significant relationships between computer liking, usefulness, or the use of Internet resources and gender, ethnicity, or computer experience were found. Dependent T-tests were performed to determine whether computer attitude scores (pretest and posttest) increased over a 16-week period for students who had been exposed to computer-driven assignments and other Internet resources. Results showed that students in the experimental group were less anxious about working with computers and considered computers to be more useful. In the control group, no significant changes in computer anxiety, confidence, liking, or usefulness were noted. Overall, students in

  19. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  20. Efficient heuristics for maximum common substructure search.

    Science.gov (United States)

    Englert, Péter; Kovács, Péter

    2015-05-26

    Maximum common substructure search is a computationally hard optimization problem with diverse applications in the field of cheminformatics, including similarity search, lead optimization, molecule alignment, and clustering. Most of these applications have strict constraints on running time, so heuristic methods are often preferred. However, the development of an algorithm that is both fast enough and accurate enough for most practical purposes is still a challenge. Moreover, in some applications, the quality of a common substructure depends not only on its size but also on various topological features of the one-to-one atom correspondence it defines. Two state-of-the-art heuristic algorithms for finding maximum common substructures have been implemented at ChemAxon Ltd., and effective heuristics have been developed to improve both their efficiency and the relevance of the atom mappings they provide. The implementations have been thoroughly evaluated and compared with existing solutions (KCOMBU and Indigo). The heuristics have been found to greatly improve the performance and applicability of the algorithms. The purpose of this paper is to introduce the applied methods and present the experimental results.

  1. Computational biomechanics

    International Nuclear Information System (INIS)

    Ethier, C.R.

    2004-01-01

    Computational biomechanics is a fast-growing field that integrates modern biological techniques and computer modelling to solve problems of medical and biological interest. Modelling of blood flow in the large arteries is the best-known application of computational biomechanics, but there are many others. Described here is work being carried out in the laboratory on the modelling of blood flow in the coronary arteries and on the transport of viral particles in the eye. (author)

  2. Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.

    to understand the computer as a material like any other material we would use for design, like wood, aluminum, or plastic. That as soon as the computer forms a composition with other materials it becomes just as approachable and inspiring as other smart materials. I present a series of investigations of what...... Computational Composite, and Telltale). Through the investigations, I show how the computer can be understood as a material and how it partakes in a new strand of materials whose expressions come to be in context. I uncover some of their essential material properties and potential expressions. I develop a way...

  3. Understanding Computational Bayesian Statistics

    CERN Document Server

    Bolstad, William M

    2011-01-01

    A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic

  4. Computer science II essentials

    CERN Document Server

    Raus, Randall

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Computer Science II includes organization of a computer, memory and input/output, coding, data structures, and program development. Also included is an overview of the most commonly

  5. Philosophy vs the common sense

    Directory of Open Access Journals (Sweden)

    V. V. Chernyshov

    2017-01-01

    Full Text Available The paper deals with the antinomy of philosophy and the common sense. Philosophy emerges as a way of specifically human knowledge, which purposes analytics of the reality of subjective experience. The study reveals that in order to alienate philosophy from the common sense it was essential to revise the understanding of wisdom. The new, philosophical interpretation of wisdom – offered by Pythagoras – has laid the foundation of any future philosophy. Thus, philosophy emerges, alienating itself from the common sense, which refers to the common or collective experience. Moreover, the study examines the role of emotions, conformity and conventionality which they play with respect to the common sense. Next the author focuses on the role of philosophical intuition, guided with principles of rationality, nonconformity and scepticism, which the author professes the foundation stones of any sound philosophy. The common sense, described as deeply routed in the world of human emotions, aims at empathy, as the purpose of philosophy is to provide the rational means of knowledge. Therefore, philosophy uses thinking, keeping the permanent efforts to check and recheck data of its own experience. Thus, the first task of philosophical thinking appears to overcome the suggestion of the common sense, which purposes the social empathy, as philosophical intuition aims at independent thinking, the analytics of subjective experience. The study describes the fundamental principles of the common sense, on the one hand, and those of philosophy, on the other. The author arrives to conclusion that the common sense is unable to exceed the limits of sensual experience. Even there, where it apparently rises to a form of any «spiritual unity», even there it cannot avoid referring to the data of commonly shared sensual experience; though, philosophy, meanwhile, goes beyond sensuality, creating a discourse that would be able to alienate from it, and to make its rational

  6. [Common household traditional Chinese medicines].

    Science.gov (United States)

    Zhang, Shu-Yuan; Li, Mei; Fu, Dan; Liu, Yang; Wang, Hui; Tan, Wei

    2016-02-01

    With the enhancement in the awareness of self-diagnosis among residents, it's very common for each family to prepare common medicines for unexpected needs. Meanwhile, with the popularization of the traditional Chinese medicine knowledge, the proportion of common traditional Chinese medicines prepared at residents' families is increasingly higher than western medicines year by year. To make it clear, both pre-research and closed questionnaire research were adopted for residents in Chaoyang District, Beijing, excluding residents with a medical background. Based on the results of data, a analysis was made to define the role and influence on the quality of life of residents and give suggestions for relevant departments to improve the traditional Chinese medicine popularization and promote the traditional Chinese medicine market. Copyright© by the Chinese Pharmaceutical Association.

  7. Governing for the Common Good.

    Science.gov (United States)

    Ruger, Jennifer Prah

    2015-12-01

    The proper object of global health governance (GHG) should be the common good, ensuring that all people have the opportunity to flourish. A well-organized global society that promotes the common good is to everyone's advantage. Enabling people to flourish includes enabling their ability to be healthy. Thus, we must assess health governance by its effectiveness in enhancing health capabilities. Current GHG fails to support human flourishing, diminishes health capabilities and thus does not serve the common good. The provincial globalism theory of health governance proposes a Global Health Constitution and an accompanying Global Institute of Health and Medicine that together propose to transform health governance. Multiple lines of empirical research suggest that these institutions would be effective, offering the most promising path to a healthier, more just world.

  8. The Messiness of Common Good

    DEFF Research Database (Denmark)

    Feldt, Liv Egholm

    Civil society and its philanthropic and voluntary organisations are currently experiencing public and political attention and demands to safeguard society’s ‘common good’ through social cohesion and as providers of welfare services. This has raised the question by both practitioners and researchers...... that a distinction between the non-civil and the civil is more fruitful, if we want to understand the past, present and future messiness in place in defining the common good. Based on an ethnographic case analysis of a Danish corporate foundation between 1920 and 2014 the paper shows how philanthropic gift......-giving concepts, practices and operational forms throughout history have played a significant role in defining the common good and its future avenues. Through an analytical attitude based on microhistory, conceptual history and the sociology of translation it shows that civil society’s institutional logic always...

  9. UMTS Common Channel Sensitivity Analysis

    DEFF Research Database (Denmark)

    Pratas, Nuno; Rodrigues, António; Santos, Frederico

    2006-01-01

    and as such it is necessary that both channels be available across the cell radius. This requirement makes the choice of the transmission parameters a fundamental one. This paper presents a sensitivity analysis regarding the transmission parameters of two UMTS common channels: RACH and FACH. Optimization of these channels...... is performed and values for the key transmission parameters in both common channels are obtained. On RACH these parameters are the message to preamble offset, the initial SIR target and the preamble power step while on FACH it is the transmission power offset....

  10. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  11. Quantum Computing

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 9. Quantum Computing - Building Blocks of a Quantum Computer. C S Vijay Vishal Gupta. General Article Volume 5 Issue 9 September 2000 pp 69-81. Fulltext. Click here to view fulltext PDF. Permanent link:

  12. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  13. Quantum Computing

    Indian Academy of Sciences (India)

    In the first part of this article, we had looked at how quantum physics can be harnessed to make the building blocks of a quantum computer. In this concluding part, we look at algorithms which can exploit the power of this computational device, and some practical difficulties in building such a device. Quantum Algorithms.

  14. Quantum computing

    OpenAIRE

    Burba, M.; Lapitskaya, T.

    2017-01-01

    This article gives an elementary introduction to quantum computing. It is a draft for a book chapter of the "Handbook of Nature-Inspired and Innovative Computing", Eds. A. Zomaya, G.J. Milburn, J. Dongarra, D. Bader, R. Brent, M. Eshaghian-Wilner, F. Seredynski (Springer, Berlin Heidelberg New York, 2006).

  15. Computational Pathology

    Science.gov (United States)

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  16. Recent trends in grid computing

    International Nuclear Information System (INIS)

    Miura, Kenichi

    2004-01-01

    Grid computing is a technology which allows uniform and transparent access to geographically dispersed computational resources, such as computers, databases, experimental and observational equipment etc. via high-speed, high-bandwidth networking. The commonly used analogy is that of electrical power grid, whereby the household electricity is made available from outlets on the wall, and little thought need to be given to where the electricity is generated and how it is transmitted. The usage of grid also includes distributed parallel computing, high through-put computing, data intensive computing (data grid) and collaborative computing. This paper reviews the historical background, software structure, current status and on-going grid projects, including applications of grid technology to nuclear fusion research. (author)

  17. A SURVEY ON UBIQUITOUS COMPUTING

    Directory of Open Access Journals (Sweden)

    Vishal Meshram

    2016-01-01

    Full Text Available This work presents a survey of ubiquitous computing research which is the emerging domain that implements communication technologies into day-to-day life activities. This research paper provides a classification of the research areas on the ubiquitous computing paradigm. In this paper, we present common architecture principles of ubiquitous systems and analyze important aspects in context-aware ubiquitous systems. In addition, this research work presents a novel architecture of ubiquitous computing system and a survey of sensors needed for applications in ubiquitous computing. The goals of this research work are three-fold: i serve as a guideline for researchers who are new to ubiquitous computing and want to contribute to this research area, ii provide a novel system architecture for ubiquitous computing system, and iii provides further research directions required into quality-of-service assurance of ubiquitous computing.

  18. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production......), for instance, in establishing and maintaining trust between the involved parties (Sabherwal, 1999). So far, research in cloud computing has neglected this perspective and focused entirely on aspects relating to technology, economy, security and legal questions. While the core technologies of cloud computing (e...

  19. Computability theory

    CERN Document Server

    Weber, Rebecca

    2012-01-01

    What can we compute--even with unlimited resources? Is everything within reach? Or are computations necessarily drastically limited, not just in practice, but theoretically? These questions are at the heart of computability theory. The goal of this book is to give the reader a firm grounding in the fundamentals of computability theory and an overview of currently active areas of research, such as reverse mathematics and algorithmic randomness. Turing machines and partial recursive functions are explored in detail, and vital tools and concepts including coding, uniformity, and diagonalization are described explicitly. From there the material continues with universal machines, the halting problem, parametrization and the recursion theorem, and thence to computability for sets, enumerability, and Turing reduction and degrees. A few more advanced topics round out the book before the chapter on areas of research. The text is designed to be self-contained, with an entire chapter of preliminary material including re...

  20. A Scheduling Algorithm for Cloud Computing System Based on the Driver of Dynamic Essential Path.

    Science.gov (United States)

    Xie, Zhiqiang; Shao, Xia; Xin, Yu

    2016-01-01

    To solve the problem of task scheduling in the cloud computing system, this paper proposes a scheduling algorithm for cloud computing based on the driver of dynamic essential path (DDEP). This algorithm applies a predecessor-task layer priority strategy to solve the problem of constraint relations among task nodes. The strategy assigns different priority values to every task node based on the scheduling order of task node as affected by the constraint relations among task nodes, and the task node list is generated by the different priority value. To address the scheduling order problem in which task nodes have the same priority value, the dynamic essential long path strategy is proposed. This strategy computes the dynamic essential path of the pre-scheduling task nodes based on the actual computation cost and communication cost of task node in the scheduling process. The task node that has the longest dynamic essential path is scheduled first as the completion time of task graph is indirectly influenced by the finishing time of task nodes in the longest dynamic essential path. Finally, we demonstrate the proposed algorithm via simulation experiments using Matlab tools. The experimental results indicate that the proposed algorithm can effectively reduce the task Makespan in most cases and meet a high quality performance objective.

  1. Computational Streetscapes

    Directory of Open Access Journals (Sweden)

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  2. Five Common Cancers in Iran

    NARCIS (Netherlands)

    Kolandoozan, Shadi; Sadjadi, Alireza; Radmard, Amir Reza; Khademi, Hooman

    Iran as a developing nation is in epidemiological transition from communicable to non-communicable diseases. Although, cancer is the third cause of death in Iran, ifs mortality are on the rise during recent decades. This mini-review was carried out to provide a general viewpoint on common cancers

  3. Experiments on common property management

    NARCIS (Netherlands)

    van Soest, D.P.; Shogren, J.F.

    2013-01-01

    Common property resources are (renewable) natural resources where current excessive extraction reduces future resource availability, and the use of which is de facto restricted to a specific set of agents, such as inhabitants of a village or members of a community; think of community-owned forests,

  4. The Parody of the Commons

    Directory of Open Access Journals (Sweden)

    Vasilis Kostakis

    2013-08-01

    Full Text Available This essay builds on the idea that Commons-based peer production is a social advancement within capitalism but with various post-capitalistic aspects, in need of protection, enforcement, stimulation and connection with progressive social movements. We use theory and examples to claim that peer-to-peer economic relations can be undermined in the long run, distorted by the extra-economic means of a political context designed to maintain profit-driven relations of production into power. This subversion can arguably become a state policy, and the subsequent outcome is the full absorption of the Commons as well as of the underpinning peer-to-peer relations into the dominant mode of production. To tackle this threat, we argue in favour of a certain working agenda for Commons-based communities. Such an agenda should aim the enforcement of the circulation of the Commons. Therefore, any useful social transformation will be meaningful if the people themselves decide and apply policies for their own benefit, optimally with the support of a sovereign partner state. If peer production is to become dominant, it has to control capital accumulation with the aim to marginalise and eventually transcend capitalism.

  5. Parents' common pitfalls of discipline.

    Science.gov (United States)

    Witoonchart, Chatree; Fangsa-ard, Thitiporn; Chaoaree, Supamit; Ketumarn, Panom; Kaewpornsawan, Titawee; Phatthrayuttawat, Sucheera

    2005-11-01

    Problems of discipline are common among parents. These may be the results of the parents' pitfalls in disciplining their children. To find out common pitfalls of parents in disciplining their children. Parents of students with ages ranged between 60-72 months old in Bangkok-Noi district, Bangkok, were selected by random sampling. Total number of 1947 children ages between 60-72 months were recruited. Parents of these children were interviewed with a questionnaire designed to probe into problems in child rearing. There hindered and fifty questionnaires were used for data analyses. Parents had high concerns about problems in discipline their children and needed support from professional personnel. They had limited knowledge and possessed lots of wrong attitude towards discipline. Common pitfalls on the topics were problems in, 1) limit setting 2) rewarding and punishment 3) supervision on children watching TV and bedtime routines. Parents of children with ages 60-72 months old in Bangkok-Noi district, Bangkok, had several common pitfalls in disciplining their children, including attitude, knowledge and practice.

  6. The Common Vision. Reviews: Books.

    Science.gov (United States)

    Chattin-McNichols, John

    1998-01-01

    Reviews Marshak's book describing the work of educators Maria Montessori, Rudolf Steiner, Aurobindo Ghose, and Inayat Khan. Maintains that the book gives clear, concise information on each educator and presents a common vision for children and their education; also maintains that it gives theoretical and practical information and discusses…

  7. Twenty-First Century Diseases: Commonly Rare and Rarely Common?

    Science.gov (United States)

    Daunert, Sylvia; Sittampalam, Gurusingham Sitta; Goldschmidt-Clermont, Pascal J

    2017-09-20

    Alzheimer's drugs are failing at a rate of 99.6%, and success rate for drugs designed to help patients with this form of dementia is 47 times less than for drugs designed to help patients with cancers ( www.scientificamerican.com/article/why-alzheimer-s-drugs-keep-failing/2014 ). How can it be so difficult to produce a valuable drug for Alzheimer's disease? Each human has a unique genetic and epigenetic makeup, thus endowing individuals with a highly unique complement of genes, polymorphisms, mutations, RNAs, proteins, lipids, and complex sugars, resulting in distinct genome, proteome, metabolome, and also microbiome identity. This editorial is taking into account the uniqueness of each individual and surrounding environment, and stresses the point that a more accurate definition of a "common" disorder could be simply the amalgamation of a myriad of "rare" diseases. These rare diseases are being grouped together because they share a rather constant complement of common features and, indeed, generally respond to empirically developed treatments, leading to a positive outcome consistently. We make the case that it is highly unlikely that such treatments, despite their statistical success measured with large cohorts using standardized clinical research, will be effective on all patients until we increase the depth and fidelity of our understanding of the individual "rare" diseases that are grouped together in the "buckets" of common illnesses. Antioxid. Redox Signal. 27, 511-516.

  8. Common sense and the common morality in theory and practice.

    Science.gov (United States)

    Daly, Patrick

    2014-06-01

    The unfinished nature of Beauchamp and Childress's account of the common morality after 34 years and seven editions raises questions about what is lacking, specifically in the way they carry out their project, more generally in the presuppositions of the classical liberal tradition on which they rely. Their wide-ranging review of ethical theories has not provided a method by which to move beyond a hypothetical approach to justification or, on a practical level regarding values conflict, beyond a questionable appeal to consensus. My major purpose in this paper is to introduce the thought of Bernard Lonergan as offering a way toward such a methodological breakthrough. In the first section, I consider Beauchamp and Childress's defense of their theory of the common morality. In the second, I relate a persisting vacillation in their argument regarding the relative importance of reason and experience to a similar tension in classical liberal theory. In the third, I consider aspects of Lonergan's generalized empirical method as a way to address problems that surface in the first two sections of the paper: (1) the structural relation of reason and experience in human action; and (2) the importance of theory for practice in terms of what Lonergan calls "common sense" and "general bias."

  9. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  10. Advanced computer-based training

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, H D; Martin, H D

    1987-05-01

    The paper presents new techniques of computer-based training for personnel of nuclear power plants. Training on full-scope simulators is further increased by use of dedicated computer-based equipment. An interactive communication system runs on a personal computer linked to a video disc; a part-task simulator runs on 32 bit process computers and shows two versions: as functional trainer or as on-line predictor with an interactive learning system (OPAL), which may be well-tailored to a specific nuclear power plant. The common goal of both develoments is the optimization of the cost-benefit ratio for training and equipment.

  11. Advanced computer-based training

    International Nuclear Information System (INIS)

    Fischer, H.D.; Martin, H.D.

    1987-01-01

    The paper presents new techniques of computer-based training for personnel of nuclear power plants. Training on full-scope simulators is further increased by use of dedicated computer-based equipment. An interactive communication system runs on a personal computer linked to a video disc; a part-task simulator runs on 32 bit process computers and shows two versions: as functional trainer or as on-line predictor with an interactive learning system (OPAL), which may be well-tailored to a specific nuclear power plant. The common goal of both develoments is the optimization of the cost-benefit ratio for training and equipment. (orig.) [de

  12. Universal computer interfaces

    CERN Document Server

    Dheere, RFBM

    1988-01-01

    Presents a survey of the latest developments in the field of the universal computer interface, resulting from a study of the world patent literature. Illustrating the state of the art today, the book ranges from basic interface structure, through parameters and common characteristics, to the most important industrial bus realizations. Recent technical enhancements are also included, with special emphasis devoted to the universal interface adapter circuit. Comprehensively indexed.

  13. Computer interfacing

    CERN Document Server

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  14. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  15. Computational physics

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1987-01-15

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October.

  16. Computational Viscoelasticity

    CERN Document Server

    Marques, Severino P C

    2012-01-01

    This text is a guide how to solve problems in which viscoelasticity is present using existing commercial computational codes. The book gives information on codes’ structure and use, data preparation  and output interpretation and verification. The first part of the book introduces the reader to the subject, and to provide the models, equations and notation to be used in the computational applications. The second part shows the most important Computational techniques: Finite elements formulation, Boundary elements formulation, and presents the solutions of Viscoelastic problems with Abaqus.

  17. Optical computing.

    Science.gov (United States)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  18. Computational physics

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October

  19. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot...... encompass human concepts of subjective experience and intersubjective meaningful communication, which prevents it from being genuinely transdisciplinary. (3) Philosophically, it does not sufficiently accept the deep ontological differences between various paradigms such as von Foerster’s second- order...

  20. Security in Computer Applications

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    Computer security has been an increasing concern for IT professionals for a number of years, yet despite all the efforts, computer systems and networks remain highly vulnerable to attacks of different kinds. Design flaws and security bugs in the underlying software are among the main reasons for this. This lecture addresses the following question: how to create secure software? The lecture starts with a definition of computer security and an explanation of why it is so difficult to achieve. It then introduces the main security principles (like least-privilege, or defense-in-depth) and discusses security in different phases of the software development cycle. The emphasis is put on the implementation part: most common pitfalls and security bugs are listed, followed by advice on best practice for security development. The last part of the lecture covers some miscellaneous issues like the use of cryptography, rules for networking applications, and social engineering threats. This lecture was first given on Thursd...

  1. Common Nearly Best Linear Estimates of Location and Scale ...

    African Journals Online (AJOL)

    Common nearly best linear estimates of location and scale parameters of normal and logistic distributions, which are based on complete samples, are considered. Here, the population from which the samples are drawn is either normal or logistic population or a fusion of both distributions and the estimates are computed ...

  2. Sustainability of common pool resources.

    Science.gov (United States)

    Timilsina, Raja Rajendra; Kotani, Koji; Kamijo, Yoshio

    2017-01-01

    Sustainability has become a key issue in managing natural resources together with growing concerns for capitalism, environmental and resource problems. We hypothesize that the ongoing modernization of competitive societies, which we refer to as "capitalism," affects human nature for utilizing common pool resources, thus compromising sustainability. To test this hypothesis, we design and implement a set of dynamic common pool resource games and experiments in the following two types of Nepalese areas: (i) rural (non-capitalistic) and (ii) urban (capitalistic) areas. We find that a proportion of prosocial individuals in urban areas is lower than that in rural areas, and urban residents deplete resources more quickly than rural residents. The composition of proself and prosocial individuals in a group and the degree of capitalism are crucial in that an increase in prosocial members in a group and the rural dummy positively affect resource sustainability by 65% and 63%, respectively. Overall, this paper shows that when societies move toward more capitalistic environments, the sustainability of common pool resources tends to decrease with the changes in individual preferences, social norms, customs and views to others through human interactions. This result implies that individuals may be losing their coordination abilities for social dilemmas of resource sustainability in capitalistic societies.

  3. Common morality and moral reform.

    Science.gov (United States)

    Wallace, K A

    2009-01-01

    The idea of moral reform requires that morality be more than a description of what people do value, for there has to be some measure against which to assess progress. Otherwise, any change is not reform, but simply difference. Therefore, I discuss moral reform in relation to two prescriptive approaches to common morality, which I distinguish as the foundational and the pragmatic. A foundational approach to common morality (e.g., Bernard Gert's) suggests that there is no reform of morality, but of beliefs, values, customs, and practices so as to conform with an unchanging, foundational morality. If, however, there were revision in its foundation (e.g., in rationality), then reform in morality itself would be possible. On a pragmatic view, on the other hand, common morality is relative to human flourishing, and its justification consists in its effectiveness in promoting flourishing. Morality is dependent on what in fact does promote human flourishing and therefore, could be reformed. However, a pragmatic approach, which appears more open to the possibility of moral reform, would need a more robust account of norms by which reform is measured.

  4. George Combe and common sense.

    Science.gov (United States)

    Dyde, Sean

    2015-06-01

    This article examines the history of two fields of enquiry in late eighteenth- and early nineteenth-century Scotland: the rise and fall of the common sense school of philosophy and phrenology as presented in the works of George Combe. Although many previous historians have construed these histories as separate, indeed sometimes incommensurate, I propose that their paths were intertwined to a greater extent than has previously been given credit. The philosophy of common sense was a response to problems raised by Enlightenment thinkers, particularly David Hume, and spurred a theory of the mind and its mode of study. In order to succeed, or even to be considered a rival of these established understandings, phrenologists adapted their arguments for the sake of engaging in philosophical dispute. I argue that this debate contributed to the relative success of these groups: phrenology as a well-known historical subject, common sense now largely forgotten. Moreover, this history seeks to question the place of phrenology within the sciences of mind in nineteenth-century Britain.

  5. Common Ground Between Three Cultures

    Directory of Open Access Journals (Sweden)

    Gloria Dunnivan

    2009-12-01

    Full Text Available The Triwizard program with Israel brought together students from three different communities: an Israeli Arab school, an Israeli Jewish school, and an American public school with few Jews and even fewer Muslims. The two Israeli groups met in Israel to find common ground and overcome their differences through dialogue and understanding. They communicated with the American school via technology such as video-conferencing, Skype, and emails. The program culminated with a visit to the U.S. The goal of the program was to embark upon a process that would bring about intercultural awareness and acceptance at the subjective level, guiding all involved to develop empathy and an insider's view of the other's culture. It was an attempt to have a group of Israeli high school students and a group of Arab Israeli students who had a fearful, distrustful perception of each other find common ground and become friends. TriWizard was designed to have participants begin a dialogue about issues, beliefs, and emotions based on the premise that cross-cultural training strategies that are effective in changing knowledge are those that engage the emotions, and actively develop empathy and an insider's views of another culture focused on what they have in common. Participants learned that they could become friends despite their cultural differences.

  6. Essentials of cloud computing

    CERN Document Server

    Chandrasekaran, K

    2014-01-01

    ForewordPrefaceComputing ParadigmsLearning ObjectivesPreambleHigh-Performance ComputingParallel ComputingDistributed ComputingCluster ComputingGrid ComputingCloud ComputingBiocomputingMobile ComputingQuantum ComputingOptical ComputingNanocomputingNetwork ComputingSummaryReview PointsReview QuestionsFurther ReadingCloud Computing FundamentalsLearning ObjectivesPreambleMotivation for Cloud ComputingThe Need for Cloud ComputingDefining Cloud ComputingNIST Definition of Cloud ComputingCloud Computing Is a ServiceCloud Computing Is a Platform5-4-3 Principles of Cloud computingFive Essential Charact

  7. Personal Computers.

    Science.gov (United States)

    Toong, Hoo-min D.; Gupta, Amar

    1982-01-01

    Describes the hardware, software, applications, and current proliferation of personal computers (microcomputers). Includes discussions of microprocessors, memory, output (including printers), application programs, the microcomputer industry, and major microcomputer manufacturers (Apple, Radio Shack, Commodore, and IBM). (JN)

  8. Computational Literacy

    DEFF Research Database (Denmark)

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies for the acquisit......In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies...... for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...

  9. Computing Religion

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  10. Computational Controversy

    NARCIS (Netherlands)

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have

  11. Grid Computing

    Indian Academy of Sciences (India)

    IAS Admin

    emergence of supercomputers led to the use of computer simula- tion as an .... Scientific and engineering applications (e.g., Tera grid secure gate way). Collaborative ... Encryption, privacy, protection from malicious software. Physical Layer.

  12. Computer tomographs

    International Nuclear Information System (INIS)

    Niedzwiedzki, M.

    1982-01-01

    Physical foundations and the developments in the transmission and emission computer tomography are presented. On the basis of the available literature and private communications a comparison is made of the various transmission tomographs. A new technique of computer emission tomography ECT, unknown in Poland, is described. The evaluation of two methods of ECT, namely those of positron and single photon emission tomography is made. (author)

  13. Computational sustainability

    CERN Document Server

    Kersting, Kristian; Morik, Katharina

    2016-01-01

    The book at hand gives an overview of the state of the art research in Computational Sustainability as well as case studies of different application scenarios. This covers topics such as renewable energy supply, energy storage and e-mobility, efficiency in data centers and networks, sustainable food and water supply, sustainable health, industrial production and quality, etc. The book describes computational methods and possible application scenarios.

  14. Computing farms

    International Nuclear Information System (INIS)

    Yeh, G.P.

    2000-01-01

    High-energy physics, nuclear physics, space sciences, and many other fields have large challenges in computing. In recent years, PCs have achieved performance comparable to the high-end UNIX workstations, at a small fraction of the price. We review the development and broad applications of commodity PCs as the solution to CPU needs, and look forward to the important and exciting future of large-scale PC computing

  15. Computational chemistry

    Science.gov (United States)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  16. Frustration: A common user experience

    DEFF Research Database (Denmark)

    Hertzum, Morten

    2010-01-01

    % of their time redoing lost work. Thus, the frustrating experiences accounted for a total of 27% of the time, This main finding is exacerbated by several supplementary findings. For example, the users were unable to fix 26% of the experienced problems, and they rated that the problems recurred with a median....... In the present study, 21 users self-reported their frustrating experiences during an average of 1.72 hours of computer use. As in the previous studies the amount of time lost due to frustrating experiences was disturbing. The users spent 16% of their time trying to fix encountered problems and another 11...

  17. Common patterns in 558 diagnostic radiology errors.

    Science.gov (United States)

    Donald, Jennifer J; Barnard, Stuart A

    2012-04-01

    As a Quality Improvement initiative our department has held regular discrepancy meetings since 2003. We performed a retrospective analysis of the cases presented and identified the most common pattern of error. A total of 558 cases were referred for discussion over 92 months, and errors were classified as perceptual or interpretative. The most common patterns of error for each imaging modality were analysed, and the misses were scored by consensus as subtle or non-subtle. Of 558 diagnostic errors, 447 (80%) were perceptual and 111 (20%) were interpretative errors. Plain radiography and computed tomography (CT) scans were the most frequent imaging modalities accounting for 246 (44%) and 241 (43%) of the total number of errors, respectively. In the plain radiography group 120 (49%) of the errors occurred in chest X-ray reports with perceptual miss of a lung nodule occurring in 40% of this subgroup. In the axial and appendicular skeleton missed fractures occurred most frequently, and metastatic bone disease was overlooked in 12 of 50 plain X-rays of the pelvis or spine. The majority of errors within the CT group were in reports of body scans with the commonest perceptual errors identified including 16 missed significant bone lesions, 14 cases of thromboembolic disease and 14 gastrointestinal tumours. Of the 558 errors, 312 (56%) were considered subtle and 246 (44%) non-subtle. Diagnostic errors are not uncommon and are most frequently perceptual in nature. Identification of the most common patterns of error has the potential to improve the quality of reporting by improving the search behaviour of radiologists. © 2012 The Authors. Journal of Medical Imaging and Radiation Oncology © 2012 The Royal Australian and New Zealand College of Radiologists.

  18. Computational creativity

    Directory of Open Access Journals (Sweden)

    López de Mántaras Badia, Ramon

    2013-12-01

    Full Text Available New technologies, and in particular artificial intelligence, are drastically changing the nature of creative processes. Computers are playing very significant roles in creative activities such as music, architecture, fine arts, and science. Indeed, the computer is already a canvas, a brush, a musical instrument, and so on. However, we believe that we must aim at more ambitious relations between computers and creativity. Rather than just seeing the computer as a tool to help human creators, we could see it as a creative entity in its own right. This view has triggered a new subfield of Artificial Intelligence called Computational Creativity. This article addresses the question of the possibility of achieving computational creativity through some examples of computer programs capable of replicating some aspects of creative behavior in the fields of music and science.Las nuevas tecnologías y en particular la Inteligencia Artificial están cambiando de forma importante la naturaleza del proceso creativo. Los ordenadores están jugando un papel muy significativo en actividades artísticas tales como la música, la arquitectura, las bellas artes y la ciencia. Efectivamente, el ordenador ya es el lienzo, el pincel, el instrumento musical, etc. Sin embargo creemos que debemos aspirar a relaciones más ambiciosas entre los ordenadores y la creatividad. En lugar de verlos solamente como herramientas de ayuda a la creación, los ordenadores podrían ser considerados agentes creativos. Este punto de vista ha dado lugar a un nuevo subcampo de la Inteligencia Artificial denominado Creatividad Computacional. En este artículo abordamos la cuestión de la posibilidad de alcanzar dicha creatividad computacional mediante algunos ejemplos de programas de ordenador capaces de replicar algunos aspectos relacionados con el comportamiento creativo en los ámbitos de la música y la ciencia.

  19. COMMON APPROACH ON WASTE MANAGEMENT

    Directory of Open Access Journals (Sweden)

    ANDREESCU Nicoleta Alina

    2017-05-01

    Full Text Available The world population has doubled since the 60’s, now reaching 7 billion – it is estimated it will continue growing. If in more advanced economies, the population is starting to grow old and even reduce in numbers, in less developed countries, population numbers are registering a fast growth. Across the world, the ecosystems are exposed to critical levels of pollution in more and more complex combinations. Human activities, population growth and shifting patterns in consumer nature are the main factors that are at the base of thin ever-growing burden on our environment. Globalization means that the consumer and production patterns from a country or a region contribute to the pressures on the environment in totally different parts of the world. With the rise of environmental problems, the search for solutions also begun, such as methods and actions aimed to protect the environment and to lead to a better correlation between economic growth and the environment. The common goals of these endeavors from participating states was to come up with medium and long term regulations that would lead to successfully solving environmental issues. In this paper, we have analyzed the way in which countries started collaborating in the 1970’s at an international level in order to come up with a common policy that would have a positive impact on the environment. The European Union has come up with its own common policy, a policy that each member state must implement. In this context, Romania has developed its National Strategy for Waste Management, a program that Romania wishes to use to reduce the quantity of waste and better dispose of it.

  20. Modeling Common-Sense Decisions

    Science.gov (United States)

    Zak, Michail

    This paper presents a methodology for efficient synthesis of dynamical model simulating a common-sense decision making process. The approach is based upon the extension of the physics' First Principles that includes behavior of living systems. The new architecture consists of motor dynamics simulating actual behavior of the object, and mental dynamics representing evolution of the corresponding knowledge-base and incorporating it in the form of information flows into the motor dynamics. The autonomy of the decision making process is achieved by a feedback from mental to motor dynamics. This feedback replaces unavailable external information by an internal knowledgebase stored in the mental model in the form of probability distributions.

  1. The common European flexicurity principles

    DEFF Research Database (Denmark)

    Mailand, Mikkel

    2010-01-01

    This article analyses the decision-making process underlying the adoption of common EU flexicurity principles. Supporters of the initiative succeeded in convincing the sceptics one by one; the change of government in France and the last-minute support of the European social partner organizations...... were instrumental in this regard. However, the critics succeeded in weakening the initially strong focus on the transition from job security to employment security and the divisions between insiders and outsiders in the labour market. In contrast to some decision-making on the European Employment...

  2. Common blocks for ASQS(12

    Directory of Open Access Journals (Sweden)

    Lorenzo Milazzo

    1997-05-01

    Full Text Available An ASQS(v is a particular Steiner system featuring a set of v vertices and two separate families of blocks, B and G, whose elements have a respective cardinality of 4 and 6. It has the property that any three vertices of X belong either to a B-block or to a G-block. The parameter cb is the number of common blocks in two separate ASQSs, both defined on the same set of vertices X . In this paper it is shown that cb ≤ 29 for any pair of ASQSs(12.

  3. Scientific computer simulation review

    International Nuclear Information System (INIS)

    Kaizer, Joshua S.; Heller, A. Kevin; Oberkampf, William L.

    2015-01-01

    Before the results of a scientific computer simulation are used for any purpose, it should be determined if those results can be trusted. Answering that question of trust is the domain of scientific computer simulation review. There is limited literature that focuses on simulation review, and most is specific to the review of a particular type of simulation. This work is intended to provide a foundation for a common understanding of simulation review. This is accomplished through three contributions. First, scientific computer simulation review is formally defined. This definition identifies the scope of simulation review and provides the boundaries of the review process. Second, maturity assessment theory is developed. This development clarifies the concepts of maturity criteria, maturity assessment sets, and maturity assessment frameworks, which are essential for performing simulation review. Finally, simulation review is described as the application of a maturity assessment framework. This is illustrated through evaluating a simulation review performed by the U.S. Nuclear Regulatory Commission. In making these contributions, this work provides a means for a more objective assessment of a simulation’s trustworthiness and takes the next step in establishing scientific computer simulation review as its own field. - Highlights: • We define scientific computer simulation review. • We develop maturity assessment theory. • We formally define a maturity assessment framework. • We describe simulation review as the application of a maturity framework. • We provide an example of a simulation review using a maturity framework

  4. Urban ambiances as common ground?

    Directory of Open Access Journals (Sweden)

    Jean-Paul Thibaud

    2014-07-01

    Full Text Available The aim of this paper is to point out various arguments which question ambiance as a common ground of everyday urban experience. Such a project involves four major points. First, we have to move beyond the exclusive practical aspects of everyday life and bring the sensory to the forefront. Under such conditions, sensory cultures emerge where feeling and acting come together. Second, we must put common experience into perspectiveby initiating a dual dynamics of socialising the sensory and sensitising social life. Ambiances involve a complex web comprised of an ‘existential’ dimension (empathy with the ambient world, a ‘contextual’ dimension (degree of presence in the situation, and an ‘interactional’ dimension (forms of sociability expressed in the tonality. Third, we have to initiate a political ecology of ambiances in order to better understand how ambiances deal with fundamental design and planning issues. Far from being neutral, the notion of ambiance appears to be bound up with the socio-aesthetic strategies underpinning changes to the sensory urban environment of the future. Fourth, we have to question what in situ experience is all about. Three major research pointers enable to address this issue: the embodiment of situated experiences, the porous nature of sensory spaces, and the sensory efficiency of the build environment. Ambiances sensitize urban design as well as social lifeforms.

  5. Common questions about infectious mononucleosis.

    Science.gov (United States)

    Womack, Jason; Jimenez, Marissa

    2015-03-15

    Epstein-Barr is a ubiquitous virus that infects 95% of the world population at some point in life. Although Epstein-Barr virus (EBV) infections are often asymptomatic, some patients present with the clinical syndrome of infectious mononucleosis (IM). The syndrome most commonly occurs between 15 and 24 years of age. It should be suspected in patients presenting with sore throat, fever, tonsillar enlargement, fatigue, lymphadenopathy, pharyngeal inflammation, and palatal petechiae. A heterophile antibody test is the best initial test for diagnosis of EBV infection, with 71% to 90% accuracy for diagnosing IM. However, the test has a 25% false-negative rate in the first week of illness. IM is unlikely if the lymphocyte count is less than 4,000 mm3. The presence of EBV-specific immunoglobulin M antibodies confirms infection, but the test is more costly and results take longer than the heterophile antibody test. Symptomatic relief is the mainstay of treatment. Glucocorticoids and antivirals do not reduce the length or severity of illness. Splenic rupture is an uncommon complication of IM. Because physical activity within the first three weeks of illness may increase the risk of splenic rupture, athletic participation is not recommended during this time. Children are at the highest risk of airway obstruction, which is the most common cause of hospitalization from IM. Patients with immunosuppression are more likely to have fulminant EBV infection.

  6. DNA/SNLA commonality program

    International Nuclear Information System (INIS)

    Keller, D.V.; Watts, A.J.; Rice, D.A.; Powe, J.; Beezhold, W.

    1980-01-01

    The purpose of the Commonality program, initiated by DNA in 1978, was to evaluate e-beam material testing procedures and techniques by comparing material stress and spall data from various US and UK e-beam facilities and experimenters. As part of this joint DNA/SNL/UK Commonality effort, Sandia and Ktech used four different electron-beam machines to investigate various aspects of e-beam energy deposition in three materials. The deposition duration and the deposition profiles were varied, and the resulting stresses were measured. The materials studied were: (1) a low-Z material (A1), (2) a high-Z material (Ta), and (3) a typical porous material, a cermet. Aluminium and tantalum were irradiated using the DNA Blackjack 3 accelerator (60 ns pulse width), the DNA Blackjack 3' accelerator (30 ns pulse width), and the SNLA REHYD accelerator (100 ns pulse width). Propagating stresses were measured using x-cut quartz gauges, carbon gauges, and laser interferometry techniques. Data to determine the influence of deposition duration were obtained over a wide range of energy loadings. The cermet material was studied using the SNLA REHYD and HERMES II accelerators. The e-beam from REHYD generated propagating stresses which were monitored with quartz gauges as a function of sample thickness and energy loadings. The HERMES II accelerator was used to uniformly heat the cermet to determine the Grueneisen parameter and identify the incipient spall condition. Results of these experiments are presented

  7. Multiparty Computations

    DEFF Research Database (Denmark)

    Dziembowski, Stefan

    here and discuss other problems caused by the adaptiveness. All protocols in the thesis are formally specified and the proofs of their security are given. [1]Ronald Cramer, Ivan Damgård, Stefan Dziembowski, Martin Hirt, and Tal Rabin. Efficient multiparty computations with dishonest minority......In this thesis we study a problem of doing Verifiable Secret Sharing (VSS) and Multiparty Computations in a model where private channels between the players and a broadcast channel is available. The adversary is active, adaptive and has an unbounded computing power. The thesis is based on two...... to a polynomial time black-box reduction, the complexity of adaptively secure VSS is the same as that of ordinary secret sharing (SS), where security is only required against a passive, static adversary. Previously, such a connection was only known for linear secret sharing and VSS schemes. We then show...

  8. Scientific computing

    CERN Document Server

    Trangenstein, John A

    2017-01-01

    This is the third of three volumes providing a comprehensive presentation of the fundamentals of scientific computing. This volume discusses topics that depend more on calculus than linear algebra, in order to prepare the reader for solving differential equations. This book and its companions show how to determine the quality of computational results, and how to measure the relative efficiency of competing methods. Readers learn how to determine the maximum attainable accuracy of algorithms, and how to select the best method for computing problems. This book also discusses programming in several languages, including C++, Fortran and MATLAB. There are 90 examples, 200 exercises, 36 algorithms, 40 interactive JavaScript programs, 91 references to software programs and 1 case study. Topics are introduced with goals, literature references and links to public software. There are descriptions of the current algorithms in GSLIB and MATLAB. This book could be used for a second course in numerical methods, for either ...

  9. Computational Psychiatry

    Science.gov (United States)

    Wang, Xiao-Jing; Krystal, John H.

    2014-01-01

    Psychiatric disorders such as autism and schizophrenia arise from abnormalities in brain systems that underlie cognitive, emotional and social functions. The brain is enormously complex and its abundant feedback loops on multiple scales preclude intuitive explication of circuit functions. In close interplay with experiments, theory and computational modeling are essential for understanding how, precisely, neural circuits generate flexible behaviors and their impairments give rise to psychiatric symptoms. This Perspective highlights recent progress in applying computational neuroscience to the study of mental disorders. We outline basic approaches, including identification of core deficits that cut across disease categories, biologically-realistic modeling bridging cellular and synaptic mechanisms with behavior, model-aided diagnosis. The need for new research strategies in psychiatry is urgent. Computational psychiatry potentially provides powerful tools for elucidating pathophysiology that may inform both diagnosis and treatment. To achieve this promise will require investment in cross-disciplinary training and research in this nascent field. PMID:25442941

  10. Joint Service Common Operating Environment (COE) Common Geographic Information System functional requirements

    Energy Technology Data Exchange (ETDEWEB)

    Meitzler, W.D.

    1992-06-01

    In the context of this document and COE, the Geographic Information Systems (GIS) are decision support systems involving the integration of spatially referenced data in a problem solving environment. They are digital computer systems for capturing, processing, managing, displaying, modeling, and analyzing geographically referenced spatial data which are described by attribute data and location. The ability to perform spatial analysis and the ability to combine two or more data sets to create new spatial information differentiates a GIS from other computer mapping systems. While the CCGIS allows for data editing and input, its primary purpose is not to prepare data, but rather to manipulate, analyte, and clarify it. The CCGIS defined herein provides GIS services and resources including the spatial and map related functionality common to all subsystems contained within the COE suite of C4I systems. The CCGIS, which is an integral component of the COE concept, relies on the other COE standard components to provide the definition for other support computing services required.

  11. Computational artifacts

    DEFF Research Database (Denmark)

    Schmidt, Kjeld; Bansler, Jørgen P.

    2016-01-01

    The key concern of CSCW research is that of understanding computing technologies in the social context of their use, that is, as integral features of our practices and our lives, and to think of their design and implementation under that perspective. However, the question of the nature...... of that which is actually integrated in our practices is often discussed in confusing ways, if at all. The article aims to try to clarify the issue and in doing so revisits and reconsiders the notion of ‘computational artifact’....

  12. Computer security

    CERN Document Server

    Gollmann, Dieter

    2011-01-01

    A completely up-to-date resource on computer security Assuming no previous experience in the field of computer security, this must-have book walks you through the many essential aspects of this vast topic, from the newest advances in software and technology to the most recent information on Web applications security. This new edition includes sections on Windows NT, CORBA, and Java and discusses cross-site scripting and JavaScript hacking as well as SQL injection. Serving as a helpful introduction, this self-study guide is a wonderful starting point for examining the variety of competing sec

  13. Cloud Computing

    CERN Document Server

    Antonopoulos, Nick

    2010-01-01

    Cloud computing has recently emerged as a subject of substantial industrial and academic interest, though its meaning and scope is hotly debated. For some researchers, clouds are a natural evolution towards the full commercialisation of grid systems, while others dismiss the term as a mere re-branding of existing pay-per-use technologies. From either perspective, 'cloud' is now the label of choice for accountable pay-per-use access to third party applications and computational resources on a massive scale. Clouds support patterns of less predictable resource use for applications and services a

  14. Computational Logistics

    DEFF Research Database (Denmark)

    Pacino, Dario; Voss, Stefan; Jensen, Rune Møller

    2013-01-01

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  15. Computational Logistics

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  16. Computational engineering

    CERN Document Server

    2014-01-01

    The book presents state-of-the-art works in computational engineering. Focus is on mathematical modeling, numerical simulation, experimental validation and visualization in engineering sciences. In particular, the following topics are presented: constitutive models and their implementation into finite element codes, numerical models in nonlinear elasto-dynamics including seismic excitations, multiphase models in structural engineering and multiscale models of materials systems, sensitivity and reliability analysis of engineering structures, the application of scientific computing in urban water management and hydraulic engineering, and the application of genetic algorithms for the registration of laser scanner point clouds.

  17. Computer busses

    CERN Document Server

    Buchanan, William

    2000-01-01

    As more and more equipment is interface or'bus' driven, either by the use of controllers or directly from PCs, the question of which bus to use is becoming increasingly important both in industry and in the office. 'Computer Busses' has been designed to help choose the best type of bus for the particular application.There are several books which cover individual busses, but none which provide a complete guide to computer busses. The author provides a basic theory of busses and draws examples and applications from real bus case studies. Busses are analysed using from a top-down approach, helpin

  18. Reconfigurable Computing

    CERN Document Server

    Cardoso, Joao MP

    2011-01-01

    As the complexity of modern embedded systems increases, it becomes less practical to design monolithic processing platforms. As a result, reconfigurable computing is being adopted widely for more flexible design. Reconfigurable Computers offer the spatial parallelism and fine-grained customizability of application-specific circuits with the postfabrication programmability of software. To make the most of this unique combination of performance and flexibility, designers need to be aware of both hardware and software issues. FPGA users must think not only about the gates needed to perform a comp

  19. Common Β- Thalassaemia Mutations in

    Directory of Open Access Journals (Sweden)

    P Azarfam

    2005-01-01

    Full Text Available Introduction: β –Thalassaemia was first explained by Thomas Cooly as Cooly’s anaemia in 1925. The β- thalassaemias are hereditary autosomal disorders with decreased or absent β-globin chain synthesis. The most common genetic defects in β-thalassaemias are caused by point mutations, micro deletions or insertions within the β-globin gene. Material and Methods: In this research , 142 blood samples (64 from childrens hospital of Tabriz , 15 samples from Shahid Gazi hospital of Tabriz , 18 from Urumia and 45 samples from Aliasghar hospital of Ardebil were taken from thalassaemic patients (who were previously diagnosed .Then 117 non-familial samples were selected . The DNA of the lymphocytes of blood samples was extracted by boiling and Proteinase K- SDS procedure, and mutations were detected by ARMS-PCR methods. Results: From the results obtained, eleven most common mutations,most of which were Mediterranean mutations were detected as follows; IVS-I-110(G-A, IVS-I-1(G-A ،IVS-I-5(G-C ,Frameshift Codon 44 (-C,( codon5(-CT,IVS-1-6(T-C, IVS-I-25(-25bp del ,Frameshift 8.9 (+G ,IVS-II-1(G-A ,Codon 39(C-T, Codon 30(G-C the mutations of the samples were defined. The results showed that Frameshift 8.9 (+G, IVS-I-110 (G-A ,IVS-II-I(G-A, IVS-I-5(G-C, IVS-I-1(G-A , Frameshift Codon 44(-C , codon5(-CT , IVS-1-6(T-C , IVS-I-25(-25bp del with a frequency of 29.9%, 25.47%,17.83%, 7.00%, 6.36% , 6.63% , 3.8% , 2.5% , 0.63% represented the most common mutations in North - west Iran. No mutations in Codon 39(C-T and Codon 30(G-C were detected. Cunclusion: The frequency of the same mutations in patients from North - West of Iran seems to be different as compared to other regions like Turkey, Pakistan, Lebanon and Fars province of Iran. The pattern of mutations in this region is more or less the same as in the Mediterranean region, but different from South west Asia and East Asia.

  20. Research on cloud computing solutions

    Directory of Open Access Journals (Sweden)

    Liudvikas Kaklauskas

    2015-07-01

    Full Text Available Cloud computing can be defined as a new style of computing in which dynamically scala-ble and often virtualized resources are provided as a services over the Internet. Advantages of the cloud computing technology include cost savings, high availability, and easy scalability. Voas and Zhang adapted six phases of computing paradigms, from dummy termi-nals/mainframes, to PCs, networking computing, to grid and cloud computing. There are four types of cloud computing: public cloud, private cloud, hybrid cloud and community. The most common and well-known deployment model is Public Cloud. A Private Cloud is suited for sensitive data, where the customer is dependent on a certain degree of security.According to the different types of services offered, cloud computing can be considered to consist of three layers (services models: IaaS (infrastructure as a service, PaaS (platform as a service, SaaS (software as a service. Main cloud computing solutions: web applications, data hosting, virtualization, database clusters and terminal services. The advantage of cloud com-puting is the ability to virtualize and share resources among different applications with the objective for better server utilization and without a clustering solution, a service may fail at the moment the server crashes.DOI: 10.15181/csat.v2i2.914

  1. 3N Cave: Longest salt cave in the world

    Czech Academy of Sciences Publication Activity Database

    Bruthans, J.; Filippi, Michal; Zare, M.; Asadi, N.; Vilhelm, Z.

    2006-01-01

    Roč. 64, č. 9 (2006), s. 10-18 ISSN 0027-7010 R&D Projects: GA AV ČR(CZ) KJB301110501 Institutional research plan: CEZ:AV0Z30130516 Keywords : salt cave * salt karst * Iran * expedition Namak Subject RIV: DB - Geology ; Mineralogy

  2. Towards the longest run in CERN’s history

    CERN Multimedia

    CERN Bulletin

    2010-01-01

    Following the decisions taken at the Chamonix meeting, the teams are preparing the LHC to run at collision energies of 7 TeV during the coming 18-24 months. The consolidation of the nQPS connectors has been successfully completed and the magnet powering at high current has begun.   All the new electrical connectors of the nQPS have been succesfully tested. Since the last issue of the Bulletin, all of the approximately 4000 high-voltage electrical connectors of the new Quench Protection System (nQPS) have been replaced in record time. Teams are now upgrading the software on some of the electronic circuit boards. This work is already complete in half the machine. At the same time, the Hardware Commissioning group has started to power up the magnets in all the sectors. This is a lengthy process that involves gradually increasing the current to reach the 6 kA needed to steer beams at an energy of 3.5 TeV/beam. Currently, all the sectors have passed the tests at low current, and Sectors 1-2 has bee...

  3. Repair systems with exchangeable items and the longest queue mechanism

    NARCIS (Netherlands)

    Ravid, R.; Boxma, O.J.; Perry, D.

    2013-01-01

    We consider a repair facility consisting of one repairman and two arrival streams of failed items, from bases 1 and 2. The arrival processes are independent Poisson processes, and the repair times are independent and identically exponentially distributed. The item types are exchangeable, and a

  4. Repair systems with exchangeable items and the longest queue mechanism

    NARCIS (Netherlands)

    Ravid, R.; Boxma, O.J.; Perry, D.

    2011-01-01

    We consider a repair facility consisting of one repairman and two arrival streams of failed items, from bases 1 and 2. The arrival processes are independent Poisson processes, and the repair times are independent and identically exponentially distributed. The item types are exchangeable, and a

  5. The longest illness. Effects of nuclear war in children

    International Nuclear Information System (INIS)

    Kappy, M.S.

    1984-01-01

    The destruction of civilization that would follow a nuclear war would render any disaster ever recorded insignificant. Millions of people would perish during the first few hours, and many more would die in the months to come. Survival would exist only in the strictest sense of the word, since societal disorganization, famine, drought, darkness, and nuclear winter would envelope the earth. The comparative frailty of children and their dependence on adults would render them most susceptible to the acute effects of a nuclear holocaust. Furthermore, studies of the Hiroshima and Nagasaki, Japan bombings showed a disproportionate propensity for children to experience leukemias and other cancers years after the bombings. There were also great increases in perinatal deaths and cases of microcephaly and retardation in children exposed in utero to the bombs. In the event that there are future generations after a nuclear war, the issue of heritable genetic effects will become important. Suggestions of permanent genetic damage are emerging from the Hiroshima and Nagasaki studies. By comparison, the genetic effects of modern weaponry will be incalculable

  6. America’s Longest War – the War on Drugs

    OpenAIRE

    Wyrwisz, Anna

    2015-01-01

    The problem of using illicit drugs in the United States, which is the largest drug consumer in the world, is an important and controversial subject. The prohibition, which aimed to eliminate alcohol from the American society, ended in a failure. In the case of federal drug legislation, the first acts appeared exactly one hundred years ago. The next, intense phase began in 1970 during the presidency of Richard Nixon, when the war on drugs has been declared. Until this day, the number of acts a...

  7. Harvesting NASA's Common Metadata Repository

    Science.gov (United States)

    Shum, D.; Mitchell, A. E.; Durbin, C.; Norton, J.

    2017-12-01

    As part of NASA's Earth Observing System Data and Information System (EOSDIS), the Common Metadata Repository (CMR) stores metadata for over 30,000 datasets from both NASA and international providers along with over 300M granules. This metadata enables sub-second discovery and facilitates data access. While the CMR offers a robust temporal, spatial and keyword search functionality to the general public and international community, it is sometimes more desirable for international partners to harvest the CMR metadata and merge the CMR metadata into a partner's existing metadata repository. This poster will focus on best practices to follow when harvesting CMR metadata to ensure that any changes made to the CMR can also be updated in a partner's own repository. Additionally, since each partner has distinct metadata formats they are able to consume, the best practices will also include guidance on retrieving the metadata in the desired metadata format using CMR's Unified Metadata Model translation software.

  8. Why is migraine so common?

    Science.gov (United States)

    Edmeads, J

    1998-08-01

    Migraine is clearly a very common biological disorder, but this knowledge has not been sufficient as yet to ensure completely effective treatment strategies. There appears to be discrepancy between what migraine patients desire as the outcome of consultations and what doctors think patients want. Patients seem, from Packard's selective study (11), to want explanation and reassurance before they get pain relief, whereas doctors view pain relief as the most important aim of management. It is possible that doctors still have underlying assumptions about psychological elements of migraine which color their perceptions of their patients. Communicating the relevance of scientific progress in migraine to neurologists and PCPs is an important challenge, as is calling attention to the patient's expectations from treatment. To be effective in improving education in this area, perhaps we should first ascertain the level of knowledge about the biology and treatment of headache among general neurologists.

  9. Practical advantages of evolutionary computation

    Science.gov (United States)

    Fogel, David B.

    1997-10-01

    Evolutionary computation is becoming a common technique for solving difficult, real-world problems in industry, medicine, and defense. This paper reviews some of the practical advantages to using evolutionary algorithms as compared with classic methods of optimization or artificial intelligence. Specific advantages include the flexibility of the procedures, as well as their ability to self-adapt the search for optimum solutions on the fly. As desktop computers increase in speed, the application of evolutionary algorithms will become routine.

  10. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  11. From computer to brain foundations of computational neuroscience

    CERN Document Server

    Lytton, William W

    2002-01-01

    Biology undergraduates, medical students and life-science graduate students often have limited mathematical skills. Similarly, physics, math and engineering students have little patience for the detailed facts that make up much of biological knowledge. Teaching computational neuroscience as an integrated discipline requires that both groups be brought forward onto common ground. This book does this by making ancillary material available in an appendix and providing basic explanations without becoming bogged down in unnecessary details. The book will be suitable for undergraduates and beginning graduate students taking a computational neuroscience course and also to anyone with an interest in the uses of the computer in modeling the nervous system.

  12. Statistical Computing

    Indian Academy of Sciences (India)

    inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.

  13. Computational biology

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Computation via biological devices has been the subject of close scrutiny since von Neumann’s early work some 60 years ago. In spite of the many relevant works in this field, the notion of programming biological devices seems to be, at best, ill-defined. While many devices are claimed or proved t...

  14. Computing News

    CERN Multimedia

    McCubbin, N

    2001-01-01

    We are still five years from the first LHC data, so we have plenty of time to get the computing into shape, don't we? Well, yes and no: there is time, but there's an awful lot to do! The recently-completed CERN Review of LHC Computing gives the flavour of the LHC computing challenge. The hardware scale for each of the LHC experiments is millions of 'SpecInt95' (SI95) units of cpu power and tens of PetaBytes of data storage. PCs today are about 20-30SI95, and expected to be about 100 SI95 by 2005, so it's a lot of PCs. This hardware will be distributed across several 'Regional Centres' of various sizes, connected by high-speed networks. How to realise this in an orderly and timely fashion is now being discussed in earnest by CERN, Funding Agencies, and the LHC experiments. Mixed in with this is, of course, the GRID concept...but that's a topic for another day! Of course hardware, networks and the GRID constitute just one part of the computing. Most of the ATLAS effort is spent on software development. What we ...

  15. Quantum Computation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 16; Issue 9. Quantum Computation - Particle and Wave Aspects of Algorithms. Apoorva Patel. General Article Volume 16 Issue 9 September 2011 pp 821-835. Fulltext. Click here to view fulltext PDF. Permanent link:

  16. Cloud computing.

    Science.gov (United States)

    Wink, Diane M

    2012-01-01

    In this bimonthly series, the author examines how nurse educators can use Internet and Web-based technologies such as search, communication, and collaborative writing tools; social networking and social bookmarking sites; virtual worlds; and Web-based teaching and learning programs. This article describes how cloud computing can be used in nursing education.

  17. Computer Recreations.

    Science.gov (United States)

    Dewdney, A. K.

    1988-01-01

    Describes the creation of the computer program "BOUNCE," designed to simulate a weighted piston coming into equilibrium with a cloud of bouncing balls. The model follows the ideal gas law. Utilizes the critical event technique to create the model. Discusses another program, "BOOM," which simulates a chain reaction. (CW)

  18. [Grid computing

    CERN Multimedia

    Wolinsky, H

    2003-01-01

    "Turn on a water spigot, and it's like tapping a bottomless barrel of water. Ditto for electricity: Flip the switch, and the supply is endless. But computing is another matter. Even with the Internet revolution enabling us to connect in new ways, we are still limited to self-contained systems running locally stored software, limited by corporate, institutional and geographic boundaries" (1 page).

  19. Computational Finance

    DEFF Research Database (Denmark)

    Rasmussen, Lykke

    One of the major challenges in todays post-crisis finance environment is calculating the sensitivities of complex products for hedging and risk management. Historically, these derivatives have been determined using bump-and-revalue, but due to the increasing magnitude of these computations does...

  20. Optical Computing

    Indian Academy of Sciences (India)

    Optical computing technology is, in general, developing in two directions. One approach is ... current support in many places, with private companies as well as governments in several countries encouraging such research work. For example, much ... which enables more information to be carried and data to be processed.

  1. Data mining in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Ruxandra-Ştefania PETRE

    2012-10-01

    Full Text Available This paper describes how data mining is used in cloud computing. Data Mining is used for extracting potentially useful information from raw data. The integration of data mining techniques into normal day-to-day activities has become common place. Every day people are confronted with targeted advertising, and data mining techniques help businesses to become more efficient by reducing costs.Data mining techniques and applications are very much needed in the cloud computing paradigm. The implementation of data mining techniques through Cloud computing will allow the users to retrieve meaningful information from virtually integrated data warehouse that reduces the costs of infrastructure and storage.

  2. Computable Frames in Computable Banach Spaces

    Directory of Open Access Journals (Sweden)

    S.K. Kaushik

    2016-06-01

    Full Text Available We develop some parts of the frame theory in Banach spaces from the point of view of Computable Analysis. We define computable M-basis and use it to construct a computable Banach space of scalar valued sequences. Computable Xd frames and computable Banach frames are also defined and computable versions of sufficient conditions for their existence are obtained.

  3. Sustainable models of audiovisual commons

    Directory of Open Access Journals (Sweden)

    Mayo Fuster Morell

    2013-03-01

    Full Text Available This paper addresses an emerging phenomenon characterized by continuous change and experimentation: the collaborative commons creation of audiovisual content online. The analysis wants to focus on models of sustainability of collaborative online creation, paying particular attention to the use of different forms of advertising. This article is an excerpt of a larger investigation, which unit of analysis are cases of Online Creation Communities that take as their central node of activity the Catalan territory. From 22 selected cases, the methodology combines quantitative analysis, through a questionnaire delivered to all cases, and qualitative analysis through face interviews conducted in 8 cases studied. The research, which conclusions we summarize in this article,in this article, leads us to conclude that the sustainability of the project depends largely on relationships of trust and interdependence between different voluntary agents, the non-monetary contributions and retributions as well as resources and infrastructure of free use. All together leads us to understand that this is and will be a very important area for the future of audiovisual content and its sustainability, which will imply changes in the policies that govern them.

  4. Common bus multinode sensor system

    International Nuclear Information System (INIS)

    Kelly, T.F.; Naviasky, E.H.; Evans, W.P.; Jefferies, D.W.; Smith, J.R.

    1988-01-01

    This patent describes a nuclear power plant including a common bus multinode sensor system for sensors in the nuclear power plant, each sensor producing a sensor signal. The system consists of: a power supply providing power; a communication cable coupled to the power supply; plural remote sensor units coupled between the cable and one or more sensors, and comprising: a direct current power supply, connected to the cable and converting the power on the cable into direct current; an analog-to-digital converter connected to the direct current power supply; an oscillator reference; a filter; and an integrated circuit sensor interface connected to the direct current power supply, the analog-to-digital converter, the oscillator crystal and the filter, the interface comprising: a counter receiving a frequency designation word from external to the interface; a phase-frequency comparator connected to the counter; an oscillator connected to the oscillator reference; a timing counter connected to the oscillator, the phase/frequency comparator and the analog-to-digital converter; an analog multiplexer connectable to the sensors and the analog-to-digital converter, and connected to the timing counter; a shift register operatively connected to the timing counter and the analog-to-digital converter; an encoder connected to the shift register and connectable to the filter; and a voltage controlled oscillator connected to the filter and the cable

  5. Common hyperspectral image database design

    Science.gov (United States)

    Tian, Lixun; Liao, Ningfang; Chai, Ali

    2009-11-01

    This paper is to introduce Common hyperspectral image database with a demand-oriented Database design method (CHIDB), which comprehensively set ground-based spectra, standardized hyperspectral cube, spectral analysis together to meet some applications. The paper presents an integrated approach to retrieving spectral and spatial patterns from remotely sensed imagery using state-of-the-art data mining and advanced database technologies, some data mining ideas and functions were associated into CHIDB to make it more suitable to serve in agriculture, geological and environmental areas. A broad range of data from multiple regions of the electromagnetic spectrum is supported, including ultraviolet, visible, near-infrared, thermal infrared, and fluorescence. CHIDB is based on dotnet framework and designed by MVC architecture including five main functional modules: Data importer/exporter, Image/spectrum Viewer, Data Processor, Parameter Extractor, and On-line Analyzer. The original data were all stored in SQL server2008 for efficient search, query and update, and some advance Spectral image data Processing technology are used such as Parallel processing in C#; Finally an application case is presented in agricultural disease detecting area.

  6. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  7. Algebraic computing

    International Nuclear Information System (INIS)

    MacCallum, M.A.H.

    1990-01-01

    The implementation of a new computer algebra system is time consuming: designers of general purpose algebra systems usually say it takes about 50 man-years to create a mature and fully functional system. Hence the range of available systems and their capabilities changes little between one general relativity meeting and the next, despite which there have been significant changes in the period since the last report. The introductory remarks aim to give a brief survey of capabilities of the principal available systems and highlight one or two trends. The reference to the most recent full survey of computer algebra in relativity and brief descriptions of the Maple, REDUCE and SHEEP and other applications are given. (author)

  8. [Computer program "PANCREAS"].

    Science.gov (United States)

    Jakubowicz, J; Jankowski, M; Szomański, B; Switka, S; Zagórowicz, E; Pertkiewicz, M; Szczygieł, B

    1998-01-01

    Contemporary computer technology allows precise and fast large database analysis. Widespread and common use depends on appropriate, user friendly software, usually lacking in special medical applications. The aim of this work was to develop an integrated system designed to store, explore and analyze data of patients treated for pancreatic cancer. For that purpose the database administration system MS Visual Fox Pro 3.0 was used and special application, according to ISO 9000 series has been developed. The system works under MS Windows 95 with possibility of easy adaptation to MS Windows 3.11 or MS Windows NT by graphic user's interface. The system stores personal data, laboratory results, visual and histological analyses and information on treatment course and complications. However the system archives them and enables the preparation reports of according to individual and statistical needs. Help and security settings allow to work also for one not familiar with computer science.

  9. Computational Controversy

    OpenAIRE

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have appeared, using new data sources such as Wikipedia, which help us now better understand these phenomena. However, compared to what social sciences have discovered about such debates, the existing computati...

  10. Computed tomography

    International Nuclear Information System (INIS)

    Andre, M.; Resnick, D.

    1988-01-01

    Computed tomography (CT) has matured into a reliable and prominent tool for study of the muscoloskeletal system. When it was introduced in 1973, it was unique in many ways and posed a challenge to interpretation. It is in these unique features, however, that its advantages lie in comparison with conventional techniques. These advantages will be described in a spectrum of important applications in orthopedics and rheumatology

  11. Computed radiography

    International Nuclear Information System (INIS)

    Pupchek, G.

    2004-01-01

    Computed radiography (CR) is an image acquisition process that is used to create digital, 2-dimensional radiographs. CR employs a photostimulable phosphor-based imaging plate, replacing the standard x-ray film and intensifying screen combination. Conventional radiographic exposure equipment is used with no modification required to the existing system. CR can transform an analog x-ray department into a digital one and eliminates the need for chemicals, water, darkrooms and film processor headaches. (author)

  12. Computational universes

    International Nuclear Information System (INIS)

    Svozil, Karl

    2005-01-01

    Suspicions that the world might be some sort of a machine or algorithm existing 'in the mind' of some symbolic number cruncher have lingered from antiquity. Although popular at times, the most radical forms of this idea never reached mainstream. Modern developments in physics and computer science have lent support to the thesis, but empirical evidence is needed before it can begin to replace our contemporary world view

  13. Common Questions About Chronic Prostatitis.

    Science.gov (United States)

    Holt, James D; Garrett, W Allan; McCurry, Tyler K; Teichman, Joel M H

    2016-02-15

    Chronic prostatitis is relatively common, with a lifetime prevalence of 1.8% to 8.2%. Risk factors include conditions that facilitate introduction of bacteria into the urethra and prostate (which also predispose the patient to urinary tract infections) and conditions that can lead to chronic neuropathic pain. Chronic prostatitis must be differentiated from other causes of chronic pelvic pain, such as interstitial cystitis/bladder pain syndrome and pelvic floor dysfunction; prostate and bladder cancers; benign prostatic hyperplasia; urolithiasis; and other causes of dysuria, urinary frequency, and nocturia. The National Institutes of Health divides prostatitis into four syndromes: acute bacterial prostatitis, chronic bacterial prostatitis (CBP), chronic nonbacterial prostatitis (CNP)/chronic pelvic pain syndrome (CPPS), and asymptomatic inflammatory prostatitis. CBP and CNP/CPPS both lead to pelvic pain and lower urinary tract symptoms. CBP presents as recurrent urinary tract infections with the same organism identified on repeated cultures; it responds to a prolonged course of an antibiotic that adequately penetrates the prostate, if the urine culture suggests sensitivity. If four to six weeks of antibiotic therapy is effective but symptoms recur, another course may be prescribed, perhaps in combination with alpha blockers or nonopioid analgesics. CNP/CPPS, accounting for more than 90% of chronic prostatitis cases, presents as prostatic pain lasting at least three months without consistent culture results. Weak evidence supports the use of alpha blockers, pain medications, and a four- to six-week course of antibiotics for the treatment of CNP/CPPS. Patients may also be referred to a psychologist experienced in managing chronic pain. Experts on this condition recommend a combination of treatments tailored to the patient's phenotypic presentation. Urology referral should be considered when appropriate treatment is ineffective. Additional treatments include pelvic

  14. Coordinating towards a Common Good

    Science.gov (United States)

    Santos, Francisco C.; Pacheco, Jorge M.

    2010-09-01

    Throughout their life, humans often engage in collective endeavors ranging from family related issues to global warming. In all cases, the tragedy of the commons threatens the possibility of reaching the optimal solution associated with global cooperation, a scenario predicted by theory and demonstrated by many experiments. Using the toolbox of evolutionary game theory, I will address two important aspects of evolutionary dynamics that have been neglected so far in the context of public goods games and evolution of cooperation. On one hand, the fact that often there is a threshold above which a public good is reached [1, 2]. On the other hand, the fact that individuals often participate in several games, related to the their social context and pattern of social ties, defined by a social network [3, 4, 5]. In the first case, the existence of a threshold above which collective action is materialized dictates a rich pattern of evolutionary dynamics where the direction of natural selection can be inverted compared to standard expectations. Scenarios of defector dominance, pure coordination or coexistence may arise simultaneously. Both finite and infinite population models are analyzed. In networked games, cooperation blooms whenever the act of contributing is more important than the effort contributed. In particular, the heterogeneous nature of social networks naturally induces a symmetry breaking of the dilemmas of cooperation, as contributions made by cooperators may become contingent on the social context in which the individual is embedded. This diversity in context provides an advantage to cooperators, which is particularly strong when both wealth and social ties follow a power-law distribution, providing clues on the self-organization of social communities. Finally, in both situations, it can be shown that individuals no longer play a defection dominance dilemma, but effectively engage in a general N-person coordination game. Even if locally defection may seem

  15. Designing the Microbial Research Commons

    Energy Technology Data Exchange (ETDEWEB)

    Uhlir, Paul F. [Board on Research Data and Information Policy and Global Affairs, Washington, DC (United States)

    2011-10-01

    Recent decades have witnessed an ever-increasing range and volume of digital data. All elements of the pillars of science--whether observation, experiment, or theory and modeling--are being transformed by the continuous cycle of generation, dissemination, and use of factual information. This is even more so in terms of the re-using and re-purposing of digital scientific data beyond the original intent of the data collectors, often with dramatic results. We all know about the potential benefits and impacts of digital data, but we are also aware of the barriers, the challenges in maximizing the access, and use of such data. There is thus a need to think about how a data infrastructure can enhance capabilities for finding, using, and integrating information to accelerate discovery and innovation. How can we best implement an accessible, interoperable digital environment so that the data can be repeatedly used by a wide variety of users in different settings and with different applications? With this objective: to use the microbial communities and microbial data, literature, and the research materials themselves as a test case, the Board on Research Data and Information held an International Symposium on Designing the Microbial Research Commons at the National Academy of Sciences in Washington, DC on 8-9 October 2009. The symposium addressed topics such as models to lower the transaction costs and support access to and use of microbiological materials and digital resources from the perspective of publicly funded research, public-private interactions, and developing country concerns. The overall goal of the symposium was to stimulate more research and implementation of improved legal and institutional models for publicly funded research in microbiology.

  16. SEAL: Common Core Libraries and Services for LHC Applications

    CERN Document Server

    Generowicz, J; Moneta, L; Roiser, S; Marino, M; Tuura, L A

    2003-01-01

    The CERN LHC experiments have begun the LHC Computing Grid project in 2001. One of the project's aims is to develop common software infrastructure based on a development vision shared by the participating experiments. The SEAL project will provide common foundation libraries, services and utilities identified by the project's architecture blueprint report. This requires a broad range of functionality that no individual package suitably covers. SEAL thus selects external and experiment-developed packages, integrates them in a coherent whole, develops new code for missing functionality, and provides support to the experiments. We describe the set of basic components identified by the LHC Computing Grid project and thought to be sufficient for development of higher level framework components and specializations. Examples of such components are a plug-in manager, an object dictionary, object whiteboards, an incident or event manager. We present the design and implementation of some of these components and the und...

  17. Proposals for common definitions of reference points in gynecological brachytherapy

    International Nuclear Information System (INIS)

    Chassagne, D.; Horiot, J.C.

    1977-01-01

    In May 1975 the report of European Curietherapy Group recommended in gynecological Dosimetry by computer. Use of reference points = lymphatic trapezoid figure with 6 points, Pelvic wall, all points are refering to bony structures. Use of critical organ reference points = maximum rectum dose, bladder dose mean rectal dose. Use of 6,000 rads reference isodose described by height, width, and thickness dimensions. These proposals are the basis of a common language in gynecological brachytherapy [fr

  18. Development of a common data model for scientific simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ambrosiano, J. [Los Alamos National Lab., NM (United States); Butler, D.M. [Limit Point Systems, Inc. (United States); Matarazzo, C.; Miller, M. [Lawrence Livermore National Lab., CA (United States); Schoof, L. [Sandia National Lab., Albuquerque, NM (United States)

    1999-06-01

    The problem of sharing data among scientific simulation models is a difficult and persistent one. Computational scientists employ an enormous variety of discrete approximations in modeling physical processes on computers. Problems occur when models based on different representations are required to exchange data with one another, or with some other software package. Within the DOE`s Accelerated Strategic Computing Initiative (ASCI), a cross-disciplinary group called the Data Models and Formats (DMF) group, has been working to develop a common data model. The current model is comprised of several layers of increasing semantic complexity. One of these layers is an abstract model based on set theory and topology called the fiber bundle kernel (FBK). This layer provides the flexibility needed to describe a wide range of mesh-approximated functions as well as other entities. This paper briefly describes the ASCI common data model, its mathematical basis, and ASCI prototype development. These prototypes include an object-oriented data management library developed at Los Alamos called the Common Data Model Library or CDMlib, the Vector Bundle API from the Lawrence Livermore Laboratory, and the DMF API from Sandia National Laboratory.

  19. Common modelling approaches for training simulators for nuclear power plants

    International Nuclear Information System (INIS)

    1990-02-01

    Training simulators for nuclear power plant operating staff have gained increasing importance over the last twenty years. One of the recommendations of the 1983 IAEA Specialists' Meeting on Nuclear Power Plant Training Simulators in Helsinki was to organize a Co-ordinated Research Programme (CRP) on some aspects of training simulators. The goal statement was: ''To establish and maintain a common approach to modelling for nuclear training simulators based on defined training requirements''. Before adapting this goal statement, the participants considered many alternatives for defining the common aspects of training simulator models, such as the programming language used, the nature of the simulator computer system, the size of the simulation computers, the scope of simulation. The participants agreed that it was the training requirements that defined the need for a simulator, the scope of models and hence the type of computer complex that was required, the criteria for fidelity and verification, and was therefore the most appropriate basis for the commonality of modelling approaches. It should be noted that the Co-ordinated Research Programme was restricted, for a variety of reasons, to consider only a few aspects of training simulators. This report reflects these limitations, and covers only the topics considered within the scope of the programme. The information in this document is intended as an aid for operating organizations to identify possible modelling approaches for training simulators for nuclear power plants. 33 refs

  20. Approximate solutions of common fixed-point problems

    CERN Document Server

    Zaslavski, Alexander J

    2016-01-01

    This book presents results on the convergence behavior of algorithms which are known as vital tools for solving convex feasibility problems and common fixed point problems. The main goal for us in dealing with a known computational error is to find what approximate solution can be obtained and how many iterates one needs to find it. According to know results, these algorithms should converge to a solution. In this exposition, these algorithms are studied, taking into account computational errors which remain consistent in practice. In this case the convergence to a solution does not take place. We show that our algorithms generate a good approximate solution if computational errors are bounded from above by a small positive constant. Beginning with an introduction, this monograph moves on to study: · dynamic string-averaging methods for common fixed point problems in a Hilbert space · dynamic string methods for common fixed point problems in a metric space · dynamic string-averaging version of the proximal...

  1. Customizable computing

    CERN Document Server

    Chen, Yu-Ting; Gill, Michael; Reinman, Glenn; Xiao, Bingjun

    2015-01-01

    Since the end of Dennard scaling in the early 2000s, improving the energy efficiency of computation has been the main concern of the research community and industry. The large energy efficiency gap between general-purpose processors and application-specific integrated circuits (ASICs) motivates the exploration of customizable architectures, where one can adapt the architecture to the workload. In this Synthesis lecture, we present an overview and introduction of the recent developments on energy-efficient customizable architectures, including customizable cores and accelerators, on-chip memory

  2. Adaptively detecting changes in Autonomic Grid Computing

    KAUST Repository

    Zhang, Xiangliang; Germain, Cé cile; Sebag, Michè le

    2010-01-01

    Detecting the changes is the common issue in many application fields due to the non-stationary distribution of the applicative data, e.g., sensor network signals, web logs and gridrunning logs. Toward Autonomic Grid Computing, adaptively detecting

  3. An Electrical Analog Computer for Poets

    Science.gov (United States)

    Bruels, Mark C.

    1972-01-01

    Nonphysics majors are presented with a direct current experiment beyond Ohms law and series and parallel laws. This involves construction of an analog computer from common rheostats and student-assembled voltmeters. (Author/TS)

  4. EU grid computing effort takes on malaria

    CERN Multimedia

    Lawrence, Stacy

    2006-01-01

    Malaria is the world's most common parasitic infection, affecting more thatn 500 million people annually and killing more than 1 million. In order to help combat malaria, CERN has launched a grid computing effort (1 page)

  5. Computed tomography

    International Nuclear Information System (INIS)

    Wells, P.; Davis, J.; Morgan, M.

    1994-01-01

    X-ray or gamma-ray transmission computed tomography (CT) is a powerful non-destructive evaluation (NDE) technique that produces two-dimensional cross-sectional images of an object without the need to physically section it. CT is also known by the acronym CAT, for computerised axial tomography. This review article presents a brief historical perspective on CT, its current status and the underlying physics. The mathematical fundamentals of computed tomography are developed for the simplest transmission CT modality. A description of CT scanner instrumentation is provided with an emphasis on radiation sources and systems. Examples of CT images are shown indicating the range of materials that can be scanned and the spatial and contrast resolutions that may be achieved. Attention is also given to the occurrence, interpretation and minimisation of various image artefacts that may arise. A final brief section is devoted to the principles and potential of a range of more recently developed tomographic modalities including diffraction CT, positron emission CT and seismic tomography. 57 refs., 2 tabs., 14 figs

  6. Computing Services and Assured Computing

    Science.gov (United States)

    2006-05-01

    fighters’ ability to execute the mission.” Computing Services 4 We run IT Systems that: provide medical care pay the warfighters manage maintenance...users • 1,400 applications • 18 facilities • 180 software vendors • 18,000+ copies of executive software products • Virtually every type of mainframe and... chocs electriques, de branchez les deux cordons d’al imentation avant de faire le depannage P R IM A R Y SD A S B 1 2 PowerHub 7000 RST U L 00- 00

  7. Computational neuroscience

    CERN Document Server

    Blackwell, Kim L

    2014-01-01

    Progress in Molecular Biology and Translational Science provides a forum for discussion of new discoveries, approaches, and ideas in molecular biology. It contains contributions from leaders in their fields and abundant references. This volume brings together different aspects of, and approaches to, molecular and multi-scale modeling, with applications to a diverse range of neurological diseases. Mathematical and computational modeling offers a powerful approach for examining the interaction between molecular pathways and ionic channels in producing neuron electrical activity. It is well accepted that non-linear interactions among diverse ionic channels can produce unexpected neuron behavior and hinder a deep understanding of how ion channel mutations bring about abnormal behavior and disease. Interactions with the diverse signaling pathways activated by G protein coupled receptors or calcium influx adds an additional level of complexity. Modeling is an approach to integrate myriad data sources into a cohesiv...

  8. Social Computing

    CERN Multimedia

    CERN. Geneva

    2011-01-01

    The past decade has witnessed a momentous transformation in the way people interact with each other. Content is now co-produced, shared, classified, and rated by millions of people, while attention has become the ephemeral and valuable resource that everyone seeks to acquire. This talk will describe how social attention determines the production and consumption of content within both the scientific community and social media, how its dynamics can be used to predict the future and the role that social media plays in setting the public agenda. About the speaker Bernardo Huberman is a Senior HP Fellow and Director of the Social Computing Lab at Hewlett Packard Laboratories. He received his Ph.D. in Physics from the University of Pennsylvania, and is currently a Consulting Professor in the Department of Applied Physics at Stanford University. He originally worked in condensed matter physics, ranging from superionic conductors to two-dimensional superfluids, and made contributions to the theory of critical p...

  9. computer networks

    Directory of Open Access Journals (Sweden)

    N. U. Ahmed

    2002-01-01

    Full Text Available In this paper, we construct a new dynamic model for the Token Bucket (TB algorithm used in computer networks and use systems approach for its analysis. This model is then augmented by adding a dynamic model for a multiplexor at an access node where the TB exercises a policing function. In the model, traffic policing, multiplexing and network utilization are formally defined. Based on the model, we study such issues as (quality of service QoS, traffic sizing and network dimensioning. Also we propose an algorithm using feedback control to improve QoS and network utilization. Applying MPEG video traces as the input traffic to the model, we verify the usefulness and effectiveness of our model.

  10. Computer Tree

    Directory of Open Access Journals (Sweden)

    Onur AĞAOĞLU

    2014-12-01

    Full Text Available It is crucial that gifted and talented students should be supported by different educational methods for their interests and skills. The science and arts centres (gifted centres provide the Supportive Education Program for these students with an interdisciplinary perspective. In line with the program, an ICT lesson entitled “Computer Tree” serves for identifying learner readiness levels, and defining the basic conceptual framework. A language teacher also contributes to the process, since it caters for the creative function of the basic linguistic skills. The teaching technique is applied for 9-11 aged student level. The lesson introduces an evaluation process including basic information, skills, and interests of the target group. Furthermore, it includes an observation process by way of peer assessment. The lesson is considered to be a good sample of planning for any subject, for the unpredicted convergence of visual and technical abilities with linguistic abilities.

  11. Science for common entrance physics : answers

    CERN Document Server

    Pickering, W R

    2015-01-01

    This book contains answers to all exercises featured in the accompanying textbook Science for Common Entrance: Physics , which covers every Level 1 and 2 topic in the ISEB 13+ Physics Common Entrance exam syllabus. - Clean, clear layout for easy marking. - Includes examples of high-scoring answers with diagrams and workings. - Suitable for ISEB 13+ Mathematics Common Entrance exams taken from Autumn 2017 onwards. Also available to purchase from the Galore Park website www.galorepark.co.uk :. - Science for Common Entrance: Physics. - Science for Common Entrance: Biology. - Science for Common En

  12. Linking computers for science

    CERN Multimedia

    2005-01-01

    After the success of SETI@home, many other scientists have found computer power donated by the public to be a valuable resource - and sometimes the only possibility to achieve their goals. In July, representatives of several “public resource computing” projects came to CERN to discuss technical issues and R&D activities on the common computing platform they are using, BOINC. This photograph shows the LHC@home screen-saver which uses the BOINC platform: the dots represent protons and the position of the status bar indicates the progress of the calculations. This summer, CERN hosted the first “pangalactic workshop” on BOINC (Berkeley Open Interface for Network Computing). BOINC is modelled on SETI@home, which millions of people have downloaded to help search for signs of extraterrestrial intelligence in radio-astronomical data. BOINC provides a general-purpose framework for scientists to adapt their software to, so that the public can install and run it. An important part of BOINC is managing the...

  13. Computed tomography

    International Nuclear Information System (INIS)

    Boyd, D.P.

    1989-01-01

    This paper reports on computed tomographic (CT) scanning which has improved computer-assisted imaging modalities for radiologic diagnosis. The advantage of this modality is its ability to image thin cross-sectional planes of the body, thus uncovering density information in three dimensions without tissue superposition problems. Because this enables vastly superior imaging of soft tissues in the brain and body, CT scanning was immediately successful and continues to grow in importance as improvements are made in speed, resolution, and cost efficiency. CT scanners are used for general purposes, and the more advanced machines are generally preferred in large hospitals, where volume and variety of usage justifies the cost. For imaging in the abdomen, a scanner with a rapid speed is preferred because peristalsis, involuntary motion of the diaphram, and even cardiac motion are present and can significantly degrade image quality. When contrast media is used in imaging to demonstrate scanner, immediate review of images, and multiformat hardcopy production. A second console is reserved for the radiologist to read images and perform the several types of image analysis that are available. Since CT images contain quantitative information in terms of density values and contours of organs, quantitation of volumes, areas, and masses is possible. This is accomplished with region-of- interest methods, which involve the electronic outlining of the selected region of the television display monitor with a trackball-controlled cursor. In addition, various image- processing options, such as edge enhancement (for viewing fine details of edges) or smoothing filters (for enhancing the detectability of low-contrast lesions) are useful tools

  14. Cloud Computing: The Future of Computing

    OpenAIRE

    Aggarwal, Kanika

    2013-01-01

    Cloud computing has recently emerged as a new paradigm for hosting and delivering services over the Internet. Cloud computing is attractive to business owners as it eliminates the requirement for users to plan ahead for provisioning, and allows enterprises to start from the small and increase resources only when there is a rise in service demand. The basic principles of cloud computing is to make the computing be assigned in a great number of distributed computers, rather then local computer ...

  15. Virtualization and cloud computing in dentistry.

    Science.gov (United States)

    Chow, Frank; Muftu, Ali; Shorter, Richard

    2014-01-01

    The use of virtualization and cloud computing has changed the way we use computers. Virtualization is a method of placing software called a hypervisor on the hardware of a computer or a host operating system. It allows a guest operating system to run on top of the physical computer with a virtual machine (i.e., virtual computer). Virtualization allows multiple virtual computers to run on top of one physical computer and to share its hardware resources, such as printers, scanners, and modems. This increases the efficient use of the computer by decreasing costs (e.g., hardware, electricity administration, and management) since only one physical computer is needed and running. This virtualization platform is the basis for cloud computing. It has expanded into areas of server and storage virtualization. One of the commonly used dental storage systems is cloud storage. Patient information is encrypted as required by the Health Insurance Portability and Accountability Act (HIPAA) and stored on off-site private cloud services for a monthly service fee. As computer costs continue to increase, so too will the need for more storage and processing power. Virtual and cloud computing will be a method for dentists to minimize costs and maximize computer efficiency in the near future. This article will provide some useful information on current uses of cloud computing.

  16. Engineering applications of soft computing

    CERN Document Server

    Díaz-Cortés, Margarita-Arimatea; Rojas, Raúl

    2017-01-01

    This book bridges the gap between Soft Computing techniques and their applications to complex engineering problems. In each chapter we endeavor to explain the basic ideas behind the proposed applications in an accessible format for readers who may not possess a background in some of the fields. Therefore, engineers or practitioners who are not familiar with Soft Computing methods will appreciate that the techniques discussed go beyond simple theoretical tools, since they have been adapted to solve significant problems that commonly arise in such areas. At the same time, the book will show members of the Soft Computing community how engineering problems are now being solved and handled with the help of intelligent approaches. Highlighting new applications and implementations of Soft Computing approaches in various engineering contexts, the book is divided into 12 chapters. Further, it has been structured so that each chapter can be read independently of the others.

  17. Preschool Cookbook of Computer Programming Topics

    Science.gov (United States)

    Morgado, Leonel; Cruz, Maria; Kahn, Ken

    2010-01-01

    A common problem in computer programming use for education in general, not simply as a technical skill, is that children and teachers find themselves constrained by what is possible through limited expertise in computer programming techniques. This is particularly noticeable at the preliterate level, where constructs tend to be limited to…

  18. Mathematics and Computer Science: The Interplay

    OpenAIRE

    Madhavan, Veni CE

    2005-01-01

    Mathematics has been an important intellectual preoccupation of man for a long time. Computer science as a formal discipline is about seven decades young. However, one thing in common between all users and producers of mathematical thought is the almost involuntary use of computing. In this article, we bring to fore the many close connections and parallels between the two sciences of mathematics and computing. We show that, unlike in the other branches of human inquiry where mathematics is me...

  19. FPGA-accelerated simulation of computer systems

    CERN Document Server

    Angepat, Hari; Chung, Eric S; Hoe, James C; Chung, Eric S

    2014-01-01

    To date, the most common form of simulators of computer systems are software-based running on standard computers. One promising approach to improve simulation performance is to apply hardware, specifically reconfigurable hardware in the form of field programmable gate arrays (FPGAs). This manuscript describes various approaches of using FPGAs to accelerate software-implemented simulation of computer systems and selected simulators that incorporate those techniques. More precisely, we describe a simulation architecture taxonomy that incorporates a simulation architecture specifically designed f

  20. Common Core: Teaching Optimum Topic Exploration (TOTE)

    Science.gov (United States)

    Karge, Belinda Dunnick; Moore, Roxane Kushner

    2015-01-01

    The Common Core has become a household term and yet many educators do not understand what it means. This article explains the historical perspectives of the Common Core and gives guidance to teachers in application of Teaching Optimum Topic Exploration (TOTE) necessary for full implementation of the Common Core State Standards. An effective…

  1. A School for the Common Good

    Science.gov (United States)

    Baines, Lawrence; Foster, Hal

    2006-01-01

    This article examines the history and the concept of the common school from the Common School Movement reformers of the 1850s to the present. These reformers envisioned schools that were to be tuition free and open to everyone, places where rich and poor met and learned together on equal terms. Central to the concept of the common school is its…

  2. 49 CFR 1185.5 - Common control.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 8 2010-10-01 2010-10-01 false Common control. 1185.5 Section 1185.5... OF TRANSPORTATION RULES OF PRACTICE INTERLOCKING OFFICERS § 1185.5 Common control. It shall not be... carriers if such carriers are operated under common control or management either: (a) Pursuant to approval...

  3. Simplifying the ELA Common Core; Demystifying Curriculum

    Science.gov (United States)

    Schmoker, Mike; Jago, Carol

    2013-01-01

    The English Language Arts (ELA) Common Core State Standards ([CCSS], 2010) could have a transformational effect on American education. Though the process seems daunting, one can begin immediately integrating the essence of the ELA Common Core in every subject area. This article shows how one could implement the Common Core and create coherent,…

  4. Common Frame of Reference and social justice

    NARCIS (Netherlands)

    Hesselink, M.W.; Satyanarayana, R.

    2009-01-01

    The article "Common Frame of Reference and Social Justice" by Martijn W. Hesselink evaluates the Draft Common Frame of Reference (DCFR) of social justice. It discusses the important areas, namely a common frame of Reference in a broad sense, social justice and contract law, private law and

  5. Learning Commons in Academic Libraries: Discussing Themes in the Literature from 2001 to the Present

    Science.gov (United States)

    Blummer, Barbara; Kenton, Jeffrey M.

    2017-01-01

    Although the term lacks a standard definition, learning commons represent academic library spaces that provide computer and library resources as well as a range of academic services that support learners and learning. Learning commons have been equated to a laboratory for creating knowledge and staffed with librarians that serve as facilitators of…

  6. Computer Refurbishment

    International Nuclear Information System (INIS)

    Ichiyen, Norman; Chan, Dominic; Thompson, Paul

    2004-01-01

    The major activity for the 18-month refurbishment outage at the Point Lepreau Generating Station is the replacement of all 380 fuel channel and calandria tube assemblies and the lower portion of connecting feeder pipes. New Brunswick Power would also take advantage of this outage to conduct a number of repairs, replacements, inspections and upgrades (such as rewinding or replacing the generator, replacement of shutdown system trip computers, replacement of certain valves and expansion joints, inspection of systems not normally accessible, etc.). This would allow for an additional 25 to 30 years. Among the systems to be replaced are the PDC's for both shutdown systems. Assessments have been completed for both the SDS1 and SDS2 PDC's, and it has been decided to replace the SDS2 PDCs with the same hardware and software approach that has been used successfully for the Wolsong 2, 3, and 4 and the Qinshan 1 and 2 SDS2 PDCs. For SDS1, it has been decided to use the same software development methodology that was used successfully for the Wolsong and Qinshan called the I A and to use a new hardware platform in order to ensure successful operation for the 25-30 year station operating life. The selected supplier is Triconex, which uses a triple modular redundant architecture that will enhance the robustness/fault tolerance of the design with respect to equipment failures

  7. Illustrated computer tomography

    International Nuclear Information System (INIS)

    Takahashi, S.

    1983-01-01

    This book provides the following information: basic aspects of computed tomography; atlas of computed tomography of the normal adult; clinical application of computed tomography; and radiotherapy planning and computed tomography

  8. Space Use in the Commons: Evaluating a Flexible Library Environment

    Directory of Open Access Journals (Sweden)

    Andrew D. Asher

    2017-06-01

    Full Text Available Abstract Objective – This article evaluates the usage and user experience of the Herman B Wells Library’s Learning Commons, a newly renovated technology and learning centre that provides services and spaces tailored to undergraduates’ academic needs at Indiana University Bloomington (IUB. Methods – A mixed-method research protocol combining time-lapse photography, unobtrusive observation, and random-sample surveys was employed to construct and visualize a representative usage and activity profile for the Learning Commons space. Results – Usage of the Learning Commons by particular student groups varied considerably from expectations based on student enrollments. In particular, business, first and second year students, and international students used the Learning Commons to a higher degree than expected, while humanities students used it to a much lower degree. While users were satisfied with the services provided and the overall atmosphere of the space, they also experienced the negative effects of insufficient space and facilities due to the space often operating at or near its capacity. Demand for collaboration rooms and computer workstations was particularly high, while additional evidence suggests that the Learning Commons furniture mix may not adequately match users’ needs. Conclusions – This study presents a unique approach to space use evaluation that enables researchers to collect and visualize representative observational data. This study demonstrates a model for quickly and reliably assessing space use for open-plan and learning-centred academic environments and for evaluating how well these learning spaces fulfill their institutional mission.

  9. Analog and hybrid computing

    CERN Document Server

    Hyndman, D E

    2013-01-01

    Analog and Hybrid Computing focuses on the operations of analog and hybrid computers. The book first outlines the history of computing devices that influenced the creation of analog and digital computers. The types of problems to be solved on computers, computing systems, and digital computers are discussed. The text looks at the theory and operation of electronic analog computers, including linear and non-linear computing units and use of analog computers as operational amplifiers. The monograph examines the preparation of problems to be deciphered on computers. Flow diagrams, methods of ampl

  10. Identification and authentication. Common biometric methods review

    OpenAIRE

    Lysak, A.

    2012-01-01

    Major biometric methods used for identification and authentication purposes in modern computing systems are considered in the article. Basic classification, application areas and key differences are given.

  11. Common cause failures of reactor pressure components

    International Nuclear Information System (INIS)

    Mankamo, T.

    1978-01-01

    The common cause failure is defined as a multiple failure event due to a common cause. The existence of common failure causes may ruin the potential advantages of applying redundancy for reliability improvement. Examples relevant to large mechanical components are presented. Preventive measures against common cause failures, such as physical separation, equipment diversity, quality assurance, and feedback from experience are discussed. Despite the large number of potential interdependencies, the analysis of common cause failures can be done within the framework of conventional reliability analysis, utilizing, for example, the method of deriving minimal cut sets from a system fault tree. Tools for the description and evaluation of dependencies between components are discussed: these include the model of conditional failure causes that are common to many components, and evaluation of the reliability of redundant components subjected to a common load. (author)

  12. Cloud Computing Fundamentals

    Science.gov (United States)

    Furht, Borko

    In the introductory chapter we define the concept of cloud computing and cloud services, and we introduce layers and types of cloud computing. We discuss the differences between cloud computing and cloud services. New technologies that enabled cloud computing are presented next. We also discuss cloud computing features, standards, and security issues. We introduce the key cloud computing platforms, their vendors, and their offerings. We discuss cloud computing challenges and the future of cloud computing.

  13. Computer games and software engineering

    CERN Document Server

    Cooper, Kendra M L

    2015-01-01

    Computer games represent a significant software application domain for innovative research in software engineering techniques and technologies. Game developers, whether focusing on entertainment-market opportunities or game-based applications in non-entertainment domains, thus share a common interest with software engineers and developers on how to best engineer game software.Featuring contributions from leading experts in software engineering, the book provides a comprehensive introduction to computer game software development that includes its history as well as emerging research on the inte

  14. Unconventional Quantum Computing Devices

    OpenAIRE

    Lloyd, Seth

    2000-01-01

    This paper investigates a variety of unconventional quantum computation devices, including fermionic quantum computers and computers that exploit nonlinear quantum mechanics. It is shown that unconventional quantum computing devices can in principle compute some quantities more rapidly than `conventional' quantum computers.

  15. Computing handbook computer science and software engineering

    CERN Document Server

    Gonzalez, Teofilo; Tucker, Allen

    2014-01-01

    Overview of Computer Science Structure and Organization of Computing Peter J. DenningComputational Thinking Valerie BarrAlgorithms and Complexity Data Structures Mark WeissBasic Techniques for Design and Analysis of Algorithms Edward ReingoldGraph and Network Algorithms Samir Khuller and Balaji RaghavachariComputational Geometry Marc van KreveldComplexity Theory Eric Allender, Michael Loui, and Kenneth ReganFormal Models and Computability Tao Jiang, Ming Li, and Bala

  16. Computed tomographic findings of intracranial pyogenic abscess

    International Nuclear Information System (INIS)

    Kim, S. J.; Suh, J. H.; Park, C. Y.; Lee, K. C.; Chung, S. S.

    1982-01-01

    The early diagnosis and effective treatment of brain abscess pose a difficult clinical problem. With the advent of computed tomography, however, it appears that mortality due to intracranial abscess has significantly diminished. 54 cases of intracranial pyogenic abscess are presented. Etiologic factors and computed tomographic findings are analyzed and following result are obtained. 1. The common etiologic factors are otitis media, post operation, and head trauma, in order of frequency. 2. The most common initial computed tomographic findings of brain abscess is ring contrast enhancement with surrounding brain edema. 3. The most characteristic computed tomographic finding of ring contrast enhancement is smooth thin walled ring contrast enhancement. 4. Most of thick irregular ring contrast enhancement are abscess associated with cyanotic heart disease or poor operation. 5. The most common findings of epidural and subdural empyema is crescentic radiolucent area with thin wall contrast enhancement without surrounding brain edema in convexity of brain

  17. From Computer Forensics to Forensic Computing: Investigators Investigate, Scientists Associate

    OpenAIRE

    Dewald, Andreas; Freiling, Felix C.

    2014-01-01

    This paper draws a comparison of fundamental theories in traditional forensic science and the state of the art in current computer forensics, thereby identifying a certain disproportion between the perception of central aspects in common theory and the digital forensics reality. We propose a separation of what is currently demanded of practitioners in digital forensics into a rigorous scientific part on the one hand, and a more general methodology of searching and seizing digital evidence an...

  18. AMRITA -- A computational facility

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, J.E. [California Inst. of Tech., CA (US); Quirk, J.J.

    1998-02-23

    Amrita is a software system for automating numerical investigations. The system is driven using its own powerful scripting language, Amrita, which facilitates both the composition and archiving of complete numerical investigations, as distinct from isolated computations. Once archived, an Amrita investigation can later be reproduced by any interested party, and not just the original investigator, for no cost other than the raw CPU time needed to parse the archived script. In fact, this entire lecture can be reconstructed in such a fashion. To do this, the script: constructs a number of shock-capturing schemes; runs a series of test problems, generates the plots shown; outputs the LATEX to typeset the notes; performs a myriad of behind-the-scenes tasks to glue everything together. Thus Amrita has all the characteristics of an operating system and should not be mistaken for a common-or-garden code.

  19. Common-image gathers using the excitation amplitude imaging condition

    KAUST Repository

    Kalita, Mahesh

    2016-06-06

    Common-image gathers (CIGs) are extensively used in migration velocity analysis. Any defocused events in the subsurface offset domain or equivalently nonflat events in angle-domain CIGs are accounted for revising the migration velocities. However, CIGs from wave-equation methods such as reverse time migration are often expensive to compute, especially in 3D. Using the excitation amplitude imaging condition that simplifies the forward-propagated source wavefield, we have managed to extract extended images for space and time lags in conjunction with prestack reverse time migration. The extended images tend to be cleaner, and the memory cost/disk storage is extensively reduced because we do not need to store the source wavefield. In addition, by avoiding the crosscorrelation calculation, we reduce the computational cost. These features are demonstrated on a linear v(z) model, a two-layer velocity model, and the Marmousi model.

  20. [Common physicochemical characteristics of endogenous hormones-- liberins and statins].

    Science.gov (United States)

    Zamiatnin, A A; Voronina, O L

    1998-01-01

    The common chemical features of oligopeptide releasing-hormones and release inhibiting hormones were investigated with the aid of computer methods. 339 regulatory molecules of such type have been extracted out of data from computer bank EROP-Moscow. They contain from 2 to 47 amino acid residues and their sequences include short sites, which play apparently a decisive role in realization of interactions with the receptors. The analysis of chemical radicals shows that all liberins and statins contain positively charged group and cyclic radical of some amino acids or hydrophobic group. Results of this study indicate that the most chemical radicals of hormones are open for the interaction with potential receptors of target-cells. The mechanism of hormone ligand and receptors binding and conceivable role of amino acid and neurotransmitter radicals in hormonal properties of liberins and statins is discussed.

  1. Specialized computer architectures for computational aerodynamics

    Science.gov (United States)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  2. Mathematics for common entrance three (extension) answers

    CERN Document Server

    Alexander, Serena

    2015-01-01

    This book contains answers to all exercises featured in the accompanying textbook Mathematics for Common Entrance Three (Extension) , which provides essential preparation for Level 3 of the ISEB 13+ Mathematics exam, as well as for CASE and other scholarship exams. - Clean, clear layout for easy marking. - Includes examples of high-scoring answers with diagrams and workings. Also available to purchase from the Galore Park website www.galorepark.co.uk :. - Mathematics for Common Entrance Three (Extension). - Mathematics for Common Entrance One. - Mathematics for Common Entrance One Answers. - M

  3. Common Property Resource Management, Institutional Change and ...

    African Journals Online (AJOL)

    Common Property Resource Management, Institutional Change and ... Most contemporary discussions on African development since independence forty ... theories on CPR Resource Management in a specific ecological and political setting.

  4. Evaluation of Chromosomal Abnormalities and Common ...

    African Journals Online (AJOL)

    Evaluation of Chromosomal Abnormalities and Common Trombophilic Mutations in Cases with Recurrent Miscarriage. Ahmet Karatas, Recep Eroz, Mustafa Albayrak, Tulay Ozlu, Bulent Cakmak, Fatih Keskin ...

  5. Justifying group-specific common morality.

    Science.gov (United States)

    Strong, Carson

    2008-01-01

    Some defenders of the view that there is a common morality have conceived such morality as being universal, in the sense of extending across all cultures and times. Those who deny the existence of such a common morality often argue that the universality claim is implausible. Defense of common morality must take account of the distinction between descriptive and normative claims that there is a common morality. This essay considers these claims separately and identifies the nature of the arguments for each claim. It argues that the claim that there is a universal common morality in the descriptive sense has not been successfully defended to date. It maintains that the claim that there is a common morality in the normative sense need not be understood as universalist. This paper advocates the concept of group specific common morality, including country-specific versions. It suggests that both the descriptive and the normative claims that there are country-specific common moralities are plausible, and that a country-specific normative common morality could provide the basis for a country's bioethics.

  6. Network survivability performance (computer diskette)

    Science.gov (United States)

    1993-11-01

    File characteristics: Data file; 1 file. Physical description: 1 computer diskette; 3 1/2 in.; high density; 2.0MB. System requirements: Mac; Word. This technical report has been developed to address the survivability of telecommunications networks including services. It responds to the need for a common understanding of, and assessment techniques for network survivability, availability, integrity, and reliability. It provides a basis for designing and operating telecommunication networks to user expectations for network survivability.

  7. Neuroscience-inspired computational systems for speech recognition under noisy conditions

    Science.gov (United States)

    Schafer, Phillip B.

    advantage of the neural representation's invariance in noise. The scheme centers on a speech similarity measure based on the longest common subsequence between spike sequences. The combined encoding and decoding scheme outperforms a benchmark system in extremely noisy acoustic conditions. Finally, I consider methods for decoding spike representations of continuous speech. To help guide the alignment of templates to words, I design a syllable detection scheme that robustly marks the locations of syllabic nuclei. The scheme combines SVM-based training with a peak selection algorithm designed to improve noise tolerance. By incorporating syllable information into the ASR system, I obtain strong recognition results in noisy conditions, although the performance in noiseless conditions is below the state of the art. The work presented here constitutes a novel approach to the problem of ASR that can be applied in the many challenging acoustic environments in which we use computer technologies today. The proposed spike-based processing methods can potentially be exploited in effcient hardware implementations and could significantly reduce the computational costs of ASR. The work also provides a framework for understanding the advantages of spike-based acoustic coding in the human brain.

  8. Functional programming for computer vision

    Science.gov (United States)

    Breuel, Thomas M.

    1992-04-01

    Functional programming is a style of programming that avoids the use of side effects (like assignment) and uses functions as first class data objects. Compared with imperative programs, functional programs can be parallelized better, and provide better encapsulation, type checking, and abstractions. This is important for building and integrating large vision software systems. In the past, efficiency has been an obstacle to the application of functional programming techniques in computationally intensive areas such as computer vision. We discuss and evaluate several 'functional' data structures for representing efficiently data structures and objects common in computer vision. In particular, we will address: automatic storage allocation and reclamation issues; abstraction of control structures; efficient sequential update of large data structures; representing images as functions; and object-oriented programming. Our experience suggests that functional techniques are feasible for high- performance vision systems, and that a functional approach simplifies the implementation and integration of vision systems greatly. Examples in C++ and SML are given.

  9. Distributed computing system with dual independent communications paths between computers and employing split tokens

    Science.gov (United States)

    Rasmussen, Robert D. (Inventor); Manning, Robert M. (Inventor); Lewis, Blair F. (Inventor); Bolotin, Gary S. (Inventor); Ward, Richard S. (Inventor)

    1990-01-01

    This is a distributed computing system providing flexible fault tolerance; ease of software design and concurrency specification; and dynamic balance of the loads. The system comprises a plurality of computers each having a first input/output interface and a second input/output interface for interfacing to communications networks each second input/output interface including a bypass for bypassing the associated computer. A global communications network interconnects the first input/output interfaces for providing each computer the ability to broadcast messages simultaneously to the remainder of the computers. A meshwork communications network interconnects the second input/output interfaces providing each computer with the ability to establish a communications link with another of the computers bypassing the remainder of computers. Each computer is controlled by a resident copy of a common operating system. Communications between respective ones of computers is by means of split tokens each having a moving first portion which is sent from computer to computer and a resident second portion which is disposed in the memory of at least one of computer and wherein the location of the second portion is part of the first portion. The split tokens represent both functions to be executed by the computers and data to be employed in the execution of the functions. The first input/output interfaces each include logic for detecting a collision between messages and for terminating the broadcasting of a message whereby collisions between messages are detected and avoided.

  10. Probabilistic analysis of ''common mode failures''

    International Nuclear Information System (INIS)

    Easterling, R.G.

    1978-01-01

    Common mode failure is a topic of considerable interest in reliability and safety analyses of nuclear reactors. Common mode failures are often discussed in terms of examples: two systems fail simultaneously due to an external event such as an earthquake; two components in redundant channels fail because of a common manufacturing defect; two systems fail because a component common to both fails; the failure of one system increases the stress on other systems and they fail. The common thread running through these is a dependence of some sort--statistical or physical--among multiple failure events. However, the nature of the dependence is not the same in all these examples. An attempt is made to model situations, such as the above examples, which have been termed ''common mode failures.'' In doing so, it is found that standard probability concepts and terms, such as statistically dependent and independent events, and conditional and unconditional probabilities, suffice. Thus, it is proposed that the term ''common mode failures'' be dropped, at least from technical discussions of these problems. A corollary is that the complementary term, ''random failures,'' should also be dropped. The mathematical model presented may not cover all situations which have been termed ''common mode failures,'' but provides insight into the difficulty of obtaining estimates of the probabilities of these events

  11. The gastro-oesophageal common cavity revisited

    NARCIS (Netherlands)

    Aanen, M. C.; Bredenoord, A. J.; Samsom, M.; Smout, A. J. P. M.

    2006-01-01

    The manometric common cavity phenomenon has been used as indicator of gastro-oesophageal reflux of liquid or gaseous substances. Using combined pH and impedance recording as reference standard the value of a common cavity as indicator of gastro-oesophageal reflux was tested. Ten healthy male

  12. Common Core in the Real World

    Science.gov (United States)

    Hess, Frederick M.; McShane, Michael Q.

    2013-01-01

    There are at least four key places where the Common Core intersects with current efforts to improve education in the United States--testing, professional development, expectations, and accountability. Understanding them can help educators, parents, and policymakers maximize the chance that the Common Core is helpful to these efforts and, perhaps…

  13. Tragedy of the commons in Melipona bees.

    Science.gov (United States)

    Wenseleers, Tom; Ratnieks, Francis L W

    2004-01-01

    In human society selfish use of common resources can lead to disaster, a situation known as the 'tragedy of the commons' (TOC). Although a TOC is usually prevented by coercion, theory predicts that close kinship ties can also favour reduced exploitation. We test this prediction using data on a TOC occurring in Melipona bee societies. PMID:15504003

  14. Tragedy of the commons in Melipona bees

    OpenAIRE

    Wenseleers, Tom; Ratnieks, Francis L.W.

    2004-01-01

    In human society selfish use of common resources can lead to disaster, a situation known as the 'tragedy of the commons' (TOC). Although a TOC is usually prevented by coercion, theory predicts that close kinship ties can also favour reduced exploitation. We test this prediction using data on a TOC occurring in Melipona bee societies.

  15. Characteristics of common infections in Nicaragua

    NARCIS (Netherlands)

    Matute Moreno, A.J.

    2006-01-01

    The main purpose of the studies outlined in this thesis was to gain empirical epidemiological and therapeutic knowledge of some common infectious diseases in Nicaragua. So far, relatively little was known about the incidence, etiology, management and antibiotic resistance patterns of common

  16. Common carotid artery disease in Takayasu's arteritis

    International Nuclear Information System (INIS)

    Hamdan, Nabil; Calderon, Luis I; Castro, Pablo and others

    2004-01-01

    Takayasu's arteritis is a disease of unknown etiology with main involvement of the common carotid 5 artery and its branches. we report the case of a 69 years old female patient with Tokays arteritis with 2 bilateral involvements of the common carotid arteries, treated with percutaneous angioplasty and Stent implantation

  17. common problems affecting supranational attempts in africa

    African Journals Online (AJOL)

    user

    Politico-legal Framework for Integration in Africa: Exploring the Attainability of a ... laws, the common international trade policy, the common fisheries policy and the .... among the member states according to the annual imports, production and ...... Fredland R (eds) Integration and Disintegration in East Africa (University.

  18. Common micronutrient deficiencies among food aid beneficiaries ...

    African Journals Online (AJOL)

    admin

    Abstract. Background: Ethiopia is amongst the African countries that have received significant food aid. Nonetheless, the common micronutrient deficiencies among food aid beneficiaries are not well documented. Objective: To find out the common micronutrient deficiencies among food aid beneficiaries in the country based ...

  19. Laboratories of commons: experimentation, recursivity and activism

    Directory of Open Access Journals (Sweden)

    Adolfo Estalella Fernández

    2013-03-01

    Full Text Available The urban public space, digital creations or the air, all of them are objects that have been traditionally thought within the dichotomous logic of the public and private property but in the last decade they have started to be considered as common resources. Commons is an old concept that has been recovered with intensity in the last decade; it refers to collective resources and goods that are governed collectively and whose property regime is different from the public and private. This article introduces the contributions to a monograph devoted to the topic of ‘Laboratories of commons’. Contributors discuss the diverse modalities of commons in different social domains like art, activism, the rural and the urban domain. This introduction contextualizes these contributions and identifies some of the issues that cross the different articles. In this exercise we introduce a tentative argument according to which the commons and the commons research take an exceptional configuration in Spain. Very briefly: commons are brought into existence as an epistemic object, an experimental domain quite different from the conventional conceptualizations that conceive it as a property regime or a type of good. This peculiar configuration gives a distinctive condition to commons in Spain that are different from other geographies; this is evidenced in a double shift: the emergence of new objects that are thought as commons and the location of their research in the domain of cultural and creative production.

  20. Clinical chemistry of common apolipoprotein E isoforms

    NARCIS (Netherlands)

    Brouwer, DAJ; vanDoormaal, JJ; Muskiet, FAJ

    1996-01-01

    Apolipoprotein E plays a central role in clearance of lipoprotein remnants by serving as a ligand for low-density lipoprotein and apolipoprotein E receptors. Three common alleles (apolipoprotein E(2), E(3) and E(4)) give rise to six phenotypes. Apolipoprotein E(3) is the ancestral form. Common

  1. Young Children's Understanding of Cultural Common Ground

    Science.gov (United States)

    Liebal, Kristin; Carpenter, Malinda; Tomasello, Michael

    2013-01-01

    Human social interaction depends on individuals identifying the common ground they have with others, based both on personally shared experiences and on cultural common ground that all members of the group share. We introduced 3- and 5-year-old children to a culturally well-known object and a novel object. An experimenter then entered and asked,…

  2. Data needs for common cause failure analysis

    International Nuclear Information System (INIS)

    Parry, G.W.; Paula, H.M.; Rasmuson, D.; Whitehead, D.

    1990-01-01

    The procedures guide for common cause failure analysis published jointly by USNRC and EPRI requires a detailed historical event analysis. Recent work on the further development of the cause-defense picture of common cause failures introduced in that guide identified the information that is necessary to perform the detailed analysis in an objective manner. This paper summarizes these information needs

  3. Common Frame of Reference & social justice

    NARCIS (Netherlands)

    Hesselink, M.

    2008-01-01

    This paper evaluates the draft Common Frame of Reference (DCFR) in terms of social justice. It concludes the DCFR has all the characteristics of a typical European compromise. Ideological and esthetical purists will certainly be disappointed. In this respect, it has much in common with the

  4. Applied Parallel Computing Industrial Computation and Optimization

    DEFF Research Database (Denmark)

    Madsen, Kaj; NA NA NA Olesen, Dorte

    Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)......Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)...

  5. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  6. Clinical diagnosis and computer analysis of headache symptoms.

    OpenAIRE

    Drummond, P D; Lance, J W

    1984-01-01

    The headache histories obtained from clinical interviews of 600 patients were analysed by computer to see whether patients could be separated systematically into clinical categories and to see whether sets of symptoms commonly reported together differed in distribution among the categories. The computer classification procedure assigned 537 patients to the same category as their clinical diagnosis, the majority of discrepancies between clinical and computer classifications involving common mi...

  7. Common-cause analysis using sets

    International Nuclear Information System (INIS)

    Worrell, R.B.; Stack, D.W.

    1977-12-01

    Common-cause analysis was developed at the Aerojet Nuclear Company for studying the behavior of a system that is affected by special conditions and secondary causes. Common-cause analysis is related to fault tree analysis. Common-cause candidates are minimal cut sets whose primary events are closely linked by a special condition or are susceptible to the same secondary cause. It is shown that common-cause candidates can be identified using the Set Equation Transformation System (SETS). A Boolean equation is used to establish the special conditions and secondary cause susceptibilities for each primary event in the fault tree. A transformation of variables (substituting equals for equals), executed on a minimal cut set equation, results in replacing each primary event by the right side of its special condition/secondary cause equation and leads to the identification of the common-cause candidates

  8. Common Injuries of Collegiate Tennis Players

    Directory of Open Access Journals (Sweden)

    Christian Wisdom Magtajas Valleser

    2017-09-01

    Full Text Available The purpose of this study is to determine the common injuries of Filipino collegiate tennis players; 110 varsity tennis players with a mean of 20 years old (SD ± 1.7 with an average playing experience of 12 years participated in the study. There was a 100% occurrence of at least one injury with an average rate of 5.98 injuries per person. The authors observed that the most commonly injured anatomical region is the lower extremity; ankles were recorded as the most commonly injured part. Other commonly injured areas included the shoulders and lower back. Furthermore, the most common injury type is tendinitis, sprains, and strains. The recorded injuries were mostly associated with overuse injuries, and the findings were similar to those of most other studies on tennis injuries. A larger sample size may provide more conclusive findings on tennis injuries, particularly in different levels of competition, such as recreational or professional athletes.

  9. BONFIRE: benchmarking computers and computer networks

    OpenAIRE

    Bouckaert, Stefan; Vanhie-Van Gerwen, Jono; Moerman, Ingrid; Phillips, Stephen; Wilander, Jerker

    2011-01-01

    The benchmarking concept is not new in the field of computing or computer networking. With “benchmarking tools”, one usually refers to a program or set of programs, used to evaluate the performance of a solution under certain reference conditions, relative to the performance of another solution. Since the 1970s, benchmarking techniques have been used to measure the performance of computers and computer networks. Benchmarking of applications and virtual machines in an Infrastructure-as-a-Servi...

  10. Democratizing Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  11. Computing at Stanford.

    Science.gov (United States)

    Feigenbaum, Edward A.; Nielsen, Norman R.

    1969-01-01

    This article provides a current status report on the computing and computer science activities at Stanford University, focusing on the Computer Science Department, the Stanford Computation Center, the recently established regional computing network, and the Institute for Mathematical Studies in the Social Sciences. Also considered are such topics…

  12. Computer simulation of hopper flow

    International Nuclear Information System (INIS)

    Potapov, A.V.; Campbell, C.S.

    1996-01-01

    This paper describes two-dimensional computer simulations of granular flow in plane hoppers. The simulations can reproduce an experimentally observed asymmetric unsteadiness for monodispersed particle sizes, but also could eliminate it by adding a small amount of polydispersity. This appears to be a result of the strong packings that may be formed by monodispersed particles and is thus a noncontinuum effect. The internal stress state was also sampled, which among other things, allows an evaluation of common assumptions made in granular material models. These showed that the internal friction coefficient is far from a constant, which is in contradiction to common models based on plasticity theory which assume that the material is always at the point of imminent yield. Furthermore, it is demonstrated that rapid granular flow theory, another common modeling technique, is inapplicable to this problem even near the exit where the flow is moving its fastest. copyright 1996 American Institute of Physics

  13. Validity of two methods to assess computer use: Self-report by questionnaire and computer use software

    NARCIS (Netherlands)

    Douwes, M.; Kraker, H.de; Blatter, B.M.

    2007-01-01

    A long duration of computer use is known to be positively associated with Work Related Upper Extremity Disorders (WRUED). Self-report by questionnaire is commonly used to assess a worker's duration of computer use. The aim of the present study was to assess the validity of self-report and computer

  14. Intelligent computing systems emerging application areas

    CERN Document Server

    Virvou, Maria; Jain, Lakhmi

    2016-01-01

    This book at hand explores emerging scientific and technological areas in which Intelligent Computing Systems provide efficient solutions and, thus, may play a role in the years to come. It demonstrates how Intelligent Computing Systems make use of computational methodologies that mimic nature-inspired processes to address real world problems of high complexity for which exact mathematical solutions, based on physical and statistical modelling, are intractable. Common intelligent computational methodologies are presented including artificial neural networks, evolutionary computation, genetic algorithms, artificial immune systems, fuzzy logic, swarm intelligence, artificial life, virtual worlds and hybrid methodologies based on combinations of the previous. The book will be useful to researchers, practitioners and graduate students dealing with mathematically-intractable problems. It is intended for both the expert/researcher in the field of Intelligent Computing Systems, as well as for the general reader in t...

  15. Soft computing in computer and information science

    CERN Document Server

    Fray, Imed; Pejaś, Jerzy

    2015-01-01

    This book presents a carefully selected and reviewed collection of papers presented during the 19th Advanced Computer Systems conference ACS-2014. The Advanced Computer Systems conference concentrated from its beginning on methods and algorithms of artificial intelligence. Further future brought new areas of interest concerning technical informatics related to soft computing and some more technological aspects of computer science such as multimedia and computer graphics, software engineering, web systems, information security and safety or project management. These topics are represented in the present book under the categories Artificial Intelligence, Design of Information and Multimedia Systems, Information Technology Security and Software Technologies.

  16. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  17. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... When the image slices are reassembled by computer software, the result is a very detailed multidimensional view ... Safety Images related to Computed Tomography (CT) - Head Videos related to Computed Tomography (CT) - Head Sponsored by ...

  18. Computers: Instruments of Change.

    Science.gov (United States)

    Barkume, Megan

    1993-01-01

    Discusses the impact of computers in the home, the school, and the workplace. Looks at changes in computer use by occupations and by industry. Provides information on new job titles in computer occupations. (JOW)

  19. DNA computing models

    CERN Document Server

    Ignatova, Zoya; Zimmermann, Karl-Heinz

    2008-01-01

    In this excellent text, the reader is given a comprehensive introduction to the field of DNA computing. The book emphasizes computational methods to tackle central problems of DNA computing, such as controlling living cells, building patterns, and generating nanomachines.

  20. Distributed multiscale computing

    NARCIS (Netherlands)

    Borgdorff, J.

    2014-01-01

    Multiscale models combine knowledge, data, and hypotheses from different scales. Simulating a multiscale model often requires extensive computation. This thesis evaluates distributing these computations, an approach termed distributed multiscale computing (DMC). First, the process of multiscale

  1. Computational Modeling | Bioenergy | NREL

    Science.gov (United States)

    cell walls and are the source of biofuels and biomaterials. Our modeling investigates their properties . Quantum Mechanical Models NREL studies chemical and electronic properties and processes to reduce barriers Computational Modeling Computational Modeling NREL uses computational modeling to increase the

  2. Computer Viruses: An Overview.

    Science.gov (United States)

    Marmion, Dan

    1990-01-01

    Discusses the early history and current proliferation of computer viruses that occur on Macintosh and DOS personal computers, mentions virus detection programs, and offers suggestions for how libraries can protect themselves and their users from damage by computer viruses. (LRW)

  3. Fast DRR splat rendering using common consumer graphics hardware

    International Nuclear Information System (INIS)

    Spoerk, Jakob; Bergmann, Helmar; Wanschitz, Felix; Dong, Shuo; Birkfellner, Wolfgang

    2007-01-01

    Digitally rendered radiographs (DRR) are a vital part of various medical image processing applications such as 2D/3D registration for patient pose determination in image-guided radiotherapy procedures. This paper presents a technique to accelerate DRR creation by using conventional graphics hardware for the rendering process. DRR computation itself is done by an efficient volume rendering method named wobbled splatting. For programming the graphics hardware, NVIDIAs C for Graphics (Cg) is used. The description of an algorithm used for rendering DRRs on the graphics hardware is presented, together with a benchmark comparing this technique to a CPU-based wobbled splatting program. Results show a reduction of rendering time by about 70%-90% depending on the amount of data. For instance, rendering a volume of 2x10 6 voxels is feasible at an update rate of 38 Hz compared to 6 Hz on a common Intel-based PC using the graphics processing unit (GPU) of a conventional graphics adapter. In addition, wobbled splatting using graphics hardware for DRR computation provides higher resolution DRRs with comparable image quality due to special processing characteristics of the GPU. We conclude that DRR generation on common graphics hardware using the freely available Cg environment is a major step toward 2D/3D registration in clinical routine

  4. Suehiro Jurisprudence and John R. Commons

    DEFF Research Database (Denmark)

    Tackney, Charles T.

    This is a comparative history study at the interface of industrial / employment relations and stakeholder theory. The focus concerns decades of post-World War II Japanese and U.S. path dependent national divergence from common labor legislation enactments separated by only 15 years: 1933...... or Suehiro hōgaku) document a dramatic, fascinating historical parting of two nations due to Japanese deep appreciation of the labor law and institutional economics research legacy of John R. Commons, the father of U.S. industrial relations. Understanding this common, shared source opens industrial relations...

  5. Accounting and marketing: searching a common denominator

    Directory of Open Access Journals (Sweden)

    David S. Murphy

    2012-06-01

    Full Text Available Accounting and marketing are very different disciplines. The analysis of customer profitability is one concept that can unite accounting and marketing as a common denominator. In this article I search for common ground between accounting and marketing in the analysis of customer profitability to determine if a common denominator really exists between the two. This analysis focuses on accounting profitability, customer lifetime value, and customer equity. The article ends with a summary of what accountants can do to move the analysis of customer value forward, as an analytical tool, within companies.

  6. Ecology and the Tragedy of the Commons

    Directory of Open Access Journals (Sweden)

    Peter Roopnarine

    2013-02-01

    Full Text Available This paper develops mathematical models of the tragedy of the commons analogous to ecological models of resource consumption. Tragedies differ fundamentally from predator–prey relationships in nature because human consumers of a resource are rarely controlled solely by that resource. Tragedies do occur, however, at the level of the ecosystem, where multiple species interactions are involved. Human resource systems are converging rapidly toward ecosystem-type systems as the number of exploited resources increase, raising the probability of system-wide tragedies in the human world. Nevertheless, common interests exclusive of exploited commons provide feasible options for avoiding tragedy in a converged world.

  7. The common polymorphism of apolipoprotein E

    DEFF Research Database (Denmark)

    Gerdes, Ulrik

    2003-01-01

    from only 10-15% in southern Europe to 40-50% in the north. The gradient may be a trace of the demic expansion of agriculture that began about 10,000 years ago, but it may also reflect the possibility that APOE*4 carriers are less likely to develop vitamin D deficiency. The common APOE polymorphism......Apolipoprotein E (apoE) has important functions in systemic and local lipid transport, but also has other functions. The gene (APOE) shows a common polymorphism with three alleles--APOE*2, APOE*3, and APOE*4. Their frequencies vary substantially around the world, but APOE*3 is the most common...

  8. Computer Virus and Trends

    OpenAIRE

    Tutut Handayani; Soenarto Usna,Drs.MMSI

    2004-01-01

    Since its appearance the first time in the mid-1980s, computer virus has invited various controversies that still lasts to this day. Along with the development of computer systems technology, viruses komputerpun find new ways to spread itself through a variety of existing communications media. This paper discusses about some things related to computer viruses, namely: the definition and history of computer viruses; the basics of computer viruses; state of computer viruses at this time; and ...

  9. The influence of age, gender and other information technology use on young people's computer use at school and home.

    Science.gov (United States)

    Harris, C; Straker, L; Pollock, C

    2013-01-01

    Young people are exposed to a range of information technologies (IT) in different environments, including home and school, however the factors influencing IT use at home and school are poorly understood. The aim of this study was to investigate young people's computer exposure patterns at home and school, and related factors such as age, gender and the types of IT used. 1351 children in Years 1, 6, 9 and 11 from 10 schools in metropolitan Western Australia were surveyed. Most children had access to computers at home and school, with computer exposures comparable to TV, reading and writing. Total computer exposure was greater at home than school, and increased with age. Computer activities varied with age and gender and became more social with increased age, at the same time parental involvement reduced. Bedroom computer use was found to result in higher exposure patterns. High use of home and school computers were associated with each other. Associations varied depending on the type of IT exposure measure (frequency, mean weekly hours, usual and longest duration). The frequency and duration of children's computer exposure were associated with a complex interplay of the environment of use, the participant's age and gender and other IT activities.

  10. Understanding failures in petascale computers

    International Nuclear Information System (INIS)

    Schroeder, Bianca; Gibson, Garth A

    2007-01-01

    With petascale computers only a year or two away there is a pressing need to anticipate and compensate for a probable increase in failure and application interruption rates. Researchers, designers and integrators have available to them far too little detailed information on the failures and interruptions that even smaller terascale computers experience. The information that is available suggests that application interruptions will become far more common in the coming decade, and the largest applications may surrender large fractions of the computer's resources to taking checkpoints and restarting from a checkpoint after an interruption. This paper reviews sources of failure information for compute clusters and storage systems, projects failure rates and the corresponding decrease in application effectiveness, and discusses coping strategies such as application-level checkpoint compression and system level process-pairs fault-tolerance for supercomputing. The need for a public repository for detailed failure and interruption records is particularly concerning, as projections from one architectural family of machines to another are widely disputed. To this end, this paper introduces the Computer Failure Data Repository and issues a call for failure history data to publish in it

  11. Plasticity: modeling & computation

    National Research Council Canada - National Science Library

    Borja, Ronaldo Israel

    2013-01-01

    .... "Plasticity Modeling & Computation" is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids...

  12. Cloud Computing Quality

    Directory of Open Access Journals (Sweden)

    Anamaria Şiclovan

    2013-02-01

    Full Text Available Cloud computing was and it will be a new way of providing Internet services and computers. This calculation approach is based on many existing services, such as the Internet, grid computing, Web services. Cloud computing as a system aims to provide on demand services more acceptable as price and infrastructure. It is exactly the transition from computer to a service offered to the consumers as a product delivered online. This paper is meant to describe the quality of cloud computing services, analyzing the advantages and characteristics offered by it. It is a theoretical paper.Keywords: Cloud computing, QoS, quality of cloud computing

  13. Computer hardware fault administration

    Science.gov (United States)

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  14. Computers and data processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Computers and Data Processing provides information pertinent to the advances in the computer field. This book covers a variety of topics, including the computer hardware, computer programs or software, and computer applications systems.Organized into five parts encompassing 19 chapters, this book begins with an overview of some of the fundamental computing concepts. This text then explores the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapters consider how computers present their results and explain the storage and retrieval of

  15. Computers in nuclear medicine

    International Nuclear Information System (INIS)

    Giannone, Carlos A.

    1999-01-01

    This chapter determines: capture and observation of images in computers; hardware and software used, personal computers, networks and workstations. The use of special filters determine the quality image

  16. An Evaluation of Windows-Based Computer Forensics Application Software Running on a Macintosh

    OpenAIRE

    Gregory H. Carlton

    2008-01-01

    The two most common computer forensics applications perform exclusively on Microsoft Windows Operating Systems, yet contemporary computer forensics examinations frequently encounter one or more of the three most common operating system environments, namely Windows, OS-X, or some form of UNIX or Linux. Additionally, government and private computer forensics laboratories frequently encounter budget constraints that limit their access to computer hardware. Currently, Macintosh computer systems a...

  17. Decoding Dyslexia, a Common Learning Disability

    Science.gov (United States)

    ... if they continue to struggle. Read More "Dyslexic" Articles In Their Own Words: Dealing with Dyslexia / Decoding Dyslexia, a Common Learning Disability / What is Dyslexia? / Special Education and Research ...

  18. SmallSat Common Electronics Board (SCEB)

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to design a low-power general-purpose SmallSat Common Electronics Board (SCEB).  The SCEB design will be based on input received from a group of...

  19. Phenotyping common beans for adaptation to drought

    Science.gov (United States)

    Beebe, Stephen E.; Rao, Idupulapati M.; Blair, Matthew W.; Acosta-Gallegos, Jorge A.

    2013-01-01

    Common beans (Phaseolus vulgaris L.) originated in the New World and are the grain legume of greatest production for direct human consumption. Common bean production is subject to frequent droughts in highland Mexico, in the Pacific coast of Central America, in northeast Brazil, and in eastern and southern Africa from Ethiopia to South Africa. This article reviews efforts to improve common bean for drought tolerance, referring to genetic diversity for drought response, the physiology of drought tolerance mechanisms, and breeding strategies. Different races of common bean respond differently to drought, with race Durango of highland Mexico being a major source of genes. Sister species of P. vulgaris likewise have unique traits, especially P. acutifolius which is well adapted to dryland conditions. Diverse sources of tolerance may have different mechanisms of plant response, implying the need for different methods of phenotyping to recognize the relevant traits. Practical considerations of field management are discussed including: trial planning; water management; and field preparation. PMID:23507928

  20. Common Cold in Babies: Symptoms and Causes

    Science.gov (United States)

    ... clear at first but might thicken and turn yellow or green Other signs and symptoms of a common cold in a baby may include: Fever Sneezing Coughing Decreased appetite Irritability Difficulty sleeping Trouble ...