WorldWideScience

Sample records for computer components

  1. Computers as components principles of embedded computing system design

    CERN Document Server

    Wolf, Marilyn

    2012-01-01

    Computers as Components: Principles of Embedded Computing System Design, 3e, presents essential knowledge on embedded systems technology and techniques. Updated for today's embedded systems design methods, this edition features new examples including digital signal processing, multimedia, and cyber-physical systems. Author Marilyn Wolf covers the latest processors from Texas Instruments, ARM, and Microchip Technology plus software, operating systems, networks, consumer devices, and more. Like the previous editions, this textbook: Uses real processors to demonstrate both technology and tec

  2. Dynamic leaching test of personal computer components.

    Science.gov (United States)

    Li, Yadong; Richardson, Jay B; Niu, Xiaojun; Jackson, Ollie J; Laster, Jeremy D; Walker, Aaron K

    2009-11-15

    A dynamic leaching test (DLT) was developed and used to evaluate the leaching of toxic substances for electronic waste in the environment. The major components in personal computers (PCs) including motherboards, hard disc drives, floppy disc drives, and compact disc drives were tested. The tests lasted for 2 years for motherboards and 1.5 year for the disc drives. The extraction fluids for the standard toxicity characteristic leaching procedure (TCLP) and synthetic precipitation leaching procedure (SPLP) were used as the DLT leaching solutions. A total of 18 elements including Ag, Al, As, Au, Ba, Be, Cd, Cr, Cu, Fe, Ga, Ni, Pd, Pb, Sb, Se, Sn, and Zn were analyzed in the DLT leachates. Only Al, Cu, Fe, Ni, Pb, and Zn were commonly found in the DLT leachates of the PC components. Their leaching levels were much higher in TCLP extraction fluid than in SPLP extraction fluid. The toxic heavy metal Pb was found to continuously leach out of the components over the entire test periods. The cumulative amounts of Pb leached out of the motherboards in TCLP extraction fluid reached 2.0 g per motherboard over the 2-year test period, and that in SPLP extraction fluid were 75-90% less. The leaching rates or levels of Pb were largely affected by the content of galvanized steel in the PC components. The higher was the steel content, the lower the Pb leaching rate would be. The findings suggest that the obsolete PCs disposed of in landfills or discarded in the environment continuously release Pb for years when subjected to landfill leachate or rains.

  3. Computational needs for modelling accelerator components

    International Nuclear Information System (INIS)

    Hanerfeld, H.

    1985-06-01

    The particle-in-cell MASK is being used to model several different electron accelerator components. These studies are being used both to design new devices and to understand particle behavior within existing structures. Studies include the injector for the Stanford Linear Collider and the 50 megawatt klystron currently being built at SLAC. MASK is a 2D electromagnetic code which is being used by SLAC both on our own IBM 3081 and on the CRAY X-MP at the NMFECC. Our experience with running MASK illustrates the need for supercomputers to continue work of the kind described. 3 refs., 2 figs

  4. 77 FR 20047 - Certain Computer and Computer Peripheral Devices and Components Thereof and Products Containing...

    Science.gov (United States)

    2012-04-03

    ... INTERNATIONAL TRADE COMMISSION [DN 2889] Certain Computer and Computer Peripheral Devices and... Certain Computer and Computer Peripheral Devices and Components Thereof and Products Containing the Same... importation, and the sale within the United States after importation of certain computer and computer...

  5. Gamma dose effects valuation on micro computing components

    International Nuclear Information System (INIS)

    Joffre, F.

    1995-01-01

    Robotics in hostile environment raises the problem of micro computing components resistance with gamma radiation cumulated dose. The current aim is to reach a dose of 3000 grays with industrial components. A methodology and an instrumentation adapted to test this type of components have been developed. The aim of this work is to present the advantages and disadvantages bound to the use of industrial components in the presence of gamma radiation. After an analysis of the criteria allowing to justify the technological choices, the different steps which characterize the selection and the assessment methodology used are explained. The irradiation and measures means now operational are mentioned. Moreover, the supply aspects of the chosen components for the design of an industrialized system is taken into account. These selection and assessment components contribute to the development and design of computers for civil nuclear robotics. (O.M.)

  6. 77 FR 26041 - Certain Computers and Computer Peripheral Devices and Components Thereof and Products Containing...

    Science.gov (United States)

    2012-05-02

    ... INTERNATIONAL TRADE COMMISSION [Inv. No. 337-TA-841] Certain Computers and Computer Peripheral... after importation of certain computers and computer peripheral devices and components thereof and... industry in the United States exists as required by subsection (a)(2) of section 337. The complainant...

  7. Component-based software for high-performance scientific computing

    Energy Technology Data Exchange (ETDEWEB)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly.

  8. Component-based software for high-performance scientific computing

    International Nuclear Information System (INIS)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly

  9. Computer compensation for NMR quantitative analysis of trace components

    International Nuclear Information System (INIS)

    Nakayama, T.; Fujiwara, Y.

    1981-01-01

    A computer program has been written that determines trace components and separates overlapping components in multicomponent NMR spectra. This program uses the Lorentzian curve as a theoretical curve of NMR spectra. The coefficients of the Lorentzian are determined by the method of least squares. Systematic errors such as baseline/phase distortion are compensated and random errors are smoothed by taking moving averages, so that there processes contribute substantially to decreasing the accumulation time of spectral data. The accuracy of quantitative analysis of trace components has been improved by two significant figures. This program was applied to determining the abundance of 13C and the saponification degree of PVA

  10. Computation of X-ray powder diffractograms of cement components ...

    Indian Academy of Sciences (India)

    Computation of X-ray powder diffractograms of cement components and its application to phase analysis and hydration performance of OPC cement. Rohan Jadhav N C Debnath. Volume 34 Issue 5 August 2011 pp 1137- ... Keywords. Portland cement; X-ray diffraction; crystal structure; characterization; Rietveld method.

  11. Improved computation method in residual life estimation of structural components

    Directory of Open Access Journals (Sweden)

    Maksimović Stevan M.

    2013-01-01

    Full Text Available This work considers the numerical computation methods and procedures for the fatigue crack growth predicting of cracked notched structural components. Computation method is based on fatigue life prediction using the strain energy density approach. Based on the strain energy density (SED theory, a fatigue crack growth model is developed to predict the lifetime of fatigue crack growth for single or mixed mode cracks. The model is based on an equation expressed in terms of low cycle fatigue parameters. Attention is focused on crack growth analysis of structural components under variable amplitude loads. Crack growth is largely influenced by the effect of the plastic zone at the front of the crack. To obtain efficient computation model plasticity-induced crack closure phenomenon is considered during fatigue crack growth. The use of the strain energy density method is efficient for fatigue crack growth prediction under cyclic loading in damaged structural components. Strain energy density method is easy for engineering applications since it does not require any additional determination of fatigue parameters (those would need to be separately determined for fatigue crack propagation phase, and low cyclic fatigue parameters are used instead. Accurate determination of fatigue crack closure has been a complex task for years. The influence of this phenomenon can be considered by means of experimental and numerical methods. Both of these models are considered. Finite element analysis (FEA has been shown to be a powerful and useful tool1,6 to analyze crack growth and crack closure effects. Computation results are compared with available experimental results. [Projekat Ministarstva nauke Republike Srbije, br. OI 174001

  12. Computational/experimental studies of isolated, single component droplet combustion

    Science.gov (United States)

    Dryer, Frederick L.

    1993-01-01

    Isolated droplet combustion processes have been the subject of extensive experimental and theoretical investigations for nearly 40 years. The gross features of droplet burning are qualitatively embodied by simple theories and are relatively well understood. However, there remain significant aspects of droplet burning, particularly its dynamics, for which additional basic knowledge is needed for thorough interpretations and quantitative explanations of transient phenomena. Spherically-symmetric droplet combustion, which can only be approximated under conditions of both low Reynolds and Grashof numbers, represents the simplest geometrical configuration in which to study the coupled chemical/transport processes inherent within non-premixed flames. The research summarized here, concerns recent results on isolated, single component, droplet combustion under microgravity conditions, a program pursued jointly with F.A. Williams of the University of California, San Diego. The overall program involves developing and applying experimental methods to study the burning of isolated, single component droplets, in various atmospheres, primarily at atmospheric pressure and below, in both drop towers and aboard space-based platforms such as the Space Shuttle or Space Station. Both computational methods and asymptotic methods, the latter pursued mainly at UCSD, are used in developing the experimental test matrix, in analyzing results, and for extending theoretical understanding. Methanol, and the normal alkanes, n-heptane, and n-decane, have been selected as test fuels to study time-dependent droplet burning phenomena. The following sections summarizes the Princeton efforts on this program, describe work in progress, and briefly delineate future research directions.

  13. Computation of the mechanical behaviour of nuclear reactor components

    International Nuclear Information System (INIS)

    Brosi, S.; Niffenegger, M.; Roesel, R.; Reichlin, K.; Duijvestijn, A.

    1994-01-01

    A possible limiting factor of the service life of a reactor is the mechanical load carrying margin, i.e. the excess of the load carrying capacity over the actual loading, of the central, heavy section components. This margin decreases during service but, for safety reasons, may not fall below a critical value. Therefore, it is essential to check and to control continuously the factors which cause the decrease. The reasons for the decrease are shown at length and in detail in an example relating to the test which almost achieved failure of a pipe emanating from a reactor pressure vessel, weakened by an artificial crack and undergoing a water-hammer loading. The latter was caused by a sudden valve closure supposed to follow upon a break far downstream. The computational and experimental difficulties associated with the simultaneous occurrence of an extreme weakening and an extreme loading in an already rather complicated geometry are explained. It is concluded that available computational tools and present know-how are sufficient to simulate the behaviour under such conditions as would prevail in normal service, and even to analyse departures from them, as long as not all difficulties arise simultaneously. (author) figs., tabs., refs

  14. Software Components and Formal Methods from a Computational Viewpoint

    OpenAIRE

    Lambertz, Christian

    2012-01-01

    Software components and the methodology of component-based development offer a promising approach to master the design complexity of huge software products because they separate the concerns of software architecture from individual component behavior and allow for reusability of components. In combination with formal methods, the specification of a formal component model of the later software product or system allows for establishing and verifying important system properties in an automatic a...

  15. Teaching Computer Security with a Hands-On Component

    OpenAIRE

    Murthy , Narayan

    2011-01-01

    Part 2: WISE 7; International audience; To address national needs for computer security education, many universities have incorporated computer and security courses into their undergraduate and graduate curricula. Our department has introduced computer security courses at both the undergraduate and the graduate level. This paper describes our approach, our experiences, and lessons learned in teaching a Computer Security Overview course.There are two key elements in the course: Studying comput...

  16. Information Sharing for Computing Trust Metrics on COTS Electronic Components

    National Research Council Canada - National Science Library

    McMillon, William J

    2008-01-01

    .... It is challenging for the DoD to determine whether and how much to trust in COTS components, given uncertainty and incomplete information about the developers and suppliers of COTS components as well...

  17. FAILPROB-A Computer Program to Compute the Probability of Failure of a Brittle Component; TOPICAL

    International Nuclear Information System (INIS)

    WELLMAN, GERALD W.

    2002-01-01

    FAILPROB is a computer program that applies the Weibull statistics characteristic of brittle failure of a material along with the stress field resulting from a finite element analysis to determine the probability of failure of a component. FAILPROB uses the statistical techniques for fast fracture prediction (but not the coding) from the N.A.S.A. - CARES/life ceramic reliability package. FAILPROB provides the analyst at Sandia with a more convenient tool than CARES/life because it is designed to behave in the tradition of structural analysis post-processing software such as ALGEBRA, in which the standard finite element database format EXODUS II is both read and written. This maintains compatibility with the entire SEACAS suite of post-processing software. A new technique to deal with the high local stresses computed for structures with singularities such as glass-to-metal seals and ceramic-to-metal braze joints is proposed and implemented. This technique provides failure probability computation that is insensitive to the finite element mesh employed in the underlying stress analysis. Included in this report are a brief discussion of the computational algorithms employed, user instructions, and example problems that both demonstrate the operation of FAILPROB and provide a starting point for verification and validation

  18. IPython: components for interactive and parallel computing across disciplines. (Invited)

    Science.gov (United States)

    Perez, F.; Bussonnier, M.; Frederic, J. D.; Froehle, B. M.; Granger, B. E.; Ivanov, P.; Kluyver, T.; Patterson, E.; Ragan-Kelley, B.; Sailer, Z.

    2013-12-01

    Scientific computing is an inherently exploratory activity that requires constantly cycling between code, data and results, each time adjusting the computations as new insights and questions arise. To support such a workflow, good interactive environments are critical. The IPython project (http://ipython.org) provides a rich architecture for interactive computing with: 1. Terminal-based and graphical interactive consoles. 2. A web-based Notebook system with support for code, text, mathematical expressions, inline plots and other rich media. 3. Easy to use, high performance tools for parallel computing. Despite its roots in Python, the IPython architecture is designed in a language-agnostic way to facilitate interactive computing in any language. This allows users to mix Python with Julia, R, Octave, Ruby, Perl, Bash and more, as well as to develop native clients in other languages that reuse the IPython clients. In this talk, I will show how IPython supports all stages in the lifecycle of a scientific idea: 1. Individual exploration. 2. Collaborative development. 3. Production runs with parallel resources. 4. Publication. 5. Education. In particular, the IPython Notebook provides an environment for "literate computing" with a tight integration of narrative and computation (including parallel computing). These Notebooks are stored in a JSON-based document format that provides an "executable paper": notebooks can be version controlled, exported to HTML or PDF for publication, and used for teaching.

  19. Software, component, and service deployment in computational Grids

    International Nuclear Information System (INIS)

    von Laszewski, G.; Blau, E.; Bletzinger, M.; Gawor, J.; Lane, P.; Martin, S.; Russell, M.

    2002-01-01

    Grids comprise an infrastructure that enables scientists to use a diverse set of distributed remote services and resources as part of complex scientific problem-solving processes. We analyze some of the challenges involved in deploying software and components transparently in Grids. We report on three practical solutions used by the Globus Project. Lessons learned from this experience lead us to believe that it is necessary to support a variety of software and component deployment strategies. These strategies are based on the hosting environment

  20. Computational Support for the Selection of Energy Saving Building Components

    NARCIS (Netherlands)

    De Wilde, P.J.C.J.

    2004-01-01

    Buildings use energy for heating, cooling and lighting, contributing to the problems of exhaustion of fossil fuel supplies and environmental pollution. In order to make buildings more energy-efficient an extensive set of âenergy saving building componentsâ has been developed that contributes to

  1. Computation of X-ray powder diffractograms of cement components ...

    Indian Academy of Sciences (India)

    are very important to understand and predict the performance of cement and the resulting ..... modulus given by kR/Di and k the first order surface rate constant for the reaction ... components of interest are listed in table 1. The other input.

  2. Electron Gun for Computer-controlled Welding of Small Components

    Czech Academy of Sciences Publication Activity Database

    Dupák, Jan; Vlček, Ivan; Zobač, Martin

    2001-01-01

    Roč. 62, 2-3 (2001), s. 159-164 ISSN 0042-207X R&D Projects: GA AV ČR IBS2065015 Institutional research plan: CEZ:AV0Z2065902 Keywords : Electron beam-welding machine * Electron gun * Computer- control led beam Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 0.541, year: 2001

  3. 77 FR 27078 - Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof...

    Science.gov (United States)

    2012-05-08

    ... Phones and Tablet Computers, and Components Thereof; Notice of Receipt of Complaint; Solicitation of... entitled Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof... the United States after importation of certain electronic devices, including mobile phones and tablet...

  4. 77 FR 34063 - Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof...

    Science.gov (United States)

    2012-06-08

    ... Phones and Tablet Computers, and Components Thereof Institution of Investigation AGENCY: U.S... the United States after importation of certain electronic devices, including mobile phones and tablet... mobile phones and tablet computers, and components thereof that infringe one or more of claims 1-3 and 5...

  5. 75 FR 8399 - In the Matter of Certain Mobile Communications and Computer Devices and Components Thereof...

    Science.gov (United States)

    2010-02-24

    ... States after importation of certain mobile communications and computer devices and components thereof by... importation of certain mobile communications or computer devices or components thereof that infringe one or... INTERNATIONAL TRADE COMMISSION [Inv. No. 337-TA-704] In the Matter of Certain Mobile...

  6. Principal Component Analysis - A Powerful Tool in Computing Marketing Information

    Directory of Open Access Journals (Sweden)

    Constantin C.

    2014-12-01

    Full Text Available This paper is about an instrumental research regarding a powerful multivariate data analysis method which can be used by the researchers in order to obtain valuable information for decision makers that need to solve the marketing problem a company face with. The literature stresses the need to avoid the multicollinearity phenomenon in multivariate analysis and the features of Principal Component Analysis (PCA in reducing the number of variables that could be correlated with each other to a small number of principal components that are uncorrelated. In this respect, the paper presents step-by-step the process of applying the PCA in marketing research when we use a large number of variables that naturally are collinear.

  7. Report on the thermal-hydraulics computational component

    International Nuclear Information System (INIS)

    Laughton, T.; Jones, B.G.

    1996-01-01

    The nodal methods computer code utilizing hexagonal geometry, which is being developed as part of this DOE contract, is called THMZ. The computational objective of the code is to calculate the steady-state thermal-hydraulic conditions in a hexagonal geometry reactor core given the appropriate initial conditions and the axial neutron flux profile. The latter is given by a companion nodal neutronics code which was developed in an earlier part of the contact. The joining of these two codes to provide a coupled analysis tool for hexagonal lattice cores is the ultimate objective of the contract and its follow-on work. The remaining part of this report presents the current status of the development and the results which have been obtained to date. These will appear in the MS thesis of Mr. Terrill Laughton in the Department of Nuclear Engineering which is currently in preparation

  8. COMPUTER AIDED THREE DIMENSIONAL DESIGN OF MOLD COMPONENTS

    Directory of Open Access Journals (Sweden)

    Kerim ÇETİNKAYA

    2000-02-01

    Full Text Available Sheet metal molding design with classical methods is formed in very long times calculates and drafts. At the molding design, selection and drafting of most of the components requires very long time because of similar repetative processes. In this study, a molding design program has been developed by using AutoLISP which has been adapted AutoCAD packet program. With this study, design of sheet metal molding, dimensioning, assemly drafting has been realized.

  9. Development and application of computer codes for multidimensional thermalhydraulic analyses of nuclear reactor components

    International Nuclear Information System (INIS)

    Carver, M.B.

    1983-01-01

    Components of reactor systems and related equipment are identified in which multidimensional computational thermal hydraulics can be used to advantage to assess and improve design. Models of single- and two-phase flow are reviewed, and the governing equations for multidimensional analysis are discussed. Suitable computational algorithms are introduced, and sample results from the application of particular multidimensional computer codes are given

  10. A virtual component method in numerical computation of cascades for isotope separation

    International Nuclear Information System (INIS)

    Zeng Shi; Cheng Lu

    2014-01-01

    The analysis, optimization, design and operation of cascades for isotope separation involve computations of cascades. In analytical analysis of cascades, using virtual components is a very useful analysis method. For complicated cases of cascades, numerical analysis has to be employed. However, bound up to the conventional idea that the concentration of a virtual component should be vanishingly small, virtual component is not yet applied to numerical computations. Here a method of introducing the method of using virtual components to numerical computations is elucidated, and its application to a few types of cascades is explained and tested by means of numerical experiments. The results show that the concentration of a virtual component is not restrained at all by the 'vanishingly small' idea. For the same requirements on cascades, the cascades obtained do not depend on the concentrations of virtual components. (authors)

  11. Fuzzy cluster quantitative computations of component mass transfer in rocks or minerals

    International Nuclear Information System (INIS)

    Liu Dezheng

    2000-01-01

    The author advances a new component mass transfer quantitative computation method on the basis of closure nature of mass percentage of components in rocks or minerals. Using fuzzy dynamic cluster analysis, and calculating restore closure difference, and determining type of difference, and assisted by relevant diagnostic parameters, the method gradually screens out the true constant component. Then, true mass percentage and mass transfer quantity of components of metabolic rocks or minerals are calculated by applying the true constant component fixed coefficient. This method is called true constant component fixed method (TCF method)

  12. Independent component analysis of dynamic contrast-enhanced computed tomography images

    Energy Technology Data Exchange (ETDEWEB)

    Koh, T S [School of Electrical and Electronic Engineering, Nanyang Technological University, 50 Nanyang Ave, Singapore 639798 (Singapore); Yang, X [School of Electrical and Electronic Engineering, Nanyang Technological University, 50 Nanyang Ave, Singapore 639798 (Singapore); Bisdas, S [Department of Diagnostic and Interventional Radiology, Johann Wolfgang Goethe University Hospital, Theodor-Stern-Kai 7, D-60590 Frankfurt (Germany); Lim, C C T [Department of Neuroradiology, National Neuroscience Institute, 11 Jalan Tan Tock Seng, Singapore 308433 (Singapore)

    2006-10-07

    Independent component analysis (ICA) was applied on dynamic contrast-enhanced computed tomography images of cerebral tumours to extract spatial component maps of the underlying vascular structures, which correspond to different haemodynamic phases as depicted by the passage of the contrast medium. The locations of arteries, veins and tumours can be separately identified on these spatial component maps. As the contrast enhancement behaviour of the cerebral tumour differs from the normal tissues, ICA yields a tumour component map that reveals the location and extent of the tumour. Tumour outlines can be generated using the tumour component maps, with relatively simple segmentation methods. (note)

  13. 76 FR 41523 - In the Matter of Certain Mobile Communications and Computer Devices and Components Thereof...

    Science.gov (United States)

    2011-07-14

    ... in its entirety Inv. No. 337-TA-704, Certain Mobile Communications and Computer Devices and... importation of certain mobile communications and computer devices and components thereof by reason of... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-704] In the Matter of Certain Mobile...

  14. Computer navigation experience in hip resurfacing improves femoral component alignment using a conventional jig.

    Science.gov (United States)

    Morison, Zachary; Mehra, Akshay; Olsen, Michael; Donnelly, Michael; Schemitsch, Emil

    2013-11-01

    The use of computer navigation has been shown to improve the accuracy of femoral component placement compared to conventional instrumentation in hip resurfacing. Whether exposure to computer navigation improves accuracy when the procedure is subsequently performed with conventional instrumentation without navigation has not been explored. We examined whether femoral component alignment utilizing a conventional jig improves following experience with the use of imageless computer navigation for hip resurfacing. Between December 2004 and December 2008, 213 consecutive hip resurfacings were performed by a single surgeon. The first 17 (Cohort 1) and the last 9 (Cohort 2) hip resurfacings were performed using a conventional guidewire alignment jig. In 187 cases, the femoral component was implanted using the imageless computer navigation. Cohorts 1 and 2 were compared for femoral component alignment accuracy. All components in Cohort 2 achieved the position determined by the preoperative plan. The mean deviation of the stem-shaft angle (SSA) from the preoperatively planned target position was 2.2° in Cohort 2 and 5.6° in Cohort 1 (P = 0.01). Four implants in Cohort 1 were positioned at least 10° varus compared to the target SSA position and another four were retroverted. Femoral component placement utilizing conventional instrumentation may be more accurate following experience using imageless computer navigation.

  15. Computer navigation experience in hip resurfacing improves femoral component alignment using a conventional jig

    Directory of Open Access Journals (Sweden)

    Zachary Morison

    2013-01-01

    Full Text Available Background:The use of computer navigation has been shown to improve the accuracy of femoral component placement compared to conventional instrumentation in hip resurfacing. Whether exposure to computer navigation improves accuracy when the procedure is subsequently performed with conventional instrumentation without navigation has not been explored. We examined whether femoral component alignment utilizing a conventional jig improves following experience with the use of imageless computer navigation for hip resurfacing. Materials and Methods:Between December 2004 and December 2008, 213 consecutive hip resurfacings were performed by a single surgeon. The first 17 (Cohort 1 and the last 9 (Cohort 2 hip resurfacings were performed using a conventional guidewire alignment jig. In 187 cases, the femoral component was implanted using the imageless computer navigation. Cohorts 1 and 2 were compared for femoral component alignment accuracy. Results:All components in Cohort 2 achieved the position determined by the preoperative plan. The mean deviation of the stem-shaft angle (SSA from the preoperatively planned target position was 2.2° in Cohort 2 and 5.6° in Cohort 1 ( P = 0.01. Four implants in Cohort 1 were positioned at least 10° varus compared to the target SSA position and another four were retroverted. Conclusions: Femoral component placement utilizing conventional instrumentation may be more accurate following experience using imageless computer navigation.

  16. Computed tomography (CT) as a nondestructive test method used for composite helicopter components

    Science.gov (United States)

    Oster, Reinhold

    1991-09-01

    The first components of primary helicopter structures to be made of glass fiber reinforced plastics were the main and tail rotor blades of the Bo105 and BK 117 helicopters. These blades are now successfully produced in series. New developments in rotor components, e.g., the rotor blade technology of the Bo108 and PAH2 programs, make use of very complex fiber reinforced structures to achieve simplicity and strength. Computer tomography was found to be an outstanding nondestructive test method for examining the internal structure of components. A CT scanner generates x-ray attenuation measurements which are used to produce computer reconstructed images of any desired part of an object. The system images a range of flaws in composites in a number of views and planes. Several CT investigations and their results are reported taking composite helicopter components as an example.

  17. A computer-controlled electronic system for the ultrasonic NDT of components for nuclear power stations

    International Nuclear Information System (INIS)

    Rehrmann, M.; Harbecke, D.

    1987-01-01

    The paper describes an automatic ultrasonic testing system combined with a computer-controlled electronics system, called IMPULS I, for the non-destructive testing of components of nuclear reactors. The system can be used for both in-service inspection and for inspection during the manufacturing process. IMPUL I has more functions and less components than conventional ultrasonic systems, and the system gives good reproducible test results and is easy to operate. (U.K.)

  18. A computer-based feedback only intervention with and without a moderation skills component

    OpenAIRE

    Weaver, Cameron C.; Leffingwell, Thad R.; Lombardi, Nathaniel J.; Claborn, Kasey R.; Miller, Mary E.; Martens, Matthew P.

    2013-01-01

    Research on the efficacy of computer-delivered feedback-only interventions (FOIs) for college alcohol misuse has been mixed. Limitations to these FOIs include participant engagement and variation in the use of a moderation skills component. The current investigation sought to address these limitations using a novel computer-delivered FOI, the Drinkers Assessment and Feedback Tool for College Students (DrAFT-CS). Heavy drinking college students (N = 176) were randomly assigned to DrAFT-CS, DrA...

  19. 78 FR 63492 - Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof...

    Science.gov (United States)

    2013-10-24

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-847] Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof; Notice of Request for Statements on the Public Interest AGENCY: U.S. International Trade Commission. ACTION: Notice. SUMMARY: Notice is...

  20. Separation of electron ion ring components (computational simulation and experimental results)

    International Nuclear Information System (INIS)

    Aleksandrov, V.S.; Dolbilov, G.V.; Kazarinov, N.Yu.; Mironov, V.I.; Novikov, V.G.; Perel'shtejn, Eh.A.; Sarantsev, V.P.; Shevtsov, V.F.

    1978-01-01

    The problems of the available polarization value of electron-ion rings in the regime of acceleration and separation of its components at the final stage of acceleration are studied. The results of computational simulation by use of the macroparticle method and experiments on the ring acceleration and separation are given. The comparison of calculation results with experiment is presented

  1. Decreasing Transition Times in Elementary School Classrooms: Using Computer-Assisted Instruction to Automate Intervention Components

    Science.gov (United States)

    Hine, Jeffrey F.; Ardoin, Scott P.; Foster, Tori E.

    2015-01-01

    Research suggests that students spend a substantial amount of time transitioning between classroom activities, which may reduce time spent academically engaged. This study used an ABAB design to evaluate the effects of a computer-assisted intervention that automated intervention components previously shown to decrease transition times. We examined…

  2. Experimental investigation of surface determination process on multi-material components for dimensional computed tomography

    DEFF Research Database (Denmark)

    Borges de Oliveira, Fabrício; Stolfi, Alessandro; Bartscher, Markus

    2016-01-01

    The possibility of measuring multi-material components, while assessing inner and outer features simultaneously makes X-ray computed tomography (CT) the latest evolution in the field of coordinate measurement systems (CMSs). However, the difficulty in selecting suitable scanning parameters and su...

  3. 78 FR 75942 - Certain Mobile Phones and Tablet Computers, and Components Thereof; Commission Determination To...

    Science.gov (United States)

    2013-12-13

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-847] Certain Mobile Phones and Tablet Computers, and Components Thereof; Commission Determination To Review in Part a Final Initial Determination... Qualcomm Magellan and Odyssey transceiver chips have become a de facto standard in the mobile devices...

  4. Development of computational methods of design by analysis for pressure vessel components

    International Nuclear Information System (INIS)

    Bao Shiyi; Zhou Yu; He Shuyan; Wu Honglin

    2005-01-01

    Stress classification is not only one of key steps when pressure vessel component is designed by analysis, but also a difficulty which puzzles engineers and designers at all times. At present, for calculating and categorizing the stress field of pressure vessel components, there are several computation methods of design by analysis such as Stress Equivalent Linearization, Two-Step Approach, Primary Structure method, Elastic Compensation method, GLOSS R-Node method and so on, that are developed and applied. Moreover, ASME code also gives an inelastic method of design by analysis for limiting gross plastic deformation only. When pressure vessel components design by analysis, sometimes there are huge differences between the calculating results for using different calculating and analysis methods mentioned above. As consequence, this is the main reason that affects wide application of design by analysis approach. Recently, a new approach, presented in the new proposal of a European Standard, CEN's unfired pressure vessel standard EN 13445-3, tries to avoid problems of stress classification by analyzing pressure vessel structure's various failure mechanisms directly based on elastic-plastic theory. In this paper, some stress classification methods mentioned above, are described briefly. And the computational methods cited in the European pressure vessel standard, such as Deviatoric Map, and nonlinear analysis methods (plastic analysis and limit analysis), are depicted compendiously. Furthermore, the characteristics of computational methods of design by analysis are summarized for selecting the proper computational method when design pressure vessel component by analysis. (authors)

  5. Can a dual-energy computed tomography predict unsuitable stone components for extracorporeal shock wave lithotripsy?

    Science.gov (United States)

    Ahn, Sung Hoon; Oh, Tae Hoon; Seo, Ill Young

    2015-09-01

    To assess the potential of dual-energy computed tomography (DECT) to identify urinary stone components, particularly uric acid and calcium oxalate monohydrate, which are unsuitable for extracorporeal shock wave lithotripsy (ESWL). This clinical study included 246 patients who underwent removal of urinary stones and an analysis of stone components between November 2009 and August 2013. All patients received preoperative DECT using two energy values (80 kVp and 140 kVp). Hounsfield units (HU) were measured and matched to the stone component. Significant differences in HU values were observed between uric acid and nonuric acid stones at the 80 and 140 kVp energy values (penergy values (p<0.001). DECT improved the characterization of urinary stone components and was a useful method for identifying uric acid and calcium oxalate monohydrate stones, which are unsuitable for ESWL.

  6. Computer software program for monitoring the availability of systems and components of electric power generating systems

    International Nuclear Information System (INIS)

    Petersen, T.A.; Hilsmeier, T.A.; Kapinus, D.M.

    1994-01-01

    As availabilities of electric power generating stations systems and components become more and more important from a financial, personnel safety, and regulatory requirements standpoint, it is evident that a comprehensive, yet simple and user-friendly program for system and component tracking and monitoring is needed to assist in effectively managing the large volume of systems and components with their large numbers of associated maintenance/availability records. A user-friendly computer software program for system and component availability monitoring has been developed that calculates, displays and monitors selected component and system availabilities. This is a Windows trademark based (Graphical User Interface) program that utilizes a system flow diagram for the data input screen which also provides a visual representation of availability values and limits for the individual components and associated systems. This program can be customized to the user's plant-specific system and component selections and configurations. As will be discussed herein, this software program is well suited for availability monitoring and ultimately providing valuable information for improving plant performance and reducing operating costs

  7. Femoral Component External Rotation Affects Knee Biomechanics: A Computational Model of Posterior-stabilized TKA.

    Science.gov (United States)

    Kia, Mohammad; Wright, Timothy M; Cross, Michael B; Mayman, David J; Pearle, Andrew D; Sculco, Peter K; Westrich, Geoffrey H; Imhauser, Carl W

    2018-01-01

    The correct amount of external rotation of the femoral component during TKA is controversial because the resulting changes in biomechanical knee function associated with varying degrees of femoral component rotation are not well understood. We addressed this question using a computational model, which allowed us to isolate the biomechanical impact of geometric factors including bony shapes, location of ligament insertions, and implant size across three different knees after posterior-stabilized (PS) TKA. Using a computational model of the tibiofemoral joint, we asked: (1) Does external rotation unload the medial collateral ligament (MCL) and what is the effect on lateral collateral ligament tension? (2) How does external rotation alter tibiofemoral contact loads and kinematics? (3) Does 3° external rotation relative to the posterior condylar axis align the component to the surgical transepicondylar axis (sTEA) and what anatomic factors of the femoral condyle explain variations in maximum MCL tension among knees? We incorporated a PS TKA into a previously developed computational knee model applied to three neutrally aligned, nonarthritic, male cadaveric knees. The computational knee model was previously shown to corroborate coupled motions and ligament loading patterns of the native knee through a range of flexion. Implant geometries were virtually installed using hip-to-ankle CT scans through measured resection and anterior referencing surgical techniques. Collateral ligament properties were standardized across each knee model by defining stiffness and slack lengths based on the healthy population. The femoral component was externally rotated from 0° to 9° relative to the posterior condylar axis in 3° increments. At each increment, the knee was flexed under 500 N compression from 0° to 90° simulating an intraoperative examination. The computational model predicted collateral ligament forces, compartmental contact forces, and tibiofemoral internal/external and

  8. Development of high performance scientific components for interoperability of computing packages

    Energy Technology Data Exchange (ETDEWEB)

    Gulabani, Teena Pratap [Iowa State Univ., Ames, IA (United States)

    2008-01-01

    Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achieved by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.

  9. TITAN: a computer program for accident occurrence frequency analyses by component Monte Carlo simulation

    International Nuclear Information System (INIS)

    Nomura, Yasushi; Tamaki, Hitoshi; Kanai, Shigeru

    2000-04-01

    In a plant system consisting of complex equipments and components for a reprocessing facility, there might be grace time between an initiating event and a resultant serious accident, allowing operating personnel to take remedial actions, thus, terminating the ongoing accident sequence. A component Monte Carlo simulation computer program TITAN has been developed to analyze such a complex reliability model including the grace time without any difficulty to obtain an accident occurrence frequency. Firstly, basic methods for the component Monte Carlo simulation is introduced to obtain an accident occurrence frequency, and then, the basic performance such as precision, convergence, and parallelization of calculation, is shown through calculation of a prototype accident sequence model. As an example to illustrate applicability to a real scale plant model, a red oil explosion in a German reprocessing plant model is simulated to show that TITAN can give an accident occurrence frequency with relatively good accuracy. Moreover, results of uncertainty analyses by TITAN are rendered to show another performance, and a proposal is made for introducing of a new input-data format to adapt the component Monte Carlo simulation. The present paper describes the calculational method, performance, applicability to a real scale, and new proposal for the TITAN code. In the Appendixes, a conventional analytical method is shown to avoid complex and laborious calculation to obtain a strict solution of accident occurrence frequency, compared with Monte Carlo method. The user's manual and the list/structure of program are also contained in the Appendixes to facilitate TITAN computer program usage. (author)

  10. GRAPHIC COMPETENCE AS A COMPONENT OF TRAINING FUTURE ENGINEERING TEACHERS OF COMPUTER PROFILE

    Directory of Open Access Journals (Sweden)

    Yuliya Kozak

    2016-06-01

    Full Text Available The article analysis the system of professional training of future engineering teachers of computer type at the pedagogical universities, including graphical content preparation. It is established that the modernization of this system of training engineering teachers of computer profile is extremely important because of increasing demands for total graphics education, which in terms of mass communication, the need to compress a significant amount of information and opportunities provided by new information technologies, becomes so important as second literacy. The article reveals the essential characteristics of the concept of graphic competence as important component of the modernization of the education system, and an attempt to find promising ways of further work to effective solving of the issue of formation of graphic competence of engineering teachers of computer profile.

  11. EXAFS Phase Retrieval Solution Tracking for Complex Multi-Component System: Synthesized Topological Inverse Computation

    International Nuclear Information System (INIS)

    Lee, Jay Min; Yang, Dong-Seok; Bunker, Grant B

    2013-01-01

    Using the FEFF kernel A(k,r), we describe the inverse computation from χ(k)-data to g(r)-solution in terms of a singularity regularization method based on complete Bayesian statistics process. In this work, we topologically decompose the system-matched invariant projection operators into two distinct types, (A + AA + A) and (AA + AA + ), and achieved Synthesized Topological Inversion Computation (STIC), by employing a 12-operator-closed-loop emulator of the symplectic transformation. This leads to a numerically self-consistent solution as the optimal near-singular regularization parameters are sought, dramatically suppressing instability problems connected with finite precision arithmetic in ill-posed systems. By statistically correlating a pair of measured data, it was feasible to compute an optimal EXAFS phase retrieval solution expressed in terms of the complex-valued χ(k), and this approach was successfully used to determine the optimal g(r) for a complex multi-component system.

  12. Computational models for residual creep life prediction of power plant components

    International Nuclear Information System (INIS)

    Grewal, G.S.; Singh, A.K.; Ramamoortry, M.

    2006-01-01

    All high temperature - high pressure power plant components are prone to irreversible visco-plastic deformation by the phenomenon of creep. The steady state creep response as well as the total creep life of a material is related to the operational component temperature through, respectively, the exponential and inverse exponential relationships. Minor increases in the component temperature can thus have serious consequences as far as the creep life and dimensional stability of a plant component are concerned. In high temperature steam tubing in power plants, one mechanism by which a significant temperature rise can occur is by the growth of a thermally insulating oxide film on its steam side surface. In the present paper, an elegantly simple and computationally efficient technique is presented for predicting the residual creep life of steel components subjected to continual steam side oxide film growth. Similarly, fabrication of high temperature power plant components involves extensive use of welding as the fabrication process of choice. Naturally, issues related to the creep life of weldments have to be seriously addressed for safe and continual operation of the welded plant component. Unfortunately, a typical weldment in an engineering structure is a zone of complex microstructural gradation comprising of a number of distinct sub-zones with distinct meso-scale and micro-scale morphology of the phases and (even) chemistry and its creep life prediction presents considerable challenges. The present paper presents a stochastic algorithm, which can be' used for developing experimental creep-cavitation intensity versus residual life correlations for welded structures. Apart from estimates of the residual life in a mean field sense, the model can be used for predicting the reliability of the plant component in a rigorous probabilistic setting. (author)

  13. Advanced computational simulation for design and manufacturing of lightweight material components for automotive applications

    Energy Technology Data Exchange (ETDEWEB)

    Simunovic, S.; Aramayo, G.A.; Zacharia, T. [Oak Ridge National Lab., TN (United States); Toridis, T.G. [George Washington Univ., Washington, DC (United States); Bandak, F.; Ragland, C.L. [Dept. of Transportation, Washington, DC (United States)

    1997-04-01

    Computational vehicle models for the analysis of lightweight material performance in automobiles have been developed through collaboration between Oak Ridge National Laboratory, the National Highway Transportation Safety Administration, and George Washington University. The vehicle models have been verified against experimental data obtained from vehicle collisions. The crashed vehicles were analyzed, and the main impact energy dissipation mechanisms were identified and characterized. Important structural parts were extracted and digitized and directly compared with simulation results. High-performance computing played a key role in the model development because it allowed for rapid computational simulations and model modifications. The deformation of the computational model shows a very good agreement with the experiments. This report documents the modifications made to the computational model and relates them to the observations and findings on the test vehicle. Procedural guidelines are also provided that the authors believe need to be followed to create realistic models of passenger vehicles that could be used to evaluate the performance of lightweight materials in automotive structural components.

  14. Advances in Human-Computer Interaction: Graphics and Animation Components for Interface Design

    Science.gov (United States)

    Cipolla Ficarra, Francisco V.; Nicol, Emma; Cipolla-Ficarra, Miguel; Richardson, Lucy

    We present an analysis of communicability methodology in graphics and animation components for interface design, called CAN (Communicability, Acceptability and Novelty). This methodology has been under development between 2005 and 2010, obtaining excellent results in cultural heritage, education and microcomputing contexts. In studies where there is a bi-directional interrelation between ergonomics, usability, user-centered design, software quality and the human-computer interaction. We also present the heuristic results about iconography and layout design in blogs and websites of the following countries: Spain, Italy, Portugal and France.

  15. Incompressible viscous flow computations for the pump components and the artificial heart

    Science.gov (United States)

    Kiris, Cetin

    1992-01-01

    A finite difference, three dimensional incompressible Navier-Stokes formulation to calculate the flow through turbopump components is utilized. The solution method is based on the pseudo compressibility approach and uses an implicit upwind differencing scheme together with the Gauss-Seidel line relaxation method. Both steady and unsteady flow calculations can be performed using the current algorithm. Here, equations are solved in steadily rotating reference frames by using the steady state formulation in order to simulate the flow through a turbopump inducer. Eddy viscosity is computed by using an algebraic mixing-length turbulence model. Numerical results are compared with experimental measurements and a good agreement is found between the two.

  16. TITAN: a computer program for accident occurrence frequency analyses by component Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Nomura, Yasushi [Department of Fuel Cycle Safety Research, Nuclear Safety Research Center, Tokai Research Establishment, Japan Atomic Energy Research Institute, Tokai, Ibaraki (Japan); Tamaki, Hitoshi [Department of Safety Research Technical Support, Tokai Research Establishment, Japan Atomic Energy Research Institute, Tokai, Ibaraki (Japan); Kanai, Shigeru [Fuji Research Institute Corporation, Tokyo (Japan)

    2000-04-01

    In a plant system consisting of complex equipments and components for a reprocessing facility, there might be grace time between an initiating event and a resultant serious accident, allowing operating personnel to take remedial actions, thus, terminating the ongoing accident sequence. A component Monte Carlo simulation computer program TITAN has been developed to analyze such a complex reliability model including the grace time without any difficulty to obtain an accident occurrence frequency. Firstly, basic methods for the component Monte Carlo simulation is introduced to obtain an accident occurrence frequency, and then, the basic performance such as precision, convergence, and parallelization of calculation, is shown through calculation of a prototype accident sequence model. As an example to illustrate applicability to a real scale plant model, a red oil explosion in a German reprocessing plant model is simulated to show that TITAN can give an accident occurrence frequency with relatively good accuracy. Moreover, results of uncertainty analyses by TITAN are rendered to show another performance, and a proposal is made for introducing of a new input-data format to adapt the component Monte Carlo simulation. The present paper describes the calculational method, performance, applicability to a real scale, and new proposal for the TITAN code. In the Appendixes, a conventional analytical method is shown to avoid complex and laborious calculation to obtain a strict solution of accident occurrence frequency, compared with Monte Carlo method. The user's manual and the list/structure of program are also contained in the Appendixes to facilitate TITAN computer program usage. (author)

  17. A computer-based feedback only intervention with and without a moderation skills component.

    Science.gov (United States)

    Weaver, Cameron C; Leffingwell, Thad R; Lombardi, Nathaniel J; Claborn, Kasey R; Miller, Mary E; Martens, Matthew P

    2014-01-01

    Research on the efficacy of computer-delivered feedback-only interventions (FOIs) for college alcohol misuse has been mixed. Limitations to these FOIs include participant engagement and variation in the use of a moderation skills component. The current investigation sought to address these limitations using a novel computer-delivered FOI, the Drinkers Assessment and Feedback Tool for College Students (DrAFT-CS). Heavy drinking college students (N=176) were randomly assigned to DrAFT-CS, DrAFT-CS plus moderation skills (DrAFT-CS+), moderation skills only (MSO), or assessment only (AO) group, and were assessed at 1-month follow-up (N=157). Participants in the DrAFT-CS and DrAFT-CS+groups reported significantly lower estimated blood alcohol concentrations (eBACs) on typical heaviest drinking day than participants in the AO group. The data also supported the incorporation of a moderation skills component within FOIs, such that participants in DrAFT-CS+group reported significantly fewer drinks per week and drinks per heaviest drinking occasion than participants in the AO group. © 2013.

  18. A Computational Model of Torque Generation: Neural, Contractile, Metabolic and Musculoskeletal Components

    Science.gov (United States)

    Callahan, Damien M.; Umberger, Brian R.; Kent-Braun, Jane A.

    2013-01-01

    The pathway of voluntary joint torque production includes motor neuron recruitment and rate-coding, sarcolemmal depolarization and calcium release by the sarcoplasmic reticulum, force generation by motor proteins within skeletal muscle, and force transmission by tendon across the joint. The direct source of energetic support for this process is ATP hydrolysis. It is possible to examine portions of this physiologic pathway using various in vivo and in vitro techniques, but an integrated view of the multiple processes that ultimately impact joint torque remains elusive. To address this gap, we present a comprehensive computational model of the combined neuromuscular and musculoskeletal systems that includes novel components related to intracellular bioenergetics function. Components representing excitatory drive, muscle activation, force generation, metabolic perturbations, and torque production during voluntary human ankle dorsiflexion were constructed, using a combination of experimentally-derived data and literature values. Simulation results were validated by comparison with torque and metabolic data obtained in vivo. The model successfully predicted peak and submaximal voluntary and electrically-elicited torque output, and accurately simulated the metabolic perturbations associated with voluntary contractions. This novel, comprehensive model could be used to better understand impact of global effectors such as age and disease on various components of the neuromuscular system, and ultimately, voluntary torque output. PMID:23405245

  19. Computing the influences of different Intraocular Pressures on the human eye components using computational fluid-structure interaction model.

    Science.gov (United States)

    Karimi, Alireza; Razaghi, Reza; Navidbakhsh, Mahdi; Sera, Toshihiro; Kudo, Susumu

    2017-01-01

    Intraocular Pressure (IOP) is defined as the pressure of aqueous in the eye. It has been reported that the normal range of IOP should be within the 10-20 mmHg with an average of 15.50 mmHg among the ophthalmologists. Keratoconus is an anti-inflammatory eye disorder that debilitated cornea unable to reserve the normal structure contrary to the IOP in the eye. Consequently, the cornea would bulge outward and invoke a conical shape following by distorted vision. In addition, it is known that any alterations in the structure and composition of the lens and cornea would exceed a change of the eye ball as well as the mechanical and optical properties of the eye. Understanding the precise alteration of the eye components' stresses and deformations due to different IOPs could help elucidate etiology and pathogenesis to develop treatments not only for keratoconus but also for other diseases of the eye. In this study, at three different IOPs, including 10, 20, and 30 mmHg the stresses and deformations of the human eye components were quantified using a Three-Dimensional (3D) computational Fluid-Structure Interaction (FSI) model of the human eye. The results revealed the highest amount of von Mises stress in the bulged region of the cornea with 245 kPa at the IOP of 30 mmHg. The lens was also showed the von Mises stress of 19.38 kPa at the IOPs of 30 mmHg. In addition, by increasing the IOP from 10 to 30 mmHg, the radius of curvature in the cornea and lens was increased accordingly. In contrast, the sclera indicated its highest stress at the IOP of 10 mmHg due to over pressure phenomenon. The variation of IOP illustrated a little influence in the amount of stress as well as the resultant displacement of the optic nerve. These results can be used for understanding the amount of stresses and deformations in the human eye components due to different IOPs as well as for clarifying significant role of IOP on the radius of curvature of the cornea and the lens.

  20. Predicting adenocarcinoma recurrence using computational texture models of nodule components in lung CT

    International Nuclear Information System (INIS)

    Depeursinge, Adrien; Yanagawa, Masahiro; Leung, Ann N.; Rubin, Daniel L.

    2015-01-01

    Purpose: To investigate the importance of presurgical computed tomography (CT) intensity and texture information from ground-glass opacities (GGO) and solid nodule components for the prediction of adenocarcinoma recurrence. Methods: For this study, 101 patients with surgically resected stage I adenocarcinoma were selected. During the follow-up period, 17 patients had disease recurrence with six associated cancer-related deaths. GGO and solid tumor components were delineated on presurgical CT scans by a radiologist. Computational texture models of GGO and solid regions were built using linear combinations of steerable Riesz wavelets learned with linear support vector machines (SVMs). Unlike other traditional texture attributes, the proposed texture models are designed to encode local image scales and directions that are specific to GGO and solid tissue. The responses of the locally steered models were used as texture attributes and compared to the responses of unaligned Riesz wavelets. The texture attributes were combined with CT intensities to predict tumor recurrence and patient hazard according to disease-free survival (DFS) time. Two families of predictive models were compared: LASSO and SVMs, and their survival counterparts: Cox-LASSO and survival SVMs. Results: The best-performing predictive model of patient hazard was associated with a concordance index (C-index) of 0.81 ± 0.02 and was based on the combination of the steered models and CT intensities with survival SVMs. The same feature group and the LASSO model yielded the highest area under the receiver operating characteristic curve (AUC) of 0.8 ± 0.01 for predicting tumor recurrence, although no statistically significant difference was found when compared to using intensity features solely. For all models, the performance was found to be significantly higher when image attributes were based on the solid components solely versus using the entire tumors (p < 3.08 × 10 −5 ). Conclusions: This study

  1. Computer-aided stress analysis system for nuclear plant primary components

    International Nuclear Information System (INIS)

    Murai, Tsutomu; Tokumaru, Yoshio; Yamazaki, Junko.

    1980-06-01

    Generally it needs a vast quantity of calculation to make the stress analysis reports of nuclear plant primary components. In Japan, especially, stress analysis reports are under obligation to make for each plant. In Mitsubishi Heavy Industries, Ltd., We have been making great efforts to rationalize the process of analysis for about these ten years. As the result of rationalization up to now, a computer-aided stress analysis system using graphic display, graphic tablet, data file, etc. was accomplished and it needs us only the least hand work. In addition we developed a fracture safety analysis system. And we are going to develop the input generator system for 3-dimensional FEM analysis by graphics terminals in the near future. We expect that when the above-mentioned input generator system is accomplished, it will be possible for us to solve instantly any case of problem. (author)

  2. 75 FR 8400 - In the Matter of Certain Notebook Computer Products and Components Thereof; Notice of Investigation

    Science.gov (United States)

    2010-02-24

    ... INTERNATIONAL TRADE COMMISSION [Inv. No. 337-TA-705] In the Matter of Certain Notebook Computer... United States after importation of certain notebook computer products and components thereof by reason of... an industry in the United States exists as required by subsection (a)(2) of section 337. The...

  3. Annotating the structure and components of a nanoparticle formulation using computable string expressions.

    Science.gov (United States)

    Thomas, Dennis G; Chikkagoudar, Satish; Chappell, Alan R; Baker, Nathan A

    2012-12-31

    Nanoparticle formulations that are being developed and tested for various medical applications are typically multi-component systems that vary in their structure, chemical composition, and function. It is difficult to compare and understand the differences between the structural and chemical descriptions of hundreds and thousands of nanoparticle formulations found in text documents. We have developed a string nomenclature to create computable string expressions that identify and enumerate the different high-level types of material parts of a nanoparticle formulation and represent the spatial order of their connectivity to each other. The string expressions are intended to be used as IDs, along with terms that describe a nanoparticle formulation and its material parts, in data sharing documents and nanomaterial research databases. The strings can be parsed and represented as a directed acyclic graph. The nodes of the graph can be used to display the string ID, name and other text descriptions of the nanoparticle formulation or its material part, while the edges represent the connectivity between the material parts with respect to the whole nanoparticle formulation. The different patterns in the string expressions can be searched for and used to compare the structure and chemical components of different nanoparticle formulations. The proposed string nomenclature is extensible and can be applied along with ontology terms to annotate the complete description of nanoparticles formulations.

  4. Computer Simulation of Robotic Device Components in 3D Printer Manufacturing

    Directory of Open Access Journals (Sweden)

    M. A. Kiselev

    2016-01-01

    Full Text Available The paper considers a relevant problem "Computer simulation of robotic device components in manufacturing on a 3D printer" and highlights the problem of computer simulation based on the cognitive programming technology of robotic device components. The paper subject is urgent because computer simulation of force-torque and accuracy characteristics of robot components in terms of their manufacturing properties and conditions from polymeric and metallic materials is of paramount importance for programming and manufacturing on the 3D printers. Two types of additive manufacturing technologies were used:1. FDM (Fused deposition modeling - layered growth of products from molten plastic strands;2. SLM (Selective laser melting - selective laser sintering of metal powders, which, in turn, create:• conditions for reducing the use of expensive equipment;• reducing weight and increasing strength through optimization of  the lattice structures when using a bionic design;• a capability to implement mathematical modeling of individual components of robotic and other devices in terms of appropriate characteristics;• a 3D printing capability to create unique items, which cannot be made by other known methods.The paper aim was to confirm the possibility of ensuring the strength and accuracy characteristics of cases when printing from polymeric and metallic materials on a 3D printer. The investigation emphasis is on mathematical modeling based on the cognitive programming technology using the additive technologies in their studies since it is, generally, impossible to make the obtained optimized structures on the modern CNC machines.The latter allows us to create a program code to be clear to other developers without cost, additional time for development, adaptation and implementation.Year by year Russian companies increasingly use a 3D-print system in mechanical engineering, aerospace industry, and for scientific purposes. Machines for the additive

  5. Modelling and computer simulation for the manufacture by powder HIPing of Blanket Shield components for ITER

    International Nuclear Information System (INIS)

    Gillia, O.; Bucci, Ph.; Vidotto, F.; Leibold, J.-M.; Boireau, B.; Boudot, C.; Cottin, A.; Lorenzetto, P.; Jacquinot, F.

    2006-01-01

    In components of blanket modules for ITER, intricate cooling networks are needed in order to evacuate all heat coming from the plasma. Hot Isostatic Pressing (HIPing) technology is a very convenient method to produce near net shape components with complex cooling network through massive stainless steel parts by bonding together tubes inserted in grooves machined in bulk stainless steel. Powder is often included in the process so as to release difficulties arising with gaps closure between tube and solid part or between several solid parts. In the mean time, it releases the machining precision needed on the parts to assemble before HIP. However, inserting powder in the assembly means densification, i.e. volume change of powder during the HIP cycle. This leads to global and local shape changes of HIPed parts. In order to control the deformations, modelling and computer simulation are used. This modelling and computer simulation work has been done in support to the fabrication of a shield prototype for the ITER blanket. Problems such as global bending of the whole part and deformations of tubes in their powder bed are addressed. It is important that the part does not bend too much. It is important as well to have circular tube shape after HIP, firstly in order to avoid their rupture during HIP but also because non destructive ultrasonic examination is needed to check the quality of the densification and bonding between tube and powder or solid parts; the insertions of a probe in the tubes requires a minimal circular tube shape. For simulation purposes, the behaviour of the different materials has to be modelled. Although the modelling of the massive stainless steel behaviour is not neglected, the most critical modelling is about power. For this study, a thorough investigation on the powder behaviour has been performed with some in-situ HIP dilatometry experiments and some interrupted HIP cycles on trial parts. These experiments have allowed the identification of a

  6. Development of software for computing forming information using a component based approach

    Directory of Open Access Journals (Sweden)

    Kwang Hee Ko

    2009-12-01

    Full Text Available In shipbuilding industry, the manufacturing technology has advanced at an unprecedented pace for the last decade. As a result, many automatic systems for cutting, welding, etc. have been developed and employed in the manufacturing process and accordingly the productivity has been increased drastically. Despite such improvement in the manufacturing technology, however, development of an automatic system for fabricating a curved hull plate remains at the beginning stage since hardware and software for the automation of the curved hull fabrication process should be developed differently depending on the dimensions of plates, forming methods and manufacturing processes of each shipyard. To deal with this problem, it is necessary to create a “plug-in” framework, which can adopt various kinds of hardware and software to construct a full automatic fabrication system. In this paper, a framework for automatic fabrication of curved hull plates is proposed, which consists of four components and related software. In particular the software module for computing fabrication information is developed by using the ooCBD development methodology, which can interface with other hardware and software with minimum effort. Examples of the proposed framework applied to medium and large shipyards are presented.

  7. Surgical planning of total hip arthroplasty: accuracy of computer-assisted EndoMap software in predicting component size

    International Nuclear Information System (INIS)

    Davila, Jesse A.; Kransdorf, Mark J.; Duffy, Gavan P.

    2006-01-01

    The purpose of our study was to assess the accuracy of a computer-assisted templating in the surgical planning of patients undergoing total hip arthroplasty utilizing EndoMap software (Siemans AG, Medical Solutions, Erlangen, Germany). Endomap Software is an electronic program that uses DICOM images to analyze standard anteroposterior radiographs for determination of optimal prosthesis component size. We retrospectively reviewed the preoperative radiographs of 36 patients undergoing uncomplicated primary total hip arthroplasty, utilizing EndoMap software, Version VA20. DICOM anteroposterior radiographs were analyzed using standard manufacturer supplied electronic templates to determine acetabular and femoral component sizes. No additional clinical information was reviewed. Acetabular and femoral component sizes were assessed by an orthopedic surgeon and two radiologists. Mean and estimated component size was compared with component size as documented in operative reports. The mean estimated acetabular component size was 53 mm (range 48-60 mm), 1 mm larger than the mean implanted size of 52 mm (range 48-62 mm). Thirty-one of 36 acetabular component sizes (86%) were accurate within one size. The mean calculated femoral component size was 4 (range 2-7), 1 size smaller than the actual mean component size of 5 (range 2-9). Twenty-six of 36 femoral component sizes (72%) were accurate within one size, and accurate within two sizes in all but four cases (94%). EndoMap Software predicted femoral component size well, with 72% within one component size of that used, and 94% within two sizes. Acetabular component size was predicted slightly better with 86% within one component size and 94% within two component sizes. (orig.)

  8. A framework for the computer-aided planning and optimisation of manufacturing processes for components with functional graded properties

    Science.gov (United States)

    Biermann, D.; Gausemeier, J.; Heim, H.-P.; Hess, S.; Petersen, M.; Ries, A.; Wagner, T.

    2014-05-01

    In this contribution a framework for the computer-aided planning and optimisation of functional graded components is presented. The framework is divided into three modules - the "Component Description", the "Expert System" for the synthetisation of several process chains and the "Modelling and Process Chain Optimisation". The Component Description module enhances a standard computer-aided design (CAD) model by a voxel-based representation of the graded properties. The Expert System synthesises process steps stored in the knowledge base to generate several alternative process chains. Each process chain is capable of producing components according to the enhanced CAD model and usually consists of a sequence of heating-, cooling-, and forming processes. The dependencies between the component and the applied manufacturing processes as well as between the processes themselves need to be considered. The Expert System utilises an ontology for that purpose. The ontology represents all dependencies in a structured way and connects the information of the knowledge base via relations. The third module performs the evaluation of the generated process chains. To accomplish this, the parameters of each process are optimised with respect to the component specification, whereby the result of the best parameterisation is used as representative value. Finally, the process chain which is capable of manufacturing a functionally graded component in an optimal way regarding to the property distributions of the component description is presented by means of a dedicated specification technique.

  9. Bridge between control science and technology. Volume 5 Manufacturing man-machine systems, computers, components, traffic control, space applications

    Energy Technology Data Exchange (ETDEWEB)

    Rembold, U; Kempf, K G; Towill, D R; Johannsen, G; Paul, M

    1985-01-01

    Among the topics discussed are: robotics; CAD/CAM applications; and man-machine systems. Consideration is also given to: tools and software for system design and integration; communication systems for real-time computer control; fail-safe design of real-time computer systems; and microcomputer-based control systems. Additional topics discussed include: programmable and intelligent components and instruments in automatic control; transportation systems; and space applications of automatic control systems.

  10. Knowledge-based geographic information systems on the Macintosh computer: a component of the GypsES project

    Science.gov (United States)

    Gregory Elmes; Thomas Millette; Charles B. Yuill

    1991-01-01

    GypsES, a decision-support and expert system for the management of Gypsy Moth addresses five related research problems in a modular, computer-based project. The modules are hazard rating, monitoring, prediction, treatment decision and treatment implementation. One common component is a geographic information system designed to function intelligently. We refer to this...

  11. Computer-Aided College Algebra: Learning Components that Students Find Beneficial

    Science.gov (United States)

    Aichele, Douglas B.; Francisco, Cynthia; Utley, Juliana; Wescoatt, Benjamin

    2011-01-01

    A mixed-method study was conducted during the Fall 2008 semester to better understand the experiences of students participating in computer-aided instruction of College Algebra using the software MyMathLab. The learning environment included a computer learning system for the majority of the instruction, a support system via focus groups (weekly…

  12. Computer-aided process planning in prismatic shape die components based on Standard for the Exchange of Product model data

    Directory of Open Access Journals (Sweden)

    Awais Ahmad Khan

    2015-11-01

    Full Text Available Insufficient technologies made good integration between the die components in design, process planning, and manufacturing impossible in the past few years. Nowadays, the advanced technologies based on Standard for the Exchange of Product model data are making it possible. This article discusses the three main steps for achieving the complete process planning for prismatic parts of the die components. These three steps are data extraction, feature recognition, and process planning. The proposed computer-aided process planning system works as part of an integrated system to cover the process planning of any prismatic part die component. The system is built using Visual Basic with EWDraw system for visualizing the Standard for the Exchange of Product model data file. The system works successfully and can cover any type of sheet metal die components. The case study discussed in this article is taken from a large design of progressive die.

  13. Using Principal Component Analysis (PCA) to Speed up Radiative Transfer (RT) Computations

    Science.gov (United States)

    Natraj, Vijay

    2012-01-01

    Multiple scattering RT calculations time-consuming. Need a speed improvement of about 1000 (for OCO)! Solution: Make use of redundancies in spectra. Correlated-k (Lacis and Wang, Lacis and Oinas, Goody et al, Fu and Liou) Problem: Assume that spectral variation of atmospheric optical properties spatially correlated at all points along optical path. High accuracy (HI) and 2-stream (2S) calculations have high correlation. Single scattering (SS) computations highly scenario-dependent, but not time consuming. Perform SS and 2S calculations at every wavelength. Perform small number of HI computations. Need to compute correction factor B at every wavelength.

  14. A hybrid finite element analysis and evolutionary computation method for the design of lightweight lattice components with optimized strut diameter

    DEFF Research Database (Denmark)

    Salonitis, Konstantinos; Chantzis, Dimitrios; Kappatos, Vasileios

    2017-01-01

    approaches or with the use of topology optimization methodologies. An optimization approach utilizing multipurpose optimization algorithms has not been proposed yet. This paper presents a novel user-friendly method for the design optimization of lattice components towards weight minimization, which combines...... finite element analysis and evolutionary computation. The proposed method utilizes the cell homogenization technique in order to reduce the computational cost of the finite element analysis and a genetic algorithm in order to search for the most lightweight lattice configuration. A bracket consisting...

  15. Computer-controlled ultrasonic equipment for automatic inspection of nuclear reactor components after manufacturing

    International Nuclear Information System (INIS)

    Moeller, P.; Roehrich, H.

    1983-01-01

    After foundation of the working team ''Automated US-Manufacture Testing'' in 1976 the realization of an ultrasonic test facility for nuclear reactor components after manufacturing has been started. During a period of about 5 years, an automated prototype facility has been developed, fabricated and successfully tested. The function of this facility is to replace the manual ultrasonic tests, which are carried out autonomically at different stages of the manufacturing process and to fulfil the test specification under improved economic conditions. This prototype facility has been designed as to be transported to the components to be tested at low expenditure. Hereby the reproduceability of a test is entirely guaranteed. (orig.) [de

  16. Automatic procedures for computing complete configuration geometry from individual component descriptions

    Science.gov (United States)

    Barger, Raymond L.; Adams, Mary S.

    1994-01-01

    Procedures are derived for developing a complete airplane surface geometry starting from component descriptions. The procedures involve locating the intersection lines of adjacent components and omitting any regions for which part of one surface lies within the other. The geometry files utilize the wave-drag (Harris) format, and output files are written in Hess format. Two algorithms are used: one, if both intersecting surfaces have airfoil cross sections; the other, if one of the surfaces has circular cross sections. Some sample results in graphical form are included.

  17. Computational neuroanatomy: ontology-based representation of neural components and connectivity.

    Science.gov (United States)

    Rubin, Daniel L; Talos, Ion-Florin; Halle, Michael; Musen, Mark A; Kikinis, Ron

    2009-02-05

    A critical challenge in neuroscience is organizing, managing, and accessing the explosion in neuroscientific knowledge, particularly anatomic knowledge. We believe that explicit knowledge-based approaches to make neuroscientific knowledge computationally accessible will be helpful in tackling this challenge and will enable a variety of applications exploiting this knowledge, such as surgical planning. We developed ontology-based models of neuroanatomy to enable symbolic lookup, logical inference and mathematical modeling of neural systems. We built a prototype model of the motor system that integrates descriptive anatomic and qualitative functional neuroanatomical knowledge. In addition to modeling normal neuroanatomy, our approach provides an explicit representation of abnormal neural connectivity in disease states, such as common movement disorders. The ontology-based representation encodes both structural and functional aspects of neuroanatomy. The ontology-based models can be evaluated computationally, enabling development of automated computer reasoning applications. Neuroanatomical knowledge can be represented in machine-accessible format using ontologies. Computational neuroanatomical approaches such as described in this work could become a key tool in translational informatics, leading to decision support applications that inform and guide surgical planning and personalized care for neurological disease in the future.

  18. Computer Simulation of Material Flow in Warm-forming Bimetallic Components

    Science.gov (United States)

    Kong, T. F.; Chan, L. C.; Lee, T. C.

    2007-05-01

    Bimetallic components take advantage of two different metals or alloys so that their applicable performance, weight and cost can be optimized. However, since each material has its own flow properties and mechanical behaviour, heterogeneous material flows will occur during the bimetal forming process. Those controls of process parameters are relatively more complicated than forming single metals. Most previous studies in bimetal forming have focused mainly on cold forming, and less relevant information about the warm forming has been provided. Indeed, changes of temperature and heat transfer between two materials are the significant factors which can highly influence the success of the process. Therefore, this paper presents a study of the material flow in warm-forming bimetallic components using finite-element (FE) simulation in order to determine the suitable process parameters for attaining the complete die filling. A watch-case-like component made of stainless steel (AISI-316L) and aluminium alloy (AL-6063) was used as the example. The warm-forming processes were simulated with the punch speeds V of 40, 80, and 120 mm/s and the initial temperatures of the stainless steel TiSS of 625, 675, 725, 775, 825, 875, 925, 975, and 1025 °C. The results showed that the AL-6063 flowed faster than the AISI-316L and so the incomplete die filling was only found in the AISI-316L region. A higher TiSS was recommended to avoid incomplete die filling. The reduction of V is also suggested because this can save the forming energy and prevent the damage of tooling. Eventually, with the experimental verification, the results from the simulation were in agreement with those of the experiments. On the basis of the results of this study, engineers can gain a better understanding of the material flow in warm-forming bimetallic components, and be able to determine more efficiently the punch speed and initial material temperature for the process.

  19. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  20. Computers for Manned Space Applications Base on Commercial Off-the-Shelf Components

    Science.gov (United States)

    Vogel, T.; Gronowski, M.

    2009-05-01

    Similar to the consumer markets there has been an ever increasing demand in processing power, signal processing capabilities and memory space also for computers used for science data processing in space. An important driver of this development have been the payload developers for the International Space Station, requesting high-speed data acquisition and fast control loops in increasingly complex systems. Current experiments now even perform video processing and compression with their payload controllers. Nowadays the requirements for a space qualified computer are often far beyond the capabilities of, for example, the classic SPARC architecture that is found in ERC32 or LEON CPUs. An increase in performance usually demands costly and power consuming application specific solutions. Continuous developments over the last few years have now led to an alternative approach that is based on complete electronics modules manufactured for commercial and industrial customers. Computer modules used in industrial environments with a high demand for reliability under harsh environmental conditions like chemical reactors, electrical power plants or on manufacturing lines are entered into a selection procedure. Promising candidates then undergo a detailed characterisation process developed by Astrium Space Transportation. After thorough analysis and some modifications, these modules can replace fully qualified custom built electronics in specific, although not safety critical applications in manned space. This paper focuses on the benefits of COTS1 based electronics modules and the necessary analyses and modifications for their utilisation in manned space applications on the ISS. Some considerations regarding overall systems architecture will also be included. Furthermore this paper will also pinpoint issues that render such modules unsuitable for specific tasks, and justify the reasons. Finally, the conclusion of this paper will advocate the implementation of COTS based

  1. IMPLEMENTATION OF CLOUD COMPUTING AS A COMPONENT OF THE UNIVERSITY IT INFRASTRUCTURE

    Directory of Open Access Journals (Sweden)

    Vasyl P. Oleksyuk

    2014-05-01

    Full Text Available The article investigated the concept of IT infrastructure of higher educational institution. The article described models of deploying of cloud technologies in IT infrastructure. The hybrid model is most recent for higher educational institution. The unified authentication is an important component of IT infrastructure. The author suggests the public (Google Apps, Office 365 and private (Cloudstack, Eucalyptus, OpenStack cloud platforms to deploying in IT infrastructure of higher educational institution. Open source platform for organizing enterprise clouds were analyzed by the author. The article describes the experience of the deployment enterprise cloud in IT infrastructure of Department of Physics and Mathematics of Ternopil V. Hnatyuk National Pedagogical University.

  2. Clustering composite SaaS components in Cloud computing using a Grouping Genetic Algorithm

    OpenAIRE

    Yusoh, Zeratul Izzah Mohd; Tang, Maolin

    2012-01-01

    Recently, Software as a Service (SaaS) in Cloud computing, has become more and more significant among software users and providers. To offer a SaaS with flexible functions at a low cost, SaaS providers have focused on the decomposition of the SaaS functionalities, or known as composite SaaS. This approach has introduced new challenges in SaaS resource management in data centres. One of the challenges is managing the resources allocated to the composite SaaS. Due to the dynamic environment of ...

  3. Computational Design of Multi-component Bio-Inspired Bilayer Membranes

    Directory of Open Access Journals (Sweden)

    Evan Koufos

    2014-04-01

    Full Text Available Our investigation is motivated by the need to design bilayer membranes with tunable interfacial and mechanical properties for use in a range of applications, such as targeted drug delivery, sensing and imaging. We draw inspiration from biological cell membranes and focus on their principal constituents. In this paper, we present our results on the role of molecular architecture on the interfacial, structural and dynamical properties of bio-inspired membranes. We focus on four lipid architectures with variations in the head group shape and the hydrocarbon tail length. Each lipid species is composed of a hydrophilic head group and two hydrophobic tails. In addition, we study a model of the Cholesterol molecule to understand the interfacial properties of a bilayer membrane composed of rigid, single-tail molecular species. We demonstrate the properties of the bilayer membranes to be determined by the molecular architecture and rigidity of the constituent species. Finally, we demonstrate the formation of a stable mixed bilayer membrane composed of Cholesterol and one of the phospholipid species. Our approach can be adopted to design multi-component bilayer membranes with tunable interfacial and mechanical properties. We use a Molecular Dynamics-based mesoscopic simulation technique called Dissipative Particle Dynamics that resolves the molecular details of the components through soft-sphere coarse-grained models and reproduces the hydrodynamic behavior of the system over extended time scales.

  4. Rotational and Translational Components of Motion Parallax: Observers' Sensitivity and Implications for Three-Dimensional Computer Graphics

    Science.gov (United States)

    Kaiser, Mary K.; Montegut, Michael J.; Proffitt, Dennis R.

    1995-01-01

    The motion of objects during motion parallax can be decomposed into 2 observer-relative components: translation and rotation. The depth ratio of objects in the visual field is specified by the inverse ratio of their angular displacement (from translation) or equivalently by the inverse ratio of their rotations. Despite the equal mathematical status of these 2 information sources, it was predicted that observers would be far more sensitive to the translational than rotational component. Such a differential sensitivity is implicitly assumed by the computer graphics technique billboarding, in which 3-dimensional (3-D) objects are drawn as planar forms (i.e., billboards) maintained normal to the line of sight. In 3 experiments, observers were found to be consistently less sensitive to rotational anomalies. The implications of these findings for kinetic depth effect displays and billboarding techniques are discussed.

  5. Feasibility Study of Cryogenic Cutting Technology by Using a Computer Simulation and Manufacture of Main Components for Cryogenic Cutting System

    International Nuclear Information System (INIS)

    Kim, Sung Kyun; Lee, Dong Gyu; Lee, Kune Woo; Song, Oh Seop

    2009-01-01

    Cryogenic cutting technology is one of the most suitable technologies for dismantling nuclear facilities due to the fact that a secondary waste is not generated during the cutting process. In this paper, the feasibility of cryogenic cutting technology was investigated by using a computer simulation. In the computer simulation, a hybrid method combined with the SPH (smoothed particle hydrodynamics) method and the FE (finite element) method was used. And also, a penetration depth equation, for the design of the cryogenic cutting system, was used and the design variables and operation conditions to cut a 10 mm thickness for steel were determined. Finally, the main components of the cryogenic cutting system were manufactures on the basis of the obtained design variables and operation conditions.

  6. Contour integral computations for multi-component material systems subjected to creep

    International Nuclear Information System (INIS)

    Chen, J.-J.; Tu, S.-T.; Xuan, F.-Z.; Wang, Z.-D.

    2006-01-01

    In the present paper the crack behavior of multi-component material systems is investigated under extensive creep condition. The validation of the creep fracture parameters C* and C(t) is firstly examined at the microscale level. It is found that the C* value is no longer path-independent when mismatch inclusions are embedded into the matrix. To characterize the crack fields in inhomogeneous material the integral value defined at the crack tip as C tip * is introduced to reflect the influence of the inclusion. The interaction effects between microcrack and inclusion are systematically calculated with respect to different mismatch factors, various inclusion locations and inclusion numbers. The analysis results show that the C tip * value is not only influenced by the inclusion properties but also depends on the microstructure near the crack tip

  7. Development of a personal computer based facility-level SSAC component and inspector support system

    International Nuclear Information System (INIS)

    Markov, A.

    1989-08-01

    Research Contract No. 4658/RB was conducted between the IAEA and the Bulgarian Committee on Use of Atomic Energy for Peaceful Purposes. The contract required the Committee to develop and program a personal computer based software package to be used as a facility-level computerized State System of Accounting and Control (SSAC) at an off-load power reactor. The software delivered, called the National Safeguards System (NSS) keeps track of all fuel assembly activity at a power reactor and generates all ledgers, MBA material balances and any required reports to national or international authorities. The NSS is designed to operate on a PC/AT or compatible equipment with a hard disk of 20 MB, color graphics monitor or adaptor and at least one floppy disk drive, 360 Kb. The programs are written in Basic (compiler 2.0). They are executed under MS DOS 3.1 or later

  8. A novel method to measure femoral component migration by computed tomography: a cadaver study.

    Science.gov (United States)

    Boettner, Friedrich; Sculco, Peter; Lipman, Joseph; Renner, Lisa; Faschingbauer, Martin

    2016-06-01

    Radiostereometric analysis (RSA) is the most accurate technique to measure implant migration. However, it requires special equipment, technical expertise and analysis software and has not gained wide acceptance. The current paper analyzes a novel method to measure implant migration utilizing widely available computer tomography (CT). Three uncemented total hip replacements were performed in three human cadavers and six tantalum beads were inserted into the femoral bone similar to RSA. Six different 28 mm heads (-3, 0, 2.5, 5.0, 7.5 and 10 mm) were added to simulate five reproducible translations (maximum total point migration) of the center of the head. Implant migration was measured in a 3-D analysis software (Geomagic Studio 7). Repeat manual reconstructions of the center of the head were performed by two investigators to determine repeatability and accuracy. The accuracy of measurements between the centers of two head sizes was 0.11 mm with a CI 95 % of 0.22 mm. The intra-observer repeatability was 0.13 mm (CI 95 % 0.25 mm). The interrater-reliability was 0.943. CT based measurement of head displacement in a cadaver model were highly accurate and reproducible.

  9. Use of computed tomography slices 3D-reconstruction as a powerful tool to improve manufacturing processes on aeroengine components

    International Nuclear Information System (INIS)

    Castellan, C.; Dastarac, D.

    2000-01-01

    TURBOMECA has been using computed tomography for several years as an inner-health analysis powerful tool for engine components. From 2D slices of the examined part, detailed information about lacks or inclusions could easily be extracted. But, measurements on internal features were quickly required because no other NDT methods were able to do it. CT has thus logically become a powerful 2D dimensional measuring tool. Recently, with new software and the latest computers able to deal with huge files, CT has become a powerful 3D digitization tool and now, TOMO ADOUR can offer a complete solution for reverse engineering of complex parts. Several months ago, TURBOMECA introduced CT into many development, validation and industrialization processes and has demonstrated how to take corrective actions to process deviation on their aeroengine components by: extracting the nonexisting CAD model of a part, generating CAD compatible data to check dimensional conformity and, eventually correct design misfits or manufacturing drifts, highlighting the metallurgical health of first article parts, making the decision of repairing the defining the appropriate method, generating a file (.STL) to build a rapid prototype or a file to pilot tool parts for machining, calculating physical properties such as behavior or flow analysis on a 'real' model. The image also allows a drawing to be made of a part that was originally produced by a supplier or competitor. This paper will be illustrated with a large number of examples

  10. A study on the optimal replacement periods of digital control computer's components of Wolsung nuclear power plant unit 1

    International Nuclear Information System (INIS)

    Mok, Jin Il; Seong, Poong Hyun

    1993-01-01

    Due to the failure of the instrument and control devices of nuclear power plants caused by aging, nuclear power plants occasionally trip. Even a trip of a single nuclear power plant (NPP) causes an extravagant economical loss and deteriorates public acceptance of nuclear power plants. Therefore, the replacement of the instrument and control devices with proper consideration of the aging effect is necessary in order to prevent the inadvertent trip. In this paper we investigated the optimal replacement periods of the control computer's components of Wolsung nuclear power plant Unit 1. We first derived mathematical models of optimal replacement periods to the digital control computer's components of Wolsung NPP Unit 1 and calculated the optimal replacement periods analytically. We compared the periods with the replacement periods currently used at Wolsung NPP Unit 1. The periods used at Wolsung is not based on mathematical analysis, but on empirical knowledge. As a consequence, the optimal replacement periods analytically obtained and those used in the field show a little difference. (Author)

  11. [Can we do therapy without a therapist? Active components of computer-based CBT for depression].

    Science.gov (United States)

    Iakimova, G; Dimitrova, S; Burté, T

    2017-12-01

    Computer-delivered Cognitive Behavioral Therapies (C-CBT) are emerging as therapeutic techniques which contribute to overcome the barriers of health care access in adult populations with depression. The C-CBTs provide CBT techniques in a highly structured format comprising a number of educational lessons, homework, multimedia illustrations and supplementary materials via interactive computer interfaces. Programs are often administrated with a minimal or regular support provided by a clinician or a technician via email, telephone, online forums, or during face-to-face consultations. However, a lot of C-CBT is provided without any therapeutic support. Several reports showed that C-CBTs, both guided or unguided by a therapist, may be reliable and effective for patients with depression, and their use was recommended as part of the first step of the clinical care. The aim of the present qualitative review is to describe the operational format and functioning of five of the most cited unguided C-CBT programs for depression, to analyze their characteristics according to the CBT's principles, and to discuss the results of the randomized clinical trials (RCT) conducted to evaluate its effectiveness, adherence and user's experience. We analyzed five C-CBTs: Beating The Blues (BTB), MoodGYM, Sadness, Deprexis and Overcoming Depression on the Internet (ODIN) and 22 randomized controlled studies according to 5 dimensions: General characteristics; Methodology, structure and organization; Specific modules, themes and techniques: Clinical indications, recruitment mode, type of users with depression, type and mode of therapist's support, overall therapeutic effects, adherence and user's experience. The C-CBT have a secured free or pay-to-use access in different languages (English, German, Dutch, and Chinese) but not in French. The programs may be accessed at a medical center or at home via a CD-ROM or via an Internet connection. Some C-CBTs are very close to textual self

  12. Computation of geometric representation of novel spectrophotometric methods used for the analysis of minor components in pharmaceutical preparations.

    Science.gov (United States)

    Lotfy, Hayam M; Saleh, Sarah S; Hassan, Nagiba Y; Salem, Hesham

    2015-01-01

    Novel spectrophotometric methods were applied for the determination of the minor component tetryzoline HCl (TZH) in its ternary mixture with ofloxacin (OFX) and prednisolone acetate (PA) in the ratio of (1:5:7.5), and in its binary mixture with sodium cromoglicate (SCG) in the ratio of (1:80). The novel spectrophotometric methods determined the minor component (TZH) successfully in the two selected mixtures by computing the geometrical relationship of either standard addition or subtraction. The novel spectrophotometric methods are: geometrical amplitude modulation (GAM), geometrical induced amplitude modulation (GIAM), ratio H-point standard addition method (RHPSAM) and compensated area under the curve (CAUC). The proposed methods were successfully applied for the determination of the minor component TZH below its concentration range. The methods were validated as per ICH guidelines where accuracy, repeatability, inter-day precision and robustness were found to be within the acceptable limits. The results obtained from the proposed methods were statistically compared with official ones where no significant difference was observed. No difference was observed between the obtained results when compared to the reported HPLC method, which proved that the developed methods could be alternative to HPLC techniques in quality control laboratories. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. FY05 LDRD Final Report A Computational Design Tool for Microdevices and Components in Pathogen Detection Systems

    Energy Technology Data Exchange (ETDEWEB)

    Trebotich, D

    2006-02-07

    We have developed new algorithms to model complex biological flows in integrated biodetection microdevice components. The proposed work is important because the design strategy for the next-generation Autonomous Pathogen Detection System at LLNL is the microfluidic-based Biobriefcase, being developed under the Chemical and Biological Countermeasures Program in the Homeland Security Organization. This miniaturization strategy introduces a new flow regime to systems where biological flow is already complex and not well understood. Also, design and fabrication of MEMS devices is time-consuming and costly due to the current trial-and-error approach. Furthermore, existing devices, in general, are not optimized. There are several MEMS CAD capabilities currently available, but their computational fluid dynamics modeling capabilities are rudimentary at best. Therefore, we proposed a collaboration to develop computational tools at LLNL which will (1) provide critical understanding of the fundamental flow physics involved in bioMEMS devices, (2) shorten the design and fabrication process, and thus reduce costs, (3) optimize current prototypes and (4) provide a prediction capability for the design of new, more advanced microfluidic systems. Computational expertise was provided by Comp-CASC and UC Davis-DAS. The simulation work was supported by key experiments for guidance and validation at UC Berkeley-BioE.

  14. Implementation of the Principal Component Analysis onto High-Performance Computer Facilities for Hyperspectral Dimensionality Reduction: Results and Comparisons

    Directory of Open Access Journals (Sweden)

    Ernestina Martel

    2018-06-01

    Full Text Available Dimensionality reduction represents a critical preprocessing step in order to increase the efficiency and the performance of many hyperspectral imaging algorithms. However, dimensionality reduction algorithms, such as the Principal Component Analysis (PCA, suffer from their computationally demanding nature, becoming advisable for their implementation onto high-performance computer architectures for applications under strict latency constraints. This work presents the implementation of the PCA algorithm onto two different high-performance devices, namely, an NVIDIA Graphics Processing Unit (GPU and a Kalray manycore, uncovering a highly valuable set of tips and tricks in order to take full advantage of the inherent parallelism of these high-performance computing platforms, and hence, reducing the time that is required to process a given hyperspectral image. Moreover, the achieved results obtained with different hyperspectral images have been compared with the ones that were obtained with a field programmable gate array (FPGA-based implementation of the PCA algorithm that has been recently published, providing, for the first time in the literature, a comprehensive analysis in order to highlight the pros and cons of each option.

  15. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  16. Parsimonious classification of binary lacunarity data computed from food surface images using kernel principal component analysis and artificial neural networks.

    Science.gov (United States)

    Iqbal, Abdullah; Valous, Nektarios A; Sun, Da-Wen; Allen, Paul

    2011-02-01

    Lacunarity is about quantifying the degree of spatial heterogeneity in the visual texture of imagery through the identification of the relationships between patterns and their spatial configurations in a two-dimensional setting. The computed lacunarity data can designate a mathematical index of spatial heterogeneity, therefore the corresponding feature vectors should possess the necessary inter-class statistical properties that would enable them to be used for pattern recognition purposes. The objectives of this study is to construct a supervised parsimonious classification model of binary lacunarity data-computed by Valous et al. (2009)-from pork ham slice surface images, with the aid of kernel principal component analysis (KPCA) and artificial neural networks (ANNs), using a portion of informative salient features. At first, the dimension of the initial space (510 features) was reduced by 90% in order to avoid any noise effects in the subsequent classification. Then, using KPCA, the first nineteen kernel principal components (99.04% of total variance) were extracted from the reduced feature space, and were used as input in the ANN. An adaptive feedforward multilayer perceptron (MLP) classifier was employed to obtain a suitable mapping from the input dataset. The correct classification percentages for the training, test and validation sets were 86.7%, 86.7%, and 85.0%, respectively. The results confirm that the classification performance was satisfactory. The binary lacunarity spatial metric captured relevant information that provided a good level of differentiation among pork ham slice images. Copyright © 2010 The American Meat Science Association. Published by Elsevier Ltd. All rights reserved.

  17. Gamma dose effects valuation on micro computing components; Evaluation des effets de la dose gamma sur les composants micro-informatiques

    Energy Technology Data Exchange (ETDEWEB)

    Joffre, F

    1996-12-31

    Robotics in hostile environment raises the problem of micro computing components resistance with gamma radiation cumulated dose. The current aim is to reach a dose of 3000 grays with industrial components. A methodology and an instrumentation adapted to test this type of components have been developed. The aim of this work is to present the advantages and disadvantages bound to the use of industrial components in the presence of gamma radiation. After an analysis of the criteria allowing to justify the technological choices, the different steps which characterize the selection and the assessment methodology used are explained. The irradiation and measures means now operational are mentioned. Moreover, the supply aspects of the chosen components for the design of an industrialized system is taken into account. These selection and assessment components contribute to the development and design of computers for civil nuclear robotics. (O.M.). 7 refs.

  18. An efficient rhythmic component expression and weighting synthesis strategy for classifying motor imagery EEG in a brain computer interface

    Science.gov (United States)

    Wang, Tao; He, Bin

    2004-03-01

    The recognition of mental states during motor imagery tasks is crucial for EEG-based brain computer interface research. We have developed a new algorithm by means of frequency decomposition and weighting synthesis strategy for recognizing imagined right- and left-hand movements. A frequency range from 5 to 25 Hz was divided into 20 band bins for each trial, and the corresponding envelopes of filtered EEG signals for each trial were extracted as a measure of instantaneous power at each frequency band. The dimensionality of the feature space was reduced from 200 (corresponding to 2 s) to 3 by down-sampling of envelopes of the feature signals, and subsequently applying principal component analysis. The linear discriminate analysis algorithm was then used to classify the features, due to its generalization capability. Each frequency band bin was weighted by a function determined according to the classification accuracy during the training process. The present classification algorithm was applied to a dataset of nine human subjects, and achieved a success rate of classification of 90% in training and 77% in testing. The present promising results suggest that the present classification algorithm can be used in initiating a general-purpose mental state recognition based on motor imagery tasks.

  19. Application of kernel principal component analysis and computational machine learning to exploration of metabolites strongly associated with diet.

    Science.gov (United States)

    Shiokawa, Yuka; Date, Yasuhiro; Kikuchi, Jun

    2018-02-21

    Computer-based technological innovation provides advancements in sophisticated and diverse analytical instruments, enabling massive amounts of data collection with relative ease. This is accompanied by a fast-growing demand for technological progress in data mining methods for analysis of big data derived from chemical and biological systems. From this perspective, use of a general "linear" multivariate analysis alone limits interpretations due to "non-linear" variations in metabolic data from living organisms. Here we describe a kernel principal component analysis (KPCA)-incorporated analytical approach for extracting useful information from metabolic profiling data. To overcome the limitation of important variable (metabolite) determinations, we incorporated a random forest conditional variable importance measure into our KPCA-based analytical approach to demonstrate the relative importance of metabolites. Using a market basket analysis, hippurate, the most important variable detected in the importance measure, was associated with high levels of some vitamins and minerals present in foods eaten the previous day, suggesting a relationship between increased hippurate and intake of a wide variety of vegetables and fruits. Therefore, the KPCA-incorporated analytical approach described herein enabled us to capture input-output responses, and should be useful not only for metabolic profiling but also for profiling in other areas of biological and environmental systems.

  20. Experimental and computational correlation of fracture parameters KIc, JIc, and GIc for unimodular and bimodular graphite components

    Science.gov (United States)

    Bhushan, Awani; Panda, S. K.

    2018-05-01

    The influence of bimodularity (different stress ∼ strain behaviour in tension and compression) on fracture behaviour of graphite specimens has been studied with fracture toughness (KIc), critical J-integral (JIc) and critical strain energy release rate (GIc) as the characterizing parameter. Bimodularity index (ratio of tensile Young's modulus to compression Young's modulus) of graphite specimens has been obtained from the normalized test data of tensile and compression experimentation. Single edge notch bend (SENB) testing of pre-cracked specimens from the same lot have been carried out as per ASTM standard D7779-11 to determine the peak load and critical fracture parameters KIc, GIc and JIc using digital image correlation technology of crack opening displacements. Weibull weakest link theory has been used to evaluate the mean peak load, Weibull modulus and goodness of fit employing two parameter least square method (LIN2), biased (MLE2-B) and unbiased (MLE2-U) maximum likelihood estimator. The stress dependent elasticity problem of three-dimensional crack progression behaviour for the bimodular graphite components has been solved as an iterative finite element procedure. The crack characterizing parameters critical stress intensity factor and critical strain energy release rate have been estimated with the help of Weibull distribution plot between peak loads versus cumulative probability of failure. Experimental and Computational fracture parameters have been compared qualitatively to describe the significance of bimodularity. The bimodular influence on fracture behaviour of SENB graphite has been reflected on the experimental evaluation of GIc values only, which has been found to be different from the calculated JIc values. Numerical evaluation of bimodular 3D J-integral value is found to be close to the GIc value whereas the unimodular 3D J-value is nearer to the JIc value. The significant difference between the unimodular JIc and bimodular GIc indicates that

  1. REVIEW OF THE GOVERNING EQUATIONS, COMPUTATIONAL ALGORITHMS, AND OTHER COMPONENTS OF THE MODELS-3 COMMUNITY MULTISCALE AIR QUALITY (CMAQ) MODELING SYSTEM

    Science.gov (United States)

    This article describes the governing equations, computational algorithms, and other components entering into the Community Multiscale Air Quality (CMAQ) modeling system. This system has been designed to approach air quality as a whole by including state-of-the-science capabiliti...

  2. Fabrication of optical fiber micro(and nano)-optical and photonic devices and components, using computer controlled spark thermo-pulling system

    International Nuclear Information System (INIS)

    Fatemi, H.; Mosleh, A.; Pashmkar, M.; Khaksar Kalati, A.

    2007-01-01

    Fabrication of optical fiber Micro (and Nano)-Optical component and devices, as well as, those applicable for photonic purposes are described. It is to demonstrate the practical capabilities and characterization of the previously reported Computer controlled spark thermo-pulling fabrication system.

  3. Users guide for WoodCite, a product cost quotation tool for wood component manufacturers [computer program

    Science.gov (United States)

    Jeff Palmer; Adrienn Andersch; Jan Wiedenbeck; Urs. Buehlmann

    2014-01-01

    WoodCite is a Microsoft® Access-based application that allows wood component manufacturers to develop product price quotations for their current and potential customers. The application was developed by the U.S. Forest Service and Virginia Polytechnic Institute and State University, in cooperation with the Wood Components Manufacturers Association.

  4. Methodology for the computational simulation of the components in photovoltaic systems; Desarrollo de herramientas para la prediccion del comportamiento de sistemas fotovoltaicos

    Energy Technology Data Exchange (ETDEWEB)

    Galimberti, P.; Arcuri, G.; Manno, R.; Fasulo, A. J.

    2004-07-01

    This work presents a methodology for the computational simulation of the components that comprise photovoltaic systems, in order to study the behavior of each component and its relevance in the operation of the whole system, which would allow to make decisions in the selection process of these components and their improvements.As a result of the simulation, files with values of different variables which characterize the behaviour of the components are obtained. Different kind of plots can be drawn, which show the information in a summarized form. Finally, the results are discussed making a comparison with actual data for the city of Rio Cuarto in Argentina (33,1 degree South Latitude) and some advantages of the propose method are mentioned. (Author)

  5. ACDOS1: a computer code to calculate dose rates from neutron activation of neutral beamlines and other fusion-reactor components

    International Nuclear Information System (INIS)

    Keney, G.S.

    1981-08-01

    A computer code has been written to calculate neutron induced activation of neutral-beam injector components and the corresponding dose rates as a function of geometry, component composition, and time after shutdown. The code, ACDOS1, was written in FORTRAN IV to calculate both activity and dose rates for up to 30 target nuclides and 50 neutron groups. Sufficient versatility has also been incorporated into the code to make it applicable to a variety of general activation problems due to neutrons of energy less than 20 MeV

  6. Use of EPICS and Python technology for the development of a computational toolkit for high heat flux testing of plasma facing components

    Energy Technology Data Exchange (ETDEWEB)

    Sugandhi, Ritesh, E-mail: ritesh@ipr.res.in; Swamy, Rajamannar, E-mail: rajamannar@ipr.res.in; Khirwadkar, Samir, E-mail: sameer@ipr.res.in

    2016-11-15

    Highlights: • An integrated approach to software development for computational processing and experimental control. • Use of open source, cross platform, robust and advanced tools for computational code development. • Prediction of optimized process parameters for critical heat flux model. • Virtual experimentation for high heat flux testing of plasma facing components. - Abstract: The high heat flux testing and characterization of the divertor and first wall components are a challenging engineering problem of a tokamak. These components are subject to steady state and transient heat load of high magnitude. Therefore, the accurate prediction and control of the cooling parameters is crucial to prevent burnout. The prediction of the cooling parameters is based on the numerical solution of the critical heat flux (CHF) model. In a test facility for high heat flux testing of plasma facing components (PFC), the integration of computations and experimental control is an essential requirement. Experimental physics and industrial control system (EPICS) provides powerful tools for steering controls, data simulation, hardware interfacing and wider usability. Python provides an open source alternative for numerical computations and scripting. We have integrated these two open source technologies to develop a graphical software for a typical high heat flux experiment. The implementation uses EPICS based tools namely IOC (I/O controller) server, control system studio (CSS) and Python based tools namely Numpy, Scipy, Matplotlib and NOSE. EPICS and Python are integrated using PyEpics library. This toolkit is currently under operation at high heat flux test facility at Institute for Plasma Research (IPR) and is also useful for the experimental labs working in the similar research areas. The paper reports the software architectural design, implementation tools and rationale for their selection, test and validation.

  7. Use of EPICS and Python technology for the development of a computational toolkit for high heat flux testing of plasma facing components

    International Nuclear Information System (INIS)

    Sugandhi, Ritesh; Swamy, Rajamannar; Khirwadkar, Samir

    2016-01-01

    Highlights: • An integrated approach to software development for computational processing and experimental control. • Use of open source, cross platform, robust and advanced tools for computational code development. • Prediction of optimized process parameters for critical heat flux model. • Virtual experimentation for high heat flux testing of plasma facing components. - Abstract: The high heat flux testing and characterization of the divertor and first wall components are a challenging engineering problem of a tokamak. These components are subject to steady state and transient heat load of high magnitude. Therefore, the accurate prediction and control of the cooling parameters is crucial to prevent burnout. The prediction of the cooling parameters is based on the numerical solution of the critical heat flux (CHF) model. In a test facility for high heat flux testing of plasma facing components (PFC), the integration of computations and experimental control is an essential requirement. Experimental physics and industrial control system (EPICS) provides powerful tools for steering controls, data simulation, hardware interfacing and wider usability. Python provides an open source alternative for numerical computations and scripting. We have integrated these two open source technologies to develop a graphical software for a typical high heat flux experiment. The implementation uses EPICS based tools namely IOC (I/O controller) server, control system studio (CSS) and Python based tools namely Numpy, Scipy, Matplotlib and NOSE. EPICS and Python are integrated using PyEpics library. This toolkit is currently under operation at high heat flux test facility at Institute for Plasma Research (IPR) and is also useful for the experimental labs working in the similar research areas. The paper reports the software architectural design, implementation tools and rationale for their selection, test and validation.

  8. Strength and Reliability of Wood for the Components of Low-cost Wind Turbines: Computational and Experimental Analysis and Applications

    DEFF Research Database (Denmark)

    Mishnaevsky, Leon; Freere, Peter; Sharma, Ranjan

    2009-01-01

    of experiments and computational investigations. Low cost testing machines have been designed, and employed for the systematic analysis of different sorts of Nepali wood, to be used for the wind turbine construction. At the same time, computational micromechanical models of deformation and strength of wood......This paper reports the latest results of the comprehensive program of experimental and computational analysis of strength and reliability of wooden parts of low cost wind turbines. The possibilities of prediction of strength and reliability of different types of wood are studied in the series...... are developed, which should provide the basis for microstructure-based correlating of observable and service properties of wood. Some correlations between microstructure, strength and service properties of wood have been established....

  9. An exploratory study of the Work Ability Index (WAI) and its components in a group of computer workers.

    Science.gov (United States)

    Costa, Ana Filipa; Puga-Leal, Rogério; Nunes, Isabel L

    2011-01-01

    The objective of this paper is to present a study on the assessment of the work ability of a group of aged computers workers. The study was developed with the goal of creating a decision making framework oriented towards the maintenance of the health and working ability of aged workers. Fifty computer workers participated in this study. They were administrative secretaries and computer technicians working mainly with office computers. The method used to assess the work ability was the Work Ability Index (WAI). 78% of the participants had good or excellent work ability and only 2% a poor one. The average WAI score was 40.5 (SD=5.761; min=27; max=49). This study confirms the decrease in work ability of workers while aging. The group overall work ability was slightly higher than the reference values develop by the Finnish Institute of Occupational Health. The assessment of work ability is fundamental to make age-friendly workplaces. WAI is one tool designed to perform such assessment. The results obtained could assist the early identification of situations where employees are struggling with their work ability, thus helping to prioritize ergonomic interventions devoted to improve the working conditions, and allowing the continued employment of aging workers on their current job.

  10. Association between diabetes and different components of coronary atherosclerotic plaque burden as measured by coronary multidetector computed tomography.

    Science.gov (United States)

    Yun, Chun-Ho; Schlett, Christopher L; Rogers, Ian S; Truong, Quynh A; Toepker, Michael; Donnelly, Patrick; Brady, Thomas J; Hoffmann, Udo; Bamberg, Fabian

    2009-08-01

    The aim of the study was to assess differences in the presence, extent, and composition of coronary atherosclerotic plaque burden as detected by coronary multidetector computed tomography (MDCT) between patients with and without diabetes mellitus. We compared coronary atherosclerotic plaques (any plaque, calcified [CAP], non-calcified [NCAP, and mixed plaque [MCAP

  11. Visibility and accessbility of a component-based approach for ubiquitous computing applications : the e-gadgets case

    NARCIS (Netherlands)

    Mavrommati, I.; Kameas, A.; Markopoulos, P.

    2003-01-01

    The paper firstly presents the concepts and infrastructure developed within the extrovert-Gadgets research project, which enable end-users to realize Ubiquitous Computing applications. Then, it discusses user-acceptance considerations of the proposed concepts based on the outcome of an early

  12. The Dynamo package for tomography and subtomogram averaging: components for MATLAB, GPU computing and EC2 Amazon Web Services.

    Science.gov (United States)

    Castaño-Díez, Daniel

    2017-06-01

    Dynamo is a package for the processing of tomographic data. As a tool for subtomogram averaging, it includes different alignment and classification strategies. Furthermore, its data-management module allows experiments to be organized in groups of tomograms, while offering specialized three-dimensional tomographic browsers that facilitate visualization, location of regions of interest, modelling and particle extraction in complex geometries. Here, a technical description of the package is presented, focusing on its diverse strategies for optimizing computing performance. Dynamo is built upon mbtools (middle layer toolbox), a general-purpose MATLAB library for object-oriented scientific programming specifically developed to underpin Dynamo but usable as an independent tool. Its structure intertwines a flexible MATLAB codebase with precompiled C++ functions that carry the burden of numerically intensive operations. The package can be delivered as a precompiled standalone ready for execution without a MATLAB license. Multicore parallelization on a single node is directly inherited from the high-level parallelization engine provided for MATLAB, automatically imparting a balanced workload among the threads in computationally intense tasks such as alignment and classification, but also in logistic-oriented tasks such as tomogram binning and particle extraction. Dynamo supports the use of graphical processing units (GPUs), yielding considerable speedup factors both for native Dynamo procedures (such as the numerically intensive subtomogram alignment) and procedures defined by the user through its MATLAB-based GPU library for three-dimensional operations. Cloud-based virtual computing environments supplied with a pre-installed version of Dynamo can be publicly accessed through the Amazon Elastic Compute Cloud (EC2), enabling users to rent GPU computing time on a pay-as-you-go basis, thus avoiding upfront investments in hardware and longterm software maintenance.

  13. Development of GPU Based Parallel Computing Module for Solving Pressure Equation in the CUPID Component Thermo-Fluid Analysis Code

    International Nuclear Information System (INIS)

    Lee, Jin Pyo; Joo, Han Gyu

    2010-01-01

    In the thermo-fluid analysis code named CUPID, the linear system of pressure equations must be solved in each iteration step. The time for repeatedly solving the linear system can be quite significant because large sparse matrices of Rank more than 50,000 are involved and the diagonal dominance of the system is hardly hold. Therefore parallelization of the linear system solver is essential to reduce the computing time. Meanwhile, Graphics Processing Units (GPU) have been developed as highly parallel, multi-core processors for the global demand of high quality 3D graphics. If a suitable interface is provided, parallelization using GPU can be available to engineering computing. NVIDIA provides a Software Development Kit(SDK) named CUDA(Compute Unified Device Architecture) to code developers so that they can manage GPUs for parallelization using the C language. In this research, we implement parallel routines for the linear system solver using CUDA, and examine the performance of the parallelization. In the next section, we will describe the method of CUDA parallelization for the CUPID code, and then the performance of the CUDA parallelization will be discussed

  14. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  15. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  17. Computer analysis of an adiabatic Stirling cryocooler using a two-phase two-component working fluid

    International Nuclear Information System (INIS)

    Renfroe, D.A.; Cheung, C.M.

    1992-01-01

    This paper describes the performance and behavior of a Stirling cyrocooler incorporating a working fluid composed of helium and nitrogen. At the operating temperature of the cryocooler (80 K), the nitrogen component will condense in the freezer section. It is shown that the phase change in the working fluid increased the heat lifted for a given size and weight of machine and the coefficient of performance. The magnitude of these effects was dependent on the mass ratio of nitrogen to helium, phase angle between the compression and expansion processes, and the ratio of the compression space volume to the expansion space volume. The optimum heat lifted performance was obtained for a mass ratio of four parts of nitrogen to one part of helium, a phase angle of approximately 100 degrees, and a volume ratio of two which resulted in a heat lifted increase of 75% over the single phase, 90 degree phase angle configuration. The coefficient of performance showed a 20% improvement

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  20. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  1. Impact of reduced-radiation dual-energy protocols using 320-detector row computed tomography for analyzing urinary calculus components: initial in vitro evaluation.

    Science.gov (United States)

    Cai, Xiangran; Zhou, Qingchun; Yu, Juan; Xian, Zhaohui; Feng, Youzhen; Yang, Wencai; Mo, Xukai

    2014-10-01

    To evaluate the impact of reduced-radiation dual-energy (DE) protocols using 320-detector row computed tomography on the differentiation of urinary calculus components. A total of 58 urinary calculi were placed into the same phantom and underwent DE scanning with 320-detector row computed tomography. Each calculus was scanned 4 times with the DE protocols using 135 kV and 80 kV tube voltage and different tube current combinations, including 100 mA and 570 mA (group A), 50 mA and 290 mA (group B), 30 mA and 170 mA (group C), and 10 mA and 60 mA (group D). The acquisition data of all 4 groups were then analyzed by stone DE analysis software, and the results were compared with x-ray diffraction analysis. Noise, contrast-to-noise ratio, and radiation dose were compared. Calculi were correctly identified in 56 of 58 stones (96.6%) using group A and B protocols. However, only 35 stones (60.3%) and 16 stones (27.6%) were correctly diagnosed using group C and D protocols, respectively. Mean noise increased significantly and mean contrast-to-noise ratio decreased significantly from groups A to D (P calculus component analysis while reducing patient radiation exposure to 1.81 mSv. Further reduction of tube currents may compromise diagnostic accuracy. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Acetabular component positioning in total hip arthroplasty with and without a computer-assisted system: a prospective, randomized and controlled study.

    Science.gov (United States)

    Gurgel, Henrique M C; Croci, Alberto T; Cabrita, Henrique A B A; Vicente, José Ricardo N; Leonhardt, Marcos C; Rodrigues, João Carlos

    2014-01-01

    In a study of the acetabular component in total hip arthroplasty, 20 hips were operated on using imageless navigation and 20 hips were operated on using the conventional method. The correct position of the acetabular component was evaluated with computed tomography, measuring the operative anteversion and the operative inclination and determining the cases inside Lewinnek's safe zone. The results were similar in all the analyses: a mean anteversion of 17.4° in the navigated group and 14.5° in the control group (P=.215); a mean inclination of 41.7° and 42.2° (P=.633); a mean deviation from the desired anteversion (15°) of 5.5° and 6.6° (P=.429); a mean deviation from the desired inclination of 3° and 3.2° (P=.783); and location inside the safe zone of 90% and 80% (P=.661). The acetabular component position's tomography analyses were similar whether using the imageless navigation or performing it conventionally. Published by Elsevier Inc.

  3. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  10. Mitigating component performance variation

    Science.gov (United States)

    Gara, Alan G.; Sylvester, Steve S.; Eastep, Jonathan M.; Nagappan, Ramkumar; Cantalupo, Christopher M.

    2018-01-09

    Apparatus and methods may provide for characterizing a plurality of similar components of a distributed computing system based on a maximum safe operation level associated with each component and storing characterization data in a database and allocating non-uniform power to each similar component based at least in part on the characterization data in the database to substantially equalize performance of the components.

  11. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  12. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  14. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  15. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  18. Design of an EEG-based brain-computer interface (BCI) from standard components running in real-time under Windows.

    Science.gov (United States)

    Guger, C; Schlögl, A; Walterspacher, D; Pfurtscheller, G

    1999-01-01

    An EEG-based brain-computer interface (BCI) is a direct connection between the human brain and the computer. Such a communication system is needed by patients with severe motor impairments (e.g. late stage of Amyotrophic Lateral Sclerosis) and has to operate in real-time. This paper describes the selection of the appropriate components to construct such a BCI and focuses also on the selection of a suitable programming language and operating system. The multichannel system runs under Windows 95, equipped with a real-time Kernel expansion to obtain reasonable real-time operations on a standard PC. Matlab controls the data acquisition and the presentation of the experimental paradigm, while Simulink is used to calculate the recursive least square (RLS) algorithm that describes the current state of the EEG in real-time. First results of the new low-cost BCI show that the accuracy of differentiating imagination of left and right hand movement is around 95%.

  19. Incidence of traumatic carotid and vertebral artery dissections: results of cervical vessel computed tomography angiogram as a mandatory scan component in severely injured patients

    Directory of Open Access Journals (Sweden)

    Schicho A

    2018-01-01

    Full Text Available Andreas Schicho,1 Lukas Luerken,1 Ramona Meier,1 Antonio Ernstberger,2 Christian Stroszczynski,1 Andreas Schreyer,1 Lena-Marie Dendl,1 Stephan Schleder1 1Department of Radiology, 2Department of Trauma Surgery, University Medical Center, Regensburg, Germany Purpose: The aim of this study was to evaluate the true incidence of cervical artery dissections (CeADs in trauma patients with an Injury Severity Score (ISS of ≥16, since head-and-neck computed tomography angiogram (CTA is not a compulsory component of whole-body trauma computed tomography (CT protocols. Patients and methods: A total of 230 consecutive trauma patients with an ISS of ≥16 admitted to our Level I trauma center during a 24-month period were prospectively included. Standardized whole-body CT in a 256-detector row scanner included a head-and-neck CTA. Incidence, mortality, patient and trauma characteristics, and concomitant injuries were recorded and analyzed retrospectively in patients with carotid artery dissection (CAD and vertebral artery dissection (VAD. Results: Of the 230 patients included, 6.5% had a CeAD, 5.2% had a CAD, and 1.7% had a VAD. One patient had both CAD and VAD. For both, CAD and VAD, mortality is 25%. One death was caused by fatal cerebral ischemia due to high-grade CAD. A total of 41.6% of the patients with traumatic CAD and 25% of the patients with VAD had neurological sequelae. Conclusion: Mandatory head-and-neck CTA yields higher CeAD incidence than reported before. We highly recommend the compulsory inclusion of a head-and-neck CTA to whole-body CT routines for severely injured patients. Keywords: polytrauma, carotid artery, vertebral artery, dissection, blunt trauma, computed tomography angiogram

  20. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  1. Computing the stresses and deformations of the human eye components due to a high explosive detonation using fluid-structure interaction model.

    Science.gov (United States)

    Karimi, Alireza; Razaghi, Reza; Navidbakhsh, Mahdi; Sera, Toshihiro; Kudo, Susumu

    2016-05-01

    In spite the fact that a very small human body surface area is comprised by the eye, its wounds due to detonation have recently been dramatically amplified. Although many efforts have been devoted to measure injury of the globe, there is still a lack of knowledge on the injury mechanism due to Primary Blast Wave (PBW). The goal of this study was to determine the stresses and deformations of the human eye components, including the cornea, aqueous, iris, ciliary body, lens, vitreous, retina, sclera, optic nerve, and muscles, attributed to PBW induced by trinitrotoluene (TNT) explosion via a Lagrangian-Eulerian computational coupling model. Magnetic Resonance Imaging (MRI) was employed to establish a Finite Element (FE) model of the human eye according to a normal human eye. The solid components of the eye were modelled as Lagrangian mesh, while an explosive TNT, air domain, and aqueous were modelled using Arbitrary Lagrangian-Eulerian (ALE) mesh. Nonlinear dynamic FE simulations were accomplished using the explicit FE code, namely LS-DYNA. In order to simulate the blast wave generation, propagation, and interaction with the eye, the ALE formulation with Jones-Wilkins-Lee (JWL) equation defining the explosive material were employed. The results revealed a peak stress of 135.70kPa brought about by detonation upsurge on the cornea at the distance of 25cm. The highest von Mises stresses were observed on the sclera (267.3kPa), whereas the lowest one was seen on the vitreous body (0.002kPa). The results also showed a relatively high resultant displacement for the macula as well as a high variation for the radius of curvature for the cornea and lens, which can result in both macular holes, optic nerve damage and, consequently, vision loss. These results may have implications not only for understanding the value of stresses and strains in the human eye components but also giving an outlook about the process of PBW triggers damage to the eye. Copyright © 2016 Elsevier Ltd

  2. A three-field model of transient 3D multiphase, three-component flow for the computer code IV A3. Pt. 1

    International Nuclear Information System (INIS)

    Kolev, N.I.

    1991-12-01

    This work contains description of the physical and mathematical basis on which the IVA3 computer code relies. After describing the state of the art of the 3D modeling for transient multiphase flows, the model assumptions and the modeling technique used in IVA3 are described. Starting with the principles of conservation of mass, momentum, and energy, the non averaged conservation equations are derived for each of the velocity fields which consist of different isothermal components. Thereafter averaging is applied and the working form of the system of 21 partial differential equations is derived. Special attention is paid to the strict consistence of the modeling technique used in IVA3 with the second principle of thermodynamics. The entropy concept used is derived starting with the unaveraged conservation equations and subsequent averaging. The source terms of the entropy production are carefully defined and the final form of the averaged entropy equation is given ready for direct practical applications. The idea of strong analytical thermodynamic coupling between pressure field and changes of the other thermodynamic properties, which is used for the first time in 3D multi fluid modeling, is presented in detail. After obtaining the working form of the conservation equations, the discretization procedure and the reduction to algebraic problems is presented. The mathematical solution method together with some information about the architecture of IVA3 including the local momentum decoupling and accuracy control is presented too. (orig./GL) [de

  3. Ginger components as new leads for the design and development of novel multi-targeted anti-Alzheimer’s drugs: a computational investigation

    Directory of Open Access Journals (Sweden)

    Azam F

    2014-10-01

    Full Text Available Faizul Azam,1,2 Abdualrahman M Amer,1 Abdullah R Abulifa,1 Mustafa M Elzwawi1 1Faculty of Pharmacy, Misurata University, Misurata, Libya; 2Department of Pharmaceutical Chemistry, Nims Institute of Pharmacy, Nims University, Jaipur, Rajasthan, India Abstract: Ginger (Zingiber officinale, despite being a common dietary adjunct that contributes to the taste and flavor of foods, is well known to contain a number of potentially bioactive phytochemicals having valuable medicinal properties. Although recent studies have emphasized their benefits in Alzheimer’s disease, limited information is available on the possible mechanism by which it renders anti-Alzheimer activity. Therefore, the present study seeks to employ molecular docking studies to investigate the binding interactions between active ginger components and various anti-Alzheimer drug targets. Lamarckian genetic algorithm methodology was employed for docking of 12 ligands with 13 different target proteins using AutoDock 4.2 program. Docking protocol was validated by re-docking of all native co-crystallized ligands into their original binding cavities exhibiting a strong correlation coefficient value (r2=0.931 between experimentally reported and docking predicted activities. This value suggests that the approach could be a promising computational tool to aid optimization of lead compounds obtained from ginger. Analysis of binding energy, predicted inhibition constant, and hydrophobic/hydrophilic interactions of ligands with target receptors revealed acetylcholinesterase as most promising, while c-Jun N-terminal kinase was recognized as the least favorable anti-Alzheimer’s drug target. Common structural requirements include hydrogen bond donor/acceptor area, hydrophobic domain, carbon spacer, and distal hydrophobic domain flanked by hydrogen bond donor/acceptor moieties. In addition, drug-likeness score and molecular properties responsible for a good pharmacokinetic profile were calculated

  4. Quantum computation

    International Nuclear Information System (INIS)

    Deutsch, D.

    1992-01-01

    As computers become ever more complex, they inevitably become smaller. This leads to a need for components which are fabricated and operate on increasingly smaller size scales. Quantum theory is already taken into account in microelectronics design. This article explores how quantum theory will need to be incorporated into computers in future in order to give them their components functionality. Computation tasks which depend on quantum effects will become possible. Physicists may have to reconsider their perspective on computation in the light of understanding developed in connection with universal quantum computers. (UK)

  5. Other components

    International Nuclear Information System (INIS)

    Anon.

    1993-01-01

    This chapter includes descriptions of electronic and mechanical components which do not merit a chapter to themselves. Other hardware requires mention because of particularly high tolerance or intolerance of exposure to radiation. A more systematic analysis of radiation responses of structures which are definable by material was given in section 3.8. The components discussed here are field effect transistors, transducers, temperature sensors, magnetic components, superconductors, mechanical sensors, and miscellaneous electronic components

  6. Electronic components

    CERN Document Server

    Colwell, Morris A

    1976-01-01

    Electronic Components provides a basic grounding in the practical aspects of using and selecting electronics components. The book describes the basic requirements needed to start practical work on electronic equipment, resistors and potentiometers, capacitance, and inductors and transformers. The text discusses semiconductor devices such as diodes, thyristors and triacs, transistors and heat sinks, logic and linear integrated circuits (I.C.s) and electromechanical devices. Common abbreviations applied to components are provided. Constructors and electronics engineers will find the book useful

  7. Impact test of components

    International Nuclear Information System (INIS)

    Borsoi, L.; Buland, P.; Labbe, P.

    1987-01-01

    Stops with gaps are currently used to support components and piping: it is simple, low cost, efficient and permits free thermal expansion. In order to keep the nonlinear nature of stops, such design is often modeled by beam elements (for the component) and nonlinear springs (for the stops). This paper deals with the validity and the limits of these models through the comparison of computational and experimental results. The experimental results come from impact laboratory tests on a simplified mockup. (orig.)

  8. SPSS软件中主成分分析的计算技术解析%Analysis of Computing Technology on Principal Components Method in SPSS

    Institute of Scientific and Technical Information of China (English)

    王春枝

    2011-01-01

    In view of the errors in many teaching material and articles about applying SPSS software for principal components analysis, analyzes the basic principles and mathematical process, on this basis, demonstrates the applied progress of principal component an%针对目前很多用SPSS软件进行主成分分析的教材和发表的文章中有不少误解之处.在解析主成分分析的基本原理与数学过程的基础上.结合实例演示应用SPSS软件实现主成分分析的过程。

  9. Electronic computer prediction of the type of the crystallization reaction of binary compounds on the base of electronic structure of components

    International Nuclear Information System (INIS)

    Kutolin, S.A.; Kotyukov, V.I.

    1979-01-01

    Presented are the criteria permitting to predict distectic and peritectic reactions of crystallization in narrow and wide regions of homogeneity of Asub(n)Bsub(m) binary compounds. Criteria are found as a result of identification of the crystallization reaction type for one hundred of known binary compounds. Predicting function arguments are Chebishev coefficients whose polynomials describe energy distribution of s-, p-, d-atoms of bands of corresponding compound components in condensed state and also Fermi energy atate in compound components and a content. Prediction relative error of distectic (peritectic) crystallization reaction is 6%, for wide homogeneity region - not more than 12 rel.% (for narrow - up to 1 at%). Presented are the prediction results for the most part of A 3 B and AB 3 binary compounds as well as the rare earth compounds of various composition

  10. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  11. Direct FEM-computation of load carrying capacity of highly loaded passive components; Direkte FEM - Berechnung der Tragfaehigkeit hochbeanspruchter passiver Komponenten

    Energy Technology Data Exchange (ETDEWEB)

    Staat, M; Heitzer, M [Forschungszentrum Juelich GmbH (Germany). Inst. fuer Sicherheitsforschung und Reaktortechnik

    1998-11-01

    Detailed, inelastic FEM analyses yield accurate information about the stresses and deformations in passive components. The local loading conditions, however, cannot be directly compared with a limit load in terms of structural mechanics. Concentration on the load carrying capacity is an approach simplifying the analysis. Based on the plasticity theory, limit and shakedown analyses calculate the load carrying capacities directly and exactly. The paper explains the implementation of the limit and shakedown data sets in a general FEM program and the direct calculation of the load carrying capacities of passive components. The concepts used are explained with respect to common structural analysis. Examples assuming high local stresses illustrate the application of FEM-based limit and shakedown analyses. The calculated interaction diagrams present a good insight into the applicable operational loads of individual passive components. The load carrying analysis also opens up a structure mechanics-based approach to assessing the load-to-collapse of cracked components made of highly ductile fracture-resistant material. (orig./CB) [Deutsch] Genaue Kenntnis der Spannungen und Verformungen in passiven Komponenten gewinnt man mit detailierten inelastischen FEM Analysen. Die lokale Beanspruchung laesst sich aber nicht direkt mit einer Beanspruchbarkeit im strukturmechanischen Sinne vergleichen. Konzentriert man sich auf die Frage nach der Tragfaehigkeit, dann vereinfacht sich die Analyse. Im Rahmen der Plastizitaetstheorie berechnen Traglast- und Einspielanalyse die tragbaren Lasten direkt und exakt. In diesem Beitrag wird eine Implementierung der Traglast- und Einspielsaetze in ein allgemeines FEM Programm vorgestellt, mit der die Tragfaehigkeit passiver Komponenten direkt berechnet wird. Die benutzten Konzepte werden in Bezug auf die uebliche Strukturanalyse erlaeutert. Beispiele mit lokal hoher Beanspruchung verdeutlichen die Anwendung der FEM basierten Traglast- und

  12. Euler principal component analysis

    NARCIS (Netherlands)

    Liwicki, Stephan; Tzimiropoulos, Georgios; Zafeiriou, Stefanos; Pantic, Maja

    Principal Component Analysis (PCA) is perhaps the most prominent learning tool for dimensionality reduction in pattern recognition and computer vision. However, the ℓ 2-norm employed by standard PCA is not robust to outliers. In this paper, we propose a kernel PCA method for fast and robust PCA,

  13. Isolation, characterization, spectroscopic properties and quantum chemical computations of an important phytoalexin resveratrol as antioxidant component from Vitis labrusca L. and their chemical compositions

    Science.gov (United States)

    Güder, Aytaç; Korkmaz, Halil; Gökce, Halil; Alpaslan, Yelda Bingöl; Alpaslan, Gökhan

    2014-12-01

    In this study, isolation and characterization of trans-resveratrol (RES) as an antioxidant compound were carried out from VLE, VLG and VLS. Furthermore, antioxidant activities were evaluated by using six different methods. Finally, total phenolic, flavonoid, ascorbic acid, anthocyanin, lycopene, β-carotene and vitamin E contents were carried out. In addition, the FT-IR, 13C and 1H NMR chemical shifts and UV-vis. spectra of trans-resveratrol were experimentally recorded. Quantum chemical computations such as the molecular geometry, vibrational frequencies, UV-vis. spectroscopic parameters, HOMOs-LUMOs energies, molecular electrostatic potential (MEP), natural bond orbitals (NBO) and nonlinear optics (NLO) properties of title molecule have been calculated by using DFT/B3PW91 method with 6-311++G(d,p) basis set in ground state for the first time. The obtained results show that the calculated spectroscopic data are in a good agreement with experimental data.

  14. The effect of macromolecular crowding on the electrostatic component of barnase-barstar binding: a computational, implicit solvent-based study.

    Directory of Open Access Journals (Sweden)

    Helena W Qi

    Full Text Available Macromolecular crowding within the cell can impact both protein folding and binding. Earlier models of cellular crowding focused on the excluded volume, entropic effect of crowding agents, which generally favors compact protein states. Recently, other effects of crowding have been explored, including enthalpically-related crowder-protein interactions and changes in solvation properties. In this work, we explore the effects of macromolecular crowding on the electrostatic desolvation and solvent-screened interaction components of protein-protein binding. Our simple model enables us to focus exclusively on the electrostatic effects of water depletion on protein binding due to crowding, providing us with the ability to systematically analyze and quantify these potentially intuitive effects. We use the barnase-barstar complex as a model system and randomly placed, uncharged spheres within implicit solvent to model crowding in an aqueous environment. On average, we find that the desolvation free energy penalties incurred by partners upon binding are lowered in a crowded environment and solvent-screened interactions are amplified. At a constant crowder density (fraction of total available volume occupied by crowders, this effect generally increases as the radius of model crowders decreases, but the strength and nature of this trend can depend on the water probe radius used to generate the molecular surface in the continuum model. In general, there is huge variation in desolvation penalties as a function of the random crowder positions. Results with explicit model crowders can be qualitatively similar to those using a lowered "effective" solvent dielectric to account for crowding, although the "best" effective dielectric constant will likely depend on multiple system properties. Taken together, this work systematically demonstrates, quantifies, and analyzes qualitative intuition-based insights into the effects of water depletion due to crowding on the

  15. Smart time-pulse coding photoconverters as basic components 2D-array logic devices for advanced neural networks and optical computers

    Science.gov (United States)

    Krasilenko, Vladimir G.; Nikolsky, Alexander I.; Lazarev, Alexander A.; Michalnichenko, Nikolay N.

    2004-04-01

    The article deals with a conception of building arithmetic-logic devices (ALD) with a 2D-structure and optical 2D-array inputs-outputs as advanced high-productivity parallel basic operational training modules for realization of basic operation of continuous, neuro-fuzzy, multilevel, threshold and others logics and vector-matrix, vector-tensor procedures in neural networks, that consists in use of time-pulse coding (TPC) architecture and 2D-array smart optoelectronic pulse-width (or pulse-phase) modulators (PWM or PPM) for transformation of input pictures. The input grayscale image is transformed into a group of corresponding short optical pulses or time positions of optical two-level signal swing. We consider optoelectronic implementations of universal (quasi-universal) picture element of two-valued ALD, multi-valued ALD, analog-to-digital converters, multilevel threshold discriminators and we show that 2D-array time-pulse photoconverters are the base elements for these devices. We show simulation results of the time-pulse photoconverters as base components. Considered devices have technical parameters: input optical signals power is 200nW_200μW (if photodiode responsivity is 0.5A/W), conversion time is from tens of microseconds to a millisecond, supply voltage is 1.5_15V, consumption power is from tens of microwatts to a milliwatt, conversion nonlinearity is less than 1%. One cell consists of 2-3 photodiodes and about ten CMOS transistors. This simplicity of the cells allows to carry out their integration in arrays of 32x32, 64x64 elements and more.

  16. A comparative study of the morphometry of sperm head components in cattle, sheep, and pigs with a computer-assisted fluorescence method

    Directory of Open Access Journals (Sweden)

    Jesús L Yániz

    2016-01-01

    Full Text Available The aim of this study was to compare the sperm nuclear and acrosomal morphometry of three species of domestic artiodactyls; cattle (Bos taurus, sheep (Ovis aries, and pigs (Sus scrofa. Semen smears of twenty ejaculates from each species were fixed and labeled with a propidium iodide-Pisum sativum agglutinin (PI/PSA combination. Digital images of the sperm nucleus, acrosome, and whole sperm head were captured and analyzed. The use of the PI/PSA combination and CASA-Morph fluorescence-based method allowed the capture, morphometric analysis, and differentiation of most sperm nuclei, acrosomes and whole heads, and the assessment of acrosomal integrity with a high precision in the three species studied. For the size of the head and nuclear area, the relationship between the three species may be summarized as bull > ram > boar. However, for the other morphometric parameters (length, width, and perimeter, there were differences in the relationships between species for sperm nuclei and whole sperm heads. Bull sperm acrosomes were clearly smaller than those in the other species studied and covered a smaller proportion of the sperm head. The acrosomal morphology, small in the bull, large and broad in the sheep, and large, long, and with a pronounced equatorial segment curve in the boar, was species-characteristic. It was concluded that there are clear variations in the size and shape of the sperm head components between the three species studied, the acrosome being the structure showing the most variability, allowing a clear distinction of the spermatozoa of each species.

  17. Principal components

    NARCIS (Netherlands)

    Hallin, M.; Hörmann, S.; Piegorsch, W.; El Shaarawi, A.

    2012-01-01

    Principal Components are probably the best known and most widely used of all multivariate analysis techniques. The essential idea consists in performing a linear transformation of the observed k-dimensional variables in such a way that the new variables are vectors of k mutually orthogonal

  18. Estimation of the binding modes with important human cytochrome P450 enzymes, drug interaction potential, pharmacokinetics, and hepatotoxicity of ginger components using molecular docking, computational, and pharmacokinetic modeling studies.

    Science.gov (United States)

    Qiu, Jia-Xuan; Zhou, Zhi-Wei; He, Zhi-Xu; Zhang, Xueji; Zhou, Shu-Feng; Zhu, Shengrong

    2015-01-01

    Ginger is one of the most commonly used herbal medicines for the treatment of numerous ailments and improvement of body functions. It may be used in combination with prescribed drugs. The coadministration of ginger with therapeutic drugs raises a concern of potential deleterious drug interactions via the modulation of the expression and/or activity of drug-metabolizing enzymes and drug transporters, resulting in unfavorable therapeutic outcomes. This study aimed to determine the molecular interactions between 12 main active ginger components (6-gingerol, 8-gingerol, 10-gingerol, 6-shogaol, 8-shogaol, 10-shogaol, ar-curcumene, β-bisabolene, β-sesquiphelandrene, 6-gingerdione, (-)-zingiberene, and methyl-6-isogingerol) and human cytochrome P450 (CYP) 1A2, 2C9, 2C19, 2D6, and 3A4 and to predict the absorption, distribution, metabolism, excretion, and toxicity (ADMET) of the 12 ginger components using computational approaches and comprehensive literature search. Docking studies showed that ginger components interacted with a panel of amino acids in the active sites of CYP1A2, 2C9, 2C19, 2D6, and 3A4 mainly through hydrogen bond formation, to a lesser extent, via π-π stacking. The pharmacokinetic simulation studies showed that the [I]/[Ki ] value for CYP2C9, 2C19, and 3A4 ranged from 0.0002 to 19.6 and the R value ranged from 1.0002 to 20.6 and that ginger might exhibit a high risk of drug interaction via inhibition of the activity of human CYP2C9 and CYP3A4, but a low risk of drug interaction toward CYP2C19-mediated drug metabolism. Furthermore, it has been evaluated that the 12 ginger components possessed a favorable ADMET profiles with regard to the solubility, absorption, permeability across the blood-brain barrier, interactions with CYP2D6, hepatotoxicity, and plasma protein binding. The validation results showed that there was no remarkable effect of ginger on the metabolism of warfarin in humans, whereas concurrent use of ginger and nifedipine exhibited a

  19. Magnetic compatibility of standard components for electrical installations: Computation of the background field and consequences on the design of the electrical distribution boards and control boards for the ITER Tokamak building

    International Nuclear Information System (INIS)

    Benfatto, I.; Bettini, P.; Cavinato, M.; Lorenzi, A. De; Hourtoule, J.; Serra, E.

    2005-01-01

    Inside the proposed Tokamak building, the ITER poloidal field magnet system would produce a stray magnetic field up to 70 mT. This is a very unusual environmental condition for electrical installation equipment and limited information is available on the magnetic compatibility of standard components for electrical distribution boards and control boards. Because this information is a necessary input for the design of the electrical installation inside the proposed ITER Tokamak building specific investigations have been carried out by the ITER European Participant Team. The paper reports on the computation of the background magnetic field map inside the ITER Tokamak building and the consequences on the design of the electrical installations of this building. The effects of the steel inside the building structure and the feasibility of magnetic shields for electrical distribution boards and control boards are also reported in the paper. The results of the test campaigns on the magnetic field compatibility of standard components for electrical distribution boards and control boards are reported in companion papers published in these proceedings

  20. Multipass welding of nuclear reactor components - computations

    International Nuclear Information System (INIS)

    Hedblom, E.

    2002-01-01

    The finite element method is used to compare different welding procedures. The simulations are compared with measurements. Two different geometries and two different welding procedures are evaluated. It is found that a narrow gap weld gives smaller tensile residual axial stresses on the inside of the pipe. This is believed to reduce the risk for intergranular stress corrosion cracking

  1. Computer interfacing

    CERN Document Server

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  2. Estimation of the binding modes with important human cytochrome P450 enzymes, drug interaction potential, pharmacokinetics, and hepatotoxicity of ginger components using molecular docking, computational, and pharmacokinetic modeling studies

    Directory of Open Access Journals (Sweden)

    Qiu JX

    2015-02-01

    Full Text Available Jia-Xuan Qiu,1,2 Zhi-Wei Zhou,3 Zhi-Xu He,4 Xueji Zhang,5 Shu-Feng Zhou,3 Shengrong Zhu11Department of Stomatology, Tongji Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, Hubei, People’s Republic of China; 2Department of Oral and Maxillofacial Surgery, The First Affiliated Hospital of Nanchang University, Nanchang, Jiangxi, People’s Republic of China; 3Department of Pharmaceutical Sciences, College of Pharmacy, University of South Florida, Tampa, FL, USA; 4Guizhou Provincial Key Laboratory for Regenerative Medicine, Stem Cell and Tissue Engineering Research Center and Sino-US Joint Laboratory for Medical Sciences, Guiyang Medical University, Guiyang, Guizhou, People’s Republic of China; 5Research Center for Bioengineering and Sensing Technology, University of Science and Technology Beijing, Beijing, People’s Republic of ChinaAbstract: Ginger is one of the most commonly used herbal medicines for the treatment of numerous ailments and improvement of body functions. It may be used in combination with prescribed drugs. The coadministration of ginger with therapeutic drugs raises a concern of potential deleterious drug interactions via the modulation of the expression and/or activity of drug-metabolizing enzymes and drug transporters, resulting in unfavorable therapeutic outcomes. This study aimed to determine the molecular interactions between 12 main active ginger components (6-gingerol, 8-gingerol, 10-gingerol, 6-shogaol, 8-shogaol, 10-shogaol, ar-curcumene, ß-bisabolene, ß-sesquiphelandrene, 6-gingerdione, (--zingiberene, and methyl-6-isogingerol and human cytochrome P450 (CYP 1A2, 2C9, 2C19, 2D6, and 3A4 and to predict the absorption, distribution, metabolism, excretion, and toxicity (ADMET of the 12 ginger components using computational approaches and comprehensive literature search. Docking studies showed that ginger components interacted with a panel of amino acids in the active sites of CYP1A

  3. Computers as Components Principles of Embedded Computing System Design

    CERN Document Server

    Wolf, Wayne

    2008-01-01

    This book was the first to bring essential knowledge on embedded systems technology and techniques under a single cover. This second edition has been updated to the state-of-the-art by reworking and expanding performance analysis with more examples and exercises, and coverage of electronic systems now focuses on the latest applications. Researchers, students, and savvy professionals schooled in hardware or software design, will value Wayne Wolf's integrated engineering design approach.The second edition gives a more comprehensive view of multiprocessors including VLIW and superscalar archite

  4. Passive Microwave Components and Antennas

    DEFF Research Database (Denmark)

    State-of-the-art microwave systems always require higher performance and lower cost microwave components. Constantly growing demands and performance requirements of industrial and scientific applications often make employing traditionally designed components impractical. For that reason, the design...... and development process remains a great challenge today. This problem motivated intensive research efforts in microwave design and technology, which is responsible for a great number of recently appeared alternative approaches to analysis and design of microwave components and antennas. This book highlights...... techniques. Modelling and computations in electromagnetics is a quite fast-growing research area. The recent interest in this field is caused by the increased demand for designing complex microwave components, modeling electromagnetic materials, and rapid increase in computational power for calculation...

  5. Developing a Model Component

    Science.gov (United States)

    Fields, Christina M.

    2013-01-01

    The Spaceport Command and Control System (SCCS) Simulation Computer Software Configuration Item (CSCI) is responsible for providing simulations to support test and verification of SCCS hardware and software. The Universal Coolant Transporter System (UCTS) was a Space Shuttle Orbiter support piece of the Ground Servicing Equipment (GSE). The initial purpose of the UCTS was to provide two support services to the Space Shuttle Orbiter immediately after landing at the Shuttle Landing Facility. The UCTS is designed with the capability of servicing future space vehicles; including all Space Station Requirements necessary for the MPLM Modules. The Simulation uses GSE Models to stand in for the actual systems to support testing of SCCS systems during their development. As an intern at Kennedy Space Center (KSC), my assignment was to develop a model component for the UCTS. I was given a fluid component (dryer) to model in Simulink. I completed training for UNIX and Simulink. The dryer is a Catch All replaceable core type filter-dryer. The filter-dryer provides maximum protection for the thermostatic expansion valve and solenoid valve from dirt that may be in the system. The filter-dryer also protects the valves from freezing up. I researched fluid dynamics to understand the function of my component. The filter-dryer was modeled by determining affects it has on the pressure and velocity of the system. I used Bernoulli's Equation to calculate the pressure and velocity differential through the dryer. I created my filter-dryer model in Simulink and wrote the test script to test the component. I completed component testing and captured test data. The finalized model was sent for peer review for any improvements. I participated in Simulation meetings and was involved in the subsystem design process and team collaborations. I gained valuable work experience and insight into a career path as an engineer.

  6. The digital computer

    CERN Document Server

    Parton, K C

    2014-01-01

    The Digital Computer focuses on the principles, methodologies, and applications of the digital computer. The publication takes a look at the basic concepts involved in using a digital computer, simple autocode examples, and examples of working advanced design programs. Discussions focus on transformer design synthesis program, machine design analysis program, solution of standard quadratic equations, harmonic analysis, elementary wage calculation, and scientific calculations. The manuscript then examines commercial and automatic programming, how computers work, and the components of a computer

  7. Cloud computing for radiologists

    OpenAIRE

    Amit T Kharat; Amjad Safvi; S S Thind; Amarjit Singh

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as...

  8. Component reliability for electronic systems

    CERN Document Server

    Bajenescu, Titu-Marius I

    2010-01-01

    The main reason for the premature breakdown of today's electronic products (computers, cars, tools, appliances, etc.) is the failure of the components used to build these products. Today professionals are looking for effective ways to minimize the degradation of electronic components to help ensure longer-lasting, more technically sound products and systems. This practical book offers engineers specific guidance on how to design more reliable components and build more reliable electronic systems. Professionals learn how to optimize a virtual component prototype, accurately monitor product reliability during the entire production process, and add the burn-in and selection procedures that are the most appropriate for the intended applications. Moreover, the book helps system designers ensure that all components are correctly applied, margins are adequate, wear-out failure modes are prevented during the expected duration of life, and system interfaces cannot lead to failure.

  9. Embedded 100 Gbps Photonic Components:

    Energy Technology Data Exchange (ETDEWEB)

    Kuznia, Charlie

    2018-04-26

    This innovation to fiber optic component technology increases the performance, reduces the size and reduces the power consumption of optical communications within dense network systems, such as advanced distributed computing systems and data centers. VCSEL technology is enabling short-reach (< 100 m) and >100 Gbps optical interconnections over multi-mode fiber in commercial applications.

  10. Specialized computer architectures for computational aerodynamics

    Science.gov (United States)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  11. Component failure data handbook

    International Nuclear Information System (INIS)

    Gentillon, C.D.

    1991-04-01

    This report presents generic component failure rates that are used in reliability and risk studies of commercial nuclear power plants. The rates are computed using plant-specific data from published probabilistic risk assessments supplemented by selected other sources. Each data source is described. For rates with four or more separate estimates among the sources, plots show the data that are combined. The method for combining data from different sources is presented. The resulting aggregated rates are listed with upper bounds that reflect the variability observed in each rate across the nuclear power plant industry. Thus, the rates are generic. Both per hour and per demand rates are included. They may be used for screening in risk assessments or for forming distributions to be updated with plant-specific data

  12. Model reduction by weighted Component Cost Analysis

    Science.gov (United States)

    Kim, Jae H.; Skelton, Robert E.

    1990-01-01

    Component Cost Analysis considers any given system driven by a white noise process as an interconnection of different components, and assigns a metric called 'component cost' to each component. These component costs measure the contribution of each component to a predefined quadratic cost function. A reduced-order model of the given system may be obtained by deleting those components that have the smallest component costs. The theory of Component Cost Analysis is extended to include finite-bandwidth colored noises. The results also apply when actuators have dynamics of their own. Closed-form analytical expressions of component costs are also derived for a mechanical system described by its modal data. This is very useful to compute the modal costs of very high order systems. A numerical example for MINIMAST system is presented.

  13. Research Advances: DNA Computing Targets West Nile Virus, Other Deadly Diseases, and Tic-Tac-Toe; Marijuana Component May Offer Hope for Alzheimer's Disease Treatment; New Wound Dressing May Lead to Maggot Therapy--Without the Maggots

    Science.gov (United States)

    King, Angela G.

    2007-01-01

    This article presents three reports of research advances. The first report describes a deoxyribonucleic acid (DNA)-based computer that could lead to faster, more accurate tests for diagnosing West Nile Virus and bird flu. Representing the first "medium-scale integrated molecular circuit," it is the most powerful computing device of its type to…

  14. Explorations in quantum computing

    CERN Document Server

    Williams, Colin P

    2011-01-01

    By the year 2020, the basic memory components of a computer will be the size of individual atoms. At such scales, the current theory of computation will become invalid. ""Quantum computing"" is reinventing the foundations of computer science and information theory in a way that is consistent with quantum physics - the most accurate model of reality currently known. Remarkably, this theory predicts that quantum computers can perform certain tasks breathtakingly faster than classical computers -- and, better yet, can accomplish mind-boggling feats such as teleporting information, breaking suppos

  15. Behavior Protocols for Software Components

    Czech Academy of Sciences Publication Activity Database

    Plášil, František; Višňovský, Stanislav

    2002-01-01

    Roč. 28, č. 11 (2002), s. 1056-1076 ISSN 0098-5589 R&D Projects: GA AV ČR IAA2030902; GA ČR GA201/99/0244 Grant - others:Eureka(XE) Pepita project no.2033 Institutional research plan: AV0Z1030915 Keywords : behavior protocols * component-based programming * software architecture Subject RIV: JC - Computer Hardware ; Software Impact factor: 1.170, year: 2002

  16. Principal component regression analysis with SPSS.

    Science.gov (United States)

    Liu, R X; Kuang, J; Gong, Q; Hou, X L

    2003-06-01

    The paper introduces all indices of multicollinearity diagnoses, the basic principle of principal component regression and determination of 'best' equation method. The paper uses an example to describe how to do principal component regression analysis with SPSS 10.0: including all calculating processes of the principal component regression and all operations of linear regression, factor analysis, descriptives, compute variable and bivariate correlations procedures in SPSS 10.0. The principal component regression analysis can be used to overcome disturbance of the multicollinearity. The simplified, speeded up and accurate statistical effect is reached through the principal component regression analysis with SPSS.

  17. Cloud Computing for radiologists.

    Science.gov (United States)

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  18. Cloud Computing for radiologists

    International Nuclear Information System (INIS)

    Kharat, Amit T; Safvi, Amjad; Thind, SS; Singh, Amarjit

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future

  19. Cloud computing for radiologists

    Directory of Open Access Journals (Sweden)

    Amit T Kharat

    2012-01-01

    Full Text Available Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  20. Runtime Concepts of Hierarchical Software Components

    Czech Academy of Sciences Publication Activity Database

    Bureš, Tomáš; Hnětynka, P.; Plášil, František

    2007-01-01

    Roč. 8, special (2007), s. 454-463 ISSN 1525-9293 R&D Projects: GA AV ČR 1ET400300504 Institutional research plan: CEZ:AV0Z10300504 Keywords : component-based development * hierarchical components * connectors * controlers * runtime environment Subject RIV: JC - Computer Hardware ; Software

  1. Component effects in mixture experiments

    International Nuclear Information System (INIS)

    Piepel, G.F.

    1980-01-01

    In a mixture experiment, the response to a mixture of q components is a function of the proportions x 1 , x 2 , ..., x/sub q/ of components in the mixture. Experimental regions for mixture experiments are often defined by constraints on the proportions of the components forming the mixture. The usual (orthogonal direction) definition of a factor effect does not apply because of the dependence imposed by the mixture restriction, /sup q/Σ/sub i=1/ x/sub i/ = 1. A direction within the experimental region in which to compute a mixture component effect is presented and compared to previously suggested directions. This new direction has none of the inadequacies or errors of previous suggestions while having a more meaningful interpretation. The distinction between partial and total effects is made. The uses of partial and total effects (computed using the new direction) in modification and interpretation of mixture response prediction equations are considered. The suggestions of the paper are illustrated in an example from a glass development study in a waste vitrification program. 5 figures, 3 tables

  2. Initial Ada components evaluation

    Science.gov (United States)

    Moebes, Travis

    1989-01-01

    The SAIC has the responsibility for independent test and validation of the SSE. They have been using a mathematical functions library package implemented in Ada to test the SSE IV and V process. The library package consists of elementary mathematical functions and is both machine and accuracy independent. The SSE Ada components evaluation includes code complexity metrics based on Halstead's software science metrics and McCabe's measure of cyclomatic complexity. Halstead's metrics are based on the number of operators and operands on a logical unit of code and are compiled from the number of distinct operators, distinct operands, and total number of occurrences of operators and operands. These metrics give an indication of the physical size of a program in terms of operators and operands and are used diagnostically to point to potential problems. McCabe's Cyclomatic Complexity Metrics (CCM) are compiled from flow charts transformed to equivalent directed graphs. The CCM is a measure of the total number of linearly independent paths through the code's control structure. These metrics were computed for the Ada mathematical functions library using Software Automated Verification and Validation (SAVVAS), the SSE IV and V tool. A table with selected results was shown, indicating that most of these routines are of good quality. Thresholds for the Halstead measures indicate poor quality if the length metric exceeds 260 or difficulty is greater than 190. The McCabe CCM indicated a high quality of software products.

  3. Research regarding reverse engineering for aircraft components

    Directory of Open Access Journals (Sweden)

    Udroiu Razvan

    2017-01-01

    Full Text Available Reverse engineering is a useful technique used in manufacturing and design process of new components. In aerospace industry new components can be developed, based on existing components without technical Computer Aided Design (CAD data, in order to reduce the development cycle of new products. This paper proposes a methodology wherein the CAD model of turbine blade can be build using computer aided reverse engineering technique utilising a 5 axis Coordinate Measuring Machine (CMM. The proposed methodology uses a scanning strategy by features, followed by a design methodology for 3D modelling of complex shapes.

  4. Reusable Component Services

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Reusable Component Services (RCS) is a super-catalog of components, services, solutions and technologies that facilitates search, discovery and collaboration in...

  5. SOFA 2 Component Framework and Its Ecosystem

    Czech Academy of Sciences Publication Activity Database

    Malohlava, M.; Hnětynka, P.; Bureš, Tomáš

    2013-01-01

    Roč. 295, 9 May (2013), s. 101-106 ISSN 1571-0661. [FESCA 2012. International Workshop on Formal Engineering approaches to Software Components and Architectures /9./. Tallinn, 31.03.2012] R&D Projects: GA ČR GD201/09/H057 Grant - others:GA AV ČR(CZ) GAP202/11/0312; UK(CZ) SVV-2012-265312 Keywords : CBSE * component system * component model * component * sofa * ecosystem * development tool Subject RIV: JC - Computer Hardware ; Software

  6. Software component quality evaluation

    Science.gov (United States)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  7. Reactor component automatic grapple

    International Nuclear Information System (INIS)

    Greenaway, P.R.

    1982-01-01

    A grapple for handling nuclear reactor components in a medium such as liquid sodium which, upon proper seating and alignment of the grapple with the component as sensed by a mechanical logic integral to the grapple, automatically seizes the component. The mechanical logic system also precludes seizure in the absence of proper seating and alignment. (author)

  8. Repurposing learning object components

    NARCIS (Netherlands)

    Verbert, K.; Jovanovic, J.; Gasevic, D.; Duval, E.; Meersman, R.

    2005-01-01

    This paper presents an ontology-based framework for repurposing learning object components. Unlike the usual practice where learning object components are assembled manually, the proposed framework enables on-the-fly access and repurposing of learning object components. The framework supports two

  9. Supply chain components

    OpenAIRE

    Vieraşu, T.; Bălăşescu, M.

    2011-01-01

    In this article I will go through three main logistics components, which are represented by: transportation, inventory and facilities, and the three secondary logistical components: information, production location, price and how they determine performance of any supply chain. I will discuss then how these components are used in the design, planning and operation of a supply chain. I will also talk about some obstacles a supply chain manager may encounter.

  10. Supply chain components

    Directory of Open Access Journals (Sweden)

    Vieraşu, T.

    2011-01-01

    Full Text Available In this article I will go through three main logistics components, which are represented by: transportation, inventory and facilities, and the three secondary logistical components: information, production location, price and how they determine performance of any supply chain. I will discuss then how these components are used in the design, planning and operation of a supply chain. I will also talk about some obstacles a supply chain manager may encounter.

  11. Control component retainer

    International Nuclear Information System (INIS)

    Walton, L.A.; King, R.A.

    1983-01-01

    An apparatus is described for retaining an undriven control component assembly disposed in a fuel assembly in a nuclear reactor of the type having a core grid plate. The first part of the mechanism involves a housing for the control component and the second part is a brace with a number of arms that reach under the grid plate. The brace and the housing are coupled together to firmly hold the control components in place even under strong flows of th coolant

  12. T-stresses for internally cracked components

    International Nuclear Information System (INIS)

    Fett, T.

    1997-12-01

    The failure of cracked components is governed by the stresses in the vicinity of the crack tip. The singular stress contribution is characterised by the stress intensity factor K, the first regular stress term is represented by the so-called T-stress. T-stress solutions for components containing an internal crack were computed by application of the Bundary Collocation Method (BCM). The results are compiled in form of tables or approximative relations. In addition a Green's function of T-stresses is proposed for internal cracks which enables to compute T-stress terms for any given stress distribution in the uncracked body. (orig.) [de

  13. Component design for LMFBR's

    International Nuclear Information System (INIS)

    Fillnow, R.H.; France, L.L.; Zerinvary, M.C.; Fox, R.O.

    1975-01-01

    Just as FFTF has prototype components to confirm their design, FFTF is serving as a prototype for the design of the commercial LMFBR's. Design and manufacture of critical components for the FFTF system have been accomplished primarily using vendors with little or no previous experience in supplying components for high temperature sodium systems. The exposure of these suppliers, and through them a multitude of subcontractors, to the requirements of this program has been a necessary and significant step in preparing American industry for the task of supplying the large mechanical components required for commercial LMFBR's

  14. Hot gas path component

    Science.gov (United States)

    Lacy, Benjamin Paul; Kottilingam, Srikanth Chandrudu; Porter, Christopher Donald; Schick, David Edward

    2017-09-12

    Various embodiments of the disclosure include a turbomachine component. and methods of forming such a component. Some embodiments include a turbomachine component including: a first portion including at least one of a stainless steel or an alloy steel; and a second portion joined with the first portion, the second portion including a nickel alloy including an arced cooling feature extending therethrough, the second portion having a thermal expansion coefficient substantially similar to a thermal expansion coefficient of the first portion, wherein the arced cooling feature is located within the second portion to direct a portion of a coolant to a leakage area of the turbomachine component.

  15. Optical Computing

    OpenAIRE

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  16. Tumor Size Evaluation according to the T Component of the Seventh Edition of the International Association for the Study of Lung Cancer's TNM Classification: Interobserver Agreement between Radiologists and Computer-Aided Diagnosis System in Patients with Lung Cancer

    International Nuclear Information System (INIS)

    Kim, Jin Kyoung; Chong, Se Min; Seo, Jae Seung; Lee, Sun Jin; Han, Heon

    2011-01-01

    To assess the interobserver agreement for tumor size evaluation between radiologists and the computer-aided diagnosis (CAD) system based on the 7th edition of the TNM classification by the International Association for the Study of Lung Cancer in patients with lung cancer. We evaluated 20 patients who underwent a lobectomy or pneumonectomy for primary lung cancer. The maximum diameter of each primary tumor was measured by two radiologists and a CAD system on CT, and was staged based on the 7th edition of the TNM classification. The CT size and T-staging of the primary tumors was compared with the pathologic size and staging and the variability in the sizes and T stages of primary tumors was statistically analyzed between each radiologist's measurement or CAD estimation and the pathologic results. There was no statistically significant interobserver difference for the CT size among the two radiologists, between pathologic and CT size estimated by the radiologists, and between pathologic and CT staging by the radiologists and CAD system. However, there was a statistically significant interobserver difference between pathologic size and the CT size estimated by the CAD system (p = 0.003). No significant differences were found in the measurement of tumor size among radiologists or in the assessment of T-staging by radiologists and the CAD system.

  17. Components of Sexual Identity

    Science.gov (United States)

    Shively, Michael G.; DeCecco, John P.

    1977-01-01

    This paper examines the four components of sexual identity: biological sex, gender identity, social sex-role, and sexual orientation. Theories about the development of each component and how they combine and conflict to form the individual's sexual identity are discussed. (Author)

  18. Towards Cognitive Component Analysis

    DEFF Research Database (Denmark)

    Hansen, Lars Kai; Ahrendt, Peter; Larsen, Jan

    2005-01-01

    Cognitive component analysis (COCA) is here defined as the process of unsupervised grouping of data such that the ensuing group structure is well-aligned with that resulting from human cognitive activity. We have earlier demonstrated that independent components analysis is relevant for representing...

  19. Opportunity for Realizing Ideal Computing System using Cloud Computing Model

    OpenAIRE

    Sreeramana Aithal; Vaikunth Pai T

    2017-01-01

    An ideal computing system is a computing system with ideal characteristics. The major components and their performance characteristics of such hypothetical system can be studied as a model with predicted input, output, system and environmental characteristics using the identified objectives of computing which can be used in any platform, any type of computing system, and for application automation, without making modifications in the form of structure, hardware, and software coding by an exte...

  20. The Component-Based Application for GAMESS

    Energy Technology Data Exchange (ETDEWEB)

    Peng, Fang [Iowa State Univ., Ames, IA (United States)

    2007-01-01

    GAMESS, a quantum chetnistry program for electronic structure calculations, has been freely shared by high-performance application scientists for over twenty years. It provides a rich set of functionalities and can be run on a variety of parallel platforms through a distributed data interface. While a chemistry computation is sophisticated and hard to develop, the resource sharing among different chemistry packages will accelerate the development of new computations and encourage the cooperation of scientists from universities and laboratories. Common Component Architecture (CCA) offers an enviromnent that allows scientific packages to dynamically interact with each other through components, which enable dynamic coupling of GAMESS with other chetnistry packages, such as MPQC and NWChem. Conceptually, a cotnputation can be constructed with "plug-and-play" components from scientific packages and require more than componentizing functions/subroutines of interest, especially for large-scale scientific packages with a long development history. In this research, we present our efforts to construct cotnponents for GAMESS that conform to the CCA specification. The goal is to enable the fine-grained interoperability between three quantum chemistry programs, GAMESS, MPQC and NWChem, via components. We focus on one of the three packages, GAMESS; delineate the structure of GAMESS computations, followed by our approaches to its component development. Then we use GAMESS as the driver to interoperate integral components from the other tw"o packages, arid show the solutions for interoperability problems along with preliminary results. To justify the versatility of the design, the Tuning and Analysis Utility (TAU) components have been coupled with GAMESS and its components, so that the performance of GAMESS and its components may be analyzed for a wide range of systetn parameters.

  1. Coherent systems with multistate components

    International Nuclear Information System (INIS)

    Caldarola, L.

    1980-01-01

    The basic rules of the Boolean algebra with restrictions on variables are briefly recalled. This special type of Boolean algebra allows one to handle fault trees of systems made of multistate (two or more than two states) components. Coherent systems are defined in the case of multistate components. This definition is consistent with that originally suggested by Barlow in the case of binary (two states) components. The basic properties of coherence are described and discussed. Coherent Boolean functions are also defined. It is shown that these functions are irredundant, that is they have only one base which is at the same time complete and irredundant. However, irredundant functions are not necessarily coherent. Finally a simplified algorithm for the calculation of the base of a coherent function is described. In the case that the function is not coherent, the algorithm can be used to reduce the size of the normal disjunctive form of the function. This in turn eases the application of the Nelson algorithm to calculate the complete base of the function. The simplified algorithm has been built in the computer program MUSTAFA-1. In a sample case the use of this algorithm caused a reduction of the CPU time by a factor of about 20. (orig.)

  2. Fusion-component lifetime analysis

    International Nuclear Information System (INIS)

    Mattas, R.F.

    1982-09-01

    A one-dimensional computer code has been developed to examine the lifetime of first-wall and impurity-control components. The code incorporates the operating and design parameters, the material characteristics, and the appropriate failure criteria for the individual components. The major emphasis of the modeling effort has been to calculate the temperature-stress-strain-radiation effects history of a component so that the synergystic effects between sputtering erosion, swelling, creep, fatigue, and crack growth can be examined. The general forms of the property equations are the same for all materials in order to provide the greatest flexibility for materials selection in the code. The individual coefficients within the equations are different for each material. The code is capable of determining the behavior of a plate, composed of either a single or dual material structure, that is either totally constrained or constrained from bending but not from expansion. The code has been utilized to analyze the first walls for FED/INTOR and DEMO and to analyze the limiter for FED/INTOR

  3. Generating Multimedia Components for M-Learning

    Directory of Open Access Journals (Sweden)

    Adriana REVEIU

    2009-01-01

    Full Text Available The paper proposes a solution to generate template based multimedia components for instruction and learning available both for computer based applications and for mobile devices. The field of research is situated at the intersection of computer science, mobile tools and e-learning and is generically named mobile learning or M-learning. The research goal is to provide access to computer based training resources from any location and to adapt the training content to the specific features of mobile devices, communication environment, users' preferences and users' knowledge. To become important tools in education field, the technical solutions proposed will follow to use the potential of mobile devices.

  4. GCS component development cycle

    Science.gov (United States)

    Rodríguez, Jose A.; Macias, Rosa; Molgo, Jordi; Guerra, Dailos; Pi, Marti

    2012-09-01

    The GTC1 is an optical-infrared 10-meter segmented mirror telescope at the ORM observatory in Canary Islands (Spain). First light was at 13/07/2007 and since them it is in the operation phase. The GTC control system (GCS) is a distributed object & component oriented system based on RT-CORBA8 and it is responsible for the management and operation of the telescope, including its instrumentation. GCS has used the Rational Unified process (RUP9) in its development. RUP is an iterative software development process framework. After analysing (use cases) and designing (UML10) any of GCS subsystems, an initial component description of its interface is obtained and from that information a component specification is written. In order to improve the code productivity, GCS has adopted the code generation to transform this component specification into the skeleton of component classes based on a software framework, called Device Component Framework. Using the GCS development tools, based on javadoc and gcc, in only one step, the component is generated, compiled and deployed to be tested for the first time through our GUI inspector. The main advantages of this approach are the following: It reduces the learning curve of new developers and the development error rate, allows a systematic use of design patterns in the development and software reuse, speeds up the deliverables of the software product and massively increase the timescale, design consistency and design quality, and eliminates the future refactoring process required for the code.

  5. 2-component heating systems

    Energy Technology Data Exchange (ETDEWEB)

    Radtke, W

    1987-03-01

    The knowledge accumulated only recently of the damage to buildings and the hazards of formaldehyde, radon and hydrocarbons has been inducing louder calls for ventilation, which, on their part, account for the fact that increasing importance is being attached to the controlled ventilation of buildings. Two-component heating systems provide for fresh air and thermal comfort in one. While the first component uses fresh air blown directly and controllably into the rooms, the second component is similar to the Roman hypocaustic heating systems, meaning that heated outer air is circulating under the floor, thus providing for hot surfaces and thermal comfort. Details concerning the two-component heating system are presented along with systems diagrams, diagrams of the heating system and tables identifying the respective costs. Descriptions are given of the two systems components, the fast heat-up, the two-component made, the change of air, heat recovery and control systems. Comparative evaluations determine the differences between two-component heating systems and other heating systems. Conclusive remarks are dedicated to energy conservation and comparative evaluations of costs. (HWJ).

  6. Computer group

    International Nuclear Information System (INIS)

    Bauer, H.; Black, I.; Heusler, A.; Hoeptner, G.; Krafft, F.; Lang, R.; Moellenkamp, R.; Mueller, W.; Mueller, W.F.; Schati, C.; Schmidt, A.; Schwind, D.; Weber, G.

    1983-01-01

    The computer groups has been reorganized to take charge for the general purpose computers DEC10 and VAX and the computer network (Dataswitch, DECnet, IBM - connections to GSI and IPP, preparation for Datex-P). (orig.)

  7. Computer Engineers.

    Science.gov (United States)

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  8. All-optical reservoir computing.

    Science.gov (United States)

    Duport, François; Schneider, Bendix; Smerieri, Anteo; Haelterman, Marc; Massar, Serge

    2012-09-24

    Reservoir Computing is a novel computing paradigm that uses a nonlinear recurrent dynamical system to carry out information processing. Recent electronic and optoelectronic Reservoir Computers based on an architecture with a single nonlinear node and a delay loop have shown performance on standardized tasks comparable to state-of-the-art digital implementations. Here we report an all-optical implementation of a Reservoir Computer, made of off-the-shelf components for optical telecommunications. It uses the saturation of a semiconductor optical amplifier as nonlinearity. The present work shows that, within the Reservoir Computing paradigm, all-optical computing with state-of-the-art performance is possible.

  9. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  10. Verification of Software Components: Addressing Unbounded Paralelism

    Czech Academy of Sciences Publication Activity Database

    Adámek, Jiří

    2007-01-01

    Roč. 8, č. 2 (2007), s. 300-309 ISSN 1525-9293 R&D Projects: GA AV ČR 1ET400300504 Institutional research plan: CEZ:AV0Z10300504 Keywords : software components * formal verification * unbounded parallelism Subject RIV: JC - Computer Hardware ; Software

  11. A refinement driven component-based design

    DEFF Research Database (Denmark)

    Chen, Zhenbang; Liu, Zhiming; Ravn, Anders Peter

    2007-01-01

    the work on the Common Component Modelling Example (CoCoME). This gives evidence that the formal techniques developed in rCOS can be integrated into a model-driven development process and shows where it may be integrated in computer-aided software engineering (CASE) tools for adding formally supported...

  12. Pump Component Model in SPACE Code

    International Nuclear Information System (INIS)

    Kim, Byoung Jae; Kim, Kyoung Doo

    2010-08-01

    This technical report describes the pump component model in SPACE code. A literature survey was made on pump models in existing system codes. The models embedded in SPACE code were examined to check the confliction with intellectual proprietary rights. Design specifications, computer coding implementation, and test results are included in this report

  13. Replaceable LMFBR core components

    International Nuclear Information System (INIS)

    Evans, E.A.; Cunningham, G.W.

    1976-01-01

    Much progress has been made in understanding material and component performance in the high temperature, fast neutron environment of the LMFBR. Current data have provided strong assurance that the initial core component lifetime objectives of FFTF and CRBR can be met. At the same time, this knowledge translates directly into the need for improved core designs that utilize improved materials and advanced fuels required to meet objectives of low doubling times and extended core component lifetimes. An industrial base for the manufacture of quality core components has been developed in the US, and all procurements for the first two core equivalents for FFTF will be completed this year. However, the problem of fabricating recycled plutonium while dramatically reducing fabrication costs, minimizing personnel exposure, and protecting public health and safety must be addressed

  14. Explosive Components Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The 98,000 square foot Explosive Components Facility (ECF) is a state-of-the-art facility that provides a full-range of chemical, material, and performance analysis...

  15. Component fragility research program

    International Nuclear Information System (INIS)

    Tsai, N.C.; Mochizuki, G.L.; Holman, G.S.

    1989-11-01

    To demonstrate how ''high-level'' qualification test data can be used to estimate the ultimate seismic capacity of nuclear power plant equipment, we assessed in detail various electrical components tested by the Pacific Gas ampersand Electric Company for its Diablo Canyon plant. As part of our Phase I Component Fragility Research Program, we evaluated seismic fragility for five Diablo Canyon components: medium-voltage (4kV) switchgear; safeguard relay board; emergency light battery pack; potential transformer; and station battery and racks. This report discusses our Phase II fragility evaluation of a single Westinghouse Type W motor control center column, a fan cooler motor controller, and three local starters at the Diablo Canyon nuclear power plant. These components were seismically qualified by means of biaxial random motion tests on a shaker table, and the test response spectra formed the basis for the estimate of the seismic capacity of the components. The seismic capacity of each component is referenced to the zero period acceleration (ZPA) and, in our Phase II study only, to the average spectral acceleration (ASA) of the motion at its base. For the motor control center, the seismic capacity was compared to the capacity of a Westinghouse Five-Star MCC subjected to actual fragility tests by LLNL during the Phase I Component Fragility Research Program, and to generic capacities developed by the Brookhaven National Laboratory for motor control center. Except for the medium-voltage switchgear, all of the components considered in both our Phase I and Phase II evaluations were qualified in their standard commercial configurations or with only relatively minor modifications such as top bracing of cabinets. 8 refs., 67 figs., 7 tabs

  16. Refractory alloy component fabrication

    International Nuclear Information System (INIS)

    Young, W.R.

    1984-01-01

    Purpose of this report is to describe joining procedures, primarily welding techniques, which were developed to construct reliable refractory alloy components and systems for advanced space power systems. Two systems, the Nb-1Zr Brayton Cycle Heat Receiver and the T-111 Alloy Potassium Boiler Development Program, are used to illustrate typical systems and components. Particular emphasis is given to specific problems which were eliminated during the development efforts. Finally, some thoughts on application of more recent joining technology are presented. 78 figures

  17. Attacks on computer systems

    Directory of Open Access Journals (Sweden)

    Dejan V. Vuletić

    2012-01-01

    Full Text Available Computer systems are a critical component of the human society in the 21st century. Economic sector, defense, security, energy, telecommunications, industrial production, finance and other vital infrastructure depend on computer systems that operate at local, national or global scales. A particular problem is that, due to the rapid development of ICT and the unstoppable growth of its application in all spheres of the human society, their vulnerability and exposure to very serious potential dangers increase. This paper analyzes some typical attacks on computer systems.

  18. Structural analysis of nuclear components

    International Nuclear Information System (INIS)

    Ikonen, K.; Hyppoenen, P.; Mikkola, T.; Noro, H.; Raiko, H.; Salminen, P.; Talja, H.

    1983-05-01

    THe report describes the activities accomplished in the project 'Structural Analysis Project of Nuclear Power Plant Components' during the years 1974-1982 in the Nuclear Engineering Laboratory at the Technical Research Centre of Finland. The objective of the project has been to develop Finnish expertise in structural mechanics related to nuclear engineering. The report describes the starting point of the research work, the organization of the project and the research activities on various subareas. Further the work done with computer codes is described and also the problems which the developed expertise has been applied to. Finally, the diploma works, publications and work reports, which are mainly in Finnish, are listed to give a view of the content of the project. (author)

  19. Multiscale principal component analysis

    International Nuclear Information System (INIS)

    Akinduko, A A; Gorban, A N

    2014-01-01

    Principal component analysis (PCA) is an important tool in exploring data. The conventional approach to PCA leads to a solution which favours the structures with large variances. This is sensitive to outliers and could obfuscate interesting underlying structures. One of the equivalent definitions of PCA is that it seeks the subspaces that maximize the sum of squared pairwise distances between data projections. This definition opens up more flexibility in the analysis of principal components which is useful in enhancing PCA. In this paper we introduce scales into PCA by maximizing only the sum of pairwise distances between projections for pairs of datapoints with distances within a chosen interval of values [l,u]. The resulting principal component decompositions in Multiscale PCA depend on point (l,u) on the plane and for each point we define projectors onto principal components. Cluster analysis of these projectors reveals the structures in the data at various scales. Each structure is described by the eigenvectors at the medoid point of the cluster which represent the structure. We also use the distortion of projections as a criterion for choosing an appropriate scale especially for data with outliers. This method was tested on both artificial distribution of data and real data. For data with multiscale structures, the method was able to reveal the different structures of the data and also to reduce the effect of outliers in the principal component analysis

  20. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  1. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  2. Optical CDMA components requirements

    Science.gov (United States)

    Chan, James K.

    1998-08-01

    Optical CDMA is a complementary multiple access technology to WDMA. Optical CDMA potentially provides a large number of virtual optical channels for IXC, LEC and CLEC or supports a large number of high-speed users in LAN. In a network, it provides asynchronous, multi-rate, multi-user communication with network scalability, re-configurability (bandwidth on demand), and network security (provided by inherent CDMA coding). However, optical CDMA technology is less mature in comparison to WDMA. The components requirements are also different from WDMA. We have demonstrated a video transport/switching system over a distance of 40 Km using discrete optical components in our laboratory. We are currently pursuing PIC implementation. In this paper, we will describe the optical CDMA concept/features, the demonstration system, and the requirements of some critical optical components such as broadband optical source, broadband optical amplifier, spectral spreading/de- spreading, and fixed/programmable mask.

  3. Solid state lighting component

    Energy Technology Data Exchange (ETDEWEB)

    Yuan, Thomas; Keller, Bernd; Tarsa, Eric; Ibbetson, James; Morgan, Frederick; Dowling, Kevin; Lys, Ihor

    2017-10-17

    An LED component according to the present invention comprising an array of LED chips mounted on a submount with the LED chips capable of emitting light in response to an electrical signal. The array can comprise LED chips emitting at two colors of light wherein the LED component emits light comprising the combination of the two colors of light. A single lens is included over the array of LED chips. The LED chip array can emit light of greater than 800 lumens with a drive current of less than 150 milli-Amps. The LED chip component can also operate at temperatures less than 3000 degrees K. In one embodiment, the LED array is in a substantially circular pattern on the submount.

  4. An integrated magnetics component

    DEFF Research Database (Denmark)

    2013-01-01

    The present invention relates to an integrated magnetics component comprising a magnetically permeable core comprising a base member extending in a horizontal plane and first, second, third and fourth legs protruding substantially perpendicularly from the base member. First, second, third...... and fourth output inductor windings are wound around the first, second, third and fourth legs, respectively. A first input conductor of the integrated magnetics component has a first conductor axis and extends in-between the first, second, third and fourth legs to induce a first magnetic flux through a first...... flux path of the magnetically permeable core. A second input conductor of the integrated magnetics component has a second coil axis extending substantially perpendicularly to the first conductor axis to induce a second magnetic flux through a second flux path of the magnetically permeable core...

  5. Cognitive Component Analysis

    DEFF Research Database (Denmark)

    Feng, Ling

    2008-01-01

    This dissertation concerns the investigation of the consistency of statistical regularities in a signaling ecology and human cognition, while inferring appropriate actions for a speech-based perceptual task. It is based on unsupervised Independent Component Analysis providing a rich spectrum...... of audio contexts along with pattern recognition methods to map components to known contexts. It also involves looking for the right representations for auditory inputs, i.e. the data analytic processing pipelines invoked by human brains. The main ideas refer to Cognitive Component Analysis, defined...... as the process of unsupervised grouping of generic data such that the ensuing group structure is well-aligned with that resulting from human cognitive activity. Its hypothesis runs ecologically: features which are essentially independent in a context defined ensemble, can be efficiently coded as sparse...

  6. Generalized principal component analysis

    CERN Document Server

    Vidal, René; Sastry, S S

    2016-01-01

    This book provides a comprehensive introduction to the latest advances in the mathematical theory and computational tools for modeling high-dimensional data drawn from one or multiple low-dimensional subspaces (or manifolds) and potentially corrupted by noise, gross errors, or outliers. This challenging task requires the development of new algebraic, geometric, statistical, and computational methods for efficient and robust estimation and segmentation of one or multiple subspaces. The book also presents interesting real-world applications of these new methods in image processing, image and video segmentation, face recognition and clustering, and hybrid system identification etc. This book is intended to serve as a textbook for graduate students and beginning researchers in data science, machine learning, computer vision, image and signal processing, and systems theory. It contains ample illustrations, examples, and exercises and is made largely self-contained with three Appendices which survey basic concepts ...

  7. Quantum Computing

    OpenAIRE

    Scarani, Valerio

    1998-01-01

    The aim of this thesis was to explain what quantum computing is. The information for the thesis was gathered from books, scientific publications, and news articles. The analysis of the information revealed that quantum computing can be broken down to three areas: theories behind quantum computing explaining the structure of a quantum computer, known quantum algorithms, and the actual physical realizations of a quantum computer. The thesis reveals that moving from classical memor...

  8. Electronic components and systems

    CERN Document Server

    Dennis, W H

    2013-01-01

    Electronic Components and Systems focuses on the principles and processes in the field of electronics and the integrated circuit. Covered in the book are basic aspects and physical fundamentals; different types of materials involved in the field; and passive and active electronic components such as capacitors, inductors, diodes, and transistors. Also covered in the book are topics such as the fabrication of semiconductors and integrated circuits; analog circuitry; digital logic technology; and microprocessors. The monograph is recommended for beginning electrical engineers who would like to kn

  9. Interpretable functional principal component analysis.

    Science.gov (United States)

    Lin, Zhenhua; Wang, Liangliang; Cao, Jiguo

    2016-09-01

    Functional principal component analysis (FPCA) is a popular approach to explore major sources of variation in a sample of random curves. These major sources of variation are represented by functional principal components (FPCs). The intervals where the values of FPCs are significant are interpreted as where sample curves have major variations. However, these intervals are often hard for naïve users to identify, because of the vague definition of "significant values". In this article, we develop a novel penalty-based method to derive FPCs that are only nonzero precisely in the intervals where the values of FPCs are significant, whence the derived FPCs possess better interpretability than the FPCs derived from existing methods. To compute the proposed FPCs, we devise an efficient algorithm based on projection deflation techniques. We show that the proposed interpretable FPCs are strongly consistent and asymptotically normal under mild conditions. Simulation studies confirm that with a competitive performance in explaining variations of sample curves, the proposed FPCs are more interpretable than the traditional counterparts. This advantage is demonstrated by analyzing two real datasets, namely, electroencephalography data and Canadian weather data. © 2015, The International Biometric Society.

  10. Computer network defense system

    Science.gov (United States)

    Urias, Vincent; Stout, William M. S.; Loverro, Caleb

    2017-08-22

    A method and apparatus for protecting virtual machines. A computer system creates a copy of a group of the virtual machines in an operating network in a deception network to form a group of cloned virtual machines in the deception network when the group of the virtual machines is accessed by an adversary. The computer system creates an emulation of components from the operating network in the deception network. The components are accessible by the group of the cloned virtual machines as if the group of the cloned virtual machines was in the operating network. The computer system moves network connections for the group of the virtual machines in the operating network used by the adversary from the group of the virtual machines in the operating network to the group of the cloned virtual machines, enabling protecting the group of the virtual machines from actions performed by the adversary.

  11. Validating Timed Component Contracts

    DEFF Research Database (Denmark)

    Le Guilly, Thibaut; Liu, Shaoying; Olsen, Petur

    2015-01-01

    This paper presents a technique for testing software components with contracts that specify functional behavior, synchronization, as well as timing behavior. The approach combines elements from unit testing with model-based testing techniques for timed automata. The technique is implemented...... in an online testing tool, and we demonstrate its use on a concrete use case....

  12. Hybrid wars’ information component

    Directory of Open Access Journals (Sweden)

    T. A. Nevskaya

    2015-01-01

    Full Text Available The war of the new generation - hybrid war, the information component which is directed not so much on the direct destruction of the enemy, how to achieve the goals without warfare. Fighting in the information field is no less important than immediate military action.

  13. ITER plasma facing components

    International Nuclear Information System (INIS)

    Kuroda, T.; Vieider, G.; Akiba, M.

    1991-01-01

    This document summarizes results of the Conceptual Design Activities (1988-1990) for the International Thermonuclear Experimental Reactor (ITER) project, namely those that pertain to the plasma facing components of the reactor vessel, of which the main components are the first wall and the divertor plates. After an introduction and an executive summary, the principal functions of the plasma-facing components are delineated, i.e., (i) define the low-impurity region within which the plasma is produced, (ii) absorb the electromagnetic radiation and charged-particle flux from the plasma, and (iii) protect the blanket/shield components from the plasma. A list of critical design issues for the divertor plates and the first wall is given, followed by discussions of the divertor plate design (including the issues of material selection, erosion lifetime, design concepts, thermal and mechanical analysis, operating limits and overall lifetime, tritium inventory, baking and conditioning, safety analysis, manufacture and testing, and advanced divertor concepts) and the first wall design (armor material and design, erosion lifetime, overall design concepts, thermal and mechanical analysis, lifetime and operating limits, tritium inventory, baking and conditioning, safety analysis, manufacture and testing, an alternative first wall design, and the limiters used instead of the divertor plates during start-up). Refs, figs and tabs

  14. Spain's nuclear components industry

    International Nuclear Information System (INIS)

    Kaibel, E.

    1985-01-01

    Spanish industrial participation in supply of components for nuclear power plants has grown steadily over the last fifteen years. The share of Spanish companies in work for the five second generation nuclear power plants increased to 50% of total capital investments. The necessity to maintain Spanish technology and production in the nuclear field is emphasized

  15. The market for components

    International Nuclear Information System (INIS)

    Simon, M.; Stoelzl, D.

    1986-01-01

    The offers of the German nuclear components industry are shown at the examples of some masterpieces of engineering and their delivery capacities. Then, the success achieved with exports up to now are referred to. The forecast includes the demand, the side conditions of the technical competition, and the pricing and financing situation. (UA) [de

  16. Bayesian Independent Component Analysis

    DEFF Research Database (Denmark)

    Winther, Ole; Petersen, Kaare Brandt

    2007-01-01

    In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...

  17. Component-oriented programming

    NARCIS (Netherlands)

    Bosch, J; Szyperski, C; Weck, W; Buschmann, F; Buchmann, AP; Cilia, MA

    2003-01-01

    This report covers the eighth Workshop on Component-Oriented Programming (WCOP). WCOP has been affiliated with ECOOP since its inception in 1996. The report summarizes the contributions made by authors of accepted position papers as well as those made by all attendees of the workshop sessions.

  18. COMPUTATIONAL SCIENCE CENTER

    International Nuclear Information System (INIS)

    DAVENPORT, J.

    2006-01-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to bring together

  19. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2006-11-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to

  20. 78 FR 1247 - Certain Electronic Devices, Including Wireless Communication Devices, Tablet Computers, Media...

    Science.gov (United States)

    2013-01-08

    ... Wireless Communication Devices, Tablet Computers, Media Players, and Televisions, and Components Thereof... devices, including wireless communication devices, tablet computers, media players, and televisions, and... wireless communication devices, tablet computers, media players, and televisions, and components thereof...

  1. 77 FR 51571 - Certain Wireless Communication Devices, Portable Music and Data Processing Devices, Computers...

    Science.gov (United States)

    2012-08-24

    ... Music and Data Processing Devices, Computers, and Components Thereof; Notice of Receipt of Complaint... complaint entitled Wireless Communication Devices, Portable Music and Data Processing Devices, Computers..., portable music and data processing devices, computers, and components thereof. The complaint names as...

  2. Computer Simulation of Bound Component Washing To Minimize Processing Costs

    Directory of Open Access Journals (Sweden)

    Dagmar Janáčová

    2011-11-01

    Full Text Available In this paper we focused on the optimization of the washing processes because many technological processes are characterizedby large consumption of water, electrical energy and auxiliary chemicals mainly. For this reason it is very important to deal withthem. For the optimization of process of washing it is possible to set up an access of the indirect modeling that is based on make-up ofmathematical models coming out of study of the physical operation mechanism. The process is diffusion character it is characterizedby the value of diffusion effective coefficient and so called structure power of the removing item to the solid phase. The mentionedparameters belong to input data that are appropriate for the automatic control of washing process.

  3. Effect of Model Selection on Computed Water Balance Components

    NARCIS (Netherlands)

    Jhorar, R.K.; Smit, A.A.M.F.R.; Roest, C.W.J.

    2009-01-01

    Soil water flow modelling approaches as used in four selected on-farm water management models, namely CROPWAT. FAIDS, CERES and SWAP, are compared through numerical experiments. The soil water simulation approaches used in the first three models are reformulated to incorporate ail evapotranspiration

  4. Reconfigurable Computing for Embedded Systems, FPGA Devices and Software Components

    National Research Council Canada - National Science Library

    Bardouleau, Graham; Kulp, James

    2005-01-01

    In recent years the size and capabilities of field-programmable gate array (FPGA) devices have increased to a point where they can be deployed as adjunct processing elements within a multicomputer environment...

  5. Computational Medicine

    DEFF Research Database (Denmark)

    Nygaard, Jens Vinge

    2017-01-01

    The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours......The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours...

  6. Grid Computing

    Indian Academy of Sciences (India)

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers ...

  7. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  8. Adaptable component frameworks

    DEFF Research Database (Denmark)

    Katajainen, Jyrki; Simonsen, Bo

    2009-01-01

    The CPH STL is a special edition of the STL, the containers and algorithms part of the C++ standard library. The specification of the generic components of the STL is given in the C++ standard. Any implementation of the STL, e.g. the one that ships with your standard-compliant C++ compiler, should...... for vector, which is undoubtedly the most used container of the C++ standard library. In particular, we specify the details of a vector implementation that is safe with respect to referential integrity and strong exception safety. Additionally, we report the experiences and lessons learnt from...... the development of component frameworks which we hope to be of benefit to persons engaged in the design and implementation of generic software libraries....

  9. Spatio temporal media components for neurofeedback

    DEFF Research Database (Denmark)

    Jensen, Camilla Birgitte Falk; Petersen, Michael Kai; Larsen, Jakob Eg

    2013-01-01

    A class of Brain Computer Interfaces (BCI) involves interfaces for neurofeedback training, where a user can learn to self-regulate brain activity based on real-time feedback. These particular interfaces are constructed from audio-visual components and temporal settings, which appear to have...... a strong influence on the ability to control brain activity. Therefore, identifying the different interface components and exploring their individual effects might be key for constructing new interfaces that support more efficient neurofeedback training. We discuss experiments involving two different...

  10. Components of laboratory accreditation.

    Science.gov (United States)

    Royal, P D

    1995-12-01

    Accreditation or certification is a recognition given to an operation or product that has been evaluated against a standard; be it regulatory or voluntary. The purpose of accreditation is to provide the consumer with a level of confidence in the quality of operation (process) and the product of an organization. Environmental Protection Agency/OCM has proposed the development of an accreditation program under National Environmental Laboratory Accreditation Program for Good Laboratory Practice (GLP) laboratories as a supplement to the current program. This proposal was the result of the Inspector General Office reports that identified weaknesses in the current operation. Several accreditation programs can be evaluated and common components identified when proposing a structure for accrediting a GLP system. An understanding of these components is useful in building that structure. Internationally accepted accreditation programs provide a template for building a U.S. GLP accreditation program. This presentation will discuss the traditional structure of accreditation as presented in the Organization of Economic Cooperative Development/GLP program, ISO-9000 Accreditation and ISO/IEC Guide 25 Standard, and the Canadian Association for Environmental Analytical Laboratories, which has a biological component. Most accreditation programs are managed by a recognized third party, either privately or with government oversight. Common components often include a formal review of required credentials to evaluate organizational structure, a site visit to evaluate the facility, and a performance evaluation to assess technical competence. Laboratory performance is measured against written standards and scored. A formal report is then sent to the laboratory indicating accreditation status. Usually, there is a scheduled reevaluation built into the program. Fee structures vary considerably and will need to be examined closely when building a GLP program.

  11. Fabricating nuclear components

    International Nuclear Information System (INIS)

    Anon.

    1977-01-01

    Activities of the Nuclear Engineering Division of Vickers Ltd., particularly fabrication of long slim tubular components for power reactors and the construction of irradiation loops and rigs, are outlined. The processes include hydraulic forming for fabrication of various types of tubes and outer cases of fuel transfer buckets, various specialised welding operations including some applications of the TIG process, and induction brazing of specialised assemblies. (U.K.)

  12. Components of the environment

    International Nuclear Information System (INIS)

    Klinda, J.; Lieskovska, Z.

    1998-01-01

    This report of the Ministry of the Environment of the Slovak Republic deals with the components of the environment. The results of monitoring of air (emission situation), ambient air quality, atmospheric precipitation, tropospheric ozone, water (surface water, groundwater resources, waste water and drinking water), geological factors (geothermal energy, fuel deposits, ore deposits, non-metallic ore deposits), soil (area statistics, soil contamination. soil reaction and active extractable aluminium, soil erosion), flora and fauna (national strategy of biodiversity protection) are presented

  13. Ionitriding of Weapon Components

    Science.gov (United States)

    1974-01-01

    and documented tho production sequences required for the case- hardening of AISI 4140 and Nitralloy 13514 steels. Determination of processina...depths were established experimentally for Nitralloy 135M and for AISI 4140 steels. These steels are commonly used for the manufacture of nitrlded...weapons components. A temperature of 050F, upper limit for lonitrlding, was selected for the Nitralloy 135M to keep treatment times short. Since AISI 4140

  14. Components of QCD

    International Nuclear Information System (INIS)

    Sivers, D.

    1979-10-01

    Some aspects of a simple strategy for testing the validity of QCD perturbation theory are examined. The importance of explicit evaluation of higher-order contributions is illustrated by considering Z 0 decays. The recent progress toward understanding exclusive processes in QCD is discussed and some simple examples are given of how to isolate and test the separate components of the perturbation expansion in a hypothetical series of jet experiments

  15. Optimized Kernel Entropy Components.

    Science.gov (United States)

    Izquierdo-Verdiguier, Emma; Laparra, Valero; Jenssen, Robert; Gomez-Chova, Luis; Camps-Valls, Gustau

    2017-06-01

    This brief addresses two main issues of the standard kernel entropy component analysis (KECA) algorithm: the optimization of the kernel decomposition and the optimization of the Gaussian kernel parameter. KECA roughly reduces to a sorting of the importance of kernel eigenvectors by entropy instead of variance, as in the kernel principal components analysis. In this brief, we propose an extension of the KECA method, named optimized KECA (OKECA), that directly extracts the optimal features retaining most of the data entropy by means of compacting the information in very few features (often in just one or two). The proposed method produces features which have higher expressive power. In particular, it is based on the independent component analysis framework, and introduces an extra rotation to the eigen decomposition, which is optimized via gradient-ascent search. This maximum entropy preservation suggests that OKECA features are more efficient than KECA features for density estimation. In addition, a critical issue in both the methods is the selection of the kernel parameter, since it critically affects the resulting performance. Here, we analyze the most common kernel length-scale selection criteria. The results of both the methods are illustrated in different synthetic and real problems. Results show that OKECA returns projections with more expressive power than KECA, the most successful rule for estimating the kernel parameter is based on maximum likelihood, and OKECA is more robust to the selection of the length-scale parameter in kernel density estimation.

  16. Quantum computers and quantum computations

    International Nuclear Information System (INIS)

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  17. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  18. Pervasive Computing

    NARCIS (Netherlands)

    Silvis-Cividjian, N.

    This book provides a concise introduction to Pervasive Computing, otherwise known as Internet of Things (IoT) and Ubiquitous Computing (Ubicomp) which addresses the seamless integration of computing systems within everyday objects. By introducing the core topics and exploring assistive pervasive

  19. Computational vision

    CERN Document Server

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  20. Spatial Computation

    Science.gov (United States)

    2003-12-01

    Computation and today’s microprocessors with the approach to operating system architecture, and the controversy between microkernels and monolithic kernels...Both Spatial Computation and microkernels break away a relatively monolithic architecture into in- dividual lightweight pieces, well specialized...for their particular functionality. Spatial Computation removes global signals and control, in the same way microkernels remove the global address

  1. Residual life estimation of cracked aircraft structural components

    OpenAIRE

    Maksimović, Mirko S.; Vasović, Ivana V.; Maksimović, Katarina S.; Trišović, Nataša; Maksimović, Stevan M.

    2018-01-01

    The subject of this investigation is focused on developing computation procedure for strength analysis of damaged aircraft structural components with respect to fatigue and fracture mechanics. For that purpose, here will be defined computation procedures for residual life estimation of aircraft structural components such as wing skin and attachment lugs under cyclic loads of constant amplitude and load spectrum. A special aspect of this investigation is based on using of the Strain Energy Den...

  2. Parallel computations

    CERN Document Server

    1982-01-01

    Parallel Computations focuses on parallel computation, with emphasis on algorithms used in a variety of numerical and physical applications and for many different types of parallel computers. Topics covered range from vectorization of fast Fourier transforms (FFTs) and of the incomplete Cholesky conjugate gradient (ICCG) algorithm on the Cray-1 to calculation of table lookups and piecewise functions. Single tridiagonal linear systems and vectorized computation of reactive flow are also discussed.Comprised of 13 chapters, this volume begins by classifying parallel computers and describing techn

  3. Human Computation

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  4. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  5. Multi-component optical solitary waves

    DEFF Research Database (Denmark)

    Kivshar, Y. S.; Sukhorukov, A. A.; Ostrovskaya, E. A.

    2000-01-01

    We discuss several novel types of multi-component (temporal and spatial) envelope solitary waves that appear in fiber and waveguide nonlinear optics. In particular, we describe multi-channel solitary waves in bit-parallel-wavelength fiber transmission systems for highperformance computer networks......, multi-color parametric spatial solitary waves due to cascaded nonlinearities of quadratic materials, and quasiperiodic envelope solitons due to quasi-phase-matching in Fibonacci optical superlattices. (C) 2000 Elsevier Science B.V. All rights reserved....

  6. Modeling accelerator structures and RF components

    International Nuclear Information System (INIS)

    Ko, K., Ng, C.K.; Herrmannsfeldt, W.B.

    1993-03-01

    Computer modeling has become an integral part of the design and analysis of accelerator structures RF components. Sophisticated 3D codes, powerful workstations and timely theory support all contributed to this development. We will describe our modeling experience with these resources and discuss their impact on ongoing work at SLAC. Specific examples from R ampersand D on a future linear collide and a proposed e + e - storage ring will be included

  7. Reinforced seal component

    International Nuclear Information System (INIS)

    Jeanson, G.M.; Odent, R.P.

    1980-01-01

    The invention concerns a seal component of the kind comprising a soft sheath and a flexible reinforcement housed throughout the entire length of the sheath. The invention enables O ring seals to be made capable of providing a radial seal, that is to say between two sides or flat collars of two cylindrical mechanical parts, or an axial seal, that is to say between two co-axial axisymmetrical areas. The seal so ensured is relative, but it remains adequately sufficient for many uses, for instance, to ensure the separation of two successive fixed blading compartments of axial compressors used in gas diffusion isotope concentration facilities [fr

  8. Electronic components and technology

    CERN Document Server

    Sangwine, Stephen

    2007-01-01

    Most introductory textbooks in electronics focus on the theory while leaving the practical aspects to be covered in laboratory courses. However, the sooner such matters are introduced, the better able students will be to include such important concerns as parasitic effects and reliability at the very earliest stages of design. This philosophy has kept Electronic Components and Technology thriving for two decades, and this completely updated third edition continues the approach with a more international outlook.Not only does this textbook introduce the properties, behavior, fabrication, and use

  9. Autonomous component carrier selection

    DEFF Research Database (Denmark)

    Garcia, Luis Guilherme Uzeda; Pedersen, Klaus; Mogensen, Preben

    2009-01-01

    management and efficient system operation. Due to the expected large number of user-deployed cells, centralized network planning becomes unpractical and new scalable alternatives must be sought. In this article, we propose a fully distributed and scalable solution to the interference management problem...... in local areas, basing our study case on LTE-Advanced. We present extensive network simulation results to demonstrate that a simple and robust interference management scheme, called autonomous component carrier selection allows each cell to select the most attractive frequency configuration; improving...... the experience of all users and not just the few best ones; while overall cell capacity is not compromised....

  10. Impedance of accelerator components

    International Nuclear Information System (INIS)

    Corlett, J.N.

    1996-05-01

    As demands for high luminosity and low emittance particle beams increase, an understanding of the electromagnetic interaction of these beams with their vacuum chamber environment becomes more important in order to maintain the quality of the beam. This interaction is described in terms of the wake field in time domain, and the beam impedance in frequency domain. These concepts are introduced, and related quantities such as the loss factor are presented. The broadband Q = 1 resonator impedance model is discussed. Perturbation and coaxial wire methods of measurement of real components are reviewed

  11. Reliability parameters of distribution networks components

    Energy Technology Data Exchange (ETDEWEB)

    Gono, R.; Kratky, M.; Rusek, S.; Kral, V. [Technical Univ. of Ostrava (Czech Republic)

    2009-03-11

    This paper presented a framework for the retrieval of parameters from various heterogenous power system databases. The framework was designed to transform the heterogenous outage data in a common relational scheme. The framework was used to retrieve outage data parameters from the Czech and Slovak republics in order to demonstrate the scalability of the framework. A reliability computation of the system was computed in 2 phases representing the retrieval of component reliability parameters and the reliability computation. Reliability rates were determined using component reliability and global reliability indices. Input data for the reliability was retrieved from data on equipment operating under similar conditions, while the probability of failure-free operations was evaluated by determining component status. Anomalies in distribution outage data were described as scheme, attribute, and term differences. Input types consisted of input relations; transformation programs; codebooks; and translation tables. The system was used to successfully retrieve data from 7 distributors in the Czech Republic and Slovak Republic between 2000-2007. The database included 301,555 records. Data were queried using SQL language. 29 refs., 2 tabs., 2 figs.

  12. Spatial Variation of Magnetotelluric Field Components in simple 2D ...

    African Journals Online (AJOL)

    The E-polarization mode electromagnetic field components were computed for different aspect ratios of the inhomogeneity and for different frequencies of the incident waves. The results show that as aspect ratio of the inhomogeneity is reduced the spatial variation of the electric field component Ex is reduced and that of the ...

  13. Interactions between photodegradation components

    Directory of Open Access Journals (Sweden)

    Abdollahi Yadollah

    2012-09-01

    Full Text Available Abstract Background The interactions of p-cresol photocatalytic degradation components were studied by response surface methodology. The study was designed by central composite design using the irradiation time, pH, the amount of photocatalyst and the p-cresol concentration as variables. The design was performed to obtain photodegradation % as actual responses. The actual responses were fitted with linear, two factor interactions, cubic and quadratic model to select an appropriate model. The selected model was validated by analysis of variance which provided evidences such as high F-value (845.09, very low P-value (2 = 0.999, adjusted R-squared (Radj2 = 0.998, predicted R-squared (Rpred2 = 0.994 and the adequate precision (95.94. Results From the validated model demonstrated that the component had interaction with irradiation time under 180 min of the time while the interaction with pH was above pH 9. Moreover, photocatalyst and p-cresol had interaction at minimal amount of photocatalyst (p-cresol. Conclusion These variables are interdependent and should be simultaneously considered during the photodegradation process, which is one of the advantages of the response surface methodology over the traditional laboratory method.

  14. Prognostics for Microgrid Components

    Science.gov (United States)

    Saxena, Abhinav

    2012-01-01

    Prognostics is the science of predicting future performance and potential failures based on targeted condition monitoring. Moving away from the traditional reliability centric view, prognostics aims at detecting and quantifying the time to impending failures. This advance warning provides the opportunity to take actions that can preserve uptime, reduce cost of damage, or extend the life of the component. The talk will focus on the concepts and basics of prognostics from the viewpoint of condition-based systems health management. Differences with other techniques used in systems health management and philosophies of prognostics used in other domains will be shown. Examples relevant to micro grid systems and subsystems will be used to illustrate various types of prediction scenarios and the resources it take to set up a desired prognostic system. Specifically, the implementation results for power storage and power semiconductor components will demonstrate specific solution approaches of prognostics. The role of constituent elements of prognostics, such as model, prediction algorithms, failure threshold, run-to-failure data, requirements and specifications, and post-prognostic reasoning will be explained. A discussion on performance evaluation and performance metrics will conclude the technical discussion followed by general comments on open research problems and challenges in prognostics.

  15. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  16. Lifetime analysis of fusion-reactor components

    International Nuclear Information System (INIS)

    Mattas, R.F.

    1983-01-01

    A one-dimensional computer code has been developed to examine the lifetime of first-wall and impurity-control components. The code incorporates the operating and design parameters, the material characteristics, and the appropriate failure criteria for the individual components. The major emphasis of the modelling effort has been to calculate the temperature-stress-strain-radiation effects history of a component so that the synergystic effects between sputtering erosion, swelling, creep, fatigue, and crack growth can be examined. The general forms of the property equations are the same for all materials in order to provide the greatest flexibility for materials selection in the code. The code is capable of determining the behavior of a plate, composed of either a single or dual material structure, that is either totally constrained or constrained from bending but not from expansion. The code has been utilized to analyze the first walls for FED/INTOR and DEMO

  17. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  18. Computational biomechanics

    International Nuclear Information System (INIS)

    Ethier, C.R.

    2004-01-01

    Computational biomechanics is a fast-growing field that integrates modern biological techniques and computer modelling to solve problems of medical and biological interest. Modelling of blood flow in the large arteries is the best-known application of computational biomechanics, but there are many others. Described here is work being carried out in the laboratory on the modelling of blood flow in the coronary arteries and on the transport of viral particles in the eye. (author)

  19. Food Components and Supplements

    DEFF Research Database (Denmark)

    Parlesak, Alexandr

    2012-01-01

    The major part of food consists of chemical compounds that can be used for energy production, biological synthesis, or maintenance of metabolic processes by the host. These components are defined as nutrients, and can be categorized into macronutrients (proteins, carbohydrates, triglycerides......, and alcohol), minerals, and micronutrients. The latter category comprises 13 vitamins and a hand full of trace elements. Many micronutrients are used as food supplements and are ingested at doses exceeding the amounts that can be consumed along with food by a factor of 10–100. Both macro- and micronutrients...... can interact with enzyme systems related to xenobiotic metabolism either by regulation of their expression or direct interference with their enzymatic activity. During food consumption, we consume a wide range of xenobiotics along with the consumable food, either as an original part of the food (e...

  20. Sprayed skin turbine component

    Science.gov (United States)

    Allen, David B

    2013-06-04

    Fabricating a turbine component (50) by casting a core structure (30), forming an array of pits (24) in an outer surface (32) of the core structure, depositing a transient liquid phase (TLP) material (40) on the outer surface of the core structure, the TLP containing a melting-point depressant, depositing a skin (42) on the outer surface of the core structure over the TLP material, and heating the assembly, thus forming both a diffusion bond and a mechanical interlock between the skin and the core structure. The heating diffuses the melting-point depressant away from the interface. Subsurface cooling channels (35) may be formed by forming grooves (34) in the outer surface of the core structure, filling the grooves with a fugitive filler (36), depositing and bonding the skin (42), then removing the fugitive material.

  1. Food Components and Supplements

    DEFF Research Database (Denmark)

    Parlesak, Alexandr

    2012-01-01

    acting as carcinogens) to health-protective effects (e.g., flavonoids ameliorating detrimental effects of mitochondrial oxidative stress). In particular, secondary plant metabolites along with vitamins, specific types of macronutrients and live bacteria (probiotics) as well as substances promoting.......g., secondary plant metabolites such as flavonoids), or as contaminants that enter the food chain at different stages or during the food production process. For these components, a wide spectrum of biological effects was observed that ranges from health-threatening impacts (e.g., polycyclic aromatic amines....... The supplements and contaminants can compete directly with drug oxidation, induce or suppress the expression of xenobiotic-metabolizing enzymes, change the bioavailability of drugs, and, in the case of live bacteria, bring in their own xenobiotic metabolism, including cytochrome P450 (CYP) activity. In numerous...

  2. High thermal load component

    International Nuclear Information System (INIS)

    Fuse, Toshiaki; Tachikawa, Nobuo.

    1996-01-01

    A cooling tube made of a pure copper is connected to the inner portion of an armour (heat resistant member) made of an anisotropic carbon/carbon composite (CFC) material. The CFC material has a high heat conductivity in longitudinal direction of fibers and has low conductivity in perpendicular thereto. Fibers extending in the armour from a heat receiving surface just above the cooling tube are directly connected to the cooling tube. A portion of the fibers extending from a heat receiving surface other than portions not just above the cooling tube is directly bonded to the cooling tube. Remaining fibers are disposed so as to surround the cooling tube. The armour and the cooling tube are soldered using an active metal flux. With such procedures, high thermal load components for use in a thermonuclear reactor are formed, which are excellent in a heat removing characteristic and hardly causes defects such as crackings and peeling. (I.N.)

  3. Impedance and component heating

    CERN Document Server

    Métral, E; Mounet, N; Pieloni, T; Salvant, B

    2015-01-01

    The impedance is a complex function of frequency, which represents, for the plane under consideration (longitudinal, horizontal or vertical), the force integrated over the length of an element, from a “source” to a “test” wave, normalized by their charges. In general, the impedance in a given plane is a nonlinear function of the test and source transverse coordinates, but it is most of the time sufficient to consider only the first few linear terms. Impedances can influence the motion of trailing particles, in the longitudinal and in one or both transverse directions, leading to energy loss, beam instabilities, or producing undesirable secondary effects such as excessive heating of sensitive components at or near the chamber wall, called beam-induced RF heating. The LHC performance limitations linked to impedances encountered during the 2010-2012 run are reviewed and the currently expected situation during the HL-LHC era is discussed.

  4. Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.

    to understand the computer as a material like any other material we would use for design, like wood, aluminum, or plastic. That as soon as the computer forms a composition with other materials it becomes just as approachable and inspiring as other smart materials. I present a series of investigations of what...... Computational Composite, and Telltale). Through the investigations, I show how the computer can be understood as a material and how it partakes in a new strand of materials whose expressions come to be in context. I uncover some of their essential material properties and potential expressions. I develop a way...

  5. Computer-Aided Facilities Management Systems (CAFM).

    Science.gov (United States)

    Cyros, Kreon L.

    Computer-aided facilities management (CAFM) refers to a collection of software used with increasing frequency by facilities managers. The six major CAFM components are discussed with respect to their usefulness and popularity in facilities management applications: (1) computer-aided design; (2) computer-aided engineering; (3) decision support…

  6. Cloud computing: An innovative tool for library services

    OpenAIRE

    Sahu, R.

    2015-01-01

    Cloud computing is a new technique of information communication technology because of its potential benefits such as reduced cost, accessible anywhere any time as well as its elasticity and flexibility. In this Paper defines cloud Computing, definition, essential characteristics, model of cloud computing, components of cloud, advantages & drawbacks of cloud computing and also describe cloud computing in libraries.

  7. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  8. Quantum Computing

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 9. Quantum Computing - Building Blocks of a Quantum Computer. C S Vijay Vishal Gupta. General Article Volume 5 Issue 9 September 2000 pp 69-81. Fulltext. Click here to view fulltext PDF. Permanent link:

  9. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  10. Quantum Computing

    Indian Academy of Sciences (India)

    In the first part of this article, we had looked at how quantum physics can be harnessed to make the building blocks of a quantum computer. In this concluding part, we look at algorithms which can exploit the power of this computational device, and some practical difficulties in building such a device. Quantum Algorithms.

  11. Quantum computing

    OpenAIRE

    Burba, M.; Lapitskaya, T.

    2017-01-01

    This article gives an elementary introduction to quantum computing. It is a draft for a book chapter of the "Handbook of Nature-Inspired and Innovative Computing", Eds. A. Zomaya, G.J. Milburn, J. Dongarra, D. Bader, R. Brent, M. Eshaghian-Wilner, F. Seredynski (Springer, Berlin Heidelberg New York, 2006).

  12. Computational Pathology

    Science.gov (United States)

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  13. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production......), for instance, in establishing and maintaining trust between the involved parties (Sabherwal, 1999). So far, research in cloud computing has neglected this perspective and focused entirely on aspects relating to technology, economy, security and legal questions. While the core technologies of cloud computing (e...

  14. Computability theory

    CERN Document Server

    Weber, Rebecca

    2012-01-01

    What can we compute--even with unlimited resources? Is everything within reach? Or are computations necessarily drastically limited, not just in practice, but theoretically? These questions are at the heart of computability theory. The goal of this book is to give the reader a firm grounding in the fundamentals of computability theory and an overview of currently active areas of research, such as reverse mathematics and algorithmic randomness. Turing machines and partial recursive functions are explored in detail, and vital tools and concepts including coding, uniformity, and diagonalization are described explicitly. From there the material continues with universal machines, the halting problem, parametrization and the recursion theorem, and thence to computability for sets, enumerability, and Turing reduction and degrees. A few more advanced topics round out the book before the chapter on areas of research. The text is designed to be self-contained, with an entire chapter of preliminary material including re...

  15. Computational Streetscapes

    Directory of Open Access Journals (Sweden)

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  16. Radiation damage in components

    International Nuclear Information System (INIS)

    Takano, Tomehachi

    1977-01-01

    The performance change of typical capacitors and resistors in electronic components by Co-60 γ-irradiation from the 1320 Ci source was examined in the range of 10 5 to 10 8 R. Specifically, the characteristic change during irradiation and the recovery after irradiation were continuously observed. The capacity change is +2.4% at maximum in ceramic and metallized paper capacitors, and -2.4% at maximum in mylar and paper capacitors. It is also +-0.4% at maximum in mica and polystyrene capacitors. Some of these capacitors showed the recovery of the capacity change, but the others did not. Dielectric loss varied by 15% at larger dose in some capacitors, and the recovery was not observed. While, the insulation resistance of the resistors of 10 15 Ω or more lowered to 10 13 Ω or less after 10 to 30 sec. irradiation, but recovered soon nearly to the initial values after irradiation was interrupted. The resistance change of carbon film resistors is about 0.2 to 2%, and recovered to the initial values in 100 hours after irradiation. The resistance change of composition resistors is large over the range of -13 to +35%, besides, no sign of recovery was seen. In carbon film resistors, the surface insulated type indicated far better results which are assumed to be caused by the selection of element materials and the forming of coating materials. (Wakatsuki, Y.)

  17. Robust Spacecraft Component Detection in Point Clouds

    Directory of Open Access Journals (Sweden)

    Quanmao Wei

    2018-03-01

    Full Text Available Automatic component detection of spacecraft can assist in on-orbit operation and space situational awareness. Spacecraft are generally composed of solar panels and cuboidal or cylindrical modules. These components can be simply represented by geometric primitives like plane, cuboid and cylinder. Based on this prior, we propose a robust automatic detection scheme to automatically detect such basic components of spacecraft in three-dimensional (3D point clouds. In the proposed scheme, cylinders are first detected in the iteration of the energy-based geometric model fitting and cylinder parameter estimation. Then, planes are detected by Hough transform and further described as bounded patches with their minimum bounding rectangles. Finally, the cuboids are detected with pair-wise geometry relations from the detected patches. After successive detection of cylinders, planar patches and cuboids, a mid-level geometry representation of the spacecraft can be delivered. We tested the proposed component detection scheme on spacecraft 3D point clouds synthesized by computer-aided design (CAD models and those recovered by image-based reconstruction, respectively. Experimental results illustrate that the proposed scheme can detect the basic geometric components effectively and has fine robustness against noise and point distribution density.

  18. Robust Spacecraft Component Detection in Point Clouds.

    Science.gov (United States)

    Wei, Quanmao; Jiang, Zhiguo; Zhang, Haopeng

    2018-03-21

    Automatic component detection of spacecraft can assist in on-orbit operation and space situational awareness. Spacecraft are generally composed of solar panels and cuboidal or cylindrical modules. These components can be simply represented by geometric primitives like plane, cuboid and cylinder. Based on this prior, we propose a robust automatic detection scheme to automatically detect such basic components of spacecraft in three-dimensional (3D) point clouds. In the proposed scheme, cylinders are first detected in the iteration of the energy-based geometric model fitting and cylinder parameter estimation. Then, planes are detected by Hough transform and further described as bounded patches with their minimum bounding rectangles. Finally, the cuboids are detected with pair-wise geometry relations from the detected patches. After successive detection of cylinders, planar patches and cuboids, a mid-level geometry representation of the spacecraft can be delivered. We tested the proposed component detection scheme on spacecraft 3D point clouds synthesized by computer-aided design (CAD) models and those recovered by image-based reconstruction, respectively. Experimental results illustrate that the proposed scheme can detect the basic geometric components effectively and has fine robustness against noise and point distribution density.

  19. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  20. Component-based development process and component lifecycle

    NARCIS (Netherlands)

    Crnkovic, I.; Chaudron, M.R.V.; Larsson, S.

    2006-01-01

    The process of component- and component-based system development differs in many significant ways from the "classical" development process of software systems. The main difference is in the separation of the development process of components from the development process of systems. This fact has a

  1. Center for Technology for Advanced Scientific Component Software (TASCS)

    Energy Technology Data Exchange (ETDEWEB)

    Damevski, Kostadin [Virginia State Univ., Petersburg, VA (United States)

    2009-03-30

    A resounding success of the Scientific Discover through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedened computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technology for Advanced Scientific Component Software (TASCS) tackles these issues by exploiting component-based software development to facilitate collaborative hig-performance scientific computing.

  2. GOATS - Orbitology Component

    Science.gov (United States)

    Haber, Benjamin M.; Green, Joseph J.

    2010-01-01

    The GOATS Orbitology Component software was developed to specifically address the concerns presented by orbit analysis tools that are often written as stand-alone applications. These applications do not easily interface with standard JPL first-principles analysis tools, and have a steep learning curve due to their complicated nature. This toolset is written as a series of MATLAB functions, allowing seamless integration into existing JPL optical systems engineering modeling and analysis modules. The functions are completely open, and allow for advanced users to delve into and modify the underlying physics being modeled. Additionally, this software module fills an analysis gap, allowing for quick, high-level mission analysis trades without the need for detailed and complicated orbit analysis using commercial stand-alone tools. This software consists of a series of MATLAB functions to provide for geometric orbit-related analysis. This includes propagation of orbits to varying levels of generalization. In the simplest case, geosynchronous orbits can be modeled by specifying a subset of three orbit elements. The next case is a circular orbit, which can be specified by a subset of four orbit elements. The most general case is an arbitrary elliptical orbit specified by all six orbit elements. These orbits are all solved geometrically, under the basic problem of an object in circular (or elliptical) orbit around a rotating spheroid. The orbit functions output time series ground tracks, which serve as the basis for more detailed orbit analysis. This software module also includes functions to track the positions of the Sun, Moon, and arbitrary celestial bodies specified by right ascension and declination. Also included are functions to calculate line-of-sight geometries to ground-based targets, angular rotations and decompositions, and other line-of-site calculations. The toolset allows for the rapid execution of orbit trade studies at the level of detail required for the

  3. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  4. Computational physics

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1987-01-15

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October.

  5. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  6. Computational Viscoelasticity

    CERN Document Server

    Marques, Severino P C

    2012-01-01

    This text is a guide how to solve problems in which viscoelasticity is present using existing commercial computational codes. The book gives information on codes’ structure and use, data preparation  and output interpretation and verification. The first part of the book introduces the reader to the subject, and to provide the models, equations and notation to be used in the computational applications. The second part shows the most important Computational techniques: Finite elements formulation, Boundary elements formulation, and presents the solutions of Viscoelastic problems with Abaqus.

  7. Optical computing.

    Science.gov (United States)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  8. Computational physics

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October

  9. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot...... encompass human concepts of subjective experience and intersubjective meaningful communication, which prevents it from being genuinely transdisciplinary. (3) Philosophically, it does not sufficiently accept the deep ontological differences between various paradigms such as von Foerster’s second- order...

  10. Component mode synthesis in structural dynamics

    International Nuclear Information System (INIS)

    Reddy, G.R.; Vaze, K.K.; Kushwaha, H.S.

    1993-01-01

    In seismic analysis of Nuclear Reactor Structures and equipments eigen solution requires large computer time. Component mode synthesis is an efficient technique with which one can evaluate dynamic characteristics of a large structure with minimum computer time. Due to this reason it is possible to do a coupled analysis of structure and equipment which takes into account the interaction effects. Basically in this the method large size structure is divided into small substructures and dynamic characteristics of individual substructure are determined. The dynamic characteristics of entire structure are evaluated by synthesising the individual substructure characteristics. Component mode synthesis has been applied in this paper to the analysis of a tall heavy water upgrading tower. Use of fixed interface normal modes, constrained modes, attachment modes in the component mode synthesis using energy principle and using Ritz vectors have been discussed. The validity of this method is established by solving fixed-fixed beam and comparing the results obtained by conventional and classical method. The eigen value problem has been solved using simultaneous iteration method. (author)

  11. A study on gingival component of smile

    Directory of Open Access Journals (Sweden)

    Goutam Chakroborty

    2015-01-01

    Full Text Available Background: Esthetic enhancement of smile requires prior quantification of gingival component of smile. Hence, a study has been designed on randomly selected volunteers′ and posed frontal smiling photographs were taken and analyzed through computer-aided ImageJ software. Aim: To determine the role of gingival component in designing a smile. Settings and Design: Present observational study includes one frontal photograph from each of 212 subjects who were attending the Department of Periodontics (examined during the study period and then divided into three age groups (18-30, 31-40, and 41-50 years. Materials and Methods: Standardized frontal photographs with posed smile from 212 volunteers irrespective of age and sex were taken and the images were analyzed in computer by using ImageJ software. Statistical analysis used: Mean and standard deviation of intercommissural width (ICW, interlabial gap (ILG, and smile index (SI during posed smiling were calculated for different sex. Comparison between male and female group were done by Mann-Whitney U test, and P-values were calculated for ICW, ILG, and SI. Spearman′s rank correlation coefficients (rho were calculated for SI and different components of central zone of smile. Results: Male group as compared to female group exhibited greater ICW and ILG, and there was existence of fair to good correlation between lip dynamics and different factors of smile. Conclusion: Present study indicates that different factors of central zone of smile havefair to good correlation with lip dynamics assessed by SI.

  12. Intelligent Component Monitoring for Nuclear Power Plants

    International Nuclear Information System (INIS)

    Tsoukalas, Lefteri

    2010-01-01

    Reliability and economy are two major concerns for a nuclear power generation system. Next generation nuclear power reactors are being developed to be more reliable and economic. An effective and efficient surveillance system can generously contribute toward this goal. Recent progress in computer systems and computational tools has made it necessary and possible to upgrade current surveillance/monitoring strategy for better performance. For example, intelligent computing techniques can be applied to develop algorithm that help people better understand the information collected from sensors and thus reduce human error to a new low level. Incidents incurred from human error in nuclear industry are not rare and have been proven costly. The goal of this project is to develop and test an intelligent prognostics methodology for predicting aging effects impacting long-term performance of nuclear components and systems. The approach is particularly suitable for predicting the performance of nuclear reactor systems which have low failure probabilities (e.g., less than 10 -6 year -1 ). Such components and systems are often perceived as peripheral to the reactor and are left somewhat unattended. That is, even when inspected, if they are not perceived to be causing some immediate problem, they may not be paid due attention. Attention to such systems normally involves long term monitoring and possibly reasoning with multiple features and evidence, requirements that are not best suited for humans.

  13. Towards Prognostics for Electronics Components

    Data.gov (United States)

    National Aeronautics and Space Administration — Electronics components have an increasingly critical role in avionics systems and in the development of future aircraft systems. Prognostics of such components is...

  14. Algorithmic fault tree construction by component-based system modeling

    International Nuclear Information System (INIS)

    Majdara, Aref; Wakabayashi, Toshio

    2008-01-01

    Computer-aided fault tree generation can be easier, faster and less vulnerable to errors than the conventional manual fault tree construction. In this paper, a new approach for algorithmic fault tree generation is presented. The method mainly consists of a component-based system modeling procedure an a trace-back algorithm for fault tree synthesis. Components, as the building blocks of systems, are modeled using function tables and state transition tables. The proposed method can be used for a wide range of systems with various kinds of components, if an inclusive component database is developed. (author)

  15. Verifying Embedded Systems using Component-based Runtime Observers

    DEFF Research Database (Denmark)

    Guan, Wei; Marian, Nicolae; Angelov, Christo K.

    against formally specified properties. This paper presents a component-based design method for runtime observers, which are configured from instances of prefabricated reusable components---Predicate Evaluator (PE) and Temporal Evaluator (TE). The PE computes atomic propositions for the TE; the latter...... is a reconfigurable component processing a data structure, representing the state transition diagram of a non-deterministic state machine, i.e. a Buchi automaton derived from a system property specified in Linear Temporal Logic (LTL). Observer components have been implemented using design models and design patterns...

  16. Independent component analysis: recent advances

    OpenAIRE

    Hyv?rinen, Aapo

    2013-01-01

    Independent component analysis is a probabilistic method for learning a linear transform of a random vector. The goal is to find components that are maximally independent and non-Gaussian (non-normal). Its fundamental difference to classical multi-variate statistical methods is in the assumption of non-Gaussianity, which enables the identification of original, underlying components, in contrast to classical methods. The basic theory of independent component analysis was mainly developed in th...

  17. Essentials of cloud computing

    CERN Document Server

    Chandrasekaran, K

    2014-01-01

    ForewordPrefaceComputing ParadigmsLearning ObjectivesPreambleHigh-Performance ComputingParallel ComputingDistributed ComputingCluster ComputingGrid ComputingCloud ComputingBiocomputingMobile ComputingQuantum ComputingOptical ComputingNanocomputingNetwork ComputingSummaryReview PointsReview QuestionsFurther ReadingCloud Computing FundamentalsLearning ObjectivesPreambleMotivation for Cloud ComputingThe Need for Cloud ComputingDefining Cloud ComputingNIST Definition of Cloud ComputingCloud Computing Is a ServiceCloud Computing Is a Platform5-4-3 Principles of Cloud computingFive Essential Charact

  18. Personal Computers.

    Science.gov (United States)

    Toong, Hoo-min D.; Gupta, Amar

    1982-01-01

    Describes the hardware, software, applications, and current proliferation of personal computers (microcomputers). Includes discussions of microprocessors, memory, output (including printers), application programs, the microcomputer industry, and major microcomputer manufacturers (Apple, Radio Shack, Commodore, and IBM). (JN)

  19. Computational Literacy

    DEFF Research Database (Denmark)

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies for the acquisit......In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies...... for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...

  20. Computing Religion

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  1. Computational Controversy

    NARCIS (Netherlands)

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have

  2. Grid Computing

    Indian Academy of Sciences (India)

    IAS Admin

    emergence of supercomputers led to the use of computer simula- tion as an .... Scientific and engineering applications (e.g., Tera grid secure gate way). Collaborative ... Encryption, privacy, protection from malicious software. Physical Layer.

  3. Statistics of Shared Components in Complex Component Systems

    Science.gov (United States)

    Mazzolini, Andrea; Gherardi, Marco; Caselle, Michele; Cosentino Lagomarsino, Marco; Osella, Matteo

    2018-04-01

    Many complex systems are modular. Such systems can be represented as "component systems," i.e., sets of elementary components, such as LEGO bricks in LEGO sets. The bricks found in a LEGO set reflect a target architecture, which can be built following a set-specific list of instructions. In other component systems, instead, the underlying functional design and constraints are not obvious a priori, and their detection is often a challenge of both scientific and practical importance, requiring a clear understanding of component statistics. Importantly, some quantitative invariants appear to be common to many component systems, most notably a common broad distribution of component abundances, which often resembles the well-known Zipf's law. Such "laws" affect in a general and nontrivial way the component statistics, potentially hindering the identification of system-specific functional constraints or generative processes. Here, we specifically focus on the statistics of shared components, i.e., the distribution of the number of components shared by different system realizations, such as the common bricks found in different LEGO sets. To account for the effects of component heterogeneity, we consider a simple null model, which builds system realizations by random draws from a universe of possible components. Under general assumptions on abundance heterogeneity, we provide analytical estimates of component occurrence, which quantify exhaustively the statistics of shared components. Surprisingly, this simple null model can positively explain important features of empirical component-occurrence distributions obtained from large-scale data on bacterial genomes, LEGO sets, and book chapters. Specific architectural features and functional constraints can be detected from occurrence patterns as deviations from these null predictions, as we show for the illustrative case of the "core" genome in bacteria.

  4. Statistics of Shared Components in Complex Component Systems

    Directory of Open Access Journals (Sweden)

    Andrea Mazzolini

    2018-04-01

    Full Text Available Many complex systems are modular. Such systems can be represented as “component systems,” i.e., sets of elementary components, such as LEGO bricks in LEGO sets. The bricks found in a LEGO set reflect a target architecture, which can be built following a set-specific list of instructions. In other component systems, instead, the underlying functional design and constraints are not obvious a priori, and their detection is often a challenge of both scientific and practical importance, requiring a clear understanding of component statistics. Importantly, some quantitative invariants appear to be common to many component systems, most notably a common broad distribution of component abundances, which often resembles the well-known Zipf’s law. Such “laws” affect in a general and nontrivial way the component statistics, potentially hindering the identification of system-specific functional constraints or generative processes. Here, we specifically focus on the statistics of shared components, i.e., the distribution of the number of components shared by different system realizations, such as the common bricks found in different LEGO sets. To account for the effects of component heterogeneity, we consider a simple null model, which builds system realizations by random draws from a universe of possible components. Under general assumptions on abundance heterogeneity, we provide analytical estimates of component occurrence, which quantify exhaustively the statistics of shared components. Surprisingly, this simple null model can positively explain important features of empirical component-occurrence distributions obtained from large-scale data on bacterial genomes, LEGO sets, and book chapters. Specific architectural features and functional constraints can be detected from occurrence patterns as deviations from these null predictions, as we show for the illustrative case of the “core” genome in bacteria.

  5. Unblockable Compositions of Software Components

    DEFF Research Database (Denmark)

    Dong, Ruzhen; Faber, Johannes; Liu, Zhiming

    2012-01-01

    We present a new automata-based interface model describing the interaction behavior of software components. Contrary to earlier component- or interface-based approaches, the interface model we propose specifies all the non-blockable interaction behaviors of a component with any environment...... composition of interface models preserves unblockable sequences of provided services....

  6. Probabilistic Structural Analysis Methods for select space propulsion system components (PSAM). Volume 2: Literature surveys of critical Space Shuttle main engine components

    Science.gov (United States)

    Rajagopal, K. R.

    1992-01-01

    The technical effort and computer code development is summarized. Several formulations for Probabilistic Finite Element Analysis (PFEA) are described with emphasis on the selected formulation. The strategies being implemented in the first-version computer code to perform linear, elastic PFEA is described. The results of a series of select Space Shuttle Main Engine (SSME) component surveys are presented. These results identify the critical components and provide the information necessary for probabilistic structural analysis. Volume 2 is a summary of critical SSME components.

  7. Computer tomographs

    International Nuclear Information System (INIS)

    Niedzwiedzki, M.

    1982-01-01

    Physical foundations and the developments in the transmission and emission computer tomography are presented. On the basis of the available literature and private communications a comparison is made of the various transmission tomographs. A new technique of computer emission tomography ECT, unknown in Poland, is described. The evaluation of two methods of ECT, namely those of positron and single photon emission tomography is made. (author)

  8. Computational sustainability

    CERN Document Server

    Kersting, Kristian; Morik, Katharina

    2016-01-01

    The book at hand gives an overview of the state of the art research in Computational Sustainability as well as case studies of different application scenarios. This covers topics such as renewable energy supply, energy storage and e-mobility, efficiency in data centers and networks, sustainable food and water supply, sustainable health, industrial production and quality, etc. The book describes computational methods and possible application scenarios.

  9. Computing farms

    International Nuclear Information System (INIS)

    Yeh, G.P.

    2000-01-01

    High-energy physics, nuclear physics, space sciences, and many other fields have large challenges in computing. In recent years, PCs have achieved performance comparable to the high-end UNIX workstations, at a small fraction of the price. We review the development and broad applications of commodity PCs as the solution to CPU needs, and look forward to the important and exciting future of large-scale PC computing

  10. Computational chemistry

    Science.gov (United States)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  11. Component Reification in Systems Modelling

    DEFF Research Database (Denmark)

    Bendisposto, Jens; Hallerstede, Stefan

    When modelling concurrent or distributed systems in Event-B, we often obtain models where the structure of the connected components is specified by constants. Their behaviour is specified by the non-deterministic choice of event parameters for events that operate on shared variables. From a certain......? These components may still refer to shared variables. Events of these components should not refer to the constants specifying the structure. The non-deterministic choice between these components should not be via parameters. We say the components are reified. We need to address how the reified components get...... reflected into the original model. This reflection should indicate the constraints on how to connect the components....

  12. Component Composition Using Feature Models

    DEFF Research Database (Denmark)

    Eichberg, Michael; Klose, Karl; Mitschke, Ralf

    2010-01-01

    interface description languages. If this variability is relevant when selecting a matching component then human interaction is required to decide which components can be bound. We propose to use feature models for making this variability explicit and (re-)enabling automatic component binding. In our...... approach, feature models are one part of service specifications. This enables to declaratively specify which service variant is provided by a component. By referring to a service's variation points, a component that requires a specific service can list the requirements on the desired variant. Using...... these specifications, a component environment can then determine if a binding of the components exists that satisfies all requirements. The prototypical environment Columbus demonstrates the feasibility of the approach....

  13. Bayesian component separation: The Planck experience

    Science.gov (United States)

    Wehus, Ingunn Kathrine; Eriksen, Hans Kristian

    2018-05-01

    Bayesian component separation techniques have played a central role in the data reduction process of Planck. The most important strength of this approach is its global nature, in which a parametric and physical model is fitted to the data. Such physical modeling allows the user to constrain very general data models, and jointly probe cosmological, astrophysical and instrumental parameters. This approach also supports statistically robust goodness-of-fit tests in terms of data-minus-model residual maps, which are essential for identifying residual systematic effects in the data. The main challenges are high code complexity and computational cost. Whether or not these costs are justified for a given experiment depends on its final uncertainty budget. We therefore predict that the importance of Bayesian component separation techniques is likely to increase with time for intensity mapping experiments, similar to what has happened in the CMB field, as observational techniques mature, and their overall sensitivity improves.

  14. Translator for Optimizing Fluid-Handling Components

    Science.gov (United States)

    Landon, Mark; Perry, Ernest

    2007-01-01

    A software interface has been devised to facilitate optimization of the shapes of valves, elbows, fittings, and other components used to handle fluids under extreme conditions. This software interface translates data files generated by PLOT3D (a NASA grid-based plotting-and- data-display program) and by computational fluid dynamics (CFD) software into a format in which the files can be read by Sculptor, which is a shape-deformation- and-optimization program. Sculptor enables the user to interactively, smoothly, and arbitrarily deform the surfaces and volumes in two- and three-dimensional CFD models. Sculptor also includes design-optimization algorithms that can be used in conjunction with the arbitrary-shape-deformation components to perform automatic shape optimization. In the optimization process, the output of the CFD software is used as feedback while the optimizer strives to satisfy design criteria that could include, for example, improved values of pressure loss, velocity, flow quality, mass flow, etc.

  15. The numerical simulation of accelerator components

    International Nuclear Information System (INIS)

    Herrmannsfeldt, W.B.; Hanerfeld, H.

    1987-05-01

    The techniques of the numerical simulation of plasmas can be readily applied to problems in accelerator physics. Because the problems usually involve a single component ''plasma,'' and times that are at most, a few plasma oscillation periods, it is frequently possible to make very good simulations with relatively modest computation resources. We will discuss the methods and illustrate them with several examples. One of the more powerful techniques of understanding the motion of charged particles is to view computer-generated motion pictures. We will show several little movie strips to illustrate the discussions. The examples will be drawn from the application areas of Heavy Ion Fusion, electron-positron linear colliders and injectors for free-electron lasers. 13 refs., 10 figs., 2 tabs

  16. Conductivity of two-component systems

    Energy Technology Data Exchange (ETDEWEB)

    Kuijper, A. de; Hofman, J.P.; Waal, J.A. de [Shell Research BV, Rijswijk (Netherlands). Koninklijke/Shell Exploratie en Productie Lab.; Sandor, R.K.J. [Shell International Petroleum Maatschappij, The Hague (Netherlands)

    1996-01-01

    The authors present measurements and computer simulation results on the electrical conductivity of nonconducting grains embedded in a conductive brine host. The shapes of the grains ranged from prolate-ellipsoidal (with an axis ratio of 5:1) through spherical to oblate-ellipsoidal (with an axis ratio of 1:5). The conductivity was studied as a function of porosity and packing, and Archie`s cementation exponent was found to depend on porosity. They used spatially regular and random configurations with aligned and nonaligned packings. The experimental results agree well with the computer simulation data. This data set will enable extensive tests of models for calculating the anisotropic conductivity of two-component systems.

  17. Computational creativity

    Directory of Open Access Journals (Sweden)

    López de Mántaras Badia, Ramon

    2013-12-01

    Full Text Available New technologies, and in particular artificial intelligence, are drastically changing the nature of creative processes. Computers are playing very significant roles in creative activities such as music, architecture, fine arts, and science. Indeed, the computer is already a canvas, a brush, a musical instrument, and so on. However, we believe that we must aim at more ambitious relations between computers and creativity. Rather than just seeing the computer as a tool to help human creators, we could see it as a creative entity in its own right. This view has triggered a new subfield of Artificial Intelligence called Computational Creativity. This article addresses the question of the possibility of achieving computational creativity through some examples of computer programs capable of replicating some aspects of creative behavior in the fields of music and science.Las nuevas tecnologías y en particular la Inteligencia Artificial están cambiando de forma importante la naturaleza del proceso creativo. Los ordenadores están jugando un papel muy significativo en actividades artísticas tales como la música, la arquitectura, las bellas artes y la ciencia. Efectivamente, el ordenador ya es el lienzo, el pincel, el instrumento musical, etc. Sin embargo creemos que debemos aspirar a relaciones más ambiciosas entre los ordenadores y la creatividad. En lugar de verlos solamente como herramientas de ayuda a la creación, los ordenadores podrían ser considerados agentes creativos. Este punto de vista ha dado lugar a un nuevo subcampo de la Inteligencia Artificial denominado Creatividad Computacional. En este artículo abordamos la cuestión de la posibilidad de alcanzar dicha creatividad computacional mediante algunos ejemplos de programas de ordenador capaces de replicar algunos aspectos relacionados con el comportamiento creativo en los ámbitos de la música y la ciencia.

  18. Fault tree analysis with multistate components

    International Nuclear Information System (INIS)

    Caldarola, L.

    1979-02-01

    A general analytical theory has been developed which allows one to calculate the occurence probability of the top event of a fault tree with multistate (more than states) components. It is shown that, in order to correctly describe a system with multistate components, a special type of Boolean algebra is required. This is called 'Boolean algebra with restrictions on varibales' and its basic rules are the same as those of the traditional Boolean algebra with some additional restrictions on the variables. These restrictions are extensively discussed in the paper. Important features of the method are the identification of the complete base and of the smallest irredundant base of a Boolean function which does not necessarily need to be coherent. It is shown that the identification of the complete base of a Boolean function requires the application of some algorithms which are not used in today's computer programmes for fault tree analysis. The problem of statistical dependence among primary components is discussed. The paper includes a small demonstrative example to illustrate the method. The example includes also statistical dependent components. (orig.) [de

  19. Methods of measuring residual stresses in components

    International Nuclear Information System (INIS)

    Rossini, N.S.; Dassisti, M.; Benyounis, K.Y.; Olabi, A.G.

    2012-01-01

    Highlights: ► Defining the different methods of measuring residual stresses in manufactured components. ► Comprehensive study on the hole drilling, neutron diffraction and other techniques. ► Evaluating advantage and disadvantage of each method. ► Advising the reader with the appropriate method to use. -- Abstract: Residual stresses occur in many manufactured structures and components. Large number of investigations have been carried out to study this phenomenon and its effect on the mechanical characteristics of these components. Over the years, different methods have been developed to measure residual stress for different types of components in order to obtain reliable assessment. The various specific methods have evolved over several decades and their practical applications have greatly benefited from the development of complementary technologies, notably in material cutting, full-field deformation measurement techniques, numerical methods and computing power. These complementary technologies have stimulated advances not only in measurement accuracy and reliability, but also in range of application; much greater detail in residual stresses measurement is now available. This paper aims to classify the different residual stresses measurement methods and to provide an overview of some of the recent advances in this area to help researchers on selecting their techniques among destructive, semi destructive and non-destructive techniques depends on their application and the availabilities of those techniques. For each method scope, physical limitation, advantages and disadvantages are summarized. In the end this paper indicates some promising directions for future developments.

  20. Personal Computer Transport Analysis Program

    Science.gov (United States)

    DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter

    2012-01-01

    The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.

  1. System for Cooling of Electronic Components

    Science.gov (United States)

    Vasil'ev, L. L.; Grakovich, L. P.; Dragun, L. A.; Zhuravlev, A. S.; Olekhnovich, V. A.; Rabetskii, M. I.

    2017-01-01

    Results of computational and experimental investigations of heat pipes having a predetermined thermal resistance and a system based on these pipes for air cooling of electronic components and diode assemblies of lasers are presented. An efficient compact cooling system comprising heat pipes with an evaporator having a capillary coating of a caked copper powder and a condenser having a developed outer finning, has been deviced. This system makes it possible to remove, to the ambient air, a heat flow of power more than 300 W at a temperature of 40-50°C.

  2. Thermochemical modelling of multi-component systems

    International Nuclear Information System (INIS)

    Sundman, B.; Gueneau, C.

    2015-01-01

    Computational thermodynamic, also known as the Calphad method, is a standard tool in industry for the development of materials and improving processes and there is an intense scientific development of new models and databases. The calculations are based on thermodynamic models of the Gibbs energy for each phase as a function of temperature, pressure and constitution. Model parameters are stored in databases that are developed in an international scientific collaboration. In this way, consistent and reliable data for many properties like heat capacity, chemical potentials, solubilities etc. can be obtained for multi-component systems. A brief introduction to this technique is given here and references to more extensive documentation are provided. (authors)

  3. Lifetime evaluation of Bohunice NPP components

    International Nuclear Information System (INIS)

    Kupca, L.

    2001-01-01

    The paper discuss some aspects of the main primary components lifetime evaluation program in Bohunice NPP which is performed by Nuclear Power Plant Research Institute (NPPRI) Trnava in cooperation with Bohunice and other organizations involved. Facts presented here are based on the NPPRI research report which is regularly issued after each reactor fuel campaign under conditions of project resulted from the contract between NPPRI and Bohunice NPP. For the calculations, there has been used some computer codes adapted (or made) by NPPRI and the results are just the conclusive and very brief, presented here in Tables (Figures). (authors)

  4. Parallel Computing in SCALE

    International Nuclear Information System (INIS)

    DeHart, Mark D.; Williams, Mark L.; Bowman, Stephen M.

    2010-01-01

    activities has been developed to provide an integrated framework for future methods development. Some of the major components of the SCALE parallel computing development plan are parallelization and multithreading of computationally intensive modules and redesign of the fundamental SCALE computational architecture.

  5. 78 FR 64977 - Certain Computer and Computer Peripheral Devices, and Components Thereof, and Products Containing...

    Science.gov (United States)

    2013-10-30

    ... obvious in view of, three pieces of prior art. The '623 respondents also challenge the ALJ's finding that... or obvious in view of the prior art. They also make additional non-infringement arguments for the... Telephone Lines, Inv. No. 337-TA-360, USITC Pub. No. 2843, Comm'n Op. (December 1994). If the Commission...

  6. 78 FR 78382 - Certain Computers and Computer Peripheral Devices, and Components Thereof, and Products...

    Science.gov (United States)

    2013-12-26

    ... 152-55. The ALJ rejected TPL's domestic-industry showing based upon OnSpec Electronic, Inc.'s research... topics, and briefing from the parties and written submissions on remedy, the public interest, and bonding... adopts the respondents' proposed construction of ``accessible in parallel.'' The Commission therefore...

  7. Modern computer hardware and the role of central computing facilities in particle physics

    International Nuclear Information System (INIS)

    Zacharov, V.

    1981-01-01

    Important recent changes in the hardware technology of computer system components are reviewed, and the impact of these changes assessed on the present and future pattern of computing in particle physics. The place of central computing facilities is particularly examined, to answer the important question as to what, if anything, should be their future role. Parallelism in computing system components is considered to be an important property that can be exploited with advantage. The paper includes a short discussion of the position of communications and network technology in modern computer systems. (orig.)

  8. Digital optical computers at the optoelectronic computing systems center

    Science.gov (United States)

    Jordan, Harry F.

    1991-01-01

    The Digital Optical Computing Program within the National Science Foundation Engineering Research Center for Opto-electronic Computing Systems has as its specific goal research on optical computing architectures suitable for use at the highest possible speeds. The program can be targeted toward exploiting the time domain because other programs in the Center are pursuing research on parallel optical systems, exploiting optical interconnection and optical devices and materials. Using a general purpose computing architecture as the focus, we are developing design techniques, tools and architecture for operation at the speed of light limit. Experimental work is being done with the somewhat low speed components currently available but with architectures which will scale up in speed as faster devices are developed. The design algorithms and tools developed for a general purpose, stored program computer are being applied to other systems such as optimally controlled optical communication networks.

  9. Quantum computing

    International Nuclear Information System (INIS)

    Steane, Andrew

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  10. Quantum computing

    Energy Technology Data Exchange (ETDEWEB)

    Steane, Andrew [Department of Atomic and Laser Physics, University of Oxford, Clarendon Laboratory, Oxford (United Kingdom)

    1998-02-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  11. A Survey on PageRank Computing

    OpenAIRE

    Berkhin, Pavel

    2005-01-01

    This survey reviews the research related to PageRank computing. Components of a PageRank vector serve as authority weights for web pages independent of their textual content, solely based on the hyperlink structure of the web. PageRank is typically used as a web search ranking component. This defines the importance of the model and the data structures that underly PageRank processing. Computing even a single PageRank is a difficult computational task. Computing many PageRanks is a much mor...

  12. IAEA's experience in compiling a generic component reliability data base

    International Nuclear Information System (INIS)

    Tomic, B.; Lederman, L.

    1991-01-01

    Reliability data are essential in probabilistic safety assessment, with component reliability parameters being particularly important. Component failure data which is plant specific would be most appropriate but this is rather limited. However, similar components are used in different designs. Generic data, that is all data that is not plant specific to the plant being analyzed but which relates to components more generally, is important. The International Atomic Energy Agency has compiled the Generic Component Reliability Data Base from data available in the open literature. It is part of the IAEA computer code package for fault/event tree analysis. The Data Base contains 1010 different records including most of the components used in probabilistic safety analyses of nuclear power plants. The data base input was quality controlled and data sources noted. The data compilation procedure and problems associated with using generic data are explained. (UK)

  13. Multiparty Computations

    DEFF Research Database (Denmark)

    Dziembowski, Stefan

    here and discuss other problems caused by the adaptiveness. All protocols in the thesis are formally specified and the proofs of their security are given. [1]Ronald Cramer, Ivan Damgård, Stefan Dziembowski, Martin Hirt, and Tal Rabin. Efficient multiparty computations with dishonest minority......In this thesis we study a problem of doing Verifiable Secret Sharing (VSS) and Multiparty Computations in a model where private channels between the players and a broadcast channel is available. The adversary is active, adaptive and has an unbounded computing power. The thesis is based on two...... to a polynomial time black-box reduction, the complexity of adaptively secure VSS is the same as that of ordinary secret sharing (SS), where security is only required against a passive, static adversary. Previously, such a connection was only known for linear secret sharing and VSS schemes. We then show...

  14. Scientific computing

    CERN Document Server

    Trangenstein, John A

    2017-01-01

    This is the third of three volumes providing a comprehensive presentation of the fundamentals of scientific computing. This volume discusses topics that depend more on calculus than linear algebra, in order to prepare the reader for solving differential equations. This book and its companions show how to determine the quality of computational results, and how to measure the relative efficiency of competing methods. Readers learn how to determine the maximum attainable accuracy of algorithms, and how to select the best method for computing problems. This book also discusses programming in several languages, including C++, Fortran and MATLAB. There are 90 examples, 200 exercises, 36 algorithms, 40 interactive JavaScript programs, 91 references to software programs and 1 case study. Topics are introduced with goals, literature references and links to public software. There are descriptions of the current algorithms in GSLIB and MATLAB. This book could be used for a second course in numerical methods, for either ...

  15. Computational Psychiatry

    Science.gov (United States)

    Wang, Xiao-Jing; Krystal, John H.

    2014-01-01

    Psychiatric disorders such as autism and schizophrenia arise from abnormalities in brain systems that underlie cognitive, emotional and social functions. The brain is enormously complex and its abundant feedback loops on multiple scales preclude intuitive explication of circuit functions. In close interplay with experiments, theory and computational modeling are essential for understanding how, precisely, neural circuits generate flexible behaviors and their impairments give rise to psychiatric symptoms. This Perspective highlights recent progress in applying computational neuroscience to the study of mental disorders. We outline basic approaches, including identification of core deficits that cut across disease categories, biologically-realistic modeling bridging cellular and synaptic mechanisms with behavior, model-aided diagnosis. The need for new research strategies in psychiatry is urgent. Computational psychiatry potentially provides powerful tools for elucidating pathophysiology that may inform both diagnosis and treatment. To achieve this promise will require investment in cross-disciplinary training and research in this nascent field. PMID:25442941

  16. Electromagnetic computations for fusion devices

    International Nuclear Information System (INIS)

    Turner, L.R.

    1989-09-01

    Among the difficulties in making nuclear fusion a useful energy source, two important ones are producing the magnetic fields needed to drive and confine the plasma, and controlling the eddy currents induced in electrically conducting components by changing fields. All over the world, researchers are developing electromagnetic codes and employing them to compute electromagnetic effects. Ferromagnetic components of a fusion reactor introduce field distortions. Eddy currents are induced in the vacuum vessel, blanket and other torus components of a tokamak when the plasma current disrupts. These eddy currents lead to large forces, and 3-D codes are being developed to study the currents and forces. 35 refs., 6 figs

  17. Computational artifacts

    DEFF Research Database (Denmark)

    Schmidt, Kjeld; Bansler, Jørgen P.

    2016-01-01

    The key concern of CSCW research is that of understanding computing technologies in the social context of their use, that is, as integral features of our practices and our lives, and to think of their design and implementation under that perspective. However, the question of the nature...... of that which is actually integrated in our practices is often discussed in confusing ways, if at all. The article aims to try to clarify the issue and in doing so revisits and reconsiders the notion of ‘computational artifact’....

  18. Computer security

    CERN Document Server

    Gollmann, Dieter

    2011-01-01

    A completely up-to-date resource on computer security Assuming no previous experience in the field of computer security, this must-have book walks you through the many essential aspects of this vast topic, from the newest advances in software and technology to the most recent information on Web applications security. This new edition includes sections on Windows NT, CORBA, and Java and discusses cross-site scripting and JavaScript hacking as well as SQL injection. Serving as a helpful introduction, this self-study guide is a wonderful starting point for examining the variety of competing sec

  19. Cloud Computing

    CERN Document Server

    Antonopoulos, Nick

    2010-01-01

    Cloud computing has recently emerged as a subject of substantial industrial and academic interest, though its meaning and scope is hotly debated. For some researchers, clouds are a natural evolution towards the full commercialisation of grid systems, while others dismiss the term as a mere re-branding of existing pay-per-use technologies. From either perspective, 'cloud' is now the label of choice for accountable pay-per-use access to third party applications and computational resources on a massive scale. Clouds support patterns of less predictable resource use for applications and services a

  20. Computational Logistics

    DEFF Research Database (Denmark)

    Pacino, Dario; Voss, Stefan; Jensen, Rune Møller

    2013-01-01

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  1. Computational Logistics

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  2. Computational engineering

    CERN Document Server

    2014-01-01

    The book presents state-of-the-art works in computational engineering. Focus is on mathematical modeling, numerical simulation, experimental validation and visualization in engineering sciences. In particular, the following topics are presented: constitutive models and their implementation into finite element codes, numerical models in nonlinear elasto-dynamics including seismic excitations, multiphase models in structural engineering and multiscale models of materials systems, sensitivity and reliability analysis of engineering structures, the application of scientific computing in urban water management and hydraulic engineering, and the application of genetic algorithms for the registration of laser scanner point clouds.

  3. Computer busses

    CERN Document Server

    Buchanan, William

    2000-01-01

    As more and more equipment is interface or'bus' driven, either by the use of controllers or directly from PCs, the question of which bus to use is becoming increasingly important both in industry and in the office. 'Computer Busses' has been designed to help choose the best type of bus for the particular application.There are several books which cover individual busses, but none which provide a complete guide to computer busses. The author provides a basic theory of busses and draws examples and applications from real bus case studies. Busses are analysed using from a top-down approach, helpin

  4. Reconfigurable Computing

    CERN Document Server

    Cardoso, Joao MP

    2011-01-01

    As the complexity of modern embedded systems increases, it becomes less practical to design monolithic processing platforms. As a result, reconfigurable computing is being adopted widely for more flexible design. Reconfigurable Computers offer the spatial parallelism and fine-grained customizability of application-specific circuits with the postfabrication programmability of software. To make the most of this unique combination of performance and flexibility, designers need to be aware of both hardware and software issues. FPGA users must think not only about the gates needed to perform a comp

  5. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  6. Imprecise system reliability and component importance based on survival signature

    International Nuclear Information System (INIS)

    Feng, Geng; Patelli, Edoardo; Beer, Michael; Coolen, Frank P.A.

    2016-01-01

    The concept of the survival signature has recently attracted increasing attention for performing reliability analysis on systems with multiple types of components. It opens a new pathway for a structured approach with high computational efficiency based on a complete probabilistic description of the system. In practical applications, however, some of the parameters of the system might not be defined completely due to limited data, which implies the need to take imprecisions of component specifications into account. This paper presents a methodology to include explicitly the imprecision, which leads to upper and lower bounds of the survival function of the system. In addition, the approach introduces novel and efficient component importance measures. By implementing relative importance index of each component without or with imprecision, the most critical component in the system can be identified depending on the service time of the system. Simulation method based on survival signature is introduced to deal with imprecision within components, which is precise and efficient. Numerical example is presented to show the applicability of the approach for systems. - Highlights: • Survival signature is a novel way for system reliability and component importance • High computational efficiency based on a complete description of system. • Include explicitly the imprecision, which leads to bounds of the survival function. • A novel relative importance index is proposed as importance measure. • Allows to identify critical components depending on the service time of the system.

  7. Parallel PDE-Based Simulations Using the Common Component Architecture

    International Nuclear Information System (INIS)

    McInnes, Lois C.; Allan, Benjamin A.; Armstrong, Robert; Benson, Steven J.; Bernholdt, David E.; Dahlgren, Tamara L.; Diachin, Lori; Krishnan, Manoj Kumar; Kohl, James A.; Larson, J. Walter; Lefantzi, Sophia; Nieplocha, Jarek; Norris, Boyana; Parker, Steven G.; Ray, Jaideep; Zhou, Shujia

    2006-01-01

    The complexity of parallel PDE-based simulations continues to increase as multimodel, multiphysics, and multi-institutional projects become widespread. A goal of component based software engineering in such large-scale simulations is to help manage this complexity by enabling better interoperability among various codes that have been independently developed by different groups. The Common Component Architecture (CCA) Forum is defining a component architecture specification to address the challenges of high-performance scientific computing. In addition, several execution frameworks, supporting infrastructure, and general purpose components are being developed. Furthermore, this group is collaborating with others in the high-performance computing community to design suites of domain-specific component interface specifications and underlying implementations. This chapter discusses recent work on leveraging these CCA efforts in parallel PDE-based simulations involving accelerator design, climate modeling, combustion, and accidental fires and explosions. We explain how component technology helps to address the different challenges posed by each of these applications, and we highlight how component interfaces built on existing parallel toolkits facilitate the reuse of software for parallel mesh manipulation, discretization, linear algebra, integration, optimization, and parallel data redistribution. We also present performance data to demonstrate the suitability of this approach, and we discuss strategies for applying component technologies to both new and existing applications

  8. SBA Network Components & Software Inventory

    Data.gov (United States)

    Small Business Administration — SBA’s Network Components & Software Inventory contains a complete inventory of all devices connected to SBA’s network including workstations, servers, routers,...

  9. Optical computer switching network

    Science.gov (United States)

    Clymer, B.; Collins, S. A., Jr.

    1985-01-01

    The design for an optical switching system for minicomputers that uses an optical spatial light modulator such as a Hughes liquid crystal light valve is presented. The switching system is designed to connect 80 minicomputers coupled to the switching system by optical fibers. The system has two major parts: the connection system that connects the data lines by which the computers communicate via a two-dimensional optical matrix array and the control system that controls which computers are connected. The basic system, the matrix-based connecting system, and some of the optical components to be used are described. Finally, the details of the control system are given and illustrated with a discussion of timing.

  10. Computer applications in thermochemistry

    International Nuclear Information System (INIS)

    Vana Varamban, S.

    1996-01-01

    Knowledge of equilibrium is needed under many practical situations. Simple stoichiometric calculations can be performed by the use of hand calculators. Multi-component, multi-phase gas - solid chemical equilibrium calculations are far beyond the conventional devices and methods. Iterative techniques have to be resorted. Such problems are most elegantly handled by the use of modern computers. This report demonstrates the possible use of computers for chemical equilibrium calculations in the field of thermochemistry and chemical metallurgy. Four modules are explained. To fit the experimental C p data and to generate the thermal functions, to perform equilibrium calculations to the defined conditions, to prepare the elaborate input to the equilibrium and to analyse the calculated results graphically. The principles of thermochemical calculations are briefly described. An extensive input guide is given. Several illustrations are included to help the understanding and usage. (author)

  11. Statistical Computing

    Indian Academy of Sciences (India)

    inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.

  12. Computational biology

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Computation via biological devices has been the subject of close scrutiny since von Neumann’s early work some 60 years ago. In spite of the many relevant works in this field, the notion of programming biological devices seems to be, at best, ill-defined. While many devices are claimed or proved t...

  13. Computing News

    CERN Multimedia

    McCubbin, N

    2001-01-01

    We are still five years from the first LHC data, so we have plenty of time to get the computing into shape, don't we? Well, yes and no: there is time, but there's an awful lot to do! The recently-completed CERN Review of LHC Computing gives the flavour of the LHC computing challenge. The hardware scale for each of the LHC experiments is millions of 'SpecInt95' (SI95) units of cpu power and tens of PetaBytes of data storage. PCs today are about 20-30SI95, and expected to be about 100 SI95 by 2005, so it's a lot of PCs. This hardware will be distributed across several 'Regional Centres' of various sizes, connected by high-speed networks. How to realise this in an orderly and timely fashion is now being discussed in earnest by CERN, Funding Agencies, and the LHC experiments. Mixed in with this is, of course, the GRID concept...but that's a topic for another day! Of course hardware, networks and the GRID constitute just one part of the computing. Most of the ATLAS effort is spent on software development. What we ...

  14. Quantum Computation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 16; Issue 9. Quantum Computation - Particle and Wave Aspects of Algorithms. Apoorva Patel. General Article Volume 16 Issue 9 September 2011 pp 821-835. Fulltext. Click here to view fulltext PDF. Permanent link:

  15. Cloud computing.

    Science.gov (United States)

    Wink, Diane M

    2012-01-01

    In this bimonthly series, the author examines how nurse educators can use Internet and Web-based technologies such as search, communication, and collaborative writing tools; social networking and social bookmarking sites; virtual worlds; and Web-based teaching and learning programs. This article describes how cloud computing can be used in nursing education.

  16. Computer Recreations.

    Science.gov (United States)

    Dewdney, A. K.

    1988-01-01

    Describes the creation of the computer program "BOUNCE," designed to simulate a weighted piston coming into equilibrium with a cloud of bouncing balls. The model follows the ideal gas law. Utilizes the critical event technique to create the model. Discusses another program, "BOOM," which simulates a chain reaction. (CW)

  17. [Grid computing

    CERN Multimedia

    Wolinsky, H

    2003-01-01

    "Turn on a water spigot, and it's like tapping a bottomless barrel of water. Ditto for electricity: Flip the switch, and the supply is endless. But computing is another matter. Even with the Internet revolution enabling us to connect in new ways, we are still limited to self-contained systems running locally stored software, limited by corporate, institutional and geographic boundaries" (1 page).

  18. Computational Finance

    DEFF Research Database (Denmark)

    Rasmussen, Lykke

    One of the major challenges in todays post-crisis finance environment is calculating the sensitivities of complex products for hedging and risk management. Historically, these derivatives have been determined using bump-and-revalue, but due to the increasing magnitude of these computations does...

  19. Optical Computing

    Indian Academy of Sciences (India)

    Optical computing technology is, in general, developing in two directions. One approach is ... current support in many places, with private companies as well as governments in several countries encouraging such research work. For example, much ... which enables more information to be carried and data to be processed.

  20. Comparison of Component Frameworks for Real-Time Embedded Systems

    Czech Academy of Sciences Publication Activity Database

    Pop, T.; Hnětynka, P.; Hošek, P.; Malohlava, M.; Bureš, Tomáš

    2014-01-01

    Roč. 40, č. 1 (2014), s. 127-170 ISSN 0219-1377 Grant - others:GA AV ČR(CZ) GAP202/11/0312; GA UK(CZ) Project 378111; UK(CZ) SVV-2013- 267312 Keywords : component-based development * component frameworks * real-time and embedded systems Subject RIV: JC - Computer Hardware ; Software Impact factor: 1.782, year: 2014

  1. Outline of a novel architecture for cortical computation

    OpenAIRE

    Majumdar, Kaushik

    2007-01-01

    In this paper a novel architecture for cortical computation has been proposed. This architecture is composed of computing paths consisting of neurons and synapses only. These paths have been decomposed into lateral, longitudinal and vertical components. Cortical computation has then been decomposed into lateral computation (LaC), longitudinal computation (LoC) and vertical computation (VeC). It has been shown that various loop structures in the cortical circuit play important roles in cortica...

  2. Reconditioning of Computers

    DEFF Research Database (Denmark)

    Bundgaard, Anja Marie

    2017-01-01

    Fast technological development and short innovation cycles has resulted in shorter life spans for certain consumer electronics. Reconditioning is proposed as one of the strategies to close the loops in the circular economy and increase the lifespan of products and components. The paper therefore...... qualitative research interviews. The case study of the contextual barriers indicated that trust from the buyer and the seller of the used computers were important for a viable business. If trust was not in place, it could be a potential barrier. Furthermore, economy obsolescence and a lack of influence...

  3. Components selection for ageing management

    International Nuclear Information System (INIS)

    Mingiuc, C.; Vidican, D.

    2002-01-01

    Full text: The paper presents a synthesis of methods and activities realized for the selection of critical components to assure plant safety and availability (as electricity supplier). There are presented main criteria for selection, screening process. For the resulted categories of components shall be applied different category of maintenance (condition oriented, scheduled or corrective), function of the importance and financial effort necessary to fulfil the task. 1. Systems and components screening for plant safety assurance For the systems selection, from Safety point of view, was necessary first, to define systems which are dangerous in case of failure (mainly by rupture/ release of radioactivity) and the safety systems which have to mitigate the effects. This is realized based on accident analysis (from Safety Report). Also where taken in to account the 4 basic Safety Principles: 'Reactor shut down; Residual heat removal; Radioactivity products confinement; NPP status monitoring in normal and accident conditions'. Following step is to establish safety support systems, which have to action to assure main safety systems operation. This could be realized based on engineering judgement, or on PSA Level I analysis. Finally shall be realized chains of the support systems, which have to work, till primary systems. For the critical components selection, was realized a Failure Mode and Effect Analysis (FMEA), considering the components effects of failures, on system safety function. 2. Systems and components screening for plant availability assurance The work was realized in two steps: Systems screening; Components screening The systems screening, included: General, analyze of the plant systems list and the definition of those which clearly have to run continue to assure the nominal power; Realization of a complex diagram to define interdependence between the systems (e.g. PHT and auxiliaries, moderator and auxiliaries, plant electrical diagram); Fill of special

  4. Incorporation of passive components aging into PRAs

    International Nuclear Information System (INIS)

    Phillips, J.H.; Roesener, W.S.; Magleby, H.L.; Geidl, V.

    1993-01-01

    The probabilistic risk assessments being developed at most nuclear power plants to calculate the risk of core damage generally focus on the possible failure of active components. The possible failure of passive components is given little consideration. We are developing a method for selecting risk-significant passive components and including them in probabilistic risk assessments. We demonstrated the method by selecting a weld in the auxiliary feedwater system. The selection of this component was based on expert judgement of the likelihood of failure and on an estimate of the consequence of component failure to plant safety. We then used the PRAISE computer code to perform a probabilistic structural analysis to calculate the probability that crack growth due to aging would cause the weld to fail. The calculation included the effects of mechanical loads and thermal transients considered in the design and the effects of thermal cycling caused by a leaking check valve. We modified an existing probabilistic risk assessment (NUREG-1150 plant) to include the possible failure of the auxiliary feedwater weld, and then we used the weld failure probability as input to the modified probabilistic risk assessment to calculate the change in plant risk with time. The results showed that if the failure probability of the selected weld is high, the effect on plant risk is significant. However, this particular calculation showed a very low weld failure probability and no change in plant risk for the 48 years of service analyzed. The success of this demonstration shows that this method could be applied to nuclear power plants. (orig.)

  5. On several computer-oriented studies

    International Nuclear Information System (INIS)

    Takahashi, Ryoichi

    1982-01-01

    To utilize fully digital techniques for solving various difficult problems, nuclear engineers have recourse to computer-oriented approaches. The current trend, in such fields as optimization theory, control system theory and computational fluid dynamics reflect the ability to use computers to obtain numerical solutions to complex problems. Special purpose computers will be used as the integral part of the solving system to process a large amount of data, to implement a control law and even to produce a decision-making. Many problem-solving systems designed in the future will incorporate special-purpose computers as system component. The optimum use of computer system is discussed: why are energy model, energy data base and a big computer used; why will the economic process-computer be allocated to nuclear plants in the future; why should the super-computer be demonstrated at once. (Mori, K.)

  6. APS beamline standard components handbook

    International Nuclear Information System (INIS)

    Kuzay, T.M.

    1992-01-01

    It is clear that most Advanced Photon Source (APS) Collaborative Access Team (CAT) members would like to concentrate on designing specialized equipment related to their scientific programs rather than on routine or standard beamline components. Thus, an effort is in progress at the APS to identify standard and modular components of APS beamlines. Identifying standard components is a nontrivial task because these components should support diverse beamline objectives. To assist with this effort, the APS has obtained advice and help from a Beamline Standardization and Modularization Committee consisting of experts in beamline design, construction, and operation. The staff of the Experimental Facilities Division identified various components thought to be standard items for beamlines, regardless of the specific scientific objective of a particular beamline. A generic beamline layout formed the basis for this identification. This layout is based on a double-crystal monochromator as the first optical element, with the possibility of other elements to follow. Pre-engineering designs were then made of the identified standard components. The Beamline Standardization and Modularization Committee has reviewed these designs and provided very useful input regarding the specifications of these components. We realize that there will be other configurations that may require special or modified components. This Handbook in its current version (1.1) contains descriptions, specifications, and pre-engineering design drawings of these standard components. In the future, the APS plans to add engineering drawings of identified standard beamline components. Use of standard components should result in major cost reductions for CATs in the areas of beamline design and construction

  7. Efficient transfer of sensitivity information in multi-component models

    International Nuclear Information System (INIS)

    Abdel-Khalik, Hany S.; Rabiti, Cristian

    2011-01-01

    In support of adjoint-based sensitivity analysis, this manuscript presents a new method to efficiently transfer adjoint information between components in a multi-component model, whereas the output of one component is passed as input to the next component. Often, one is interested in evaluating the sensitivities of the responses calculated by the last component to the inputs of the first component in the overall model. The presented method has two advantages over existing methods which may be classified into two broad categories: brute force-type methods and amalgamated-type methods. First, the presented method determines the minimum number of adjoint evaluations for each component as opposed to the brute force-type methods which require full evaluation of all sensitivities for all responses calculated by each component in the overall model, which proves computationally prohibitive for realistic problems. Second, the new method treats each component as a black-box as opposed to amalgamated-type methods which requires explicit knowledge of the system of equations associated with each component in order to reach the minimum number of adjoint evaluations. (author)

  8. Secure coupling of hardware components

    NARCIS (Netherlands)

    Hoepman, J.H.; Joosten, H.J.M.; Knobbe, J.W.

    2011-01-01

    A method and a system for securing communication between at least a first and a second hardware components of a mobile device is described. The method includes establishing a first shared secret between the first and the second hardware components during an initialization of the mobile device and,

  9. Generic component failure data base

    International Nuclear Information System (INIS)

    Eide, S.A.; Calley, M.B.

    1992-01-01

    This report discusses comprehensive component generic failure data base which has been developed for light water reactor probabilistic risk assessments. The Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) was used to generate component failure rates. Using this approach, most of the failure rates are based on actual plant data rather then existing estimates

  10. Active components in food supplements

    NARCIS (Netherlands)

    Siemelink M; Jansen EHJM; Piersma AH; Opperhuizen A; LEO

    2000-01-01

    The growing food supplement market, where supplements are both more diverse and more easily available (e.g. through Internet) formed the backdrop to the inventory of the active components in food supplements. The safety of an increased intake of food components via supplements was also at issue

  11. Computable Frames in Computable Banach Spaces

    Directory of Open Access Journals (Sweden)

    S.K. Kaushik

    2016-06-01

    Full Text Available We develop some parts of the frame theory in Banach spaces from the point of view of Computable Analysis. We define computable M-basis and use it to construct a computable Banach space of scalar valued sequences. Computable Xd frames and computable Banach frames are also defined and computable versions of sufficient conditions for their existence are obtained.

  12. Algebraic computing

    International Nuclear Information System (INIS)

    MacCallum, M.A.H.

    1990-01-01

    The implementation of a new computer algebra system is time consuming: designers of general purpose algebra systems usually say it takes about 50 man-years to create a mature and fully functional system. Hence the range of available systems and their capabilities changes little between one general relativity meeting and the next, despite which there have been significant changes in the period since the last report. The introductory remarks aim to give a brief survey of capabilities of the principal available systems and highlight one or two trends. The reference to the most recent full survey of computer algebra in relativity and brief descriptions of the Maple, REDUCE and SHEEP and other applications are given. (author)

  13. Computational Controversy

    OpenAIRE

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have appeared, using new data sources such as Wikipedia, which help us now better understand these phenomena. However, compared to what social sciences have discovered about such debates, the existing computati...

  14. Computed tomography

    International Nuclear Information System (INIS)

    Andre, M.; Resnick, D.

    1988-01-01

    Computed tomography (CT) has matured into a reliable and prominent tool for study of the muscoloskeletal system. When it was introduced in 1973, it was unique in many ways and posed a challenge to interpretation. It is in these unique features, however, that its advantages lie in comparison with conventional techniques. These advantages will be described in a spectrum of important applications in orthopedics and rheumatology

  15. Computed radiography

    International Nuclear Information System (INIS)

    Pupchek, G.

    2004-01-01

    Computed radiography (CR) is an image acquisition process that is used to create digital, 2-dimensional radiographs. CR employs a photostimulable phosphor-based imaging plate, replacing the standard x-ray film and intensifying screen combination. Conventional radiographic exposure equipment is used with no modification required to the existing system. CR can transform an analog x-ray department into a digital one and eliminates the need for chemicals, water, darkrooms and film processor headaches. (author)

  16. Computational universes

    International Nuclear Information System (INIS)

    Svozil, Karl

    2005-01-01

    Suspicions that the world might be some sort of a machine or algorithm existing 'in the mind' of some symbolic number cruncher have lingered from antiquity. Although popular at times, the most radical forms of this idea never reached mainstream. Modern developments in physics and computer science have lent support to the thesis, but empirical evidence is needed before it can begin to replace our contemporary world view

  17. Principal Components as a Data Reduction and Noise Reduction Technique

    Science.gov (United States)

    Imhoff, M. L.; Campbell, W. J.

    1982-01-01

    The potential of principal components as a pipeline data reduction technique for thematic mapper data was assessed and principal components analysis and its transformation as a noise reduction technique was examined. Two primary factors were considered: (1) how might data reduction and noise reduction using the principal components transformation affect the extraction of accurate spectral classifications; and (2) what are the real savings in terms of computer processing and storage costs of using reduced data over the full 7-band TM complement. An area in central Pennsylvania was chosen for a study area. The image data for the project were collected using the Earth Resources Laboratory's thematic mapper simulator (TMS) instrument.

  18. System reliability with correlated components: Accuracy of the Equivalent Planes method

    NARCIS (Netherlands)

    Roscoe, K.; Diermanse, F.; Vrouwenvelder, A.C.W.M.

    2015-01-01

    Computing system reliability when system components are correlated presents a challenge because it usually requires solving multi-fold integrals numerically, which is generally infeasible due to the computational cost. In Dutch flood defense reliability modeling, an efficient method for computing

  19. System reliability with correlated components : Accuracy of the Equivalent Planes method

    NARCIS (Netherlands)

    Roscoe, K.; Diermanse, F.; Vrouwenvelder, T.

    2015-01-01

    Computing system reliability when system components are correlated presents a challenge because it usually requires solving multi-fold integrals numerically, which is generally infeasible due to the computational cost. In Dutch flood defense reliability modeling, an efficient method for computing

  20. Component Fragility Research Program: Phase 1 component prioritization

    International Nuclear Information System (INIS)

    Holman, G.S.; Chou, C.K.

    1987-06-01

    Current probabilistic risk assessment (PRA) methods for nuclear power plants utilize seismic ''fragilities'' - probabilities of failure conditioned on the severity of seismic input motion - that are based largely on limited test data and on engineering judgment. Under the NRC Component Fragility Research Program (CFRP), the Lawrence Livermore National Laboratory (LLNL) has developed and demonstrated procedures for using test data to derive probabilistic fragility descriptions for mechanical and electrical components. As part of its CFRP activities, LLNL systematically identified and categorized components influencing plant safety in order to identify ''candidate'' components for future NRC testing. Plant systems relevant to safety were first identified; within each system components were then ranked according to their importance to overall system function and their anticipated seismic capacity. Highest priority for future testing was assigned to those ''very important'' components having ''low'' seismic capacity. This report describes the LLNL prioritization effort, which also included application of ''high-level'' qualification data as an alternate means of developing probabilistic fragility descriptions for PRA applications

  1. Conducting Computer Security Assessments at Nuclear Facilities

    International Nuclear Information System (INIS)

    2016-06-01

    Computer security is increasingly recognized as a key component in nuclear security. As technology advances, it is anticipated that computer and computing systems will be used to an even greater degree in all aspects of plant operations including safety and security systems. A rigorous and comprehensive assessment process can assist in strengthening the effectiveness of the computer security programme. This publication outlines a methodology for conducting computer security assessments at nuclear facilities. The methodology can likewise be easily adapted to provide assessments at facilities with other radioactive materials

  2. Formalization in Component Based Development

    DEFF Research Database (Denmark)

    Holmegaard, Jens Peter; Knudsen, John; Makowski, Piotr

    2006-01-01

    We present a unifying conceptual framework for components, component interfaces, contracts and composition of components by focusing on the collection of properties or qualities that they must share. A specific property, such as signature, functionality behaviour or timing is an aspect. Each aspect...... may be specified in a formal language convenient for its purpose and, in principle, unrelated to languages for other aspects. Each aspect forms its own semantic domain, although a semantic domain may be parameterized by values derived from other aspects. The proposed conceptual framework is introduced...

  3. Computer networks in future accelerator control systems

    International Nuclear Information System (INIS)

    Dimmler, D.G.

    1977-03-01

    Some findings of a study concerning a computer based control and monitoring system for the proposed ISABELLE Intersecting Storage Accelerator are presented. Requirements for development and implementation of such a system are discussed. An architecture is proposed where the system components are partitioned along functional lines. Implementation of some conceptually significant components is reviewed

  4. Computability and Representations of the Zero Set

    NARCIS (Netherlands)

    P.J. Collins (Pieter)

    2008-01-01

    htmlabstractIn this note we give a new representation for closed sets under which the robust zero set of a function is computable. We call this representation the component cover representation. The computation of the zero set is based on topological index theory, the most powerful tool for finding

  5. Introducing Cloud Computing Topics in Curricula

    Science.gov (United States)

    Chen, Ling; Liu, Yang; Gallagher, Marcus; Pailthorpe, Bernard; Sadiq, Shazia; Shen, Heng Tao; Li, Xue

    2012-01-01

    The demand for graduates with exposure in Cloud Computing is on the rise. For many educational institutions, the challenge is to decide on how to incorporate appropriate cloud-based technologies into their curricula. In this paper, we describe our design and experiences of integrating Cloud Computing components into seven third/fourth-year…

  6. How Many Separable Sources? Model Selection In Independent Components Analysis

    Science.gov (United States)

    Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen

    2015-01-01

    Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysis/Principal Components Analysis (mixed ICA/PCA) model described here accommodates one or more Gaussian components in the independent components analysis model and uses principal components analysis to characterize contributions from this inseparable Gaussian subspace. Information theory can then be used to select from among potential model categories with differing numbers of Gaussian components. Based on simulation studies, the assumptions and approximations underlying the Akaike Information Criterion do not hold in this setting, even with a very large number of observations. Cross-validation is a suitable, though computationally intensive alternative for model selection. Application of the algorithm is illustrated using Fisher's iris data set and Howells' craniometric data set. Mixed ICA/PCA is of potential interest in any field of scientific investigation where the authenticity of blindly separated non-Gaussian sources might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian. PMID:25811988

  7. Multi-component bi-Hamiltonian Dirac integrable equations

    Energy Technology Data Exchange (ETDEWEB)

    Ma Wenxiu [Department of Mathematics and Statistics, University of South Florida, Tampa, FL 33620-5700 (United States)], E-mail: mawx@math.usf.edu

    2009-01-15

    A specific matrix iso-spectral problem of arbitrary order is introduced and an associated hierarchy of multi-component Dirac integrable equations is constructed within the framework of zero curvature equations. The bi-Hamiltonian structure of the obtained Dirac hierarchy is presented be means of the variational trace identity. Two examples in the cases of lower order are computed.

  8. Optimal Component Lumping: problem formulation and solution techniques

    DEFF Research Database (Denmark)

    Lin, Bao; Leibovici, Claude F.; Jørgensen, Sten Bay

    2008-01-01

    This paper presents a systematic method for optimal lumping of a large number of components in order to minimize the loss of information. In principle, a rigorous composition-based model is preferable to describe a system accurately. However, computational intensity and numerical issues restrict ...

  9. Incremental principal component pursuit for video background modeling

    Science.gov (United States)

    Rodriquez-Valderrama, Paul A.; Wohlberg, Brendt

    2017-03-14

    An incremental Principal Component Pursuit (PCP) algorithm for video background modeling that is able to process one frame at a time while adapting to changes in background, with a computational complexity that allows for real-time processing, having a low memory footprint and is robust to translational and rotational jitter.

  10. Structured automated code checking through structural components and systems engineering

    NARCIS (Netherlands)

    Coenders, J.L.; Rolvink, A.

    2014-01-01

    This paper presents a proposal to employ the design computing methodology proposed as StructuralComponents (Rolvink et al [6] and van de Weerd et al [7]) as a method to perform a digital verification process to fulfil the requirements related to structural design and engineering as part of a

  11. Transactions in Software Components: Container-Interposed Transactions

    Czech Academy of Sciences Publication Activity Database

    Procházka, M.; Plášil, František

    2002-01-01

    Roč. 3, č. 2 (2002), s. - ISSN 1525-9293 R&D Projects: GA ČR GA201/99/0244; GA AV ČR IAA2030902 Institutional research plan: AV0Z1030915 Keywords : transactions * component-based software architectures * transaction propagation policy * transaction attributes * container -interposed transactions Subject RIV: JC - Computer Hardware ; Software

  12. The benefit of enterprise ontology in identifying business components

    OpenAIRE

    Albani, Antonia

    2006-01-01

    The benefit of enterprise ontology in identifying business components / A. Albani, J. Dietz. - In: Artificial intelligence in theory and practice : IFIP 19th World Computer Congress ; TC 12: IFIP AI 2006 Stream, August 21-24, 2006, Santiago, Chile / ed. by Max Bramer. - New York : Springer, 2006. - S. 1-12. - (IFIP ; 217)

  13. Three-component homeostasis control

    Science.gov (United States)

    Xu, Jin; Hong, Hyunsuk; Jo, Junghyo

    2014-03-01

    Two reciprocal components seem to be sufficient to maintain a control variable constant. However, pancreatic islets adapt three components to control glucose homeostasis. They are α (secreting glucagon), β (insulin), and δ (somatostatin) cells. Glucagon and insulin are the reciprocal hormones for increasing and decreasing blood glucose levels, while the role of somatostatin is unknown. However, it has been known how each hormone affects other cell types. Based on the pulsatile hormone secretion and the cellular interactions, this system can be described as coupled oscillators. In particular, we used the Landau-Stuart model to consider both amplitudes and phases of hormone oscillations. We found that the presence of the third component, δ cell, was effective to resist under glucose perturbations, and to quickly return to the normal glucose level once perturbed. Our analysis suggested that three components are necessary for advanced homeostasis control.

  14. Component Processes in Analogical Reasoning

    Science.gov (United States)

    Sternberg, Robert J.

    1977-01-01

    Describes alternative theoretical positions regarding (a) the component information processes used in analogical reasoning and (b) strategies for combining these processes. Also presents results from three experiments on analogical reasoning. (Author/RK)

  15. Metallurgical Laboratory and Components Testing

    Data.gov (United States)

    Federal Laboratory Consortium — In the field of metallurgy, TTC is equipped to run laboratory tests on track and rolling stock components and materials. The testing lab contains scanning-electron,...

  16. Metal binding by food components

    DEFF Research Database (Denmark)

    Tang, Ning

    for zinc binding by the investigated amino acids, peptides and proteins. The thiol group or imidazole group containing amino acids, peptides and proteins which exhibited strong zinc binding ability were further selected for interacting with zinc salts in relation to zinc absorption. The interactions...... between the above selected food components and zinc citrate or zinc phytate will lead to the enhanced solubility of zinc citrate or zinc phytate. The main driving force for this observed solubility enhancement is the complex formation between zinc and investigated food components as revealed by isothermal...... titration calorimetry and quantum mechanical calculations. This is due to the zinc binding affinity of the relatively softer ligands (investigated food components) will become much stronger than citrate or phytate when they present together in aqueous solution. This mechanism indicates these food components...

  17. How to Build a Quantum Computer

    Science.gov (United States)

    Sanders, Barry C.

    2017-11-01

    Quantum computer technology is progressing rapidly with dozens of qubits and hundreds of quantum logic gates now possible. Although current quantum computer technology is distant from being able to solve computational problems beyond the reach of non-quantum computers, experiments have progressed well beyond simply demonstrating the requisite components. We can now operate small quantum logic processors with connected networks of qubits and quantum logic gates, which is a great stride towards functioning quantum computers. This book aims to be accessible to a broad audience with basic knowledge of computers, electronics and physics. The goal is to convey key notions relevant to building quantum computers and to present state-of-the-art quantum-computer research in various media such as trapped ions, superconducting circuits, photonics and beyond.

  18. ATLAS Distributed Computing in LHC Run2

    CERN Document Server

    Campana, Simone; The ATLAS collaboration

    2015-01-01

    The ATLAS Distributed Computing infrastructure has evolved after the first period of LHC data taking in order to cope with the challenges of the upcoming LHC Run2. An increased data rate and computing demands of the Monte-Carlo simulation, as well as new approaches to ATLAS analysis, dictated a more dynamic workload management system (ProdSys2) and data management system (Rucio), overcoming the boundaries imposed by the design of the old computing model. In particular, the commissioning of new central computing system components was the core part of the migration toward the flexible computing model. The flexible computing utilization exploring the opportunistic resources such as HPC, cloud, and volunteer computing is embedded in the new computing model, the data access mechanisms have been enhanced with the remote access, and the network topology and performance is deeply integrated into the core of the system. Moreover a new data management strategy, based on defined lifetime for each dataset, has been defin...

  19. The Fermilab central computing facility architectural model

    International Nuclear Information System (INIS)

    Nicholls, J.

    1989-01-01

    The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front-end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS cluster interactive front-end, an Amdahl VM Computing engine, ACP farms, and (primarily) VMS workstations. This paper will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. (orig.)

  20. The Fermilab Central Computing Facility architectural model

    International Nuclear Information System (INIS)

    Nicholls, J.

    1989-05-01

    The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS Cluster interactive front end, an Amdahl VM computing engine, ACP farms, and (primarily) VMS workstations. This presentation will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. 2 figs

  1. Customizable computing

    CERN Document Server

    Chen, Yu-Ting; Gill, Michael; Reinman, Glenn; Xiao, Bingjun

    2015-01-01

    Since the end of Dennard scaling in the early 2000s, improving the energy efficiency of computation has been the main concern of the research community and industry. The large energy efficiency gap between general-purpose processors and application-specific integrated circuits (ASICs) motivates the exploration of customizable architectures, where one can adapt the architecture to the workload. In this Synthesis lecture, we present an overview and introduction of the recent developments on energy-efficient customizable architectures, including customizable cores and accelerators, on-chip memory

  2. Radiation effects in optical components

    International Nuclear Information System (INIS)

    Friebele, E.J.

    1987-01-01

    This report discusses components of high performance optical devices may be exposed to high energy radiation environments during their lifetime. The effect of these adverse environments depends upon a large number of parameters associated with the radiation (nature, energy, dose, dose rate, etc.) or the system (temperature, optical performance requirements, optical wavelength, optical power, path length, etc.), as well as the intrinsic susceptibility of the optical component itself to degradation

  3. Multiview Bayesian Correlated Component Analysis

    DEFF Research Database (Denmark)

    Kamronn, Simon Due; Poulsen, Andreas Trier; Hansen, Lars Kai

    2015-01-01

    are identical. Here we propose a hierarchical probabilistic model that can infer the level of universality in such multiview data, from completely unrelated representations, corresponding to canonical correlation analysis, to identical representations as in correlated component analysis. This new model, which...... we denote Bayesian correlated component analysis, evaluates favorably against three relevant algorithms in simulated data. A well-established benchmark EEG data set is used to further validate the new model and infer the variability of spatial representations across multiple subjects....

  4. ROLE OF VIRTUALIZATION IN CLOUD COMPUTING

    OpenAIRE

    Avneet kaur; Dr. Gaurav Gupta; Dr. Gurjit Singh Bhathal

    2017-01-01

    Cloud Computing is the fundamental change happening in the field of Information Technology.. Virtualization is the key component of cloud computing. With the use of virtualization, cloud computing brings about not only convenience and efficiency benefits, but also great challenges in the field of data security and privacy protection. .In this paper, we are discussing about virtualization, architecture of virtualization technology as well as Virtual Machine Monitor (VMM). Further discussing ab...

  5. Integrated Computer System of Management in Logistics

    Science.gov (United States)

    Chwesiuk, Krzysztof

    2011-06-01

    This paper aims at presenting a concept of an integrated computer system of management in logistics, particularly in supply and distribution chains. Consequently, the paper includes the basic idea of the concept of computer-based management in logistics and components of the system, such as CAM and CIM systems in production processes, and management systems for storage, materials flow, and for managing transport, forwarding and logistics companies. The platform which integrates computer-aided management systems is that of electronic data interchange.

  6. Distributed computing for macromolecular crystallography.

    Science.gov (United States)

    Krissinel, Evgeny; Uski, Ville; Lebedev, Andrey; Winn, Martyn; Ballard, Charles

    2018-02-01

    Modern crystallographic computing is characterized by the growing role of automated structure-solution pipelines, which represent complex expert systems utilizing a number of program components, decision makers and databases. They also require considerable computational resources and regular database maintenance, which is increasingly more difficult to provide at the level of individual desktop-based CCP4 setups. On the other hand, there is a significant growth in data processed in the field, which brings up the issue of centralized facilities for keeping both the data collected and structure-solution projects. The paradigm of distributed computing and data management offers a convenient approach to tackling these problems, which has become more attractive in recent years owing to the popularity of mobile devices such as tablets and ultra-portable laptops. In this article, an overview is given of developments by CCP4 aimed at bringing distributed crystallographic computations to a wide crystallographic community.

  7. Principal component approach in variance component estimation for international sire evaluation

    Directory of Open Access Journals (Sweden)

    Jakobsen Jette

    2011-05-01

    bias, but increased standard errors of the estimates and notably the computing time. Conclusions In terms of estimation's accuracy, both principal component approaches performed equally well and permitted the use of more parsimonious models through random regression MACE. The advantage of the bottom-up PC approach is that it does not need any previous knowledge on the rank. However, with a predetermined rank, the direct PC approach needs less computing time than the bottom-up PC.

  8. Computed tomography

    International Nuclear Information System (INIS)

    Wells, P.; Davis, J.; Morgan, M.

    1994-01-01

    X-ray or gamma-ray transmission computed tomography (CT) is a powerful non-destructive evaluation (NDE) technique that produces two-dimensional cross-sectional images of an object without the need to physically section it. CT is also known by the acronym CAT, for computerised axial tomography. This review article presents a brief historical perspective on CT, its current status and the underlying physics. The mathematical fundamentals of computed tomography are developed for the simplest transmission CT modality. A description of CT scanner instrumentation is provided with an emphasis on radiation sources and systems. Examples of CT images are shown indicating the range of materials that can be scanned and the spatial and contrast resolutions that may be achieved. Attention is also given to the occurrence, interpretation and minimisation of various image artefacts that may arise. A final brief section is devoted to the principles and potential of a range of more recently developed tomographic modalities including diffraction CT, positron emission CT and seismic tomography. 57 refs., 2 tabs., 14 figs

  9. MULTIMETAL - Structural performance of multimetal component

    International Nuclear Information System (INIS)

    Keim, Elisabeth; Blasset, Sebastien; Tiete, Ralf; Gilles, Philippe; Karjalainen-Roikonen, Paeivi

    2012-01-01

    The main objectives of the project are: - Develop a standard for fracture resistance testing in multi-metal specimens; - Develop harmonized procedures for dissimilar metal welds brittle and ductile integrity assessment. The underlying aim of the project is to provide recommendations for a good practice approach for the integrity assessment (including testing) of dissimilar metal welds as part of overall integrity analyses and leak-before-break (LBB) procedures. In a nuclear power plant (NPP) a single metallic component may be fabricated from different materials. For example, reactor pressure vessel (RPV) components are mainly made of ferritic steel, whereas some of the connecting pipelines are fabricated from austenitic stainless steel. As a consequence, components made of different kind of steels need to be connected. Their connecting welds are called dissimilar metal welds (DMW). Despite extensive research in the past within the EURATOM Framework, e.g. the projects BIMET and ADIMEW, further work is needed to quantify the structural performance of DMWs. The first step of the project is to gather relevant information from field experience. Typical locations of DMWs in Western as well as Eastern type light water reactors (LWRs) will be identified together with their physical and metallurgical characteristics, as well as applicable structural integrity assessment methods. The collection of relevant field information including findings position (flaw) will be followed by computational structural integrity assessment analyses of DMWs for dedicated test configurations and real cases. These analyses will involve simple engineering methods and numerical analyses. The latter also involves the use of innovative micro-mechanical modelling approaches for ductile failure processes in order to augment existing numerical methods for structural integrity assessment of DMWs. Ageing related phenomena and realistic stress distributions in the weld area will be considered. The

  10. Verification of the component accuracy prediction obtained by physical modelling and the elastic simulation of the die/component interaction

    DEFF Research Database (Denmark)

    Ravn, Bjarne Gottlieb; Andersen, Claus Bo; Wanheim, Tarras

    2001-01-01

    There are three demands on a component that must undergo a die-cavity elasticity analysis. The demands to the product are specified as: (i) to be able to measure the loading profile which results in elestic die-cavity deflections; (ii) to be able to compute the elestic deflections using FE; (iii...

  11. Architectures for single-chip image computing

    Science.gov (United States)

    Gove, Robert J.

    1992-04-01

    This paper will focus on the architectures of VLSI programmable processing components for image computing applications. TI, the maker of industry-leading RISC, DSP, and graphics components, has developed an architecture for a new-generation of image processors capable of implementing a plurality of image, graphics, video, and audio computing functions. We will show that the use of a single-chip heterogeneous MIMD parallel architecture best suits this class of processors--those which will dominate the desktop multimedia, document imaging, computer graphics, and visualization systems of this decade.

  12. Automatic capability to store and retrieve component data and to calculate structural integrity of these components

    International Nuclear Information System (INIS)

    McKinnis, C.J.; Toor, P.M.

    1985-01-01

    In structural analysis, assimilation of material, geometry, and service history input parameters is very cumbersome. Quite often with changing service history and revised material properties and geometry, an analysis has to be repeated. To overcome the above mentioned difficulties, a computer program was developed to provide the capability to establish a computerized library of all material, geometry, and service history parameters for components. The program also has the capability to calculate the structural integrity based on the Arrhenius type equations, including the probability calculations. This unique combination of computerized input information storage and automated analysis procedure assures consistency, efficiency, and accuracy when the hardware integrity has to be reassessed

  13. Design and implementation of component reliability database management system for NPP

    International Nuclear Information System (INIS)

    Kim, S. H.; Jung, J. K.; Choi, S. Y.; Lee, Y. H.; Han, S. H.

    1999-01-01

    KAERI is constructing the component reliability database for Korean nuclear power plant. This paper describes the development of data management tool, which runs for component reliability database. This is running under intranet environment and is used to analyze the failure mode and failure severity to compute the component failure rate. Now we are developing the additional modules to manage operation history, test history and algorithms for calculation of component failure history and reliability

  14. Computing Services and Assured Computing

    Science.gov (United States)

    2006-05-01

    fighters’ ability to execute the mission.” Computing Services 4 We run IT Systems that: provide medical care pay the warfighters manage maintenance...users • 1,400 applications • 18 facilities • 180 software vendors • 18,000+ copies of executive software products • Virtually every type of mainframe and... chocs electriques, de branchez les deux cordons d’al imentation avant de faire le depannage P R IM A R Y SD A S B 1 2 PowerHub 7000 RST U L 00- 00

  15. Integrating Data Transformation in Principal Components Analysis

    KAUST Repository

    Maadooliat, Mehdi

    2015-01-02

    Principal component analysis (PCA) is a popular dimension reduction method to reduce the complexity and obtain the informative aspects of high-dimensional datasets. When the data distribution is skewed, data transformation is commonly used prior to applying PCA. Such transformation is usually obtained from previous studies, prior knowledge, or trial-and-error. In this work, we develop a model-based method that integrates data transformation in PCA and finds an appropriate data transformation using the maximum profile likelihood. Extensions of the method to handle functional data and missing values are also developed. Several numerical algorithms are provided for efficient computation. The proposed method is illustrated using simulated and real-world data examples.

  16. Neutron induced activity in fuel element components

    International Nuclear Information System (INIS)

    Kjellbert, N.

    1978-03-01

    A thorough investigation of the importance of various nuclides in neutron-induced radioactivity from fuel element construction materials has been carried out for both BWR and PWR fuel assemblies. The calculations were performed with the ORIGEN computer code. The investigation was directed towards the final storage of the assembly components and special emphasis was put to the examination of the sources of carbon-14, cobalt-60, nickel-59, nickel-63 and zirconium-93/niobium-93m. It is demonstrated that the nuclides nickel-59, in Inconel and stainless steel, and zirconium-93/niobium-93m, in Zircaloy, are the ones which constitute the very long term radiotoxic hazard of the irradiated materials. (author)

  17. Component protection based automatic control

    International Nuclear Information System (INIS)

    Otaduy, P.J.

    1992-01-01

    Control and safety systems as well as operation procedures are designed on the basis of critical process parameters limits. The expectation is that short and long term mechanical damage and process failures will be avoided by operating the plant within the specified constraints envelopes. In this paper, one of the Advanced Liquid Metal Reactor (ALMR) design duty cycles events is discussed to corroborate that the time has come to explicitly make component protection part of the control system. Component stress assessment and aging data should be an integral part of the control system. Then transient trajectory planning and operating limits could be aimed at minimizing component specific and overall plant component damage cost functions. The impact of transients on critical components could then be managed according to plant lifetime design goals. The need for developing methodologies for online transient trajectory planning and assessment of operating limits in order to facilitate the explicit incorporation of damage assessment capabilities to the plant control and protection systems is discussed. 12 refs

  18. Computational neuroscience

    CERN Document Server

    Blackwell, Kim L

    2014-01-01

    Progress in Molecular Biology and Translational Science provides a forum for discussion of new discoveries, approaches, and ideas in molecular biology. It contains contributions from leaders in their fields and abundant references. This volume brings together different aspects of, and approaches to, molecular and multi-scale modeling, with applications to a diverse range of neurological diseases. Mathematical and computational modeling offers a powerful approach for examining the interaction between molecular pathways and ionic channels in producing neuron electrical activity. It is well accepted that non-linear interactions among diverse ionic channels can produce unexpected neuron behavior and hinder a deep understanding of how ion channel mutations bring about abnormal behavior and disease. Interactions with the diverse signaling pathways activated by G protein coupled receptors or calcium influx adds an additional level of complexity. Modeling is an approach to integrate myriad data sources into a cohesiv...

  19. Social Computing

    CERN Multimedia

    CERN. Geneva

    2011-01-01

    The past decade has witnessed a momentous transformation in the way people interact with each other. Content is now co-produced, shared, classified, and rated by millions of people, while attention has become the ephemeral and valuable resource that everyone seeks to acquire. This talk will describe how social attention determines the production and consumption of content within both the scientific community and social media, how its dynamics can be used to predict the future and the role that social media plays in setting the public agenda. About the speaker Bernardo Huberman is a Senior HP Fellow and Director of the Social Computing Lab at Hewlett Packard Laboratories. He received his Ph.D. in Physics from the University of Pennsylvania, and is currently a Consulting Professor in the Department of Applied Physics at Stanford University. He originally worked in condensed matter physics, ranging from superionic conductors to two-dimensional superfluids, and made contributions to the theory of critical p...

  20. computer networks

    Directory of Open Access Journals (Sweden)

    N. U. Ahmed

    2002-01-01

    Full Text Available In this paper, we construct a new dynamic model for the Token Bucket (TB algorithm used in computer networks and use systems approach for its analysis. This model is then augmented by adding a dynamic model for a multiplexor at an access node where the TB exercises a policing function. In the model, traffic policing, multiplexing and network utilization are formally defined. Based on the model, we study such issues as (quality of service QoS, traffic sizing and network dimensioning. Also we propose an algorithm using feedback control to improve QoS and network utilization. Applying MPEG video traces as the input traffic to the model, we verify the usefulness and effectiveness of our model.

  1. Computer Tree

    Directory of Open Access Journals (Sweden)

    Onur AĞAOĞLU

    2014-12-01

    Full Text Available It is crucial that gifted and talented students should be supported by different educational methods for their interests and skills. The science and arts centres (gifted centres provide the Supportive Education Program for these students with an interdisciplinary perspective. In line with the program, an ICT lesson entitled “Computer Tree” serves for identifying learner readiness levels, and defining the basic conceptual framework. A language teacher also contributes to the process, since it caters for the creative function of the basic linguistic skills. The teaching technique is applied for 9-11 aged student level. The lesson introduces an evaluation process including basic information, skills, and interests of the target group. Furthermore, it includes an observation process by way of peer assessment. The lesson is considered to be a good sample of planning for any subject, for the unpredicted convergence of visual and technical abilities with linguistic abilities.

  2. Computed tomography

    International Nuclear Information System (INIS)

    Boyd, D.P.

    1989-01-01

    This paper reports on computed tomographic (CT) scanning which has improved computer-assisted imaging modalities for radiologic diagnosis. The advantage of this modality is its ability to image thin cross-sectional planes of the body, thus uncovering density information in three dimensions without tissue superposition problems. Because this enables vastly superior imaging of soft tissues in the brain and body, CT scanning was immediately successful and continues to grow in importance as improvements are made in speed, resolution, and cost efficiency. CT scanners are used for general purposes, and the more advanced machines are generally preferred in large hospitals, where volume and variety of usage justifies the cost. For imaging in the abdomen, a scanner with a rapid speed is preferred because peristalsis, involuntary motion of the diaphram, and even cardiac motion are present and can significantly degrade image quality. When contrast media is used in imaging to demonstrate scanner, immediate review of images, and multiformat hardcopy production. A second console is reserved for the radiologist to read images and perform the several types of image analysis that are available. Since CT images contain quantitative information in terms of density values and contours of organs, quantitation of volumes, areas, and masses is possible. This is accomplished with region-of- interest methods, which involve the electronic outlining of the selected region of the television display monitor with a trackball-controlled cursor. In addition, various image- processing options, such as edge enhancement (for viewing fine details of edges) or smoothing filters (for enhancing the detectability of low-contrast lesions) are useful tools

  3. Addressing Cloud Computing in Enterprise Architecture: Issues and Challenges

    OpenAIRE

    Khan, Khaled; Gangavarapu, Narendra

    2009-01-01

    This article discusses how the characteristics of cloud computing affect the enterprise architecture in four domains: business, data, application and technology. The ownership and control of architectural components are shifted from organisational perimeters to cloud providers. It argues that although cloud computing promises numerous benefits to enterprises, the shifting control from enterprises to cloud providers on architectural components introduces several architectural challenges. The d...

  4. Seismic behaviour of gas cooled reactor components

    International Nuclear Information System (INIS)

    1990-08-01

    On invitation of the French Government the Specialists' Meeting on the Seismic Behaviour of Gas-Cooled Reactor Components was held at Gif-sur-Yvette, 14-16 November 1989. This was the second Specialists' Meeting on the general subject of gas-cooled reactor seismic design. There were 27 participants from France, the Federal Republic of Germany, Israel, Japan, Spain, Switzerland, the United Kingdom, the Soviet Union, the United States, the CEC and IAEA took the opportunity to present and discuss a total of 16 papers reflecting the state of the art of gained experiences in the field of their seismic qualification approach, seismic analysis methods and of the capabilities of various facilities used to qualify components and verify analytical methods. Since the first meeting, the sophistication and expanded capabilities of both the seismic analytical methods and the test facilities are apparent. The two main methods for seismic analysis, the impedance method and the finite element method, have been computer-programmed in several countries with the capability of each of the codes dependent on the computer capability. The correlations between calculation and tests are dependent on input assumptions such as boundary conditions, soil parameters and various interactions between the soil, the buildings and the contained equipment. The ability to adjust these parameters and match experimental results with calculations was displayed in several of the papers. The expanded capability of some of the new test facilities was graphically displayed by the description of the SAMSON vibration test facility at Juelich, FRG, capable of dynamically testing specimens weighing up to 25 tonnes, and the TAMARIS facility at the CEA laboratories in Gif-sur-Yvette where the largest table is capable of testing specimens weighing up to 100 tonnes. The proceedings of this meeting contain all 16 presented papers. A separate abstract was prepared for each of these papers. Refs, figs and tabs

  5. Cloud Computing: The Future of Computing

    OpenAIRE

    Aggarwal, Kanika

    2013-01-01

    Cloud computing has recently emerged as a new paradigm for hosting and delivering services over the Internet. Cloud computing is attractive to business owners as it eliminates the requirement for users to plan ahead for provisioning, and allows enterprises to start from the small and increase resources only when there is a rise in service demand. The basic principles of cloud computing is to make the computing be assigned in a great number of distributed computers, rather then local computer ...

  6. Risk-ranking IST components into two categories

    International Nuclear Information System (INIS)

    Rowley, C.W.

    1996-01-01

    The ASME has utilized several schemes for identifying the appropriate scope of components for inservice testing (IST). The initial scope was ASME Code Class 1/2/3, with all components treated equally. Later the ASME Operations and Maintenance (O ampersand M) Committee decided to use safe shutdown and accident mitigation as the scoping criteria, but continued to treat all components equal inside that scope. Recently the ASME O ampersand M Committee decided to recognize service condition of the component, hence the comprehensive pump test. Although probabilistic risk assessments (PRAs) are incredibly complex plant models and computer hardware and software intensive, they are a tool that can be utilized by many plant engineering organizations to analyze plant system and component applications. In 1992 the ASME O ampersand M Committee got interested in using the PRA as a tool to categorize its pumps and valves. In 1994 the ASME O ampersand M Committee commissioned the ASME Center for Research and Technology Development (CRTD) to develop a process that adapted the PRA technology to IST. In late 1995 that process was presented to the ASME O ampersand M Committee. The process had three distinct portions: (1) risk-rank the IST components; (2) develop a more effective testing strategy for More Safety Significant Components; and (3) develop a more economic testing strategy for Less Safety Significant Components

  7. Risk-ranking IST components into two categories

    Energy Technology Data Exchange (ETDEWEB)

    Rowley, C.W.

    1996-12-01

    The ASME has utilized several schemes for identifying the appropriate scope of components for inservice testing (IST). The initial scope was ASME Code Class 1/2/3, with all components treated equally. Later the ASME Operations and Maintenance (O&M) Committee decided to use safe shutdown and accident mitigation as the scoping criteria, but continued to treat all components equal inside that scope. Recently the ASME O&M Committee decided to recognize service condition of the component, hence the comprehensive pump test. Although probabilistic risk assessments (PRAs) are incredibly complex plant models and computer hardware and software intensive, they are a tool that can be utilized by many plant engineering organizations to analyze plant system and component applications. In 1992 the ASME O&M Committee got interested in using the PRA as a tool to categorize its pumps and valves. In 1994 the ASME O&M Committee commissioned the ASME Center for Research and Technology Development (CRTD) to develop a process that adapted the PRA technology to IST. In late 1995 that process was presented to the ASME O&M Committee. The process had three distinct portions: (1) risk-rank the IST components; (2) develop a more effective testing strategy for More Safety Significant Components; and (3) develop a more economic testing strategy for Less Safety Significant Components.

  8. Computational statistics handbook with Matlab

    CERN Document Server

    Martinez, Wendy L

    2007-01-01

    Prefaces Introduction What Is Computational Statistics? An Overview of the Book Probability Concepts Introduction Probability Conditional Probability and Independence Expectation Common Distributions Sampling Concepts Introduction Sampling Terminology and Concepts Sampling Distributions Parameter Estimation Empirical Distribution Function Generating Random Variables Introduction General Techniques for Generating Random Variables Generating Continuous Random Variables Generating Discrete Random Variables Exploratory Data Analysis Introduction Exploring Univariate Data Exploring Bivariate and Trivariate Data Exploring Multidimensional Data Finding Structure Introduction Projecting Data Principal Component Analysis Projection Pursuit EDA Independent Component Analysis Grand Tour Nonlinear Dimensionality Reduction Monte Carlo Methods for Inferential Statistics Introduction Classical Inferential Statistics Monte Carlo Methods for Inferential Statist...

  9. Supporting collaborative computing and interaction

    International Nuclear Information System (INIS)

    Agarwal, Deborah; McParland, Charles; Perry, Marcia

    2002-01-01

    To enable collaboration on the daily tasks involved in scientific research, collaborative frameworks should provide lightweight and ubiquitous components that support a wide variety of interaction modes. We envision a collaborative environment as one that provides a persistent space within which participants can locate each other, exchange synchronous and asynchronous messages, share documents and applications, share workflow, and hold videoconferences. We are developing the Pervasive Collaborative Computing Environment (PCCE) as such an environment. The PCCE will provide integrated tools to support shared computing and task control and monitoring. This paper describes the PCCE and the rationale for its design

  10. Bacterial computing with engineered populations.

    Science.gov (United States)

    Amos, Martyn; Axmann, Ilka Maria; Blüthgen, Nils; de la Cruz, Fernando; Jaramillo, Alfonso; Rodriguez-Paton, Alfonso; Simmel, Friedrich

    2015-07-28

    We describe strategies for the construction of bacterial computing platforms by describing a number of results from the recently completed bacterial computing with engineered populations project. In general, the implementation of such systems requires a framework containing various components such as intracellular circuits, single cell input/output and cell-cell interfacing, as well as extensive analysis. In this overview paper, we describe our approach to each of these, and suggest possible areas for future research. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  11. Mechanical design of machine components

    CERN Document Server

    Ugural, Ansel C

    2015-01-01

    Mechanical Design of Machine Components, Second Edition strikes a balance between theory and application, and prepares students for more advanced study or professional practice. It outlines the basic concepts in the design and analysis of machine elements using traditional methods, based on the principles of mechanics of materials. The text combines the theory needed to gain insight into mechanics with numerical methods in design. It presents real-world engineering applications, and reveals the link between basic mechanics and the specific design of machine components and machines. Divided into three parts, this revised text presents basic background topics, deals with failure prevention in a variety of machine elements and covers applications in design of machine components as well as entire machines. Optional sections treating special and advanced topics are also included.Key Features of the Second Edition:Incorporates material that has been completely updated with new chapters, problems, practical examples...

  12. Towards Prognostics for Electronics Components

    Science.gov (United States)

    Saha, Bhaskar; Celaya, Jose R.; Wysocki, Philip F.; Goebel, Kai F.

    2013-01-01

    Electronics components have an increasingly critical role in avionics systems and in the development of future aircraft systems. Prognostics of such components is becoming a very important research field as a result of the need to provide aircraft systems with system level health management information. This paper focuses on a prognostics application for electronics components within avionics systems, and in particular its application to an Isolated Gate Bipolar Transistor (IGBT). This application utilizes the remaining useful life prediction, accomplished by employing the particle filter framework, leveraging data from accelerated aging tests on IGBTs. These tests induced thermal-electrical overstresses by applying thermal cycling to the IGBT devices. In-situ state monitoring, including measurements of steady-state voltages and currents, electrical transients, and thermal transients are recorded and used as potential precursors of failure.

  13. Automated cleaning of electronic components

    International Nuclear Information System (INIS)

    Drotning, W.; Meirans, L.; Wapman, W.; Hwang, Y.; Koenig, L.; Petterson, B.

    1994-01-01

    Environmental and operator safety concerns are leading to the elimination of trichloroethylene and chlorofluorocarbon solvents in cleaning processes that remove rosin flux, organic and inorganic contamination, and particulates from electronic components. Present processes depend heavily on these solvents for manual spray cleaning of small components and subassemblies. Use of alternative solvent systems can lead to longer processing times and reduced quality. Automated spray cleaning can improve the quality of the cleaning process, thus enabling the productive use of environmentally conscious materials, while minimizing personnel exposure to hazardous materials. We describe the development of a prototype robotic system for cleaning electronic components in a spray cleaning workcell. An important feature of the prototype system is the capability to generate the robot paths and motions automatically from the CAD models of the part to be cleaned, and to embed cleaning process knowledge into the automatically programmed operations

  14. Two component systems: physiological effect of a third component.

    Directory of Open Access Journals (Sweden)

    Baldiri Salvado

    Full Text Available Signal transduction systems mediate the response and adaptation of organisms to environmental changes. In prokaryotes, this signal transduction is often done through Two Component Systems (TCS. These TCS are phosphotransfer protein cascades, and in their prototypical form they are composed by a kinase that senses the environmental signals (SK and by a response regulator (RR that regulates the cellular response. This basic motif can be modified by the addition of a third protein that interacts either with the SK or the RR in a way that could change the dynamic response of the TCS module. In this work we aim at understanding the effect of such an additional protein (which we call "third component" on the functional properties of a prototypical TCS. To do so we build mathematical models of TCS with alternative designs for their interaction with that third component. These mathematical models are analyzed in order to identify the differences in dynamic behavior inherent to each design, with respect to functionally relevant properties such as sensitivity to changes in either the parameter values or the molecular concentrations, temporal responsiveness, possibility of multiple steady states, or stochastic fluctuations in the system. The differences are then correlated to the physiological requirements that impinge on the functioning of the TCS. This analysis sheds light on both, the dynamic behavior of synthetically designed TCS, and the conditions under which natural selection might favor each of the designs. We find that a third component that modulates SK activity increases the parameter space where a bistable response of the TCS module to signals is possible, if SK is monofunctional, but decreases it when the SK is bifunctional. The presence of a third component that modulates RR activity decreases the parameter space where a bistable response of the TCS module to signals is possible.

  15. Explanation components as interactive tools

    Energy Technology Data Exchange (ETDEWEB)

    Wahlster, W.

    1983-01-01

    The ability to explain itself is probably the most important criterion of the user-friendliness of interactive systems. Explanation aids in the form of simple help functions do not meet this criterion. The reasons for this are outlined. More promising is an explanation component which can give the user intelligible and context-oriented explanations. The essential requirement for this is the development of knowledge-based interactive systems using artificial intelligence methods and techniques. The authors report on experiences with the development of explanation components, in particular a number of examples from the HAM-ANS project. 12 references.

  16. Radioactive resistance of EEPROM components

    International Nuclear Information System (INIS)

    Loncar, B.; Novakovic, D.; Stankovic, S.; Osmokrovic, P.

    1999-01-01

    The aim of this paper is to examine the resistance of EEPROM components under the influence of gamma radiation. This paper is significant for military industry and space technology. Therefore the analysis of the degradation mechanisms of these components as well as the possibilities to increase their radiation resistance have been considered by many authors. Total dose results are presented for 28C64C EEPROM there is evidence that the first failure appeared for 1000 Gy total dose level. The obtained result are analyzed and explained theoretically via the interaction of gamma radiation with oxide layer. (author)

  17. Space storable propulsion components development

    Science.gov (United States)

    Hagler, R., Jr.

    1982-01-01

    The current development status of components to control the flow of propellants (liquid fluorine and hydrazine) in a demonstration space storable propulsion system is discussed. The criteria which determined the designs for the pressure regulator, explosive-actuated valves, propellant shutoff valve, latching solenoid-actuated valve and propellant filter are presented. The test philosophy that was followed during component development is outlined. The results from compatibility demonstrations for reusable connectors, flange seals, and CRES/Ti-6Al4V transition tubes and the evaluations of processes for welding (hand-held TIG, automated TIG, and EB), cleaning for fluorine service, and decontamination after fluorine exposure are described.

  18. Radiation effects on eye components

    Science.gov (United States)

    Durchschlag, H.; Fochler, C.; Abraham, K.; Kulawik, B.

    1999-08-01

    The most important water-soluble components of the vertebrate eye (lens proteins, aqueous humor, vitreous, hyaluronic acid, ascorbic acid) have been investigated in aqueous solution, after preceding X- or UV-irradiation. Spectroscopic, chromatographic, electrophoretic, hydrodynamic and analytic techniques have been applied, to monitor several radiation damages such as destruction of aromatic and sulfur-containing amino acids, aggregation, crosslinking, dissociation, fragmentation, and partial unfolding. Various substances were found which were able to protect eye components effectively against radiation, some of them being also of medical relevance.

  19. Latent semantics as cognitive components

    DEFF Research Database (Denmark)

    Petersen, Michael Kai; Mørup, Morten; Hansen, Lars Kai

    2010-01-01

    Cognitive component analysis, defined as an unsupervised learning of features resembling human comprehension, suggests that the sensory structures we perceive might often be modeled by reducing dimensionality and treating objects in space and time as linear mixtures incorporating sparsity...... emotional responses can be encoded in words, we propose a simplified cognitive approach to model how we perceive media. Representing song lyrics in a vector space of reduced dimensionality using LSA, we combine bottom-up defined term distances with affective adjectives, that top-down constrain the latent......, which we suggest might function as cognitive components for perceiving the underlying structure in lyrics....

  20. Creep buckling problems in fast reactor components

    International Nuclear Information System (INIS)

    Ramesh, R.; Damodaran, S.P.; Chellapandi, P.; Chetal, S.C.; Bhoje, S.B.

    1995-01-01

    Creep buckling analyses for two important components of 500 M We Prototype Fast Breeder Reactor (PFBR), viz. Intermediate Heat Exchanger (IHX) and Inner Vessel (IV), are reported. The INCA code of CASTEM system is used for the large displacement elasto-plastic-creep analysis of IHX shell. As a first step, INCA is validated for a typical benchmark problem dealing with the creep buckling of a tube under external pressure. Prediction of INCA is also compared with the results obtained using Hoff's theory. For IV, considering the prohibitively high computational cost for the actual analysis, a simplified analysis which involves only large displacement elastoplastic buckling analysis is performed using isochronous stress strain curve approach. From both of these analysis is performed using isochronous stress strain curve approach. From both of these analysis, it has been inferred that creep buckling failure mode is not of great concern in the design of PFBR components. It has also been concluded from the analysis that Creep Cross Over Curve given in RCC-MR is applicable for creep buckling failure mode also. (author). 8 refs., 9 figs., 1 tab

  1. Cutting of metal components by intergranular cracking

    International Nuclear Information System (INIS)

    Chavand, J.; Gauthier, A.; Lopez, J.J.; Tanis, G.

    1985-01-01

    The objective of this contract was to study a new steel-sheet cutting technique for dismantling nuclear installations without in principle producing secondary waste. This technique is based on intergranular cracking of steel induced by the combined action of penetration of molten metal into the steel and application of a mechanical load. Cutting has been achieved for stainless-steel sheets with thicknesses ranging from a few mm to 50 mm and for carbon-steel plates with thicknesses between 20 and 60 mm. For carbon steel is seems possible that components as thick as 100 mm can be cut. The tests have permitted selection of the heating methods and determination of the cracking parameters for the materials and range of thickness studied. In the case of thin sheets, results were obtained for cutting in varied positions suited to the techniques of dismantling in hot cells. A temperature-measuring system using an infrared camera has been developed to determine the variation of the temperature field established in the component. In association with the three-dimensional computation code COCO developed by the CEA, this system permits prediction of the changes in stresses in the cracked zone when the cutting parameters are modified. 34 figs

  2. So, you are buying your first computer.

    Science.gov (United States)

    Ferrara-Love, R

    1999-06-01

    Buying your first computer need not be that complicated. The first thing that is needed is an understanding of what you want and need the computer for. By making a list of the various essentials, you will be on your way to purchasing that computer. Once that is completed, you will need an understanding of what each of the components of the computer is, how it works, and what options you have. This way, you will be better able to discuss your needs with the salesperson. The focus of this article is limited to personal computers or PCs (i.e., IBMs [Armonk, NY], IBM clones, Compaq [Houston, TX], Gateway [North Sioux City, SD], and so on). I am not including Macintosh or Apple [Cupertino, CA] in this discussion; most software is often made exclusively for personal computers or at least on the market for personal computers before becoming available in Macintosh version.

  3. Localization experience and future plan of NSLS primary components

    International Nuclear Information System (INIS)

    Kim, Haesoo

    1992-01-01

    Korea Heavy Industries and Construction Company is planning to obtain technical self-reliance of the component design, manufacturing and installation of the NSLS primary components as much as the target of 87% by 1995 as indicated in the technical self-reliance plan by the Korea Electric Power Company in 1988. In order to achieve this target, Koch has been involved in the component design, manufacturing and project management of the NSLS components from the early stage of the Young 3 and 4 project. In parallel, Koch has increased the self-reliance of the various fields taking full advantage of the technical documents, computer codes, training and consultation supplied by the technology transfer agreement. This paper presents the re-evaluation of the current status of technical self reliance as well as the make up plan to be implemented during the UCH 3 and 4 project for the area identified as the weakness

  4. The common component architecture for particle accelerator simulations

    International Nuclear Information System (INIS)

    Dechow, D.R.; Norris, B.; Amundson, J.

    2007-01-01

    Synergia2 is a beam dynamics modeling and simulation application for high-energy accelerators such as the Tevatron at Fermilab and the International Linear Collider, which is now under planning and development. Synergia2 is a hybrid, multilanguage software package comprised of two separate accelerator physics packages (Synergia and MaryLie/Impact) and one high-performance computer science package (PETSc). We describe our approach to producing a set of beam dynamics-specific software components based on the Common Component Architecture specification. Among other topics, we describe particular experiences with the following tasks: using Python steering to guide the creation of interfaces and to prototype components; working with legacy Fortran codes; and an example component-based, beam dynamics simulation.

  5. Fault Localization for Synchrophasor Data using Kernel Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    CHEN, R.

    2017-11-01

    Full Text Available In this paper, based on Kernel Principal Component Analysis (KPCA of Phasor Measurement Units (PMU data, a nonlinear method is proposed for fault location in complex power systems. Resorting to the scaling factor, the derivative for a polynomial kernel is obtained. Then, the contribution of each variable to the T2 statistic is derived to determine whether a bus is the fault component. Compared to the previous Principal Component Analysis (PCA based methods, the novel version can combat the characteristic of strong nonlinearity, and provide the precise identification of fault location. Computer simulations are conducted to demonstrate the improved performance in recognizing the fault component and evaluating its propagation across the system based on the proposed method.

  6. Environmental Forces - Some Esthetic Components.

    Science.gov (United States)

    Severino, D. Alexander

    Although our system of mass culture has raised our civilization to an extremely high level of material success and affluence, the fact remains that this system has the inherent flaw of not fully recognizing the esthetic needs of man. To overcome this weakness we need to re-introduce into the system a sizable component of first-hand experience…

  7. Components in the interstellar medium

    International Nuclear Information System (INIS)

    Martin, E.R.

    1981-01-01

    An analysis is made of the lines of sight toward 32 stars with a procedure that gives velocity components for various interstellar ions. The column densities found for species expected to be relatively undepleted are used to estimate the column density of neutral hydrogen in each component. Whenever possible, the molecular hydrogen excitation temperature, abundances (relative to S II), electron density, and hydrogen volume density are calculated for each component. The results for each star are combined to give total HI column density as a function of (LSR) velocity. The derived velocities correspond well with those found in optical studies. The mean electron density is found to be approximately constant with velocity, but the mean hydrogen volume density is found to vary. The data presented here are consistent with the assumption that some of the velocity components are due to circumstellar material. The total HI column density toward a given star is generally in agreement with Lyman alpha measurements, but ionization and abundance effects are important toward some stars. The total HI column density is found to vary exponentially with velocity (for N(HI)> 10 17 cm -2 ), with an indication that the velocity dispersion at low column densities (N(HI) 17 cm -2 ) is approximately constant. An estimate is made of the kinetic energy density due to cloud motion which depends only on the total HI column density as a function of velocity. The value of 9 x 10 42 erg/pc 3 is in good agreement with a theoretical prediction

  8. Inspection of disposal canisters components

    International Nuclear Information System (INIS)

    Pitkaenen, J.

    2013-12-01

    This report presents the inspection techniques of disposal canister components. Manufacturing methods and a description of the defects related to different manufacturing methods are described briefly. The defect types form a basis for the design of non-destructive testing because the defect types, which occur in the inspected components, affect to choice of inspection methods. The canister components are to nodular cast iron insert, steel lid, lid screw, metal gasket, copper tube with integrated or separate bottom, and copper lid. The inspection of copper material is challenging due to the anisotropic properties of the material and local changes in the grain size of the copper material. The cast iron insert has some acoustical material property variation (attenuation, velocity changes, scattering properties), which make the ultrasonic inspection demanding from calibration point of view. Mainly three different methods are used for inspection. Ultrasonic testing technique is used for inspection of volume, eddy current technique, for copper components only, and visual testing technique are used for inspection of the surface and near surface area

  9. Modelling Livestock Component in FSSIM

    NARCIS (Netherlands)

    Thorne, P.J.; Hengsdijk, H.; Janssen, S.J.C.; Louhichi, K.; Keulen, van H.; Thornton, P.K.

    2009-01-01

    This document summarises the development of a ruminant livestock component for the Farm System Simulator (FSSIM). This includes treatments of energy and protein transactions in ruminant livestock that have been used as a basis for the biophysical simulations that will generate the input production

  10. A forgotten component of biodiversity

    Indian Academy of Sciences (India)

    2011-07-08

    Jul 8, 2011 ... Home; Journals; Journal of Biosciences; Volume 36; Issue 4. Clipboard: Helminth richness in Arunachal Pradesh fishes: A forgotten component of biodiversity. Amit Tripathi. Volume 36 Issue 4 September 2011 pp 559-561. Fulltext. Click here to view fulltext PDF. Permanent link:

  11. Detecting decay in wood components

    Science.gov (United States)

    R.J. Ross; X. Wang; B.K. Brashaw

    2005-01-01

    This chapter presents a summary of the Wood and Timber Condition Assessment Manual. It focuses on current inspection techniques for decay detection and provides guidelines on the use of various non-destructive evaluation (NDE) methods in locating and defining areas of deterioration in timber bridge components and other civil structures.

  12. Computer Refurbishment

    International Nuclear Information System (INIS)

    Ichiyen, Norman; Chan, Dominic; Thompson, Paul

    2004-01-01

    The major activity for the 18-month refurbishment outage at the Point Lepreau Generating Station is the replacement of all 380 fuel channel and calandria tube assemblies and the lower portion of connecting feeder pipes. New Brunswick Power would also take advantage of this outage to conduct a number of repairs, replacements, inspections and upgrades (such as rewinding or replacing the generator, replacement of shutdown system trip computers, replacement of certain valves and expansion joints, inspection of systems not normally accessible, etc.). This would allow for an additional 25 to 30 years. Among the systems to be replaced are the PDC's for both shutdown systems. Assessments have been completed for both the SDS1 and SDS2 PDC's, and it has been decided to replace the SDS2 PDCs with the same hardware and software approach that has been used successfully for the Wolsong 2, 3, and 4 and the Qinshan 1 and 2 SDS2 PDCs. For SDS1, it has been decided to use the same software development methodology that was used successfully for the Wolsong and Qinshan called the I A and to use a new hardware platform in order to ensure successful operation for the 25-30 year station operating life. The selected supplier is Triconex, which uses a triple modular redundant architecture that will enhance the robustness/fault tolerance of the design with respect to equipment failures

  13. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses Computed tomography (CT) of the sinuses ... CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, more commonly known ...

  14. Illustrated computer tomography

    International Nuclear Information System (INIS)

    Takahashi, S.

    1983-01-01

    This book provides the following information: basic aspects of computed tomography; atlas of computed tomography of the normal adult; clinical application of computed tomography; and radiotherapy planning and computed tomography

  15. Cold component flow in a two-component mirror machine

    International Nuclear Information System (INIS)

    Rognlien, T.D.

    1975-12-01

    Steady-state solutions are given for the flow characteristics along the magnetic field of the cold plasma component in a two-component mirror machine. The hot plasma component is represented by a fixed density profile. The fluid equations are used to describe the cold plasma, which is assumed to be generated in a localized region at one end of the machine. The ion flow speed, v/sub i/, is required to satisfy the Bohm sheath condition at the end walls, i.e., v/sub i/ greater than or equal to c/sub s/, where c/sub s/ is the ion-acoustic speed. For the case when the cold plasma density, n/sub c/, is much less than the hot plasma density, n/sub h/, the cold plasma is stagnant and does not penetrate through the machine in the zero temperature case. The effect of a finite temperature is to allow for the penetration of a small amount of cold plasma through the machine. For the density range n/sub c/ approximately n/sub h/, the flow solutions are asymmetric about the midplane and have v/sub i/ = c/sub s/ near the midplane. Finally, for n/sub c/ much greater than n/sub h/, the solutions become symmetric about the midplane and approach the Lee--McNamara type solutions with v/sub i/ = c/sub s/ near the mirror throats

  16. Analog and hybrid computing

    CERN Document Server

    Hyndman, D E

    2013-01-01

    Analog and Hybrid Computing focuses on the operations of analog and hybrid computers. The book first outlines the history of computing devices that influenced the creation of analog and digital computers. The types of problems to be solved on computers, computing systems, and digital computers are discussed. The text looks at the theory and operation of electronic analog computers, including linear and non-linear computing units and use of analog computers as operational amplifiers. The monograph examines the preparation of problems to be deciphered on computers. Flow diagrams, methods of ampl

  17. Diamond High Assurance Security Program: Trusted Computing Exemplar

    Science.gov (United States)

    2002-09-01

    computing component, the Embedded MicroKernel Prototype. A third-party evaluation of the component will be initiated during development (e.g., once...target technologies and larger projects is a topic for future research. Trusted Computing Reference Component – The Embedded MicroKernel Prototype We...Kernel The primary security function of the Embedded MicroKernel will be to enforce process and data-domain separation, while providing primitive

  18. Cloud Computing Fundamentals

    Science.gov (United States)

    Furht, Borko

    In the introductory chapter we define the concept of cloud computing and cloud services, and we introduce layers and types of cloud computing. We discuss the differences between cloud computing and cloud services. New technologies that enabled cloud computing are presented next. We also discuss cloud computing features, standards, and security issues. We introduce the key cloud computing platforms, their vendors, and their offerings. We discuss cloud computing challenges and the future of cloud computing.

  19. Computer control of fuel handling activities at FFTF

    International Nuclear Information System (INIS)

    Romrell, D.M.

    1985-03-01

    The Fast Flux Test Facility near Richland, Washington, utilizes computer control for reactor refueling and other related core component handling and processing tasks. The computer controlled tasks described in this paper include core component transfers within the reactor vessel, core component transfers into and out of the reactor vessel, remote duct measurements of irradiated core components, remote duct cutting, and finally, transferring irradiated components out of the reactor containment building for off-site shipments or to long term storage. 3 refs., 16 figs

  20. Unconventional Quantum Computing Devices

    OpenAIRE

    Lloyd, Seth

    2000-01-01

    This paper investigates a variety of unconventional quantum computation devices, including fermionic quantum computers and computers that exploit nonlinear quantum mechanics. It is shown that unconventional quantum computing devices can in principle compute some quantities more rapidly than `conventional' quantum computers.

  1. Computing handbook computer science and software engineering

    CERN Document Server

    Gonzalez, Teofilo; Tucker, Allen

    2014-01-01

    Overview of Computer Science Structure and Organization of Computing Peter J. DenningComputational Thinking Valerie BarrAlgorithms and Complexity Data Structures Mark WeissBasic Techniques for Design and Analysis of Algorithms Edward ReingoldGraph and Network Algorithms Samir Khuller and Balaji RaghavachariComputational Geometry Marc van KreveldComplexity Theory Eric Allender, Michael Loui, and Kenneth ReganFormal Models and Computability Tao Jiang, Ming Li, and Bala

  2. Research on development model of nuclear component based on life cycle management

    International Nuclear Information System (INIS)

    Bao Shiyi; Zhou Yu; He Shuyan

    2005-01-01

    At present the development process of nuclear component, even nuclear component itself, is more and more supported by computer technology. This increasing utilization of the computer and software has led to the faster development of nuclear technology on one hand and also brought new problems on the other hand. Especially, the combination of hardware, software and humans has increased nuclear component system complexities to an unprecedented level. To solve this problem, Life Cycle Management technology is adopted in nuclear component system. Hence, an intensive discussion on the development process of a nuclear component is proposed. According to the characteristics of the nuclear component development, such as the complexities and strict safety requirements of the nuclear components, long-term design period, changeable design specifications and requirements, high capital investment, and satisfaction for engineering codes/standards, the development life-cycle model of nuclear component is presented. The development life-cycle model is classified at three levels, namely, component level development life-cycle, sub-component development life-cycle and component level verification/certification life-cycle. The purposes and outcomes of development processes are stated in detailed. A process framework for nuclear component based on system engineering and development environment of nuclear component is discussed for future research work. (authors)

  3. Determination of the optimal number of components in independent components analysis.

    Science.gov (United States)

    Kassouf, Amine; Jouan-Rimbaud Bouveresse, Delphine; Rutledge, Douglas N

    2018-03-01

    Independent components analysis (ICA) may be considered as one of the most established blind source separation techniques for the treatment of complex data sets in analytical chemistry. Like other similar methods, the determination of the optimal number of latent variables, in this case, independent components (ICs), is a crucial step before any modeling. Therefore, validation methods are required in order to decide about the optimal number of ICs to be used in the computation of the final model. In this paper, three new validation methods are formally presented. The first one, called Random_ICA, is a generalization of the ICA_by_blocks method. Its specificity resides in the random way of splitting the initial data matrix into two blocks, and then repeating this procedure several times, giving a broader perspective for the selection of the optimal number of ICs. The second method, called KMO_ICA_Residuals is based on the computation of the Kaiser-Meyer-Olkin (KMO) index of the transposed residual matrices obtained after progressive extraction of ICs. The third method, called ICA_corr_y, helps to select the optimal number of ICs by computing the correlations between calculated proportions and known physico-chemical information about samples, generally concentrations, or between a source signal known to be present in the mixture and the signals extracted by ICA. These three methods were tested using varied simulated and experimental data sets and compared, when necessary, to ICA_by_blocks. Results were relevant and in line with expected ones, proving the reliability of the three proposed methods. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Language interoperability for high-performance parallel scientific components

    International Nuclear Information System (INIS)

    Elliot, N; Kohn, S; Smolinski, B

    1999-01-01

    With the increasing complexity and interdisciplinary nature of scientific applications, code reuse is becoming increasingly important in scientific computing. One method for facilitating code reuse is the use of components technologies, which have been used widely in industry. However, components have only recently worked their way into scientific computing. Language interoperability is an important underlying technology for these component architectures. In this paper, we present an approach to language interoperability for a high-performance parallel, component architecture being developed by the Common Component Architecture (CCA) group. Our approach is based on Interface Definition Language (IDL) techniques. We have developed a Scientific Interface Definition Language (SIDL), as well as bindings to C and Fortran. We have also developed a SIDL compiler and run-time library support for reference counting, reflection, object management, and exception handling (Babel). Results from using Babel to call a standard numerical solver library (written in C) from C and Fortran show that the cost of using Babel is minimal, where as the savings in development time and the benefits of object-oriented development support for C and Fortran far outweigh the costs

  5. NET-COMPUTER: Internet Computer Architecture and its Application in E-Commerce

    OpenAIRE

    P. O. Umenne; M. O. Odhiambo

    2012-01-01

    Research in Intelligent Agents has yielded interesting results, some of which have been translated into commer­cial ventures. Intelligent Agents are executable software components that represent the user, perform tasks on behalf of the user and when the task terminates, the Agents send the result to the user. Intelligent Agents are best suited for the Internet: a collection of computers connected together in a world-wide computer network. Swarm and HYDRA computer architectures for Agents’ ex...

  6. Interoperability and Security Support for Heterogeneous COTS/GOTS/Legacy Component-Based Architecture

    National Research Council Canada - National Science Library

    Tran, Tam

    2000-01-01

    There is a need for Commercial-off-the-shelf (COTS), Government-off-the-shelf (GOTS) and legacy components to interoperate in a secure distributed computing environment in order to facilitate the development of evolving applications...

  7. CLOUD COMPUTING SECURITY

    Directory of Open Access Journals (Sweden)

    Ştefan IOVAN

    2016-05-01

    Full Text Available Cloud computing reprentes the software applications offered as a service online, but also the software and hardware components from the data center.In the case of wide offerd services for any type of client, we are dealing with a public cloud. In the other case, in wich a cloud is exclusively available for an organization and is not available to the open public, this is consider a private cloud [1]. There is also a third type, called hibrid in which case an user or an organization might use both services available in the public and private cloud. One of the main challenges of cloud computing are to build the trust and ofer information privacy in every aspect of service offerd by cloud computingle. The variety of existing standards, just like the lack of clarity in sustenability certificationis not a real help in building trust. Also appear some questions marks regarding the efficiency of traditionsecurity means that are applied in the cloud domain. Beside the economic and technology advantages offered by cloud, also are some advantages in security area if the information is migrated to cloud. Shared resources available in cloud includes the survey, use of the "best practices" and technology for advance security level, above all the solutions offered by the majority of medium and small businesses, big companies and even some guvermental organizations [2].

  8. Cylinder components properties, applications, materials

    CERN Document Server

    2016-01-01

    Owing to the ever-increasing requirements to be met by gasoline and diesel engines in terms of CO2 reduction, emission behavior, weight, and service life, a comprehensive understanding of combustion engine components is essential today. It is no longer possible for professionals in automotive engineering to manage without the corresponding expertise, whether they work in the field of design, development, testing, or maintenance. This technical book provides in-depth answers to questions about design, production, and machining of cylinder components. In this second edition, every section has been revised and expanded to include the latest developments in the combustion engine. Content Piston rings Piston pins and piston pin circlips Bearings Connecting rods Crankcase and cylinder liners Target audience Engineers in the field of engine development and maintenanceLecturers and students in the areas of mechanical engineering, engine technology, and vehicle constructionAnyone interested in technology Publisher MAH...

  9. Tritium in fusion reactor components

    International Nuclear Information System (INIS)

    Watson, J.S.; Fisher, P.W.; Talbot, J.B.

    1980-01-01

    When tritium is used in a fusion energy experiment or reactor, several implications affect and usually restrict the design and operation of the system and involve questions of containment, inventory, and radiation damage. Containment is expected to be particularly important both for high-temperature components and for those components that are prone to require frequent maintenance. Inventory is currently of major significance in cases where safety and environmental considerations limit the experiments to very low levels of tritium. Fewer inventory restrictions are expected as fusion experiments are placed in more-remote locations and as the fusion community gains experience with the use of tritium. However, the advent of power-producing experiments with high-duty cycle will again lead to serious difficulties based principally on tritium availability; cyclic operations with significant regeneration times are the principal problems

  10. HG ion thruster component testing

    Science.gov (United States)

    Mantenieks, M. A.

    1979-01-01

    Cathodes, isolators, and vaporizers are critical components in determining the performance and lifetime of mercury ion thrusters. The results of life tests of several of these components are reported. A 30-cm thruster CIV test in a bell jar has successfully accumulated over 26,000 hours. The cathode has undergone 65 restarts during the life test without requiring any appreciable increases in starting power. Recently, all restarts have been achieved with only the 44 volt keeper supply with no change required in the starting power. Another ongoing 30-cm Hg thruster cathode test has successfully passed the 10,000 hour mark. A solid-insert, 8-cm thruster cathode has accumulated over 4,000 hours of thruster operation. All starts have been achieved without the use of a high voltage ignitor. The results of this test indicate that the solid impregnated insert is a viable neutralizer cathode for the 8-cm thruster.

  11. Component processes underlying future thinking.

    Science.gov (United States)

    D'Argembeau, Arnaud; Ortoleva, Claudia; Jumentier, Sabrina; Van der Linden, Martial

    2010-09-01

    This study sought to investigate the component processes underlying the ability to imagine future events, using an individual-differences approach. Participants completed several tasks assessing different aspects of future thinking (i.e., fluency, specificity, amount of episodic details, phenomenology) and were also assessed with tasks and questionnaires measuring various component processes that have been hypothesized to support future thinking (i.e., executive processes, visual-spatial processing, relational memory processing, self-consciousness, and time perspective). The main results showed that executive processes were correlated with various measures of future thinking, whereas visual-spatial processing abilities and time perspective were specifically related to the number of sensory descriptions reported when specific future events were imagined. Furthermore, individual differences in self-consciousness predicted the subjective feeling of experiencing the imagined future events. These results suggest that future thinking involves a collection of processes that are related to different facets of future-event representation.

  12. Nuclear power plant component protection

    International Nuclear Information System (INIS)

    Michel, E.; Ruf, R.; Dorner, H.

    1976-01-01

    Described is a nuclear power plant installation which includes a concrete biological shield forming a pit in which a reactor pressure vessel is positioned. A steam generator on the outside of the shield is connected with the pressure vessel via coolant pipe lines which extend through the shield, the coolant circulation being provided by a coolant pump which is also on the outside of the shield. To protect these components on the outside of the shield and which are of mainly or substantially cylindrical shape, semicylindrical concrete segments are interfitted around them to form complete outer cylinders which are retained against outward separation radially from the components, by rings of high tensile steel which may be interspaced so closely that they provide, in effect, an outer steel cylinder. The invention is particularly applicable to pressurized-water coolant reactor installations

  13. Computational Methods for Biomolecular Electrostatics

    Science.gov (United States)

    Dong, Feng; Olsen, Brett; Baker, Nathan A.

    2008-01-01

    An understanding of intermolecular interactions is essential for insight into how cells develop, operate, communicate and control their activities. Such interactions include several components: contributions from linear, angular, and torsional forces in covalent bonds, van der Waals forces, as well as electrostatics. Among the various components of molecular interactions, electrostatics are of special importance because of their long range and their influence on polar or charged molecules, including water, aqueous ions, and amino or nucleic acids, which are some of the primary components of living systems. Electrostatics, therefore, play important roles in determining the structure, motion and function of a wide range of biological molecules. This chapter presents a brief overview of electrostatic interactions in cellular systems with a particular focus on how computational tools can be used to investigate these types of interactions. PMID:17964951

  14. Detection of incorrect manufacturer labelling of hip components

    Energy Technology Data Exchange (ETDEWEB)

    Durand-Hill, Matthieu; Henckel, Johann; Skinner, John; Hart, Alister [University College London, Institute of Orthopaedics, London (United Kingdom); Burwell, Matthew [Royal United Hospital, Bath (United Kingdom)

    2017-01-15

    We describe the case of a 53-year-old man who underwent a left metal-on-metal hip resurfacing in 2015. Component size mismatch (CSM) was suspected because of the patient's immediate post-operative mechanical symptoms and high metal ion levels. Surgical notes indicated the appropriate combinations of implants were used. However, we detected a mismatch using computed tomography. Revision was performed and subsequent measurements of explanted components confirmed the mismatch. To our knowledge, this case is the first report of a CT method being used in a patient to pre-operatively identify CSM. (orig.)

  15. Holistic and component plant phenotyping using temporal image sequence.

    Science.gov (United States)

    Das Choudhury, Sruti; Bashyam, Srinidhi; Qiu, Yumou; Samal, Ashok; Awada, Tala

    2018-01-01

    Image-based plant phenotyping facilitates the extraction of traits noninvasively by analyzing large number of plants in a relatively short period of time. It has the potential to compute advanced phenotypes by considering the whole plant as a single object (holistic phenotypes) or as individual components, i.e., leaves and the stem (component phenotypes), to investigate the biophysical characteristics of the plants. The emergence timing, total number of leaves present at any point of time and the growth of individual leaves during vegetative stage life cycle of the maize plants are significant phenotypic expressions that best contribute to assess the plant vigor. However, image-based automated solution to this novel problem is yet to be explored. A set of new holistic and component phenotypes are introduced in this paper. To compute the component phenotypes, it is essential to detect the individual leaves and the stem. Thus, the paper introduces a novel method to reliably detect the leaves and the stem of the maize plants by analyzing 2-dimensional visible light image sequences captured from the side using a graph based approach. The total number of leaves are counted and the length of each leaf is measured for all images in the sequence to monitor leaf growth. To evaluate the performance of the proposed algorithm, we introduce University of Nebraska-Lincoln Component Plant Phenotyping Dataset (UNL-CPPD) and provide ground truth to facilitate new algorithm development and uniform comparison. The temporal variation of the component phenotypes regulated by genotypes and environment (i.e., greenhouse) are experimentally demonstrated for the maize plants on UNL-CPPD. Statistical models are applied to analyze the greenhouse environment impact and demonstrate the genetic regulation of the temporal variation of the holistic phenotypes on the public dataset called Panicoid Phenomap-1. The central contribution of the paper is a novel computer vision based algorithm for

  16. Sparse Principal Component Analysis in Medical Shape Modeling

    DEFF Research Database (Denmark)

    Sjöstrand, Karl; Stegmann, Mikkel Bille; Larsen, Rasmus

    2006-01-01

    Principal component analysis (PCA) is a widely used tool in medical image analysis for data reduction, model building, and data understanding and exploration. While PCA is a holistic approach where each new variable is a linear combination of all original variables, sparse PCA (SPCA) aims...... analysis in medicine. Results for three different data sets are given in relation to standard PCA and sparse PCA by simple thresholding of sufficiently small loadings. Focus is on a recent algorithm for computing sparse principal components, but a review of other approaches is supplied as well. The SPCA...

  17. Open Component Portability Infrastructure (OPENCPI)

    Science.gov (United States)

    2013-03-01

    declaration of the authored worker that must be implemented . If there is existing legacy VHDL entity architecture, then it is wrapped or modified to...that the software implementations were written to. Since all of the original code was VHDL , the HDL Authoring Model for VHDL was enhanced to meet...engineering process. This application was completed for the execution of all the components, the software implementations , and the VHDL skeletons for the

  18. Chemical decontamination of reactor components

    International Nuclear Information System (INIS)

    Riess, R.; Berthold, H.O.

    1977-08-01

    A solution for the decontamination of reactor components of the primary system was developed. This solution is a modification of the APAC- (Alkaline Permanganate Ammonium Citrate) system described in the literature. The most important advantage of the present solution over the APAC-method is that it does not induce any selective corrosion attack on materials like stainless steel (austenitic), Inconel 600 and Incoloy 800. (orig.) [de

  19. The risk components of liquidity

    OpenAIRE

    Chollete, Lorán; Næs, Randi; Skjeltorp, Johannes A.

    2008-01-01

    Does liquidity risk differ depending on our choice of liquidity proxy? Unlike literature that considers common liquidity variation, we focus on identifying different components of liquidity, statistically and economically, using more than a decade of US transaction data. We identify three main statistical liquidity factors which are utilized in a linear asset pricing framework. We motivate a correspondence of the statistical factors to traditional dimensions of liquidity as well as the notion...

  20. Reformulating Component Identification as Document Analysis Problem

    NARCIS (Netherlands)

    Gross, H.G.; Lormans, M.; Zhou, J.

    2007-01-01

    One of the first steps of component procurement is the identification of required component features in large repositories of existing components. On the highest level of abstraction, component requirements as well as component descriptions are usually written in natural language. Therefore, we can