WorldWideScience

Sample records for large biomolecular systems

  1. Scalable Molecular Dynamics for Large Biomolecular Systems

    Directory of Open Access Journals (Sweden)

    Robert K. Brunner

    2000-01-01

    Full Text Available We present an optimized parallelization scheme for molecular dynamics simulations of large biomolecular systems, implemented in the production-quality molecular dynamics program NAMD. With an object-based hybrid force and spatial decomposition scheme, and an aggressive measurement-based predictive load balancing framework, we have attained speeds and speedups that are much higher than any reported in literature so far. The paper first summarizes the broad methodology we are pursuing, and the basic parallelization scheme we used. It then describes the optimizations that were instrumental in increasing performance, and presents performance results on benchmark simulations.

  2. Positrons in biomolecular systems. II

    International Nuclear Information System (INIS)

    Glass, J.C.; Graf, G.; Costabal, H.; Ewert, D.H.; English, L.

    1982-01-01

    Pickoff-annihilation parameters, as related to the free volume model, are shown to be indicators of structural fluctuations in membranes and membrane bound proteins. Nitrous oxide anesthetic induces lateral rigidity in a membrane, and an anesthetic mechanism is suggested. Conformational changes of (Na + ,K + )ATPase in natural membrane are observed with ATP and Mg-ion binding. New positron applications to active transport and photosynthetic systems are suggested. (Auth.)

  3. A multiscale modeling approach for biomolecular systems

    Energy Technology Data Exchange (ETDEWEB)

    Bowling, Alan, E-mail: bowling@uta.edu; Haghshenas-Jaryani, Mahdi, E-mail: mahdi.haghshenasjaryani@mavs.uta.edu [The University of Texas at Arlington, Department of Mechanical and Aerospace Engineering (United States)

    2015-04-15

    This paper presents a new multiscale molecular dynamic model for investigating the effects of external interactions, such as contact and impact, during stepping and docking of motor proteins and other biomolecular systems. The model retains the mass properties ensuring that the result satisfies Newton’s second law. This idea is presented using a simple particle model to facilitate discussion of the rigid body model; however, the particle model does provide insights into particle dynamics at the nanoscale. The resulting three-dimensional model predicts a significant decrease in the effect of the random forces associated with Brownian motion. This conclusion runs contrary to the widely accepted notion that the motor protein’s movements are primarily the result of thermal effects. This work focuses on the mechanical aspects of protein locomotion; the effect ATP hydrolysis is estimated as internal forces acting on the mechanical model. In addition, the proposed model can be numerically integrated in a reasonable amount of time. Herein, the differences between the motion predicted by the old and new modeling approaches are compared using a simplified model of myosin V.

  4. Ion induced fragmentation of biomolecular systems at low collision energies

    International Nuclear Information System (INIS)

    Bernigaud, V; Adoui, L; Chesnel, J Y; Rangama, J; Huber, B A; Manil, B; Alvarado, F; Bari, S; Hoekstra, R; Postma, J; Schlathoelter, T

    2009-01-01

    In this paper, we present results of different collision experiments between multiply charged ions at low collision energies (in the keV-region) and biomolecular systems. This kind of interaction allows to remove electrons form the biomolecule without transferring a large amount of vibrational excitation energy. Nevertheless, following the ionization of the target, fragmentation of biomolecular species may occur. It is the main objective of this work to study the physical processes involved in the dissociation of highly electronically excited systems. In order to elucidate the intrinsic properties of certain biomolecules (porphyrins and amino acids) we have performed experiments in the gas phase with isolated systems. The obtained results demonstrate the high stability of porphyrins after electron removal. Furthermore, a dependence of the fragmentation pattern produced by multiply charged ions on the isomeric structure of the alanine molecule has been shown. By considering the presence of other surrounding biomolecules (clusters of nucleobases), a strong influence of the environment of the biomolecule on the fragmentation channels and their modification, has been clearly proven. This result is explained, in the thymine and uracil case, by the formation of hydrogen bonds between O and H atoms, which is known to favor planar cluster geometries.

  5. DNA-assisted swarm control in a biomolecular motor system.

    Science.gov (United States)

    Keya, Jakia Jannat; Suzuki, Ryuhei; Kabir, Arif Md Rashedul; Inoue, Daisuke; Asanuma, Hiroyuki; Sada, Kazuki; Hess, Henry; Kuzuya, Akinori; Kakugo, Akira

    2018-01-31

    In nature, swarming behavior has evolved repeatedly among motile organisms because it confers a variety of beneficial emergent properties. These include improved information gathering, protection from predators, and resource utilization. Some organisms, e.g., locusts, switch between solitary and swarm behavior in response to external stimuli. Aspects of swarming behavior have been demonstrated for motile supramolecular systems composed of biomolecular motors and cytoskeletal filaments, where cross-linkers induce large scale organization. The capabilities of such supramolecular systems may be further extended if the swarming behavior can be programmed and controlled. Here, we demonstrate that the swarming of DNA-functionalized microtubules (MTs) propelled by surface-adhered kinesin motors can be programmed and reversibly regulated by DNA signals. Emergent swarm behavior, such as translational and circular motion, can be selected by tuning the MT stiffness. Photoresponsive DNA containing azobenzene groups enables switching between solitary and swarm behavior in response to stimulation with visible or ultraviolet light.

  6. PREFACE: Radiation Damage in Biomolecular Systems (RADAM07)

    Science.gov (United States)

    McGuigan, Kevin G.

    2008-03-01

    , which include: theoretical, experimental, medical and computational physicists, radiation chemists, radiation biologists and microbiologists, among others. An important aspect of the previous 3 conferences in this series was to remove barriers between the different working groups and to encourage a more interdisciplinary approach to research collaborations. During RADAM_07 we could observe the success of these efforts. A large number of presentations were based on new collaborations, many funded under the COST STSM programme, and all presentations led to lively discussions. It is clear from the discussions following many of the presentations and at the poster sessions that Radiation Damage in Biomolecular Systems remains a topic of increasing interest, relevance and importance. The success of this conference as well as of the whole RADAM conference series reflects the growing international interest in the area of interactions of ionizing radiation with biomolecules. Despite the scheduled conclusion in September 2007 of COST Action P9 which has part-funded this, and previous RADAM meetings, the nature of the cross-disciplinary interactions and opportunities for collaborative research was deemed so successful and valuable by the assembled delegates that it was agreed that another such meeting should be held in Debrecen in Hungary from 13-15 June 2008 http://www.isa.au.dk/meetings/radam2008/programme.html. Additional information about RADAM07 programme is available on the conference web-page http://www.isa.au.dk/networks/cost/radam07/index.html. The Organizing Committee would like to thank all speakers, contributors, session chairs, referees and meeting staff for their efforts in making the RADAM07 successful. The local Organization Committee would like to thank Lorraine Monard and Margaret Nolan for their invaluable expertise in conference management. We also gratefully acknowledge the financial support of our sponsors - The Royal, College of Surgeons in Ireland (RCSI

  7. From dynamics to structure and function of model biomolecular systems

    NARCIS (Netherlands)

    Fontaine-Vive-Curtaz, F.

    2007-01-01

    The purpose of this thesis was to extend recent works on structure and dynamics of hydrogen bonded crystals to model biomolecular systems and biological processes. The tools that we have used are neutron scattering (NS) and density functional theory (DFT) and force field (FF) based simulation

  8. Optimal number of coarse-grained sites in different components of large biomolecular complexes.

    Science.gov (United States)

    Sinitskiy, Anton V; Saunders, Marissa G; Voth, Gregory A

    2012-07-26

    The computational study of large biomolecular complexes (molecular machines, cytoskeletal filaments, etc.) is a formidable challenge facing computational biophysics and biology. To achieve biologically relevant length and time scales, coarse-grained (CG) models of such complexes usually must be built and employed. One of the important early stages in this approach is to determine an optimal number of CG sites in different constituents of a complex. This work presents a systematic approach to this problem. First, a universal scaling law is derived and numerically corroborated for the intensity of the intrasite (intradomain) thermal fluctuations as a function of the number of CG sites. Second, this result is used for derivation of the criterion for the optimal number of CG sites in different parts of a large multibiomolecule complex. In the zeroth-order approximation, this approach validates the empirical rule of taking one CG site per fixed number of atoms or residues in each biomolecule, previously widely used for smaller systems (e.g., individual biomolecules). The first-order corrections to this rule are derived and numerically checked by the case studies of the Escherichia coli ribosome and Arp2/3 actin filament junction. In different ribosomal proteins, the optimal number of amino acids per CG site is shown to differ by a factor of 3.5, and an even wider spread may exist in other large biomolecular complexes. Therefore, the method proposed in this paper is valuable for the optimal construction of CG models of such complexes.

  9. Frequency-scanning MALDI linear ion trap mass spectrometer for large biomolecular ion detection.

    Science.gov (United States)

    Lu, I-Chung; Lin, Jung Lee; Lai, Szu-Hsueh; Chen, Chung-Hsuan

    2011-11-01

    This study presents the first report on the development of a matrix-assisted laser desorption ionization (MALDI) linear ion trap mass spectrometer for large biomolecular ion detection by frequency scan. We designed, installed, and tested this radio frequency (RF) scan linear ion trap mass spectrometer and its associated electronics to dramatically extend the mass region to be detected. The RF circuit can be adjusted from 300 to 10 kHz with a set of operation amplifiers. To trap the ions produced by MALDI, a high pressure of helium buffer gas was employed to quench extra kinetic energy of the heavy ions produced by MALDI. The successful detection of the singly charged secretory immunoglobulin A ions indicates that the detectable mass-to-charge ratio (m/z) of this system can reach ~385 000 or beyond.

  10. Biomolecular System Design: Architecture, Synthesis, and Simulation

    OpenAIRE

    Chiang , Katherine

    2015-01-01

    The advancements in systems and synthetic biology have been broadening the range of realizable systems with increasing complexity both in vitro and in vivo. Systems for digital logic operations, signal processing, analog computation, program flow control, as well as those composed of different functions – for example an on-site diagnostic system based on multiple biomarker measurements and signal processing – have been realized successfully. However, the efforts to date tend to tackle each de...

  11. Electron-correlated fragment-molecular-orbital calculations for biomolecular and nano systems.

    Science.gov (United States)

    Tanaka, Shigenori; Mochizuki, Yuji; Komeiji, Yuto; Okiyama, Yoshio; Fukuzawa, Kaori

    2014-06-14

    Recent developments in the fragment molecular orbital (FMO) method for theoretical formulation, implementation, and application to nano and biomolecular systems are reviewed. The FMO method has enabled ab initio quantum-mechanical calculations for large molecular systems such as protein-ligand complexes at a reasonable computational cost in a parallelized way. There have been a wealth of application outcomes from the FMO method in the fields of biochemistry, medicinal chemistry and nanotechnology, in which the electron correlation effects play vital roles. With the aid of the advances in high-performance computing, the FMO method promises larger, faster, and more accurate simulations of biomolecular and related systems, including the descriptions of dynamical behaviors in solvent environments. The current status and future prospects of the FMO scheme are addressed in these contexts.

  12. Role of biomolecular logic systems in biosensors and bioactuators

    Science.gov (United States)

    Mailloux, Shay; Katz, Evgeny

    2014-09-01

    An overview of recent advances in biosensors and bioactuators based on biocomputing systems is presented. Biosensors digitally process multiple biochemical signals through Boolean logic networks of coupled biomolecular reactions and produce an output in the form of a YES/NO response. Compared to traditional single-analyte sensing devices, the biocomputing approach enables high-fidelity multianalyte biosensing, which is particularly beneficial for biomedical applications. Multisignal digital biosensors thus promise advances in rapid diagnosis and treatment of diseases by processing complex patterns of physiological biomarkers. Specifically, they can provide timely detection and alert medical personnel of medical emergencies together with immediate therapeutic intervention. Application of the biocomputing concept has been successfully demonstrated for systems performing logic analysis of biomarkers corresponding to different injuries, particularly as exemplified for liver injury. Wide-ranging applications of multianalyte digital biosensors in medicine, environmental monitoring, and homeland security are anticipated. "Smart" bioactuators, for signal-triggered drug release, for example, were designed by interfacing switchable electrodes with biocomputing systems. Integration of biosensing and bioactuating systems with biomolecular information processing systems advances the potential for further scientific innovations and various practical applications.

  13. Biomolecular logic systems: applications to biosensors and bioactuators

    Science.gov (United States)

    Katz, Evgeny

    2014-05-01

    The paper presents an overview of recent advances in biosensors and bioactuators based on the biocomputing concept. Novel biosensors digitally process multiple biochemical signals through Boolean logic networks of coupled biomolecular reactions and produce output in the form of YES/NO response. Compared to traditional single-analyte sensing devices, biocomputing approach enables a high-fidelity multi-analyte biosensing, particularly beneficial for biomedical applications. Multi-signal digital biosensors thus promise advances in rapid diagnosis and treatment of diseases by processing complex patterns of physiological biomarkers. Specifically, they can provide timely detection and alert to medical emergencies, along with an immediate therapeutic intervention. Application of the biocomputing concept has been successfully demonstrated for systems performing logic analysis of biomarkers corresponding to different injuries, particularly exemplified for liver injury. Wide-ranging applications of multi-analyte digital biosensors in medicine, environmental monitoring and homeland security are anticipated. "Smart" bioactuators, for example for signal-triggered drug release, were designed by interfacing switchable electrodes and biocomputing systems. Integration of novel biosensing and bioactuating systems with the biomolecular information processing systems keeps promise for further scientific advances and numerous practical applications.

  14. Multiresolution persistent homology for excessively large biomolecular datasets

    Energy Technology Data Exchange (ETDEWEB)

    Xia, Kelin; Zhao, Zhixiong [Department of Mathematics, Michigan State University, East Lansing, Michigan 48824 (United States); Wei, Guo-Wei, E-mail: wei@math.msu.edu [Department of Mathematics, Michigan State University, East Lansing, Michigan 48824 (United States); Department of Electrical and Computer Engineering, Michigan State University, East Lansing, Michigan 48824 (United States); Department of Biochemistry and Molecular Biology, Michigan State University, East Lansing, Michigan 48824 (United States)

    2015-10-07

    Although persistent homology has emerged as a promising tool for the topological simplification of complex data, it is computationally intractable for large datasets. We introduce multiresolution persistent homology to handle excessively large datasets. We match the resolution with the scale of interest so as to represent large scale datasets with appropriate resolution. We utilize flexibility-rigidity index to access the topological connectivity of the data set and define a rigidity density for the filtration analysis. By appropriately tuning the resolution of the rigidity density, we are able to focus the topological lens on the scale of interest. The proposed multiresolution topological analysis is validated by a hexagonal fractal image which has three distinct scales. We further demonstrate the proposed method for extracting topological fingerprints from DNA molecules. In particular, the topological persistence of a virus capsid with 273 780 atoms is successfully analyzed which would otherwise be inaccessible to the normal point cloud method and unreliable by using coarse-grained multiscale persistent homology. The proposed method has also been successfully applied to the protein domain classification, which is the first time that persistent homology is used for practical protein domain analysis, to our knowledge. The proposed multiresolution topological method has potential applications in arbitrary data sets, such as social networks, biological networks, and graphs.

  15. Single-molecule imaging and manipulation of biomolecular machines and systems.

    Science.gov (United States)

    Iino, Ryota; Iida, Tatsuya; Nakamura, Akihiko; Saita, Ei-Ichiro; You, Huijuan; Sako, Yasushi

    2018-02-01

    Biological molecular machines support various activities and behaviors of cells, such as energy production, signal transduction, growth, differentiation, and migration. We provide an overview of single-molecule imaging methods involving both small and large probes used to monitor the dynamic motions of molecular machines in vitro (purified proteins) and in living cells, and single-molecule manipulation methods used to measure the forces, mechanical properties and responses of biomolecules. We also introduce several examples of single-molecule analysis, focusing primarily on motor proteins and signal transduction systems. Single-molecule analysis is a powerful approach to unveil the operational mechanisms both of individual molecular machines and of systems consisting of many molecular machines. Quantitative, high-resolution single-molecule analyses of biomolecular systems at the various hierarchies of life will help to answer our fundamental question: "What is life?" This article is part of a Special Issue entitled "Biophysical Exploration of Dynamical Ordering of Biomolecular Systems" edited by Dr. Koichi Kato. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Evolution of biomolecular loadings along a major river system

    Science.gov (United States)

    Freymond, Chantal V.; Kündig, Nicole; Stark, Courcelle; Peterse, Francien; Buggle, Björn; Lupker, Maarten; Plötze, Michael; Blattmann, Thomas M.; Filip, Florin; Giosan, Liviu; Eglinton, Timothy I.

    2018-02-01

    Understanding the transport history and fate of organic carbon (OC) within river systems is crucial in order to constrain the dynamics and significance of land-ocean interactions as a component of the global carbon cycle. Fluvial export and burial of terrestrial OC in marine sediments influences atmospheric CO2 over a range of timescales, while river-dominated sedimentary sequences can provide valuable archives of paleoenvironmental information. While there is abundant evidence that the association of organic matter (OM) with minerals exerts an important influence on its stability as well as hydrodynamic behavior in aquatic systems, there is a paucity of information on where such associations form and how they evolve during fluvial transport. Here, we track total organic carbon (TOC) and terrestrial biomarker concentrations (plant wax-derived long-chain fatty acids (FA), branched glycerol dialkyl glycerol tetraethers (brGDGTs) and lignin-derived phenols) in sediments collected along the entire course of the Danube River system in the context of sedimentological parameters. Mineral-specific surface area-normalized biomarker and TOC concentrations show a systematic decrease from the upper to the lower Danube basin. Changes in OM loading of the available mineral phase correspond to a net decrease of 70-80% of different biomolecular components. Ranges for biomarker loadings on Danube River sediments, corresponding to 0.4-1.5 μgFA/m2 for long-chain (n-C24-32) fatty acids and 17-71 ngbrGDGT/m2 for brGDGTs, are proposed as a benchmark for comparison with other systems. We propose that normalizing TOC as well as biomarker concentrations to mineral surface area provides valuable quantitative constraints on OM dynamics and organo-mineral interactions during fluvial transport from terrigenous source to oceanic sink.

  17. Engineering intracellular active transport systems as in vivo biomolecular tools.

    Energy Technology Data Exchange (ETDEWEB)

    Bachand, George David; Carroll-Portillo, Amanda

    2006-11-01

    Active transport systems provide essential functions in terms of cell physiology and metastasis. These systems, however, are also co-opted by invading viruses, enabling directed transport of the virus to and from the cell's nucleus (i.e., the site of virus replication). Based on this concept, fundamentally new approaches for interrogating and manipulating the inner workings of living cells may be achievable by co-opting Nature's active transport systems as an in vivo biomolecular tool. The overall goal of this project was to investigate the ability to engineer kinesin-based transport systems for in vivo applications, specifically the collection of effector proteins (e.g., transcriptional regulators) within single cells. In the first part of this project, a chimeric fusion protein consisting of kinesin and a single chain variable fragment (scFv) of an antibody was successfully produced through a recombinant expression system. The kinesin-scFv retained both catalytic and antigenic functionality, enabling selective capture and transport of target antigens. The incorporation of a rabbit IgG-specific scFv into the kinesin established a generalized system for functionalizing kinesin with a wide range of target-selective antibodies raised in rabbits. The second objective was to develop methods of isolating the intact microtubule network from live cells as a platform for evaluating kinesin-based transport within the cytoskeletal architecture of a cell. Successful isolation of intact microtubule networks from two distinct cell types was demonstrated using glutaraldehyde and methanol fixation methods. This work provides a platform for inferring the ability of kinesin-scFv to function in vivo, and may also serve as a three-dimensional scaffold for evaluating and exploiting kinesin-based transport for nanotechnological applications. Overall, the technology developed in this project represents a first-step in engineering active transport system for in vivo

  18. New Distributed Multipole Methods for Accurate Electrostatics for Large-Scale Biomolecular Simultations

    Science.gov (United States)

    Sagui, Celeste

    2006-03-01

    An accurate and numerically efficient treatment of electrostatics is essential for biomolecular simulations, as this stabilizes much of the delicate 3-d structure associated with biomolecules. Currently, force fields such as AMBER and CHARMM assign ``partial charges'' to every atom in a simulation in order to model the interatomic electrostatic forces, so that the calculation of the electrostatics rapidly becomes the computational bottleneck in large-scale simulations. There are two main issues associated with the current treatment of classical electrostatics: (i) how does one eliminate the artifacts associated with the point-charges (e.g., the underdetermined nature of the current RESP fitting procedure for large, flexible molecules) used in the force fields in a physically meaningful way? (ii) how does one efficiently simulate the very costly long-range electrostatic interactions? Recently, we have dealt with both of these challenges as follows. In order to improve the description of the molecular electrostatic potentials (MEPs), a new distributed multipole analysis based on localized functions -- Wannier, Boys, and Edminston-Ruedenberg -- was introduced, which allows for a first principles calculation of the partial charges and multipoles. Through a suitable generalization of the particle mesh Ewald (PME) and multigrid method, one can treat electrostatic multipoles all the way to hexadecapoles all without prohibitive extra costs. The importance of these methods for large-scale simulations will be discussed, and examplified by simulations from polarizable DNA models.

  19. Bookshelf: a simple curation system for the storage of biomolecular simulation data.

    Science.gov (United States)

    Vohra, Shabana; Hall, Benjamin A; Holdbrook, Daniel A; Khalid, Syma; Biggin, Philip C

    2010-01-01

    Molecular dynamics simulations can now routinely generate data sets of several hundreds of gigabytes in size. The ability to generate this data has become easier over recent years and the rate of data production is likely to increase rapidly in the near future. One major problem associated with this vast amount of data is how to store it in a way that it can be easily retrieved at a later date. The obvious answer to this problem is a database. However, a key issue in the development and maintenance of such a database is its sustainability, which in turn depends on the ease of the deposition and retrieval process. Encouraging users to care about meta-data is difficult and thus the success of any storage system will ultimately depend on how well used by end-users the system is. In this respect we suggest that even a minimal amount of metadata if stored in a sensible fashion is useful, if only at the level of individual research groups. We discuss here, a simple database system which we call 'Bookshelf', that uses python in conjunction with a mysql database to provide an extremely simple system for curating and keeping track of molecular simulation data. It provides a user-friendly, scriptable solution to the common problem amongst biomolecular simulation laboratories; the storage, logging and subsequent retrieval of large numbers of simulations. Download URL: http://sbcb.bioch.ox.ac.uk/bookshelf/

  20. Introduction to a Protein Interaction System Used for Quantitative Evaluation of Biomolecular Interactions

    OpenAIRE

    Yamniuk, Aaron

    2013-01-01

    A central goal of molecular biology is the determination of biomolecular function. This comes largely from a knowledge of the non-covalent interactions that biological small and macro-molecules experience. The fundamental mission of the Molecular Interactions Research Group (MIRG) of the ABRF is to show how solution biophysical tools are used to quantitatively characterize molecular interactions, and to educate the ABRF members and scientific community on the utility and limitations of core t...

  1. Rapid prototyping of nanofluidic systems using size-reduced electrospun nanofibers for biomolecular analysis.

    Science.gov (United States)

    Park, Seung-Min; Huh, Yun Suk; Szeto, Kylan; Joe, Daniel J; Kameoka, Jun; Coates, Geoffrey W; Edel, Joshua B; Erickson, David; Craighead, Harold G

    2010-11-05

    Biomolecular transport in nanofluidic confinement offers various means to investigate the behavior of biomolecules in their native aqueous environments, and to develop tools for diverse single-molecule manipulations. Recently, a number of simple nanofluidic fabrication techniques has been demonstrated that utilize electrospun nanofibers as a backbone structure. These techniques are limited by the arbitrary dimension of the resulting nanochannels due to the random nature of electrospinning. Here, a new method for fabricating nanofluidic systems from size-reduced electrospun nanofibers is reported and demonstrated. As it is demonstrated, this method uses the scanned electrospinning technique for generation of oriented sacrificial nanofibers and exposes these nanofibers to harsh, but isotropic etching/heating environments to reduce their cross-sectional dimension. The creation of various nanofluidic systems as small as 20 nm is demonstrated, and practical examples of single biomolecular handling, such as DNA elongation in nanochannels and fluorescence correlation spectroscopic analysis of biomolecules passing through nanochannels, are provided.

  2. Biomolecular Science (Fact Sheet)

    Energy Technology Data Exchange (ETDEWEB)

    2012-04-01

    A brief fact sheet about NREL Photobiology and Biomolecular Science. The research goal of NREL's Biomolecular Science is to enable cost-competitive advanced lignocellulosic biofuels production by understanding the science critical for overcoming biomass recalcitrance and developing new product and product intermediate pathways. NREL's Photobiology focuses on understanding the capture of solar energy in photosynthetic systems and its use in converting carbon dioxide and water directly into hydrogen and advanced biofuels.

  3. Nonlocal Dynamics in Nonlinear Biomolecular and Optical Systems

    DEFF Research Database (Denmark)

    Larsen, Peter Ulrik Vingaard

    2006-01-01

    Begrebet ikke-lokalitet nyder større og større interesse indenfor modelleringen af fysiske systemer - og med god grund. At en model er ikke-lokal betyder at for at kunne beskrive dens fysiske egenskaber i et givet punkt korrekt er det ikke tilstrækkeligt blot at betragte omstændighederne netop de...

  4. A compact imaging spectroscopic system for biomolecular detections on plasmonic chips.

    Science.gov (United States)

    Lo, Shu-Cheng; Lin, En-Hung; Wei, Pei-Kuen; Tsai, Wan-Shao

    2016-10-17

    In this study, we demonstrate a compact imaging spectroscopic system for high-throughput detection of biomolecular interactions on plasmonic chips, based on a curved grating as the key element of light diffraction and light focusing. Both the curved grating and the plasmonic chips are fabricated on flexible plastic substrates using a gas-assisted thermal-embossing method. A fiber-coupled broadband light source and a camera are included in the system. Spectral resolution within 1 nm is achieved in sensing environmental index solutions and protein bindings. The detected sensitivities of the plasmonic chip are comparable with a commercial spectrometer. An extra one-dimensional scanning stage enables high-throughput detection of protein binding on a designed plasmonic chip consisting of several nanoslit arrays with different periods. The detected resonance wavelengths match well with the grating equation under an air environment. Wavelength shifts between 1 and 9 nm are detected for antigens of various concentrations binding with antibodies. A simple, mass-productive and cost-effective method has been demonstrated on the imaging spectroscopic system for real-time, label-free, highly sensitive and high-throughput screening of biomolecular interactions.

  5. An optics-based variable-temperature assay system for characterizing thermodynamics of biomolecular reactions on solid support

    Energy Technology Data Exchange (ETDEWEB)

    Fei, Yiyan; Landry, James P.; Zhu, X. D., E-mail: xdzhu@physics.ucdavis.edu [Department of Physics, University of California, One Shields Avenue, Davis, California 95616 (United States); Li, Yanhong; Yu, Hai; Lau, Kam; Huang, Shengshu; Chokhawala, Harshal A.; Chen, Xi [Department of Chemistry, University of California, One Shields Avenue, Davis, California 95616 (United States)

    2013-11-15

    A biological state is equilibrium of multiple concurrent biomolecular reactions. The relative importance of these reactions depends on physiological temperature typically between 10 °C and 50 °C. Experimentally the temperature dependence of binding reaction constants reveals thermodynamics and thus details of these biomolecular processes. We developed a variable-temperature opto-fluidic system for real-time measurement of multiple (400–10 000) biomolecular binding reactions on solid supports from 10 °C to 60 °C within ±0.1 °C. We illustrate the performance of this system with investigation of binding reactions of plant lectins (carbohydrate-binding proteins) with 24 synthetic glycans (i.e., carbohydrates). We found that the lectin-glycan reactions in general can be enthalpy-driven, entropy-driven, or both, and water molecules play critical roles in the thermodynamics of these reactions.

  6. Hybrid Quantum Mechanics/Molecular Mechanics/Coarse Grained Modeling: A Triple-Resolution Approach for Biomolecular Systems.

    Science.gov (United States)

    Sokkar, Pandian; Boulanger, Eliot; Thiel, Walter; Sanchez-Garcia, Elsa

    2015-04-14

    We present a hybrid quantum mechanics/molecular mechanics/coarse-grained (QM/MM/CG) multiresolution approach for solvated biomolecular systems. The chemically important active-site region is treated at the QM level. The biomolecular environment is described by an atomistic MM force field, and the solvent is modeled with the CG Martini force field using standard or polarizable (pol-CG) water. Interactions within the QM, MM, and CG regions, and between the QM and MM regions, are treated in the usual manner, whereas the CG-MM and CG-QM interactions are evaluated using the virtual sites approach. The accuracy and efficiency of our implementation is tested for two enzymes, chorismate mutase (CM) and p-hydroxybenzoate hydroxylase (PHBH). In CM, the QM/MM/CG potential energy scans along the reaction coordinate yield reaction energies that are too large, both for the standard and polarizable Martini CG water models, which can be attributed to adverse effects of using large CG water beads. The inclusion of an atomistic MM water layer (10 Å for uncharged CG water and 5 Å for polarizable CG water) around the QM region improves the energy profiles compared to the reference QM/MM calculations. In analogous QM/MM/CG calculations on PHBH, the use of the pol-CG description for the outer water does not affect the stabilization of the highly charged FADHOOH-pOHB transition state compared to the fully atomistic QM/MM calculations. Detailed performance analysis in a glycine-water model system indicates that computation times for QM energy and gradient evaluations at the density functional level are typically reduced by 40-70% for QM/MM/CG relative to fully atomistic QM/MM calculations.

  7. Prediction of Biomolecular Complexes

    KAUST Repository

    Vangone, Anna

    2017-04-12

    Almost all processes in living organisms occur through specific interactions between biomolecules. Any dysfunction of those interactions can lead to pathological events. Understanding such interactions is therefore a crucial step in the investigation of biological systems and a starting point for drug design. In recent years, experimental studies have been devoted to unravel the principles of biomolecular interactions; however, due to experimental difficulties in solving the three-dimensional (3D) structure of biomolecular complexes, the number of available, high-resolution experimental 3D structures does not fulfill the current needs. Therefore, complementary computational approaches to model such interactions are necessary to assist experimentalists since a full understanding of how biomolecules interact (and consequently how they perform their function) only comes from 3D structures which provide crucial atomic details about binding and recognition processes. In this chapter we review approaches to predict biomolecular complexesBiomolecular complexes, introducing the concept of molecular dockingDocking, a technique which uses a combination of geometric, steric and energetics considerations to predict the 3D structure of a biological complex starting from the individual structures of its constituent parts. We provide a mini-guide about docking concepts, its potential and challenges, along with post-docking analysis and a list of related software.

  8. Prediction of Biomolecular Complexes

    KAUST Repository

    Vangone, Anna; Oliva, Romina; Cavallo, Luigi; Bonvin, Alexandre M. J. J.

    2017-01-01

    Almost all processes in living organisms occur through specific interactions between biomolecules. Any dysfunction of those interactions can lead to pathological events. Understanding such interactions is therefore a crucial step in the investigation of biological systems and a starting point for drug design. In recent years, experimental studies have been devoted to unravel the principles of biomolecular interactions; however, due to experimental difficulties in solving the three-dimensional (3D) structure of biomolecular complexes, the number of available, high-resolution experimental 3D structures does not fulfill the current needs. Therefore, complementary computational approaches to model such interactions are necessary to assist experimentalists since a full understanding of how biomolecules interact (and consequently how they perform their function) only comes from 3D structures which provide crucial atomic details about binding and recognition processes. In this chapter we review approaches to predict biomolecular complexesBiomolecular complexes, introducing the concept of molecular dockingDocking, a technique which uses a combination of geometric, steric and energetics considerations to predict the 3D structure of a biological complex starting from the individual structures of its constituent parts. We provide a mini-guide about docking concepts, its potential and challenges, along with post-docking analysis and a list of related software.

  9. REVIEW ARTICLE: How do biomolecular systems speed up and regulate rates?

    Science.gov (United States)

    Zhou, Huan-Xiang

    2005-09-01

    The viability of a biological system depends upon careful regulation of the rates of various processes. These rates have limits imposed by intrinsic chemical or physical steps (e.g., diffusion). These limits can be expanded by interactions and dynamics of the biomolecules. For example, (a) a chemical reaction is catalyzed when its transition state is preferentially bound to an enzyme; (b) the folding of a protein molecule is speeded up by specific interactions within the transition-state ensemble and may be assisted by molecular chaperones; (c) the rate of specific binding of a protein molecule to a cellular target can be enhanced by mechanisms such as long-range electrostatic interactions, nonspecific binding and folding upon binding; (d) directional movement of motor proteins is generated by capturing favorable Brownian motion through intermolecular binding energy; and (e) conduction and selectivity of ions through membrane channels are controlled by interactions and the dynamics of channel proteins. Simple physical models are presented here to illustrate these processes and provide a unifying framework for understanding speed attainment and regulation in biomolecular systems.

  10. PREFACE: 1st Nano-IBCT Conference 2011 - Radiation Damage of Biomolecular Systems: Nanoscale Insights into Ion Beam Cancer Therapy

    Science.gov (United States)

    Huber, Bernd A.; Malot, Christiane; Domaracka, Alicja; Solov'yov, Andrey V.

    2012-07-01

    The 1st Nano-IBCT Conference entitled 'Radiation Damage in Biomolecular Systems: Nanoscale Insights into Ion Beam Cancer Therapy' was held in Caen, France, in October 2011. The Meeting was organised in the framework of the COST Action MP1002 (Nano-IBCT) which was launched in December 2010 (http://fias.uni-frankfurt.de/nano-ibct). This action aims to promote the understanding of mechanisms and processes underlying the radiation damage of biomolecular systems at the molecular and nanoscopic level and to use the findings to improve the strategy of Ion Beam Cancer Therapy. In the hope of achieving this, participants from different disciplines were invited to represent the fields of physics, biology, medicine and chemistry, and also included those from industry and the operators of hadron therapy centres. Ion beam therapy offers the possibility of excellent dose localization for treatment of malignant tumours, minimizing radiation damage in normal healthy tissue, while maximizing cell killing within the tumour. Several ion beam cancer therapy clinical centres are now operating in Europe and elsewhere. However, the full potential of such therapy can only be exploited by better understanding the physical, chemical and biological mechanisms that lead to cell death under ion irradiation. Considering a range of spatio-temporal scales, the proposed action therefore aims to combine the unique experimental and theoretical expertise available within Europe to acquire greater insight at the nanoscopic and molecular level into radiation damage induced by ion impact. Success in this endeavour will be both an important scientific breakthrough and give great impetus to the practical improvement of this innovative therapeutic technique. Ion therapy potentially provides an important advance in cancer therapy and the COST action MP1002 will be very significant in ensuring Europe's leadership in this field, providing the scientific background, required data and mechanistic insight which

  11. Applications of atomic force microscopy to the studies of biomaterials in biomolecular systems

    Science.gov (United States)

    Ma, Xiang

    Atomic force microscopy (AFM) is a unique tool for the studies of nanoscale structures and interactions. In this dissertation, I applied AFM to study transitions among multiple states of biomaterials in three different microscopic biomolecular systems: MukB-dependent DNA condensation, holdfast adhesion, and virus elasticity. To elucidate the mechanism of MukB-dependent DNA condensation, I have studied the conformational changes of MukB proteins as indicators for the strength of interactions between MukB, DNA and other molecular factors, such as magnesium and ParC proteins, using high-resolution AFM imaging. To determine the physical origins of holdfast adhesion, I have investigated the dynamics of adhesive force development of the holdfast, employing AFM force spectroscopy. By measuring rupture forces between the holdfast and the substrate, I showed that the holdfast adhesion is strongly time-dependent and involves transformations at multiple time scales. Understanding the mechanisms of adhesion force development of the holdfast will be critical for future engineering of holdfasts properties for various applications. Finally, I have examined the elasticity of self-assembled hepatitis B virus-like particles (HBV VLPs) and brome mosaic virus (BMV) in response to changes of pH and salinity, using AFM nanoindentation. The distributions of elasticity were mapped on a single particle level and compared between empty, RNA- and gold-filled HBV VLPs. I found that a single HBV VLP showed heterogeneous distribution of elasticity and a two-step buckling transition, suggesting a discrete property of HBV capsids. For BMV, I have showed that viruses containing different RNA molecules can be distinguished by mechanical measurements, while they are indistinguishable by morphology. I also studied the effect of pH on the elastic behaviors of three-particle BMV and R3/4 BMV. This study can yield insights into RNA presentation/release mechanisms, and could help us to design novel drug

  12. Aligning Biomolecular Networks Using Modular Graph Kernels

    Science.gov (United States)

    Towfic, Fadi; Greenlee, M. Heather West; Honavar, Vasant

    Comparative analysis of biomolecular networks constructed using measurements from different conditions, tissues, and organisms offer a powerful approach to understanding the structure, function, dynamics, and evolution of complex biological systems. We explore a class of algorithms for aligning large biomolecular networks by breaking down such networks into subgraphs and computing the alignment of the networks based on the alignment of their subgraphs. The resulting subnetworks are compared using graph kernels as scoring functions. We provide implementations of the resulting algorithms as part of BiNA, an open source biomolecular network alignment toolkit. Our experiments using Drosophila melanogaster, Saccharomyces cerevisiae, Mus musculus and Homo sapiens protein-protein interaction networks extracted from the DIP repository of protein-protein interaction data demonstrate that the performance of the proposed algorithms (as measured by % GO term enrichment of subnetworks identified by the alignment) is competitive with some of the state-of-the-art algorithms for pair-wise alignment of large protein-protein interaction networks. Our results also show that the inter-species similarity scores computed based on graph kernels can be used to cluster the species into a species tree that is consistent with the known phylogenetic relationships among the species.

  13. A model system for targeted drug release triggered by biomolecular signals logically processed through enzyme logic networks.

    Science.gov (United States)

    Mailloux, Shay; Halámek, Jan; Katz, Evgeny

    2014-03-07

    A new Sense-and-Act system was realized by the integration of a biocomputing system, performing analytical processes, with a signal-responsive electrode. A drug-mimicking release process was triggered by biomolecular signals processed by different logic networks, including three concatenated AND logic gates or a 3-input OR logic gate. Biocatalytically produced NADH, controlled by various combinations of input signals, was used to activate the electrochemical system. A biocatalytic electrode associated with signal-processing "biocomputing" systems was electrically connected to another electrode coated with a polymer film, which was dissolved upon the formation of negative potential releasing entrapped drug-mimicking species, an enzyme-antibody conjugate, operating as a model for targeted immune-delivery and consequent "prodrug" activation. The system offers great versatility for future applications in controlled drug release and personalized medicine.

  14. An effective hierarchical model for the biomolecular covalent bond: an approach integrating artificial chemistry and an actual terrestrial life system.

    Science.gov (United States)

    Oohashi, Tsutomu; Ueno, Osamu; Maekawa, Tadao; Kawai, Norie; Nishina, Emi; Honda, Manabu

    2009-01-01

    Under the AChem paradigm and the programmed self-decomposition (PSD) model, we propose a hierarchical model for the biomolecular covalent bond (HBCB model). This model assumes that terrestrial organisms arrange their biomolecules in a hierarchical structure according to the energy strength of their covalent bonds. It also assumes that they have evolutionarily selected the PSD mechanism of turning biological polymers (BPs) into biological monomers (BMs) as an efficient biomolecular recycling strategy We have examined the validity and effectiveness of the HBCB model by coordinating two complementary approaches: biological experiments using existent terrestrial life, and simulation experiments using an AChem system. Biological experiments have shown that terrestrial life possesses a PSD mechanism as an endergonic, genetically regulated process and that hydrolysis, which decomposes a BP into BMs, is one of the main processes of such a mechanism. In simulation experiments, we compared different virtual self-decomposition processes. The virtual species in which the self-decomposition process mainly involved covalent bond cleavage from a BP to BMs showed evolutionary superiority over other species in which the self-decomposition process involved cleavage from BP to classes lower than BM. These converging findings strongly support the existence of PSD and the validity and effectiveness of the HBCB model.

  15. Coupling switches and oscillators as a means to shape cellular signals in biomolecular systems

    International Nuclear Information System (INIS)

    Zhou, Peipei; Cai, Shuiming; Liu, Zengrong; Chen, Luonan; Wang, Ruiqi

    2013-01-01

    To understand how a complex biomolecular network functions, a decomposition or a reconstruction process of the network is often needed so as to provide new insights into the regulatory mechanisms underlying various dynamical behaviors and also to gain qualitative knowledge of the network. Unfortunately, it seems that there are still no general rules on how to decompose a complex network into simple modules. An alternative resolution is to decompose a complex network into small modules or subsystems with specified functions such as switches and oscillators and then integrate them by analyzing the interactions between them. The main idea of this approach can be illustrated by considering a bidirectionally coupled network in this paper, i.e., coupled Toggle switch and Repressilator, and analyzing the occurrence of various dynamics, although the theoretical principle may hold for a general class of networks. We show that various biomolecular signals can be shaped by regulating the coupling between the subsystems. The approach presented here can be expected to simplify and analyze even more complex biological networks

  16. Coupling switches and oscillators as a means to shape cellular signals in biomolecular systems

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Peipei [Institute of Systems Biology, Shanghai University, Shanghai 200444 (China); Faculty of Science, Jiangsu University, Zhenjiang, Jiangsu 212013 (China); Cai, Shuiming [Faculty of Science, Jiangsu University, Zhenjiang, Jiangsu 212013 (China); Liu, Zengrong [Institute of Systems Biology, Shanghai University, Shanghai 200444 (China); Chen, Luonan [Key Laboratory of Systems Biology, SIBS-Novo Nordisk Translational Research Center for PreDiabetes, Shanghai Institutes for Biological Sciences, Chinese Academy of Sciences, Shanghai 200031 (China); Collaborative Research Center for Innovative Mathematical Modeling, Institute of Industrial Science, University of Tokyo, Tokyo 153-8505 (Japan); Wang, Ruiqi [Institute of Systems Biology, Shanghai University, Shanghai 200444 (China)

    2013-05-15

    To understand how a complex biomolecular network functions, a decomposition or a reconstruction process of the network is often needed so as to provide new insights into the regulatory mechanisms underlying various dynamical behaviors and also to gain qualitative knowledge of the network. Unfortunately, it seems that there are still no general rules on how to decompose a complex network into simple modules. An alternative resolution is to decompose a complex network into small modules or subsystems with specified functions such as switches and oscillators and then integrate them by analyzing the interactions between them. The main idea of this approach can be illustrated by considering a bidirectionally coupled network in this paper, i.e., coupled Toggle switch and Repressilator, and analyzing the occurrence of various dynamics, although the theoretical principle may hold for a general class of networks. We show that various biomolecular signals can be shaped by regulating the coupling between the subsystems. The approach presented here can be expected to simplify and analyze even more complex biological networks.

  17. Biomolecular EPR spectroscopy

    CERN Document Server

    Hagen, Wilfred Raymond

    2008-01-01

    Comprehensive, Up-to-Date Coverage of Spectroscopy Theory and its Applications to Biological SystemsAlthough a multitude of books have been published about spectroscopy, most of them only occasionally refer to biological systems and the specific problems of biomolecular EPR (bioEPR). Biomolecular EPR Spectroscopy provides a practical introduction to bioEPR and demonstrates how this remarkable tool allows researchers to delve into the structural, functional, and analytical analysis of paramagnetic molecules found in the biochemistry of all species on the planet. A Must-Have Reference in an Intrinsically Multidisciplinary FieldThis authoritative reference seamlessly covers all important bioEPR applications, including low-spin and high-spin metalloproteins, spin traps and spin lables, interaction between active sites, and redox systems. It is loaded with practical tricks as well as do's and don'ts that are based on the author's 30 years of experience in the field. The book also comes with an unprecedented set of...

  18. Studies of the charge instabilities in the complex nano-objects: clusters and bio-molecular systems

    International Nuclear Information System (INIS)

    Manil, B.

    2007-11-01

    For the last 6 years, my main research works focused on i) the Coulomb instabilities and the fragmentation processes of fullerenes and clusters of fullerenes ii) the stability and the reactivity of complex bio-molecular systems. Concerning the clusters of fullerenes, which are van der Waals type clusters, we have shown that the multiply charged species, obtained in collisions with slow highly charged ions, keep their structural properties but become very good electric conductor. In another hand, with the aim to understand the role of the biologic environment at the molecular scale in the irradiation damage of complex biomolecules, we have studied the charge stabilities of clusters of small biomolecules and the dissociation processes of larger nano-hydrated biomolecules. Theses studies have shown that first, specific molecular recognition mechanisms continue to exist in gas phase and secondly, a small and very simple biochemical environment is enough to change the dynamics of instabilities. (author)

  19. Converting biomolecular modelling data based on an XML representation.

    Science.gov (United States)

    Sun, Yudong; McKeever, Steve

    2008-08-25

    Biomolecular modelling has provided computational simulation based methods for investigating biological processes from quantum chemical to cellular levels. Modelling such microscopic processes requires atomic description of a biological system and conducts in fine timesteps. Consequently the simulations are extremely computationally demanding. To tackle this limitation, different biomolecular models have to be integrated in order to achieve high-performance simulations. The integration of diverse biomolecular models needs to convert molecular data between different data representations of different models. This data conversion is often non-trivial, requires extensive human input and is inevitably error prone. In this paper we present an automated data conversion method for biomolecular simulations between molecular dynamics and quantum mechanics/molecular mechanics models. Our approach is developed around an XML data representation called BioSimML (Biomolecular Simulation Markup Language). BioSimML provides a domain specific data representation for biomolecular modelling which can effciently support data interoperability between different biomolecular simulation models and data formats.

  20. Improvements to the APBS biomolecular solvation software suite.

    Science.gov (United States)

    Jurrus, Elizabeth; Engel, Dave; Star, Keith; Monson, Kyle; Brandi, Juan; Felberg, Lisa E; Brookes, David H; Wilson, Leighton; Chen, Jiahui; Liles, Karina; Chun, Minju; Li, Peter; Gohara, David W; Dolinsky, Todd; Konecny, Robert; Koes, David R; Nielsen, Jens Erik; Head-Gordon, Teresa; Geng, Weihua; Krasny, Robert; Wei, Guo-Wei; Holst, Michael J; McCammon, J Andrew; Baker, Nathan A

    2018-01-01

    The Adaptive Poisson-Boltzmann Solver (APBS) software was developed to solve the equations of continuum electrostatics for large biomolecular assemblages that have provided impact in the study of a broad range of chemical, biological, and biomedical applications. APBS addresses the three key technology challenges for understanding solvation and electrostatics in biomedical applications: accurate and efficient models for biomolecular solvation and electrostatics, robust and scalable software for applying those theories to biomolecular systems, and mechanisms for sharing and analyzing biomolecular electrostatics data in the scientific community. To address new research applications and advancing computational capabilities, we have continually updated APBS and its suite of accompanying software since its release in 2001. In this article, we discuss the models and capabilities that have recently been implemented within the APBS software package including a Poisson-Boltzmann analytical and a semi-analytical solver, an optimized boundary element solver, a geometry-based geometric flow solvation model, a graph theory-based algorithm for determining pK a values, and an improved web-based visualization tool for viewing electrostatics. © 2017 The Protein Society.

  1. AIM for Allostery: Using the Ising Model to Understand Information Processing and Transmission in Allosteric Biomolecular Systems.

    Science.gov (United States)

    LeVine, Michael V; Weinstein, Harel

    2015-05-01

    In performing their biological functions, molecular machines must process and transmit information with high fidelity. Information transmission requires dynamic coupling between the conformations of discrete structural components within the protein positioned far from one another on the molecular scale. This type of biomolecular "action at a distance" is termed allostery . Although allostery is ubiquitous in biological regulation and signal transduction, its treatment in theoretical models has mostly eschewed quantitative descriptions involving the system's underlying structural components and their interactions. Here, we show how Ising models can be used to formulate an approach to allostery in a structural context of interactions between the constitutive components by building simple allosteric constructs we termed Allosteric Ising Models (AIMs). We introduce the use of AIMs in analytical and numerical calculations that relate thermodynamic descriptions of allostery to the structural context, and then show that many fundamental properties of allostery, such as the multiplicative property of parallel allosteric channels, are revealed from the analysis of such models. The power of exploring mechanistic structural models of allosteric function in more complex systems by using AIMs is demonstrated by building a model of allosteric signaling for an experimentally well-characterized asymmetric homodimer of the dopamine D2 receptor.

  2. Rule-based modeling: a computational approach for studying biomolecular site dynamics in cell signaling systems

    Science.gov (United States)

    Chylek, Lily A.; Harris, Leonard A.; Tung, Chang-Shung; Faeder, James R.; Lopez, Carlos F.

    2013-01-01

    Rule-based modeling was developed to address the limitations of traditional approaches for modeling chemical kinetics in cell signaling systems. These systems consist of multiple interacting biomolecules (e.g., proteins), which themselves consist of multiple parts (e.g., domains, linear motifs, and sites of phosphorylation). Consequently, biomolecules that mediate information processing generally have the potential to interact in multiple ways, with the number of possible complexes and post-translational modification states tending to grow exponentially with the number of binary interactions considered. As a result, only large reaction networks capture all possible consequences of the molecular interactions that occur in a cell signaling system, which is problematic because traditional modeling approaches for chemical kinetics (e.g., ordinary differential equations) require explicit network specification. This problem is circumvented through representation of interactions in terms of local rules. With this approach, network specification is implicit and model specification is concise. Concise representation results in a coarse graining of chemical kinetics, which is introduced because all reactions implied by a rule inherit the rate law associated with that rule. Coarse graining can be appropriate if interactions are modular, and the coarseness of a model can be adjusted as needed. Rules can be specified using specialized model-specification languages, and recently developed tools designed for specification of rule-based models allow one to leverage powerful software engineering capabilities. A rule-based model comprises a set of rules, which can be processed by general-purpose simulation and analysis tools to achieve different objectives (e.g., to perform either a deterministic or stochastic simulation). PMID:24123887

  3. Rule-based modeling: a computational approach for studying biomolecular site dynamics in cell signaling systems.

    Science.gov (United States)

    Chylek, Lily A; Harris, Leonard A; Tung, Chang-Shung; Faeder, James R; Lopez, Carlos F; Hlavacek, William S

    2014-01-01

    Rule-based modeling was developed to address the limitations of traditional approaches for modeling chemical kinetics in cell signaling systems. These systems consist of multiple interacting biomolecules (e.g., proteins), which themselves consist of multiple parts (e.g., domains, linear motifs, and sites of phosphorylation). Consequently, biomolecules that mediate information processing generally have the potential to interact in multiple ways, with the number of possible complexes and posttranslational modification states tending to grow exponentially with the number of binary interactions considered. As a result, only large reaction networks capture all possible consequences of the molecular interactions that occur in a cell signaling system, which is problematic because traditional modeling approaches for chemical kinetics (e.g., ordinary differential equations) require explicit network specification. This problem is circumvented through representation of interactions in terms of local rules. With this approach, network specification is implicit and model specification is concise. Concise representation results in a coarse graining of chemical kinetics, which is introduced because all reactions implied by a rule inherit the rate law associated with that rule. Coarse graining can be appropriate if interactions are modular, and the coarseness of a model can be adjusted as needed. Rules can be specified using specialized model-specification languages, and recently developed tools designed for specification of rule-based models allow one to leverage powerful software engineering capabilities. A rule-based model comprises a set of rules, which can be processed by general-purpose simulation and analysis tools to achieve different objectives (e.g., to perform either a deterministic or stochastic simulation). © 2013 Wiley Periodicals, Inc.

  4. Development of an informatics infrastructure for data exchange of biomolecular simulations: Architecture, data models and ontology.

    Science.gov (United States)

    Thibault, J C; Roe, D R; Eilbeck, K; Cheatham, T E; Facelli, J C

    2015-01-01

    Biomolecular simulations aim to simulate structure, dynamics, interactions, and energetics of complex biomolecular systems. With the recent advances in hardware, it is now possible to use more complex and accurate models, but also reach time scales that are biologically significant. Molecular simulations have become a standard tool for toxicology and pharmacology research, but organizing and sharing data - both within the same organization and among different ones - remains a substantial challenge. In this paper we review our recent work leading to the development of a comprehensive informatics infrastructure to facilitate the organization and exchange of biomolecular simulations data. Our efforts include the design of data models and dictionary tools that allow the standardization of the metadata used to describe the biomedical simulations, the development of a thesaurus and ontology for computational reasoning when searching for biomolecular simulations in distributed environments, and the development of systems based on these models to manage and share the data at a large scale (iBIOMES), and within smaller groups of researchers at laboratory scale (iBIOMES Lite), that take advantage of the standardization of the meta data used to describe biomolecular simulations.

  5. Open-ended response theory with polarizable embedding: multiphoton absorption in biomolecular systems.

    Science.gov (United States)

    Steindal, Arnfinn Hykkerud; Beerepoot, Maarten T P; Ringholm, Magnus; List, Nanna Holmgaard; Ruud, Kenneth; Kongsted, Jacob; Olsen, Jógvan Magnus Haugaard

    2016-10-12

    We present the theory and implementation of an open-ended framework for electric response properties at the level of Hartree-Fock and Kohn-Sham density functional theory that includes effects from the molecular environment modeled by the polarizable embedding (PE) model. With this new state-of-the-art multiscale functionality, electric response properties to any order can be calculated for molecules embedded in polarizable atomistic molecular environments ranging from solvents to complex heterogeneous macromolecules such as proteins. In addition, environmental effects on multiphoton absorption (MPA) properties can be studied by evaluating single residues of the response functions. The PE approach includes mutual polarization effects between the quantum and classical parts of the system through induced dipoles that are determined self-consistently with respect to the electronic density. The applicability of our approach is demonstrated by calculating MPA strengths up to four-photon absorption for the green fluorescent protein. We show how the size of the quantum region, as well as the treatment of the border between the quantum and classical regions, is crucial in order to obtain reliable MPA predictions.

  6. Biomolecular simulation: historical picture and future perspectives.

    Science.gov (United States)

    van Gunsteren, Wilfred F; Dolenc, Jozica

    2008-02-01

    Over the last 30 years, computation based on molecular models is playing an increasingly important role in biology, biological chemistry and biophysics. Since only a very limited number of properties of biomolecular systems are actually accessible to measurement by experimental means, computer simulation complements experiments by providing not only averages, but also distributions and time series of any definable, observable or non-observable, quantity. Biomolecular simulation may be used (i) to interpret experimental data, (ii) to provoke new experiments, (iii) to replace experiments and (iv) to protect intellectual property. Progress over the last 30 years is sketched and perspectives are outlined for the future.

  7. Programming in biomolecular computation

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Our goal is to provide a top-down approach to biomolecular computation. In spite of widespread discussion about connections between biology and computation, one question seems notable by its absence: Where are the programs? We identify a number of common features in programming that seem...... conspicuously absent from the literature on biomolecular computing; to partially redress this absence, we introduce a model of computation that is evidently programmable, by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined...... by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only executable, but are also compilable and interpretable. It is universal: all computable functions can be computed (in natural ways...

  8. Biomolecular Sciences: uniting Biology and Chemistry

    NARCIS (Netherlands)

    Vrieling, Engel

    2017-01-01

    Biomolecular Sciences: uniting Biology and Chemistry www.rug.nl/research/gbb The scientific discoveries in biomolecular sciences have benefitted enormously from technological innovations. At the Groningen Biomolecular Science and Biotechnology Institute (GBB) we now sequence a genome in days,

  9. Biomolecular modelling and simulations

    CERN Document Server

    Karabencheva-Christova, Tatyana

    2014-01-01

    Published continuously since 1944, the Advances in Protein Chemistry and Structural Biology series is the essential resource for protein chemists. Each volume brings forth new information about protocols and analysis of proteins. Each thematically organized volume is guest edited by leading experts in a broad range of protein-related topics. Describes advances in biomolecular modelling and simulations Chapters are written by authorities in their field Targeted to a wide audience of researchers, specialists, and students The information provided in the volume is well supported by a number of high quality illustrations, figures, and tables.

  10. Large Superconducting Magnet Systems

    CERN Document Server

    Védrine, P.

    2014-07-17

    The increase of energy in accelerators over the past decades has led to the design of superconducting magnets for both accelerators and the associated detectors. The use of Nb−Ti superconducting materials allows an increase in the dipole field by up to 10 T compared with the maximum field of 2 T in a conventional magnet. The field bending of the particles in the detectors and generated by the magnets can also be increased. New materials, such as Nb$_{3}$Sn and high temperature superconductor (HTS) conductors, can open the way to higher fields, in the range 13–20 T. The latest generations of fusion machines producing hot plasma also use large superconducting magnet systems.

  11. Large Superconducting Magnet Systems

    Energy Technology Data Exchange (ETDEWEB)

    Védrine, P [Saclay (France)

    2014-07-01

    The increase of energy in accelerators over the past decades has led to the design of superconducting magnets for both accelerators and the associated detectors. The use of Nb−Ti superconducting materials allows an increase in the dipole field by up to 10 T compared with the maximum field of 2 T in a conventional magnet. The field bending of the particles in the detectors and generated by the magnets can also be increased. New materials, such as Nb3Sn and high temperature superconductor (HTS) conductors, can open the way to higher fields, in the range 13–20 T. The latest generations of fusion machines producing hot plasma also use large superconducting magnet systems.

  12. Modeling, Analysis, Simulation, and Synthesis of Biomolecular Networks

    National Research Council Canada - National Science Library

    Ruben, Harvey; Kumar, Vijay; Sokolsky, Oleg

    2006-01-01

    ...) a first example of reachability analysis applied to a biomolecular system (lactose induction), 4) a model of tetracycline resistance that discriminates between two possible mechanisms for tetracycline diffusion through the cell membrane, and 5...

  13. Converting Biomolecular Modelling Data Based on an XML Representation

    Directory of Open Access Journals (Sweden)

    Sun Yudong

    2008-06-01

    Full Text Available Biomolecular modelling has provided computational simulation based methods for investigating biological processes from quantum chemical to cellular levels. Modelling such microscopic processes requires atomic description of a biological system and conducts in fine timesteps. Consequently the simulations are extremely computationally demanding. To tackle this limitation, different biomolecular models have to be integrated in order to achieve high-performance simulations. The integration of diverse biomolecular models needs to convert molecular data between different data representations of different models. This data conversion is often non-trivial, requires extensive human input and is inevitably error prone. In this paper we present an automated data conversion method for biomolecular simulations between molecular dynamics and quantum mechanics/molecular mechanics models. Our approach is developed around an XML data representation called BioSimML (Biomolecular Simulation Markup Language. BioSimML provides a domain specific data representation for biomolecular modelling which can effciently support data interoperability between different biomolecular simulation models and data formats.

  14. Improvements to the APBS biomolecular solvation software suite: Improvements to the APBS Software Suite

    Energy Technology Data Exchange (ETDEWEB)

    Jurrus, Elizabeth [Pacific Northwest National Laboratory, Richland Washington; Engel, Dave [Pacific Northwest National Laboratory, Richland Washington; Star, Keith [Pacific Northwest National Laboratory, Richland Washington; Monson, Kyle [Pacific Northwest National Laboratory, Richland Washington; Brandi, Juan [Pacific Northwest National Laboratory, Richland Washington; Felberg, Lisa E. [University of California, Berkeley California; Brookes, David H. [University of California, Berkeley California; Wilson, Leighton [University of Michigan, Ann Arbor Michigan; Chen, Jiahui [Southern Methodist University, Dallas Texas; Liles, Karina [Pacific Northwest National Laboratory, Richland Washington; Chun, Minju [Pacific Northwest National Laboratory, Richland Washington; Li, Peter [Pacific Northwest National Laboratory, Richland Washington; Gohara, David W. [St. Louis University, St. Louis Missouri; Dolinsky, Todd [FoodLogiQ, Durham North Carolina; Konecny, Robert [University of California San Diego, San Diego California; Koes, David R. [University of Pittsburgh, Pittsburgh Pennsylvania; Nielsen, Jens Erik [Protein Engineering, Novozymes A/S, Copenhagen Denmark; Head-Gordon, Teresa [University of California, Berkeley California; Geng, Weihua [Southern Methodist University, Dallas Texas; Krasny, Robert [University of Michigan, Ann Arbor Michigan; Wei, Guo-Wei [Michigan State University, East Lansing Michigan; Holst, Michael J. [University of California San Diego, San Diego California; McCammon, J. Andrew [University of California San Diego, San Diego California; Baker, Nathan A. [Pacific Northwest National Laboratory, Richland Washington; Brown University, Providence Rhode Island

    2017-10-24

    The Adaptive Poisson-Boltzmann Solver (APBS) software was developed to solve the equations of continuum electrostatics for large biomolecular assemblages that has provided impact in the study of a broad range of chemical, biological, and biomedical applications. APBS addresses three key technology challenges for understanding solvation and electrostatics in biomedical applications: accurate and efficient models for biomolecular solvation and electrostatics, robust and scalable software for applying those theories to biomolecular systems, and mechanisms for sharing and analyzing biomolecular electrostatics data in the scientific community. To address new research applications and advancing computational capabilities, we have continually updated APBS and its suite of accompanying software since its release in 2001. In this manuscript, we discuss the models and capabilities that have recently been implemented within the APBS software package including: a Poisson-Boltzmann analytical and a semi-analytical solver, an optimized boundary element solver, a geometry-based geometric flow solvation model, a graph theory based algorithm for determining pKa values, and an improved web-based visualization tool for viewing electrostatics.

  15. Membrane-based biomolecular smart materials

    International Nuclear Information System (INIS)

    Sarles, Stephen A; Leo, Donald J

    2011-01-01

    Membrane-based biomolecular materials are a new class of smart material that feature networks of artificial lipid bilayers contained within durable synthetic substrates. Bilayers contained within this modular material platform provide an environment that can be tailored to host an enormous diversity of functional biomolecules, where the functionality of the global material system depends on the type(s) and organization(s) of the biomolecules that are chosen. In this paper, we review a series of biomolecular material platforms developed recently within the Leo Group at Virginia Tech and we discuss several novel coupling mechanisms provided by these hybrid material systems. The platforms developed demonstrate that the functions of biomolecules and the properties of synthetic materials can be combined to operate in concert, and the examples provided demonstrate how the formation and properties of a lipid bilayer can respond to a variety of stimuli including mechanical forces and electric fields

  16. NMRbox: A Resource for Biomolecular NMR Computation.

    Science.gov (United States)

    Maciejewski, Mark W; Schuyler, Adam D; Gryk, Michael R; Moraru, Ion I; Romero, Pedro R; Ulrich, Eldon L; Eghbalnia, Hamid R; Livny, Miron; Delaglio, Frank; Hoch, Jeffrey C

    2017-04-25

    Advances in computation have been enabling many recent advances in biomolecular applications of NMR. Due to the wide diversity of applications of NMR, the number and variety of software packages for processing and analyzing NMR data is quite large, with labs relying on dozens, if not hundreds of software packages. Discovery, acquisition, installation, and maintenance of all these packages is a burdensome task. Because the majority of software packages originate in academic labs, persistence of the software is compromised when developers graduate, funding ceases, or investigators turn to other projects. To simplify access to and use of biomolecular NMR software, foster persistence, and enhance reproducibility of computational workflows, we have developed NMRbox, a shared resource for NMR software and computation. NMRbox employs virtualization to provide a comprehensive software environment preconfigured with hundreds of software packages, available as a downloadable virtual machine or as a Platform-as-a-Service supported by a dedicated compute cloud. Ongoing development includes a metadata harvester to regularize, annotate, and preserve workflows and facilitate and enhance data depositions to BioMagResBank, and tools for Bayesian inference to enhance the robustness and extensibility of computational analyses. In addition to facilitating use and preservation of the rich and dynamic software environment for biomolecular NMR, NMRbox fosters the development and deployment of a new class of metasoftware packages. NMRbox is freely available to not-for-profit users. Copyright © 2017 Biophysical Society. All rights reserved.

  17. Biomolecular electrostatics and solvation: a computational perspective.

    Science.gov (United States)

    Ren, Pengyu; Chun, Jaehun; Thomas, Dennis G; Schnieders, Michael J; Marucho, Marcelo; Zhang, Jiajing; Baker, Nathan A

    2012-11-01

    An understanding of molecular interactions is essential for insight into biological systems at the molecular scale. Among the various components of molecular interactions, electrostatics are of special importance because of their long-range nature and their influence on polar or charged molecules, including water, aqueous ions, proteins, nucleic acids, carbohydrates, and membrane lipids. In particular, robust models of electrostatic interactions are essential for understanding the solvation properties of biomolecules and the effects of solvation upon biomolecular folding, binding, enzyme catalysis, and dynamics. Electrostatics, therefore, are of central importance to understanding biomolecular structure and modeling interactions within and among biological molecules. This review discusses the solvation of biomolecules with a computational biophysics view toward describing the phenomenon. While our main focus lies on the computational aspect of the models, we provide an overview of the basic elements of biomolecular solvation (e.g. solvent structure, polarization, ion binding, and non-polar behavior) in order to provide a background to understand the different types of solvation models.

  18. THz time domain spectroscopy of biomolecular conformational modes

    International Nuclear Information System (INIS)

    Markelz, Andrea; Whitmire, Scott; Hillebrecht, Jay; Birge, Robert

    2002-01-01

    We discuss the use of terahertz time domain spectroscopy for studies of conformational flexibility and conformational change in biomolecules. Protein structural dynamics are vital to biological function with protein flexibility affecting enzymatic reaction rates and sensory transduction cycling times. Conformational mode dynamics occur on the picosecond timescale and with the collective vibrational modes associated with these large scale structural motions in the 1-100 cm -1 range. We have performed THz time domain spectroscopy (TTDS) of several biomolecular systems to explore the sensitivity of TTDS to distinguish different molecular species, different mutations within a single species and different conformations of a given biomolecule. We compare the measured absorbances to normal mode calculations and find that the TTDS absorbance reflects the density of normal modes determined by molecular mechanics calculations, and is sensitive to both conformation and mutation. These early studies demonstrate some of the advantages and limitations of using TTDS for the study of biomolecules

  19. Micro and Nanotechnologies Enhanced Biomolecular Sensing

    Directory of Open Access Journals (Sweden)

    Tza-Huei Wang

    2013-07-01

    Full Text Available This editorial summarizes some of the recent advances of micro and nanotechnology-based tools and devices for biomolecular detection. These include the incorporation of nanomaterials into a sensor surface or directly interfacing with molecular probes to enhance target detection via more rapid and sensitive responses, and the use of self-assembled organic/inorganic nanocomposites that inhibit exceptional spectroscopic properties to enable facile homogenous assays with efficient binding kinetics. Discussions also include some insight into microfluidic principles behind the development of an integrated sample preparation and biosensor platform toward a miniaturized and fully functional system for point of care applications.

  20. Optimization theory for large systems

    CERN Document Server

    Lasdon, Leon S

    2002-01-01

    Important text examines most significant algorithms for optimizing large systems and clarifying relations between optimization procedures. Much data appear as charts and graphs and will be highly valuable to readers in selecting a method and estimating computer time and cost in problem-solving. Initial chapter on linear and nonlinear programming presents all necessary background for subjects covered in rest of book. Second chapter illustrates how large-scale mathematical programs arise from real-world problems. Appendixes. List of Symbols.

  1. Smartphones for cell and biomolecular detection.

    Science.gov (United States)

    Liu, Xiyuan; Lin, Tung-Yi; Lillehoj, Peter B

    2014-11-01

    Recent advances in biomedical science and technology have played a significant role in the development of new sensors and assays for cell and biomolecular detection. Generally, these efforts are aimed at reducing the complexity and costs associated with diagnostic testing so that it can be performed outside of a laboratory or hospital setting, requiring minimal equipment and user involvement. In particular, point-of-care (POC) testing offers immense potential for many important applications including medical diagnosis, environmental monitoring, food safety, and biosecurity. When coupled with smartphones, POC systems can offer portability, ease of use and enhanced functionality while maintaining performance. This review article focuses on recent advancements and developments in smartphone-based POC systems within the last 6 years with an emphasis on cell and biomolecular detection. These devices typically comprise multiple components, such as detectors, sample processors, disposable chips, batteries, and software, which are integrated with a commercial smartphone. One of the most important aspects of developing these systems is the integration of these components onto a compact and lightweight platform that requires minimal power. Researchers have demonstrated several promising approaches employing various detection schemes and device configurations, and it is expected that further developments in biosensors, battery technology and miniaturized electronics will enable smartphone-based POC technologies to become more mainstream tools in the scientific and biomedical communities.

  2. ORAC: a molecular dynamics simulation program to explore free energy surfaces in biomolecular systems at the atomistic level.

    Science.gov (United States)

    Marsili, Simone; Signorini, Giorgio Federico; Chelli, Riccardo; Marchi, Massimo; Procacci, Piero

    2010-04-15

    We present the new release of the ORAC engine (Procacci et al., Comput Chem 1997, 18, 1834), a FORTRAN suite to simulate complex biosystems at the atomistic level. The previous release of the ORAC code included multiple time steps integration, smooth particle mesh Ewald method, constant pressure and constant temperature simulations. The present release has been supplemented with the most advanced techniques for enhanced sampling in atomistic systems including replica exchange with solute tempering, metadynamics and steered molecular dynamics. All these computational technologies have been implemented for parallel architectures using the standard MPI communication protocol. ORAC is an open-source program distributed free of charge under the GNU general public license (GPL) at http://www.chim.unifi.it/orac. 2009 Wiley Periodicals, Inc.

  3. Radiation damage in biomolecular systems

    CERN Document Server

    Fuss, Martina Christina

    2012-01-01

    Since the discovery of X-rays and radioactivity, ionizing radiations have been widely applied in medicine both for diagnostic and therapeutic purposes. The risks associated with radiation exposure and handling led to the parallel development of the field of radiation protection. Pioneering experiments done by Sanche and co-workers in 2000 showed that low-energy secondary electrons, which are abundantly generated along radiation tracks, are primarily responsible for radiation damage through successive interactions with the molecular constituents of the medium. Apart from ionizing processes, which are usually related to radiation damage, below the ionization level low-energy electrons can induce molecular fragmentation via dissociative processes such as internal excitation and electron attachment. This prompted collaborative projects between different research groups from European countries together with other specialists from Canada,  the USA and Australia. This book summarizes the advances achieved by these...

  4. Biomolecular simulations on petascale: promises and challenges

    International Nuclear Information System (INIS)

    Agarwal, Pratul K; Alam, Sadaf R

    2006-01-01

    Proteins work as highly efficient machines at the molecular level and are responsible for a variety of processes in all living cells. There is wide interest in understanding these machines for implications in biochemical/biotechnology industries as well as in health related fields. Over the last century, investigations of proteins based on a variety of experimental techniques have provided a wealth of information. More recently, theoretical and computational modeling using large scale simulations is providing novel insights into the functioning of these machines. The next generation supercomputers with petascale computing power, hold great promises as well as challenges for the biomolecular simulation scientists. We briefly discuss the progress being made in this area

  5. Systems engineering for very large systems

    Science.gov (United States)

    Lewkowicz, Paul E.

    Very large integrated systems have always posed special problems for engineers. Whether they are power generation systems, computer networks or space vehicles, whenever there are multiple interfaces, complex technologies or just demanding customers, the challenges are unique. 'Systems engineering' has evolved as a discipline in order to meet these challenges by providing a structured, top-down design and development methodology for the engineer. This paper attempts to define the general class of problems requiring the complete systems engineering treatment and to show how systems engineering can be utilized to improve customer satisfaction and profit ability. Specifically, this work will focus on a design methodology for the largest of systems, not necessarily in terms of physical size, but in terms of complexity and interconnectivity.

  6. A compact hard X-ray source for medical imaging and biomolecular studies

    International Nuclear Information System (INIS)

    Cline, D.B.; Green, M.A.; Kolonko, J.

    1995-01-01

    There are a large number of synchrotron light sources in the world. However, these sources are designed for physics, chemistry, and engineering studies. To our knowledge, none have been optimized for either medical imaging or biomolecular studies. There are special needs for these applications. We present here a preliminary design of a very compact source, small enough for a hospital or a biomolecular laboratory, that is suitable for these applications. (orig.)

  7. iCAVE: an open source tool for visualizing biomolecular networks in 3D, stereoscopic 3D and immersive 3D.

    Science.gov (United States)

    Liluashvili, Vaja; Kalayci, Selim; Fluder, Eugene; Wilson, Manda; Gabow, Aaron; Gümüs, Zeynep H

    2017-08-01

    Visualizations of biomolecular networks assist in systems-level data exploration in many cellular processes. Data generated from high-throughput experiments increasingly inform these networks, yet current tools do not adequately scale with concomitant increase in their size and complexity. We present an open source software platform, interactome-CAVE (iCAVE), for visualizing large and complex biomolecular interaction networks in 3D. Users can explore networks (i) in 3D using a desktop, (ii) in stereoscopic 3D using 3D-vision glasses and a desktop, or (iii) in immersive 3D within a CAVE environment. iCAVE introduces 3D extensions of known 2D network layout, clustering, and edge-bundling algorithms, as well as new 3D network layout algorithms. Furthermore, users can simultaneously query several built-in databases within iCAVE for network generation or visualize their own networks (e.g., disease, drug, protein, metabolite). iCAVE has modular structure that allows rapid development by addition of algorithms, datasets, or features without affecting other parts of the code. Overall, iCAVE is the first freely available open source tool that enables 3D (optionally stereoscopic or immersive) visualizations of complex, dense, or multi-layered biomolecular networks. While primarily designed for researchers utilizing biomolecular networks, iCAVE can assist researchers in any field. © The Authors 2017. Published by Oxford University Press.

  8. Integrative NMR for biomolecular research

    International Nuclear Information System (INIS)

    Lee, Woonghee; Cornilescu, Gabriel; Dashti, Hesam; Eghbalnia, Hamid R.; Tonelli, Marco; Westler, William M.; Butcher, Samuel E.; Henzler-Wildman, Katherine A.; Markley, John L.

    2016-01-01

    NMR spectroscopy is a powerful technique for determining structural and functional features of biomolecules in physiological solution as well as for observing their intermolecular interactions in real-time. However, complex steps associated with its practice have made the approach daunting for non-specialists. We introduce an NMR platform that makes biomolecular NMR spectroscopy much more accessible by integrating tools, databases, web services, and video tutorials that can be launched by simple installation of NMRFAM software packages or using a cross-platform virtual machine that can be run on any standard laptop or desktop computer. The software package can be downloaded freely from the NMRFAM software download page ( http://pine.nmrfam.wisc.edu/download-packages.html http://pine.nmrfam.wisc.edu/download_packages.html ), and detailed instructions are available from the Integrative NMR Video Tutorial page ( http://pine.nmrfam.wisc.edu/integrative.html http://pine.nmrfam.wisc.edu/integrative.html ).

  9. Integrative NMR for biomolecular research

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Woonghee, E-mail: whlee@nmrfam.wisc.edu; Cornilescu, Gabriel; Dashti, Hesam; Eghbalnia, Hamid R.; Tonelli, Marco; Westler, William M.; Butcher, Samuel E.; Henzler-Wildman, Katherine A.; Markley, John L., E-mail: markley@nmrfam.wisc.edu [University of Wisconsin-Madison, National Magnetic Resonance Facility at Madison and Biochemistry Department (United States)

    2016-04-15

    NMR spectroscopy is a powerful technique for determining structural and functional features of biomolecules in physiological solution as well as for observing their intermolecular interactions in real-time. However, complex steps associated with its practice have made the approach daunting for non-specialists. We introduce an NMR platform that makes biomolecular NMR spectroscopy much more accessible by integrating tools, databases, web services, and video tutorials that can be launched by simple installation of NMRFAM software packages or using a cross-platform virtual machine that can be run on any standard laptop or desktop computer. The software package can be downloaded freely from the NMRFAM software download page ( http://pine.nmrfam.wisc.edu/download-packages.html http://pine.nmrfam.wisc.edu/download{sub p}ackages.html ), and detailed instructions are available from the Integrative NMR Video Tutorial page ( http://pine.nmrfam.wisc.edu/integrative.html http://pine.nmrfam.wisc.edu/integrative.html ).

  10. hPDB – Haskell library for processing atomic biomolecular structures in protein data bank format

    OpenAIRE

    Gajda, Michał Jan

    2013-01-01

    Background Protein DataBank file format is used for the majority of biomolecular data available today. Haskell is a lazy functional language that enjoys a high-level class-based type system, a growing collection of useful libraries and a reputation for efficiency. Findings I present a fast library for processing biomolecular data in the Protein Data Bank format. I present benchmarks indicating that this library is faster than other frequently used Protein Data Bank parsing programs. The propo...

  11. A fast mollified impulse method for biomolecular atomistic simulations

    Energy Technology Data Exchange (ETDEWEB)

    Fath, L., E-mail: lukas.fath@kit.edu [Institute for App. and Num. Mathematics, Karlsruhe Institute of Technology (Germany); Hochbruck, M., E-mail: marlis.hochbruck@kit.edu [Institute for App. and Num. Mathematics, Karlsruhe Institute of Technology (Germany); Singh, C.V., E-mail: chandraveer.singh@utoronto.ca [Department of Materials Science & Engineering, University of Toronto (Canada)

    2017-03-15

    Classical integration methods for molecular dynamics are inherently limited due to resonance phenomena occurring at certain time-step sizes. The mollified impulse method can partially avoid this problem by using appropriate filters based on averaging or projection techniques. However, existing filters are computationally expensive and tedious in implementation since they require either analytical Hessians or they need to solve nonlinear systems from constraints. In this work we follow a different approach based on corotation for the construction of a new filter for (flexible) biomolecular simulations. The main advantages of the proposed filter are its excellent stability properties and ease of implementation in standard softwares without Hessians or solving constraint systems. By simulating multiple realistic examples such as peptide, protein, ice equilibrium and ice–ice friction, the new filter is shown to speed up the computations of long-range interactions by approximately 20%. The proposed filtered integrators allow step sizes as large as 10 fs while keeping the energy drift less than 1% on a 50 ps simulation.

  12. Stochastic Simulation of Biomolecular Reaction Networks Using the Biomolecular Network Simulator Software

    National Research Council Canada - National Science Library

    Frazier, John; Chusak, Yaroslav; Foy, Brent

    2008-01-01

    .... The software uses either exact or approximate stochastic simulation algorithms for generating Monte Carlo trajectories that describe the time evolution of the behavior of biomolecular reaction networks...

  13. Conducting polymer based biomolecular electronic devices

    Indian Academy of Sciences (India)

    Conducting polymers; LB films; biosensor microactuators; monolayers. ... have been projected for applications for a wide range of biomolecular electronic devices such as optical, electronic, drug-delivery, memory and biosensing devices.

  14. Application of biomolecular recognition via magnetic nanoparticle in nanobiotechnology

    Science.gov (United States)

    Shen, Wei-Zheng; Cetinel, Sibel; Montemagno, Carlo

    2018-05-01

    The marriage of biomolecular recognition and magnetic nanoparticle creates tremendous opportunities in the development of advanced technology both in academic research and in industrial sectors. In this paper, we review current progress on the magnetic nanoparticle-biomolecule hybrid systems, particularly employing the recognition pairs of DNA-DNA, DNA-protein, protein-protein, and protein-inorganics in several nanobiotechnology application areas, including molecular biology, diagnostics, medical treatment, industrial biocatalysts, and environmental separations.

  15. Tailoring the Variational Implicit Solvent Method for New Challenges: Biomolecular Recognition and Assembly

    Directory of Open Access Journals (Sweden)

    Clarisse Gravina Ricci

    2018-02-01

    Full Text Available Predicting solvation free energies and describing the complex water behavior that plays an important role in essentially all biological processes is a major challenge from the computational standpoint. While an atomistic, explicit description of the solvent can turn out to be too expensive in large biomolecular systems, most implicit solvent methods fail to capture “dewetting” effects and heterogeneous hydration by relying on a pre-established (i.e., guessed solvation interface. Here we focus on the Variational Implicit Solvent Method, an implicit solvent method that adds water “plasticity” back to the picture by formulating the solvation free energy as a functional of all possible solvation interfaces. We survey VISM's applications to the problem of molecular recognition and report some of the most recent efforts to tailor VISM for more challenging scenarios, with the ultimate goal of including thermal fluctuations into the framework. The advances reported herein pave the way to make VISM a uniquely successful approach to characterize complex solvation properties in the recognition and binding of large-scale biomolecular complexes.

  16. Tailoring the Variational Implicit Solvent Method for New Challenges: Biomolecular Recognition and Assembly

    Science.gov (United States)

    Ricci, Clarisse Gravina; Li, Bo; Cheng, Li-Tien; Dzubiella, Joachim; McCammon, J. Andrew

    2018-01-01

    Predicting solvation free energies and describing the complex water behavior that plays an important role in essentially all biological processes is a major challenge from the computational standpoint. While an atomistic, explicit description of the solvent can turn out to be too expensive in large biomolecular systems, most implicit solvent methods fail to capture “dewetting” effects and heterogeneous hydration by relying on a pre-established (i.e., guessed) solvation interface. Here we focus on the Variational Implicit Solvent Method, an implicit solvent method that adds water “plasticity” back to the picture by formulating the solvation free energy as a functional of all possible solvation interfaces. We survey VISM's applications to the problem of molecular recognition and report some of the most recent efforts to tailor VISM for more challenging scenarios, with the ultimate goal of including thermal fluctuations into the framework. The advances reported herein pave the way to make VISM a uniquely successful approach to characterize complex solvation properties in the recognition and binding of large-scale biomolecular complexes. PMID:29484300

  17. Biomolecular surface construction by PDE transform.

    Science.gov (United States)

    Zheng, Qiong; Yang, Siyang; Wei, Guo-Wei

    2012-03-01

    This work proposes a new framework for the surface generation based on the partial differential equation (PDE) transform. The PDE transform has recently been introduced as a general approach for the mode decomposition of images, signals, and data. It relies on the use of arbitrarily high-order PDEs to achieve the time-frequency localization, control the spectral distribution, and regulate the spatial resolution. The present work provides a new variational derivation of high-order PDE transforms. The fast Fourier transform is utilized to accomplish the PDE transform so as to avoid stringent stability constraints in solving high-order PDEs. As a consequence, the time integration of high-order PDEs can be done efficiently with the fast Fourier transform. The present approach is validated with a variety of test examples in two-dimensional and three-dimensional settings. We explore the impact of the PDE transform parameters, such as the PDE order and propagation time, on the quality of resulting surfaces. Additionally, we utilize a set of 10 proteins to compare the computational efficiency of the present surface generation method and a standard approach in Cartesian meshes. Moreover, we analyze the present method by examining some benchmark indicators of biomolecular surface, that is, surface area, surface-enclosed volume, solvation free energy, and surface electrostatic potential. A test set of 13 protein molecules is used in the present investigation. The electrostatic analysis is carried out via the Poisson-Boltzmann equation model. To further demonstrate the utility of the present PDE transform-based surface method, we solve the Poisson-Nernst-Planck equations with a PDE transform surface of a protein. Second-order convergence is observed for the electrostatic potential and concentrations. Finally, to test the capability and efficiency of the present PDE transform-based surface generation method, we apply it to the construction of an excessively large biomolecule, a

  18. Measurement system for large motions

    International Nuclear Information System (INIS)

    Noyes, R.; Davies, L.; Kalinowski, J.; Stubbs, T.

    1979-05-01

    The system used to measure the response of geologic media to stress waves generated during and after underground tests performed by the Lawrence Livermore Laboratory (LLL) at the Department of Energy's Nevada Test Site (NTS) is described. Included are descriptions of the system transducers and accelerometers, the procedures used in calibrating and packaging the system at the North Las Vegas Facility of EG and G, Inc., the positioning of equipment during fielding activities at NTS, and the procedures used at LLL's facilities in California to reduce and analyze the data recorded on magnetic tape at NTS during an underground nuclear explosion. In summarizing, the authors give the system high marks, attributing its success to good basic design, careful installation, and rigorous calibration and data analysis techniques applied with good judgement on the part of the instrumentation engineers and data analysts. 10 figures

  19. Operation of large cryogenic systems

    International Nuclear Information System (INIS)

    Rode, C.H.; Ferry, B.; Fowler, W.B.; Makara, J.; Peterson, T.; Theilacker, J.; Walker, R.

    1985-06-01

    This report is based on the past 12 years of experiments on R and D and operation of the 27 kW Fermilab Tevatron Cryogenic System. In general the comments are applicable for all helium plants larger than 1000W (400 l/hr) and non mass-produced nitrogen plants larger than 50 tons per day. 14 refs., 3 figs., 1 tab

  20. Multiscale Persistent Functions for Biomolecular Structure Characterization

    Energy Technology Data Exchange (ETDEWEB)

    Xia, Kelin [Nanyang Technological University (Singapore). Division of Mathematical Sciences, School of Physical, Mathematical Sciences and School of Biological Sciences; Li, Zhiming [Central China Normal University, Wuhan (China). Key Laboratory of Quark and Lepton Physics (MOE) and Institute of Particle Physics; Mu, Lin [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Computer Science and Mathematics Division

    2017-11-02

    Here in this paper, we introduce multiscale persistent functions for biomolecular structure characterization. The essential idea is to combine our multiscale rigidity functions (MRFs) with persistent homology analysis, so as to construct a series of multiscale persistent functions, particularly multiscale persistent entropies, for structure characterization. To clarify the fundamental idea of our method, the multiscale persistent entropy (MPE) model is discussed in great detail. Mathematically, unlike the previous persistent entropy (Chintakunta et al. in Pattern Recognit 48(2):391–401, 2015; Merelli et al. in Entropy 17(10):6872–6892, 2015; Rucco et al. in: Proceedings of ECCS 2014, Springer, pp 117–128, 2016), a special resolution parameter is incorporated into our model. Various scales can be achieved by tuning its value. Physically, our MPE can be used in conformational entropy evaluation. More specifically, it is found that our method incorporates in it a natural classification scheme. This is achieved through a density filtration of an MRF built from angular distributions. To further validate our model, a systematical comparison with the traditional entropy evaluation model is done. Additionally, it is found that our model is able to preserve the intrinsic topological features of biomolecular data much better than traditional approaches, particularly for resolutions in the intermediate range. Moreover, by comparing with traditional entropies from various grid sizes, bond angle-based methods and a persistent homology-based support vector machine method (Cang et al. in Mol Based Math Biol 3:140–162, 2015), we find that our MPE method gives the best results in terms of average true positive rate in a classic protein structure classification test. More interestingly, all-alpha and all-beta protein classes can be clearly separated from each other with zero error only in our model. Finally, a special protein structure index (PSI) is proposed, for the first

  1. Biomolecular Structure Information from High-Speed Quantum Mechanical Electronic Spectra Calculation.

    Science.gov (United States)

    Seibert, Jakob; Bannwarth, Christoph; Grimme, Stefan

    2017-08-30

    A fully quantum mechanical (QM) treatment to calculate electronic absorption (UV-vis) and circular dichroism (CD) spectra of typical biomolecules with thousands of atoms is presented. With our highly efficient sTDA-xTB method, spectra averaged along structures from molecular dynamics (MD) simulations can be computed in a reasonable time frame on standard desktop computers. This way, nonequilibrium structure and conformational, as well as purely quantum mechanical effects like charge-transfer or exciton-coupling, are included. Different from other contemporary approaches, the entire system is treated quantum mechanically and neither fragmentation nor system-specific adjustment is necessary. Among the systems considered are a large DNA fragment, oligopeptides, and even entire proteins in an implicit solvent. We propose the method in tandem with experimental spectroscopy or X-ray studies for the elucidation of complex (bio)molecular structures including metallo-proteins like myoglobin.

  2. Large inflated-antenna system

    Science.gov (United States)

    Hinson, W. F.; Keafer, L. S.

    1984-01-01

    It is proposed that for inflatable antenna systems, technology feasibility can be demonstrated and parametric design and scalability (scale factor 10 to 20) can be validated with an experiment using a 16-m-diameter antenna attached to the Shuttle. The antenna configuration consists of a thin film cone and paraboloid held to proper shape by internal pressure and a self-rigidizing torus. The cone and paraboloid would be made using pie-shaped gores with the paraboloid being coated with aluminum to provide reflectivity. The torus would be constructed using an aluminum polyester composite that when inflated would erect to a smooth shell that can withstand loads without internal pressure.

  3. Integrated Spintronic Platforms for Biomolecular Recognition Detection

    Science.gov (United States)

    Martins, V. C.; Cardoso, F. A.; Loureiro, J.; Mercier, M.; Germano, J.; Cardoso, S.; Ferreira, R.; Fonseca, L. P.; Sousa, L.; Piedade, M. S.; Freitas, P. P.

    2008-06-01

    This paper covers recent developments in magnetoresistive based biochip platforms fabricated at INESC-MN, and their application to the detection and quantification of pathogenic waterborn microorganisms in water samples for human consumption. Such platforms are intended to give response to the increasing concern related to microbial contaminated water sources. The presented results concern the development of biological active DNA chips and protein chips and the demonstration of the detection capability of the present platforms. Two platforms are described, one including spintronic sensors only (spin-valve based or magnetic tunnel junction based), and the other, a fully scalable platform where each probe site consists of a MTJ in series with a thin film diode (TFD). Two microfluidic systems are described, for cell separation and concentration, and finally, the read out and control integrated electronics are described, allowing the realization of bioassays with a portable point of care unit. The present platforms already allow the detection of complementary biomolecular target recognition with 1 pM concentration.

  4. Large thermal protection system panel

    Science.gov (United States)

    Weinberg, David J. (Inventor); Myers, Franklin K. (Inventor); Tran, Tu T. (Inventor)

    2003-01-01

    A protective panel for a reusable launch vehicle provides enhanced moisture protection, simplified maintenance, and increased temperature resistance. The protective panel includes an outer ceramic matrix composite (CMC) panel, and an insulative bag assembly coupled to the outer CMC panel for isolating the launch vehicle from elevated temperatures and moisture. A standoff attachment system attaches the outer CMC panel and the bag assembly to the primary structure of the launch vehicle. The insulative bag assembly includes a foil bag having a first opening shrink fitted to the outer CMC panel such that the first opening and the outer CMC panel form a water tight seal at temperatures below a desired temperature threshold. Fibrous insulation is contained within the foil bag for protecting the launch vehicle from elevated temperatures. The insulative bag assembly further includes a back panel coupled to a second opening of the foil bag such that the fibrous insulation is encapsulated by the back panel, the foil bag, and the outer CMC panel. The use of a CMC material for the outer panel in conjunction with the insulative bag assembly eliminates the need for waterproofing processes, and ultimately allows for more efficient reentry profiles.

  5. Biomolecular condensates: organizers of cellular biochemistry.

    Science.gov (United States)

    Banani, Salman F; Lee, Hyun O; Hyman, Anthony A; Rosen, Michael K

    2017-05-01

    Biomolecular condensates are micron-scale compartments in eukaryotic cells that lack surrounding membranes but function to concentrate proteins and nucleic acids. These condensates are involved in diverse processes, including RNA metabolism, ribosome biogenesis, the DNA damage response and signal transduction. Recent studies have shown that liquid-liquid phase separation driven by multivalent macromolecular interactions is an important organizing principle for biomolecular condensates. With this physical framework, it is now possible to explain how the assembly, composition, physical properties and biochemical and cellular functions of these important structures are regulated.

  6. Application of Hidden Markov Models in Biomolecular Simulations.

    Science.gov (United States)

    Shukla, Saurabh; Shamsi, Zahra; Moffett, Alexander S; Selvam, Balaji; Shukla, Diwakar

    2017-01-01

    Hidden Markov models (HMMs) provide a framework to analyze large trajectories of biomolecular simulation datasets. HMMs decompose the conformational space of a biological molecule into finite number of states that interconvert among each other with certain rates. HMMs simplify long timescale trajectories for human comprehension, and allow comparison of simulations with experimental data. In this chapter, we provide an overview of building HMMs for analyzing bimolecular simulation datasets. We demonstrate the procedure for building a Hidden Markov model for Met-enkephalin peptide simulation dataset and compare the timescales of the process.

  7. Reliability of large and complex systems

    CERN Document Server

    Kolowrocki, Krzysztof

    2014-01-01

    Reliability of Large and Complex Systems, previously titled Reliability of Large Systems, is an innovative guide to the current state and reliability of large and complex systems. In addition to revised and updated content on the complexity and safety of large and complex mechanisms, this new edition looks at the reliability of nanosystems, a key research topic in nanotechnology science. The author discusses the importance of safety investigation of critical infrastructures that have aged or have been exposed to varying operational conditions. This reference provides an asympt

  8. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  9. Biomolecular ions in superfluid helium nanodroplets

    International Nuclear Information System (INIS)

    Gonzalez Florez, Ana Isabel

    2016-01-01

    The function of a biological molecule is closely related to its structure. As a result, understanding and predicting biomolecular structure has become the focus of an extensive field of research. However, the investigation of molecular structure can be hampered by two main difficulties: the inherent complications that may arise from studying biological molecules in their native environment, and the potential congestion of the experimental results as a consequence of the large number of degrees of freedom present in these molecules. In this work, a new experimental setup has been developed and established in order to overcome the afore mentioned limitations combining structure-sensitive gas-phase methods with superfluid helium droplets. First, biological molecules are ionised and brought into the gas phase, often referred to as a clean-room environment, where the species of interest are isolated from their surroundings and, thus, intermolecular interactions are absent. The mass-to-charge selected biomolecules are then embedded inside clusters of superfluid helium with an equilibrium temperature of ∝0.37 K. As a result, the internal energy of the molecules is lowered, thereby reducing the number of populated quantum states. Finally, the local hydrogen bonding patterns of the molecules are investigated by probing specific vibrational modes using the Fritz Haber Institute's free electron laser as a source of infrared radiation. Although the structure of a wide variety of molecules has been studied making use of the sub-Kelvin environment provided by superfluid helium droplets, the suitability of this method for the investigation of biological molecular ions was still unclear. However, the experimental results presented in this thesis demonstrate the applicability of this experimental approach in order to study the structure of intact, large biomolecular ions and the first vibrational spectrum of the protonated pentapeptide leu-enkephalin embedded in helium

  10. Biomolecular ions in superfluid helium nanodroplets

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez Florez, Ana Isabel

    2016-07-01

    The function of a biological molecule is closely related to its structure. As a result, understanding and predicting biomolecular structure has become the focus of an extensive field of research. However, the investigation of molecular structure can be hampered by two main difficulties: the inherent complications that may arise from studying biological molecules in their native environment, and the potential congestion of the experimental results as a consequence of the large number of degrees of freedom present in these molecules. In this work, a new experimental setup has been developed and established in order to overcome the afore mentioned limitations combining structure-sensitive gas-phase methods with superfluid helium droplets. First, biological molecules are ionised and brought into the gas phase, often referred to as a clean-room environment, where the species of interest are isolated from their surroundings and, thus, intermolecular interactions are absent. The mass-to-charge selected biomolecules are then embedded inside clusters of superfluid helium with an equilibrium temperature of ∝0.37 K. As a result, the internal energy of the molecules is lowered, thereby reducing the number of populated quantum states. Finally, the local hydrogen bonding patterns of the molecules are investigated by probing specific vibrational modes using the Fritz Haber Institute's free electron laser as a source of infrared radiation. Although the structure of a wide variety of molecules has been studied making use of the sub-Kelvin environment provided by superfluid helium droplets, the suitability of this method for the investigation of biological molecular ions was still unclear. However, the experimental results presented in this thesis demonstrate the applicability of this experimental approach in order to study the structure of intact, large biomolecular ions and the first vibrational spectrum of the protonated pentapeptide leu-enkephalin embedded in helium

  11. Large scale network-centric distributed systems

    CERN Document Server

    Sarbazi-Azad, Hamid

    2014-01-01

    A highly accessible reference offering a broad range of topics and insights on large scale network-centric distributed systems Evolving from the fields of high-performance computing and networking, large scale network-centric distributed systems continues to grow as one of the most important topics in computing and communication and many interdisciplinary areas. Dealing with both wired and wireless networks, this book focuses on the design and performance issues of such systems. Large Scale Network-Centric Distributed Systems provides in-depth coverage ranging from ground-level hardware issu

  12. Thermodynamic properties of water solvating biomolecular surfaces

    Science.gov (United States)

    Heyden, Matthias

    Changes in the potential energy and entropy of water molecules hydrating biomolecular interfaces play a significant role for biomolecular solubility and association. Free energy perturbation and thermodynamic integration methods allow calculations of free energy differences between two states from simulations. However, these methods are computationally demanding and do not provide insights into individual thermodynamic contributions, i.e. changes in the solvent energy or entropy. Here, we employ methods to spatially resolve distributions of hydration water thermodynamic properties in the vicinity of biomolecular surfaces. This allows direct insights into thermodynamic signatures of the hydration of hydrophobic and hydrophilic solvent accessible sites of proteins and small molecules and comparisons to ideal model surfaces. We correlate dynamic properties of hydration water molecules, i.e. translational and rotational mobility, to their thermodynamics. The latter can be used as a guide to extract thermodynamic information from experimental measurements of site-resolved water dynamics. Further, we study energy-entropy compensations of water at different hydration sites of biomolecular surfaces. This work is supported by the Cluster of Excellence RESOLV (EXC 1069) funded by the Deutsche Forschungsgemeinschaft.

  13. Biomolecular engineering for nanobio/bionanotechnology

    Science.gov (United States)

    Nagamune, Teruyuki

    2017-04-01

    Biomolecular engineering can be used to purposefully manipulate biomolecules, such as peptides, proteins, nucleic acids and lipids, within the framework of the relations among their structures, functions and properties, as well as their applicability to such areas as developing novel biomaterials, biosensing, bioimaging, and clinical diagnostics and therapeutics. Nanotechnology can also be used to design and tune the sizes, shapes, properties and functionality of nanomaterials. As such, there are considerable overlaps between nanotechnology and biomolecular engineering, in that both are concerned with the structure and behavior of materials on the nanometer scale or smaller. Therefore, in combination with nanotechnology, biomolecular engineering is expected to open up new fields of nanobio/bionanotechnology and to contribute to the development of novel nanobiomaterials, nanobiodevices and nanobiosystems. This review highlights recent studies using engineered biological molecules (e.g., oligonucleotides, peptides, proteins, enzymes, polysaccharides, lipids, biological cofactors and ligands) combined with functional nanomaterials in nanobio/bionanotechnology applications, including therapeutics, diagnostics, biosensing, bioanalysis and biocatalysts. Furthermore, this review focuses on five areas of recent advances in biomolecular engineering: (a) nucleic acid engineering, (b) gene engineering, (c) protein engineering, (d) chemical and enzymatic conjugation technologies, and (e) linker engineering. Precisely engineered nanobiomaterials, nanobiodevices and nanobiosystems are anticipated to emerge as next-generation platforms for bioelectronics, biosensors, biocatalysts, molecular imaging modalities, biological actuators, and biomedical applications.

  14. Distributed simulation of large computer systems

    International Nuclear Information System (INIS)

    Marzolla, M.

    2001-01-01

    Sequential simulation of large complex physical systems is often regarded as a computationally expensive task. In order to speed-up complex discrete-event simulations, the paradigm of Parallel and Distributed Discrete Event Simulation (PDES) has been introduced since the late 70s. The authors analyze the applicability of PDES to the modeling and analysis of large computer system; such systems are increasingly common in the area of High Energy and Nuclear Physics, because many modern experiments make use of large 'compute farms'. Some feasibility tests have been performed on a prototype distributed simulator

  15. Synthetic Approach to biomolecular science by cyborg supramolecular chemistry.

    Science.gov (United States)

    Kurihara, Kensuke; Matsuo, Muneyuki; Yamaguchi, Takumi; Sato, Sota

    2018-02-01

    To imitate the essence of living systems via synthetic chemistry approaches has been attempted. With the progress in supramolecular chemistry, it has become possible to synthesize molecules of a size and complexity close to those of biomacromolecules. Recently, the combination of precisely designed supramolecules with biomolecules has generated structural platforms for designing and creating unique molecular systems. Bridging between synthetic chemistry and biomolecular science is also developing methodologies for the creation of artificial cellular systems. This paper provides an overview of the recently expanding interdisciplinary research to fuse artificial molecules with biomolecules, that can deepen our understanding of the dynamical ordering of biomolecules. Using bottom-up approaches based on the precise chemical design, synthesis and hybridization of artificial molecules with biological materials have been realizing the construction of sophisticated platforms having the fundamental functions of living systems. The effective hybrid, molecular cyborg, approaches enable not only the establishment of dynamic systems mimicking nature and thus well-defined models for biophysical understanding, but also the creation of those with highly advanced, integrated functions. This article is part of a Special Issue entitled "Biophysical Exploration of Dynamical Ordering of Biomolecular Systems" edited by Dr. Koichi Kato. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Large-scale Complex IT Systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2011-01-01

    This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that identifies the major challen...

  17. Large-scale complex IT systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2012-01-01

    12 pages, 2 figures This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that ident...

  18. Stability of large scale interconnected dynamical systems

    International Nuclear Information System (INIS)

    Akpan, E.P.

    1993-07-01

    Large scale systems modelled by a system of ordinary differential equations are considered and necessary and sufficient conditions are obtained for the uniform asymptotic connective stability of the systems using the method of cone-valued Lyapunov functions. It is shown that this model significantly improves the existing models. (author). 9 refs

  19. Computing in Large-Scale Dynamic Systems

    NARCIS (Netherlands)

    Pruteanu, A.S.

    2013-01-01

    Software applications developed for large-scale systems have always been difficult to de- velop due to problems caused by the large number of computing devices involved. Above a certain network size (roughly one hundred), necessary services such as code updating, topol- ogy discovery and data

  20. Trust dynamics in a large system implementation

    DEFF Research Database (Denmark)

    Schlichter, Bjarne Rerup; Rose, Jeremy

    2013-01-01

    outcomes, but largely ignored the dynamics of trust relations. Giddens, as part of his study of modernity, theorises trust dynamics in relation to abstract social systems, though without focusing on information systems. We use Giddens’ concepts to investigate evolving trust relationships in a longitudinal......A large information systems implementation (such as Enterprise Resource Planning systems) relies on the trust of its stakeholders to succeed. Such projects impact diverse groups of stakeholders, each with their legitimate interests and expectations. Levels of stakeholder trust can be expected...... case analysis of a large Integrated Hospital System implementation for the Faroe Islands. Trust relationships suffered a serious breakdown, but the project was able to recover and meet its goals. We develop six theoretical propositions theorising the relationship between trust and project outcomes...

  1. Numerical solution of large sparse linear systems

    International Nuclear Information System (INIS)

    Meurant, Gerard; Golub, Gene.

    1982-02-01

    This note is based on one of the lectures given at the 1980 CEA-EDF-INRIA Numerical Analysis Summer School whose aim is the study of large sparse linear systems. The main topics are solving least squares problems by orthogonal transformation, fast Poisson solvers and solution of sparse linear system by iterative methods with a special emphasis on preconditioned conjuguate gradient method [fr

  2. Modeling and simulation of large HVDC systems

    Energy Technology Data Exchange (ETDEWEB)

    Jin, H.; Sood, V.K.

    1993-01-01

    This paper addresses the complexity and the amount of work in preparing simulation data and in implementing various converter control schemes and the excessive simulation time involved in modelling and simulation of large HVDC systems. The Power Electronic Circuit Analysis program (PECAN) is used to address these problems and a large HVDC system with two dc links is simulated using PECAN. A benchmark HVDC system is studied to compare the simulation results with those from other packages. The simulation time and results are provided in the paper.

  3. Workflow management in large distributed systems

    International Nuclear Information System (INIS)

    Legrand, I; Newman, H; Voicu, R; Dobre, C; Grigoras, C

    2011-01-01

    The MonALISA (Monitoring Agents using a Large Integrated Services Architecture) framework provides a distributed service system capable of controlling and optimizing large-scale, data-intensive applications. An essential part of managing large-scale, distributed data-processing facilities is a monitoring system for computing facilities, storage, networks, and the very large number of applications running on these systems in near realtime. All this monitoring information gathered for all the subsystems is essential for developing the required higher-level services—the components that provide decision support and some degree of automated decisions—and for maintaining and optimizing workflow in large-scale distributed systems. These management and global optimization functions are performed by higher-level agent-based services. We present several applications of MonALISA's higher-level services including optimized dynamic routing, control, data-transfer scheduling, distributed job scheduling, dynamic allocation of storage resource to running jobs and automated management of remote services among a large set of grid facilities.

  4. Workflow management in large distributed systems

    Science.gov (United States)

    Legrand, I.; Newman, H.; Voicu, R.; Dobre, C.; Grigoras, C.

    2011-12-01

    The MonALISA (Monitoring Agents using a Large Integrated Services Architecture) framework provides a distributed service system capable of controlling and optimizing large-scale, data-intensive applications. An essential part of managing large-scale, distributed data-processing facilities is a monitoring system for computing facilities, storage, networks, and the very large number of applications running on these systems in near realtime. All this monitoring information gathered for all the subsystems is essential for developing the required higher-level services—the components that provide decision support and some degree of automated decisions—and for maintaining and optimizing workflow in large-scale distributed systems. These management and global optimization functions are performed by higher-level agent-based services. We present several applications of MonALISA's higher-level services including optimized dynamic routing, control, data-transfer scheduling, distributed job scheduling, dynamic allocation of storage resource to running jobs and automated management of remote services among a large set of grid facilities.

  5. Application of Nanodiamonds in Biomolecular Mass Spectrometry

    Directory of Open Access Journals (Sweden)

    Ping Cheng

    2010-03-01

    Full Text Available The combination of nanodiamond (ND with biomolecular mass spectrometry (MS makes rapid, sensitive detection of biopolymers from complex biosamples feasible. Due to its chemical inertness, optical transparency and biocompatibility, the advantage of NDs in MS study is unique. Furthermore, functionalization on the surfaces of NDs expands their application in the fields of proteomics and genomics for specific requirements greatly. This review presents methods of MS analysis based on solid phase extraction and elution on NDs and different application examples including peptide, protein, DNA, glycan and others. Owing to the quick development of nanotechnology, surface chemistry, new MS methods and the intense interest in proteomics and genomics, a huge increase of their applications in biomolecular MS analysis in the near future can be predicted.

  6. Large autonomous spacecraft electrical power system (LASEPS)

    Science.gov (United States)

    Dugal-Whitehead, Norma R.; Johnson, Yvette B.

    1992-01-01

    NASA - Marshall Space Flight Center is creating a large high voltage electrical power system testbed called LASEPS. This testbed is being developed to simulate an end-to-end power system from power generation and source to loads. When the system is completed it will have several power configurations, which will include several battery configurations. These configurations are: two 120 V batteries, one or two 150 V batteries, and one 250 to 270 V battery. This breadboard encompasses varying levels of autonomy from remote power converters to conventional software control to expert system control of the power system elements. In this paper, the construction and provisions of this breadboard are discussed.

  7. MPBEC, a Matlab Program for Biomolecular Electrostatic Calculations.

    Science.gov (United States)

    Vergara-Perez, Sandra; Marucho, Marcelo

    2016-01-01

    One of the most used and efficient approaches to compute electrostatic properties of biological systems is to numerically solve the Poisson-Boltzmann (PB) equation. There are several software packages available that solve the PB equation for molecules in aqueous electrolyte solutions. Most of these software packages are useful for scientists with specialized training and expertise in computational biophysics. However, the user is usually required to manually take several important choices, depending on the complexity of the biological system, to successfully obtain the numerical solution of the PB equation. This may become an obstacle for researchers, experimentalists, even students with no special training in computational methodologies. Aiming to overcome this limitation, in this article we present MPBEC, a free, cross-platform, open-source software that provides non-experts in the field an easy and efficient way to perform biomolecular electrostatic calculations on single processor computers. MPBEC is a Matlab script based on the Adaptative Poisson Boltzmann Solver, one of the most popular approaches used to solve the PB equation. MPBEC does not require any user programming, text editing or extensive statistical skills, and comes with detailed user-guide documentation. As a unique feature, MPBEC includes a useful graphical user interface (GUI) application which helps and guides users to configure and setup the optimal parameters and approximations to successfully perform the required biomolecular electrostatic calculations. The GUI also incorporates visualization tools to facilitate users pre- and post- analysis of structural and electrical properties of biomolecules.

  8. MPBEC, a Matlab Program for Biomolecular Electrostatic Calculations

    Science.gov (United States)

    Vergara-Perez, Sandra; Marucho, Marcelo

    2016-01-01

    One of the most used and efficient approaches to compute electrostatic properties of biological systems is to numerically solve the Poisson-Boltzmann (PB) equation. There are several software packages available that solve the PB equation for molecules in aqueous electrolyte solutions. Most of these software packages are useful for scientists with specialized training and expertise in computational biophysics. However, the user is usually required to manually take several important choices, depending on the complexity of the biological system, to successfully obtain the numerical solution of the PB equation. This may become an obstacle for researchers, experimentalists, even students with no special training in computational methodologies. Aiming to overcome this limitation, in this article we present MPBEC, a free, cross-platform, open-source software that provides non-experts in the field an easy and efficient way to perform biomolecular electrostatic calculations on single processor computers. MPBEC is a Matlab script based on the Adaptative Poisson-Boltzmann Solver, one of the most popular approaches used to solve the PB equation. MPBEC does not require any user programming, text editing or extensive statistical skills, and comes with detailed user-guide documentation. As a unique feature, MPBEC includes a useful graphical user interface (GUI) application which helps and guides users to configure and setup the optimal parameters and approximations to successfully perform the required biomolecular electrostatic calculations. The GUI also incorporates visualization tools to facilitate users pre- and post-analysis of structural and electrical properties of biomolecules.

  9. Models for large superconducting toroidal magnet systems

    International Nuclear Information System (INIS)

    Arendt, F.; Brechna, H.; Erb, J.; Komarek, P.; Krauth, H.; Maurer, W.

    1976-01-01

    Prior to the design of large GJ toroidal magnet systems it is appropriate to procure small scale models, which can simulate their pertinent properties and allow to investigate their relevant phenomena. The important feature of the model is to show under which circumstances the system performance can be extrapolated to large magnets. Based on parameters such as the maximum magnetic field and the current density, the maximum tolerable magneto-mechanical stresses, a simple method of designing model magnets is presented. It is shown how pertinent design parameters are changed when the toroidal dimensions are altered. In addition some conductor cost estimations are given based on reactor power output and wall loading

  10. A statistical nanomechanism of biomolecular patterning actuated by surface potential

    Science.gov (United States)

    Lin, Chih-Ting; Lin, Chih-Hao

    2011-02-01

    Biomolecular patterning on a nanoscale/microscale on chip surfaces is one of the most important techniques used in vitro biochip technologies. Here, we report upon a stochastic mechanics model we have developed for biomolecular patterning controlled by surface potential. The probabilistic biomolecular surface adsorption behavior can be modeled by considering the potential difference between the binding and nonbinding states. To verify our model, we experimentally implemented a method of electroactivated biomolecular patterning technology and the resulting fluorescence intensity matched the prediction of the developed model quite well. Based on this result, we also experimentally demonstrated the creation of a bovine serum albumin pattern with a width of 200 nm in 5 min operations. This submicron noncovalent-binding biomolecular pattern can be maintained for hours after removing the applied electrical voltage. These stochastic understandings and experimental results not only prove the feasibility of submicron biomolecular patterns on chips but also pave the way for nanoscale interfacial-bioelectrical engineering.

  11. Perspective: Markov models for long-timescale biomolecular dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Schwantes, C. R.; McGibbon, R. T. [Department of Chemistry, Stanford University, Stanford, California 94305 (United States); Pande, V. S., E-mail: pande@stanford.edu [Department of Chemistry, Stanford University, Stanford, California 94305 (United States); Department of Computer Science, Stanford University, Stanford, California 94305 (United States); Department of Structural Biology, Stanford University, Stanford, California 94305 (United States); Biophysics Program, Stanford University, Stanford, California 94305 (United States)

    2014-09-07

    Molecular dynamics simulations have the potential to provide atomic-level detail and insight to important questions in chemical physics that cannot be observed in typical experiments. However, simply generating a long trajectory is insufficient, as researchers must be able to transform the data in a simulation trajectory into specific scientific insights. Although this analysis step has often been taken for granted, it deserves further attention as large-scale simulations become increasingly routine. In this perspective, we discuss the application of Markov models to the analysis of large-scale biomolecular simulations. We draw attention to recent improvements in the construction of these models as well as several important open issues. In addition, we highlight recent theoretical advances that pave the way for a new generation of models of molecular kinetics.

  12. Perspective: Markov models for long-timescale biomolecular dynamics

    International Nuclear Information System (INIS)

    Schwantes, C. R.; McGibbon, R. T.; Pande, V. S.

    2014-01-01

    Molecular dynamics simulations have the potential to provide atomic-level detail and insight to important questions in chemical physics that cannot be observed in typical experiments. However, simply generating a long trajectory is insufficient, as researchers must be able to transform the data in a simulation trajectory into specific scientific insights. Although this analysis step has often been taken for granted, it deserves further attention as large-scale simulations become increasingly routine. In this perspective, we discuss the application of Markov models to the analysis of large-scale biomolecular simulations. We draw attention to recent improvements in the construction of these models as well as several important open issues. In addition, we highlight recent theoretical advances that pave the way for a new generation of models of molecular kinetics

  13. Data acquisition system issues for large experiments

    International Nuclear Information System (INIS)

    Siskind, E.J.

    2007-01-01

    This talk consists of personal observations on two classes of data acquisition ('DAQ') systems for Silicon trackers in large experiments with which the author has been concerned over the last three or more years. The first half is a classic 'lessons learned' recital based on experience with the high-level debug and configuration of the DAQ system for the GLAST LAT detector. The second half is concerned with a discussion of the promises and pitfalls of using modern (and future) generations of 'system-on-a-chip' ('SOC') or 'platform' field-programmable gate arrays ('FPGAs') in future large DAQ systems. The DAQ system pipeline for the 864k channels of Si tracker in the GLAST LAT consists of five tiers of hardware buffers which ultimately feed into the main memory of the (two-active-node) level-3 trigger processor farm. The data formats and buffer volumes of these tiers are briefly described, as well as the flow control employed between successive tiers. Lessons learned regarding data formats, buffer volumes, and flow control/data discard policy are discussed. The continued development of platform FPGAs containing large amounts of configurable logic fabric, embedded PowerPC hard processor cores, digital signal processing components, large volumes of on-chip buffer memory, and multi-gigabit serial I/O capability permits DAQ system designers to vastly increase the amount of data preprocessing that can be performed in parallel within the DAQ pipeline for detector systems in large experiments. The capabilities of some currently available FPGA families are reviewed, along with the prospects for next-generation families of announced, but not yet available, platform FPGAs. Some experience with an actual implementation is presented, and reconciliation between advertised and achievable specifications is attempted. The prospects for applying these components to space-borne Si tracker detectors are briefly discussed

  14. Siemens: Smart Technologies for Large Control Systems

    CERN Multimedia

    CERN. Geneva; BAKANY, Elisabeth

    2015-01-01

    The CERN Large Hadron Collider (LHC) is known to be one of the most complex scientific machines ever built by mankind. Its correct functioning relies on the integration of a multitude of interdependent industrial control systems, which provide different and essential services to run and protect the accelerators and experiments. These systems have to deal with several millions of data points (e.g. sensors, actuators, configuration parameters, etc…) which need to be acquired, processed, archived and analysed. Since more than 20 years, CERN and Siemens have developed a strong collaboration to deal with the challenges for these large systems. The presentation will cover the current work on the SCADA (Supervisory Control and Data Acquisition) systems and Data Analytics Frameworks.

  15. Geothermal ORC Systems Using Large Screw Expanders

    OpenAIRE

    Biederman, Tim R.; Brasz, Joost J.

    2014-01-01

    Geothermal ORC Systems using Large Screw Expanders Tim Biederman Cyrq Energy Abstract This paper describes a low-temperature Organic Rankine Cycle Power Recovery system with a screw expander a derivative of developed of Kaishan's line of screw compressors, as its power unit. The screw expander design is a modified version of its existing refrigeration compressor used on water-cooled chillers. Starting the ORC development program with existing refrigeration screw compre...

  16. Quality Function Deployment for Large Systems

    Science.gov (United States)

    Dean, Edwin B.

    1992-01-01

    Quality Function Deployment (QFD) is typically applied to small subsystems. This paper describes efforts to extend QFD to large scale systems. It links QFD to the system engineering process, the concurrent engineering process, the robust design process, and the costing process. The effect is to generate a tightly linked project management process of high dimensionality which flushes out issues early to provide a high quality, low cost, and, hence, competitive product. A pre-QFD matrix linking customers to customer desires is described.

  17. Advanced manipulator system for large hot cells

    International Nuclear Information System (INIS)

    Vertut, J.; Moreau, C.; Brossard, J.P.

    1981-01-01

    Large hot cells can be approached as extrapolated from smaller ones as wide, higher or longer in size with the same concept of using mechanical master slave manipulators and high density windows. This concept leads to a large number of working places and corresponding equipments, with a number of penetrations through the biological protection. When the large cell does not need a permanent operation of number of work places, as in particular to serve PIE machines and maintain the facility, use of servo manipulators with a large supporting unit and extensive use of television appears optimal. The advance on MA 23 and supports will be described including the extra facilities related to manipulators introduction and maintenance. The possibility to combine a powered manipulator and MA 23 (single or pair) on the same boom crane system will be described. An advance control system to bring the minimal dead time to control support movement, associated to the master slave arm operation is under development. The general television system includes over view cameras, associated with the limited number of windows, and manipulators camera. A special new system will be described which brings an automatic control of manipulator cameras and saves operator load and dead time. Full scale tests with MA 23 and support will be discussed. (author)

  18. Sequential decoders for large MIMO systems

    KAUST Repository

    Ali, Konpal S.; Abediseid, Walid; Alouini, Mohamed-Slim

    2014-01-01

    the Sequential Decoder using the Fano Algorithm for large MIMO systems. A parameter called the bias is varied to attain different performance-complexity trade-offs. Low values of the bias result in excellent performance but at the expense of high complexity

  19. Beyond Multiplexing Gain in Large MIMO Systems

    DEFF Research Database (Denmark)

    Cakmak, Burak; Müller, Ralf R.; Fleury, Bernard Henri

    growth (multiplexing gain). Even when the channel entries are i.i.d. the deviation from the linear growth is significant. We also find an additive property of the deviation for a concatenated MIMO system. Finally, we quantify the deviation of the large SNR capacity from the exact capacity and find...

  20. Fires in large scale ventilation systems

    International Nuclear Information System (INIS)

    Gregory, W.S.; Martin, R.A.; White, B.W.; Nichols, B.D.; Smith, P.R.; Leslie, I.H.; Fenton, D.L.; Gunaji, M.V.; Blythe, J.P.

    1991-01-01

    This paper summarizes the experience gained simulating fires in large scale ventilation systems patterned after ventilation systems found in nuclear fuel cycle facilities. The series of experiments discussed included: (1) combustion aerosol loading of 0.61x0.61 m HEPA filters with the combustion products of two organic fuels, polystyrene and polymethylemethacrylate; (2) gas dynamic and heat transport through a large scale ventilation system consisting of a 0.61x0.61 m duct 90 m in length, with dampers, HEPA filters, blowers, etc.; (3) gas dynamic and simultaneous transport of heat and solid particulate (consisting of glass beads with a mean aerodynamic diameter of 10μ) through the large scale ventilation system; and (4) the transport of heat and soot, generated by kerosene pool fires, through the large scale ventilation system. The FIRAC computer code, designed to predict fire-induced transients in nuclear fuel cycle facility ventilation systems, was used to predict the results of experiments (2) through (4). In general, the results of the predictions were satisfactory. The code predictions for the gas dynamics, heat transport, and particulate transport and deposition were within 10% of the experimentally measured values. However, the code was less successful in predicting the amount of soot generation from kerosene pool fires, probably due to the fire module of the code being a one-dimensional zone model. The experiments revealed a complicated three-dimensional combustion pattern within the fire room of the ventilation system. Further refinement of the fire module within FIRAC is needed. (orig.)

  1. Entry control system for large populations

    International Nuclear Information System (INIS)

    Merillat, P.D.

    1982-01-01

    An Entry Control System has been developed which is appropriate for use at an installation with a large population requiring access over a large area. This is accomplished by centralizing the data base management and enrollment functions and decentralizing the guard-assisted, positive personnel identification and access functions. Current information pertaining to all enrollees is maintained through user-friendly enrollment stations. These stations may be used to enroll individuals, alter their area access authorizations, change expiration dates, and other similar functions. An audit trail of data base alterations is provided to the System Manager. Decentrailized systems exist at each area to which access is controlled. The central system provides these systems with the necessary entry control information to allow them to operate microprocessor-driven entry control devices. The system is comprised of commercially available entry control components and is structured such that it will be able to incorporate improved devices as technology porogresses. Currently, access is granted to individuals who possess a valid credential, have current access authorization, can supply a memorized personal identification number, and whose physical hand dimensions match their profile obtained during enrollment. The entry control devices report misuses as security violations to a Guard Alarm Display and Assessment System

  2. Large-scale Intelligent Transporation Systems simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ewing, T.; Canfield, T.; Hannebutte, U.; Levine, D.; Tentner, A.

    1995-06-01

    A prototype computer system has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS) capable of running on massively parallel computers and distributed (networked) computer systems. The prototype includes the modelling of instrumented ``smart`` vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of our design is that vehicles will be represented by autonomus computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  3. Computational Methods for Biomolecular Electrostatics

    Science.gov (United States)

    Dong, Feng; Olsen, Brett; Baker, Nathan A.

    2008-01-01

    An understanding of intermolecular interactions is essential for insight into how cells develop, operate, communicate and control their activities. Such interactions include several components: contributions from linear, angular, and torsional forces in covalent bonds, van der Waals forces, as well as electrostatics. Among the various components of molecular interactions, electrostatics are of special importance because of their long range and their influence on polar or charged molecules, including water, aqueous ions, and amino or nucleic acids, which are some of the primary components of living systems. Electrostatics, therefore, play important roles in determining the structure, motion and function of a wide range of biological molecules. This chapter presents a brief overview of electrostatic interactions in cellular systems with a particular focus on how computational tools can be used to investigate these types of interactions. PMID:17964951

  4. Techniques of biomolecular quantification through AMS detection of radiocarbon

    International Nuclear Information System (INIS)

    Vogel, S.J.; Turteltaub, K.W.; Frantz, C.; Felton, J.S.; Gledhill, B.L.

    1992-01-01

    Accelerator mass spectrometry offers a large gain over scintillation counting in sensitivity for detecting radiocarbon in biomolecular tracing. Application of this sensitivity requires new considerations of procedures to extract or isolate the carbon fraction to be quantified, to inventory all carbon in the sample, to prepare graphite from the sample for use in the spectrometer, and to derive a meaningful quantification from the measured isotope ratio. These procedures need to be accomplished without contaminating the sample with radiocarbon, which may be ubiquitous in laboratories and on equipment previously used for higher dose, scintillation experiments. Disposable equipment, materials and surfaces are used to control these contaminations. Quantification of attomole amounts of labeled substances are possible through these techniques

  5. Database management system for large container inspection system

    International Nuclear Information System (INIS)

    Gao Wenhuan; Li Zheng; Kang Kejun; Song Binshan; Liu Fang

    1998-01-01

    Large Container Inspection System (LCIS) based on radiation imaging technology is a powerful tool for the Customs to check the contents inside a large container without opening it. The author has discussed a database application system, as a part of Signal and Image System (SIS), for the LCIS. The basic requirements analysis was done first. Then the selections of computer hardware, operating system, and database management system were made according to the technology and market products circumstance. Based on the above considerations, a database application system with central management and distributed operation features has been implemented

  6. Detector correction in large container inspection systems

    CERN Document Server

    Kang Ke Jun; Chen Zhi Qiang

    2002-01-01

    In large container inspection systems, the image is constructed by parallel scanning with a one-dimensional detector array with a linac used as the X-ray source. The linear nonuniformity and nonlinearity of multiple detectors and the nonuniform intensity distribution of the X-ray sector beam result in horizontal striations in the scan image. This greatly impairs the image quality, so the image needs to be corrected. The correction parameters are determined experimentally by scaling the detector responses at multiple points with logarithm interpolation of the results. The horizontal striations are eliminated by modifying the original image data with the correction parameters. This method has proven to be effective and applicable in large container inspection systems

  7. [Large vessels vasculopathy in systemic sclerosis].

    Science.gov (United States)

    Tejera Segura, Beatriz; Ferraz-Amaro, Iván

    2015-12-07

    Vasculopathy in systemic sclerosis is a severe, in many cases irreversible, manifestation that can lead to amputation. While the classical clinical manifestations of the disease have to do with the involvement of microcirculation, proximal vessels of upper and lower limbs can also be affected. This involvement of large vessels may be related to systemic sclerosis, vasculitis or atherosclerotic, and the differential diagnosis is not easy. To conduct a proper and early diagnosis, it is essential to start prompt appropriate treatment. In this review, we examine the involvement of large vessels in scleroderma, an understudied manifestation with important prognostic and therapeutic implications. Copyright © 2015 Elsevier España, S.L.U. All rights reserved.

  8. Energy cascading in large district heating systems

    International Nuclear Information System (INIS)

    Mayer, F.W.

    1978-01-01

    District heat transfer is the most economical utilization of the waste heat of power plants. Optimum utilization and heat transfer over large distances are possible because of a new energy distribution system, the ''energy cascading system,'' in which heat is transferred to several consumer regions at different temperature ranges. It is made more profitable by the use of heat pumps. The optimum flow-line temperature is 368 0 K, and the optimum return-line temperature is 288 0 K, resulting in an approximately 50% reduction of electric power loss at the power plant

  9. Large Efficient Intelligent Heating Relay Station System

    Science.gov (United States)

    Wu, C. Z.; Wei, X. G.; Wu, M. Q.

    2017-12-01

    The design of large efficient intelligent heating relay station system aims at the improvement of the existing heating system in our country, such as low heating efficiency, waste of energy and serious pollution, and the control still depends on the artificial problem. In this design, we first improve the existing plate heat exchanger. Secondly, the ATM89C51 is used to control the whole system and realize the intelligent control. The detection part is using the PT100 temperature sensor, pressure sensor, turbine flowmeter, heating temperature, detection of user end liquid flow, hydraulic, and real-time feedback, feedback signal to the microcontroller through the heating for users to adjust, realize the whole system more efficient, intelligent and energy-saving.

  10. Large Coil Program magnetic system design study

    International Nuclear Information System (INIS)

    Moses, S.D.; Johnson, N.E.

    1977-01-01

    The primary objective of the Large Coil Program (LCP) is to demonstrate the reliable operation of large superconducting coils to provide a basis for the design principles, materials, and fabrication techniques proposed for the toroidal magnets for the THE NEXT STEP (TNS) and other future tokamak devices. This paper documents a design study of the Large Coil Test Facility (LCTF) in which the structural response of the Toroidal Field (TF) Coils and the supporting structure was evaluated under simulated reactor conditions. The LCP test facility structural system consists of six TF Coils, twelve coil-to-coil torsional restraining beams (torque rings), a central bucking post with base, and a Pulse Coil system. The NASTRAN Finite Element Structural Analysis computer Code was utilized to determine the distribution of deflections, forces, and stresses for each of the TF Coils, torque rings, and the central bucking post. Eleven load conditions were selected to represent probable test operations. Pulse Coils suspended in the bore of the test coil were energized to simulate the pulsed field environment characteristic of the TNS reactor system. The TORMAC Computer Code was utilized to develop the magnetic forces in the TF Coils for each of the eleven loading conditions examined, with or without the Pulse Coils energized. The TORMAC computer program output forces were used directly as input load conditions for the NASTRAN analyses. Results are presented which demonstrate the reliability of the LCTF under simulated reactor operating conditions

  11. Status: Large-scale subatmospheric cryogenic systems

    International Nuclear Information System (INIS)

    Peterson, T.

    1989-01-01

    In the late 1960's and early 1970's an interest in testing and operating RF cavities at 1.8K motivated the development and construction of four large (300 Watt) 1.8K refrigeration systems. in the past decade, development of successful superconducting RF cavities and interest in obtaining higher magnetic fields with the improved Niobium-Titanium superconductors has once again created interest in large-scale 1.8K refrigeration systems. The L'Air Liquide plant for Tore Supra is a recently commissioned 300 Watt 1.8K system which incorporates new technology, cold compressors, to obtain the low vapor pressure for low temperature cooling. CEBAF proposes to use cold compressors to obtain 5KW at 2.0K. Magnetic refrigerators of 10 Watt capacity or higher at 1.8K are now being developed. The state of the art of large-scale refrigeration in the range under 4K will be reviewed. 28 refs., 4 figs., 7 tabs

  12. A QM-MD simulation approach to the analysis of FRET processes in (bio)molecular systems. A case study: complexes of E. coli purine nucleoside phosphorylase and its mutants with formycin A.

    Science.gov (United States)

    Sobieraj, M; Krzyśko, K A; Jarmuła, A; Kalinowski, M W; Lesyng, B; Prokopowicz, M; Cieśla, J; Gojdź, A; Kierdaszuk, B

    2015-04-01

    Predicting FRET pathways in proteins using computer simulation techniques is very important for reliable interpretation of experimental data. A novel and relatively simple methodology has been developed and applied to purine nucleoside phosphorylase (PNP) complexed with a fluorescent ligand - formycin A (FA). FRET occurs between an excited Tyr residue (D*) and FA (A). This study aims to interpret experimental data that, among others, suggests the absence of FRET for the PNPF159A mutant in complex with FA, based on novel theoretical methodology. MD simulations for the protein molecule containing D*, and complexed with A, are carried out. Interactions of D* with its molecular environment are accounted by including changes of the ESP charges in S1, compared to S0, and computed at the SCF-CI level. FRET probability W F depends on the inverse six-power of the D*-A distance, R da . The orientational factor 0 < k(2) < 4 between D* and A is computed and included in the analysis. Finally W F is time-averaged over the MD trajectories resulting in its mean value. The red-shift of the tyrosinate anion emission and thus lack of spectral overlap integral and thermal energy dissipation are the reasons for the FRET absence in the studied mutants at pH 7 and above. The presence of the tyrosinate anion results in a competitive energy dissipation channel and red-shifted emission, thus in consequence in the absence of FRET. These studies also indicate an important role of the phenyl ring of Phe159 for FRET in the wild-type PNP, which does not exist in the Ala159 mutant, and for the effective association of PNP with FA. In a more general context, our observations point out very interesting and biologically important properties of the tyrosine residue in its excited state, which may undergo spontaneous deprotonation in the biomolecular systems, resulting further in unexpected physical and/or biological phenomena. Until now, this observation has not been widely discussed in the

  13. Construction of a large laser fusion system

    International Nuclear Information System (INIS)

    Hurley, C.A.

    1977-01-01

    Construction of a large laser fusion machine is nearing completion at the Lawrence Livermore Laboratory (LLL). Shiva, a 20-terawatt neodymium doped glass system, will be complete in early 1978. This system will have the high power needed to demonstrate significant thermonuclear burn. Shiva will irradiate a microscopic D-T pellet with 20 separate laser beams arriving simultaneously at the target. This requires precise alignment, and stability to maintain alignment. Hardware for the 20 laser chains is composed of 140 amplifiers, 100 spatial filters, 80 isolation stages, 40 large turning mirrors, and a front-end splitter system of over 100 parts. These are mounted on a high stability, three dimensional spaceframe which serves as an optical bench. The mechanical design effort, spanning approximately 3 years, followed a classic engineering evolution. The conceptual design phase led directly to system optimization through cost and technical tradeoffs. Additional manpower was then required for detailed design and specification of hardware and fabrication. Design of long-lead items was started early in order to initiate fabrication and assembly while the rest of the design was completed. All components were ready for assembly and construction as fiscal priorities and schedules permitted

  14. Quantifying the topography of the intrinsic energy landscape of flexible biomolecular recognition

    Science.gov (United States)

    Chu, Xiakun; Gan, Linfeng; Wang, Erkang; Wang, Jin

    2013-01-01

    Biomolecular functions are determined by their interactions with other molecules. Biomolecular recognition is often flexible and associated with large conformational changes involving both binding and folding. However, the global and physical understanding for the process is still challenging. Here, we quantified the intrinsic energy landscapes of flexible biomolecular recognition in terms of binding–folding dynamics for 15 homodimers by exploring the underlying density of states, using a structure-based model both with and without considering energetic roughness. By quantifying three individual effective intrinsic energy landscapes (one for interfacial binding, two for monomeric folding), the association mechanisms for flexible recognition of 15 homodimers can be classified into two-state cooperative “coupled binding–folding” and three-state noncooperative “folding prior to binding” scenarios. We found that the association mechanism of flexible biomolecular recognition relies on the interplay between the underlying effective intrinsic binding and folding energy landscapes. By quantifying the whole global intrinsic binding–folding energy landscapes, we found strong correlations between the landscape topography measure Λ (dimensionless ratio of energy gap versus roughness modulated by the configurational entropy) and the ratio of the thermodynamic stable temperature versus trapping temperature, as well as between Λ and binding kinetics. Therefore, the global energy landscape topography determines the binding–folding thermodynamics and kinetics, crucial for the feasibility and efficiency of realizing biomolecular function. We also found “U-shape” temperature-dependent kinetic behavior and a dynamical cross-over temperature for dividing exponential and nonexponential kinetics for two-state homodimers. Our study provides a unique way to bridge the gap between theory and experiments. PMID:23754431

  15. Engineering management of large scale systems

    Science.gov (United States)

    Sanders, Serita; Gill, Tepper L.; Paul, Arthur S.

    1989-01-01

    The organization of high technology and engineering problem solving, has given rise to an emerging concept. Reasoning principles for integrating traditional engineering problem solving with system theory, management sciences, behavioral decision theory, and planning and design approaches can be incorporated into a methodological approach to solving problems with a long range perspective. Long range planning has a great potential to improve productivity by using a systematic and organized approach. Thus, efficiency and cost effectiveness are the driving forces in promoting the organization of engineering problems. Aspects of systems engineering that provide an understanding of management of large scale systems are broadly covered here. Due to the focus and application of research, other significant factors (e.g., human behavior, decision making, etc.) are not emphasized but are considered.

  16. Pulsed rf systems for large storage rings

    International Nuclear Information System (INIS)

    Wilson, P.B.

    1979-03-01

    The possibility is considered that by using a pulsed rf system a substantial reduction can be made in the rf power requirement for the next generation of large storage rings. For a ring with a sufficiently large circumference, the time between bunch passages, T/sub b/, can exceed the cavity filling time, T/sub f/. As the ratio T/sub b//T/sub f/ increases, it is clear that at some point the average power requirement can be reduced by pulsing the rf to the cavities. In this mode of operation, the rf power is turned on a filling time or so before the arrival of a bunch and is switched off again at the time of bunch passage. There is no rf energy in the accelerating structure, and hence no power dissipation, for most of the period between bunches

  17. Radiofrequency and microwave interactions between biomolecular systems

    Czech Academy of Sciences Publication Activity Database

    Kučera, Ondřej; Cifra, Michal

    2016-01-01

    Roč. 42, č. 1 (2016), s. 1-8 ISSN 0092-0606 R&D Projects: GA ČR(CZ) GA15-17102S Institutional support: RVO:67985882 Keywords : Cell signaling * Radiofrequency * Bioelectrodynamics Subject RIV: JA - Electronics ; Optoelectronics, Electrical Engineering Impact factor: 1.241, year: 2016

  18. Optical interconnect for large-scale systems

    Science.gov (United States)

    Dress, William

    2013-02-01

    This paper presents a switchless, optical interconnect module that serves as a node in a network of identical distribution modules for large-scale systems. Thousands to millions of hosts or endpoints may be interconnected by a network of such modules, avoiding the need for multi-level switches. Several common network topologies are reviewed and their scaling properties assessed. The concept of message-flow routing is discussed in conjunction with the unique properties enabled by the optical distribution module where it is shown how top-down software control (global routing tables, spanning-tree algorithms) may be avoided.

  19. Review of MEMS differential scanning calorimetry for biomolecular study

    Science.gov (United States)

    Yu, Shifeng; Wang, Shuyu; Lu, Ming; Zuo, Lei

    2017-12-01

    Differential scanning calorimetry (DSC) is one of the few techniques that allow direct determination of enthalpy values for binding reactions and conformational transitions in biomolecules. It provides the thermodynamics information of the biomolecules which consists of Gibbs free energy, enthalpy and entropy in a straightforward manner that enables deep understanding of the structure function relationship in biomolecules such as the folding/unfolding of protein and DNA, and ligand bindings. This review provides an up to date overview of the applications of DSC in biomolecular study such as the bovine serum albumin denaturation study, the relationship between the melting point of lysozyme and the scanning rate. We also introduce the recent advances of the development of micro-electro-mechanic-system (MEMS) based DSCs.

  20. Modular pump limiter systems for large tokamaks

    International Nuclear Information System (INIS)

    Uckan, T.; Klepper, C.C.; Mioduszewski, P.K.; McGrath, R.T.

    1987-09-01

    Long-pulse (>10-s) operation of large tokamaks with high-power (>10-MW) heating and extensive external fueling will require correspondingly efficient particle exhaust for density control. A pump limiter can provide the needed exhaust capability by removing a small percentage of the particles, which would otherwise be recycled. Single pump limiter modules have been operated successfully on ISX-B, PDX, TEXTOR, and PLT. An axisymmetric pump limiter is now being installed and will be studied in TEXTOR. A third type of pump limiter is a system that consists of several modules and exhibits performance different from that of a single module. To take advantage of the flexibility of a modular pump limiter system in a high-power, long-pulse device, the power load must be distributed among a number of modules. Because each added module changes the performance of all the others, a set of design criteria must be defined for the overall limiter system. The design parameters for the modules are then determined from the system requirements for particle and power removal. Design criteria and parameters are presented, and the impact on module design of the state of the art in engineering technology is discussed. The relationship between modules are considered from the standpoint of flux coverage and shadowing effects. The results are applied to the Tore Supra tokamak. A preliminary conceptual design for the Tore Supra pump limiter system is discussed, and the design parameters of the limiter modules are presented. 21 refs., 12 figs

  1. Large-scale modelling of neuronal systems

    International Nuclear Information System (INIS)

    Castellani, G.; Verondini, E.; Giampieri, E.; Bersani, F.; Remondini, D.; Milanesi, L.; Zironi, I.

    2009-01-01

    The brain is, without any doubt, the most, complex system of the human body. Its complexity is also due to the extremely high number of neurons, as well as the huge number of synapses connecting them. Each neuron is capable to perform complex tasks, like learning and memorizing a large class of patterns. The simulation of large neuronal systems is challenging for both technological and computational reasons, and can open new perspectives for the comprehension of brain functioning. A well-known and widely accepted model of bidirectional synaptic plasticity, the BCM model, is stated by a differential equation approach based on bistability and selectivity properties. We have modified the BCM model extending it from a single-neuron to a whole-network model. This new model is capable to generate interesting network topologies starting from a small number of local parameters, describing the interaction between incoming and outgoing links from each neuron. We have characterized this model in terms of complex network theory, showing how this, learning rule can be a support For network generation.

  2. Integration of biomolecular logic gates with field-effect transducers

    Energy Technology Data Exchange (ETDEWEB)

    Poghossian, A., E-mail: a.poghossian@fz-juelich.de [Institute of Nano- and Biotechnologies, Aachen University of Applied Sciences, Campus Juelich, Heinrich-Mussmann-Str. 1, D-52428 Juelich (Germany); Institute of Bio- and Nanosystems, Research Centre Juelich GmbH, D-52425 Juelich (Germany); Malzahn, K. [Institute of Nano- and Biotechnologies, Aachen University of Applied Sciences, Campus Juelich, Heinrich-Mussmann-Str. 1, D-52428 Juelich (Germany); Abouzar, M.H. [Institute of Nano- and Biotechnologies, Aachen University of Applied Sciences, Campus Juelich, Heinrich-Mussmann-Str. 1, D-52428 Juelich (Germany); Institute of Bio- and Nanosystems, Research Centre Juelich GmbH, D-52425 Juelich (Germany); Mehndiratta, P. [Institute of Nano- and Biotechnologies, Aachen University of Applied Sciences, Campus Juelich, Heinrich-Mussmann-Str. 1, D-52428 Juelich (Germany); Katz, E. [Department of Chemistry and Biomolecular Science, NanoBio Laboratory (NABLAB), Clarkson University, Potsdam, NY 13699-5810 (United States); Schoening, M.J. [Institute of Nano- and Biotechnologies, Aachen University of Applied Sciences, Campus Juelich, Heinrich-Mussmann-Str. 1, D-52428 Juelich (Germany); Institute of Bio- and Nanosystems, Research Centre Juelich GmbH, D-52425 Juelich (Germany)

    2011-11-01

    Highlights: > Enzyme-based AND/OR logic gates are integrated with a capacitive field-effect sensor. > The AND/OR logic gates compose of multi-enzyme system immobilised on sensor surface. > Logic gates were activated by different combinations of chemical inputs (analytes). > The logic output (pH change) produced by the enzymes was read out by the sensor. - Abstract: The integration of biomolecular logic gates with field-effect devices - the basic element of conventional electronic logic gates and computing - is one of the most attractive and promising approaches for the transformation of biomolecular logic principles into macroscopically useable electrical output signals. In this work, capacitive field-effect EIS (electrolyte-insulator-semiconductor) sensors based on a p-Si-SiO{sub 2}-Ta{sub 2}O{sub 5} structure modified with a multi-enzyme membrane have been used for electronic transduction of biochemical signals processed by enzyme-based OR and AND logic gates. The realised OR logic gate composes of two enzymes (glucose oxidase and esterase) and was activated by ethyl butyrate or/and glucose. The AND logic gate composes of three enzymes (invertase, mutarotase and glucose oxidase) and was activated by two chemical input signals: sucrose and dissolved oxygen. The developed integrated enzyme logic gates produce local pH changes at the EIS sensor surface as a result of biochemical reactions activated by different combinations of chemical input signals, while the pH value of the bulk solution remains unchanged. The pH-induced charge changes at the gate-insulator (Ta{sub 2}O{sub 5}) surface of the EIS transducer result in an electronic signal corresponding to the logic output produced by the immobilised enzymes. The logic output signals have been read out by means of a constant-capacitance method.

  3. Integration of biomolecular logic gates with field-effect transducers

    International Nuclear Information System (INIS)

    Poghossian, A.; Malzahn, K.; Abouzar, M.H.; Mehndiratta, P.; Katz, E.; Schoening, M.J.

    2011-01-01

    Highlights: → Enzyme-based AND/OR logic gates are integrated with a capacitive field-effect sensor. → The AND/OR logic gates compose of multi-enzyme system immobilised on sensor surface. → Logic gates were activated by different combinations of chemical inputs (analytes). → The logic output (pH change) produced by the enzymes was read out by the sensor. - Abstract: The integration of biomolecular logic gates with field-effect devices - the basic element of conventional electronic logic gates and computing - is one of the most attractive and promising approaches for the transformation of biomolecular logic principles into macroscopically useable electrical output signals. In this work, capacitive field-effect EIS (electrolyte-insulator-semiconductor) sensors based on a p-Si-SiO 2 -Ta 2 O 5 structure modified with a multi-enzyme membrane have been used for electronic transduction of biochemical signals processed by enzyme-based OR and AND logic gates. The realised OR logic gate composes of two enzymes (glucose oxidase and esterase) and was activated by ethyl butyrate or/and glucose. The AND logic gate composes of three enzymes (invertase, mutarotase and glucose oxidase) and was activated by two chemical input signals: sucrose and dissolved oxygen. The developed integrated enzyme logic gates produce local pH changes at the EIS sensor surface as a result of biochemical reactions activated by different combinations of chemical input signals, while the pH value of the bulk solution remains unchanged. The pH-induced charge changes at the gate-insulator (Ta 2 O 5 ) surface of the EIS transducer result in an electronic signal corresponding to the logic output produced by the immobilised enzymes. The logic output signals have been read out by means of a constant-capacitance method.

  4. Sequential decoders for large MIMO systems

    KAUST Repository

    Ali, Konpal S.

    2014-05-01

    Due to their ability to provide high data rates, multiple-input multiple-output (MIMO) systems have become increasingly popular. Decoding of these systems with acceptable error performance is computationally very demanding. In this paper, we employ the Sequential Decoder using the Fano Algorithm for large MIMO systems. A parameter called the bias is varied to attain different performance-complexity trade-offs. Low values of the bias result in excellent performance but at the expense of high complexity and vice versa for higher bias values. Numerical results are done that show moderate bias values result in a decent performance-complexity trade-off. We also attempt to bound the error by bounding the bias, using the minimum distance of a lattice. The variations in complexity with SNR have an interesting trend that shows room for considerable improvement. Our work is compared against linear decoders (LDs) aided with Element-based Lattice Reduction (ELR) and Complex Lenstra-Lenstra-Lovasz (CLLL) reduction. © 2014 IFIP.

  5. Large-scale digitizer system, analog converters

    International Nuclear Information System (INIS)

    Althaus, R.F.; Lee, K.L.; Kirsten, F.A.; Wagner, L.J.

    1976-10-01

    Analog to digital converter circuits that are based on the sharing of common resources, including those which are critical to the linearity and stability of the individual channels, are described. Simplicity of circuit composition is valued over other more costly approaches. These are intended to be applied in a large-scale processing and digitizing system for use with high-energy physics detectors such as drift-chambers or phototube-scintillator arrays. Signal distribution techniques are of paramount importance in maintaining adequate signal-to-noise ratio. Noise in both amplitude and time-jitter senses is held sufficiently low so that conversions with 10-bit charge resolution and 12-bit time resolution are achieved

  6. Biomolecular Markers in Cancer of the Tongue

    Directory of Open Access Journals (Sweden)

    Daris Ferrari

    2009-01-01

    Full Text Available The incidence of tongue cancer is increasing worldwide, and its aggressiveness remains high regardless of treatment. Genetic changes and the expression of abnormal proteins have been frequently reported in the case of head and neck cancers, but the little information that has been published concerning tongue tumours is often contradictory. This review will concentrate on the immunohistochemical expression of biomolecular markers and their relationships with clinical behaviour and prognosis. Most of these proteins are associated with nodal stage, tumour progression and metastases, but there is still controversy concerning their impact on disease-free and overall survival, and treatment response. More extensive clinical studies are needed to identify the patterns of molecular alterations and the most reliable predictors in order to develop tailored anti-tumour strategies based on the targeting of hypoxia markers, vascular and lymphangiogenic factors, epidermal growth factor receptors, intracytoplasmatic signalling and apoptosis.

  7. Micro- and nanodevices integrated with biomolecular probes.

    Science.gov (United States)

    Alapan, Yunus; Icoz, Kutay; Gurkan, Umut A

    2015-12-01

    Understanding how biomolecules, proteins and cells interact with their surroundings and other biological entities has become the fundamental design criterion for most biomedical micro- and nanodevices. Advances in biology, medicine, and nanofabrication technologies complement each other and allow us to engineer new tools based on biomolecules utilized as probes. Engineered micro/nanosystems and biomolecules in nature have remarkably robust compatibility in terms of function, size, and physical properties. This article presents the state of the art in micro- and nanoscale devices designed and fabricated with biomolecular probes as their vital constituents. General design and fabrication concepts are presented and three major platform technologies are highlighted: microcantilevers, micro/nanopillars, and microfluidics. Overview of each technology, typical fabrication details, and application areas are presented by emphasizing significant achievements, current challenges, and future opportunities. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Poisson-Nernst-Planck Equations for Simulating Biomolecular Diffusion-Reaction Processes I: Finite Element Solutions.

    Science.gov (United States)

    Lu, Benzhuo; Holst, Michael J; McCammon, J Andrew; Zhou, Y C

    2010-09-20

    In this paper we developed accurate finite element methods for solving 3-D Poisson-Nernst-Planck (PNP) equations with singular permanent charges for electrodiffusion in solvated biomolecular systems. The electrostatic Poisson equation was defined in the biomolecules and in the solvent, while the Nernst-Planck equation was defined only in the solvent. We applied a stable regularization scheme to remove the singular component of the electrostatic potential induced by the permanent charges inside biomolecules, and formulated regular, well-posed PNP equations. An inexact-Newton method was used to solve the coupled nonlinear elliptic equations for the steady problems; while an Adams-Bashforth-Crank-Nicolson method was devised for time integration for the unsteady electrodiffusion. We numerically investigated the conditioning of the stiffness matrices for the finite element approximations of the two formulations of the Nernst-Planck equation, and theoretically proved that the transformed formulation is always associated with an ill-conditioned stiffness matrix. We also studied the electroneutrality of the solution and its relation with the boundary conditions on the molecular surface, and concluded that a large net charge concentration is always present near the molecular surface due to the presence of multiple species of charged particles in the solution. The numerical methods are shown to be accurate and stable by various test problems, and are applicable to real large-scale biophysical electrodiffusion problems.

  9. Large-scale stochasticity in Hamiltonian systems

    International Nuclear Information System (INIS)

    Escande, D.F.

    1982-01-01

    Large scale stochasticity (L.S.S.) in Hamiltonian systems is defined on the paradigm Hamiltonian H(v,x,t) =v 2 /2-M cos x-P cos k(x-t) which describes the motion of one particle in two electrostatic waves. A renormalization transformation Tsub(r) is described which acts as a microscope that focusses on a given KAM (Kolmogorov-Arnold-Moser) torus in phase space. Though approximate, Tsub(r) yields the threshold of L.S.S. in H with an error of 5-10%. The universal behaviour of KAM tori is predicted: for instance the scale invariance of KAM tori and the critical exponent of the Lyapunov exponent of Cantori. The Fourier expansion of KAM tori is computed and several conjectures by L. Kadanoff and S. Shenker are proved. Chirikov's standard mapping for stochastic layers is derived in a simpler way and the width of the layers is computed. A simpler renormalization scheme for these layers is defined. A Mathieu equation for describing the stability of a discrete family of cycles is derived. When combined with Tsub(r), it allows to prove the link between KAM tori and nearby cycles, conjectured by J. Greene and, in particular, to compute the mean residue of a torus. The fractal diagrams defined by G. Schmidt are computed. A sketch of a methodology for computing the L.S.S. threshold in any two-degree-of-freedom Hamiltonian system is given. (Auth.)

  10. SLAP, Large Sparse Linear System Solution Package

    International Nuclear Information System (INIS)

    Greenbaum, A.

    1987-01-01

    1 - Description of program or function: SLAP is a set of routines for solving large sparse systems of linear equations. One need not store the entire matrix - only the nonzero elements and their row and column numbers. Any nonzero structure is acceptable, so the linear system solver need not be modified when the structure of the matrix changes. Auxiliary storage space is acquired and released within the routines themselves by use of the LRLTRAN POINTER statement. 2 - Method of solution: SLAP contains one direct solver, a band matrix factorization and solution routine, BAND, and several interactive solvers. The iterative routines are as follows: JACOBI, Jacobi iteration; GS, Gauss-Seidel Iteration; ILUIR, incomplete LU decomposition with iterative refinement; DSCG and ICCG, diagonal scaling and incomplete Cholesky decomposition with conjugate gradient iteration (for symmetric positive definite matrices only); DSCGN and ILUGGN, diagonal scaling and incomplete LU decomposition with conjugate gradient interaction on the normal equations; DSBCG and ILUBCG, diagonal scaling and incomplete LU decomposition with bi-conjugate gradient iteration; and DSOMN and ILUOMN, diagonal scaling and incomplete LU decomposition with ORTHOMIN iteration

  11. Viewing Systems for Large Underground Storage Tanks

    International Nuclear Information System (INIS)

    Heckendorn, F.M.; Robinson, C.W.; Anderson, E.K.; Pardini, A.F.

    1996-01-01

    Specialized remote video systems have been successfully developed and deployed in a number of large radiological Underground Storage Tanks (USTs)that tolerate the hostile tank interior, while providing high resolution video to a remotely located operator. The deployment is through 100 mm (4 in) tank openings, while incorporating full video functions of the camera, lights, and zoom lens. The usage of remote video minimizes the potential for personnel exposure to radiological and hazardous conditions, and maximizes the quality of the visual data used to assess the interior conditions of both tank and contents. The robustness of this type of remote system has a direct effect on the potential for radiological exposure that personnel may encounter. The USTs typical of the Savannah River and Hanford Department Of Energy - (DOE) sites are typically 4.5 million liter (1.2 million gal) units under earth. or concrete overburden with limited openings to the surface. The interior is both highly contaminated and radioactive with a wide variety of nuclear processing waste material. Some of the tanks are -flammable rated -to Class 1, Division 1,and personnel presence at or near the openings should be minimized. The interior of these USTs must be assessed periodically as part of the ongoing management of the tanks and as a step towards tank remediation. The systems are unique in their deployment technology, which virtually eliminates the potential for entrapment in a tank, and their ability to withstand flammable environments. A multiplicity of components used within a common packaging allow for cost effective and appropriate levels of technology, with radiation hardened components on some units and lesser requirements on other units. All units are completely self contained for video, zoom lens, lighting, deployment,as well as being self purging, and modular in construction

  12. DNA algorithms of implementing biomolecular databases on a biological computer.

    Science.gov (United States)

    Chang, Weng-Long; Vasilakos, Athanasios V

    2015-01-01

    In this paper, DNA algorithms are proposed to perform eight operations of relational algebra (calculus), which include Cartesian product, union, set difference, selection, projection, intersection, join, and division, on biomolecular relational databases.

  13. Unique temporal and spatial biomolecular emission profile on individual zinc oxide nanorods

    Science.gov (United States)

    Singh, Manpreet; Song, Sheng; Hahm, Jong-In

    2013-12-01

    Zinc oxide nanorods (ZnO NRs) have emerged in recent years as extremely useful, optical signal-enhancing platforms in DNA and protein detection. Although the use of ZnO NRs in biodetection has been demonstrated so far in systems involving many ZnO NRs per detection element, their future applications will likely take place in a miniaturized setting while exploiting single ZnO NRs in a low-volume, high-throughput bioanalysis. In this paper, we investigate temporal and spatial characteristics of the biomolecular fluorescence on individual ZnO NR systems. Quantitative and qualitative examinations of the biomolecular intensity and photostability are carried out as a function of two important criteria, the time and position along the long axis (length) of NRs. Photostability profiles are also measured with respect to the position on NRs and compared to those characteristics of biomolecules on polymeric control platforms. Unlike the uniformly distributed signal observed on the control platforms, both the fluorescence intensity and photostability are position-dependent on individual ZnO NRs. We have identified a unique phenomenon of highly localized, fluorescence intensification on the nanorod ends (FINE) of well-characterized, individual ZnO nanostructures. When compared to the polymeric controls, the biomolecular fluorescence intensity and photostability are determined to be higher on individual ZnO NRs regardless of the position on NRs. We have also carried out finite-difference time-domain simulations the results of which are in good agreement with the observed FINE. The outcomes of our investigation will offer a much needed basis for signal interpretation for biodetection devices and platforms consisting of single ZnO NRs and, at the same time, contribute significantly to provide insight in understanding the biomolecular fluorescence observed from ZnO NR ensemble-based systems.Zinc oxide nanorods (ZnO NRs) have emerged in recent years as extremely useful, optical

  14. Environmental effects and large space systems

    Science.gov (United States)

    Garrett, H. B.

    1981-01-01

    When planning large scale operations in space, environmental impact must be considered in addition to radiation, spacecraft charging, contamination, high power and size. Pollution of the atmosphere and space is caused by rocket effluents and by photoelectrons generated by sunlight falling on satellite surfaces even light pollution may result (the SPS may reflect so much light as to be a nuisance to astronomers). Large (100 Km 2) structures also will absorb the high energy particles that impinge on them. Altogether, these effects may drastically alter the Earth's magnetosphere. It is not clear if these alterations will in any way affect the Earth's surface climate. Large structures will also generate large plasma wakes and waves which may cause interference with communications to the vehicle. A high energy, microwave beam from the SPS will cause ionospheric turbulence, affecting UHF and VHF communications. Although none of these effects may ultimately prove critical, they must be considered in the design of large structures.

  15. HPDB-Haskell library for processing atomic biomolecular structures in Protein Data Bank format.

    Science.gov (United States)

    Gajda, Michał Jan

    2013-11-23

    Protein DataBank file format is used for the majority of biomolecular data available today. Haskell is a lazy functional language that enjoys a high-level class-based type system, a growing collection of useful libraries and a reputation for efficiency. I present a fast library for processing biomolecular data in the Protein Data Bank format. I present benchmarks indicating that this library is faster than other frequently used Protein Data Bank parsing programs. The proposed library also features a convenient iterator mechanism, and a simple API modeled after BioPython. I set a new standard for convenience and efficiency of Protein Data Bank processing in a Haskell library, and release it to open source.

  16. Cytoscape: a software environment for integrated models of biomolecular interaction networks.

    Science.gov (United States)

    Shannon, Paul; Markiel, Andrew; Ozier, Owen; Baliga, Nitin S; Wang, Jonathan T; Ramage, Daniel; Amin, Nada; Schwikowski, Benno; Ideker, Trey

    2003-11-01

    Cytoscape is an open source software project for integrating biomolecular interaction networks with high-throughput expression data and other molecular states into a unified conceptual framework. Although applicable to any system of molecular components and interactions, Cytoscape is most powerful when used in conjunction with large databases of protein-protein, protein-DNA, and genetic interactions that are increasingly available for humans and model organisms. Cytoscape's software Core provides basic functionality to layout and query the network; to visually integrate the network with expression profiles, phenotypes, and other molecular states; and to link the network to databases of functional annotations. The Core is extensible through a straightforward plug-in architecture, allowing rapid development of additional computational analyses and features. Several case studies of Cytoscape plug-ins are surveyed, including a search for interaction pathways correlating with changes in gene expression, a study of protein complexes involved in cellular recovery to DNA damage, inference of a combined physical/functional interaction network for Halobacterium, and an interface to detailed stochastic/kinetic gene regulatory models.

  17. Developing the Biomolecular Screening Facility at the EPFL into the Chemical Biology Screening Platform for Switzerland.

    Science.gov (United States)

    Turcatti, Gerardo

    2014-05-01

    The Biomolecular Screening Facility (BSF) is a multidisciplinary laboratory created in 2006 at the Ecole Polytechnique Federale de Lausanne (EPFL) to perform medium and high throughput screening in life sciences-related projects. The BSF was conceived and developed to meet the needs of a wide range of researchers, without privileging a particular biological discipline or therapeutic area. The facility has the necessary infrastructure, multidisciplinary expertise and flexibility to perform large screening programs using small interfering RNAs (siRNAs) and chemical collections in the areas of chemical biology, systems biology and drug discovery. In the framework of the National Centres of Competence in Research (NCCR) Chemical Biology, the BSF is hosting 'ACCESS', the Academic Chemical Screening Platform of Switzerland that provides the scientific community with chemical diversity, screening facilities and know-how in chemical genetics. In addition, the BSF started its own applied research axes that are driven by innovation in thematic areas related to preclinical drug discovery and discovery of bioactive probes.

  18. Large solar energy systems within IEA task 14

    NARCIS (Netherlands)

    Geus, A.C. de; Isakson, P.; Bokhoven, T.P.; Vanoli, K.; Tepe, R.

    1996-01-01

    Within IEA Task 14 (Advanced Solar Systems) a working group was established dealing with large advanced solar energy systems (the Large Systems Working group). The goal of this working group was to generate a common base of experiences for the design and construction of advanced large solar systems.

  19. Electrochemical sensor for multiplex screening of genetically modified DNA: identification of biotech crops by logic-based biomolecular analysis.

    Science.gov (United States)

    Liao, Wei-Ching; Chuang, Min-Chieh; Ho, Ja-An Annie

    2013-12-15

    Genetically modified (GM) technique, one of the modern biomolecular engineering technologies, has been deemed as profitable strategy to fight against global starvation. Yet rapid and reliable analytical method is deficient to evaluate the quality and potential risk of such resulting GM products. We herein present a biomolecular analytical system constructed with distinct biochemical activities to expedite the computational detection of genetically modified organisms (GMOs). The computational mechanism provides an alternative to the complex procedures commonly involved in the screening of GMOs. Given that the bioanalytical system is capable of processing promoter, coding and species genes, affirmative interpretations succeed to identify specified GM event in terms of both electrochemical and optical fashions. The biomolecular computational assay exhibits detection capability of genetically modified DNA below sub-nanomolar level and is found interference-free by abundant coexistence of non-GM DNA. This bioanalytical system, furthermore, sophisticates in array fashion operating multiplex screening against variable GM events. Such a biomolecular computational assay and biosensor holds great promise for rapid, cost-effective, and high-fidelity screening of GMO. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Optimization of Large-Scale Structural Systems

    DEFF Research Database (Denmark)

    Jensen, F. M.

    solutions to small problems with one or two variables to the optimization of large structures such as bridges, ships and offshore structures. The methods used for salving these problems have evolved from being classical differential calculus and calculus of variation to very advanced numerical techniques...

  1. 75 FR 21455 - Large Trader Reporting System

    Science.gov (United States)

    2010-04-23

    ... essence, a ``large trader'' would be defined as a person whose transactions in NMS securities equal or... directly or indirectly effect securities transactions.\\14\\ \\12\\ Section 13(h) of the Exchange Act defines a... term ``identifying activity level'' is defined in Section 13(h) as ``transactions in publicly traded...

  2. Microfluidic Devices for Studying Biomolecular Interactions

    Science.gov (United States)

    Wilson, Wilbur W.; Garcia, Carlos d.; Henry, Charles S.

    2006-01-01

    Microfluidic devices for monitoring biomolecular interactions have been invented. These devices are basically highly miniaturized liquid-chromatography columns. They are intended to be prototypes of miniature analytical devices of the laboratory on a chip type that could be fabricated rapidly and inexpensively and that, because of their small sizes, would yield analytical results from very small amounts of expensive analytes (typically, proteins). Other advantages to be gained by this scaling down of liquid-chromatography columns may include increases in resolution and speed, decreases in the consumption of reagents, and the possibility of performing multiple simultaneous and highly integrated analyses by use of multiple devices of this type, each possibly containing multiple parallel analytical microchannels. The principle of operation is the same as that of a macroscopic liquid-chromatography column: The column is a channel packed with particles, upon which are immobilized molecules of the protein of interest (or one of the proteins of interest if there are more than one). Starting at a known time, a solution or suspension containing molecules of the protein or other substance of interest is pumped into the channel at its inlet. The liquid emerging from the outlet of the channel is monitored to detect the molecules of the dissolved or suspended substance(s). The time that it takes these molecules to flow from the inlet to the outlet is a measure of the degree of interaction between the immobilized and the dissolved or suspended molecules. Depending on the precise natures of the molecules, this measure can be used for diverse purposes: examples include screening for solution conditions that favor crystallization of proteins, screening for interactions between drugs and proteins, and determining the functions of biomolecules.

  3. Fire extinguishing system in large underground garages

    Directory of Open Access Journals (Sweden)

    Ivan Antonov

    2017-04-01

    Full Text Available In the work is considered an acceptable constructive scheme from a practical point of view at fire extinguishing in underground garages. The garage space is divided into quadrants which covering, for example, 2 cars. In case of ignition on one of them, a sprinkler nozzle system is triggered by the effect of the vertical convective jet. A protective curtain preventing the spread of fire to adjacent vehicles is realized. The solution is based on an integrated method which allows the calculation from hydrodynamic point of view on extinguishing time of the fire extinguishing system.

  4. The Design of Large Technological Systems

    DEFF Research Database (Denmark)

    Pineda, Andres Felipe Valderrama

    implies a reconfiguration of the designing team, the supporting actors and the diverse user groups. By tracing material scripts, the author accounts for the unfolding of visions, politics and materialities that constitute the system. The analysis contributes to understanding the complex sociotechnical...

  5. Automatic management software for large-scale cluster system

    International Nuclear Information System (INIS)

    Weng Yunjian; Chinese Academy of Sciences, Beijing; Sun Gongxing

    2007-01-01

    At present, the large-scale cluster system faces to the difficult management. For example the manager has large work load. It needs to cost much time on the management and the maintenance of large-scale cluster system. The nodes in large-scale cluster system are very easy to be chaotic. Thousands of nodes are put in big rooms so that some managers are very easy to make the confusion with machines. How do effectively carry on accurate management under the large-scale cluster system? The article introduces ELFms in the large-scale cluster system. Furthermore, it is proposed to realize the large-scale cluster system automatic management. (authors)

  6. Iterative solution of large linear systems

    CERN Document Server

    Young, David Matheson

    1971-01-01

    This self-contained treatment offers a systematic development of the theory of iterative methods. Its focal point resides in an analysis of the convergence properties of the successive overrelaxation (SOR) method, as applied to a linear system with a consistently ordered matrix. The text explores the convergence properties of the SOR method and related techniques in terms of the spectral radii of the associated matrices as well as in terms of certain matrix norms. Contents include a review of matrix theory and general properties of iterative methods; SOR method and stationary modified SOR meth

  7. Laser photodissociation and spectroscopy of mass-separated biomolecular ions

    CERN Document Server

    Polfer, Nicolas C

    2014-01-01

    This lecture notes book presents how enhanced structural information of biomolecular ions can be obtained from interaction with photons of specific frequency - laser light. The methods described in the book ""Laser photodissociation and spectroscopy of mass-separated biomolecular ions"" make use of the fact that the discrete energy and fast time scale of photoexcitation can provide more control in ion activation. This activation is the crucial process producing structure-informative product ions that cannot be generated with more conventional heating methods, such as collisional activation. Th

  8. Large computer systems and new architectures

    International Nuclear Information System (INIS)

    Bloch, T.

    1978-01-01

    The super-computers of today are becoming quite specialized and one can no longer expect to get all the state-of-the-art software and hardware facilities in one package. In order to achieve faster and faster computing it is necessary to experiment with new architectures, and the cost of developing each experimental architecture into a general-purpose computer system is too high when one considers the relatively small market for these computers. The result is that such computers are becoming 'back-ends' either to special systems (BSP, DAP) or to anything (CRAY-1). Architecturally the CRAY-1 is the most attractive today since it guarantees a speed gain of a factor of two over a CDC 7600 thus allowing us to regard any speed up resulting from vectorization as a bonus. It looks, however, as if it will be very difficult to make substantially faster computers using only pipe-lining techniques and that it will be necessary to explore multiple processors working on the same problem. The experience which will be gained with the BSP and the DAP over the next few years will certainly be most valuable in this respect. (Auth.)

  9. Performance regression manager for large scale systems

    Science.gov (United States)

    Faraj, Daniel A.

    2017-08-01

    System and computer program product to perform an operation comprising generating, based on a first output generated by a first execution instance of a command, a first output file specifying a value of at least one performance metric, wherein the first output file is formatted according to a predefined format, comparing the value of the at least one performance metric in the first output file to a value of the performance metric in a second output file, the second output file having been generated based on a second output generated by a second execution instance of the command, and outputting for display an indication of a result of the comparison of the value of the at least one performance metric of the first output file to the value of the at least one performance metric of the second output file.

  10. Magmatic systems of large continental igneous provinces

    Directory of Open Access Journals (Sweden)

    E. Sharkov

    2017-07-01

    Full Text Available Large igneous provinces (LIPs formed by mantle superplume events have irreversibly changed their composition in the geological evolution of the Earth from high-Mg melts (during Archean and early Paleoproterozoic to Phanerozoic-type geochemically enriched Fe-Ti basalts and picrites at 2.3 Ga. We propose that this upheaval could be related to the change in the source and nature of the mantle superplumes of different generations. The first generation plumes were derived from the depleted mantle, whereas the second generation (thermochemical originated from the core-mantle boundary (CMB. This study mainly focuses on the second (Phanerozoic type of LIPs, as exemplified by the mid-Paleoproterozoic Jatulian–Ludicovian LIP in the Fennoscandian Shield, the Permian–Triassic Siberian LIP, and the late Cenozoic flood basalts of Syria. The latter LIP contains mantle xenoliths represented by green and black series. These xenoliths are fragments of cooled upper margins of the mantle plume heads, above zones of adiabatic melting, and provide information about composition of the plume material and processes in the plume head. Based on the previous studies on the composition of the mantle xenoliths in within-plate basalts around the world, it is inferred that the heads of the mantle (thermochemical plumes are made up of moderately depleted spinel peridotites (mainly lherzolites and geochemically-enriched intergranular fluid/melt. Further, it is presumed that the plume heads intrude the mafic lower crust and reach up to the bottom of the upper crust at depths ∼20 km. The generation of two major types of mantle-derived magmas (alkali and tholeiitic basalts was previously attributed to the processes related to different PT-parameters in the adiabatic melting zone whereas this study relates to the fluid regime in the plume heads. It is also suggested that a newly-formed melt can occur on different sides of a critical plane of silica undersaturation and can

  11. Balancing modern Power System with large scale of wind power

    DEFF Research Database (Denmark)

    Basit, Abdul; Altin, Müfit; Hansen, Anca Daniela

    2014-01-01

    Power system operators must ensure robust, secure and reliable power system operation even with a large scale integration of wind power. Electricity generated from the intermittent wind in large propor-tion may impact on the control of power system balance and thus deviations in the power system...... frequency in small or islanded power systems or tie line power flows in interconnected power systems. Therefore, the large scale integration of wind power into the power system strongly concerns the secure and stable grid operation. To ensure the stable power system operation, the evolving power system has...... to be analysed with improved analytical tools and techniques. This paper proposes techniques for the active power balance control in future power systems with the large scale wind power integration, where power balancing model provides the hour-ahead dispatch plan with reduced planning horizon and the real time...

  12. Sop-GPU: accelerating biomolecular simulations in the centisecond timescale using graphics processors.

    Science.gov (United States)

    Zhmurov, A; Dima, R I; Kholodov, Y; Barsegov, V

    2010-11-01

    Theoretical exploration of fundamental biological processes involving the forced unraveling of multimeric proteins, the sliding motion in protein fibers and the mechanical deformation of biomolecular assemblies under physiological force loads is challenging even for distributed computing systems. Using a C(α)-based coarse-grained self organized polymer (SOP) model, we implemented the Langevin simulations of proteins on graphics processing units (SOP-GPU program). We assessed the computational performance of an end-to-end application of the program, where all the steps of the algorithm are running on a GPU, by profiling the simulation time and memory usage for a number of test systems. The ∼90-fold computational speedup on a GPU, compared with an optimized central processing unit program, enabled us to follow the dynamics in the centisecond timescale, and to obtain the force-extension profiles using experimental pulling speeds (v(f) = 1-10 μm/s) employed in atomic force microscopy and in optical tweezers-based dynamic force spectroscopy. We found that the mechanical molecular response critically depends on the conditions of force application and that the kinetics and pathways for unfolding change drastically even upon a modest 10-fold increase in v(f). This implies that, to resolve accurately the free energy landscape and to relate the results of single-molecule experiments in vitro and in silico, molecular simulations should be carried out under the experimentally relevant force loads. This can be accomplished in reasonable wall-clock time for biomolecules of size as large as 10(5) residues using the SOP-GPU package. © 2010 Wiley-Liss, Inc.

  13. An advanced dispatching technology for large container inspection system

    International Nuclear Information System (INIS)

    Chen Zhiqiang; Zhang Li; Kang Kejun; Gao Wenhuan

    2001-01-01

    The author describes the transmitting and dispatching technology of large container inspection system. It introduces the structure of the double buffer graded pipe lining used in the system. Strategies of queue mechanism and waiting dispatch policy are illustrated

  14. Estimating the state of large spatio-temporally chaotic systems

    International Nuclear Information System (INIS)

    Ott, E.; Hunt, B.R.; Szunyogh, I.; Zimin, A.V.; Kostelich, E.J.; Corazza, M.; Kalnay, E.; Patil, D.J.; Yorke, J.A.

    2004-01-01

    We consider the estimation of the state of a large spatio-temporally chaotic system from noisy observations and knowledge of a system model. Standard state estimation techniques using the Kalman filter approach are not computationally feasible for systems with very many effective degrees of freedom. We present and test a new technique (called a Local Ensemble Kalman Filter), generally applicable to large spatio-temporally chaotic systems for which correlations between system variables evaluated at different points become small at large separation between the points

  15. GROMOS++Software for the Analysis of Biomolecular Simulation Trajectories

    NARCIS (Netherlands)

    Eichenberger, A.P.; Allison, J.R.; Dolenc, J.; Geerke, D.P.; Horta, B.A.C.; Meier, K; Oostenbrink, B.C.; Schmid, N.; Steiner, D; Wang, D.; van Gunsteren, W.F.

    2011-01-01

    GROMOS++ is a set of C++ programs for pre- and postprocessing of molecular dynamics simulation trajectories and as such is part of the GROningen MOlecular Simulation software for (bio)molecular simulation. It contains more than 70 programs that can be used to prepare data for the production of

  16. The HADDOCK web server for data-driven biomolecular docking

    NARCIS (Netherlands)

    de Vries, S.J.|info:eu-repo/dai/nl/304837717; van Dijk, M.|info:eu-repo/dai/nl/325811113; Bonvin, A.M.J.J.|info:eu-repo/dai/nl/113691238

    2010-01-01

    Computational docking is the prediction or modeling of the three-dimensional structure of a biomolecular complex, starting from the structures of the individual molecules in their free, unbound form. HADDOC K is a popular docking program that takes a datadriven approach to docking, with support for

  17. Biomolecular strategies for cell surface engineering

    Science.gov (United States)

    Wilson, John Tanner

    Islet transplantation has emerged as a promising cell-based therapy for the treatment of diabetes, but its clinical efficacy remains limited by deleterious host responses that underlie islet destruction. In this dissertation, we describe the assembly of ultrathin conformal coatings that confer molecular-level control over the composition and biophysicochemical properties of the islet surface with implications for improving islet engraftment. Significantly, this work provides novel biomolecular strategies for cell surface engineering with broad biomedical and biotechnological applications in cell-based therapeutics and beyond. Encapsulation of cells and tissue offers a rational approach for attenuating deleterious host responses towards transplanted cells, but a need exists to develop cell encapsulation strategies that minimize transplant volume. Towards this end, we endeavored to generate nanothin films of diverse architecture with tunable properties on the extracellular surface of individual pancreatic islets through a process of layer-by-layer (LbL) self assembly. We first describe the formation of poly(ethylene glycol) (PEG)-rich conformal coatings on islets via LbL self assembly of poly(L-lysine)-g-PEG(biotin) and streptavidin. Multilayer thin films conformed to the geometrically and chemically heterogeneous islet surface, and could be assembled without loss of islet viability or function. Significantly, coated islets performed comparably to untreated controls in a murine model of allogenic intraportal islet transplantation, and, to our knowledge, this is the first study to report in vivo survival and function of nanoencapsulated cells or cell aggregates. Based on these findings, we next postulated that structurally similar PLL-g-PEG copolymers comprised of shorter PEG grafts might be used to initiate and propagate the assembly of polyelectrolyte multilayer (PEM) films on pancreatic islets, while simultaneously preserving islet viability. Through control of PLL

  18. Needs, opportunities, and options for large scale systems research

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  19. Power quality load management for large spacecraft electrical power systems

    Science.gov (United States)

    Lollar, Louis F.

    1988-01-01

    In December, 1986, a Center Director's Discretionary Fund (CDDF) proposal was granted to study power system control techniques in large space electrical power systems. Presented are the accomplishments in the area of power system control by power quality load management. In addition, information concerning the distortion problems in a 20 kHz ac power system is presented.

  20. The use of gold nanoparticle aggregation for DNA computing and logic-based biomolecular detection

    International Nuclear Information System (INIS)

    Lee, In-Hee; Yang, Kyung-Ae; Zhang, Byoung-Tak; Lee, Ji-Hoon; Park, Ji-Yoon; Chai, Young Gyu; Lee, Jae-Hoon

    2008-01-01

    The use of DNA molecules as a physical computational material has attracted much interest, especially in the area of DNA computing. DNAs are also useful for logical control and analysis of biological systems if efficient visualization methods are available. Here we present a quick and simple visualization technique that displays the results of the DNA computing process based on a colorimetric change induced by gold nanoparticle aggregation, and we apply it to the logic-based detection of biomolecules. Our results demonstrate its effectiveness in both DNA-based logical computation and logic-based biomolecular detection

  1. Design techniques for large scale linear measurement systems

    International Nuclear Information System (INIS)

    Candy, J.V.

    1979-03-01

    Techniques to design measurement schemes for systems modeled by large scale linear time invariant systems, i.e., physical systems modeled by a large number (> 5) of ordinary differential equations, are described. The techniques are based on transforming the physical system model to a coordinate system facilitating the design and then transforming back to the original coordinates. An example of a three-stage, four-species, extraction column used in the reprocessing of spent nuclear fuel elements is presented. The basic ideas are briefly discussed in the case of noisy measurements. An example using a plutonium nitrate storage vessel (reprocessing) with measurement uncertainty is also presented

  2. Reduction of Large Dynamical Systems by Minimization of Evolution Rate

    Science.gov (United States)

    Girimaji, Sharath S.

    1999-01-01

    Reduction of a large system of equations to a lower-dimensional system of similar dynamics is investigated. For dynamical systems with disparate timescales, a criterion for determining redundant dimensions and a general reduction method based on the minimization of evolution rate are proposed.

  3. Large deviations for noninteracting infinite-particle systems

    International Nuclear Information System (INIS)

    Donsker, M.D.; Varadhan, S.R.S.

    1987-01-01

    A large deviation property is established for noninteracting infinite particle systems. Previous large deviation results obtained by the authors involved a single I-function because the cases treated always involved a unique invariant measure for the process. In the context of this paper there is an infinite family of invariant measures and a corresponding infinite family of I-functions governing the large deviations

  4. The interplay of intrinsic and extrinsic bounded noises in biomolecular networks.

    Directory of Open Access Journals (Sweden)

    Giulio Caravagna

    Full Text Available After being considered as a nuisance to be filtered out, it became recently clear that biochemical noise plays a complex role, often fully functional, for a biomolecular network. The influence of intrinsic and extrinsic noises on biomolecular networks has intensively been investigated in last ten years, though contributions on the co-presence of both are sparse. Extrinsic noise is usually modeled as an unbounded white or colored gaussian stochastic process, even though realistic stochastic perturbations are clearly bounded. In this paper we consider Gillespie-like stochastic models of nonlinear networks, i.e. the intrinsic noise, where the model jump rates are affected by colored bounded extrinsic noises synthesized by a suitable biochemical state-dependent Langevin system. These systems are described by a master equation, and a simulation algorithm to analyze them is derived. This new modeling paradigm should enlarge the class of systems amenable at modeling. We investigated the influence of both amplitude and autocorrelation time of a extrinsic Sine-Wiener noise on: (i the Michaelis-Menten approximation of noisy enzymatic reactions, which we show to be applicable also in co-presence of both intrinsic and extrinsic noise, (ii a model of enzymatic futile cycle and (iii a genetic toggle switch. In (ii and (iii we show that the presence of a bounded extrinsic noise induces qualitative modifications in the probability densities of the involved chemicals, where new modes emerge, thus suggesting the possible functional role of bounded noises.

  5. The interplay of intrinsic and extrinsic bounded noises in biomolecular networks.

    Science.gov (United States)

    Caravagna, Giulio; Mauri, Giancarlo; d'Onofrio, Alberto

    2013-01-01

    After being considered as a nuisance to be filtered out, it became recently clear that biochemical noise plays a complex role, often fully functional, for a biomolecular network. The influence of intrinsic and extrinsic noises on biomolecular networks has intensively been investigated in last ten years, though contributions on the co-presence of both are sparse. Extrinsic noise is usually modeled as an unbounded white or colored gaussian stochastic process, even though realistic stochastic perturbations are clearly bounded. In this paper we consider Gillespie-like stochastic models of nonlinear networks, i.e. the intrinsic noise, where the model jump rates are affected by colored bounded extrinsic noises synthesized by a suitable biochemical state-dependent Langevin system. These systems are described by a master equation, and a simulation algorithm to analyze them is derived. This new modeling paradigm should enlarge the class of systems amenable at modeling. We investigated the influence of both amplitude and autocorrelation time of a extrinsic Sine-Wiener noise on: (i) the Michaelis-Menten approximation of noisy enzymatic reactions, which we show to be applicable also in co-presence of both intrinsic and extrinsic noise, (ii) a model of enzymatic futile cycle and (iii) a genetic toggle switch. In (ii) and (iii) we show that the presence of a bounded extrinsic noise induces qualitative modifications in the probability densities of the involved chemicals, where new modes emerge, thus suggesting the possible functional role of bounded noises.

  6. Optimization of MIMO Systems Capacity Using Large Random Matrix Methods

    Directory of Open Access Journals (Sweden)

    Philippe Loubaton

    2012-11-01

    Full Text Available This paper provides a comprehensive introduction of large random matrix methods for input covariance matrix optimization of mutual information of MIMO systems. It is first recalled informally how large system approximations of mutual information can be derived. Then, the optimization of the approximations is discussed, and important methodological points that are not necessarily covered by the existing literature are addressed, including the strict concavity of the approximation, the structure of the argument of its maximum, the accuracy of the large system approach with regard to the number of antennas, or the justification of iterative water-filling optimization algorithms. While the existing papers have developed methods adapted to a specific model, this contribution tries to provide a unified view of the large system approximation approach.

  7. Improved control system power unit for large parachutes

    Science.gov (United States)

    Chandler, J. A.; Grubbs, T. M.

    1968-01-01

    Improved control system power unit drives the control surfaces of very large controllable parachutes. The design features subassemblies for determining control surface position and cable loading, and protection of the load sensor against the possibility of damage during manipulation.

  8. Using Agent Base Models to Optimize Large Scale Network for Large System Inventories

    Science.gov (United States)

    Shameldin, Ramez Ahmed; Bowling, Shannon R.

    2010-01-01

    The aim of this paper is to use Agent Base Models (ABM) to optimize large scale network handling capabilities for large system inventories and to implement strategies for the purpose of reducing capital expenses. The models used in this paper either use computational algorithms or procedure implementations developed by Matlab to simulate agent based models in a principal programming language and mathematical theory using clusters, these clusters work as a high performance computational performance to run the program in parallel computational. In both cases, a model is defined as compilation of a set of structures and processes assumed to underlie the behavior of a network system.

  9. Biomolecular structure refinement using the GROMOS simulation software

    International Nuclear Information System (INIS)

    Schmid, Nathan; Allison, Jane R.; Dolenc, Jožica; Eichenberger, Andreas P.; Kunz, Anna-Pitschna E.; Gunsteren, Wilfred F. van

    2011-01-01

    For the understanding of cellular processes the molecular structure of biomolecules has to be accurately determined. Initial models can be significantly improved by structure refinement techniques. Here, we present the refinement methods and analysis techniques implemented in the GROMOS software for biomolecular simulation. The methodology and some implementation details of the computation of NMR NOE data, 3 J-couplings and residual dipolar couplings, X-ray scattering intensities from crystals and solutions and neutron scattering intensities used in GROMOS is described and refinement strategies and concepts are discussed using example applications. The GROMOS software allows structure refinement combining different types of experimental data with different types of restraining functions, while using a variety of methods to enhance conformational searching and sampling and the thermodynamically calibrated GROMOS force field for biomolecular simulation.

  10. Biomolecular structure refinement using the GROMOS simulation software

    Energy Technology Data Exchange (ETDEWEB)

    Schmid, Nathan; Allison, Jane R.; Dolenc, Jozica; Eichenberger, Andreas P.; Kunz, Anna-Pitschna E.; Gunsteren, Wilfred F. van, E-mail: wfvgn@igc.phys.chem.ethz.ch [Swiss Federal Institute of Technology ETH, Laboratory of Physical Chemistry (Switzerland)

    2011-11-15

    For the understanding of cellular processes the molecular structure of biomolecules has to be accurately determined. Initial models can be significantly improved by structure refinement techniques. Here, we present the refinement methods and analysis techniques implemented in the GROMOS software for biomolecular simulation. The methodology and some implementation details of the computation of NMR NOE data, {sup 3}J-couplings and residual dipolar couplings, X-ray scattering intensities from crystals and solutions and neutron scattering intensities used in GROMOS is described and refinement strategies and concepts are discussed using example applications. The GROMOS software allows structure refinement combining different types of experimental data with different types of restraining functions, while using a variety of methods to enhance conformational searching and sampling and the thermodynamically calibrated GROMOS force field for biomolecular simulation.

  11. Physics at the biomolecular interface fundamentals for molecular targeted therapy

    CERN Document Server

    Fernández, Ariel

    2016-01-01

    This book focuses primarily on the role of interfacial forces in understanding biological phenomena at the molecular scale. By providing a suitable statistical mechanical apparatus to handle the biomolecular interface, the book becomes uniquely positioned to address core problems in molecular biophysics. It highlights the importance of interfacial tension in delineating a solution to the protein folding problem, in unravelling the physico-chemical basis of enzyme catalysis and protein associations, and in rationally designing molecular targeted therapies. Thus grounded in fundamental science, the book develops a powerful technological platform for drug discovery, while it is set to inspire scientists at any level in their careers determined to address the major challenges in molecular biophysics. The acknowledgment of how exquisitely the structure and dynamics of proteins and their aqueous environment are related attests to the overdue recognition that biomolecular phenomena cannot be effectively understood w...

  12. Highly Scalable Trip Grouping for Large Scale Collective Transportation Systems

    DEFF Research Database (Denmark)

    Gidofalvi, Gyozo; Pedersen, Torben Bach; Risch, Tore

    2008-01-01

    Transportation-related problems, like road congestion, parking, and pollution, are increasing in most cities. In order to reduce traffic, recent work has proposed methods for vehicle sharing, for example for sharing cabs by grouping "closeby" cab requests and thus minimizing transportation cost...... and utilizing cab space. However, the methods published so far do not scale to large data volumes, which is necessary to facilitate large-scale collective transportation systems, e.g., ride-sharing systems for large cities. This paper presents highly scalable trip grouping algorithms, which generalize previous...

  13. A Data Analysis Expert System For Large Established Distributed Databases

    Science.gov (United States)

    Gnacek, Anne-Marie; An, Y. Kim; Ryan, J. Patrick

    1987-05-01

    The purpose of this work is to analyze the applicability of artificial intelligence techniques for developing a user-friendly, parallel interface to large isolated, incompatible NASA databases for the purpose of assisting the management decision process. To carry out this work, a survey was conducted to establish the data access requirements of several key NASA user groups. In addition, current NASA database access methods were evaluated. The results of this work are presented in the form of a design for a natural language database interface system, called the Deductively Augmented NASA Management Decision Support System (DANMDS). This design is feasible principally because of recently announced commercial hardware and software product developments which allow cross-vendor compatibility. The goal of the DANMDS system is commensurate with the central dilemma confronting most large companies and institutions in America, the retrieval of information from large, established, incompatible database systems. The DANMDS system implementation would represent a significant first step toward this problem's resolution.

  14. Tools for the Automation of Large Distributed Control Systems

    CERN Document Server

    Gaspar, Clara

    2005-01-01

    The new LHC experiments at CERN will have very large numbers of channels to operate. In order to be able to configure and monitor such large systems, a high degree of parallelism is necessary. The control system is built as a hierarchy of sub-systems distributed over several computers. A toolkit - SMI++, combining two approaches: finite state machines and rule-based programming, allows for the description of the various sub-systems as decentralized deciding entities, reacting is real-time to changes in the system, thus providing for the automation of standard procedures and for the automatic recovery from error conditions in a hierarchical fashion. In this paper we will describe the principles and features of SMI++ as well as its integration with an industrial SCADA tool for use by the LHC experiments and we will try to show that such tools, can provide a very convenient mechanism for the automation of large scale, high complexity, applications.

  15. Tools for the automation of large control systems

    CERN Document Server

    Gaspar, Clara

    2005-01-01

    The new LHC experiments at CERN will have very large numbers of channels to operate. In order to be able to configure and monitor such large systems, a high degree of parallelism is necessary. The control system is built as a hierarchy of sub-systems distributed over several computers. A toolkit – SMI++, combining two approaches: finite state machines and rule-based programming, allows for the description of the various sub-systems as decentralized deciding entities, reacting in real-time to changes in the system, thus providing for the automation of standard procedures and the for the automatic recovery from error conditions in a hierarchical fashion. In this paper we will describe the principles and features of SMI++ as well as its integration with an industrial SCADA tool for use by the LHC experiments and we will try to show that such tools, can provide a very convenient mechanism for the automation of large scale, high complexity, applications.

  16. Large-Scale Systems Control Design via LMI Optimization

    Czech Academy of Sciences Publication Activity Database

    Rehák, Branislav

    2015-01-01

    Roč. 44, č. 3 (2015), s. 247-253 ISSN 1392-124X Institutional support: RVO:67985556 Keywords : Combinatorial linear matrix inequalities * large-scale system * decentralized control Subject RIV: BC - Control Systems Theory Impact factor: 0.633, year: 2015

  17. Large superconducting magnet systems for plasma and fusion applications

    International Nuclear Information System (INIS)

    Heinz, W.

    1976-05-01

    Work on superconducting magnet systems and state of the art of superconducting magnet technology are described. Conceptual design consideration and problems of large magnet systems (stability, magnetic forces, cooling modes, safety) are discussed. Recent results of experimental work at Karlsruhe are reported. An outline of American and European programs is given. (orig.) [de

  18. A Chain Perspective on Large-scale Number Systems

    NARCIS (Netherlands)

    Grijpink, J.H.A.M.

    2012-01-01

    As large-scale number systems gain significance in social and economic life (electronic communication, remote electronic authentication), the correct functioning and the integrity of public number systems take on crucial importance. They are needed to uniquely indicate people, objects or phenomena

  19. Disruptions in large value payment systems: an experimental approach

    NARCIS (Netherlands)

    Abbink, K.; Bosman, R.; Heijmans, R.; van Winden, F.

    2010-01-01

    This experimental study investigates the behaviour of banks in a large value payment system. More specifically,we look at 1) the reactions of banks to disruptions in the payment system, 2) the way in which the history of disruptions affects the behaviour of banks (path dependency) and 3) the effect

  20. Disruptions in large value payment systems: An experimental approach

    NARCIS (Netherlands)

    Abbink, K.; Bosman, R.; Heijmans, R.; van Winden, F.; Hellqvist, M.; Laine, T.

    2012-01-01

    This experimental study investigates the behaviour of banks in a large value payment system. More specifically, we look at 1) the reactions of banks to disruptions in the payment system, 2) the way in which the history of disruptions affects the behaviour of banks (path dependency) and 3) the effect

  1. Large amplitude forced vibration analysis of cross-beam system ...

    African Journals Online (AJOL)

    Large amplitude forced vibration behaviour of cross-beam system under harmonic excitation is studied, incorporating the effect of geometric non-linearity. The forced vibration analysis is carried out in an indirect way, in which the dynamic system is assumed to satisfy the force equilibrium condition at peak load value, thus ...

  2. Self-* and Adaptive Mechanisms for Large Scale Distributed Systems

    Science.gov (United States)

    Fragopoulou, P.; Mastroianni, C.; Montero, R.; Andrjezak, A.; Kondo, D.

    Large-scale distributed computing systems and infrastructure, such as Grids, P2P systems and desktop Grid platforms, are decentralized, pervasive, and composed of a large number of autonomous entities. The complexity of these systems is such that human administration is nearly impossible and centralized or hierarchical control is highly inefficient. These systems need to run on highly dynamic environments, where content, network topologies and workloads are continuously changing. Moreover, they are characterized by the high degree of volatility of their components and the need to provide efficient service management and to handle efficiently large amounts of data. This paper describes some of the areas for which adaptation emerges as a key feature, namely, the management of computational Grids, the self-management of desktop Grid platforms and the monitoring and healing of complex applications. It also elaborates on the use of bio-inspired algorithms to achieve self-management. Related future trends and challenges are described.

  3. XML-based approaches for the integration of heterogeneous bio-molecular data.

    Science.gov (United States)

    Mesiti, Marco; Jiménez-Ruiz, Ernesto; Sanz, Ismael; Berlanga-Llavori, Rafael; Perlasca, Paolo; Valentini, Giorgio; Manset, David

    2009-10-15

    The today's public database infrastructure spans a very large collection of heterogeneous biological data, opening new opportunities for molecular biology, bio-medical and bioinformatics research, but raising also new problems for their integration and computational processing. In this paper we survey the most interesting and novel approaches for the representation, integration and management of different kinds of biological data by exploiting XML and the related recommendations and approaches. Moreover, we present new and interesting cutting edge approaches for the appropriate management of heterogeneous biological data represented through XML. XML has succeeded in the integration of heterogeneous biomolecular information, and has established itself as the syntactic glue for biological data sources. Nevertheless, a large variety of XML-based data formats have been proposed, thus resulting in a difficult effective integration of bioinformatics data schemes. The adoption of a few semantic-rich standard formats is urgent to achieve a seamless integration of the current biological resources.

  4. Software Reliability Issues Concerning Large and Safety Critical Software Systems

    Science.gov (United States)

    Kamel, Khaled; Brown, Barbara

    1996-01-01

    This research was undertaken to provide NASA with a survey of state-of-the-art techniques using in industrial and academia to provide safe, reliable, and maintainable software to drive large systems. Such systems must match the complexity and strict safety requirements of NASA's shuttle system. In particular, the Launch Processing System (LPS) is being considered for replacement. The LPS is responsible for monitoring and commanding the shuttle during test, repair, and launch phases. NASA built this system in the 1970's using mostly hardware techniques to provide for increased reliability, but it did so often using custom-built equipment, which has not been able to keep up with current technologies. This report surveys the major techniques used in industry and academia to ensure reliability in large and critical computer systems.

  5. Accuracy of the photogrametric measuring system for large size elements

    Directory of Open Access Journals (Sweden)

    M. Grzelka

    2011-04-01

    Full Text Available The aim of this paper is to present methods of estimating and guidelines for verifying the accuracy of optical photogrammetric measuringsystems, using for measurement of large size elements. Measuring systems applied to measure workpieces of a large size which oftenreach more than 10000mm require use of appropriate standards. Those standards provided by the manufacturer of photogrammetricsystems are certified and are inspected annually. To make sure that these systems work properly there was developed a special standardVDI / VDE 2634, "Optical 3D measuring systems. Imaging systems with point - by - point probing. " According to recommendationsdescribed in this standard research on accuracy of photogrametric measuring system was conducted using K class gauge blocks dedicatedto calibrate and test accuracy of classic CMMs. The paper presents results of research of estimation the actual error of indication for sizemeasurement MPEE for photogrammetric coordinate measuring system TRITOP.

  6. Large scale gas chromatographic demonstration system for hydrogen isotope separation

    International Nuclear Information System (INIS)

    Cheh, C.H.

    1988-01-01

    A large scale demonstration system was designed for a throughput of 3 mol/day equimolar mixture of H,D, and T. The demonstration system was assembled and an experimental program carried out. This project was funded by Kernforschungszentrum Karlsruhe, Canadian Fusion Fuel Technology Projects and Ontario Hydro Research Division. Several major design innovations were successfully implemented in the demonstration system and are discussed in detail. Many experiments were carried out in the demonstration system to study the performance of the system to separate hydrogen isotopes at high throughput. Various temperature programming schemes were tested, heart-cutting operation was evaluated, and very large (up to 138 NL/injection) samples were separated in the system. The results of the experiments showed that the specially designed column performed well as a chromatographic column and good separation could be achieved even when a 138 NL sample was injected

  7. Engineering large-scale agent-based systems with consensus

    Science.gov (United States)

    Bokma, A.; Slade, A.; Kerridge, S.; Johnson, K.

    1994-01-01

    The paper presents the consensus method for the development of large-scale agent-based systems. Systems can be developed as networks of knowledge based agents (KBA) which engage in a collaborative problem solving effort. The method provides a comprehensive and integrated approach to the development of this type of system. This includes a systematic analysis of user requirements as well as a structured approach to generating a system design which exhibits the desired functionality. There is a direct correspondence between system requirements and design components. The benefits of this approach are that requirements are traceable into design components and code thus facilitating verification. The use of the consensus method with two major test applications showed it to be successful and also provided valuable insight into problems typically associated with the development of large systems.

  8. Development of automated analytical systems for large throughput

    International Nuclear Information System (INIS)

    Ernst, P.C.; Hoffman, E.L.

    1982-01-01

    The need to be able to handle a large throughput of samples for neutron activation analysis has led to the development of automated counting and sample handling systems. These are coupled with available computer-assisted INAA techniques to perform a wide range of analytical services on a commercial basis. A fully automated delayed neutron counting system and a computer controlled pneumatic transfer for INAA use are described, as is a multi-detector gamma-spectroscopy system. (author)

  9. Challenges in parameter identification of large structural dynamic systems

    International Nuclear Information System (INIS)

    Koh, C.G.

    2001-01-01

    In theory, it is possible to determine the parameters of a structural or mechanical system by subjecting it to some dynamic excitation and measuring the response. Considerable research has been carried out in this subject area known as the system identification over the past two decades. Nevertheless, the challenges associated with numerical convergence are still formidable when the system is large in terms of the number of degrees of freedom and number of unknowns. While many methods work for small systems, the convergence becomes difficult, if not impossible, for large systems. In this keynote lecture, both classical and non-classical system identification methods for dynamic testing and vibration-based inspection are discussed. For classical methods, the extended Kalman filter (EKF) approach is used. On this basis, a substructural identification method has been developed as a strategy to deal with large structural systems. This is achieved by reducing the problem size, thereby significantly improving the numerical convergence and efficiency. Two versions of this method are presented each with its own merits. A numerical example of frame structure with 20 unknown parameters is illustrated. For non-classical methods, the Genetic Algorithm (GA) is shown to be applicable with relative ease due to its 'forward analysis' nature. The computational time is, however, still enormous for large structural systems due to the combinatorial explosion problem. A model GA method has been developed to address this problem and tested with considerable success on a relatively large system of 50 degrees of freedom, accounting for input and output noise effects. An advantages of this GA-based identification method is that the objective function can be defined in response measured. Numerical studies show that the method is relatively robust, as it does in response measured. Numerical studies show that the method is relatively robust, as it dos not require good initial guess and the

  10. Performance Health Monitoring of Large-Scale Systems

    Energy Technology Data Exchange (ETDEWEB)

    Rajamony, Ram [IBM Research, Austin, TX (United States)

    2014-11-20

    This report details the progress made on the ASCR funded project Performance Health Monitoring for Large Scale Systems. A large-­scale application may not achieve its full performance potential due to degraded performance of even a single subsystem. Detecting performance faults, isolating them, and taking remedial action is critical for the scale of systems on the horizon. PHM aims to develop techniques and tools that can be used to identify and mitigate such performance problems. We accomplish this through two main aspects. The PHM framework encompasses diagnostics, system monitoring, fault isolation, and performance evaluation capabilities that indicates when a performance fault has been detected, either due to an anomaly present in the system itself or due to contention for shared resources between concurrently executing jobs. Software components called the PHM Control system then build upon the capabilities provided by the PHM framework to mitigate degradation caused by performance problems.

  11. Large Scale GW Calculations on the Cori System

    Science.gov (United States)

    Deslippe, Jack; Del Ben, Mauro; da Jornada, Felipe; Canning, Andrew; Louie, Steven

    The NERSC Cori system, powered by 9000+ Intel Xeon-Phi processors, represents one of the largest HPC systems for open-science in the United States and the world. We discuss the optimization of the GW methodology for this system, including both node level and system-scale optimizations. We highlight multiple large scale (thousands of atoms) case studies and discuss both absolute application performance and comparison to calculations on more traditional HPC architectures. We find that the GW method is particularly well suited for many-core architectures due to the ability to exploit a large amount of parallelism across many layers of the system. This work was supported by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Materials Sciences and Engineering Division, as part of the Computational Materials Sciences Program.

  12. Dynamic and label-free high-throughput detection of biomolecular interactions based on phase-shift interferometry

    Science.gov (United States)

    Li, Qiang; Huang, Guoliang; Gan, Wupeng; Chen, Shengyi

    2009-08-01

    Biomolecular interactions can be detected by many established technologies such as fluorescence imaging, surface plasmon resonance (SPR)[1-4], interferometry and radioactive labeling of the analyte. In this study, we have designed and constructed a label-free, real-time sensing platform and its operating imaging instrument that detects interactions using optical phase differences from the accumulation of biological material on solid substrates. This system allows us to monitor biomolecular interactions in real time and quantify concentration changes during micro-mixing processes by measuring the changes of the optical path length (OPD). This simple interferometric technology monitors the optical phase difference resulting from accumulated biomolecular mass. A label-free protein chip that forms a 4×4 probe array was designed and fabricated using a commercial microarray robot spotter on solid substrates. Two positive control probe lines of BSA (Bovine Serum Albumin) and two experimental human IgG and goat IgG was used. The binding of multiple protein targets was performed and continuously detected by using this label-free and real-time sensing platform.

  13. Recovering from trust breakdowns in large system implementations

    DEFF Research Database (Denmark)

    Rerup Schlichter, Bjarne Rerup; Andersen, Povl Erik Rostgård

    2011-01-01

    On the basis of experiences from the Faroese large-scale implementation of integrated healthcare information systems and insights into dynamic aspects of trust, we offer the following lessons learned for the successful management and recovery of trust (breakdowns) in large system implementations......: restore relations by turning towards face-to-face events and procedures, assure a well-functioning and available support organization, demonstrate trust in actors to enhance their own self-confidence and celebrate successes, even the smallest or ones injected by yourself. The propositions are based on a 6...

  14. Study of grounding system of large tokamak device JT-60

    International Nuclear Information System (INIS)

    Arakawa, Kiyotsugu; Shimada, Ryuichi; Kishimoto, Hiroshi; Yabuno, Kohei; Ishigaki, Yukio.

    1982-01-01

    In the critical plasma testing facility JT-60 constructed by the Japan Atomic Energy Research Institute, high voltage, large current is required in an instant. Accordingly, for the protection of human bodies and the equipment, and for realizing the stable operation of the complex, precise control and measurement system, a large scale facility of grounding system is required. In case of the JT-60 experimental facility, the equipments with different functions in separate buildings are connected, therefore, it is an important point to avoid high potential difference between buildings. In the grounding system for the JT-60, a reticulate grounding electrode is laid for each building, and these electrodes are connected with a low impedance metallic duct called grounding trunk line. The power supply cables for various magnetic field coils, control lines and measurement lines are laid in the duct. It is a large problem to grasp quantitatively the effect of a grounding trunk line by analysis. The authors analyzed the phenomenon that large current flows into a grounding system by lightning strike or grounding. The fundamental construction of the grounding system for the JT-60, the condition for the analysis and the result of simulation are reported. (Kako, I.)

  15. A document preparation system in a large network environment

    Energy Technology Data Exchange (ETDEWEB)

    Vigil, M.; Bouchier, S.; Sanders, C.; Sydoriak, S.; Wheeler, K.

    1988-01-01

    At Los Alamos National Laboratory, we have developed an integrated document preparation system that produces publication-quality documents. This system combines text formatters and computer graphics capabilities that have been adapted to meet the needs of users in a large scientific research laboratory. This paper describes the integration of document processing technology to develop a system architecture, based on a page description language, to provide network-wide capabilities in a distributed computing environment. We describe the Laboratory requirements, the integration and implementation issues, and the challenges we faced developing this system.

  16. From a Proven Correct Microkernel to Trustworthy Large Systems

    Science.gov (United States)

    Andronick, June

    The seL4 microkernel was the world's first general-purpose operating system kernel with a formal, machine-checked proof of correctness. The next big step in the challenge of building truly trustworthy systems is to provide a framework for developing secure systems on top of seL4. This paper first gives an overview of seL4's correctness proof, together with its main implications and assumptions, and then describes our approach to provide formal security guarantees for large, complex systems.

  17. Turbomolecular pump vacuum system for the Princeton Large Torus

    International Nuclear Information System (INIS)

    Dylla, H.F.

    1977-10-01

    A turbomolecular pump vacuum system has been designed and installed on the Princeton Large Torus (PLT). Four vertical shaft, oil-bearing, 1500 l/s turbomolecular pumps have been interfaced to the 6400 liter PLT Vacuum vessel to provide a net pumping speed of 3000 l/s for H 2 . The particular requirements and problems of tokamak vacuum systems are enumerated. A vacuum control system is described which protects the vacuum vessel from contamination, and protects the turbomolecular pumps from damage under a variety of possible failure modes. The performance of the vacuum system is presented in terms of pumping speed measurements and residual gas behavior

  18. Hydrothermal processes above the Yellowstone magma chamber: Large hydrothermal systems and large hydrothermal explosions

    Science.gov (United States)

    Morgan, L.A.; Shanks, W.C. Pat; Pierce, K.L.

    2009-01-01

    and vein-fi lling; and (5) areal dimensions of many large hydrothermal explosion craters in Yellowstone are similar to those of its active geyser basins and thermal areas. For Yellowstone, our knowledge of hydrothermal craters and ejecta is generally limited to after the Yellowstone Plateau emerged from beneath a late Pleistocene icecap that was roughly a kilometer thick. Large hydrothermal explosions may have occurred earlier as indicated by multiple episodes of cementation and brecciation commonly observed in hydrothermal ejecta clasts. Critical components for large, explosive hydrothermal systems include a watersaturated system at or near boiling temperatures and an interconnected system of well-developed joints and fractures along which hydrothermal fluids flow. Active deformation of the Yellowstone caldera, active faulting and moderate local seismicity, high heat flow, rapid changes in climate, and regional stresses are factors that have strong infl uences on the type of hydrothermal system developed. Ascending hydrothermal fluids flow along fractures that have developed in response to active caldera deformation and along edges of low-permeability rhyolitic lava flows. Alteration of the area affected, self-sealing leading to development of a caprock for the hydrothermal system, and dissolution of silica-rich rocks are additional factors that may constrain the distribution and development of hydrothermal fields. A partial lowpermeability layer that acts as a cap to the hydrothermal system may produce some over-pressurization, thought to be small in most systems. Any abrupt drop in pressure initiates steam fl ashing and is rapidly transmitted through interconnected fractures that result in a series of multiple large-scale explosions contributing to the excavation of a larger explosion crater. Similarities between the size and dimensions of large hydrothermal explosion craters and thermal fields in Yellowstone may indicate that catastrophic events which result in l

  19. Overcoming the solubility limit with solubility-enhancement tags: successful applications in biomolecular NMR studies

    International Nuclear Information System (INIS)

    Zhou Pei; Wagner, Gerhard

    2010-01-01

    Although the rapid progress of NMR technology has significantly expanded the range of NMR-trackable systems, preparation of NMR-suitable samples that are highly soluble and stable remains a bottleneck for studies of many biological systems. The application of solubility-enhancement tags (SETs) has been highly effective in overcoming solubility and sample stability issues and has enabled structural studies of important biological systems previously deemed unapproachable by solution NMR techniques. In this review, we provide a brief survey of the development and successful applications of the SET strategy in biomolecular NMR. We also comment on the criteria for choosing optimal SETs, such as for differently charged target proteins, and recent new developments on NMR-invisible SETs.

  20. Value of flexibility in systems with large wind penetration

    OpenAIRE

    Silva , Vera

    2010-01-01

    The focus of this thesis is the quantification of the value of operation flexibility in systems with large penetration of wind generation. This begins with the quantification of the impact of wind generation (WG) uncertainty on the system's needs for frequency regulation and reserve. This is done by combing the stochastic behaviour of wind generation, demand uncertainty and generation outages. Two different approaches are compared to access the implications of using normal distribution approx...

  1. Collaboration and Virtualization in Large Information Systems Projects

    Directory of Open Access Journals (Sweden)

    Stefan Ioan NITCHI

    2009-01-01

    Full Text Available A project is evolving through different phases from idea and conception until the experiments, implementation and maintenance. The globalization, the Internet, the Web and the mobile computing changed many human activities, and in this respect, the realization of the Information System (IS projects. The projects are growing, the teams are geographically distributed, and the users are heterogeneous. In this respect, the realization of the large Information Technology (IT projects needs to use collaborative technologies. The distribution of the team, the users' heterogeneity and the project complexity determines the virtualization. This paper is an overview of these aspects for large IT projects. It shortly present a general framework developed by the authors for collaborative systems in general and adapted to collaborative project management. The general considerations are illustrated on the case of a large IT project in which the authors were involved.

  2. Highly uniform parallel microfabrication using a large numerical aperture system

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Zi-Yu; Su, Ya-Hui, E-mail: ustcsyh@ahu.edu.cn, E-mail: dongwu@ustc.edu.cn [School of Electrical Engineering and Automation, Anhui University, Hefei 230601 (China); Zhang, Chen-Chu; Hu, Yan-Lei; Wang, Chao-Wei; Li, Jia-Wen; Chu, Jia-Ru; Wu, Dong, E-mail: ustcsyh@ahu.edu.cn, E-mail: dongwu@ustc.edu.cn [CAS Key Laboratory of Mechanical Behavior and Design of Materials, Department of Precision Machinery and Precision Instrumentation, University of Science and Technology of China, Hefei 230026 (China)

    2016-07-11

    In this letter, we report an improved algorithm to produce accurate phase patterns for generating highly uniform diffraction-limited multifocal arrays in a large numerical aperture objective system. It is shown that based on the original diffraction integral, the uniformity of the diffraction-limited focal arrays can be improved from ∼75% to >97%, owing to the critical consideration of the aperture function and apodization effect associated with a large numerical aperture objective. The experimental results, e.g., 3 × 3 arrays of square and triangle, seven microlens arrays with high uniformity, further verify the advantage of the improved algorithm. This algorithm enables the laser parallel processing technology to realize uniform microstructures and functional devices in the microfabrication system with a large numerical aperture objective.

  3. Local decoherence-resistant quantum states of large systems

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, Utkarsh; Sen, Aditi; Sen, Ujjwal, E-mail: ujjwal@hri.res.in

    2015-02-06

    We identify an effectively decoherence-free class of quantum states, each of which consists of a “minuscule” and a “large” sector, against local noise. In particular, the content of entanglement and other quantum correlations in the minuscule to large partition is independent of the number of particles in their large sectors, when all the particles suffer passage through local amplitude and phase damping channels. The states of the large sectors are distinct in terms of markedly different amounts of violation of Bell inequality. In case the large sector is macroscopic, such states are akin to the Schrödinger cat. - Highlights: • We identify an effectively decoherence-free class of quantum states of large systems. • We work with local noise models. • Decay of entanglement as well as information-theoretic quantum correlations considered. • The states are of the form of the Schrödinger cats, with minuscule and large sectors. • The states of the large sector are distinguishable by their violation of Bell inequality.

  4. Steiner systems and large non-Hamiltonian hypergraphs

    Directory of Open Access Journals (Sweden)

    Zsolt Tuza

    2006-10-01

    Full Text Available From Steiner systems S(k − 2, 2k − 3, v, we construct k-uniform hyper- graphs of large size without Hamiltonian cycles. This improves previous estimates due to G. Y. Katona and H. Kierstead [J. Graph Theory 30 (1999, pp.  205–212].

  5. Participatory Design and the Challenges of Large-Scale Systems

    DEFF Research Database (Denmark)

    Simonsen, Jesper; Hertzum, Morten

    2008-01-01

    With its 10th biannual anniversary conference, Participatory Design (PD) is leaving its teens and must now be considered ready to join the adult world. In this article we encourage the PD community to think big: PD should engage in large-scale information-systems development and opt for a PD...

  6. Economic viability of large-scale fusion systems

    Energy Technology Data Exchange (ETDEWEB)

    Helsley, Charles E., E-mail: cehelsley@fusionpowercorporation.com; Burke, Robert J.

    2014-01-01

    A typical modern power generation facility has a capacity of about 1 GWe (Gigawatt electric) per unit. This works well for fossil fuel plants and for most fission facilities for it is large enough to support the sophisticated generation infrastructure but still small enough to be accommodated by most utility grid systems. The size of potential fusion power systems may demand a different viewpoint. The compression and heating of the fusion fuel for ignition requires a large driver, even if it is necessary for only a few microseconds or nanoseconds per energy pulse. The economics of large systems, that can effectively use more of the driver capacity, need to be examined. The assumptions used in this model are specific for the Fusion Power Corporation (FPC) SPRFD process but could be generalized for any system. We assume that the accelerator is the most expensive element of the facility and estimate its cost to be $20 billion. Ignition chambers and fuel handling facilities are projected to cost $1.5 billion each with up to 10 to be serviced by one accelerator. At first this seems expensive but that impression has to be tempered by the energy output that is equal to 35 conventional nuclear plants. This means the cost per kWh is actually low. Using the above assumptions and industry data for generators and heat exchange systems, we conclude that a fully utilized fusion system will produce marketable energy at roughly one half the cost of our current means of generating an equivalent amount of energy from conventional fossil fuel and/or fission systems. Even fractionally utilized systems, i.e. systems used at 25% of capacity, can be cost effective in many cases. In conclusion, SPRFD systems can be scaled to a size and configuration that can be economically viable and very competitive in today's energy market. Electricity will be a significant element in the product mix but synthetic fuels and water may also need to be incorporated to make the large system

  7. Economic viability of large-scale fusion systems

    International Nuclear Information System (INIS)

    Helsley, Charles E.; Burke, Robert J.

    2014-01-01

    A typical modern power generation facility has a capacity of about 1 GWe (Gigawatt electric) per unit. This works well for fossil fuel plants and for most fission facilities for it is large enough to support the sophisticated generation infrastructure but still small enough to be accommodated by most utility grid systems. The size of potential fusion power systems may demand a different viewpoint. The compression and heating of the fusion fuel for ignition requires a large driver, even if it is necessary for only a few microseconds or nanoseconds per energy pulse. The economics of large systems, that can effectively use more of the driver capacity, need to be examined. The assumptions used in this model are specific for the Fusion Power Corporation (FPC) SPRFD process but could be generalized for any system. We assume that the accelerator is the most expensive element of the facility and estimate its cost to be $20 billion. Ignition chambers and fuel handling facilities are projected to cost $1.5 billion each with up to 10 to be serviced by one accelerator. At first this seems expensive but that impression has to be tempered by the energy output that is equal to 35 conventional nuclear plants. This means the cost per kWh is actually low. Using the above assumptions and industry data for generators and heat exchange systems, we conclude that a fully utilized fusion system will produce marketable energy at roughly one half the cost of our current means of generating an equivalent amount of energy from conventional fossil fuel and/or fission systems. Even fractionally utilized systems, i.e. systems used at 25% of capacity, can be cost effective in many cases. In conclusion, SPRFD systems can be scaled to a size and configuration that can be economically viable and very competitive in today's energy market. Electricity will be a significant element in the product mix but synthetic fuels and water may also need to be incorporated to make the large system economically

  8. Hydraulic System Design of Hydraulic Actuators for Large Butterfly Valves

    Directory of Open Access Journals (Sweden)

    Ye HUANG

    2014-09-01

    Full Text Available Hydraulic control systems of butterfly valves are presently valve-controlled and pump-controlled. Valve-controlled hydraulic systems have serious power loss and generate much heat during throttling. Pump-controlled hydraulic systems have no overflow or throttling losses but are limited in the speed adjustment of the variable-displacement pump, generate much noise, pollute the environment, and have motor power that does not match load requirements, resulting in low efficiency under light loads and wearing of the variable-displacement pump. To overcome these shortcomings, this article designs a closed hydraulic control system in which an AC servo motor drives a quantitative pump that controls a spiral swinging hydraulic cylinder, and analyzes and calculates the structure and parameters of a spiral swinging hydraulic cylinder. The hydraulic system adjusts the servo motor’s speed according to the requirements of the control system, and the motor power matches the power provided to components, thus eliminating the throttling loss of hydraulic circuits. The system is compact, produces a large output force, provides stable transmission, has a quick response, and is suitable as a hydraulic control system of a large butterfly valve.

  9. Biomolecular transport and separation in nanotubular networks.

    Energy Technology Data Exchange (ETDEWEB)

    Stachowiak, Jeanne C.; Stevens, Mark Jackson (Sandia National Laboratories, Albuquerque, NM); Robinson, David B.; Branda, Steven S.; Zendejas, Frank; Meagher, Robert J.; Sasaki, Darryl Yoshio; Bachand, George David (Sandia National Laboratories, Albuquerque, NM); Hayden, Carl C.; Sinha, Anupama; Abate, Elisa; Wang, Julia; Carroll-Portillo, Amanda (Sandia National Laboratories, Albuquerque, NM); Liu, Haiqing (Sandia National Laboratories, Albuquerque, NM)

    2010-09-01

    Cell membranes are dynamic substrates that achieve a diverse array of functions through multi-scale reconfigurations. We explore the morphological changes that occur upon protein interaction to model membrane systems that induce deformation of their planar structure to yield nanotube assemblies. In the two examples shown in this report we will describe the use of membrane adhesion and particle trajectory to form lipid nanotubes via mechanical stretching, and protein adsorption onto domains and the induction of membrane curvature through steric pressure. Through this work the relationship between membrane bending rigidity, protein affinity, and line tension of phase separated structures were examined and their relationship in biological membranes explored.

  10. Scanning probe and optical tweezer investigations of biomolecular interactions

    International Nuclear Information System (INIS)

    Rigby-Singleton, Shellie

    2002-01-01

    A complex array of intermolecular forces controls the interactions between and within biological molecules. The desire to empirically explore the fundamental forces has led to the development of several biophysical techniques. Of these, the atomic force microscope (AFM) and the optical tweezers have been employed throughout this thesis to monitor the intermolecular forces involved in biomolecular interactions. The AFM is a well-established force sensing technique capable of measuring biomolecular interactions at a single molecule level. However, its versatility has not been extrapolated to the investigation of a drug-enzyme complex. The energy landscape for the force induced dissociation of the DHFR-methotrexate complex was studied. Revealing an energy barrier to dissociation located ∼0.3 nm from the bound state. Unfortunately, the AFM has a limited range of accessible loading rates and in order to profile the complete energy landscape alternative force sensing instrumentation should be considered, for example the BFP and optical tweezers. Thus, this thesis outlines the development and construction an optical trap capable of measuring intermolecular forces between biomolecules at the single molecule level. To demonstrate the force sensing abilities of the optical set up, proof of principle measurements were performed which investigate the interactions between proteins and polymer surfaces subjected to varying degrees of argon plasma treatment. Complementary data was gained from measurements performed independently by the AFM. Changes in polymer resistance to proteins as a response to changes in polymer surface chemistry were detected utilising both AFM and optical tweezers measurements. Finally, the AFM and optical tweezers were employed as ultrasensitive biosensors. Single molecule investigations of the antibody-antigen interaction between the cardiac troponin I marker and its complementary antibody, reveals the impact therapeutic concentrations of heparin have

  11. Nonterrestrial material processing and manufacturing of large space systems

    Science.gov (United States)

    Von Tiesenhausen, G.

    1979-01-01

    Nonterrestrial processing of materials and manufacturing of large space system components from preprocessed lunar materials at a manufacturing site in space is described. Lunar materials mined and preprocessed at the lunar resource complex will be flown to the space manufacturing facility (SMF), where together with supplementary terrestrial materials, they will be final processed and fabricated into space communication systems, solar cell blankets, radio frequency generators, and electrical equipment. Satellite Power System (SPS) material requirements and lunar material availability and utilization are detailed, and the SMF processing, refining, fabricating facilities, material flow and manpower requirements are described.

  12. Large-scale computing techniques for complex system simulations

    CERN Document Server

    Dubitzky, Werner; Schott, Bernard

    2012-01-01

    Complex systems modeling and simulation approaches are being adopted in a growing number of sectors, including finance, economics, biology, astronomy, and many more. Technologies ranging from distributed computing to specialized hardware are explored and developed to address the computational requirements arising in complex systems simulations. The aim of this book is to present a representative overview of contemporary large-scale computing technologies in the context of complex systems simulations applications. The intention is to identify new research directions in this field and

  13. Senior Management Use of Management Control Systems in Large Companies

    DEFF Research Database (Denmark)

    Willert, Jeanette; Israelsen, Poul; Rohde, Carsten

    2017-01-01

    The use of management control systems in large companies remains relatively unexplored. Indeed, only a few studies of senior managers’ use of management control systems consider multiple controls in companies. This paper explores data from a comprehensive survey of the use of management control...... systems in 120 strategic business units at some of the largest companies in Denmark. The paper identifies how senior management guides and controls their subordinates to meet their companies’ objectives. The presentation and discussion of the results, including citations from executive managers, use...

  14. Large Time Behavior of the Vlasov-Poisson-Boltzmann System

    Directory of Open Access Journals (Sweden)

    Li Li

    2013-01-01

    Full Text Available The motion of dilute charged particles can be modeled by Vlasov-Poisson-Boltzmann system. We study the large time stability of the VPB system. To be precise, we prove that when time goes to infinity, the solution of VPB system tends to global Maxwellian state in a rate Ot−∞, by using a method developed for Boltzmann equation without force in the work of Desvillettes and Villani (2005. The improvement of the present paper is the removal of condition on parameter λ as in the work of Li (2008.

  15. A novel reliability evaluation method for large engineering systems

    Directory of Open Access Journals (Sweden)

    Reda Farag

    2016-06-01

    Full Text Available A novel reliability evaluation method for large nonlinear engineering systems excited by dynamic loading applied in time domain is presented. For this class of problems, the performance functions are expected to be function of time and implicit in nature. Available first- or second-order reliability method (FORM/SORM will be challenging to estimate reliability of such systems. Because of its inefficiency, the classical Monte Carlo simulation (MCS method also cannot be used for large nonlinear dynamic systems. In the proposed approach, only tens instead of hundreds or thousands of deterministic evaluations at intelligently selected points are used to extract the reliability information. A hybrid approach, consisting of the stochastic finite element method (SFEM developed by the author and his research team using FORM, response surface method (RSM, an interpolation scheme, and advanced factorial schemes, is proposed. The method is clarified with the help of several numerical examples.

  16. Understanding large social-ecological systems: introducing the SESMAD project

    Directory of Open Access Journals (Sweden)

    Michael Cox

    2014-08-01

    Full Text Available This article introduces the Social-ecological systems meta-analysis database (SESMAD project, which is the project behind the case studies and synthetic articles contained in this special issue of the International Journal of the Commons. SESMAD is an internationally collaborative meta-analysis project that builds on previous seminally synthetic work on small-scale common-pool resource systems conducted at the Workshop in Political Theory and Policy Analysis at Indiana University. This project is guided by the following research question: can the variables found to be important in explaining outcomes on small-scale systems be scaled up to explain outcomes in large-scale environmental governance? In this special issue we report on our findings thus far through a set of case studies of large-scale environmental governance, a paper that describes our conceptual advances, and a paper that compares these five case studies to further examine our central research question.

  17. Synergy of Two Highly Specific Biomolecular Recognition Events

    DEFF Research Database (Denmark)

    Ejlersen, Maria; Christensen, Niels Johan; Sørensen, Kasper K

    2018-01-01

    Two highly specific biomolecular recognition events, nucleic acid duplex hybridization and DNA-peptide recognition in the minor groove, were coalesced in a miniature ensemble for the first time by covalently attaching a natural AT-hook peptide motif to nucleic acid duplexes via a 2'-amino......-LNA scaffold. A combination of molecular dynamics simulations and ultraviolet thermal denaturation studies revealed high sequence-specific affinity of the peptide-oligonucleotide conjugates (POCs) when binding to complementary DNA strands, leveraging the bioinformation encrypted in the minor groove of DNA...

  18. Instrumental biosensors: new perspectives for the analysis of biomolecular interactions.

    Science.gov (United States)

    Nice, E C; Catimel, B

    1999-04-01

    The use of instrumental biosensors in basic research to measure biomolecular interactions in real time is increasing exponentially. Applications include protein-protein, protein-peptide, DNA-protein, DNA-DNA, and lipid-protein interactions. Such techniques have been applied to, for example, antibody-antigen, receptor-ligand, signal transduction, and nuclear receptor studies. This review outlines the principles of two of the most commonly used instruments and highlights specific operating parameters that will assist in optimising experimental design, data generation, and analysis.

  19. Supramolecular photochemistry of drugs in biomolecular environments.

    Science.gov (United States)

    Monti, Sandra; Manet, Ilse

    2014-06-21

    In this tutorial review we illustrate how the interaction of photoactive drugs/potential drugs with proteins or DNA in supramolecular complexes can determine the course of the reactions initiated by the drug absorbed photons, evidencing the mechanistic differences with respect to the solution conditions. We focus on photoprocesses, independent of oxygen, that lead to chemical modification of the biomolecules, with formation of new covalent bonds or cleavage of existing bonds. Representative systems are mainly selected from the literature of the last decade. The photoreactivity of some aryl propionic acids, (fluoro)quinolones, furocoumarins, metal coordination complexes, quinine-like compounds, naphthaleneimides and pyrenyl-peptides with proteins or DNA is discussed. The use of light for biomolecule photomodification, historically relevant to biological photosensitization processes and some forms of photochemotherapy, is nowadays becoming more and more important in the development of innovative methods in nanomedicine and biotechnology.

  20. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Science.gov (United States)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  1. Hydrogen Production from Water by Photosynthesis System I for Use as Fuel in Energy Conversion Devices (a.k.a. Understanding Photosystem I as a Biomolecular Reactor for Energy Conversion)

    Science.gov (United States)

    2014-04-01

    Hydrogen Production from Water by Photosynthesis System I for Use as Fuel in Energy Conversion Devices (a.k.a. Understanding Photosystem I as...Laboratory Adelphi, MD 20783-1197 ARL-TR-6904 April 2014 Hydrogen Production from Water by Photosynthesis System I for Use as Fuel in Energy...Final 3. DATES COVERED (From - To) 10/1/2010–10/1/2013 4. TITLE AND SUBTITLE Hydrogen Production from Water by Photosynthesis System I for Use as Fuel

  2. Jump phenomena. [large amplitude responses of nonlinear systems

    Science.gov (United States)

    Reiss, E. L.

    1980-01-01

    The paper considers jump phenomena composed of large amplitude responses of nonlinear systems caused by small amplitude disturbances. Physical problems where large jumps in the solution amplitude are important features of the response are described, including snap buckling of elastic shells, chemical reactions leading to combustion and explosion, and long-term climatic changes of the earth's atmosphere. A new method of rational functions was then developed which consists of representing the solutions of the jump problems as rational functions of the small disturbance parameter; this method can solve jump problems explicitly.

  3. Status and Future Developments in Large Accelerator Control Systems

    International Nuclear Information System (INIS)

    Karen S. White

    2006-01-01

    Over the years, accelerator control systems have evolved from small hardwired systems to complex computer controlled systems with many types of graphical user interfaces and electronic data processing. Today's control systems often include multiple software layers, hundreds of distributed processors, and hundreds of thousands of lines of code. While it is clear that the next generation of accelerators will require much bigger control systems, they will also need better systems. Advances in technology will be needed to ensure the network bandwidth and CPU power can provide reasonable update rates and support the requisite timing systems. Beyond the scaling problem, next generation systems face additional challenges due to growing cyber security threats and the likelihood that some degree of remote development and operation will be required. With a large number of components, the need for high reliability increases and commercial solutions can play a key role towards this goal. Future control systems will operate more complex machines and need to present a well integrated, interoperable set of tools with a high degree of automation. Consistency of data presentation and exception handling will contribute to efficient operations. From the development perspective, engineers will need to provide integrated data management in the beginning of the project and build adaptive software components around a central data repository. This will make the system maintainable and ensure consistency throughout the inevitable changes during the machine lifetime. Additionally, such a large project will require professional project management and disciplined use of well-defined engineering processes. Distributed project teams will make the use of standards, formal requirements and design and configuration control vital. Success in building the control system of the future may hinge on how well we integrate commercial components and learn from best practices used in other industries

  4. Biomolecular tracing using long-lived isotopes

    International Nuclear Information System (INIS)

    Vogel, J.S.; Turteltaub, K.W.; Frantz, C.E.; Keating, G.; Felton, J.S.; Southon, J.R.; Roberts, M.R.; Gledhill, B.L.

    1992-01-01

    Accelerator mass spectrometry (AMS) was developed over the past 15 years as an essential tool for detecting long-lived, cosmogenic radio-isotopes in the earth and space sciences. We apply this technology to the measurement of chemical kinetics, primarily in biomedical systems, which had heretofore employed short-lived isotopes and/or long counting times to quantify radio-isotopic labels. AMS provides detection efficiencies of ∼ 1%, 10 3 to 10 6 better than decay-counting. Long-lived isotopes are used and detected with AMS at concentrations which reduce sample size, chemical dose, radiation safety hazards and radiolysis. We measure 3 H, 7,1O Be, 14 C, 26 Al, 36 CI, 41 Ca and 129 I, but most of our current program uses 14 C. Initial experiments involved research on the genotoxicity of mutagens in cooked foods and reversible binding of compounds to antibodies. Through collaborations, we apply AMS detection to research in carcinogenesis, pharmacokinetics of toxins, elemental metabolism, distribution of topical medications and nutrition

  5. Policy Driven Development: Flexible Policy Insertion for Large Scale Systems.

    Science.gov (United States)

    Demchak, Barry; Krüger, Ingolf

    2012-07-01

    The success of a software system depends critically on how well it reflects and adapts to stakeholder requirements. Traditional development methods often frustrate stakeholders by creating long latencies between requirement articulation and system deployment, especially in large scale systems. One source of latency is the maintenance of policy decisions encoded directly into system workflows at development time, including those involving access control and feature set selection. We created the Policy Driven Development (PDD) methodology to address these development latencies by enabling the flexible injection of decision points into existing workflows at runtime , thus enabling policy composition that integrates requirements furnished by multiple, oblivious stakeholder groups. Using PDD, we designed and implemented a production cyberinfrastructure that demonstrates policy and workflow injection that quickly implements stakeholder requirements, including features not contemplated in the original system design. PDD provides a path to quickly and cost effectively evolve such applications over a long lifetime.

  6. Universality in few-body systems with large scattering length

    International Nuclear Information System (INIS)

    Hammer, H.-W.

    2005-01-01

    Effective Field Theory (EFT) provides a powerful framework that exploits a separation of scales in physical systems to perform systematically improvable, model-independent calculations. Particularly interesting are few-body systems with short-range interactions and large two-body scattering length. Such systems display remarkable universal features. In systems with more than two particles, a three-body force with limit cycle behavior is required for consistent renormalization already at leading order. We will review this EFT and some of its applications in the physics of cold atoms and nuclear physics. In particular, we will discuss the possibility of an infrared limit cycle in QCD. Recent extensions of the EFT approach to the four-body system and N-boson droplets in two spatial dimensions will also be addressed

  7. Senior Management Use of Management Control Systems in Large Companies

    DEFF Research Database (Denmark)

    Willert, Jeanette; Israelsen, Poul; Rohde, Carsten

    2017-01-01

    Ferreira and Otley’s (2009) conceptual and holistic framework for performance management systems, supplemented by elements of contextual factors and organisational culture. Further, selected researchers’ perceptions of the purpose of using management control systems are related to practitioners’ ideas......The use of management control systems in large companies remains relatively unexplored. Indeed, only a few studies of senior managers’ use of management control systems consider multiple controls in companies. This paper explores data from a comprehensive survey of the use of management control...... systems in 120 strategic business units at some of the largest companies in Denmark. The paper identifies how senior management guides and controls their subordinates to meet their companies’ objectives. The presentation and discussion of the results, including citations from executive managers, use...

  8. A hydrogel-based versatile screening platform for specific biomolecular recognition in a well plate format.

    Science.gov (United States)

    Beer, Meike V; Rech, Claudia; Diederichs, Sylvia; Hahn, Kathrin; Bruellhoff, Kristina; Möller, Martin; Elling, Lothar; Groll, Jürgen

    2012-04-01

    Precise determination of biomolecular interactions in high throughput crucially depends on a surface coating technique that allows immobilization of a variety of interaction partners in a non-interacting environment. We present a one-step hydrogel coating system based on isocyanate functional six-arm poly(ethylene oxide)-based star polymers for commercially available 96-well microtiter plates that combines a straightforward and robust coating application with versatile bio-functionalization. This system generates resistance to unspecific protein adsorption and cell adhesion, as demonstrated with fluorescently labeled bovine serum albumin and primary human dermal fibroblasts (HDF), and high specificity for the assessment of biomolecular recognition processes when ligands are immobilized on this surface. One particular advantage is the wide range of biomolecules that can be immobilized and convert the per se inert coating into a specifically interacting surface. We here demonstrate the immobilization and quantification of a broad range of biochemically important ligands, such as peptide sequences GRGDS and GRGDSK-biotin, the broadly applicable coupler molecule biocytin, the protein fibronectin, and the carbohydrates N-acetylglucosamine and N-acetyllactosamine. A simplified protocol for an enzyme-linked immunosorbent assay was established for the detection and quantification of ligands on the coating surface. Cell adhesion on the peptide and protein-modified surfaces was assessed using HDF. All coatings were applied using a one-step preparation technique, including bioactivation, which makes the system suitable for high-throughput screening in a format that is compatible with the most routinely used testing systems.

  9. Verifying large modular systems using iterative abstraction refinement

    International Nuclear Information System (INIS)

    Lahtinen, Jussi; Kuismin, Tuomas; Heljanko, Keijo

    2015-01-01

    Digital instrumentation and control (I&C) systems are increasingly used in the nuclear engineering domain. The exhaustive verification of these systems is challenging, and the usual verification methods such as testing and simulation are typically insufficient. Model checking is a formal method that is able to exhaustively analyse the behaviour of a model against a formally written specification. If the model checking tool detects a violation of the specification, it will give out a counter-example that demonstrates how the specification is violated in the system. Unfortunately, sometimes real life system designs are too big to be directly analysed by traditional model checking techniques. We have developed an iterative technique for model checking large modular systems. The technique uses abstraction based over-approximations of the model behaviour, combined with iterative refinement. The main contribution of the work is the concrete abstraction refinement technique based on the modular structure of the model, the dependency graph of the model, and a refinement sampling heuristic similar to delta debugging. The technique is geared towards proving properties, and outperforms BDD-based model checking, the k-induction technique, and the property directed reachability algorithm (PDR) in our experiments. - Highlights: • We have developed an iterative technique for model checking large modular systems. • The technique uses BDD-based model checking, k-induction, and PDR in parallel. • We have tested our algorithm by verifying two models with it. • The technique outperforms classical model checking methods in our experiments

  10. Large area high-speed metrology SPM system

    International Nuclear Information System (INIS)

    Klapetek, P; Valtr, M; Martinek, J; Picco, L; Payton, O D; Miles, M; Yacoot, A

    2015-01-01

    We present a large area high-speed measuring system capable of rapidly generating nanometre resolution scanning probe microscopy data over mm 2 regions. The system combines a slow moving but accurate large area XYZ scanner with a very fast but less accurate small area XY scanner. This arrangement enables very large areas to be scanned by stitching together the small, rapidly acquired, images from the fast XY scanner while simultaneously moving the slow XYZ scanner across the region of interest. In order to successfully merge the image sequences together two software approaches for calibrating the data from the fast scanner are described. The first utilizes the low uncertainty interferometric sensors of the XYZ scanner while the second implements a genetic algorithm with multiple parameter fitting during the data merging step of the image stitching process. The basic uncertainty components related to these high-speed measurements are also discussed. Both techniques are shown to successfully enable high-resolution, large area images to be generated at least an order of magnitude faster than with a conventional atomic force microscope. (paper)

  11. Large area high-speed metrology SPM system

    Science.gov (United States)

    Klapetek, P.; Valtr, M.; Picco, L.; Payton, O. D.; Martinek, J.; Yacoot, A.; Miles, M.

    2015-02-01

    We present a large area high-speed measuring system capable of rapidly generating nanometre resolution scanning probe microscopy data over mm2 regions. The system combines a slow moving but accurate large area XYZ scanner with a very fast but less accurate small area XY scanner. This arrangement enables very large areas to be scanned by stitching together the small, rapidly acquired, images from the fast XY scanner while simultaneously moving the slow XYZ scanner across the region of interest. In order to successfully merge the image sequences together two software approaches for calibrating the data from the fast scanner are described. The first utilizes the low uncertainty interferometric sensors of the XYZ scanner while the second implements a genetic algorithm with multiple parameter fitting during the data merging step of the image stitching process. The basic uncertainty components related to these high-speed measurements are also discussed. Both techniques are shown to successfully enable high-resolution, large area images to be generated at least an order of magnitude faster than with a conventional atomic force microscope.

  12. Bake-Out Mobile Controls for Large Vacuum Systems

    CERN Document Server

    Blanchard, S; Gomes, P; Pereira, H; Kopylov, L; Merker, S; Mikheev, M

    2014-01-01

    Large vacuum systems at CERN (Large Hadron Collider - LHC, Low Energy Ion Rings - LEIR...) require bake-out to achieve ultra-high vacuum specifications. The bake-out cycle is used to decrease the outgassing rate of the vacuum vessel and to activate the Non-Evaporable Getter (NEG) thin film. Bake-out control is a Proportional-Integral-Derivative (PID) regulation with complex recipes, interlocks and troubleshooting management and remote control. It is based on mobile Programmable Logic Controller (PLC) cabinets, fieldbus network and Supervisory Control and Data Acquisition (SCADA) application. The CERN vacuum installations include more than 7 km of baked vessels; using mobile cabinets reduces considerably the cost of the control system. The cabinets are installed close to the vacuum vessels during the time of the bake-out cycle. Mobile cabinets can be used in any of the CERN vacuum facilities. Remote control is provided through a fieldbus network and a SCADA application

  13. Solution methods for large systems of linear equations in BACCHUS

    International Nuclear Information System (INIS)

    Homann, C.; Dorr, B.

    1993-05-01

    The computer programme BACCHUS is used to describe steady state and transient thermal-hydraulic behaviour of a coolant in a fuel element with intact geometry in a fast breeder reactor. In such computer programmes generally large systems of linear equations with sparse matrices of coefficients, resulting from discretization of coolant conservation equations, must be solved thousands of times giving rise to large demands of main storage and CPU time. Direct and iterative solution methods of the systems of linear equations, available in BACCHUS, are described, giving theoretical details and experience with their use in the programme. Besides use of a method of lines, a Runge-Kutta-method, for solution of the partial differential equation is outlined. (orig.) [de

  14. Quarkonia production in small and large systems measured by ATLAS

    CERN Document Server

    Lopez, Jorge; The ATLAS collaboration

    2018-01-01

    The experimentally observed dissociation and regeneration of bound quarkonium states in heavy-ion collisions provide a powerful tool to probe the dynamics of the hot, dense plasma. These measurements are sensitive to the effects of color screening, color recombination, or other, new suppression mechanisms. In the large-statistics Run 2 lead-lead and proton-lead collision data, these phenomena can be probed with unprecedented precision. Measurements of the ground and excited quarkonia states, as well as their separation into prompt and non-prompt components, provide further opportunities to study the dynamics of heavy parton energy loss in these large systems. In addition, quarkonium production rates, and their excited to ground states ratios, in small, asymmetric systems are an interesting probe of cold nuclear matter effects. In this talk, the latest ATLAS results on quarkonia production will be presented, including new, differential measurements of charmonium suppression and azimuthal modulation in lead-lea...

  15. Dynamics of Large Systems of Nonlinearly Evolving Units

    Science.gov (United States)

    Lu, Zhixin

    The dynamics of large systems of many nonlinearly evolving units is a general research area that has great importance for many areas in science and technology, including biology, computation by artificial neural networks, statistical mechanics, flocking in animal groups, the dynamics of coupled neurons in the brain, and many others. While universal principles and techniques are largely lacking in this broad area of research, there is still one particular phenomenon that seems to be broadly applicable. In particular, this is the idea of emergence, by which is meant macroscopic behaviors that "emerge" from a large system of many "smaller or simpler entities such that...large entities" [i.e., macroscopic behaviors] arise which "exhibit properties the smaller/simpler entities do not exhibit." In this thesis we investigate mechanisms and manifestations of emergence in four dynamical systems consisting many nonlinearly evolving units. These four systems are as follows. (a) We first study the motion of a large ensemble of many noninteracting particles in a slowly changing Hamiltonian system that undergoes a separatrix crossing. In such systems, we find that separatrix-crossing induces a counterintuitive effect. Specifically, numerical simulation of two sets of densely sprinkled initial conditions on two energy curves appears to suggest that the two energy curves, one originally enclosing the other, seemingly interchange their positions. This, however, is topologically forbidden. We resolve this paradox by introducing a numerical simulation method we call "robust" and study its consequences. (b) We next study the collective dynamics of oscillatory pacemaker neurons in Suprachiasmatic Nucleus (SCN), which, through synchrony, govern the circadian rhythm of mammals. We start from a high-dimensional description of the many coupled oscillatory neuronal units within the SCN. This description is based on a forced Kuramoto model. We then reduce the system dimensionality by using

  16. Selected topics in solution-phase biomolecular NMR spectroscopy

    Science.gov (United States)

    Kay, Lewis E.; Frydman, Lucio

    2017-05-01

    Solution bio-NMR spectroscopy continues to enjoy a preeminent role as an important tool in elucidating the structure and dynamics of a range of important biomolecules and in relating these to function. Equally impressive is how NMR continues to 'reinvent' itself through the efforts of many brilliant practitioners who ask increasingly demanding and increasingly biologically relevant questions. The ability to manipulate spin Hamiltonians - almost at will - to dissect the information of interest contributes to the success of the endeavor and ensures that the NMR technology will be well poised to contribute to as yet unknown frontiers in the future. As a tribute to the versatility of solution NMR in biomolecular studies and to the continued rapid advances in the field we present a Virtual Special Issue (VSI) that includes over 40 articles on various aspects of solution-state biomolecular NMR that have been published in the Journal of Magnetic Resonance in the past 7 years. These, in total, help celebrate the achievements of this vibrant field.

  17. Photochirogenesis: Photochemical Models on the Origin of Biomolecular Homochirality

    Directory of Open Access Journals (Sweden)

    Cornelia Meinert

    2010-05-01

    Full Text Available Current research focuses on a better understanding of the origin of biomolecular asymmetry by the identification and detection of the possibly first chiral molecules that were involved in the appearance and evolution of life on Earth. We have reasons to assume that these molecules were specific chiral amino acids. Chiral amino acids have been identified in both chondritic meteorites and simulated interstellar ices. Present research reasons that circularly polarized electromagnetic radiation was identified in interstellar environments and an asymmetric interstellar photon-molecule interaction might have triggered biomolecular symmetry breaking. We review on the possible prebiotic interaction of ‘chiral photons’ in the form of circularly polarized light, with early chiral organic molecules. We will highlight recent studies on enantioselective photolysis of racemic amino acids by circularly polarized light and experiments on the asymmetric photochemical synthesis of amino acids from only one C and one N containing molecules by simulating interstellar environments. Both approaches are based on circular dichroic transitions of amino acids that will be presented as well.

  18. An Overview of Biomolecular Event Extraction from Scientific Documents.

    Science.gov (United States)

    Vanegas, Jorge A; Matos, Sérgio; González, Fabio; Oliveira, José L

    2015-01-01

    This paper presents a review of state-of-the-art approaches to automatic extraction of biomolecular events from scientific texts. Events involving biomolecules such as genes, transcription factors, or enzymes, for example, have a central role in biological processes and functions and provide valuable information for describing physiological and pathogenesis mechanisms. Event extraction from biomedical literature has a broad range of applications, including support for information retrieval, knowledge summarization, and information extraction and discovery. However, automatic event extraction is a challenging task due to the ambiguity and diversity of natural language and higher-level linguistic phenomena, such as speculations and negations, which occur in biological texts and can lead to misunderstanding or incorrect interpretation. Many strategies have been proposed in the last decade, originating from different research areas such as natural language processing, machine learning, and statistics. This review summarizes the most representative approaches in biomolecular event extraction and presents an analysis of the current state of the art and of commonly used methods, features, and tools. Finally, current research trends and future perspectives are also discussed.

  19. An Overview of Biomolecular Event Extraction from Scientific Documents

    Directory of Open Access Journals (Sweden)

    Jorge A. Vanegas

    2015-01-01

    Full Text Available This paper presents a review of state-of-the-art approaches to automatic extraction of biomolecular events from scientific texts. Events involving biomolecules such as genes, transcription factors, or enzymes, for example, have a central role in biological processes and functions and provide valuable information for describing physiological and pathogenesis mechanisms. Event extraction from biomedical literature has a broad range of applications, including support for information retrieval, knowledge summarization, and information extraction and discovery. However, automatic event extraction is a challenging task due to the ambiguity and diversity of natural language and higher-level linguistic phenomena, such as speculations and negations, which occur in biological texts and can lead to misunderstanding or incorrect interpretation. Many strategies have been proposed in the last decade, originating from different research areas such as natural language processing, machine learning, and statistics. This review summarizes the most representative approaches in biomolecular event extraction and presents an analysis of the current state of the art and of commonly used methods, features, and tools. Finally, current research trends and future perspectives are also discussed.

  20. A Classification Framework for Large-Scale Face Recognition Systems

    OpenAIRE

    Zhou, Ziheng; Deravi, Farzin

    2009-01-01

    This paper presents a generic classification framework for large-scale face recognition systems. Within the framework, a data sampling strategy is proposed to tackle the data imbalance when image pairs are sampled from thousands of face images for preparing a training dataset. A modified kernel Fisher discriminant classifier is proposed to make it computationally feasible to train the kernel-based classification method using tens of thousands of training samples. The framework is tested in an...

  1. A remote joint system for large vacuum ducts

    International Nuclear Information System (INIS)

    Hagmann, D.B.; Coughlan, J.B.

    1983-01-01

    A large remote vacuum duct joining system has been developed for fusion machines that uses several two-jaw screwdriven clamps. The preferred location for clamp installation is inside the vacuum duct where access space is available for the actuating device. It also decreases space needed for handling operations exterior to the duct. The clamp system is unique in that it is low cost, applies force directly over the seal, permits leak testing to the seal annulus, is highly reliable, can be remotely replaced, and is usable on a variety of other applications

  2. Electron cyclotron beam measurement system in the Large Helical Device

    Energy Technology Data Exchange (ETDEWEB)

    Kamio, S., E-mail: kamio@nifs.ac.jp; Takahashi, H.; Kubo, S.; Shimozuma, T.; Yoshimura, Y.; Igami, H.; Ito, S.; Kobayashi, S.; Mizuno, Y.; Okada, K.; Osakabe, M.; Mutoh, T. [National Institute for Fusion Science, Toki 509-5292 (Japan)

    2014-11-15

    In order to evaluate the electron cyclotron (EC) heating power inside the Large Helical Device vacuum vessel and to investigate the physics of the interaction between the EC beam and the plasma, a direct measurement system for the EC beam transmitted through the plasma column was developed. The system consists of an EC beam target plate, which is made of isotropic graphite and faces against the EC beam through the plasma, and an IR camera for measuring the target plate temperature increase by the transmitted EC beam. This system is applicable to the high magnetic field (up to 2.75 T) and plasma density (up to 0.8 × 10{sup 19} m{sup −3}). This system successfully evaluated the transmitted EC beam profile and the refraction.

  3. Large capacity, high-speed multiparameter multichannel analysis system

    International Nuclear Information System (INIS)

    Hendricks, R.W.; Seeger, P.A.; Scheer, J.W.; Suehiro, S.

    1980-01-01

    A data acquisition system for recording multiparameter digital data into a large memory array at over 2.5 MHz is described. The system consists of a MOSTEK MK8600 2048K x 24-bit memory system, I/O ports to various external devices including the CAMAC dataway, a memory incrementer/adder and a daisy-chain of experiment-specific modules which calculate the memory address which is to be incremented. The design of the daisy-chain permits multiple modules and provides for easy modification as experimental needs change. The system has been designed for use in multiparameter, multichannel analysis of high-speed data gathered by position-sensitive detectors at conventional and synchrotron x-ray sources as well as for fixed energy and time-of-flight diffraction at continuous and pulsed neutron sources

  4. Separate Poles Mode for Large-Capacity HVDC System

    Science.gov (United States)

    Zhu, Lin; Gao, Qin

    2017-05-01

    This paper proposes a novel connection mode, separate poles mode (SPM), for large-capacity HVDC systems. The proposed mode focuses on the core issues of HVDC connection in interconnected power grids and principally aims at increasing effective electric distance between poles, which helps to mitigate the interaction problems between AC system and DC system. Receiving end of bipolar HVDC has been divided into different inverter stations under the mode, and thus significantly alleviates difficulties in power transmission and consumption of receiving-end AC grids. By investigating the changes of multi-feed short-circuit ratio (MISCR), finding that HVDC with SPM shows critical impacts upon itself and other HVDC systems with conventional connection mode, which demonstrates that SPM can make balance between MISCR increase and short-circuit current limit.

  5. Two-level systems driven by large-amplitude fields

    Science.gov (United States)

    Nori, F.; Ashhab, S.; Johansson, J. R.; Zagoskin, A. M.

    2009-03-01

    We analyze the dynamics of a two-level system subject to driving by large-amplitude external fields, focusing on the resonance properties in the case of driving around the region of avoided level crossing. In particular, we consider three main questions that characterize resonance dynamics: (1) the resonance condition, (2) the frequency of the resulting oscillations on resonance, and (3) the width of the resonance. We identify the regions of validity of different approximations. In a large region of the parameter space, we use a geometric picture in order to obtain both a simple understanding of the dynamics and quantitative results. The geometric approach is obtained by dividing the evolution into discrete time steps, with each time step described by either a phase shift on the basis states or a coherent mixing process corresponding to a Landau-Zener crossing. We compare the results of the geometric picture with those of a rotating wave approximation. We also comment briefly on the prospects of employing strong driving as a useful tool to manipulate two-level systems. S. Ashhab, J.R. Johansson, A.M. Zagoskin, F. Nori, Two-level systems driven by large-amplitude fields, Phys. Rev. A 75, 063414 (2007). S. Ashhab et al, unpublished.

  6. Design of central control system for large helical device (LHD)

    International Nuclear Information System (INIS)

    Yamazaki, K.; Kaneko, H.; Yamaguchi, S.; Watanabe, K.Y.; Taniguchi, Y.; Motojima, O.

    1993-11-01

    The world largest superconducting fusion machine LHD (Large Helical Device) is under construction in Japan, aiming at steady state operations. Its basic control system consists of UNIX computers, FDDI/Ethernet LANs, VME multiprocessors and VxWorks real-time OS. For flexible and reliable operations of the LHD machine a cooperative distributed system with more than 30 experimental equipments is controlled by the central computer and the main timing system, and is supervised by the main protective interlock system. Intelligent control systems, such as applications of fuzzy logic and neural networks, are planed to be adopted for flexible feedback controls of plasma configurations besides the classical PID control scheme. Design studies of its control system and related R and D programs with coil-plasma simulation systems are now being performed. The construction of the LHD Control Building in a new site will begin in 1995 after finishing the construction of the LHD Experimental Building, and the hardware construction of the LHD central control equipments will be started in 1996. A first plasma production by means of this control system is expected in 1997. (author)

  7. Stability and Control of Large-Scale Dynamical Systems A Vector Dissipative Systems Approach

    CERN Document Server

    Haddad, Wassim M

    2011-01-01

    Modern complex large-scale dynamical systems exist in virtually every aspect of science and engineering, and are associated with a wide variety of physical, technological, environmental, and social phenomena, including aerospace, power, communications, and network systems, to name just a few. This book develops a general stability analysis and control design framework for nonlinear large-scale interconnected dynamical systems, and presents the most complete treatment on vector Lyapunov function methods, vector dissipativity theory, and decentralized control architectures. Large-scale dynami

  8. Solution approach for a large scale personnel transport system for a large company in Latin America

    Energy Technology Data Exchange (ETDEWEB)

    Garzón-Garnica, Eduardo-Arturo; Caballero-Morales, Santiago-Omar; Martínez-Flores, José-Luis

    2017-07-01

    The present paper focuses on the modelling and solution of a large-scale personnel transportation system in Mexico where many routes and vehicles are currently used to service 525 points. The routing system proposed can be applied to many cities in the Latin-American region. Design/methodology/approach: This system was modelled as a VRP model considering the use of real-world transit times, and the fact that routes start at the farthest point from the destination center. Experiments were performed on different sized sets of service points. As the size of the instances was increased, the performance of the heuristic method was assessed in comparison with the results of an exact algorithm, the results remaining very close between both. When the size of the instance was full-scale and the exact algorithm took too much time to solve the problem, then the heuristic algorithm provided a feasible solution. Supported by the validation with smaller scale instances, where the difference between both solutions was close to a 6%, the full –scale solution obtained with the heuristic algorithm was considered to be within that same range. Findings: The proposed modelling and solving method provided a solution that would produce significant savings in the daily operation of the routes. Originality/value: The urban distribution of the cities in Latin America is unique to other regions in the world. The general layout of the large cities in this region includes a small town center, usually antique, and a somewhat disordered outer region. The lack of a vehicle-centered urban planning poses distinct challenges for vehicle routing problems in the region. The use of a heuristic VRP combined with the results of an exact VRP, allowed the obtention of an improved routing plan specific to the requirements of the region.

  9. Solution approach for a large scale personnel transport system for a large company in Latin America

    International Nuclear Information System (INIS)

    Garzón-Garnica, Eduardo-Arturo; Caballero-Morales, Santiago-Omar; Martínez-Flores, José-Luis

    2017-01-01

    The present paper focuses on the modelling and solution of a large-scale personnel transportation system in Mexico where many routes and vehicles are currently used to service 525 points. The routing system proposed can be applied to many cities in the Latin-American region. Design/methodology/approach: This system was modelled as a VRP model considering the use of real-world transit times, and the fact that routes start at the farthest point from the destination center. Experiments were performed on different sized sets of service points. As the size of the instances was increased, the performance of the heuristic method was assessed in comparison with the results of an exact algorithm, the results remaining very close between both. When the size of the instance was full-scale and the exact algorithm took too much time to solve the problem, then the heuristic algorithm provided a feasible solution. Supported by the validation with smaller scale instances, where the difference between both solutions was close to a 6%, the full –scale solution obtained with the heuristic algorithm was considered to be within that same range. Findings: The proposed modelling and solving method provided a solution that would produce significant savings in the daily operation of the routes. Originality/value: The urban distribution of the cities in Latin America is unique to other regions in the world. The general layout of the large cities in this region includes a small town center, usually antique, and a somewhat disordered outer region. The lack of a vehicle-centered urban planning poses distinct challenges for vehicle routing problems in the region. The use of a heuristic VRP combined with the results of an exact VRP, allowed the obtention of an improved routing plan specific to the requirements of the region.

  10. Solution approach for a large scale personnel transport system for a large company in Latin America

    Directory of Open Access Journals (Sweden)

    Eduardo-Arturo Garzón-Garnica

    2017-10-01

    Full Text Available Purpose: The present paper focuses on the modelling and solution of a large-scale personnel transportation system in Mexico where many routes and vehicles are currently used to service 525 points. The routing system proposed can be applied to many cities in the Latin-American region. Design/methodology/approach: This system was modelled as a VRP model considering the use of real-world transit times, and the fact that routes start at the farthest point from the destination center. Experiments were performed on different sized sets of service points. As the size of the instances was increased, the performance of the heuristic method was assessed in comparison with the results of an exact algorithm, the results remaining very close between both.  When the size of the instance was full-scale and the exact algorithm took too much time to solve the problem, then the heuristic algorithm provided a feasible solution. Supported by the validation with smaller scale instances, where the difference between both solutions was close to a 6%, the full –scale solution obtained with the heuristic algorithm was considered to be within that same range. Findings: The proposed modelling and solving method provided a solution that would produce significant savings in the daily operation of the routes. Originality/value: The urban distribution of the cities in Latin America is unique to other regions in the world. The general layout of the large cities in this region includes a small town center, usually antique, and a somewhat disordered outer region. The lack of a vehicle-centered urban planning poses distinct challenges for vehicle routing problems in the region. The use of a heuristic VRP combined with the results of an exact VRP, allowed the obtention of an improved routing plan specific to the requirements of the region.

  11. RPYFMM: Parallel adaptive fast multipole method for Rotne-Prager-Yamakawa tensor in biomolecular hydrodynamics simulations

    Science.gov (United States)

    Guan, W.; Cheng, X.; Huang, J.; Huber, G.; Li, W.; McCammon, J. A.; Zhang, B.

    2018-06-01

    RPYFMM is a software package for the efficient evaluation of the potential field governed by the Rotne-Prager-Yamakawa (RPY) tensor interactions in biomolecular hydrodynamics simulations. In our algorithm, the RPY tensor is decomposed as a linear combination of four Laplace interactions, each of which is evaluated using the adaptive fast multipole method (FMM) (Greengard and Rokhlin, 1997) where the exponential expansions are applied to diagonalize the multipole-to-local translation operators. RPYFMM offers a unified execution on both shared and distributed memory computers by leveraging the DASHMM library (DeBuhr et al., 2016, 2018). Preliminary numerical results show that the interactions for a molecular system of 15 million particles (beads) can be computed within one second on a Cray XC30 cluster using 12,288 cores, while achieving approximately 54% strong-scaling efficiency.

  12. On Lattice Sequential Decoding for Large MIMO Systems

    KAUST Repository

    Ali, Konpal S.

    2014-04-01

    Due to their ability to provide high data rates, Multiple-Input Multiple-Output (MIMO) wireless communication systems have become increasingly popular. Decoding of these systems with acceptable error performance is computationally very demanding. In the case of large overdetermined MIMO systems, we employ the Sequential Decoder using the Fano Algorithm. A parameter called the bias is varied to attain different performance-complexity trade-offs. Low values of the bias result in excellent performance but at the expense of high complexity and vice versa for higher bias values. We attempt to bound the error by bounding the bias, using the minimum distance of a lattice. Also, a particular trend is observed with increasing SNR: a region of low complexity and high error, followed by a region of high complexity and error falling, and finally a region of low complexity and low error. For lower bias values, the stages of the trend are incurred at lower SNR than for higher bias values. This has the important implication that a low enough bias value, at low to moderate SNR, can result in low error and low complexity even for large MIMO systems. Our work is compared against Lattice Reduction (LR) aided Linear Decoders (LDs). Another impressive observation for low bias values that satisfy the error bound is that the Sequential Decoder\\'s error is seen to fall with increasing system size, while it grows for the LR-aided LDs. For the case of large underdetermined MIMO systems, Sequential Decoding with two preprocessing schemes is proposed – 1) Minimum Mean Square Error Generalized Decision Feedback Equalization (MMSE-GDFE) preprocessing 2) MMSE-GDFE preprocessing, followed by Lattice Reduction and Greedy Ordering. Our work is compared against previous work which employs Sphere Decoding preprocessed using MMSE-GDFE, Lattice Reduction and Greedy Ordering. For the case of large systems, this results in high complexity and difficulty in choosing the sphere radius. Our schemes

  13. Remote collaboration system based on large scale simulation

    International Nuclear Information System (INIS)

    Kishimoto, Yasuaki; Sugahara, Akihiro; Li, J.Q.

    2008-01-01

    Large scale simulation using super-computer, which generally requires long CPU time and produces large amount of data, has been extensively studied as a third pillar in various advanced science fields in parallel to theory and experiment. Such a simulation is expected to lead new scientific discoveries through elucidation of various complex phenomena, which are hardly identified only by conventional theoretical and experimental approaches. In order to assist such large simulation studies for which many collaborators working at geographically different places participate and contribute, we have developed a unique remote collaboration system, referred to as SIMON (simulation monitoring system), which is based on client-server system control introducing an idea of up-date processing, contrary to that of widely used post-processing. As a key ingredient, we have developed a trigger method, which transmits various requests for the up-date processing from the simulation (client) running on a super-computer to a workstation (server). Namely, the simulation running on a super-computer actively controls the timing of up-date processing. The server that has received the requests from the ongoing simulation such as data transfer, data analyses, and visualizations, etc. starts operations according to the requests during the simulation. The server makes the latest results available to web browsers, so that the collaborators can monitor the results at any place and time in the world. By applying the system to a specific simulation project of laser-matter interaction, we have confirmed that the system works well and plays an important role as a collaboration platform on which many collaborators work with one another

  14. Large Scale Emerging Properties from Non Hamiltonian Complex Systems

    Directory of Open Access Journals (Sweden)

    Marco Bianucci

    2017-06-01

    Full Text Available The concept of “large scale” depends obviously on the phenomenon we are interested in. For example, in the field of foundation of Thermodynamics from microscopic dynamics, the spatial and time large scales are order of fraction of millimetres and microseconds, respectively, or lesser, and are defined in relation to the spatial and time scales of the microscopic systems. In large scale oceanography or global climate dynamics problems the time scales of interest are order of thousands of kilometres, for space, and many years for time, and are compared to the local and daily/monthly times scales of atmosphere and ocean dynamics. In all the cases a Zwanzig projection approach is, at least in principle, an effective tool to obtain class of universal smooth “large scale” dynamics for few degrees of freedom of interest, starting from the complex dynamics of the whole (usually many degrees of freedom system. The projection approach leads to a very complex calculus with differential operators, that is drastically simplified when the basic dynamics of the system of interest is Hamiltonian, as it happens in Foundation of Thermodynamics problems. However, in geophysical Fluid Dynamics, Biology, and in most of the physical problems the building block fundamental equations of motions have a non Hamiltonian structure. Thus, to continue to apply the useful projection approach also in these cases, we exploit the generalization of the Hamiltonian formalism given by the Lie algebra of dissipative differential operators. In this way, we are able to analytically deal with the series of the differential operators stemming from the projection approach applied to these general cases. Then we shall apply this formalism to obtain some relevant results concerning the statistical properties of the El Niño Southern Oscillation (ENSO.

  15. Glass badge dosimetry system for large scale personal monitoring

    International Nuclear Information System (INIS)

    Norimichi Juto

    2002-01-01

    Glass Badge using silver activated phosphate glass dosemeter was specially developed for large scale personal monitoring. And dosimetry systems such as an automatic leader and a dose equipment calculation algorithm were developed at once to achieve reasonable personal monitoring. In large scale personal monitoring, both of precision for dosimetry and confidence for lot of personal data handling become very important. The silver activated phosphate glass dosemeter has basically excellent characteristics for dosimetry such as homogeneous and stable sensitivity, negligible fading and so on. Glass Badge was designed to measure 10 keV - 10 MeV range of photon. 300 keV - 3 MeV range of beta, and 0.025 eV - 15 MeV range of neutron by included SSNTD. And developed Glass Badge dosimetry system has not only these basic characteristics but also lot of features to keep good precision for dosimetry and data handling. In this presentation, features of Glass Badge dosimetry systems and examples for practical personal monitoring systems will be presented. (Author)

  16. Buffer provisioning for large-scale data-acquisition systems

    CERN Document Server

    AUTHOR|(SzGeCERN)756497; The ATLAS collaboration; Garcia Garcia, Pedro Javier; Froening, Holger; Vandelli, Wainer

    2018-01-01

    The data acquisition system of the ATLAS experiment, a major experiment of the Large Hadron Collider (LHC) at CERN, will go through a major upgrade in the next decade. The upgrade is driven by experimental physics requirements, calling for increased data rates on the order of 6~TB/s. By contrast, the data rate of the existing system is 160~GB/s. Among the changes in the upgraded system will be a very large buffer with a projected size on the order of 70 PB. The buffer role will be decoupling of data production from on-line data processing, storing data for periods of up to 24~hours until it can be analyzed by the event processing system. The larger buffer will allow a new data recording strategy, providing additional margins to handle variable data rates. At the same time it will provide sensible trade-offs between buffering space and on-line processing capabilities. This compromise between two resources will be possible since the data production cycle includes time periods where the experiment will not produ...

  17. A convex optimization approach for solving large scale linear systems

    Directory of Open Access Journals (Sweden)

    Debora Cores

    2017-01-01

    Full Text Available The well-known Conjugate Gradient (CG method minimizes a strictly convex quadratic function for solving large-scale linear system of equations when the coefficient matrix is symmetric and positive definite. In this work we present and analyze a non-quadratic convex function for solving any large-scale linear system of equations regardless of the characteristics of the coefficient matrix. For finding the global minimizers, of this new convex function, any low-cost iterative optimization technique could be applied. In particular, we propose to use the low-cost globally convergent Spectral Projected Gradient (SPG method, which allow us to extend this optimization approach for solving consistent square and rectangular linear system, as well as linear feasibility problem, with and without convex constraints and with and without preconditioning strategies. Our numerical results indicate that the new scheme outperforms state-of-the-art iterative techniques for solving linear systems when the symmetric part of the coefficient matrix is indefinite, and also for solving linear feasibility problems.

  18. A Simple Instrumentation System for Large Structure Vibration Monitoring

    Directory of Open Access Journals (Sweden)

    Didik R. Santoso

    2010-12-01

    Full Text Available Traditional instrumentation systems used for monitoring vibration of large-scale infrastructure building such as bridges, railway, and others structural building, generally have a complex design. Makes it simple would be very useful both in terms of low-cost and easy maintenance. This paper describes how to develop the instrumentation system. The system is built based on distributed network, with field bus topology, using single-master multi-slave architecture. Master is a control unit, built based on a PC equipped with RS-485 interface. Slave is a sensing unit; each slave was built by integrating a 3-axis vibration sensor with a microcontroller based data acquisition system. Vibration sensor is designed using the main components of a MEMS accelerometer. While the software is developed for two functions: as a control system hardware and data processing. To verify performance of the developed instrumentation system, several laboratory tests have been performed. The result shows that the system has good performance.

  19. Advanced energy systems (APU) for large commercial aircraft

    Energy Technology Data Exchange (ETDEWEB)

    Westenberger, A.; Bleil, J.; Arendt, M. [Airbus Deutschland GmbH, Hamburg (Germany)

    2013-06-01

    The intention of using a highly integrated component using on fuel cell technology installed on board of large commercial passenger aircraft for the generation of onboard power for the systems demand during an entire aircraft mission was subject of several studies. The results of these studies have been based on the simulation of the whole system in the context of an aircraft system environment. In front of the work stood the analyses of different fuel cell technologies and the analyses of the aircraft system environment. Today onboard power is provided on ground by an APU and in flight by the main engines. In order to compare fuel cell technology with the today's usual gas turbine operational characteristics have been analysed. A second analysis was devoted to the system demand for typical aircraft categories. The MEA system concept was supposed in all cases. The favourable concept represented an aircraft propelled by conventional engines with starter generator units, providing AC electrical power, covering in total proximately half of the power demand and a component based on fuel cell technology. This component provided electrical DC power, clean potable water, thermal energy at 180 degrees Celsius and nitrogen enriched air for fire suppression and fire extinguishing agent. In opposite of a usual gas turbine based APU, this new unit was operated as the primary power system. (orig.)

  20. Exchanging large data object in multi-agent systems

    Science.gov (United States)

    Al-Yaseen, Wathiq Laftah; Othman, Zulaiha Ali; Nazri, Mohd Zakree Ahmad

    2016-08-01

    One of the Business Intelligent solutions that is currently in use is the Multi-Agent System (MAS). Communication is one of the most important elements in MAS, especially for exchanging large low level data between distributed agents (physically). The Agent Communication Language in JADE has been offered as a secure method for sending data, whereby the data is defined as an object. However, the object cannot be used to send data to another agent in a different location. Therefore, the aim of this paper was to propose a method for the exchange of large low level data as an object by creating a proxy agent known as a Delivery Agent, which temporarily imitates the Receiver Agent. The results showed that the proposed method is able to send large-sized data. The experiments were conducted using 16 datasets ranging from 100,000 to 7 million instances. However, for the proposed method, the RAM and the CPU machine had to be slightly increased for the Receiver Agent, but the latency time was not significantly different compared to the use of the Java Socket method (non-agent and less secure). With such results, it was concluded that the proposed method can be used to securely send large data between agents.

  1. Distributed system for large-scale remote research

    International Nuclear Information System (INIS)

    Ueshima, Yutaka

    2002-01-01

    In advanced photon research, large-scale simulations and high-resolution observations are powerfull tools. In numerical and real experiments, the real-time visualization and steering system is considered as a hopeful method of data analysis. This approach is valid in the typical analysis at one time or low cost experiment and simulation. In research of an unknown problem, it is necessary that the output data be analyzed many times because conclusive analysis is difficult at one time. Consequently, output data should be filed to refer and analyze at any time. To support research, we need the automatic functions, transporting data files from data generator to data storage, analyzing data, tracking history of data handling, and so on. The supporting system will be a functionally distributed system. (author)

  2. SIMON: Remote collaboration system based on large scale simulation

    International Nuclear Information System (INIS)

    Sugawara, Akihiro; Kishimoto, Yasuaki

    2003-01-01

    Development of SIMON (SImulation MONitoring) system is described. SIMON aims to investigate many physical phenomena of tokamak type nuclear fusion plasma by simulation and to exchange information and to carry out joint researches with scientists in the world using internet. The characteristics of SIMON are followings; 1) decrease load of simulation by trigger sending method, 2) visualization of simulation results and hierarchical structure of analysis, 3) decrease of number of license by using command line when software is used, 4) improvement of support for using network of simulation data output by use of HTML (Hyper Text Markup Language), 5) avoidance of complex built-in work in client part and 6) small-sized and portable software. The visualization method of large scale simulation, remote collaboration system by HTML, trigger sending method, hierarchical analytical method, introduction into three-dimensional electromagnetic transportation code and technologies of SIMON system are explained. (S.Y.)

  3. Cryogenic control system of the large COMPASS polarized target

    CERN Document Server

    Gautheron, F; Baum, G; Berglund, P; Doshita, N; Görtz, S; Gustafsson, K K; Horikawa, N; Kisselev, Yu V; Koivuniemi, J H; Kondo, K; Meyer, Werner T; Reicherz, G

    2004-01-01

    The dilution refrigerator used to cool the large COMPASS polarized target is monitored through a PC running LabVIEW trademark 6.1 under Windows 2000 trademark . About 60 parameters of the target (temperatures, pressures, flow rates) are continuously plotted and checked. They are periodically recorded in an Oracle trademark database and in a data file. An alarm for every parameter can be individually activated and optionally connected to a GSM (Global System for Mobile Communication) delivery message system. A web server receives and publishes the online status of the target with online tables and graphics on a dedicated COMPASS polarized target information web site. A Siemens programmable logic controller (PLC) powered by an uninterruptable source keeps the cryogenic system safe and stable during the long beam periods by controlling valves and interlocks. This safety feature protects the dilution refrigerator against potential damages in case of power failure.

  4. Magnetic Properties of Large-Scale Nanostructured Graphene Systems

    DEFF Research Database (Denmark)

    Gregersen, Søren Schou

    The on-going progress in two-dimensional (2D) materials and nanostructure fabrication motivates the study of altered and combined materials. Graphene—the most studied material of the 2D family—displays unique electronic and spintronic properties. Exceptionally high electron mobilities, that surpass...... those in conventional materials such as silicon, make graphene a very interesting material for high-speed electronics. Simultaneously, long spin-diffusion lengths and spin-life times makes graphene an eligible spin-transport channel. In this thesis, we explore fundamental features of nanostructured...... graphene systems using large-scale modeling techniques. Graphene perforations, or antidots, have received substantial interest in the prospect of opening large band gaps in the otherwise gapless graphene. Motivated by recent improvements of fabrication processes, such as forming graphene antidots and layer...

  5. Efficient network monitoring for large data acquisition systems

    International Nuclear Information System (INIS)

    Savu, D.O.; Martin, B.; Al-Shabibi, A.; Sjoen, R.; Batraneanu, S.M.; Stancu, S.N.

    2012-01-01

    Though constantly evolving and improving, the available network monitoring solutions have limitations when applied to the infrastructure of a high speed realtime data acquisition (DAQ) system. DAQ networks are particular computer networks where experts have to pay attention to both individual subsections as well as system wide traffic flows while monitoring the network. The ATLAS Network at the Large Hadron Collider (LHC) has more than 200 switches interconnecting 3500 hosts and totaling 8500 high speed links. The use of heterogeneous tools for monitoring various infrastructure parameters, in order to assure optimal DAQ system performance, proved to be a tedious and time consuming task for experts. To alleviate this problem we used our networking and DAQ expertise to build a flexible and scalable monitoring system providing an intuitive user interface with the same look and feel irrespective of the data provider that is used. Our system uses custom developed components for critical performance monitoring and seamlessly integrates complementary data from auxiliary tools, such as NAGIOS, information services or custom databases. A number of techniques (e.g. normalization, aggregation and data caching) were used in order to improve the user interface response time. The end result is a unified monitoring interface, for fast and uniform access to system statistics, which significantly reduced the time spent by experts for ad-hoc and post-mortem analysis. (authors)

  6. Large-Scale Optimization for Bayesian Inference in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Willcox, Karen [MIT; Marzouk, Youssef [MIT

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of the SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to

  7. Market penetration of large wind/diesel systems

    International Nuclear Information System (INIS)

    Kronborg, T.

    1992-01-01

    Burmeister ampersand Wain is developing a large size wind/diesel package in collaboration with Micon, the Danish wind turbine manufacturer, and the Danish utility NESA. The package comprises an initial calculation of the technical feasibility and the economic viability of an actual project, installing the optimum number of large wind turbines, and service, operation, and maintenance as needed. The concept should be seen as an addition to existing diesel-based power stations. Wind turbines are especially advantageous in smaller diesel-based electrical systems in the 1-20 MW range because such systems can have high fuel costs and expensive maintenance. Analysis of the market for the wind/diesel concept indicates islands and remote areas with limited population are likely candidates for implementation of wind/diesel systems. An example of an economic analysis of a wind/diesel application on an isolated island is presented, showing the cost savings possible. To obtain practical experience and to demonstrate the wind/diesel concept, a MW-size demonstration plant is being constructed in Denmark

  8. Large Scale Self-Organizing Information Distribution System

    National Research Council Canada - National Science Library

    Low, Steven

    2005-01-01

    This project investigates issues in "large-scale" networks. Here "large-scale" refers to networks with large number of high capacity nodes and transmission links, and shared by a large number of users...

  9. Reliable pipeline repair system for very large pipe size

    Energy Technology Data Exchange (ETDEWEB)

    Charalambides, John N.; Sousa, Alexandre Barreto de [Oceaneering International, Inc., Houston, TX (United States)

    2004-07-01

    The oil and gas industry worldwide has been mainly depending on the long-term reliability of rigid pipelines to ensure the transportation of hydrocarbons, crude oil, gas, fuel, etc. Many other methods are also utilized onshore and offshore (e.g. flexible lines, FPSO's, etc.), but when it comes to the underwater transportation of very high volumes of oil and gas, the industry commonly uses large size rigid pipelines (i.e. steel pipes). Oil and gas operators learned to depend on the long-lasting integrity of these very large pipelines and many times they forget or disregard that even steel pipelines degrade over time and more often that that, they are also susceptible to various forms of damage (minor or major, environmental or external, etc.). Over the recent years the industry had recognized the need of implementing an 'emergency repair plan' to account for such unforeseen events and the oil and gas operators have become 'smarter' by being 'pro-active' in order to ensure 'flow assurance'. When we consider very large diameter steel pipelines such as 42' and 48' nominal pipe size (NPS), the industry worldwide does not provide 'ready-made', 'off-the-shelf' repair hardware that can be easily shipped to the offshore location and effect a major repair within acceptable time frames and avoid substantial profit losses due to 'down-time' in production. The typical time required to establish a solid repair system for large pipe diameters could be as long as six or more months (depending on the availability of raw materials). This paper will present in detail the Emergency Pipeline Repair Systems (EPRS) that Oceaneering successfully designed, manufactured, tested and provided to two major oil and gas operators, located in two different continents (Gulf of Mexico, U.S.A. and Arabian Gulf, U.A.E.), for two different very large pipe sizes (42'' and 48'' Nominal Pipe Sizes

  10. Surface Nuclear Magnetic Resonance Imaging of Large Systems

    International Nuclear Information System (INIS)

    Weichman, P.B.; Lavely, E.M.; Ritzwoller, M.H.

    1999-01-01

    The general theory of surface NMR imaging of large electromagnetically active systems is considered, motivated by geophysical applications. A general imaging equation is derived for the NMR voltage response, valid for arbitrary transmitter and receiver loop geometry and arbitrary conductivity structure of the sample. When the conductivity grows to the point where the electromagnetic skin depth becomes comparable to the sample size, significant diffusive retardation effects occur that strongly affect the signal. Accounting for these now allows more accurate imaging than previously possible. It is shown that the time constant T 1 may in principle be inferred directly from the diffusive tail of the signal. copyright 1999 The American Physical Society

  11. Theoretical restrictions on longest implicit time scales in Markov state models of biomolecular dynamics

    Science.gov (United States)

    Sinitskiy, Anton V.; Pande, Vijay S.

    2018-01-01

    Markov state models (MSMs) have been widely used to analyze computer simulations of various biomolecular systems. They can capture conformational transitions much slower than an average or maximal length of a single molecular dynamics (MD) trajectory from the set of trajectories used to build the MSM. A rule of thumb claiming that the slowest implicit time scale captured by an MSM should be comparable by the order of magnitude to the aggregate duration of all MD trajectories used to build this MSM has been known in the field. However, this rule has never been formally proved. In this work, we present analytical results for the slowest time scale in several types of MSMs, supporting the above rule. We conclude that the slowest implicit time scale equals the product of the aggregate sampling and four factors that quantify: (1) how much statistics on the conformational transitions corresponding to the longest implicit time scale is available, (2) how good the sampling of the destination Markov state is, (3) the gain in statistics from using a sliding window for counting transitions between Markov states, and (4) a bias in the estimate of the implicit time scale arising from finite sampling of the conformational transitions. We demonstrate that in many practically important cases all these four factors are on the order of unity, and we analyze possible scenarios that could lead to their significant deviation from unity. Overall, we provide for the first time analytical results on the slowest time scales captured by MSMs. These results can guide further practical applications of MSMs to biomolecular dynamics and allow for higher computational efficiency of simulations.

  12. The Pathological Spectrum of Systemic Anaplastic Large Cell Lymphoma (ALCL

    Directory of Open Access Journals (Sweden)

    Ivonne A. Montes-Mojarro

    2018-04-01

    Full Text Available Anaplastic large cell lymphoma (ALCL represents a group of malignant T-cell lymphoproliferations that share morphological and immunophenotypical features, namely strong CD30 expression and variable loss of T-cell markers, but differ in clinical presentation and prognosis. The recognition of anaplastic lymphoma kinase (ALK fusion proteins as a result of chromosomal translocations or inversions was the starting point for the distinction of different subgroups of ALCL. According to their distinct clinical settings and molecular findings, the 2016 revised World Health Organization (WHO classification recognizes four different entities: systemic ALK-positive ALCL (ALK+ ALCL, systemic ALK-negative ALCL (ALK− ALCL, primary cutaneous ALCL (pC-ALCL, and breast implant-associated ALCL (BI-ALCL, the latter included as a provisional entity. ALK is rearranged in approximately 80% of systemic ALCL cases with one of its partner genes, most commonly NPM1, and is associated with favorable prognosis, whereas systemic ALK− ALCL shows heterogeneous clinical, phenotypical, and genetic features, underlining the different oncogenesis between these two entities. Recognition of the pathological spectrum of ALCL is crucial to understand its pathogenesis and its boundaries with other entities. In this review, we will focus on the morphological, immunophenotypical, and molecular features of systemic ALK+ and ALK− ALCL. In addition, BI-ALCL will be discussed.

  13. An integrated system for large scale scanning of nuclear emulsions

    Energy Technology Data Exchange (ETDEWEB)

    Bozza, Cristiano, E-mail: kryss@sa.infn.it [University of Salerno and INFN, via Ponte Don Melillo, Fisciano 84084 (Italy); D’Ambrosio, Nicola [Laboratori Nazionali del Gran Sasso, S.S. 17 BIS km 18.910, Assergi (AQ) 67010 (Italy); De Lellis, Giovanni [University of Napoli and INFN, Complesso Universitario di Monte Sant' Angelo, via Cintia Ed. G, Napoli 80126 (Italy); De Serio, Marilisa [University of Bari and INFN, via E. Orabona 4, Bari 70125 (Italy); Di Capua, Francesco [INFN Napoli, Complesso Universitario di Monte Sant' Angelo, via Cintia Ed. G, Napoli 80126 (Italy); Di Crescenzo, Antonia [University of Napoli and INFN, Complesso Universitario di Monte Sant' Angelo, via Cintia Ed. G, Napoli 80126 (Italy); Di Ferdinando, Donato [INFN Bologna, viale B. Pichat 6/2, Bologna 40127 (Italy); Di Marco, Natalia [Laboratori Nazionali del Gran Sasso, S.S. 17 BIS km 18.910, Assergi (AQ) 67010 (Italy); Esposito, Luigi Salvatore [Laboratori Nazionali del Gran Sasso, now at CERN, Geneva (Switzerland); Fini, Rosa Anna [INFN Bari, via E. Orabona 4, Bari 70125 (Italy); Giacomelli, Giorgio [University of Bologna and INFN, viale B. Pichat 6/2, Bologna 40127 (Italy); Grella, Giuseppe [University of Salerno and INFN, via Ponte Don Melillo, Fisciano 84084 (Italy); Ieva, Michela [University of Bari and INFN, via E. Orabona 4, Bari 70125 (Italy); Kose, Umut [INFN Padova, via Marzolo 8, Padova (PD) 35131 (Italy); Longhin, Andrea; Mauri, Nicoletta [INFN Laboratori Nazionali di Frascati, via E. Fermi 40, Frascati (RM) 00044 (Italy); Medinaceli, Eduardo [University of Padova and INFN, via Marzolo 8, Padova (PD) 35131 (Italy); Monacelli, Piero [University of L' Aquila and INFN, via Vetoio Loc. Coppito, L' Aquila (AQ) 67100 (Italy); Muciaccia, Maria Teresa; Pastore, Alessandra [University of Bari and INFN, via E. Orabona 4, Bari 70125 (Italy); and others

    2013-03-01

    The European Scanning System, developed to analyse nuclear emulsions at high speed, has been completed with the development of a high level software infrastructure to automate and support large-scale emulsion scanning. In one year, an average installation is capable of performing data-taking and online analysis on a total surface ranging from few m{sup 2} to tens of m{sup 2}, acquiring many billions of tracks, corresponding to several TB. This paper focuses on the procedures that have been implemented and on their impact on physics measurements. The system proved robust, reliable, fault-tolerant and user-friendly, and seldom needs assistance. A dedicated relational Data Base system is the backbone of the whole infrastructure, storing data themselves and not only catalogues of data files, as in common practice, being a unique case in high-energy physics DAQ systems. The logical organisation of the system is described and a summary is given of the physics measurement that are readily available by automated processing.

  14. Large quantum systems: a mathematical and numerical perspective

    International Nuclear Information System (INIS)

    Lewin, M.

    2009-06-01

    This thesis is devoted to the mathematical study of variational models for large quantum systems. The mathematical methods are that of nonlinear analysis, calculus of variations, partial differential equations, spectral theory, and numerical analysis. The first part contains some results on finite systems. We study several approximations of the N-body Schroedinger equation for electrons in an atom or a molecule, and then the so-called Hartree-Fock- Bogoliubov model for a system of fermions interacting via the gravitational force. In a second part, we propose a new method allowing to prove the existence of the thermodynamic limit of Coulomb quantum systems. Then, we construct two Hartree-Fock-type models for infinite systems. The first is a relativistic theory deduced from Quantum Electrodynamics, allowing to describe the behavior of electrons, coupled to that of Dirac's vacuum which can become polarized. The second model describes a nonrelativistic quantum crystal in the presence of a charged defect. A new numerical method is also proposed. The last part of the thesis is devoted to spectral pollution, a phenomenon which is observed when trying to approximate eigenvalues in a gap of the essential spectrum of a self-adjoint operator, for instance for periodic Schroedinger operators or Dirac operators. (author)

  15. An integrated system for large scale scanning of nuclear emulsions

    International Nuclear Information System (INIS)

    Bozza, Cristiano; D’Ambrosio, Nicola; De Lellis, Giovanni; De Serio, Marilisa; Di Capua, Francesco; Di Crescenzo, Antonia; Di Ferdinando, Donato; Di Marco, Natalia; Esposito, Luigi Salvatore; Fini, Rosa Anna; Giacomelli, Giorgio; Grella, Giuseppe; Ieva, Michela; Kose, Umut; Longhin, Andrea; Mauri, Nicoletta; Medinaceli, Eduardo; Monacelli, Piero; Muciaccia, Maria Teresa; Pastore, Alessandra

    2013-01-01

    The European Scanning System, developed to analyse nuclear emulsions at high speed, has been completed with the development of a high level software infrastructure to automate and support large-scale emulsion scanning. In one year, an average installation is capable of performing data-taking and online analysis on a total surface ranging from few m 2 to tens of m 2 , acquiring many billions of tracks, corresponding to several TB. This paper focuses on the procedures that have been implemented and on their impact on physics measurements. The system proved robust, reliable, fault-tolerant and user-friendly, and seldom needs assistance. A dedicated relational Data Base system is the backbone of the whole infrastructure, storing data themselves and not only catalogues of data files, as in common practice, being a unique case in high-energy physics DAQ systems. The logical organisation of the system is described and a summary is given of the physics measurement that are readily available by automated processing

  16. Participatory Design of Large-Scale Information Systems

    DEFF Research Database (Denmark)

    Simonsen, Jesper; Hertzum, Morten

    2008-01-01

    into a PD process model that (1) emphasizes PD experiments as transcending traditional prototyping by evaluating fully integrated systems exposed to real work practices; (2) incorporates improvisational change management including anticipated, emergent, and opportunity-based change; and (3) extends initial...... design and development into a sustained and ongoing stepwise implementation that constitutes an overall technology-driven organizational change. The process model is presented through a largescale PD experiment in the Danish healthcare sector. We reflect on our experiences from this experiment......In this article we discuss how to engage in large-scale information systems development by applying a participatory design (PD) approach that acknowledges the unique situated work practices conducted by the domain experts of modern organizations. We reconstruct the iterative prototyping approach...

  17. The Liquid Argon Calorimeter system for the SLC Large Detector

    International Nuclear Information System (INIS)

    Haller, G.M.; Fox, J.D.; Smith, S.R.

    1988-09-01

    In this paper the physical packaging and the logical organization of the Liquid Argon Calorimeter (LAC) electronics system for the Stanford Linear Collider Large Detector (SLD) at SLAC are described. This system processes signals from approximately 44,000 calorimeter towers and is unusual in that most electronic functions are packaged within the detector itself as opposed to an external electronics support rack. The signal path from the towers in the liquid argon through the vacuum to the outside of the detector is explained. The organization of the control logic, analog electronics, power regulation, analog-to-digital conversion circuits, and fiber optic drivers mounted directly on the detector are described. Redundancy considerations for the electronics and cooling issues are discussed. 12 refs., 5 figs

  18. System concept for a moderate cost Large Deployable Reflector (LDR)

    Science.gov (United States)

    Swanson, P. N.; Breckinridge, J. B.; Diner, A.; Freeland, R. E.; Irace, W. R.; Mcelroy, P. M.; Meinel, A. B.; Tolivar, A. F.

    1986-01-01

    A study was carried out at JPL during the first quarter of 1985 to develop a system concept for NASA's LDR. Major features of the concept are a four-mirror, two-stage optical system; a lightweight structural composite segmented primary reflector; and a deployable truss backup structure with integral thermal shield. The two-stage optics uses active figure control at the quaternary reflector located at the primary reflector exit pupil, allowing the large primary to be passive. The lightweight composite reflector panels limit the short-wavelength operation to approximately 30 microns but reduce the total primary reflector weight by a factor of 3 to 4 over competing technologies. On-orbit thermal analysis indicates a primary reflector equilibrium temperature of less than 200 K with a maximum gradient of about 5 C across the 20-m aperture. Weight and volume estimates are consistent with a single Shuttle launch, and are based on Space Station assembly and checkout.

  19. Large linear magnetoresistivity in strongly inhomogeneous planar and layered systems

    International Nuclear Information System (INIS)

    Bulgadaev, S.A.; Kusmartsev, F.V.

    2005-01-01

    Explicit expressions for magnetoresistance R of planar and layered strongly inhomogeneous two-phase systems are obtained, using exact dual transformation, connecting effective conductivities of in-plane isotropic two-phase systems with and without magnetic field. These expressions allow to describe the magnetoresistance of various inhomogeneous media at arbitrary concentrations x and magnetic fields H. All expressions show large linear magnetoresistance effect with different dependencies on the phase concentrations. The corresponding plots of the x- and H-dependencies of R(x,H) are represented for various values, respectively, of magnetic field and concentrations at some values of inhomogeneity parameter. The obtained results show a remarkable similarity with the existing experimental data on linear magnetoresistance in silver chalcogenides Ag 2+δ Se. A possible physical explanation of this similarity is proposed. It is shown that the random, stripe type, structures of inhomogeneities are the most suitable for a fabrication of magnetic sensors and a storage of information at room temperatures

  20. Human visual system automatically represents large-scale sequential regularities.

    Science.gov (United States)

    Kimura, Motohiro; Widmann, Andreas; Schröger, Erich

    2010-03-04

    Our brain recordings reveal that large-scale sequential regularities defined across non-adjacent stimuli can be automatically represented in visual sensory memory. To show that, we adopted an auditory paradigm developed by Sussman, E., Ritter, W., and Vaughan, H. G. Jr. (1998). Predictability of stimulus deviance and the mismatch negativity. NeuroReport, 9, 4167-4170, Sussman, E., and Gumenyuk, V. (2005). Organization of sequential sounds in auditory memory. NeuroReport, 16, 1519-1523 to the visual domain by presenting task-irrelevant infrequent luminance-deviant stimuli (D, 20%) inserted among task-irrelevant frequent stimuli being of standard luminance (S, 80%) in randomized (randomized condition, SSSDSSSSSDSSSSD...) and fixed manners (fixed condition, SSSSDSSSSDSSSSD...). Comparing the visual mismatch negativity (visual MMN), an event-related brain potential (ERP) index of memory-mismatch processes in human visual sensory system, revealed that visual MMN elicited by deviant stimuli was reduced in the fixed compared to the randomized condition. Thus, the large-scale sequential regularity being present in the fixed condition (SSSSD) must have been represented in visual sensory memory. Interestingly, this effect did not occur in conditions with stimulus-onset asynchronies (SOAs) of 480 and 800 ms but was confined to the 160-ms SOA condition supporting the hypothesis that large-scale regularity extraction was based on perceptual grouping of the five successive stimuli defining the regularity. 2010 Elsevier B.V. All rights reserved.

  1. ANCA: Anharmonic Conformational Analysis of Biomolecular Simulations.

    Science.gov (United States)

    Parvatikar, Akash; Vacaliuc, Gabriel S; Ramanathan, Arvind; Chennubhotla, S Chakra

    2018-05-08

    Anharmonicity in time-dependent conformational fluctuations is noted to be a key feature of functional dynamics of biomolecules. Although anharmonic events are rare, long-timescale (μs-ms and beyond) simulations facilitate probing of such events. We have previously developed quasi-anharmonic analysis to resolve higher-order spatial correlations and characterize anharmonicity in biomolecular simulations. In this article, we have extended this toolbox to resolve higher-order temporal correlations and built a scalable Python package called anharmonic conformational analysis (ANCA). ANCA has modules to: 1) measure anharmonicity in the form of higher-order statistics and its variation as a function of time, 2) output a storyboard representation of the simulations to identify key anharmonic conformational events, and 3) identify putative anharmonic conformational substates and visualization of transitions between these substates. Copyright © 2018 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  2. Orientation of biomolecular assemblies in a microfluidic jet

    International Nuclear Information System (INIS)

    Priebe, M; Kalbfleisch, S; Tolkiehn, M; Salditt, T; Koester, S; Abel, B; Davies, R J

    2010-01-01

    We have investigated multilamellar lipid assemblies in a microfluidic jet, operating at high shear rates of the order of 10 7 s -1 . Compared to classical Couette cells or rheometers, the shear rate was increased by at least 2-3 orders of magnitude, and the sample volume was scaled down correspondingly. At the same time, the jet is characterized by high extensional stress due to elongational flow. A focused synchrotron x-ray beam was used to measure the structure and orientation of the lipid assemblies in the jet. The diffraction patterns indicate conventional multilamellar phases, aligned with the membrane normals oriented along the velocity gradient of the jet. The results indicate that the setup may be well suited for coherent diffractive imaging of oriented biomolecular assemblies and macromolecules at the future x-ray free electron laser (XFEL) sources.

  3. Hybrid organic semiconductor lasers for bio-molecular sensing.

    Science.gov (United States)

    Haughey, Anne-Marie; Foucher, Caroline; Guilhabert, Benoit; Kanibolotsky, Alexander L; Skabara, Peter J; Burley, Glenn; Dawson, Martin D; Laurand, Nicolas

    2014-01-01

    Bio-functionalised luminescent organic semiconductors are attractive for biophotonics because they can act as efficient laser materials while simultaneously interacting with molecules. In this paper, we present and discuss a laser biosensor platform that utilises a gain layer made of such an organic semiconductor material. The simple structure of the sensor and its operation principle are described. Nanolayer detection is shown experimentally and analysed theoretically in order to assess the potential and the limits of the biosensor. The advantage conferred by the organic semiconductor is explained, and comparisons to laser sensors using alternative dye-doped materials are made. Specific biomolecular sensing is demonstrated, and routes to functionalisation with nucleic acid probes, and future developments opened up by this achievement, are highlighted. Finally, attractive formats for sensing applications are mentioned, as well as colloidal quantum dots, which in the future could be used in conjunction with organic semiconductors.

  4. Design rules for biomolecular adhesion: lessons from force measurements.

    Science.gov (United States)

    Leckband, Deborah

    2010-01-01

    Cell adhesion to matrix, other cells, or pathogens plays a pivotal role in many processes in biomolecular engineering. Early macroscopic methods of quantifying adhesion led to the development of quantitative models of cell adhesion and migration. The more recent use of sensitive probes to quantify the forces that alter or manipulate adhesion proteins has revealed much greater functional diversity than was apparent from population average measurements of cell adhesion. This review highlights theoretical and experimental methods that identified force-dependent molecular properties that are central to the biological activity of adhesion proteins. Experimental and theoretical methods emphasized in this review include the surface force apparatus, atomic force microscopy, and vesicle-based probes. Specific examples given illustrate how these tools have revealed unique properties of adhesion proteins and their structural origins.

  5. Two-level systems driven by large-amplitude fields

    International Nuclear Information System (INIS)

    Ashhab, S.; Johansson, J. R.; Zagoskin, A. M.; Nori, Franco

    2007-01-01

    We analyze the dynamics of a two-level system subject to driving by large-amplitude external fields, focusing on the resonance properties in the case of driving around the region of avoided level crossing. In particular, we consider three main questions that characterize resonance dynamics: (1) the resonance condition (2) the frequency of the resulting oscillations on resonance, and (3) the width of the resonance. We identify the regions of validity of different approximations. In a large region of the parameter space, we use a geometric picture in order to obtain both a simple understanding of the dynamics and quantitative results. The geometric approach is obtained by dividing the evolution into discrete time steps, with each time step described by either a phase shift on the basis states or a coherent mixing process corresponding to a Landau-Zener crossing. We compare the results of the geometric picture with those of a rotating wave approximation. We also comment briefly on the prospects of employing strong driving as a useful tool to manipulate two-level systems

  6. Potential large missions enabled by NASA's space launch system

    Science.gov (United States)

    Stahl, H. Philip; Hopkins, Randall C.; Schnell, Andrew; Smith, David A.; Jackman, Angela; Warfield, Keith R.

    2016-07-01

    Large space telescope missions have always been limited by their launch vehicle's mass and volume capacities. The Hubble Space Telescope (HST) was specifically designed to fit inside the Space Shuttle and the James Webb Space Telescope (JWST) is specifically designed to fit inside an Ariane 5. Astrophysicists desire even larger space telescopes. NASA's "Enduring Quests Daring Visions" report calls for an 8- to 16-m Large UV-Optical-IR (LUVOIR) Surveyor mission to enable ultra-high-contrast spectroscopy and coronagraphy. AURA's "From Cosmic Birth to Living Earth" report calls for a 12-m class High-Definition Space Telescope to pursue transformational scientific discoveries. NASA's "Planning for the 2020 Decadal Survey" calls for a Habitable Exoplanet Imaging (HabEx) and a LUVOIR as well as Far-IR and an X-Ray Surveyor missions. Packaging larger space telescopes into existing launch vehicles is a significant engineering complexity challenge that drives cost and risk. NASA's planned Space Launch System (SLS), with its 8 or 10-m diameter fairings and ability to deliver 35 to 45-mt of payload to Sun-Earth-Lagrange-2, mitigates this challenge by fundamentally changing the design paradigm for large space telescopes. This paper reviews the mass and volume capacities of the planned SLS, discusses potential implications of these capacities for designing large space telescope missions, and gives three specific mission concept implementation examples: a 4-m monolithic off-axis telescope, an 8-m monolithic on-axis telescope and a 12-m segmented on-axis telescope.

  7. Potential Large Decadal Missions Enabled by Nasas Space Launch System

    Science.gov (United States)

    Stahl, H. Philip; Hopkins, Randall C.; Schnell, Andrew; Smith, David Alan; Jackman, Angela; Warfield, Keith R.

    2016-01-01

    Large space telescope missions have always been limited by their launch vehicle's mass and volume capacities. The Hubble Space Telescope (HST) was specifically designed to fit inside the Space Shuttle and the James Webb Space Telescope (JWST) is specifically designed to fit inside an Ariane 5. Astrophysicists desire even larger space telescopes. NASA's "Enduring Quests Daring Visions" report calls for an 8- to 16-m Large UV-Optical-IR (LUVOIR) Surveyor mission to enable ultra-high-contrast spectroscopy and coronagraphy. AURA's "From Cosmic Birth to Living Earth" report calls for a 12-m class High-Definition Space Telescope to pursue transformational scientific discoveries. NASA's "Planning for the 2020 Decadal Survey" calls for a Habitable Exoplanet Imaging (HabEx) and a LUVOIR as well as Far-IR and an X-Ray Surveyor missions. Packaging larger space telescopes into existing launch vehicles is a significant engineering complexity challenge that drives cost and risk. NASA's planned Space Launch System (SLS), with its 8 or 10-m diameter fairings and ability to deliver 35 to 45-mt of payload to Sun-Earth-Lagrange-2, mitigates this challenge by fundamentally changing the design paradigm for large space telescopes. This paper reviews the mass and volume capacities of the planned SLS, discusses potential implications of these capacities for designing large space telescope missions, and gives three specific mission concept implementation examples: a 4-m monolithic off-axis telescope, an 8-m monolithic on-axis telescope and a 12-m segmented on-axis telescope.

  8. High-speed AFM for Studying Dynamic Biomolecular Processes

    Science.gov (United States)

    Ando, Toshio

    2008-03-01

    Biological molecules show their vital activities only in aqueous solutions. It had been one of dreams in biological sciences to directly observe biological macromolecules (protein, DNA) at work under a physiological condition because such observation is straightforward to understanding their dynamic behaviors and functional mechanisms. Optical microscopy has no sufficient spatial resolution and electron microscopy is not applicable to in-liquid samples. Atomic force microscopy (AFM) can visualize molecules in liquids at high resolution but its imaging rate was too low to capture dynamic biological processes. This slow imaging rate is because AFM employs mechanical probes (cantilevers) and mechanical scanners to detect the sample height at each pixel. It is quite difficult to quickly move a mechanical device of macroscopic size with sub-nanometer accuracy without producing unwanted vibrations. It is also difficult to maintain the delicate contact between a probe tip and fragile samples. Two key techniques are required to realize high-speed AFM for biological research; fast feedback control to maintain a weak tip-sample interaction force and a technique to suppress mechanical vibrations of the scanner. Various efforts have been carried out in the past decade to materialize high-speed AFM. The current high-speed AFM can capture images on video at 30-60 frames/s for a scan range of 250nm and 100 scan lines, without significantly disturbing week biomolecular interaction. Our recent studies demonstrated that this new microscope can reveal biomolecular processes such as myosin V walking along actin tracks and association/dissociation dynamics of chaperonin GroEL-GroES that occurs in a negatively cooperative manner. The capacity of nanometer-scale visualization of dynamic processes in liquids will innovate on biological research. In addition, it will open a new way to study dynamic chemical/physical processes of various phenomena that occur at the liquid-solid interfaces.

  9. A large capacity, high-speed multiparameter multichannel analysis system

    International Nuclear Information System (INIS)

    Hendricks, R.W.; Suehiro, S.; Seeger, P.A.; Scheer, J.W.

    1982-01-01

    A data acquisition system for recording multiparameter digital data into a large memory array at over 2.5 MHz is described. The system consists of a MOSTEK MK 8600 2048 K x 24-bit memory system, I/O ports to various external devices including the CAMAC dataway, a memory incrementer/adder and a daisy-chain of experiment-specific modules which calculate the memory address which is to be incremented. The design of the daisy-chain permits multiple modules and provides for easy modification as experimental needs change. The system has been designed for use in multiparameter, multichannel analysis of high-speed data gathered by position-sensitive detectors at conventional and synchrotron X-ray sources as well as for fixed energy and time-of-flight diffraction at continuous and pulsed neutron sources. Modules which have been developed to date include a buffer for two-dimensional position-sensitive detectors, a mapper for high-speed coordinate transformations, a buffered time-of-flight clock, a time-correlator for synchronized diffraction experiments, and a display unit for data bus diagnostics. (orig.)

  10. Staghorn: An Automated Large-Scale Distributed System Analysis Platform

    Energy Technology Data Exchange (ETDEWEB)

    Gabert, Kasimir [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Burns, Ian [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Elliott, Steven [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kallaher, Jenna [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vail, Adam [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-09-01

    Conducting experiments on large-scale distributed computing systems is becoming significantly easier with the assistance of emulation. Researchers can now create a model of a distributed computing environment and then generate a virtual, laboratory copy of the entire system composed of potentially thousands of virtual machines, switches, and software. The use of real software, running at clock rate in full virtual machines, allows experiments to produce meaningful results without necessitating a full understanding of all model components. However, the ability to inspect and modify elements within these models is bound by the limitation that such modifications must compete with the model, either running in or alongside it. This inhibits entire classes of analyses from being conducted upon these models. We developed a mechanism to snapshot an entire emulation-based model as it is running. This allows us to \\freeze time" and subsequently fork execution, replay execution, modify arbitrary parts of the model, or deeply explore the model. This snapshot includes capturing packets in transit and other input/output state along with the running virtual machines. We were able to build this system in Linux using Open vSwitch and Kernel Virtual Machines on top of Sandia's emulation platform Firewheel. This primitive opens the door to numerous subsequent analyses on models, including state space exploration, debugging distributed systems, performance optimizations, improved training environments, and improved experiment repeatability.

  11. Proceedings of the international advisory committee on 'biomolecular dynamics instrument DNA' and the workshop on 'biomolecular dynamics backscattering spectrometers'

    International Nuclear Information System (INIS)

    Arai, Masatoshi; Aizawa, Kazuya; Nakajima, Kenji; Shibata, Kaoru; Takahashi, Nobuaki

    2008-08-01

    A workshop entitled 'Biomolecular Dynamics Backscattering Spectrometers' was held on February 27th - 29th, 2008 at J-PARC Center, Japan Atomic Energy Agency. This workshop was planned to be held for aiming to realize an innovative neutron backscattering instrument, namely DNA, in the MLF and thus four leading scientists in the field of neutron backscattering instruments were invited as the International Advisory Committee (IAC member: Dr. Dan Neumann (Chair); Prof. Ferenc Mezei; Dr. Hannu Mutka; Dr. Philip Tregenna-Piggott) for DNA from institutes in the United States, France and Switzerland, where backscattering instruments are in-service. It was therefore held in the form of lecture anterior and then in the form of the committee posterior. This report includes the executive summary of the IAC and materials of the presentations in the IAC and the workshop. (author)

  12. A very high performance stabilization system for large mass bolometerexperiments

    Energy Technology Data Exchange (ETDEWEB)

    Arnaboldi, C. [Sezione INFN di Milano Bicocca, Piazza della Scienza 3, I-20126 Milano (Italy); Universita di Milano Bicocca, Piazza della Scienza 3, I-20126 Milano (Italy); Giachero, A., E-mail: Andrea.Giachero@mib.infn.it [Sezione INFN di Milano Bicocca, Piazza della Scienza 3, I-20126 Milano (Italy); Universita di Milano Bicocca, Piazza della Scienza 3, I-20126 Milano (Italy); Gotti, C. [Sezione INFN di Milano Bicocca, Piazza della Scienza 3, I-20126 Milano (Italy); Universita di Firenze, Dipartimento di Elettronica e Telecomunicazioni, Via S. Marta 3, I-50139 Firenze (Italy); Pessina, G. [Sezione INFN di Milano Bicocca, Piazza della Scienza 3, I-20126 Milano (Italy); Universita di Milano Bicocca, Piazza della Scienza 3, I-20126 Milano (Italy)

    2011-10-01

    CUORE is a large mass bolometric experiment, composed of 988 crystals, under construction in Hall A of the Gran Sasso Underground Laboratories (LNGS). Its main aim is the study of neutrinoless double beta decay of {sup 130}Te. Each bolometer is a 760 g crystal of Tellurium dioxide on which a Nuclear Transmutation Doped Ge thermistor, Ge NTD, is glued with proper thermal contact. The stability of the system is mandatory over many years of data taking. To accomplish this requirement a heating resistor is glued on each detector across which a voltage pulse can be injected at will, to develop a known calibrated heating power. We present the design solution for a pulse generator system to be used for the injection of such a small and short voltage pulse across the heaters. This system is composed by different custom PCB boards each of them having multi-channel independent outputs completely remotely programmable from the acquisition system, in pulse width and amplitude, through an on-board ARM7 microcontroller. Pulse amplitudes must be selectable, in order to handle each detector on its full dynamic range. The resolution of the output voltage is 12 bits over 10 V range. An additional 4 steps programmable voltage attenuator is added at every output. The width of any pulse can range from 100{mu}s to 25.5 ms. The main features of the final system are: stability and precision in pulses generation (at the level of less than a ppm/{sup o}C), low cost (thanks to the use of commercial components) and compact implementation.

  13. Optimization of large-scale heterogeneous system-of-systems models.

    Energy Technology Data Exchange (ETDEWEB)

    Parekh, Ojas; Watson, Jean-Paul; Phillips, Cynthia Ann; Siirola, John; Swiler, Laura Painton; Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Lee, Herbert K. H. (University of California, Santa Cruz, Santa Cruz, CA); Hart, William Eugene; Gray, Genetha Anne (Sandia National Laboratories, Livermore, CA); Woodruff, David L. (University of California, Davis, Davis, CA)

    2012-01-01

    Decision makers increasingly rely on large-scale computational models to simulate and analyze complex man-made systems. For example, computational models of national infrastructures are being used to inform government policy, assess economic and national security risks, evaluate infrastructure interdependencies, and plan for the growth and evolution of infrastructure capabilities. A major challenge for decision makers is the analysis of national-scale models that are composed of interacting systems: effective integration of system models is difficult, there are many parameters to analyze in these systems, and fundamental modeling uncertainties complicate analysis. This project is developing optimization methods to effectively represent and analyze large-scale heterogeneous system of systems (HSoS) models, which have emerged as a promising approach for describing such complex man-made systems. These optimization methods enable decision makers to predict future system behavior, manage system risk, assess tradeoffs between system criteria, and identify critical modeling uncertainties.

  14. System design of a large fuel cell hybrid locomotive

    Science.gov (United States)

    Miller, A. R.; Hess, K. S.; Barnes, D. L.; Erickson, T. L.

    Fuel cell power for locomotives combines the environmental benefits of a catenary-electric locomotive with the higher overall energy efficiency and lower infrastructure costs of a diesel-electric. A North American consortium, a public-private partnership, is developing a prototype hydrogen-fueled fuel cell-battery hybrid switcher locomotive for urban and military-base rail applications. Switcher locomotives are used in rail yards for assembling and disassembling trains and moving trains from one point to another. At 127 tonnes (280,000 lb), continuous power of 250 kW from its (proton exchange membrane) PEM fuel cell prime mover, and transient power well in excess of 1 MW, the hybrid locomotive will be the heaviest and most powerful fuel cell land vehicle yet. This fast-paced project calls for completion of the vehicle itself near the end of 2007. Several technical challenges not found in the development of smaller vehicles arise when designing and developing such a large fuel cell vehicle. Weight, center of gravity, packaging, and safety were design factors leading to, among other features, the roof location of the lightweight 350 bar compressed hydrogen storage system. Harsh operating conditions, especially shock loads during coupling to railcars, require component mounting systems capable of absorbing high energy. Vehicle scale-up by increasing mass, density, or power presents new challenges primarily related to issues of system layout, hydrogen storage, heat transfer, and shock loads.

  15. Optical technologies for data communication in large parallel systems

    International Nuclear Information System (INIS)

    Ritter, M B; Vlasov, Y; Kash, J A; Benner, A

    2011-01-01

    Large, parallel systems have greatly aided scientific computation and data collection, but performance scaling now relies on chip and system-level parallelism. This has happened because power density limits have caused processor frequency growth to stagnate, driving the new multi-core architecture paradigm, which would seem to provide generations of performance increases as transistors scale. However, this paradigm will be constrained by electrical I/O bandwidth limits; first off the processor card, then off the processor module itself. We will present best-estimates of these limits, then show how optical technologies can help provide more bandwidth to allow continued system scaling. We will describe the current status of optical transceiver technology which is already being used to exceed off-board electrical bandwidth limits, then present work on silicon nanophotonic transceivers and 3D integration technologies which, taken together, promise to allow further increases in off-module and off-card bandwidth. Finally, we will show estimated limits of nanophotonic links and discuss breakthroughs that are needed for further progress, and will speculate on whether we will reach Exascale-class machine performance at affordable powers.

  16. Characteristics of large thermal energy storage systems in Poland

    Science.gov (United States)

    Zwierzchowski, Ryszard

    2017-11-01

    In District Heating Systems (DHS) there are significant fluctuations in demand for heat by consumers during both the heating and the summer seasons. These variations are considered primarily in the 24-hour time horizon. These problems are aggravated further if the DHS is supplied by a CHP plant, because fluctuations in heat demand adversely affect to a significant degree the stable production of electricity at high overall efficiency. Therefore, introducing Thermal Energy Storage (TES) would be highly recommended on these grounds alone. The characteristics of Large (i.e. over 10 000 m3) TES in operation in Poland are presented. Information is given regarding new projects (currently in design or construction) that apply TES technology in DHS in Poland. The paper looks at the methodology used in Poland to select the TES system for a particular DHS, i.e., procedure for calculating capacity of the TES tank and the system to prevent water stored in the tank from absorbing oxygen from atmospheric air. Implementation of TES in DHS is treated as a recommended technology in the Polish District Heating sector. This technology offers great opportunities to improve the operating conditions of DHS, cutting energy production costs and emissions of pollutants to the atmosphere.

  17. Optimization of large-scale industrial systems : an emerging method

    Energy Technology Data Exchange (ETDEWEB)

    Hammache, A.; Aube, F.; Benali, M.; Cantave, R. [Natural Resources Canada, Varennes, PQ (Canada). CANMET Energy Technology Centre

    2006-07-01

    This paper reviewed optimization methods of large-scale industrial production systems and presented a novel systematic multi-objective and multi-scale optimization methodology. The methodology was based on a combined local optimality search with global optimality determination, and advanced system decomposition and constraint handling. The proposed method focused on the simultaneous optimization of the energy, economy and ecology aspects of industrial systems (E{sup 3}-ISO). The aim of the methodology was to provide guidelines for decision-making strategies. The approach was based on evolutionary algorithms (EA) with specifications including hybridization of global optimality determination with a local optimality search; a self-adaptive algorithm to account for the dynamic changes of operating parameters and design variables occurring during the optimization process; interactive optimization; advanced constraint handling and decomposition strategy; and object-oriented programming and parallelization techniques. Flowcharts of the working principles of the basic EA were presented. It was concluded that the EA uses a novel decomposition and constraint handling technique to enhance the Pareto solution search procedure for multi-objective problems. 6 refs., 9 figs.

  18. Iterative solution of large sparse systems of equations

    CERN Document Server

    Hackbusch, Wolfgang

    2016-01-01

    In the second edition of this classic monograph, complete with four new chapters and updated references, readers will now have access to content describing and analysing classical and modern methods with emphasis on the algebraic structure of linear iteration, which is usually ignored in other literature. The necessary amount of work increases dramatically with the size of systems, so one has to search for algorithms that most efficiently and accurately solve systems of, e.g., several million equations. The choice of algorithms depends on the special properties the matrices in practice have. An important class of large systems arises from the discretization of partial differential equations. In this case, the matrices are sparse (i.e., they contain mostly zeroes) and well-suited to iterative algorithms. The first edition of this book grew out of a series of lectures given by the author at the Christian-Albrecht University of Kiel to students of mathematics. The second edition includes quite novel approaches.

  19. A measurement system for large, complex software programs

    Science.gov (United States)

    Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.

    1994-01-01

    This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.

  20. Large resistivity modulation in mixed-phase metallic systems.

    Science.gov (United States)

    Lee, Yeonbae; Liu, Z Q; Heron, J T; Clarkson, J D; Hong, J; Ko, C; Biegalski, M D; Aschauer, U; Hsu, S L; Nowakowski, M E; Wu, J; Christen, H M; Salahuddin, S; Bokor, J B; Spaldin, N A; Schlom, D G; Ramesh, R

    2015-01-07

    In numerous systems, giant physical responses have been discovered when two phases coexist; for example, near a phase transition. An intermetallic FeRh system undergoes a first-order antiferromagnetic to ferromagnetic transition above room temperature and shows two-phase coexistence near the transition. Here we have investigated the effect of an electric field to FeRh/PMN-PT heterostructures and report 8% change in the electrical resistivity of FeRh films. Such a 'giant' electroresistance (GER) response is striking in metallic systems, in which external electric fields are screened, and thus only weakly influence the carrier concentrations and mobilities. We show that our FeRh films comprise coexisting ferromagnetic and antiferromagnetic phases with different resistivities and the origin of the GER effect is the strain-mediated change in their relative proportions. The observed behaviour is reminiscent of colossal magnetoresistance in perovskite manganites and illustrates the role of mixed-phase coexistence in achieving large changes in physical properties with low-energy external perturbation.

  1. Optical technologies for data communication in large parallel systems

    Energy Technology Data Exchange (ETDEWEB)

    Ritter, M B; Vlasov, Y; Kash, J A [IBM T.J. Watson Research Center, Yorktown Heights, NY (United States); Benner, A, E-mail: mritter@us.ibm.com [IBM Poughkeepsie, Poughkeepsie, NY (United States)

    2011-01-15

    Large, parallel systems have greatly aided scientific computation and data collection, but performance scaling now relies on chip and system-level parallelism. This has happened because power density limits have caused processor frequency growth to stagnate, driving the new multi-core architecture paradigm, which would seem to provide generations of performance increases as transistors scale. However, this paradigm will be constrained by electrical I/O bandwidth limits; first off the processor card, then off the processor module itself. We will present best-estimates of these limits, then show how optical technologies can help provide more bandwidth to allow continued system scaling. We will describe the current status of optical transceiver technology which is already being used to exceed off-board electrical bandwidth limits, then present work on silicon nanophotonic transceivers and 3D integration technologies which, taken together, promise to allow further increases in off-module and off-card bandwidth. Finally, we will show estimated limits of nanophotonic links and discuss breakthroughs that are needed for further progress, and will speculate on whether we will reach Exascale-class machine performance at affordable powers.

  2. Investigation of propulsion system for large LNG ships

    Science.gov (United States)

    Sinha, R. P.; Nik, Wan Mohd Norsani Wan

    2012-09-01

    Requirements to move away from coal for power generation has made LNG as the most sought after fuel source, raising steep demands on its supply and production. Added to this scenario is the gradual depletion of the offshore oil and gas fields which is pushing future explorations and production activities far away into the hostile environment of deep sea. Production of gas in such environment has great technical and commercial impacts on gas business. For instance, laying gas pipes from deep sea to distant receiving terminals will be technically and economically challenging. Alternative to laying gas pipes will require installing re-liquefaction unit on board FPSOs to convert gas into liquid for transportation by sea. But, then because of increased distance between gas source and receiving terminals the current medium size LNG ships will no longer remain economical to operate. Recognizing this business scenario shipowners are making huge investments in the acquisition of large LNG ships. As power need of large LNG ships is very different from the current small ones, a variety of propulsion derivatives such as UST, DFDE, 2-Stroke DRL and Combined cycle GT have been proposed by leading engine manufacturers. Since, propulsion system constitutes major element of the ship's capital and life cycle cost, which of these options is most suited for large LNG ships is currently a major concern of the shipping industry and must be thoroughly assessed. In this paper the authors investigate relative merits of these propulsion options against the benchmark performance criteria of BOG disposal, fuel consumption, gas emissions, plant availability and overall life cycle cost.

  3. Large-Scale Traveling Weather Systems in Mars’ Southern Extratropics

    Science.gov (United States)

    Hollingsworth, Jeffery L.; Kahre, Melinda A.

    2017-10-01

    Between late fall and early spring, Mars’ middle- and high-latitude atmosphere supports strong mean equator-to-pole temperature contrasts and an accompanying mean westerly polar vortex. Observations from both the MGS Thermal Emission Spectrometer (TES) and the MRO Mars Climate Sounder (MCS) indicate that a mean baroclinicity-barotropicity supports intense, large-scale eastward traveling weather systems (i.e., transient synoptic-period waves). Such extratropical weather disturbances are critical components of the global circulation as they serve as agents in the transport of heat and momentum, and generalized scalar/tracer quantities (e.g., atmospheric dust, water-vapor and ice clouds). The character of such traveling extratropical synoptic disturbances in Mars' southern hemisphere during late winter through early spring is investigated using a moderately high-resolution Mars global climate model (Mars GCM). This Mars GCM imposes interactively-lifted and radiatively-active dust based on a threshold value of the surface stress. The model exhibits a reasonable "dust cycle" (i.e., globally averaged, a dustier atmosphere during southern spring and summer occurs). Compared to the northern-hemisphere counterparts, the southern synoptic-period weather disturbances and accompanying frontal waves have smaller meridional and zonal scales, and are far less intense. Influences of the zonally asymmetric (i.e., east-west varying) topography on southern large-scale weather are investigated, in addition to large-scale up-slope/down-slope flows and the diurnal cycle. A southern storm zone in late winter and early spring presents in the western hemisphere via orographic influences from the Tharsis highlands, and the Argyre and Hellas impact basins. Geographically localized transient-wave activity diagnostics are constructed that illuminate dynamical differences amongst the simulations and these are presented.

  4. Investigation of propulsion system for large LNG ships

    International Nuclear Information System (INIS)

    Sinha, R P; Wan Nik, Wan Mohd Norsani

    2012-01-01

    Requirements to move away from coal for power generation has made LNG as the most sought after fuel source, raising steep demands on its supply and production. Added to this scenario is the gradual depletion of the offshore oil and gas fields which is pushing future explorations and production activities far away into the hostile environment of deep sea. Production of gas in such environment has great technical and commercial impacts on gas business. For instance, laying gas pipes from deep sea to distant receiving terminals will be technically and economically challenging. Alternative to laying gas pipes will require installing re-liquefaction unit on board FPSOs to convert gas into liquid for transportation by sea. But, then because of increased distance between gas source and receiving terminals the current medium size LNG ships will no longer remain economical to operate. Recognizing this business scenario shipowners are making huge investments in the acquisition of large LNG ships. As power need of large LNG ships is very different from the current small ones, a variety of propulsion derivatives such as UST, DFDE, 2-Stroke DRL and Combined cycle GT have been proposed by leading engine manufacturers. Since, propulsion system constitutes major element of the ship's capital and life cycle cost, which of these options is most suited for large LNG ships is currently a major concern of the shipping industry and must be thoroughly assessed. In this paper the authors investigate relative merits of these propulsion options against the benchmark performance criteria of BOG disposal, fuel consumption, gas emissions, plant availability and overall life cycle cost.

  5. Large-Scale Traveling Weather Systems in Mars Southern Extratropics

    Science.gov (United States)

    Hollingsworth, Jeffery L.; Kahre, Melinda A.

    2017-01-01

    Between late fall and early spring, Mars' middle- and high-latitude atmosphere supports strong mean equator-to-pole temperature contrasts and an accompanying mean westerly polar vortex. Observations from both the MGS Thermal Emission Spectrometer (TES) and the MRO Mars Climate Sounder (MCS) indicate that a mean baroclinicity-barotropicity supports intense, large-scale eastward traveling weather systems (i.e., transient synoptic-period waves). Such extratropical weather disturbances are critical components of the global circulation as they serve as agents in the transport of heat and momentum, and generalized scalar/tracer quantities (e.g., atmospheric dust, water-vapor and ice clouds). The character of such traveling extratropical synoptic disturbances in Mars' southern hemisphere during late winter through early spring is investigated using a moderately high-resolution Mars global climate model (Mars GCM). This Mars GCM imposes interactively-lifted and radiatively-active dust based on a threshold value of the surface stress. The model exhibits a reasonable "dust cycle" (i.e., globally averaged, a dustier atmosphere during southern spring and summer occurs). Compared to the northern-hemisphere counterparts, the southern synoptic-period weather disturbances and accompanying frontal waves have smaller meridional and zonal scales, and are far less intense. Influences of the zonally asymmetric (i.e., east-west varying) topography on southern large-scale weather are investigated, in addition to large-scale up-slope/down-slope flows and the diurnal cycle. A southern storm zone in late winter and early spring presents in the western hemisphere via orographic influences from the Tharsis highlands, and the Argyre and Hellas impact basins. Geographically localized transient-wave activity diagnostics are constructed that illuminate dynamical differences amongst the simulations and these are presented.

  6. Large strip RPCs for the LEPS2 TOF system

    Energy Technology Data Exchange (ETDEWEB)

    Tomida, N., E-mail: natsuki@scphys.kyoto-u.ac.jp [Department of Physics, Kyoto University, Kyoto 606-8502 (Japan); Niiyama, M. [Department of Physics, Kyoto University, Kyoto 606-8502 (Japan); Ohnishi, H. [RIKEN (The Institute of Physical and Chemical Research), Wako, Saitama 351-0198 (Japan); Tran, N. [Research Center for Nuclear Physics (RCNP), Osaka University, Ibaraki, Osaka 567-0047 (Japan); Hsieh, C.-Y.; Chu, M.-L.; Chang, W.-C. [Institute of Physics, Academia Sinica, Nankang, Taipei 11529, Taiwan (China); Chen, J.-Y. [National Synchrotron Radiation Research Center (NSRRC), Hsinchu 30076, Taiwan (China)

    2014-12-01

    High time-resolution resistive plate chambers (RPCs) with large-size readout strips are developed for the time-of-flight (TOF) detector system of the LEPS2 experiment at SPring-8. The experimental requirement is a 50-ps time resolution for a strip size larger than 100 cm{sup 2}/channel. We are able to achieve 50-ps time resolutions with 2.5×100 cm{sup 2} strips by directly connecting the amplifiers to strips. With the same time resolution, the number of front-end electronics (FEE) is also reduced by signal addition. - Highlights: • Find a way to achieve a good time resolution with a large strip RPC. • 2.5 cm narrow strips have better resolutions than 5.0 cm ones. • The 0.5 mm narrow strip interval shows flat time resolutions between strips. • FEEs directly connected to strips make the signal reflection at the strip edge small. • A time resolution of 50 ps was achieved with 2.5 cm×100 cm strips.

  7. Aberrations and focusability in large solid-state-laser systems

    International Nuclear Information System (INIS)

    Simmons, W.W.

    1981-01-01

    Solid state lasers for fusion experiments must reliably deliver maximum power to small (approximately .5 mm) targets from stand-off focal distances of 1 m or more. This requirement places stringent limits upon the optical quality of the several major components - amplifiers, Faraday isolators, spatial filters - in each amplifier train. Residual static aberrations in optical components are transferred to the beam as it traverses the optical amplifier chain. Although individual components are typically less than lambda/20 for components less than 10 cm clear aperture; and less than lambda/10 for components less than 20 cm clear aperture; the large number of such components in optical series results in a wavefront error that may exceed one wave for modern solid state lasers. For pulse operation, the focal spot is additionally broadened by intensity dependent nonlinearities. Specific examples of the performance of large aperture components will be presented within the context of the Argus and Shiva laser systems, which are presently operational at Lawrence Livermore National Laboratory. Design requirements upon the larger aperture Nova laser components, up to 74 cm in clear aperture, will also be discussed; these pose a significant challenge to the optical industry

  8. Large-scale visualization system for grid environment

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2007-01-01

    Center for Computational Science and E-systems of Japan Atomic Energy Agency (CCSE/JAEA) has been conducting R and Ds of distributed computing (grid computing) environments: Seamless Thinking Aid (STA), Information Technology Based Laboratory (ITBL) and Atomic Energy Grid InfraStructure (AEGIS). In these R and Ds, we have developed the visualization technology suitable for the distributed computing environment. As one of the visualization tools, we have developed the Parallel Support Toolkit (PST) which can execute the visualization process parallely on a computer. Now, we improve PST to be executable simultaneously on multiple heterogeneous computers using Seamless Thinking Aid Message Passing Interface (STAMPI). STAMPI, we have developed in these R and Ds, is the MPI library executable on a heterogeneous computing environment. The improvement realizes the visualization of extremely large-scale data and enables more efficient visualization processes in a distributed computing environment. (author)

  9. Risk-benefit evaluation for large technological systems

    International Nuclear Information System (INIS)

    Okrent, D.

    1979-01-01

    The related topics of risk-benefit analysis, risk analysis, and risk-acceptance criteria (How safe is safe enough) are of growing importance. An interdisciplinary study on various aspects of these topics, including applications to nuclear power, was recently completed at the University of California, Los Angeles (UCLA), with the support of the National Science Foundation. In addition to more than 30 topical reports and various open-literature publications, a final report (UCLA-ENG-7777) to the study, titled ''A Generalized Evaluation Approach to Risk--Benefit for Large Technological Systems and Its Application to Nuclear Power'', was issued in early 1978. This article briefly summarizes portions of the final report dealing with general aspects of risk-benefit methodology, societal knowledge and perception of risk, and risk-acceptance criteria

  10. Program system RALLY - for probabilistic safety analysis of large technical systems

    International Nuclear Information System (INIS)

    Gueldner, W.; Polke, H.; Spindler, H.; Zipf, G.

    1982-03-01

    This report describes the program system RALLY to compute the reliability of large and intermeshed technical systems. In addition to a short explanation of the different programs, the possible applications of the program system RALLY are demonstrated. Finally, the most important studies carried out so far on RALLY are discussed. (orig.) [de

  11. Pool fires in a large scale ventilation system

    International Nuclear Information System (INIS)

    Smith, P.R.; Leslie, I.H.; Gregory, W.S.; White, B.

    1991-01-01

    A series of pool fire experiments was carried out in the Large Scale Flow Facility of the Mechanical Engineering Department at New Mexico State University. The various experiments burned alcohol, hydraulic cutting oil, kerosene, and a mixture of kerosene and tributylphosphate. Gas temperature and wall temperature measurements as a function of time were made throughout the 23.3m 3 burn compartment and the ducts of the ventilation system. The mass of the smoke particulate deposited upon the ventilation system 0.61m x 0.61m high efficiency particulate air filter for the hydraulic oil, kerosene, and kerosene-tributylphosphate mixture fires was measured using an in situ null balance. Significant increases in filter resistance were observed for all three fuels for burning time periods ranging from 10 to 30 minutes. This was found to be highly dependent upon initial ventilation system flow rate, fuel type, and flow configuration. The experimental results were compared to simulated results predicted by the Los Alamos National Laboratory FIRAC computer code. In general, the experimental and the computer results were in reasonable agreement, despite the fact that the fire compartment for the experiments was an insulated steel tank with 0.32 cm walls, while the compartment model FIRIN of FIRAC assumes 0.31 m thick concrete walls. This difference in configuration apparently caused FIRAC to consistently underpredict the measured temperatures in the fire compartment. The predicted deposition of soot proved to be insensitive to ventilation system flow rate, but the measured values showed flow rate dependence. However, predicted soot deposition was of the same order of magnitude as measured soot deposition

  12. Versatile single-molecule multi-color excitation and detection fluorescence setup for studying biomolecular dynamics

    KAUST Repository

    Sobhy, M. A.; Elshenawy, M. M.; Takahashi, Masateru; Whitman, B. H.; Walter, N. G.; Hamdan, S. M.

    2011-01-01

    Single-molecule fluorescence imaging is at the forefront of tools applied to study biomolecular dynamics both in vitro and in vivo. The ability of the single-molecule fluorescence microscope to conduct simultaneous multi-color excitation

  13. Medical isotope identification with large mobile detection systems

    Science.gov (United States)

    Mukhopadhyay, Sanjoy; Maurer, Richard

    2012-10-01

    The Remote Sensing laboratory (RSL) of National Security Technologies Inc. has built an array of large (5.08 - cm x 10.16 - cm x 40.6 - cm) thallium doped sodium iodide (NaI: Tl) scintillators to locate and screen gamma-ray emitting radioisotopes that are of interests to radiological emergency responders [1]. These vehicle mounted detectors provide the operators with rapid, simple, specific information for radiological threat assessment. Applications include large area inspection, customs inspection, border protection, emergency response, and monitoring of radiological facilities. These RSL mobile units are currently being upgraded to meet the Defense Threat Reduction Agency mission requirements for a next-generation system capable of detecting and identifying nuclear threat materials. One of the challenging problems faced by these gamma-ray detectors is the unambiguous identification of medical isotopes like 131I (364.49 keV [81.7%], 636.99 keV [7.17%]), 99Tcm (140.51 keV [89.1%]) and 67Ga (184.6 keV [19.7%], 300.2 [16.0%], 393.5 [4.5%] that are used in radionuclide therapy and often have overlapping gamma-ray energy regions of interest (ROI). The problem is made worse by short (about 5 seconds) acquisition time of the spectral data necessary for dynamic mobile detectors. This article describes attempts to identify medical isotopes from data collected from this mobile detection system in a short period of time (not exceeding 5 secs) and a large standoff distance (typically 10 meters) The mobile units offer identification capabilities that are based on hardware auto stabilization of the amplifier gain. The 1461 keV gamma-energy line from 40K is tracked. It uses gamma-ray energy windowing along with embedded mobile Gamma Detector Response and Analysis Software (GADRAS) [2] simultaneously to deconvolve any overlapping gamma-energy ROIs. These high sensitivity detectors are capable of resolving complex masking scenarios and exceed all ANSI N42.34 (2006) requirements

  14. REQUIREMENTS FOR SYSTEMS DEVELOPMENT LIFE CYCLE MODELS FOR LARGE-SCALE DEFENSE SYSTEMS

    Directory of Open Access Journals (Sweden)

    Kadir Alpaslan DEMIR

    2015-10-01

    Full Text Available TLarge-scale defense system projects are strategic for maintaining and increasing the national defense capability. Therefore, governments spend billions of dollars in the acquisition and development of large-scale defense systems. The scale of defense systems is always increasing and the costs to build them are skyrocketing. Today, defense systems are software intensive and they are either a system of systems or a part of it. Historically, the project performances observed in the development of these systems have been signifi cantly poor when compared to other types of projects. It is obvious that the currently used systems development life cycle models are insuffi cient to address today’s challenges of building these systems. Using a systems development life cycle model that is specifi cally designed for largescale defense system developments and is effective in dealing with today’s and near-future challenges will help to improve project performances. The fi rst step in the development a large-scale defense systems development life cycle model is the identifi cation of requirements for such a model. This paper contributes to the body of literature in the fi eld by providing a set of requirements for system development life cycle models for large-scale defense systems. Furthermore, a research agenda is proposed.

  15. RF system considerations for large high-duty-factor linacs

    International Nuclear Information System (INIS)

    Lynch, M.T.; Ziomek, C.D.; Tallerico, P.J.; Regan, A.H.; Eaton, L.; Lawrence, G.

    1994-01-01

    RF systems are often a major cost item for linacs, but this is especially true for large high-duty-factor linacs (up to and including CW) such as the Accelerator for Production of Tritium (APT) or the Accelerator for Transmutation of nuclear Waste (ATW). In addition, the high energy and high average beam current of these machines (approximately 1 GeV, 100--200 mA) leads to a need for excellent control of the accelerating fields in order to minimize the possibility of beam loss in the accelerator and the resulting activation. This paper will address the key considerations and limitations in the design of the RF system. These considerations impact the design of both the high power RF components and the RF controls. As might be expected, the two concerns sometimes lead to conflicting design requirements. For example minimum RF operating costs lead to a desire for operation near saturation of the high power RF generators in order to maximize the operating efficiency. Optimal control of the RF fields leads to a desire for maximum overdrive capability in those same generators in order to respond quickly to disturbances of the accelerator fields

  16. Modeling Resource Utilization of a Large Data Acquisition System

    CERN Document Server

    AUTHOR|(SzGeCERN)756497; The ATLAS collaboration; Garcia Garcia, Pedro Javier; Vandelli, Wainer; Froening, Holger

    2017-01-01

    The ATLAS 'Phase-II' upgrade, scheduled to start in 2024, will significantly change the requirements under which the data-acquisition system operates. The input data rate, currently fixed around 150 GB/s, is anticipated to reach 5 TB/s. In order to deal with the challenging conditions, and exploit the capabilities of newer technologies, a number of architectural changes are under consideration. Of particular interest is a new component, known as the Storage Handler, which will provide a large buffer area decoupling real-time data taking from event filtering. Dynamic operational models of the upgraded system can be used to identify the required resources and to select optimal techniques. In order to achieve a robust and dependable model, the current data-acquisition architecture has been used as a test case. This makes it possible to verify and calibrate the model against real operation data. Such a model can then be evolved toward the future ATLAS Phase-II architecture. In this paper we introduce the current ...

  17. Modelling Resource Utilization of a Large Data Acquisition System

    CERN Document Server

    Santos, Alejandro; The ATLAS collaboration

    2017-01-01

    The ATLAS 'Phase-II' upgrade, scheduled to start in 2024, will significantly change the requirements under which the data-acquisition system operates. The input data rate, currently fixed around 150 GB/s, is anticipated to reach 5 TB/s. In order to deal with the challenging conditions, and exploit the capabilities of newer technologies, a number of architectural changes are under consideration. Of particular interest is a new component, known as the Storage Handler, which will provide a large buffer area decoupling real-time data taking from event filtering. Dynamic operational models of the upgraded system can be used to identify the required resources and to select optimal techniques. In order to achieve a robust and dependable model, the current data-acquisition architecture has been used as a test case. This makes it possible to verify and calibrate the model against real operation data. Such a model can then be evolved toward the future ATLAS Phase-II architecture. In this paper we introduce the current ...

  18. ACE: A distributed system to manage large data archives

    Science.gov (United States)

    Daily, Mike I.; Allen, Frank W.

    1993-01-01

    Competitive pressures in the oil and gas industry are requiring a much tighter integration of technical data into E and P business processes. The development of new systems to accommodate this business need must comprehend the significant numbers of large, complex data objects which the industry generates. The life cycle of the data objects is a four phase progression from data acquisition, to data processing, through data interpretation, and ending finally with data archival. In order to implement a cost effect system which provides an efficient conversion from data to information and allows effective use of this information, an organization must consider the technical data management requirements in all four phases. A set of technical issues which may differ in each phase must be addressed to insure an overall successful development strategy. The technical issues include standardized data formats and media for data acquisition, data management during processing, plus networks, applications software, and GUI's for interpretation of the processed data. Mass storage hardware and software is required to provide cost effective storage and retrieval during the latter three stages as well as long term archival. Mobil Oil Corporation's Exploration and Producing Technical Center (MEPTEC) has addressed the technical and cost issues of designing, building, and implementing an Advanced Computing Environment (ACE) to support the petroleum E and P function, which is critical to the corporation's continued success. Mobile views ACE as a cost effective solution which can give Mobile a competitive edge as well as a viable technical solution.

  19. Blended particle filters for large-dimensional chaotic dynamical systems

    Science.gov (United States)

    Majda, Andrew J.; Qi, Di; Sapsis, Themistoklis P.

    2014-01-01

    A major challenge in contemporary data science is the development of statistically accurate particle filters to capture non-Gaussian features in large-dimensional chaotic dynamical systems. Blended particle filters that capture non-Gaussian features in an adaptively evolving low-dimensional subspace through particles interacting with evolving Gaussian statistics on the remaining portion of phase space are introduced here. These blended particle filters are constructed in this paper through a mathematical formalism involving conditional Gaussian mixtures combined with statistically nonlinear forecast models compatible with this structure developed recently with high skill for uncertainty quantification. Stringent test cases for filtering involving the 40-dimensional Lorenz 96 model with a 5-dimensional adaptive subspace for nonlinear blended filtering in various turbulent regimes with at least nine positive Lyapunov exponents are used here. These cases demonstrate the high skill of the blended particle filter algorithms in capturing both highly non-Gaussian dynamical features as well as crucial nonlinear statistics for accurate filtering in extreme filtering regimes with sparse infrequent high-quality observations. The formalism developed here is also useful for multiscale filtering of turbulent systems and a simple application is sketched below. PMID:24825886

  20. Development of large scale wind energy conservation system. Development of large scale wind energy conversion system; Ogata furyoku hatsuden system no kaihatsu. Ogata furyoku hatsuden system no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    Takita, M [New Energy and Industrial Technology Development Organization, Tokyo (Japan)

    1994-12-01

    Described herein are the results of the FY1994 research program for development of large scale wind energy conversion system. The study on technological development of key components evaluates performance of, and confirms reliability and applicability of, hydraulic systems centered by those equipped with variable pitch mechanisms and electrohydraulic servo valves that control them. The study on blade conducts fatigue and crack-propagation tests, which show that the blades developed have high strength. The study on speed-increasing gear conducts load tests, confirming the effects of reducing vibration and noise by modification of the gear teeth. The study on NACELLE cover conducts vibration tests to confirm its vibration characteristics, and analyzes three-dimensional vibration by the finite element method. Some components for a 500kW commercial wind mill are fabricated, including rotor heads, variable pitch mechanisms, speed-increasing gears, YAW systems, and hydraulic control systems. The others fabricated include a remote supervisory control system for maintenance, system to integrate the wind mill into a power system, and electrical control devices in which site conditions, such as atmospheric temperature and lightening, are taken into consideration.

  1. Benefits of transactive memory systems in large-scale development

    OpenAIRE

    Aivars, Sablis

    2016-01-01

    Context. Large-scale software development projects are those consisting of a large number of teams, maybe even spread across multiple locations, and working on large and complex software tasks. That means that neither a team member individually nor an entire team holds all the knowledge about the software being developed and teams have to communicate and coordinate their knowledge. Therefore, teams and team members in large-scale software development projects must acquire and manage expertise...

  2. Solar System science with the Large Synoptic Survey Telescope

    Science.gov (United States)

    Jones, Lynne; Brown, Mike; Ivezić, Zeljko; Jurić, Mario; Malhotra, Renu; Trilling, David

    2015-11-01

    The Large Synoptic Survey Telescope (LSST; http://lsst.org) will be a large-aperture, wide-field, ground-based telescope that will survey half the sky every few nights in six optical bands from 320 to 1050 nm. It will explore a wide range of astrophysical questions, ranging from performing a census of the Solar System, to examining the nature of dark energy. It is currently in construction, slated for first light in 2019 and full operations by 2022.The LSST will survey over 20,000 square degrees with a rapid observational cadence, to typical limiting magnitudes of r~24.5 in each visit (9.6 square degree field of view). Automated software will link the individual detections into orbits; these orbits, as well as precisely calibrated astrometry (~50mas) and photometry (~0.01-0.02 mag) in multiple bandpasses will be available as LSST data products. The resulting data set will have tremendous potential for planetary astronomy; multi-color catalogs of hundreds of thousands of NEOs and Jupiter Trojans, millions of asteroids, tens of thousands of TNOs, as well as thousands of other objects such as comets and irregular satellites of the major planets.LSST catalogs will increase the sample size of objects with well-known orbits 10-100 times for small body populations throughout the Solar System, enabling a major increase in the completeness level of the inventory of most dynamical classes of small bodies and generating new insights into planetary formation and evolution. Precision multi-color photometry will allow determination of lightcurves and colors, as well as spin state and shape modeling through sparse lightcurve inversion. LSST is currently investigating survey strategies to optimize science return across a broad range of goals. To aid in this investigation, we are making a series of realistic simulated survey pointing histories available together with a Python software package to model and evaluate survey detections for a user-defined input population. Preliminary

  3. A Digital Motion Control System for Large Telescopes

    Science.gov (United States)

    Hunter, T. R.; Wilson, R. W.; Kimberk, R.; Leiker, P. S.

    2001-05-01

    We have designed and programmed a digital motion control system for large telescopes, in particular, the 6-meter antennas of the Submillimeter Array on Mauna Kea. The system consists of a single robust, high-reliability microcontroller board which implements a two-axis velocity servo while monitoring and responding to critical safety parameters. Excellent tracking performance has been achieved with this system (0.3 arcsecond RMS at sidereal rate). The 24x24 centimeter four-layer printed circuit board contains a multitude of hardware devices: 40 digital inputs (for limit switches and fault indicators), 32 digital outputs (to enable/disable motor amplifiers and brakes), a quad 22-bit ADC (to read the motor tachometers), four 16-bit DACs (that provide torque signals to the motor amplifiers), a 32-LED status panel, a serial port to the LynxOS PowerPC antenna computer (RS422/460kbps), a serial port to the Palm Vx handpaddle (RS232/115kbps), and serial links to the low-resolution absolute encoders on the azimuth and elevation axes. Each section of the board employs independent ground planes and power supplies, with optical isolation on all I/O channels. The processor is an Intel 80C196KC 16-bit microcontroller running at 20MHz on an 8-bit bus. This processor executes an interrupt-driven, scheduler-based software system written in C and assembled into an EPROM with user-accessible variables stored in NVSRAM. Under normal operation, velocity update requests arrive at 100Hz from the position-loop servo process running independently on the antenna computer. A variety of telescope safety checks are performed at 279Hz including routine servicing of a 6 millisecond watchdog timer. Additional ADCs onboard the microcontroller monitor the winding temperature and current in the brushless three-phase drive motors. The PID servo gains can be dynamically changed in software. Calibration factors and software filters can be applied to the tachometer readings prior to the application of

  4. BioMagResBank (BMRB) as a partner in the Worldwide Protein Data Bank (wwPDB): new policies affecting biomolecular NMR depositions

    International Nuclear Information System (INIS)

    Markley, John L.; Ulrich, Eldon L.; Berman, Helen M.; Henrick, Kim; Nakamura, Haruki; Akutsu, Hideo

    2008-01-01

    We describe the role of the BioMagResBank (BMRB) within the Worldwide Protein Data Bank (wwPDB) and recent policies affecting the deposition of biomolecular NMR data. All PDB depositions of structures based on NMR data must now be accompanied by experimental restraints. A scheme has been devised that allows depositors to specify a representative structure and to define residues within that structure found experimentally to be largely unstructured. The BMRB now accepts coordinate sets representing three-dimensional structural models based on experimental NMR data of molecules of biological interest that fall outside the guidelines of the Protein Data Bank (i.e., the molecule is a peptide with 23 or fewer residues, a polynucleotide with 3 or fewer residues, a polysaccharide with 3 or fewer sugar residues, or a natural product), provided that the coordinates are accompanied by representation of the covalent structure of the molecule (atom connectivity), assigned NMR chemical shifts, and the structural restraints used in generating model. The BMRB now contains an archive of NMR data for metabolites and other small molecules found in biological systems

  5. Biomolecular Modeling in a Process Dynamics and Control Course

    Science.gov (United States)

    Gray, Jeffrey J.

    2006-01-01

    I present modifications to the traditional course entitled, "Process dynamics and control," which I renamed "Modeling, dynamics, and control of chemical and biological processes." Additions include the central dogma of biology, pharmacokinetic systems, population balances, control of gene transcription, and large­-scale…

  6. Molecular-dynamics simulations of polymeric surfaces for biomolecular applications

    NARCIS (Netherlands)

    Muntean, S.A.

    2013-01-01

    In-vitro diagnostics plays a very important role in the present healthcare system. It consists of a large variety of medical devices designed to diagnose a medical condition by measuring a target molecule in a sample, such as blood or urine. In vitro is the latin term for in glass and refers here to

  7. Large aperture components for solid state laser fusion systems

    International Nuclear Information System (INIS)

    Simmons, W.W.

    1978-01-01

    Solid state lasers for fusion experiments must reliably deliver maximum power to small (approximately .5 mm) targets from stand-off focal distances of 1 m or more. This requirement places stringent limits upon the optical quality, resistance to damage, and overall performance of the several major components--amplifiers, Faraday isolators, spatial filters--in each amplifier train. Component development centers about achieving (1) highest functional material figure of merit, (2) best optical quality, and (3) maximum resistance to optical damage. Specific examples of the performance of large aperture components will be presented within the context of the Argus and Shiva laser systems, which are presently operational at Lawrence Livermore Laboratory. Shiva comprises twenty amplifiers, each of 20 cm output clear aperture. Terawatt beams from these amplifiers are focused through two opposed, nested clusters of f/6 lenses onto such targets. Design requirements upon the larger aperture Nova laser components, up to 35 cm in clear aperture, will also be discussed; these pose a significant challenge to the optical industry

  8. Financing a large-scale picture archival and communication system.

    Science.gov (United States)

    Goldszal, Alberto F; Bleshman, Michael H; Bryan, R Nick

    2004-01-01

    An attempt to finance a large-scale multi-hospital picture archival and communication system (PACS) solely based on cost savings from current film operations is reported. A modified Request for Proposal described the technical requirements, PACS architecture, and performance targets. The Request for Proposal was complemented by a set of desired financial goals-the main one being the ability to use film savings to pay for the implementation and operation of the PACS. Financing of the enterprise-wide PACS was completed through an operating lease agreement including all PACS equipment, implementation, service, and support for an 8-year term, much like a complete outsourcing. Equipment refreshes, both hardware and software, are included. Our agreement also linked the management of the digital imaging operation (PACS) and the traditional film printing, shifting the operational risks of continued printing and costs related to implementation delays to the PACS vendor. An additional optimization step provided the elimination of the negative film budget variances in the beginning of the project when PACS costs tend to be higher than film and film-related expenses. An enterprise-wide PACS has been adopted to achieve clinical workflow improvements and cost savings. PACS financing was solely based on film savings, which included the entire digital solution (PACS) and any residual film printing. These goals were achieved with simultaneous elimination of any over-budget scenarios providing a non-negative cash flow in each year of an 8-year term.

  9. Dose controlled low energy electron irradiator for biomolecular films.

    Science.gov (United States)

    Kumar, S V K; Tare, Satej T; Upalekar, Yogesh V; Tsering, Thupten

    2016-03-01

    We have developed a multi target, Low Energy Electron (LEE), precise dose controlled irradiator for biomolecular films. Up to seven samples can be irradiated one after another at any preset electron energy and dose under UHV conditions without venting the chamber. In addition, one more sample goes through all the steps except irradiation, which can be used as control for comparison with the irradiated samples. All the samples are protected against stray electron irradiation by biasing them at -20 V during the entire period, except during irradiation. Ethernet based communication electronics hardware, LEE beam control electronics and computer interface were developed in house. The user Graphical User Interface to control the irradiation and dose measurement was developed using National Instruments Lab Windows CVI. The working and reliability of the dose controlled irradiator has been fully tested over the electron energy range of 0.5 to 500 eV by studying LEE induced single strand breaks to ΦX174 RF1 dsDNA.

  10. Spin valve sensor for biomolecular identification: Design, fabrication, and characterization

    Science.gov (United States)

    Li, Guanxiong

    Biomolecular identification, e.g., DNA recognition, has broad applications in biology and medicine such as gene expression analysis, disease diagnosis, and DNA fingerprinting. Therefore, we have been developing a magnetic biodetection technology based on giant magnetoresistive spin valve sensors and magnetic nanoparticle (developed for the magnetic nanoparticle detection, assuming the equivalent average field of magnetic nanoparticles and the coherent rotation of spin valve free layer magnetization. Micromagnetic simulations have also been performed for the spin valve sensors. The analytical model and micromagnetic simulations are found consistent with each other and are in good agreement with experiments. The prototype spin valve sensors have been fabricated at both micron and submicron scales. We demonstrated the detection of a single 2.8-mum magnetic microbead by micron-sized spin valve sensors. Based on polymer-mediated self-assembly and fine lithography, a bilayer lift-off process was developed to deposit magnetic nanoparticles onto the sensor surface in a controlled manner. With the lift-off deposition method, we have successfully demonstrated the room temperature detection of monodisperse 16-nm Fe3O 4 nanoparticles in a quantity from a few tens to several hundreds by submicron spin valve sensors, proving the feasibility of the nanoparticle detection. As desired for quantitative biodetection, a fairly linear dependence of sensor signal on the number of nanoparticles has been confirmed. The initial detection of DNA hybridization events labeled by magnetic nanoparticles further proved the magnetic biodetection concept.

  11. Dose controlled low energy electron irradiator for biomolecular films

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, S. V. K., E-mail: svkk@tifr.res.in; Tare, Satej T.; Upalekar, Yogesh V.; Tsering, Thupten [Tata Institute of Fundamental Research, Homi Bhabha Road, Colaba, Mumbai 400 005 (India)

    2016-03-15

    We have developed a multi target, Low Energy Electron (LEE), precise dose controlled irradiator for biomolecular films. Up to seven samples can be irradiated one after another at any preset electron energy and dose under UHV conditions without venting the chamber. In addition, one more sample goes through all the steps except irradiation, which can be used as control for comparison with the irradiated samples. All the samples are protected against stray electron irradiation by biasing them at −20 V during the entire period, except during irradiation. Ethernet based communication electronics hardware, LEE beam control electronics and computer interface were developed in house. The user Graphical User Interface to control the irradiation and dose measurement was developed using National Instruments Lab Windows CVI. The working and reliability of the dose controlled irradiator has been fully tested over the electron energy range of 0.5 to 500 eV by studying LEE induced single strand breaks to ΦX174 RF1 dsDNA.

  12. A biomolecular proportional integral controller based on feedback regulations of protein level and activity.

    Science.gov (United States)

    Mairet, Francis

    2018-02-01

    Homeostasis is the capacity of living organisms to keep internal conditions regulated at a constant level, despite environmental fluctuations. Integral feedback control is known to play a key role in this behaviour. Here, I show that a feedback system involving transcriptional and post-translational regulations of the same executor protein acts as a proportional integral (PI) controller, leading to enhanced transient performances in comparison with a classical integral loop. Such a biomolecular controller-which I call a level and activity-PI controller (LA-PI)-is involved in the regulation of ammonium uptake by Escherichia coli through the transporter AmtB. The P II molecules, which reflect the nitrogen status of the cell, inhibit both the production of AmtB and its activity (via the NtrB-NtrC system and the formation of a complex with GlnK, respectively). Other examples of LA-PI controller include copper and zinc transporters, and the redox regulation in photosynthesis. This scheme has thus emerged through evolution in many biological systems, surely because of the benefits it offers in terms of performances (rapid and perfect adaptation) and economy (protein production according to needs).

  13. Optimal use of data in parallel tempering simulations for the construction of discrete-state Markov models of biomolecular dynamics.

    Science.gov (United States)

    Prinz, Jan-Hendrik; Chodera, John D; Pande, Vijay S; Swope, William C; Smith, Jeremy C; Noé, Frank

    2011-06-28

    Parallel tempering (PT) molecular dynamics simulations have been extensively investigated as a means of efficient sampling of the configurations of biomolecular systems. Recent work has demonstrated how the short physical trajectories generated in PT simulations of biomolecules can be used to construct the Markov models describing biomolecular dynamics at each simulated temperature. While this approach describes the temperature-dependent kinetics, it does not make optimal use of all available PT data, instead estimating the rates at a given temperature using only data from that temperature. This can be problematic, as some relevant transitions or states may not be sufficiently sampled at the temperature of interest, but might be readily sampled at nearby temperatures. Further, the comparison of temperature-dependent properties can suffer from the false assumption that data collected from different temperatures are uncorrelated. We propose here a strategy in which, by a simple modification of the PT protocol, the harvested trajectories can be reweighted, permitting data from all temperatures to contribute to the estimated kinetic model. The method reduces the statistical uncertainty in the kinetic model relative to the single temperature approach and provides estimates of transition probabilities even for transitions not observed at the temperature of interest. Further, the method allows the kinetics to be estimated at temperatures other than those at which simulations were run. We illustrate this method by applying it to the generation of a Markov model of the conformational dynamics of the solvated terminally blocked alanine peptide.

  14. Towards a large deviation theory for strongly correlated systems

    International Nuclear Information System (INIS)

    Ruiz, Guiomar; Tsallis, Constantino

    2012-01-01

    A large-deviation connection of statistical mechanics is provided by N independent binary variables, the (N→∞) limit yielding Gaussian distributions. The probability of n≠N/2 out of N throws is governed by e −Nr , r related to the entropy. Large deviations for a strong correlated model characterized by indices (Q,γ) are studied, the (N→∞) limit yielding Q-Gaussians (Q→1 recovers a Gaussian). Its large deviations are governed by e q −Nr q (∝1/N 1/(q−1) , q>1), q=(Q−1)/(γ[3−Q])+1. This illustration opens the door towards a large-deviation foundation of nonextensive statistical mechanics. -- Highlights: ► We introduce the formalism of relative entropy for a single random binary variable and its q-generalization. ► We study a model of N strongly correlated binary random variables and their large-deviation probabilities. ► Large-deviation probability of strongly correlated model exhibits a q-exponential decay whose argument is proportional to N, as extensivity requires. ► Our results point to a q-generalized large deviation theory and suggest a large-deviation foundation of nonextensive statistical mechanics.

  15. Large Scale System Safety Integration for Human Rated Space Vehicles

    Science.gov (United States)

    Massie, Michael J.

    2005-12-01

    Since the 1960s man has searched for ways to establish a human presence in space. Unfortunately, the development and operation of human spaceflight vehicles carry significant safety risks that are not always well understood. As a result, the countries with human space programs have felt the pain of loss of lives in the attempt to develop human space travel systems. Integrated System Safety is a process developed through years of experience (since before Apollo and Soyuz) as a way to assess risks involved in space travel and prevent such losses. The intent of Integrated System Safety is to take a look at an entire program and put together all the pieces in such a way that the risks can be identified, understood and dispositioned by program management. This process has many inherent challenges and they need to be explored, understood and addressed.In order to prepare truly integrated analysis safety professionals must gain a level of technical understanding of all of the project's pieces and how they interact. Next, they must find a way to present the analysis so the customer can understand the risks and make decisions about managing them. However, every organization in a large-scale project can have different ideas about what is or is not a hazard, what is or is not an appropriate hazard control, and what is or is not adequate hazard control verification. NASA provides some direction on these topics, but interpretations of those instructions can vary widely.Even more challenging is the fact that every individual/organization involved in a project has different levels of risk tolerance. When the discrete hazard controls of the contracts and agreements cannot be met, additional risk must be accepted. However, when one has left the arena of compliance with the known rules, there can be no longer be specific ground rules on which to base a decision as to what is acceptable and what is not. The integrator must find common grounds between all parties to achieve

  16. Cancer genetics meets biomolecular mechanism-bridging an age-old gulf.

    Science.gov (United States)

    González-Sánchez, Juan Carlos; Raimondi, Francesco; Russell, Robert B

    2018-02-01

    Increasingly available genomic sequencing data are exploited to identify genes and variants contributing to diseases, particularly cancer. Traditionally, methods to find such variants have relied heavily on allele frequency and/or familial history, often neglecting to consider any mechanistic understanding of their functional consequences. Thus, while the set of known cancer-related genes has increased, for many, their mechanistic role in the disease is not completely understood. This issue highlights a wide gap between the disciplines of genetics, which largely aims to correlate genetic events with phenotype, and molecular biology, which ultimately aims at a mechanistic understanding of biological processes. Fortunately, new methods and several systematic studies have proved illuminating for many disease genes and variants by integrating sequencing with mechanistic data, including biomolecular structures and interactions. These have provided new interpretations for known mutations and suggested new disease-relevant variants and genes. Here, we review these approaches and discuss particular examples where these have had a profound impact on the understanding of human cancers. © 2018 Federation of European Biochemical Societies.

  17. Compressed sensing and the reconstruction of ultrafast 2D NMR data: Principles and biomolecular applications.

    Science.gov (United States)

    Shrot, Yoav; Frydman, Lucio

    2011-04-01

    A topic of active investigation in 2D NMR relates to the minimum number of scans required for acquiring this kind of spectra, particularly when these are dictated by sampling rather than by sensitivity considerations. Reductions in this minimum number of scans have been achieved by departing from the regular sampling used to monitor the indirect domain, and relying instead on non-uniform sampling and iterative reconstruction algorithms. Alternatively, so-called "ultrafast" methods can compress the minimum number of scans involved in 2D NMR all the way to a minimum number of one, by spatially encoding the indirect domain information and subsequently recovering it via oscillating field gradients. Given ultrafast NMR's simultaneous recording of the indirect- and direct-domain data, this experiment couples the spectral constraints of these orthogonal domains - often calling for the use of strong acquisition gradients and large filter widths to fulfill the desired bandwidth and resolution demands along all spectral dimensions. This study discusses a way to alleviate these demands, and thereby enhance the method's performance and applicability, by combining spatial encoding with iterative reconstruction approaches. Examples of these new principles are given based on the compressed-sensed reconstruction of biomolecular 2D HSQC ultrafast NMR data, an approach that we show enables a decrease of the gradient strengths demanded in this type of experiments by up to 80%. Copyright © 2011 Elsevier Inc. All rights reserved.

  18. An automated system designed for large scale NMR data deposition and annotation: application to over 600 assigned chemical shift data entries to the BioMagResBank from the Riken Structural Genomics/Proteomics Initiative internal database

    International Nuclear Information System (INIS)

    Kobayashi, Naohiro; Harano, Yoko; Tochio, Naoya; Nakatani, Eiichi; Kigawa, Takanori; Yokoyama, Shigeyuki; Mading, Steve; Ulrich, Eldon L.; Markley, John L.; Akutsu, Hideo; Fujiwara, Toshimichi

    2012-01-01

    Biomolecular NMR chemical shift data are key information for the functional analysis of biomolecules and the development of new techniques for NMR studies utilizing chemical shift statistical information. Structural genomics projects are major contributors to the accumulation of protein chemical shift information. The management of the large quantities of NMR data generated by each project in a local database and the transfer of the data to the public databases are still formidable tasks because of the complicated nature of NMR data. Here we report an automated and efficient system developed for the deposition and annotation of a large number of data sets including 1 H, 13 C and 15 N resonance assignments used for the structure determination of proteins. We have demonstrated the feasibility of our system by applying it to over 600 entries from the internal database generated by the RIKEN Structural Genomics/Proteomics Initiative (RSGI) to the public database, BioMagResBank (BMRB). We have assessed the quality of the deposited chemical shifts by comparing them with those predicted from the PDB coordinate entry for the corresponding protein. The same comparison for other matched BMRB/PDB entries deposited from 2001–2011 has been carried out and the results suggest that the RSGI entries greatly improved the quality of the BMRB database. Since the entries include chemical shifts acquired under strikingly similar experimental conditions, these NMR data can be expected to be a promising resource to improve current technologies as well as to develop new NMR methods for protein studies.

  19. A coordination model for ultra-large scale systems of systems

    Directory of Open Access Journals (Sweden)

    Manuela L. Bujorianu

    2013-11-01

    Full Text Available The ultra large multi-agent systems are becoming increasingly popular due to quick decay of the individual production costs and the potential of speeding up the solving of complex problems. Examples include nano-robots, or systems of nano-satellites for dangerous meteorite detection, or cultures of stem cells for organ regeneration or nerve repair. The topics associated with these systems are usually dealt within the theories of intelligent swarms or biologically inspired computation systems. Stochastic models play an important role and they are based on various formulations of the mechanical statistics. In these cases, the main assumption is that the swarm elements have a simple behaviour and that some average properties can be deduced for the entire swarm. In contrast, complex systems in areas like aeronautics are formed by elements with sophisticated behaviour, which are even autonomous. In situations like this, a new approach to swarm coordination is necessary. We present a stochastic model where the swarm elements are communicating autonomous systems, the coordination is separated from the component autonomous activity and the entire swarm can be abstracted away as a piecewise deterministic Markov process, which constitutes one of the most popular model in stochastic control. Keywords: ultra large multi-agent systems, system of systems, autonomous systems, stochastic hybrid systems.

  20. MONITORING OF LARGE INSTABLE AREAS: system reliability and new tools.

    Science.gov (United States)

    Leandro, G.; Mucciarelli, M.; Pellicani, R.; Spilotro, G.

    2009-04-01

    The monitoring of unstable or potentially unstable areas is a necessary operation every time you can not remove the conditions of risk and apply to mitigation measures. In Italian Apennine regions there are many urban or extra-urban areas affected by instability, for which it is impracticable to remove hazard conditions, because of size and cost problems. The technological evolution exportable to the field of land instability monitoring is particularly lively and allows the use of warning systems unthinkable just few years ago. However, the monitoring of unstable or potentially unstable areas requires a very great knowledge of the specific problems, without which the reliability of the system may be dangerously overestimated. The movement may arise, indeed, in areas not covered by instrumentation, or covered with vegetation that prevents the acquisition of both reflected signals in the multi-beam laser techniques and radar signals. Environmental conditions (wind, concentrated sources of light, temperature changes, presence of animals) may also invalidate the accuracy of the measures, by introducing modulations or disturbance at a level well above the threshold of alarm signal, leading consequently to raise the values of the warning threshold. The Authors have gained long experience with the observation and monitoring of some large landslides in the Southern Apennine (Aliano, Buoninventre, Calciano, Carlantino, etc.) and unstable areas also at regional scale. One of the most important experiences is about the case of landslides of extensive areas, where unstable and stables zones coexist along transverse and longitudinal axis. In many of these cases you need the accurate control of the movement at selected points to evaluate the trend of displacement velocity, which can be achieved by means of a single-beam laser. The control of these movements, however, does not provide information on stress pattern into the stable areas. Among the sensitive precursors, acoustic

  1. Biomolecular detection using a metal semiconductor field effect transistor

    Science.gov (United States)

    Estephan, Elias; Saab, Marie-Belle; Buzatu, Petre; Aulombard, Roger; Cuisinier, Frédéric J. G.; Gergely, Csilla; Cloitre, Thierry

    2010-04-01

    In this work, our attention was drawn towards developing affinity-based electrical biosensors, using a MESFET (Metal Semiconductor Field Effect Transistor). Semiconductor (SC) surfaces must be prepared before the incubations with biomolecules. The peptides route was adapted to exceed and bypass the limits revealed by other types of surface modification due to the unwanted unspecific interactions. As these peptides reveal specific recognition of materials, then controlled functionalization can be achieved. Peptides were produced by phage display technology using a library of M13 bacteriophage. After several rounds of bio-panning, the phages presenting affinities for GaAs SC were isolated; the DNA of these specific phages were sequenced, and the peptide with the highest affinity was synthesized and biotinylated. To explore the possibility of electrical detection, the MESFET fabricated with the GaAs SC were used to detect the streptavidin via the biotinylated peptide in the presence of the bovine Serum Albumin. After each surface modification step, the IDS (current between the drain and the source) of the transistor was measured and a decrease in the intensity was detected. Furthermore, fluorescent microscopy was used in order to prove the specificity of this peptide and the specific localisation of biomolecules. In conclusion, the feasibility of producing an electrical biosensor using a MESFET has been demonstrated. Controlled placement, specific localization and detection of biomolecules on a MESFET transistor were achieved without covering the drain and the source. This method of functionalization and detection can be of great utility for biosensing application opening a new way for developing bioFETs (Biomolecular Field-Effect Transistor).

  2. Soft Supercharging of Biomolecular Ions in Electrospray Ionization Mass Spectrometry

    Science.gov (United States)

    Chingin, Konstantin; Xu, Ning; Chen, Huanwen

    2014-06-01

    The charge states of biomolecular ions in ESI-MS can be significantly increased by the addition of low-vapor supercharging (SC) reagents into the spraying solution. Despite the considerable interest from the community, the mechanistic aspects of SC are not well understood and are hotly debated. Arguments that denaturation accounts for the increased charging observed in proteins sprayed from aqueous solutions containing SC reagent have been published widely, but often with incomplete or ambiguous supporting data. In this work, we explored ESI MS charging and SC behavior of several biopolymers including proteins and DNA oligonucleotides. Analytes were ionized from 100 mM ammonium acetate (NH4Ac) aqueous buffer in both positive (ESI+) and negative (ESI-) ion modes. SC was induced either with m-NBA or by the elevated temperature of ESI capillary. For all the analytes studied we, found striking differences in the ESI MS response to these two modes of activation. The data suggest that activation with m-NBA results in more extensive analyte charging with lower degree of denaturation. When working solution with m-NBA was analyzed at elevated temperatures, the SC effect from m-NBA was neutralized. Instead, the net SC effect was similar to the SC effect achieved by thermal activation only. Overall, our observations indicate that SC reagents enhance ESI charging of biomolecules via distinctly different mechanism compared with the traditional approaches based on analyte denaturation. Instead, the data support the hypothesis that the SC phenomenon involves a direct interaction between a biopolymer and SC reagent occurring in evaporating ESI droplets.

  3. Stability of superconducting cables for use in large magnet systems

    International Nuclear Information System (INIS)

    Tateishi, Hiroshi; Schmidt, C.

    1992-01-01

    The construction of large superconducting magnets requires the development of complicated conductor types, which can fulfill the specific requirements of different types of magnets. A rather hard boundary condition for large magnets is the presence of fast changing magnetic fields. In the Institute of Technical Physics of the Karlsruhe Nuclear Research Center, Germany, a superconducting cable was developed for use in poloidal field coils in Tokamak experiments. This 'POLO'-cable exhibits low losses in a magnetic ac-field and a high stability margin. In the present article the requirements on a superconducting cable are described, as well as the mechanisms of ac-losses and the calculation of the stability limit. Calculated values are compared with experimental data. Some unresolved problems concerning the stability of large magnets are discussed taking the example of the POLO-cable. (author)

  4. Large momentum transfer electron scattering from few-nucleon systems

    International Nuclear Information System (INIS)

    Arnold, R.G.

    1979-08-01

    A review is given of the experimental results from a series of measurements at SLAC of large momentum transfer (Q 2 > 20 fm -2 ) electron scattering at forward angles from nuclei with A less than or equal to 4. Theoretical interpretations of these data in terms of traditional nuclear physics models and in terms of quark constituent models are described. Some physics questions for future experiments are explored, and a preview of possible future measurements of magnetic structure functions of light nuclei at large Q 2 is given

  5. Cryogenic systems for large superconducting accelerators/storage rings

    International Nuclear Information System (INIS)

    Brown, D.P.

    1981-01-01

    Particle accelerators and storage rings which utilize superconducting magnets have presented cryogenic system designers, as well as magnet designers, with many new challenges. When such accelerators were first proposed, little operational experience existed to guide the design. Two superconducting accelerators, complete with cryogenic systems, have been designed and are now under construction. These are the Fermilab Doubler Project and the Brookhaven National Laboratory ISABELLE Project. The cryogenic systems which developed at these two laboratories share many common characteristics, especially as compared to earlier cryogenic systems. Because of this commonality, these characteristics can be reasonably taken as also being representative of future systems. There are other areas in which the two systems are dissimilar. In those areas, it is not possible to state which, if either, will be chosen by future designers. Some of the design parameters for the two systems are given

  6. Toxicity evaluation of PEDOT/biomolecular composites intended for neural communication electrodes

    International Nuclear Information System (INIS)

    Asplund, M; Thaning, E; Von Holst, H; Lundberg, J; Sandberg-Nordqvist, A C; Kostyszyn, B; Inganaes, O

    2009-01-01

    Electrodes coated with the conducting polymer poly(3,4-ethylene dioxythiophene) (PEDOT) possess attractive electrochemical properties for stimulation or recording in the nervous system. Biomolecules, added as counter ions in electropolymerization, could further improve the biomaterial properties, eliminating the need for surfactant counter ions in the process. Such PEDOT/biomolecular composites, using heparin or hyaluronic acid, have previously been investigated electrochemically. In the present study, their biocompatibility is evaluated. An agarose overlay assay using L929 fibroblasts, and elution and direct contact tests on human neuroblastoma SH-SY5Y cells are applied to investigate cytotoxicity in vitro. PEDOT:heparin was further evaluated in vivo through polymer-coated implants in rodent cortex. No cytotoxic response was seen to any of the PEDOT materials tested. The examination of cortical tissue exposed to polymer-coated implants showed extensive glial scarring irrespective of implant material (Pt:polymer or Pt). However, quantification of immunological response, through distance measurements from implant site to closest neuron and counting of ED1+ cell density around implant, was comparable to those of platinum controls. These results indicate that PEDOT:heparin surfaces were non-cytotoxic and show no marked difference in immunological response in cortical tissue compared to pure platinum controls.

  7. A Starting Point for Fluorescence-Based Single-Molecule Measurements in Biomolecular Research

    Directory of Open Access Journals (Sweden)

    Alexander Gust

    2014-09-01

    Full Text Available Single-molecule fluorescence techniques are ideally suited to provide information about the structure-function-dynamics relationship of a biomolecule as static and dynamic heterogeneity can be easily detected. However, what type of single-molecule fluorescence technique is suited for which kind of biological question and what are the obstacles on the way to a successful single-molecule microscopy experiment? In this review, we provide practical insights into fluorescence-based single-molecule experiments aiming for scientists who wish to take their experiments to the single-molecule level. We especially focus on fluorescence resonance energy transfer (FRET experiments as these are a widely employed tool for the investigation of biomolecular mechanisms. We will guide the reader through the most critical steps that determine the success and quality of diffusion-based confocal and immobilization-based total internal reflection fluorescence microscopy. We discuss the specific chemical and photophysical requirements that make fluorescent dyes suitable for single-molecule fluorescence experiments. Most importantly, we review recently emerged photoprotection systems as well as passivation and immobilization strategies that enable the observation of fluorescently labeled molecules under biocompatible conditions. Moreover, we discuss how the optical single-molecule toolkit has been extended in recent years to capture the physiological complexity of a cell making it even more relevant for biological research.

  8. A Quick-responsive DNA Nanotechnology Device for Bio-molecular Homeostasis Regulation.

    Science.gov (United States)

    Wu, Songlin; Wang, Pei; Xiao, Chen; Li, Zheng; Yang, Bing; Fu, Jieyang; Chen, Jing; Wan, Neng; Ma, Cong; Li, Maoteng; Yang, Xiangliang; Zhan, Yi

    2016-08-10

    Physiological processes such as metabolism, cell apoptosis and immune responses, must be strictly regulated to maintain their homeostasis and achieve their normal physiological functions. The speed with which bio-molecular homeostatic regulation occurs directly determines the ability of an organism to adapt to conditional changes. To produce a quick-responsive regulatory system that can be easily utilized for various types of homeostasis, a device called nano-fingers that facilitates the regulation of physiological processes was constructed using DNA origami nanotechnology. This nano-fingers device functioned in linked open and closed phases using two types of DNA tweezers, which were covalently coupled with aptamers that captured specific molecules when the tweezer arms were sufficiently close. Via this specific interaction mechanism, certain physiological processes could be simultaneously regulated from two directions by capturing one biofactor and releasing the other to enhance the regulatory capacity of the device. To validate the universal application of this device, regulation of the homeostasis of the blood coagulant thrombin was attempted using the nano-fingers device. It was successfully demonstrated that this nano-fingers device achieved coagulation buffering upon the input of fuel DNA. This nano-device could also be utilized to regulate the homeostasis of other types of bio-molecules.

  9. Pre-Clinical Tests of an Integrated CMOS Biomolecular Sensor for Cardiac Diseases Diagnosis.

    Science.gov (United States)

    Lee, Jen-Kuang; Wang, I-Shun; Huang, Chi-Hsien; Chen, Yih-Fan; Huang, Nien-Tsu; Lin, Chih-Ting

    2017-11-26

    Coronary artery disease and its related complications pose great threats to human health. In this work, we aim to clinically evaluate a CMOS field-effect biomolecular sensor for cardiac biomarkers, cardiac-specific troponin-I (cTnI), N -terminal prohormone brain natriuretic peptide (NT-proBNP), and interleukin-6 (IL-6). The CMOS biosensor is implemented via a standard commercialized 0.35 μm CMOS process. To validate the sensing characteristics, in buffer conditions, the developed CMOS biosensor has identified the detection limits of IL-6, cTnI, and NT-proBNP as being 45 pM, 32 pM, and 32 pM, respectively. In clinical serum conditions, furthermore, the developed CMOS biosensor performs a good correlation with an enzyme-linked immuno-sorbent assay (ELISA) obtained from a hospital central laboratory. Based on this work, the CMOS field-effect biosensor poses good potential for accomplishing the needs of a point-of-care testing (POCT) system for heart disease diagnosis.

  10. A variational approach to moment-closure approximations for the kinetics of biomolecular reaction networks

    Science.gov (United States)

    Bronstein, Leo; Koeppl, Heinz

    2018-01-01

    Approximate solutions of the chemical master equation and the chemical Fokker-Planck equation are an important tool in the analysis of biomolecular reaction networks. Previous studies have highlighted a number of problems with the moment-closure approach used to obtain such approximations, calling it an ad hoc method. In this article, we give a new variational derivation of moment-closure equations which provides us with an intuitive understanding of their properties and failure modes and allows us to correct some of these problems. We use mixtures of product-Poisson distributions to obtain a flexible parametric family which solves the commonly observed problem of divergences at low system sizes. We also extend the recently introduced entropic matching approach to arbitrary ansatz distributions and Markov processes, demonstrating that it is a special case of variational moment closure. This provides us with a particularly principled approximation method. Finally, we extend the above approaches to cover the approximation of multi-time joint distributions, resulting in a viable alternative to process-level approximations which are often intractable.

  11. System seismic analysis of an innovative primary system for a large pool type LMFBR plant

    International Nuclear Information System (INIS)

    Pan, Y.C.; Wu, T.S.; Cha, B.K.; Burelbach, J.; Seidensticker, R.

    1984-01-01

    The system seismic analysis of an innovative primary system for a large pool type liquid metal fast breeder reactor (LMFBR) plant is presented. In this primary system, the reactor core is supported in a way which differs significantly from that used in previous designs. The analytical model developed for this study is a three-dimensional finite element model including one-half of the primary system cut along the plane of symmetry. The model includes the deck and deck mounted components,the reactor vessel, the core support structure, the core barrel, the radial neutron shield, the redan, and the conical support skirt. The sodium contained in the primary system is treated as a lumped mass appropriately distributed among various components. The significant seismic behavior as well as the advantages of this primary system design are discussed in detail

  12. Large combined heat and power plants in sustainable energy systems

    DEFF Research Database (Denmark)

    Lund, Rasmus Søgaard; Mathiesen, Brian Vad

    2015-01-01

    . It is concluded that the CCGT CHP plant is the most feasible both from a technical analysis and a market economic analysis with electricity exchange. It is found that the current economic framework for large CHP plants in Denmark generates a mismatch between socio economy and business economy as well...

  13. Solving large linear systems in an implicit thermohaline ocean model

    NARCIS (Netherlands)

    de Niet, Arie Christiaan

    2007-01-01

    The climate on earth is largely determined by the global ocean circulation. Hence it is important to predict how the flow will react to perturbation by for example melting icecaps. To answer questions about the stability of the global ocean flow, a computer model has been developed that is able to

  14. Seafloor mapping of large areas using multibeam system - Indian experience

    Digital Repository Service at National Institute of Oceanography (India)

    Kodagali, V.N.; KameshRaju, K.A; Ramprasad, T.

    averaged and merged to produce large area maps. Maps were generated in the scale of 1 mil. and 1.5 mil covering area of about 2 mil. sq.km in single map. Also, depth contour interval were generated. A computer program was developed to convert the depth data...

  15. System Dynamics Simulation of Large-Scale Generation System for Designing Wind Power Policy in China

    Directory of Open Access Journals (Sweden)

    Linna Hou

    2015-01-01

    Full Text Available This paper focuses on the impacts of renewable energy policy on a large-scale power generation system, including thermal power, hydropower, and wind power generation. As one of the most important clean energy, wind energy has been rapidly developed in the world. But in recent years there is a serious waste of wind power equipment and investment in China leading to many problems in the industry from wind power planning to its integration. One way overcoming the difficulty is to analyze the influence of wind power policy on a generation system. This paper builds a system dynamics (SD model of energy generation to simulate the results of wind energy generation policies based on a complex system. And scenario analysis method is used to compare the effectiveness and efficiency of these policies. The case study shows that the combinations of lower portfolio goal and higher benchmark price and those of higher portfolio goal and lower benchmark price have large differences in both effectiveness and efficiency. On the other hand, the combinations of uniformly lower or higher portfolio goal and benchmark price have similar efficiency, but different effectiveness. Finally, an optimal policy combination can be chosen on the basis of policy analysis in the large-scale power system.

  16. Emergent Semantics Interoperability in Large-Scale Decentralized Information Systems

    CERN Document Server

    Cudré-Mauroux, Philippe

    2008-01-01

    Peer-to-peer systems are evolving with new information-system architectures, leading to the idea that the principles of decentralization and self-organization will offer new approaches in informatics, especially for systems that scale with the number of users or for which central authorities do not prevail. This book describes a new way of building global agreements (semantic interoperability) based only on decentralized, self-organizing interactions.

  17. Optimized preventive replacement policy for large cascade systems

    International Nuclear Information System (INIS)

    Kretzen, H.H.

    1986-01-01

    The repair-bottleneck problem as a limiting factor for system reliability can be overcome. Design need only cover the steady state, wearout induced accumulations of failures being precluded by preventive replacements with subsequent recycling. As a result, a reliable system appears to be feasible on an economic basis, optimization in detail to be left to more precised cost-benefit studies. As a reference system the radio-frequency-generator cascade of a single-cell linear accelerator is considered. (DG)

  18. DYMAC - The first stab at large safeguards systems

    International Nuclear Information System (INIS)

    Augustson, R.H.

    1987-01-01

    This article discussed the evolution of the Los Alamos Safeguards Program that led to the concept of dynamic materials accountability and control (DYMAC). A plant specific DYMAC system was installed at the Los Alamos Plutonium Facility, TA-55, in 1978. This system was turned over to the plant management and is still operational today. Systems like DYMAC/TA-55 provide necessary and realistic data on the workability of near-real-time accountability in nuclear material processing facilities

  19. Panoramic, large-screen, 3-D flight display system design

    Science.gov (United States)

    Franklin, Henry; Larson, Brent; Johnson, Michael; Droessler, Justin; Reinhart, William F.

    1995-01-01

    The report documents and summarizes the results of the required evaluations specified in the SOW and the design specifications for the selected display system hardware. Also included are the proposed development plan and schedule as well as the estimated rough order of magnitude (ROM) cost to design, fabricate, and demonstrate a flyable prototype research flight display system. The thrust of the effort was development of a complete understanding of the user/system requirements for a panoramic, collimated, 3-D flyable avionic display system and the translation of the requirements into an acceptable system design for fabrication and demonstration of a prototype display in the early 1997 time frame. Eleven display system design concepts were presented to NASA LaRC during the program, one of which was down-selected to a preferred display system concept. A set of preliminary display requirements was formulated. The state of the art in image source technology, 3-D methods, collimation methods, and interaction methods for a panoramic, 3-D flight display system were reviewed in depth and evaluated. Display technology improvements and risk reductions associated with maturity of the technologies for the preferred display system design concept were identified.

  20. A data analysis expert system for large established distributed databases

    Science.gov (United States)

    Gnacek, Anne-Marie; An, Y. Kim; Ryan, J. Patrick

    1987-01-01

    A design for a natural language database interface system, called the Deductively Augmented NASA Management Decision support System (DANMDS), is presented. The DANMDS system components have been chosen on the basis of the following considerations: maximal employment of the existing NASA IBM-PC computers and supporting software; local structuring and storing of external data via the entity-relationship model; a natural easy-to-use error-free database query language; user ability to alter query language vocabulary and data analysis heuristic; and significant artificial intelligence data analysis heuristic techniques that allow the system to become progressively and automatically more useful.

  1. Computational methods to study the structure and dynamics of biomolecules and biomolecular processes from bioinformatics to molecular quantum mechanics

    CERN Document Server

    2014-01-01

    Since the second half of the 20th century machine computations have played a critical role in science and engineering. Computer-based techniques have become especially important in molecular biology, since they often represent the only viable way to gain insights into the behavior of a biological system as a whole. The complexity of biological systems, which usually needs to be analyzed on different time- and size-scales and with different levels of accuracy, requires the application of different approaches, ranging from comparative analysis of sequences and structural databases, to the analysis of networks of interdependence between cell components and processes, through coarse-grained modeling to atomically detailed simulations, and finally to molecular quantum mechanics. This book provides a comprehensive overview of modern computer-based techniques for computing the structure, properties and dynamics of biomolecules and biomolecular processes. The twenty-two chapters, written by scientists from all over t...

  2. Expert system shell to reason on large amounts of data

    Science.gov (United States)

    Giuffrida, Gionanni

    1994-01-01

    The current data base management systems (DBMS's) do not provide a sophisticated environment to develop rule based expert systems applications. Some of the new DBMS's come with some sort of rule mechanism; these are active and deductive database systems. However, both of these are not featured enough to support full implementation based on rules. On the other hand, current expert system shells do not provide any link with external databases. That is, all the data are kept in the system working memory. Such working memory is maintained in main memory. For some applications the reduced size of the available working memory could represent a constraint for the development. Typically these are applications which require reasoning on huge amounts of data. All these data do not fit into the computer main memory. Moreover, in some cases these data can be already available in some database systems and continuously updated while the expert system is running. This paper proposes an architecture which employs knowledge discovering techniques to reduce the amount of data to be stored in the main memory; in this architecture a standard DBMS is coupled with a rule-based language. The data are stored into the DBMS. An interface between the two systems is responsible for inducing knowledge from the set of relations. Such induced knowledge is then transferred to the rule-based language working memory.

  3. Mechanical design for a large fusion laser system

    International Nuclear Information System (INIS)

    Hurley, C.A.

    1979-01-01

    The Nova Mechanical Systems Group at LLL is responsible for the design, fabrication, and installation of all laser chain components, for the stable support structure that holds them, and for the beam lines that transport the laser beam to the target system. This paper is an overview of the group's engineering effort, emphasizing new developments

  4. Rapid Transient Fault Insertion in Large Digital Systems

    NARCIS (Netherlands)

    Rohani, A.; Kerkhoff, Hans G.

    This paper presents a technique for rapidtransientfault injection, regarding the CPU time, to perform simulation-based fault-injection in complex System-on-Chip Systems (SoCs). The proposed approach can be applied to complex circuits, as it is not required to modify the top-level modules of a

  5. Future hydrogen markets for large-scale hydrogen production systems

    International Nuclear Information System (INIS)

    Forsberg, Charles W.

    2007-01-01

    The cost of delivered hydrogen includes production, storage, and distribution. For equal production costs, large users (>10 6 m 3 /day) will favor high-volume centralized hydrogen production technologies to avoid collection costs for hydrogen from widely distributed sources. Potential hydrogen markets were examined to identify and characterize those markets that will favor large-scale hydrogen production technologies. The two high-volume centralized hydrogen production technologies are nuclear energy and fossil energy with carbon dioxide sequestration. The potential markets for these technologies are: (1) production of liquid fuels (gasoline, diesel and jet) including liquid fuels with no net greenhouse gas emissions and (2) peak electricity production. The development of high-volume centralized hydrogen production technologies requires an understanding of the markets to (1) define hydrogen production requirements (purity, pressure, volumes, need for co-product oxygen, etc.); (2) define and develop technologies to use the hydrogen, and (3) create the industrial partnerships to commercialize such technologies. (author)

  6. Finding a Roadmap to achieve Large Neuromorphic Hardware Systems

    Directory of Open Access Journals (Sweden)

    Jennifer eHasler

    2013-09-01

    Full Text Available Neuromorphic systems are gaining increasing importance in an era where CMOS digital computing techniques are meeting hard physical limits. These silicon systems mimic extremely energy efficient neural computing structures, potentially both for solving engineering applications as well as understanding neural computation. Towards this end, the authors provide a glimpse at what the technology evolution roadmap looks like for these systems so that Neuromorphic engineers may gain the same benefit of anticipation and foresight that IC designers gained from Moore's law many years ago. Scaling of energy efficiency, performance, and size will be discussed as well as how the implementation and application space of Neuromorphic systems are expected to evolve over time.

  7. Iterative algorithms for large sparse linear systems on parallel computers

    Science.gov (United States)

    Adams, L. M.

    1982-01-01

    Algorithms for assembling in parallel the sparse system of linear equations that result from finite difference or finite element discretizations of elliptic partial differential equations, such as those that arise in structural engineering are developed. Parallel linear stationary iterative algorithms and parallel preconditioned conjugate gradient algorithms are developed for solving these systems. In addition, a model for comparing parallel algorithms on array architectures is developed and results of this model for the algorithms are given.

  8. Rated power factor and excitation system of large turbine generator

    International Nuclear Information System (INIS)

    Tokumitsu, Iwao; Watanabe, Takashi; Banjou, Minoru.

    1979-01-01

    As for the rated power factor of turbine generators for thermal power stations, 90% has been adopted since around 1960. On the other hand, power transmission system has entered 500 kV age, and 1,000 kV transmission is expected in the near future. As for the supply of reactive power from thermal and nuclear turbine generators, the necessity of supplying leading reactive power has rather increased. Now, the operating power factor of thermal and nuclear generators becomes 96 to 100% actually. As for the excess stability of turbine generators owing to the strengthening of transmission system and the adoption of super-high voltage, the demand of strict conditions can be dealt with by the adoption of super-fast response excitation system of thyristor shunt winding self exciting type. The adoption of the turbine generators with 90 to 95% power factor and the adoption of the thyristor shunt winding self exciting system were examined and evaluated. The rated power factor of generators, excitation system and economy of adopting these systems are explained. When the power factor of generators is increased from 0.9 to 0.95, about 6% of saving can be obtained in the installation cost. When the thyristor shunt winding self excitation is adopted, it is about 10% more economical than AC excitation. (Kako, I.)

  9. Etoile Project : Social Intelligent ICT-System for very large scale education in complex systems

    Science.gov (United States)

    Bourgine, P.; Johnson, J.

    2009-04-01

    The project will devise new theory and implement new ICT-based methods of delivering high-quality low-cost postgraduate education to many thousands of people in a scalable way, with the cost of each extra student being negligible (Socially Intelligent Resource Mining system to gather large volumes of high quality educational resources from the internet; new methods to deconstruct these to produce a semantically tagged Learning Object Database; a Living Course Ecology to support the creation and maintenance of evolving course materials; systems to deliver courses; and a ‘socially intelligent assessment system'. The system will be tested on one to ten thousand postgraduate students in Europe working towards the Complex System Society's title of European PhD in Complex Systems. Étoile will have a very high impact both scientifically and socially by (i) the provision of new scalable ICT-based methods for providing very low cost scientific education, (ii) the creation of new mathematical and statistical theory for the multiscale dynamics of complex systems, (iii) the provision of a working example of adaptation and emergence in complex socio-technical systems, and (iv) making a major educational contribution to European complex systems science and its applications.

  10. Review of DC System Technologies for Large Scale Integration of Wind Energy Systems with Electricity Grids

    Directory of Open Access Journals (Sweden)

    Sheng Jie Shao

    2010-06-01

    Full Text Available The ever increasing development and availability of power electronic systems is the underpinning technology that enables large scale integration of wind generation plants with the electricity grid. As the size and power capacity of the wind turbine continues to increase, so is the need to place these significantly large structures at off-shore locations. DC grids and associated power transmission technologies provide opportunities for cost reduction and electricity grid impact minimization as the bulk power is concentrated at single point of entry. As a result, planning, optimization and impact can be studied and carefully controlled minimizing the risk of the investment as well as power system stability issues. This paper discusses the key technologies associated with DC grids for offshore wind farm applications.

  11. A large-scale cryoelectronic system for biological sample banking

    Science.gov (United States)

    Shirley, Stephen G.; Durst, Christopher H. P.; Fuchs, Christian C.; Zimmermann, Heiko; Ihmig, Frank R.

    2009-11-01

    We describe a polymorphic electronic infrastructure for managing biological samples stored over liquid nitrogen. As part of this system we have developed new cryocontainers and carrier plates attached to Flash memory chips to have a redundant and portable set of data at each sample. Our experimental investigations show that basic Flash operation and endurance is adequate for the application down to liquid nitrogen temperatures. This identification technology can provide the best sample identification, documentation and tracking that brings added value to each sample. The first application of the system is in a worldwide collaborative research towards the production of an AIDS vaccine. The functionality and versatility of the system can lead to an essential optimization of sample and data exchange for global clinical studies.

  12. Power System Operation with Large Scale Wind Power Integration

    DEFF Research Database (Denmark)

    Suwannarat, A.; Bak-Jensen, B.; Chen, Z.

    2007-01-01

    to the uncertain nature of wind power. In this paper, proposed models of generations and control system are presented which analyze the deviation of power exchange at the western Danish-German border, taking into account the fluctuating nature of wind power. The performance of the secondary control of the thermal......The Danish power system starts to face problems of integrating thousands megawatts of wind power, which produce in a stochastic behavior due to natural wind fluctuations. With wind power capacities increasing, the Danish Transmission System Operator (TSO) is faced with new challenges related...... power plants and the spinning reserves control from the Combined Heat and Power (CHP) units to achieve active power balance with the increased wind power penetration is presented....

  13. On the response of large systems to electrostatic fields

    Energy Technology Data Exchange (ETDEWEB)

    Springborg, Michael [Physical and Theoretical Chemistry, University of Saarland, 66123 Saarbrücken (Germany); Kirtman, Bernard [Department of Chemistry and Biochemistry, University of California, Santa Barbara, California 93106 (United States)

    2015-01-22

    By modifying the surfaces of a macroscopic regular system it is possible to modify the dipole moment per unit by an amount equal to a lattice vector times the elementary charge. Alternatively, we may ignore the surfaces and treat the system as being infinite and periodic. In that event the dipole moment per unit is determined only up to an additive term equal to a lattice vector times the elementary charge. Beyond mathematical arguments we show, through model calculations, that the two cases are completely equivalent, even though the origin of the additive term is very different. The response of extended systems to electrostatic fields — including internal structure, piezoelectricity, bulk charge density, and (hyper)polarizabilities — depends upon this term and is, thereby, surface-dependent. The case of piezoelectricity is analyzed in some detail.

  14. Large distributed control system using Ada in fusion research

    International Nuclear Information System (INIS)

    Van Arsdall, P J; Woodruff, J P.

    1998-01-01

    Construction of the National Ignition Facility laser at Lawrence Livermore National Laboratory features a distributed control system that uses object-oriented software engineering techniques. Control of 60,000 devices is effected using a network of some 500 computers. The software is being written in Ada and communicates through CORBA. Software controls are implemented in two layers: individual device controllers and a supervisory layer. The software architecture provides services in the form of frameworks that address issues common to event-driven control systems. Those services are allocated to levels that strictly prescribe their interdependency so the levels are separately reusable. The project has completed its final design review. The delivery of the first increment takes place in October 1998. Keywords Distributed control system, object-oriented development, CORBA, application frameworks, levels of abstraction

  15. Concept for a power system controller for large space electrical power systems

    Science.gov (United States)

    Lollar, L. F.; Lanier, J. R., Jr.; Graves, J. R.

    1981-01-01

    The development of technology for a fail-operatonal power system controller (PSC) utilizing microprocessor technology for managing the distribution and power processor subsystems of a large multi-kW space electrical power system is discussed. The specific functions which must be performed by the PSC, the best microprocessor available to do the job, and the feasibility, cost savings, and applications of a PSC were determined. A limited function breadboard version of a PSC was developed to demonstrate the concept and potential cost savings.

  16. Testing of valves and associated systems in large scale experiments

    International Nuclear Information System (INIS)

    Becker, M.

    1985-01-01

    The system examples dealt with are selected so that they cover a wide spectrum of technical tasks and limits. Therefore the flowing medium varies from pure steam flow via a mixed flow of steam and water to pure water flow. The valves concerned include those whose main function is opening, and also those whose main function is the secure closing. There is a certain limitation in that the examples are taken from Boiling Water Reactor technology. The main procedure in valve and system testing described is, of course, not limited to the selected examples, but applies generally in powerstation and process technology. (orig./HAG) [de

  17. A logistics model for large space power systems

    Science.gov (United States)

    Koelle, H. H.

    Space Power Systems (SPS) have to overcome two hurdles: (1) to find an attractive design, manufacturing and assembly concept and (2) to have available a space transportation system that can provide economical logistic support during the construction and operational phases. An initial system feasibility study, some five years ago, was based on a reference system that used terrestrial resources only and was based partially on electric propulsion systems. The conclusion was: it is feasible but not yet economically competitive with other options. This study is based on terrestrial and extraterrestrial resources and on chemical (LH 2/LOX) propulsion systems. These engines are available from the Space Shuttle production line and require small changes only. Other so-called advanced propulsion systems investigated did not prove economically superior if lunar LOX is available! We assume that a Shuttle derived Heavy Lift Launch Vehicle (HLLV) will become available around the turn of the century and that this will be used to establish a research base on the lunar surface. This lunar base has the potential to grow into a lunar factory producing LOX and construction materials for supporting among other projects also the construction of space power systems in geostationary orbit. A model was developed to simulate the logistics support of such an operation for a 50-year life cycle. After 50 years 111 SPS units with 5 GW each and an availability of 90% will produce 100 × 5 = 500 GW. The model comprises 60 equations and requires 29 assumptions of the parameter involved. 60-state variables calculated with the 60 equations mentioned above are given on an annual basis and as averages for the 50-year life cycle. Recycling of defective parts in geostationary orbit is one of the features of the model. The state-of-the-art with respect to SPS technology is introduced as a variable Mg mass/MW electric power delivered. If the space manufacturing facility, a maintenance and repair facility

  18. The Isis project: Fault-tolerance in large distributed systems

    Science.gov (United States)

    Birman, Kenneth P.; Marzullo, Keith

    1993-01-01

    This final status report covers activities of the Isis project during the first half of 1992. During the report period, the Isis effort has achieved a major milestone in its effort to redesign and reimplement the Isis system using Mach and Chorus as target operating system environments. In addition, we completed a number of publications that address issues raised in our prior work; some of these have recently appeared in print, while others are now being considered for publication in a variety of journals and conferences.

  19. Rf system considerations for a large hadron collider

    International Nuclear Information System (INIS)

    Raka, E.

    1988-01-01

    In this paper, we shall discuss how we arrive at a particular choice of voltage and frequency; the type of acceleration structure that would be suitable for obtaining the required voltage and resonant impedance; static beam loading including a simplified beam stability criterion involving the beam current and total rf system shunt impedance; the basic principle of rf phase and frequency control loops; and the effect of rf noise and its interaction with these loops. Finally, we shall consider the need for and design of rf systems to damp independently coherent oscillations of individual bunches or groups of bunches. 30 refs., 17 figs., 2 tabs

  20. Accelerating Inexact Newton Schemes for Large Systems of Nonlinear Equations

    NARCIS (Netherlands)

    Fokkema, D.R.; Sleijpen, G.L.G.; Vorst, H.A. van der

    Classical iteration methods for linear systems, such as Jacobi iteration, can be accelerated considerably by Krylov subspace methods like GMRES. In this paper, we describe how inexact Newton methods for nonlinear problems can be accelerated in a similar way and how this leads to a general

  1. Large vacuum system for experiences in magnetic confined plasmas

    International Nuclear Information System (INIS)

    Honda, R.Y.; Kayama, M.E.; Boeckelmann, H.K.; Aihara, S.

    1984-01-01

    It is presented the operation method of a theta-pinch system capable of generating and confine plasmas with high densities and temperatures. Some characteristics of Tupa theta-pinch, which is operating at UNICAMP, emphasizing the cleaning mode of the vacuum chamber, are also presented. (M.C.K.) [pt

  2. A novel categorisation system to organise a large photo ...

    African Journals Online (AJOL)

    The white shark Carcharodon carcharias was one of the first elasmobranch species where photo identification was used to identify unique individuals. In this study, we propose guidelines that improve the current photo identification technique for white sharks by presenting a novel categorisation system. Using this method, a ...

  3. High-speed large angle mammography tomosynthesis system

    Science.gov (United States)

    Eberhard, Jeffrey W.; Staudinger, Paul; Smolenski, Joe; Ding, Jason; Schmitz, Andrea; McCoy, Julie; Rumsey, Michael; Al-Khalidy, Abdulrahman; Ross, William; Landberg, Cynthia E.; Claus, Bernhard E. H.; Carson, Paul; Goodsitt, Mitchell; Chan, Heang-Ping; Roubidoux, Marilyn; Thomas, Jerry A.; Osland, Jacqueline

    2006-03-01

    A new mammography tomosynthesis prototype system that acquires 21 projection images over a 60 degree angular range in approximately 8 seconds has been developed and characterized. Fast imaging sequences are facilitated by a high power tube and generator for faster delivery of the x-ray exposure and a high speed detector read-out. An enhanced a-Si/CsI flat panel digital detector provides greater DQE at low exposure, enabling tomo image sequence acquisitions at total patient dose levels between 150% and 200% of the dose of a standard mammographic view. For clinical scenarios where a single MLO tomographic acquisition per breast may replace the standard CC and MLO views, total tomosynthesis breast dose is comparable to or below the dose in standard mammography. The system supports co-registered acquisition of x-ray tomosynthesis and 3-D ultrasound data sets by incorporating an ultrasound transducer scanning system that flips into position above the compression paddle for the ultrasound exam. Initial images acquired with the system are presented.

  4. Modeling Synergies in Large Human-Machine Networked Systems

    Science.gov (United States)

    2013-09-25

    Agents and Multi- Agent Systems (AAMAS), Valencia, Spain, June 4-8, 2012. Steven Okamoto_, Praveen Paruchuri, Yonghong Wang, Katia Sycara, Janusz...Society, Santa Monica, CA: Human Factors and Ergonomics Society. 86. Steven Okamoto_, Praveen Paruchuri, Yonghong Wang, Katia Sycara, Janusz

  5. Acquisition of reliable vacuum hardware for large accelerator systems

    International Nuclear Information System (INIS)

    Welch, K.M.

    1995-01-01

    Credible and effective communications prove to be the major challenge in the acquisition of reliable vacuum hardware. Technical competence is necessary but not sufficient. The authors must effectively communicate with management, sponsoring agencies, project organizations, service groups, staff and with vendors. Most of Deming's 14 quality assurance tenants relate to creating an enlightened environment of good communications. All projects progress along six distinct, closely coupled, dynamic phases. All six phases are in a state of perpetual change. These phases and their elements are discussed, with emphasis given to the acquisition phase and its related vocabulary. Large projects require great clarity and rigor as poor communications can be costly. For rigor to be cost effective, it can't be pedantic. Clarity thrives best in a low-risk, team environment

  6. Compositional Verification of Interlocking Systems for Large Stations

    DEFF Research Database (Denmark)

    Fantechi, Alessandro; Haxthausen, Anne Elisabeth; Macedo, Hugo Daniel dos Santos

    2017-01-01

    -networks that are independent at some degree. At this regard, we study how the division of a complex network into sub-networks, using stub elements to abstract all the routes that are common between sub-networks, may still guarantee compositionality of verification of safety properties....... for networks of large size due to the exponential computation time and resources needed. Some recent attempts to address this challenge adopt a compositional approach, targeted to track layouts that are easily decomposable into sub-networks such that a route is almost fully contained in a sub......-network: in this way granting the access to a route is essentially a decision local to the sub-network, and the interfaces with the rest of the network easily abstract away less interesting details related to the external world. Following up on previous work, where we defined a compositional verification method...

  7. Design and modelling of innovative machinery systems for large ships

    DEFF Research Database (Denmark)

    Larsen, Ulrik

    Eighty percent of the growing global merchandise trade is transported by sea. The shipping industry is required to reduce the pollution and increase the energy efficiency of ships in the near future. There is a relatively large potential for approaching these requirements by implementing waste heat...... consisting of a two-zone combustion and NOx emission model, a double Wiebe heat release model, the Redlich-Kwong equation of state and the Woschni heat loss correlation. A novel methodology is presented and used to determine the optimum organic Rankine cycle process layout, working fluid and process......, are evaluated with regards to the fuel consumption and NOx emissions trade-off. The results of the calibration and validation of the engine model suggest that the main performance parameters can be predicted with adequate accuracies for the overall purpose. The results of the ORC and the Kalina cycle...

  8. Mizan: Optimizing Graph Mining in Large Parallel Systems

    KAUST Repository

    Kalnis, Panos

    2012-03-01

    Extracting information from graphs, from nding shortest paths to complex graph mining, is essential for many ap- plications. Due to the shear size of modern graphs (e.g., social networks), processing must be done on large paral- lel computing infrastructures (e.g., the cloud). Earlier ap- proaches relied on the MapReduce framework, which was proved inadequate for graph algorithms. More recently, the message passing model (e.g., Pregel) has emerged. Although the Pregel model has many advantages, it is agnostic to the graph properties and the architecture of the underlying com- puting infrastructure, leading to suboptimal performance. In this paper, we propose Mizan, a layer between the users\\' code and the computing infrastructure. Mizan considers the structure of the input graph and the architecture of the in- frastructure in order to: (i) decide whether it is bene cial to generate a near-optimal partitioning of the graph in a pre- processing step, and (ii) choose between typical point-to- point message passing and a novel approach that puts com- puting nodes in a virtual overlay ring. We deployed Mizan on a small local Linux cluster, on the cloud (256 virtual machines in Amazon EC2), and on an IBM Blue Gene/P supercomputer (1024 CPUs). We show that Mizan executes common algorithms on very large graphs 1-2 orders of mag- nitude faster than MapReduce-based implementations and up to one order of magnitude faster than implementations relying on Pregel-like hash-based graph partitioning.

  9. Single-Cell Biomolecular Analysis of Coral Algal Symbionts Reveals Opposing Metabolic Responses to Heat Stress and Expulsion

    Directory of Open Access Journals (Sweden)

    Katherina Petrou

    2018-03-01

    Full Text Available The success of corals in nutrient poor environments is largely attributed to the symbiosis between the cnidarian host and its intracellular alga. Warm water anomalies have been shown to destabilize this symbiosis, yet detailed analysis of the effect of temperature and expulsion on cell-specific carbon and nutrient allocation in the symbiont is limited. Here, we exposed colonies of the hard coral Acropora millepora to heat stress and using synchrotron-based infrared microspectroscopy measured the biomolecular profiles of individual in hospite and expelled symbiont cells at an acute state of bleaching. Our results showed symbiont metabolic profiles to be remarkably distinct with heat stress and expulsion, where the two effectors elicited opposing metabolic adjustments independent of treatment or cell type. Elevated temperature resulted in biomolecular changes reflecting cellular stress, with relative increases in free amino acids and phosphorylation of molecules and a concomitant decline in protein content, suggesting protein modification and degradation. This contrasted with the metabolic profiles of expelled symbionts, which showed relative decreases in free amino acids and phosphorylated molecules, but increases in proteins and lipids, suggesting expulsion lessens the overall effect of heat stress on the metabolic signature of the algal symbionts. Interestingly, the combined effects of expulsion and thermal stress were additive, reducing the overall shifts in all biomolecules, with the notable exception of the significant accumulation of lipids and saturated fatty acids. This first use of a single-cell metabolomics approach on the coral symbiosis provides novel insight into coral bleaching and emphasizes the importance of a single-cell approach to demark the cell-to-cell variability in the physiology of coral cellular populations.

  10. Risk Management of Large RC Structures within Spatial Information System

    DEFF Research Database (Denmark)

    Qin, Jianjun; Faber, Michael Havbro

    2012-01-01

    Abstract: The present article addresses the development of a spatial information system (SIS), which aims to facilitate risk management of large‐scale concrete structures. The formulation of the SIS is based on ideas developed in the context of indicator‐based risk modeling for concrete structures...... subject to corrosion and geographical information system based risk modeling concerning large‐scale risk management. The term “risk management” here refers in particular to the process of condition assessment and optimization of the inspection and repair activities. The SIS facilitates the storage...... and handling of all relevant information to the risk management. The probabilistic modeling utilized in the condition assessment takes basis in a Bayesian hierarchical modeling philosophy. It facilitates the updating of risks as well as optimizing inspection plans whenever new information about the condition...

  11. Heavy-flavour transport: from large to small systems

    Energy Technology Data Exchange (ETDEWEB)

    Beraudo, A.; De Pace, A.; Monteno, M.; Nardi, M.; Prino, F.

    2016-12-15

    Predictions for heavy-flavour production in relativistic heavy-ion experiments provided by the POWLANG transport setup, including now also an in-medium hadronization model, are displayed, After showing some representative findings for the Au-Au and Pb-Pb cases, a special focus will be devoted to the results obtained in the small systems formed in proton(deuteron)-nucleus collisions, where recent experimental data suggest the possible formation of a medium featuring a collective behaviour.

  12. The chief nurse executive role in large healthcare systems.

    Science.gov (United States)

    Englebright, Jane; Perlin, Jonathan

    2008-01-01

    Community hospitals are most frequently led by nonclinicians. Although some may have employed physician leaders, most often clinical leadership is provided by a chief nurse executive (CNE) or chief nursing officer. Clinical leadership of community hospital and health systems may similarly be provided by a system-level nursing executive or, often, by a council of facility CNEs. The increasingly competitive healthcare environment in which value-based purchasing of healthcare and pay-for-performance programs demand improved clinical performance for financial success has led to reconsideration of whether a council model can provide either the leadership or adequate attention to clinical (and operational) improvement. In turn, community hospitals and health systems look to CNE or chief nursing officer roles at the highest level of the organization as resources that are able to segue between the clinical and operational domains, translating clinical performance demands into operating strategies and tactics. This article explores CNE characteristics required for success in these increasingly responsible and visible roles.

  13. Transparent Fingerprint Sensor System for Large Flat Panel Display

    Directory of Open Access Journals (Sweden)

    Wonkuk Seo

    2018-01-01

    Full Text Available In this paper, we introduce a transparent fingerprint sensing system using a thin film transistor (TFT sensor panel, based on a self-capacitive sensing scheme. An armorphousindium gallium zinc oxide (a-IGZO TFT sensor array and associated custom Read-Out IC (ROIC are implemented for the system. The sensor panel has a 200 × 200 pixel array and each pixel size is as small as 50 μm × 50 μm. The ROIC uses only eight analog front-end (AFE amplifier stages along with a successive approximation analog-to-digital converter (SAR ADC. To get the fingerprint image data from the sensor array, the ROIC senses a capacitance, which is formed by a cover glass material between a human finger and an electrode of each pixel of the sensor array. Three methods are reviewed for estimating the self-capacitance. The measurement result demonstrates that the transparent fingerprint sensor system has an ability to differentiate a human finger’s ridges and valleys through the fingerprint sensor array.

  14. Transparent Fingerprint Sensor System for Large Flat Panel Display.

    Science.gov (United States)

    Seo, Wonkuk; Pi, Jae-Eun; Cho, Sung Haeung; Kang, Seung-Youl; Ahn, Seong-Deok; Hwang, Chi-Sun; Jeon, Ho-Sik; Kim, Jong-Uk; Lee, Myunghee

    2018-01-19

    In this paper, we introduce a transparent fingerprint sensing system using a thin film transistor (TFT) sensor panel, based on a self-capacitive sensing scheme. An armorphousindium gallium zinc oxide (a-IGZO) TFT sensor array and associated custom Read-Out IC (ROIC) are implemented for the system. The sensor panel has a 200 × 200 pixel array and each pixel size is as small as 50 μm × 50 μm. The ROIC uses only eight analog front-end (AFE) amplifier stages along with a successive approximation analog-to-digital converter (SAR ADC). To get the fingerprint image data from the sensor array, the ROIC senses a capacitance, which is formed by a cover glass material between a human finger and an electrode of each pixel of the sensor array. Three methods are reviewed for estimating the self-capacitance. The measurement result demonstrates that the transparent fingerprint sensor system has an ability to differentiate a human finger's ridges and valleys through the fingerprint sensor array.

  15. The Unified Health System (SUS as a large technological system: territory, technique and politic

    Directory of Open Access Journals (Sweden)

    Luis Henrique Leandro Ribeiro

    2018-03-01

    Full Text Available The Uni ed Health System (SUS constitutes a large technological system wi- thin the Brazilian territory since it aggregates a broad and diverse materiality in organizing and managing its ows. Additionally, it has two other attributes that make it unique: high sensibility to speci cities of different places; and technical and political centralization and decentralization of its actions. The macro dimen- sion is the SUS, leading it to be understood not simply as a health system, through its: multidimensionality – elements of other life instances (social, economic, cul- tural and political; broad and unequal spectrum of actors (state and non-state who move it and the meanings of its actions; and the trans-scaleness of its con- cretion in places (local, national and international nexuses. As an infrastructure of everyday life, it is a hegemonic large technological system that acts upon objec- tive (technosphere and subjective (psychosphere conditions of existence, a con- ception that has important implications for health policy and territory integration.

  16. Enabling parallel simulation of large-scale HPC network systems

    International Nuclear Information System (INIS)

    Mubarak, Misbah; Carothers, Christopher D.; Ross, Robert B.; Carns, Philip

    2016-01-01

    Here, with the increasing complexity of today’s high-performance computing (HPC) architectures, simulation has become an indispensable tool for exploring the design space of HPC systems—in particular, networks. In order to make effective design decisions, simulations of these systems must possess the following properties: (1) have high accuracy and fidelity, (2) produce results in a timely manner, and (3) be able to analyze a broad range of network workloads. Most state-of-the-art HPC network simulation frameworks, however, are constrained in one or more of these areas. In this work, we present a simulation framework for modeling two important classes of networks used in today’s IBM and Cray supercomputers: torus and dragonfly networks. We use the Co-Design of Multi-layer Exascale Storage Architecture (CODES) simulation framework to simulate these network topologies at a flit-level detail using the Rensselaer Optimistic Simulation System (ROSS) for parallel discrete-event simulation. Our simulation framework meets all the requirements of a practical network simulation and can assist network designers in design space exploration. First, it uses validated and detailed flit-level network models to provide an accurate and high-fidelity network simulation. Second, instead of relying on serial time-stepped or traditional conservative discrete-event simulations that limit simulation scalability and efficiency, we use the optimistic event-scheduling capability of ROSS to achieve efficient and scalable HPC network simulations on today’s high-performance cluster systems. Third, our models give network designers a choice in simulating a broad range of network workloads, including HPC application workloads using detailed network traces, an ability that is rarely offered in parallel with high-fidelity network simulations

  17. Energy Constraints for Building Large-Scale Systems

    Science.gov (United States)

    2016-03-17

    although most systems built to date do not consider these issues as primary constraints. Keywords: Neuromorphic Engineering; Cortical Operation...2Mbyte, 32bit input data, and 1Mbyte, 32bit output data, results in 3.1mW (Vdd = 2.5V) of power, even though one might find a DSP chip computing at...4MMAC(/s)/mW power efficiency [5], close to the power / energy efficiency wall [6]. A memory chip or data source further away requires even higher

  18. Toyota production system beyond large-scale production

    CERN Document Server

    Ohno, Taiichi

    1998-01-01

    In this classic text, Taiichi Ohno--inventor of the Toyota Production System and Lean manufacturing--shares the genius that sets him apart as one of the most disciplined and creative thinkers of our time. Combining his candid insights with a rigorous analysis of Toyota's attempts at Lean production, Ohno's book explains how Lean principles can improve any production endeavor. A historical and philosophical description of just-in-time and Lean manufacturing, this work is a must read for all students of human progress. On a more practical level, it continues to provide inspiration and instruction for those seeking to improve efficiency through the elimination of waste.

  19. Krylov subspace methods for solving large unsymmetric linear systems

    International Nuclear Information System (INIS)

    Saad, Y.

    1981-01-01

    Some algorithms based upon a projection process onto the Krylov subspace K/sub m/ = Span(r 0 , Ar 0 ,...,A/sup m/-1r 0 ) are developed, generalizing the method of conjugate gradients to unsymmetric systems. These methods are extensions of Arnoldi's algorithm for solving eigenvalue problems. The convergence is analyzed in terms of the distance of the solution to the subspace K/sub m/ and some error bounds are established showing, in particular, a similarity with the conjugate gradient method (for symmetric matrices) when the eigenvalues are real. Several numerical experiments are described and discussed

  20. Axiomatic design in large systems complex products, buildings and manufacturing systems

    CERN Document Server

    Suh, Nam

    2016-01-01

    This book provides a synthesis of recent developments in Axiomatic Design theory and its application in large complex systems. Introductory chapters provide concise tutorial materials for graduate students and new practitioners, presenting the fundamentals of Axiomatic Design and relating its key concepts to those of model-based systems engineering. A mathematical exposition of design axioms is also provided. The main body of the book, which represents a concentrated treatment of several applications, is divided into three parts covering work on: complex products; buildings; and manufacturing systems. The book shows how design work in these areas can benefit from the scientific and systematic underpinning provided by Axiomatic Design, and in so doing effectively combines the state of the art in design research with practice. All contributions were written by an international group of leading proponents of Axiomatic Design. The book concludes with a call to action motivating further research into the engineeri...

  1. Computer Programming and Biomolecular Structure Studies: A Step beyond Internet Bioinformatics

    Science.gov (United States)

    Likic, Vladimir A.

    2006-01-01

    This article describes the experience of teaching structural bioinformatics to third year undergraduate students in a subject titled "Biomolecular Structure and Bioinformatics." Students were introduced to computer programming and used this knowledge in a practical application as an alternative to the well established Internet bioinformatics…

  2. Electrostatics in biomolecular simulations : where are we now and where are we heading?

    NARCIS (Netherlands)

    Karttunen, M.E.J.; Rottler, J.; Vattulainen, I.; Sagui, C.

    2008-01-01

    Chapter 2. In this review, we discuss current methods and developments in the treatment of electrostatic interactions in biomolecular and soft matter simulations. We review the current ‘work horses’, namely, Ewald summation based methods such the Particle-Mesh Ewald, and others, and also newer

  3. Affinity Capillary Electrophoresis – A Powerful Tool to Investigate Biomolecular Interactions

    Czech Academy of Sciences Publication Activity Database

    Kašička, Václav

    2017-01-01

    Roč. 30, č. 5 (2017), s. 248 ISSN 1471-6577 Institutional support: RVO:61388963 Keywords : capillary affinity electrophoresis * biomolecular interactions * binding constants Subject RIV: CB - Analytical Chemistry, Separation OBOR OECD: Analytical chemistry Impact factor: 0.663, year: 2016

  4. Program planning for large-scale control system upgrades

    International Nuclear Information System (INIS)

    Madani, M.; Giajnorio, J.; Richard, T.; Ho, D.; Volk, W.; Ertel, A.

    2011-01-01

    Bruce Power has been planning to replace the Bruce A Fuel Handling (FH) computer systems including the Controller and Protective computers for many years. This is a complex project, requiring an extended FH outage. To minimize operational disruption and fully identify associated project risks, Bruce Power is executing the project in phases starting with the Protective Computer replacement. GEH-C is collaborating with Bruce Power in a Preliminary Engineering (PE) phase to generate a project plan including specifications, budgetary cost, schedule, risks for the Protective computer replacement project. To assist Bruce Power in its evaluation, GEH-C's is using 6-Sigma methodologies to identify and rank Critical to Quality (CTQ) requirements in collaboration with Bruce Power Maintenance, Operations, Plant Design and FH Engineering teams. PE phase established the project scope, hardware and software specifications, material requirements and finally concluded with a recommended hardware platform and approved controls architecture.

  5. Large Distributed Data Acquisition System at the Z Facility

    International Nuclear Information System (INIS)

    Mills, Jerry A.; Potter, James E.

    1999-01-01

    Experiments at the Z machine generate over four hundred channels of waveform data on each accelerator shot. Most experiments require timing accuracy to better than one nanosecond between multiple distributed recording locations throughout the facility. Experimental diagnostics and high speed data recording equipment are typically located within a few meters of the 200 to 300 terawatt X- ray source produced during Z-pinch experiments. This paper will discuss techniques used to resolve the timing of the several hundred data channels acquired on each shot event and system features which allow viewing of waveforms within a few minutes after a shot. Methods for acquiring high bandwidth signals in a severe noise environment will also be discussed

  6. Unsupervised Calculation of Free Energy Barriers in Large Crystalline Systems

    Science.gov (United States)

    Swinburne, Thomas D.; Marinica, Mihai-Cosmin

    2018-03-01

    The calculation of free energy differences for thermally activated mechanisms in the solid state are routinely hindered by the inability to define a set of collective variable functions that accurately describe the mechanism under study. Even when possible, the requirement of descriptors for each mechanism under study prevents implementation of free energy calculations in the growing range of automated material simulation schemes. We provide a solution, deriving a path-based, exact expression for free energy differences in the solid state which does not require a converged reaction pathway, collective variable functions, Gram matrix evaluations, or probability flux-based estimators. The generality and efficiency of our method is demonstrated on a complex transformation of C 15 interstitial defects in iron and double kink nucleation on a screw dislocation in tungsten, the latter system consisting of more than 120 000 atoms. Both cases exhibit significant anharmonicity under experimentally relevant temperatures.

  7. Hazards analysis of TNX Large Melter-Off-Gas System

    International Nuclear Information System (INIS)

    Randall, C.T.

    1982-03-01

    Analysis of the potential safety hazards and an evaluation of the engineered safety features and administrative controls indicate that the LMOG System can be operated without undue hazard to employees or the public, or damage to equipment. The safety features provided in the facility design coupled with the planned procedural and administrative controls make the occurrence of serious accidents very improbable. A set of recommendations evolved during this analysis that was judged potentially capable of further reducing the probability of personnel injury or further mitigating the consequences of potential accidents. These recommendations concerned areas such as formic acid vapor hazards, hazard of feeding water to the melter at an uncontrolled rate, prevention of uncontrolled glass pours due to melter pressure excursions and additional interlocks. These specific suggestions were reviewed with operational and technical personnel and are being incorporated into the process. The safeguards provided by these recommendations are discussed in this report

  8. Quantum Entanglement of Matter and Geometry in Large Systems

    Energy Technology Data Exchange (ETDEWEB)

    Hogan, Craig J.

    2014-12-04

    Standard quantum mechanics and gravity are used to estimate the mass and size of idealized gravitating systems where position states of matter and geometry become indeterminate. It is proposed that well-known inconsistencies of standard quantum field theory with general relativity on macroscopic scales can be reconciled by nonstandard, nonlocal entanglement of field states with quantum states of geometry. Wave functions of particle world lines are used to estimate scales of geometrical entanglement and emergent locality. Simple models of entanglement predict coherent fluctuations in position of massive bodies, of Planck scale origin, measurable on a laboratory scale, and may account for the fact that the information density of long lived position states in Standard Model fields, which is determined by the strong interactions, is the same as that determined holographically by the cosmological constant.

  9. Exergy analysis of refrigerators for large scale cooling systems

    Energy Technology Data Exchange (ETDEWEB)

    Loehlein, K [Sulzer Cryogenics, Winterthur (Switzerland); Fukano, T [Nippon Sanso Corp., Kawasaki (Japan)

    1993-01-01

    Facilities with superconducting magnets require cooling capacity at different temperature levels and of different types (refrigeration or liquefaction). The bigger the demand for refrigeration, the more investment for improved efficiency of the refrigeration plant is justified and desired. Refrigeration cycles are built with discrete components like expansion turbines, cold compressors, etc. Therefore the exergetic efficiency for producing refrigeration on a distinct temperature level is significantly dependent on the 'thermodynamic arrangement' of these components. Among a variety of possibilities, limited by the range of applicability of the components, one has to choose the best design for higher efficiency on every level. Some influences are being quantified and aspects are given for a optimal integration of the refrigerator into the whole cooling system. (orig.).

  10. Large solar heating system with seasonal storage for buld drying in Lisse, the Netherlands

    NARCIS (Netherlands)

    Bokhoven, T.P.; Geus, A.C. de

    1996-01-01

    Within IEA Task 14 (Advanced Solar Systems) of the IEA Solar Heating and Cooling Programme a working group was established dealing with large advanced solar energy systems (the Large Systems Working group). The goal of this working group was to generate a common base of experiences for the design

  11. Using the Large Fire Simulator System to map wildland fire potential for the conterminous United States

    Science.gov (United States)

    LaWen Hollingsworth; James Menakis

    2010-01-01

    This project mapped wildland fire potential (WFP) for the conterminous United States by using the large fire simulation system developed for Fire Program Analysis (FPA) System. The large fire simulation system, referred to here as LFSim, consists of modules for weather generation, fire occurrence, fire suppression, and fire growth modeling. Weather was generated with...

  12. Mathematical model for biomolecular quantification using large-area surface-enhanced Raman spectroscopy mapping

    DEFF Research Database (Denmark)

    Palla, Mirkó; Bosco, Filippo; Yang, Jaeyoung

    2015-01-01

    Surface-enhanced Raman spectroscopy (SERS) based on nanostructured platforms is a promising technique for quantitative and highly sensitive detection of biomolecules in the field of analytical biochemistry. Here, we report a mathematical model to predict experimental SERS signal (or hotspot) inte...

  13. ALTERNATIVAS BIOMOLECULARES EN EL TRATAMIENTO DE LA OBESIDAD

    Directory of Open Access Journals (Sweden)

    Fernando Lizcano

    2010-09-01

    Full Text Available

    Resumen

    La obesidad se ha convertido en un problema de salud pública que cobija tanto a países desarrollados como a aquellos en vía de desarrollo. En la mayoría de los casos las políticas de salud no han tenido el efecto deseado para reducir la prevalencia de esta patología y muchos de los fármacos útiles para contrarrestar la obesidad no han podido continuar en el mercado debido a serios efectos secundarios. Algunas alternativas terapéuticas más agresivas como la cirugías reductivas han demostrado una utilidad restringida. Incluso, recientes observaciones han puesto de manifiesto las consecuencias a largo plazo de este tipo de intervenciones.

    En la búsqueda de nuevas estrategias para el tratamiento de la obesidad se ha investigado, tanto en la propia célula grasa como en los genes que podrían ser modificados y cuya función está enfocada en regular el gasto calórico y la termogénesis adaptativa. Algunos de estos genes son modificados por factores de transcripción que pueden determinar la característica fenotípica de la célula grasa. Recientemente se ha observado que en la persona adulta es posible evidenciar vestigios de célula grasa parda que puede gastar energía en forma de calor y esta modificación podría ser una alternativa terapéutica en la obesidad. Nuestro grupo de investigación ha observado que mediante la modificación de la función de la proteína del retinoblastoma (pRb se pueden aumentar los genes que estimulan la pérdida calórica en el adipocito.

    Palabras clave: Grasa Parda, Obesidad, transcripción, EID1, transdiferenciación

    BIOMOLECULAR OPTIONS IN TREATING OBESITY

    Abstract

    Obesity is a public health issue for both developed and third world countries. Although many efforts have been made to reverse the trend of this prevalent pathology, no results have been obtained with public health policies in most cases. Furthermore, many medicines approved for

  14. A visual analytics system for optimizing the performance of large-scale networks in supercomputing systems

    Directory of Open Access Journals (Sweden)

    Takanori Fujiwara

    2018-03-01

    Full Text Available The overall efficiency of an extreme-scale supercomputer largely relies on the performance of its network interconnects. Several of the state of the art supercomputers use networks based on the increasingly popular Dragonfly topology. It is crucial to study the behavior and performance of different parallel applications running on Dragonfly networks in order to make optimal system configurations and design choices, such as job scheduling and routing strategies. However, in order to study these temporal network behavior, we would need a tool to analyze and correlate numerous sets of multivariate time-series data collected from the Dragonfly’s multi-level hierarchies. This paper presents such a tool–a visual analytics system–that uses the Dragonfly network to investigate the temporal behavior and optimize the communication performance of a supercomputer. We coupled interactive visualization with time-series analysis methods to help reveal hidden patterns in the network behavior with respect to different parallel applications and system configurations. Our system also provides multiple coordinated views for connecting behaviors observed at different levels of the network hierarchies, which effectively helps visual analysis tasks. We demonstrate the effectiveness of the system with a set of case studies. Our system and findings can not only help improve the communication performance of supercomputing applications, but also the network performance of next-generation supercomputers. Keywords: Supercomputing, Parallel communication network, Dragonfly networks, Time-series data, Performance analysis, Visual analytics

  15. Large-scale event extraction from literature with multi-level gene normalization.

    Directory of Open Access Journals (Sweden)

    Sofie Van Landeghem

    Full Text Available Text mining for the life sciences aims to aid database curation, knowledge summarization and information retrieval through the automated processing of biomedical texts. To provide comprehensive coverage and enable full integration with existing biomolecular database records, it is crucial that text mining tools scale up to millions of articles and that their analyses can be unambiguously linked to information recorded in resources such as UniProt, KEGG, BioGRID and NCBI databases. In this study, we investigate how fully automated text mining of complex biomolecular events can be augmented with a normalization strategy that identifies biological concepts in text, mapping them to identifiers at varying levels of granularity, ranging from canonicalized symbols to unique gene and proteins and broad gene families. To this end, we have combined two state-of-the-art text mining components, previously evaluated on two community-wide challenges, and have extended and improved upon these methods by exploiting their complementary nature. Using these systems, we perform normalization and event extraction to create a large-scale resource that is publicly available, unique in semantic scope, and covers all 21.9 million PubMed abstracts and 460 thousand PubMed Central open access full-text articles. This dataset contains 40 million biomolecular events involving 76 million gene/protein mentions, linked to 122 thousand distinct genes from 5032 species across the full taxonomic tree. Detailed evaluations and analyses reveal promising results for application of this data in database and pathway curation efforts. The main software components used in this study are released under an open-source license. Further, the resulting dataset is freely accessible through a novel API, providing programmatic and customized access (http://www.evexdb.org/api/v001/. Finally, to allow for large-scale bioinformatic analyses, the entire resource is available for bulk download from

  16. Steady-state analysis of large scale systems : the successive lumping method

    NARCIS (Netherlands)

    Smit, L.C.

    2016-01-01

    The general area of research of this dissertation concerns large systems with random aspects to their behavior that can be modeled and studied in terms of the stationary distribution of Markov chains. As the state spaces of such systems become large, their behavior gets hard to analyze, either via

  17. The convertible client/server technology in large container inspection system

    International Nuclear Information System (INIS)

    Chen Zhiqiang; Zhang Li; Gao Wenhuan; Kang Kejun

    2001-01-01

    The author presents a new convertible client/server technology in distributed networking environment of a large container inspection system. The characteristic and advantage of this technology is introduced. The authors illustrate the policy of the technology to develop the networking program, and provide one example about how to program the software in large container inspection system using the new technology

  18. Very Large Data Volumes Analysis of Collaborative Systems with Finite Number of States

    Science.gov (United States)

    Ivan, Ion; Ciurea, Cristian; Pavel, Sorin

    2010-01-01

    The collaborative system with finite number of states is defined. A very large database is structured. Operations on large databases are identified. Repetitive procedures for collaborative systems operations are derived. The efficiency of such procedures is analyzed. (Contains 6 tables, 5 footnotes and 3 figures.)

  19. Super-resolution biomolecular crystallography with low-resolution data.

    Science.gov (United States)

    Schröder, Gunnar F; Levitt, Michael; Brunger, Axel T

    2010-04-22

    X-ray diffraction plays a pivotal role in the understanding of biological systems by revealing atomic structures of proteins, nucleic acids and their complexes, with much recent interest in very large assemblies like the ribosome. As crystals of such large assemblies often diffract weakly (resolution worse than 4 A), we need methods that work at such low resolution. In macromolecular assemblies, some of the components may be known at high resolution, whereas others are unknown: current refinement methods fail as they require a high-resolution starting structure for the entire complex. Determining the structure of such complexes, which are often of key biological importance, should be possible in principle as the number of independent diffraction intensities at a resolution better than 5 A generally exceeds the number of degrees of freedom. Here we introduce a method that adds specific information from known homologous structures but allows global and local deformations of these homology models. Our approach uses the observation that local protein structure tends to be conserved as sequence and function evolve. Cross-validation with R(free) (the free R-factor) determines the optimum deformation and influence of the homology model. For test cases at 3.5-5 A resolution with known structures at high resolution, our method gives significant improvements over conventional refinement in the model as monitored by coordinate accuracy, the definition of secondary structure and the quality of electron density maps. For re-refinements of a representative set of 19 low-resolution crystal structures from the Protein Data Bank, we find similar improvements. Thus, a structure derived from low-resolution diffraction data can have quality similar to a high-resolution structure. Our method is applicable to the study of weakly diffracting crystals using X-ray micro-diffraction as well as data from new X-ray light sources. Use of homology information is not restricted to X

  20. Large-scale heat pumps in sustainable energy systems: System and project perspectives

    Directory of Open Access Journals (Sweden)

    Blarke Morten B.

    2007-01-01

    Full Text Available This paper shows that in support of its ability to improve the overall economic cost-effectiveness and flexibility of the Danish energy system, the financially feasible integration of large-scale heat pumps (HP with existing combined heat and power (CHP plants, is critically sensitive to the operational mode of the HP vis-à-vis the operational coefficient of performance, mainly given by the temperature level of the heat source. When using ground source for low-temperature heat source, heat production costs increases by about 10%, while partial use of condensed flue gasses for low-temperature heat source results in an 8% cost reduction. Furthermore, the analysis shows that when a large-scale HP is integrated with an existing CHP plant, the projected spot market situation in The Nordic Power Exchange (Nord Pool towards 2025, which reflects a growing share of wind power and heat-supply constrained power generation electricity, further reduces the operational hours of the CHP unit over time, while increasing the operational hours of the HP unit. In result, an HP unit at half the heat production capacity as the CHP unit in combination with a heat-only boiler represents as a possibly financially feasible alternative to CHP operation, rather than a supplement to CHP unit operation. While such revised operational strategy would have impacts on policies to promote co-generation, these results indicate that the integration of large-scale HP may jeopardize efforts to promote co-generation. Policy instruments should be designed to promote the integration of HP with lower than half of the heating capacity of the CHP unit. Also it is found, that CHP-HP plant designs should allow for the utilization of heat recovered from the CHP unit’s flue gasses for both concurrent (CHP unit and HP unit and independent operation (HP unit only. For independent operation, the recovered heat is required to be stored. .

  1. An Axiomatic Analysis Approach for Large-Scale Disaster-Tolerant Systems Modeling

    Directory of Open Access Journals (Sweden)

    Theodore W. Manikas

    2011-02-01

    Full Text Available Disaster tolerance in computing and communications systems refers to the ability to maintain a degree of functionality throughout the occurrence of a disaster. We accomplish the incorporation of disaster tolerance within a system by simulating various threats to the system operation and identifying areas for system redesign. Unfortunately, extremely large systems are not amenable to comprehensive simulation studies due to the large computational complexity requirements. To address this limitation, an axiomatic approach that decomposes a large-scale system into smaller subsystems is developed that allows the subsystems to be independently modeled. This approach is implemented using a data communications network system example. The results indicate that the decomposition approach produces simulation responses that are similar to the full system approach, but with greatly reduced simulation time.

  2. Defining Execution Viewpoints for a Large and Complex Software-Intensive System

    OpenAIRE

    Callo Arias, Trosky B.; America, Pierre; Avgeriou, Paris

    2009-01-01

    An execution view is an important asset for developing large and complex systems. An execution view helps practitioners to describe, analyze, and communicate what a software system does at runtime and how it does it. In this paper, we present an approach to define execution viewpoints for an existing large and complex software-intensive system. This definition approach enables the customization and extension of a set of predefined viewpoints to address the requirements of a specific developme...

  3. Security and VO management capabilities in a large-scale Grid operating system

    OpenAIRE

    Aziz, Benjamin; Sporea, Ioana

    2014-01-01

    This paper presents a number of security and VO management capabilities in a large-scale distributed Grid operating system. The capabilities formed the basis of the design and implementation of a number of security and VO management services in the system. The main aim of the paper is to provide some idea of the various functionality cases that need to be considered when designing similar large-scale systems in the future.

  4. A study of the transferability of influenza case detection systems between two large healthcare systems.

    Science.gov (United States)

    Ye, Ye; Wagner, Michael M; Cooper, Gregory F; Ferraro, Jeffrey P; Su, Howard; Gesteland, Per H; Haug, Peter J; Millett, Nicholas E; Aronis, John M; Nowalk, Andrew J; Ruiz, Victor M; López Pineda, Arturo; Shi, Lingyun; Van Bree, Rudy; Ginter, Thomas; Tsui, Fuchiang

    2017-01-01

    This study evaluates the accuracy and transferability of Bayesian case detection systems (BCD) that use clinical notes from emergency department (ED) to detect influenza cases. A BCD uses natural language processing (NLP) to infer the presence or absence of clinical findings from ED notes, which are fed into a Bayesain network classifier (BN) to infer patients' diagnoses. We developed BCDs at the University of Pittsburgh Medical Center (BCDUPMC) and Intermountain Healthcare in Utah (BCDIH). At each site, we manually built a rule-based NLP and trained a Bayesain network classifier from over 40,000 ED encounters between Jan. 2008 and May. 2010 using feature selection, machine learning, and expert debiasing approach. Transferability of a BCD in this study may be impacted by seven factors: development (source) institution, development parser, application (target) institution, application parser, NLP transfer, BN transfer, and classification task. We employed an ANOVA analysis to study their impacts on BCD performance. Both BCDs discriminated well between influenza and non-influenza on local test cases (AUCs > 0.92). When tested for transferability using the other institution's cases, BCDUPMC discriminations declined minimally (AUC decreased from 0.95 to 0.94, pdetection performance in two large healthcare systems in two geographically separated regions, providing evidentiary support for the use of automated case detection from routinely collected electronic clinical notes in national influenza surveillance. The transferability could be improved by training Bayesian network classifier locally and increasing the accuracy of the NLP parser.

  5. NMR paves the way for atomic level descriptions of sparsely populated, transiently formed biomolecular conformers.

    Science.gov (United States)

    Sekhar, Ashok; Kay, Lewis E

    2013-08-06

    The importance of dynamics to biomolecular function is becoming increasingly clear. A description of the structure-function relationship must, therefore, include the role of motion, requiring a shift in paradigm from focus on a single static 3D picture to one where a given biomolecule is considered in terms of an ensemble of interconverting conformers, each with potentially diverse activities. In this Perspective, we describe how recent developments in solution NMR spectroscopy facilitate atomic resolution studies of sparsely populated, transiently formed biomolecular conformations that exchange with the native state. Examples of how this methodology is applied to protein folding and misfolding, ligand binding, and molecular recognition are provided as a means of illustrating both the power of the new techniques and the significant roles that conformationally excited protein states play in biology.

  6. Design of an embedded inverse-feedforward biomolecular tracking controller for enzymatic reaction processes.

    Science.gov (United States)

    Foo, Mathias; Kim, Jongrae; Sawlekar, Rucha; Bates, Declan G

    2017-04-06

    Feedback control is widely used in chemical engineering to improve the performance and robustness of chemical processes. Feedback controllers require a 'subtractor' that is able to compute the error between the process output and the reference signal. In the case of embedded biomolecular control circuits, subtractors designed using standard chemical reaction network theory can only realise one-sided subtraction, rendering standard controller design approaches inadequate. Here, we show how a biomolecular controller that allows tracking of required changes in the outputs of enzymatic reaction processes can be designed and implemented within the framework of chemical reaction network theory. The controller architecture employs an inversion-based feedforward controller that compensates for the limitations of the one-sided subtractor that generates the error signals for a feedback controller. The proposed approach requires significantly fewer chemical reactions to implement than alternative designs, and should have wide applicability throughout the fields of synthetic biology and biological engineering.

  7. Architecture of transcriptional regulatory circuits is knitted over the topology of bio-molecular interaction networks

    DEFF Research Database (Denmark)

    Soberano de Oliveira, Ana Paula; Patil, Kiran Raosaheb; Nielsen, Jens

    2008-01-01

    is to use the topology of bio-molecular interaction networks in order to constrain the solution space. Such approaches systematically integrate the existing biological knowledge with the 'omics' data. Results: Here we introduce a hypothesis-driven method that integrates bio-molecular network topology......Background: Uncovering the operating principles underlying cellular processes by using 'omics' data is often a difficult task due to the high-dimensionality of the solution space that spans all interactions among the bio-molecules under consideration. A rational way to overcome this problem...... with transcriptome data, thereby allowing the identification of key biological features (Reporter Features) around which transcriptional changes are significantly concentrated. We have combined transcriptome data with different biological networks in order to identify Reporter Gene Ontologies, Reporter Transcription...

  8. Mitigation and enhancement techniques for the Upper Mississippi River system and other large river systems

    Science.gov (United States)

    Schnick, Rosalie A.; Morton, John M.; Mochalski, Jeffrey C.; Beall, Jonathan T.

    1982-01-01

    Extensive information is provided on techniques that can reduce or eliminate the negative impact of man's activities (particularly those related to navigation) on large river systems, with special reference to the Upper Mississippi River. These techniques should help resource managers who are concerned with such river systems to establish sound environmental programs. Discussion of each technique or group of techniques include (1) situation to be mitigated or enhanced; (2) description of technique; (3) impacts on the environment; (4) costs; and (5) evaluation for use on the Upper Mississippi River Systems. The techniques are divided into four primary categories: Bank Stabilization Techniques, Dredging and Disposal of Dredged Material, Fishery Management Techniques, and Wildlife Management Techniques. Because techniques have been grouped by function, rather than by structure, some structures are discussed in several contexts. For example, gabions are discussed for use in revetments, river training structures, and breakwaters. The measures covered under Bank Stabilization Techniques include the use of riprap revetments, other revetments, bulkheads, river training structures, breakwater structures, chemical soil stabilizers, erosion-control mattings, and filter fabrics; the planting of vegetation; the creation of islands; the creation of berms or enrichment of beaches; and the control of water level and boat traffic. The discussions of Dredging and the Disposal of Dredged Material consider dredges, dredging methods, and disposal of dredged material. The following subjects are considered under Fishery Management Techniques: fish attractors; spawning structures; nursery ponds, coves, and marshes; fish screens and barriers; fish passage; water control structures; management of water levels and flows; wing dam modification; side channel modification; aeration techniques; control of nuisance aquatic plants; and manipulated of fish populations. Wildlife Management

  9. Design of an embedded inverse-feedforward biomolecular tracking controller for enzymatic reaction processes

    OpenAIRE

    Foo, Mathias; Kim, Jongrae; Sawlekar, Rucha; Bates, Declan G.

    2017-01-01

    Feedback control is widely used in chemical engineering to improve the performance and robustness of chemical processes. Feedback controllers require a ‘subtractor’ that is able to compute the error between the process output and the reference signal. In the case of embedded biomolecular control circuits, subtractors designed using standard chemical reaction network theory can only realise one-sided subtraction, rendering standard controller design approaches inadequate. Here, we show how a b...

  10. Parity Violation in Chiral Molecules: From Theory towards Spectroscopic Experiment and the Evolution of Biomolecular Homochirality

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    The observation of biomolecular homochirality can be considered as a quasi-fossil of the evolution of life [1], the interpretation of which has been an open question for more than a century, with numerous related hypotheses, but no definitive answers. We shall briefly discuss the current status and the relation to the other two questions. The discovery of parity violation led to important developm...

  11. Customer Relationship Management Systems - Why Many Large Companies Do Not Have Them?

    Science.gov (United States)

    Cunha, Manuela; Varajão, João; Santana, Daniela; Bentes, Isabel

    Today's information technologies are heavily embedded in the reality of organizations. Their role is essential not only at the level of internal processes optimization, but also the interaction between the company and its environment. In this context, the Customer Relationship Management (CRM) systems are powerful competitive tools in many different sectors of activity. Despite the undeniable importance of these systems, there are in practice, many large companies that do not use them. Supported by the results of a survey carried out in a sample of large enterprises, this paper seeks to answer to the research question "why many large companies do not have CRM systems".

  12. Large-scale theoretical calculations in molecular science - design of a large computer system for molecular science and necessary conditions for future computers

    Energy Technology Data Exchange (ETDEWEB)

    Kashiwagi, H [Institute for Molecular Science, Okazaki, Aichi (Japan)

    1982-06-01

    A large computer system was designed and established for molecular science under the leadership of molecular scientists. Features of the computer system are an automated operation system and an open self-service system. Large-scale theoretical calculations have been performed to solve many problems in molecular science, using the computer system. Necessary conditions for future computers are discussed on the basis of this experience.

  13. Large-scale theoretical calculations in molecular science - design of a large computer system for molecular science and necessary conditions for future computers

    International Nuclear Information System (INIS)

    Kashiwagi, H.

    1982-01-01

    A large computer system was designed and established for molecular science under the leadership of molecular scientists. Features of the computer system are an automated operation system and an open self-service system. Large-scale theoretical calculations have been performed to solve many problems in molecular science, using the computer system. Necessary conditions for future computers are discussed on the basis of this experience. (orig.)

  14. Foreword [IJEGMBE 2015: India-Japan expert group meeting on biomolecular electronics and organic nanotechnology for environment preservation, Fukuoka (Japan), 23-26 December 2015

    International Nuclear Information System (INIS)

    2016-01-01

    There is increased interest in organic nanotechnology and biomolecular electronics for environmental preservation, and in their anticipated impact on the economics of both the developing and the developed world. Keeping this in mind, the Department of Biological Functions, Graduate School of Life Sciences and Systems Engineering, Kyushu Institute of Technology (KIT), Kitakyushu, Japan, and the Department of Science and Technology Centre on Biomolecular Electronics (DSTCBE), National Physical Laboratory (NPL) jointly organized the India-Japan Workshop on Biomolecular Electronics and Organic Nanotechnology for Environmental Preservation (IJWBME 2009) at NPL, New Delhi from 17 th - 19 th December 2009, IJWBME 2011 at EGRET Himeji, Himeji, from 7 th - 10 th December, Japan, and IJWBME 2013 at Delhi Technological University, New Delhi, from 13 th - 15 th December. The India-Japan Expert Group Meeting on Biomolecular Electronics and Organic Nanotechnology for Environment Preservation (IJEGMBE) will be held from 22 th – 25 th , December, 2015, at Nakamura Centenary Memorial Hall, Kyushu Institute of Technology, Kitakyushu, Japan in association with Delhi Technological University, Delhi, India. Recent years have seen rapid growth in the area of Biomolecular Electronics involving the association and expertise of physicists, biologists, chemists, electronics engineers and information technologists. There is increasing interest in the development of nanotechnology and biomolecular electronic devices for the preservation of our precious environment. In this context, the world of the electronics, which developed on Si semiconductors, is going to change drastically. A paradigm shift towards organic or printed electronics is more likely in the future. The field of organic electronics promises exciting new technologies based on inexpensive and mechanically flexible electronic devices, and is now starting to see commercial success. On the sidelines of this increasingly well

  15. Computer code package RALLY for probabilistic safety assessment of large technical systems

    International Nuclear Information System (INIS)

    Gueldner, W.; Polke, H.; Spindler, H.; Zipf, G.

    1981-09-01

    This report describes the program system RALLY to compute the reliability of large and intermeshed technical systems. In addition to a short explanation of the different programs, the possible applications of the program system RALLY are demonstrated. Finally, the most important studies carried out so far on RALLY are discussed. (orig.) [de

  16. Software quality assurance: in large scale and complex software-intensive systems

    NARCIS (Netherlands)

    Mistrik, I.; Soley, R.; Ali, N.; Grundy, J.; Tekinerdogan, B.

    2015-01-01

    Software Quality Assurance in Large Scale and Complex Software-intensive Systems presents novel and high-quality research related approaches that relate the quality of software architecture to system requirements, system architecture and enterprise-architecture, or software testing. Modern software

  17. Very Large Distance Education Systems: The Case of China. ZIFF Papiere 94.

    Science.gov (United States)

    Keegan, Desmond

    One answer to the magnitude of the world education crisis is the provision of very large education systems, capable of enrolling 100,000 students or more. The largest distance system is the Dianda or Chinese Radio and Television University (CRTVU) system. Dianda is best described as a network of one central open university that does not enroll…

  18. Changes in biomolecular profile in a single nucleolus during cell fixation.

    Science.gov (United States)

    Kuzmin, Andrey N; Pliss, Artem; Prasad, Paras N

    2014-11-04

    Fixation of biological sample is an essential technique applied in order to "freeze" in time the intracellular molecular content. However, fixation induces changes of the cellular molecular structure, which mask physiological distribution of biomolecules and bias interpretation of results. Accurate, sensitive, and comprehensive characterization of changes in biomolecular composition, occurring during fixation, is crucial for proper analysis of experimental data. Here we apply biomolecular component analysis for Raman spectra measured in the same nucleoli of HeLa cells before and after fixation by either formaldehyde solution or by chilled ethanol. It is found that fixation in formaldehyde does not strongly affect the Raman spectra of nucleolar biomolecular components, but may significantly decrease the nucleolar RNA concentration. At the same time, ethanol fixation leads to a proportional increase (up to 40%) in concentrations of nucleolar proteins and RNA, most likely due to cell shrinkage occurring in the presence of coagulant fixative. Ethanol fixation also triggers changes in composition of nucleolar proteome, as indicated by an overall reduction of the α-helical structure of proteins and increase in the concentration of proteins containing the β-sheet conformation. We conclude that cross-linking fixation is a more appropriate protocol for mapping of proteins in situ. At the same time, ethanol fixation is preferential for studies of RNA-containing macromolecules. We supplemented our quantitative Raman spectroscopic measurements with mapping of the protein and lipid macromolecular groups in live and fixed cells using coherent anti-Stokes Raman scattering nonlinear optical imaging.

  19. Nanogap biosensors for electrical and label-free detection of biomolecular interactions

    International Nuclear Information System (INIS)

    Kyu Kim, Sang; Cho, Hyunmin; Park, Hye-Jung; Kwon, Dohyoung; Min Lee, Jeong; Hyun Chung, Bong

    2009-01-01

    We demonstrate nanogap biosensors for electrical and label-free detection of biomolecular interactions. Parallel fabrication of nanometer distance gaps has been achieved using a silicon anisotropic wet etching technique on a silicon-on-insulator (SOI) wafer with a finely controllable silicon device layer. Since silicon anisotropic wet etching resulted in a trapezoid-shaped structure whose end became narrower during the etching, the nanogap structure was simply fabricated on the device layer of a SOI wafer. The nanogap devices were individually addressable and a gap size of less than 60 nm was obtained. We demonstrate that the nanogap biosensors can electrically detect biomolecular interactions such as biotin/streptavidin and antigen/antibody pairs. The nanogap devices show a current increase when the proteins are bound to the surface. The current increases proportionally depending upon the concentrations of the molecules in the range of 100 fg ml -1 -100 ng ml -1 at 1 V bias. It is expected that the nanogap developed here could be a highly sensitive biosensor platform for label-free detection of biomolecular interactions.

  20. A study of the transferability of influenza case detection systems between two large healthcare systems.

    Directory of Open Access Journals (Sweden)

    Ye Ye

    Full Text Available This study evaluates the accuracy and transferability of Bayesian case detection systems (BCD that use clinical notes from emergency department (ED to detect influenza cases.A BCD uses natural language processing (NLP to infer the presence or absence of clinical findings from ED notes, which are fed into a Bayesain network classifier (BN to infer patients' diagnoses. We developed BCDs at the University of Pittsburgh Medical Center (BCDUPMC and Intermountain Healthcare in Utah (BCDIH. At each site, we manually built a rule-based NLP and trained a Bayesain network classifier from over 40,000 ED encounters between Jan. 2008 and May. 2010 using feature selection, machine learning, and expert debiasing approach. Transferability of a BCD in this study may be impacted by seven factors: development (source institution, development parser, application (target institution, application parser, NLP transfer, BN transfer, and classification task. We employed an ANOVA analysis to study their impacts on BCD performance.Both BCDs discriminated well between influenza and non-influenza on local test cases (AUCs > 0.92. When tested for transferability using the other institution's cases, BCDUPMC discriminations declined minimally (AUC decreased from 0.95 to 0.94, p<0.01, and BCDIH discriminations declined more (from 0.93 to 0.87, p<0.0001. We attributed the BCDIH decline to the lower recall of the IH parser on UPMC notes. The ANOVA analysis showed five significant factors: development parser, application institution, application parser, BN transfer, and classification task.We demonstrated high influenza case detection performance in two large healthcare systems in two geographically separated regions, providing evidentiary support for the use of automated case detection from routinely collected electronic clinical notes in national influenza surveillance. The transferability could be improved by training Bayesian network classifier locally and increasing the

  1. Large Scale Landslide Database System Established for the Reservoirs in Southern Taiwan

    Science.gov (United States)

    Tsai, Tsai-Tsung; Tsai, Kuang-Jung; Shieh, Chjeng-Lun

    2017-04-01

    Typhoon Morakot seriously attack southern Taiwan awaken the public awareness of large scale landslide disasters. Large scale landslide disasters produce large quantity of sediment due to negative effects on the operating functions of reservoirs. In order to reduce the risk of these disasters within the study area, the establishment of a database for hazard mitigation / disaster prevention is necessary. Real time data and numerous archives of engineering data, environment information, photo, and video, will not only help people make appropriate decisions, but also bring the biggest concern for people to process and value added. The study tried to define some basic data formats / standards from collected various types of data about these reservoirs and then provide a management platform based on these formats / standards. Meanwhile, in order to satisfy the practicality and convenience, the large scale landslide disasters database system is built both provide and receive information abilities, which user can use this large scale landslide disasters database system on different type of devices. IT technology progressed extreme quick, the most modern system might be out of date anytime. In order to provide long term service, the system reserved the possibility of user define data format /standard and user define system structure. The system established by this study was based on HTML5 standard language, and use the responsive web design technology. This will make user can easily handle and develop this large scale landslide disasters database system.

  2. Development and performance of a calibration system for a large calorimeter array

    International Nuclear Information System (INIS)

    Arenton, M.; Dawson, J.; Ditzler, W.R.

    1982-01-01

    Experiment 609 at Fermilab is a study of the properties of high-p/sub t/ collisions using a large segmented hadron calorimeter. The calibration and monitoring of such a large calorimeter array is a difficult undertaking. This paper describes the systems developed by E609 for automatic monitoring of the phototube gains and performance of the associated electronics

  3. A new system of labour management in African large-scale agriculture?

    DEFF Research Database (Denmark)

    Gibbon, Peter; Riisgaard, Lone

    2014-01-01

    This paper applies a convention theory (CT) approach to the analysis of labour management systems in African large-scale farming. The reconstruction of previous analyses of high-value crop production on large-scale farms in Africa in terms of CT suggests that, since 1980–95, labour management has...

  4. An implicit iterative scheme for solving large systems of linear equations

    International Nuclear Information System (INIS)

    Barry, J.M.; Pollard, J.P.

    1986-12-01

    An implicit iterative scheme for the solution of large systems of linear equations arising from neutron diffusion studies is presented. The method is applied to three-dimensional reactor studies and its performance is compared with alternative iterative approaches

  5. Development of distortion measurement system for large deployable antenna via photogrammetry in vacuum and cryogenic environment

    Science.gov (United States)

    Zhang, Pengsong; Jiang, Shanping; Yang, Linhua; Zhang, Bolun

    2018-01-01

    In order to meet the requirement of high precision thermal distortion measurement foraΦ4.2m deployable mesh antenna of satellite in vacuum and cryogenic environment, based on Digital Close-range Photogrammetry and Space Environment Test Technology of Spacecraft, a large scale antenna distortion measurement system under vacuum and cryogenic environment is developed in this paper. The antenna Distortion measurement system (ADMS) is the first domestic independently developed thermal distortion measurement system for large antenna, which has successfully solved non-contact high precision distortion measurement problem in large spacecraft structure under vacuum and cryogenic environment. The measurement accuracy of ADMS is better than 50 μm/5m, which has reached international advanced level. The experimental results show that the measurement system has great advantages in large structural measurement of spacecrafts, and also has broad application prospects in space or other related fields.

  6. Development of a New Delivery and Attachment System for Tagging Large Cetaceans

    National Research Council Canada - National Science Library

    Harvey, James

    1997-01-01

    .... We developed a new method of delivering tags for placement on large whales. Sea lions were trained to carry a harness and attached camera system along with carrying a suction-cup tag in a mouth piece...

  7. The linac control system for the large-scale synchrotron radiation facility (SPring-8)

    Energy Technology Data Exchange (ETDEWEB)

    Sakaki, Hironao; Yoshikawa, Hiroshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Itoh, Yuichi [Atomic Energy General Services Corporation, Tokai, Ibaraki (Japan); Terashima, Yasushi [Information Technology System Co., Ltd. (ITECS), Tokyo (Japan)

    2000-09-01

    The linac for large-scale synchrotron radiation facilities has been operated since August of 1996. The linac deal with the user requests without any big troubles. In this report, the control system development policy, details, and the operation for the linac are presented. It is also described so that these experiences can be used for control system of a large scale proton accelerators which will be developed in the High Intensity Proton Accelerator Project. (author)

  8. The MARTINI force field : Coarse grained model for biomolecular simulations

    NARCIS (Netherlands)

    Marrink, Siewert J.; Risselada, H. Jelger; Yefimov, Serge; Tieleman, D. Peter; de Vries, Alex H.

    2007-01-01

    We present an improved and extended version of our coarse grained lipid model. The new version, coined the MARTINI force field, is parametrized in a systematic way, based on the reproduction of partitioning free energies between polar and apolar phases of a large number of chemical compounds. To

  9. Transformations of large technical systems : a multi-level analysis of the Dutch highway system (1950-2000)

    NARCIS (Netherlands)

    Geels, F.W.

    2007-01-01

    The transformation of existing systems is an underexposed topic in large technical systems (LTS) research. Most LTS research has focused on the emergence and stabilization of systems, ending with momentum. But how is momentum overcome, and how do transformations come about? This article presents a

  10. Distributed and hierarchical control techniques for large-scale power plant systems

    International Nuclear Information System (INIS)

    Raju, G.V.S.; Kisner, R.A.

    1985-08-01

    In large-scale systems, integrated and coordinated control functions are required to maximize plant availability, to allow maneuverability through various power levels, and to meet externally imposed regulatory limitations. Nuclear power plants are large-scale systems. Prime subsystems are those that contribute directly to the behavior of the plant's ultimate output. The prime subsystems in a nuclear power plant include reactor, primary and intermediate heat transport, steam generator, turbine generator, and feedwater system. This paper describes and discusses the continuous-variable control system developed to supervise prime plant subsystems for optimal control and coordination

  11. Interactive computer graphics and its role in control system design of large space structures

    Science.gov (United States)

    Reddy, A. S. S. R.

    1985-01-01

    This paper attempts to show the relevance of interactive computer graphics in the design of control systems to maintain attitude and shape of large space structures to accomplish the required mission objectives. The typical phases of control system design, starting from the physical model such as modeling the dynamics, modal analysis, and control system design methodology are reviewed and the need of the interactive computer graphics is demonstrated. Typical constituent parts of large space structures such as free-free beams and free-free plates are used to demonstrate the complexity of the control system design and the effectiveness of the interactive computer graphics.

  12. Krylov subspace methods for the solution of large systems of ODE's

    DEFF Research Database (Denmark)

    Thomsen, Per Grove; Bjurstrøm, Nils Henrik

    1998-01-01

    In Air Pollution Modelling large systems of ODE's arise. Solving such systems may be done efficientliy by Semi Implicit Runge-Kutta methods. The internal stages may be solved using Krylov subspace methods. The efficiency of this approach is investigated and verified.......In Air Pollution Modelling large systems of ODE's arise. Solving such systems may be done efficientliy by Semi Implicit Runge-Kutta methods. The internal stages may be solved using Krylov subspace methods. The efficiency of this approach is investigated and verified....

  13. Hierarchical modeling and robust synthesis for the preliminary design of large scale complex systems

    Science.gov (United States)

    Koch, Patrick Nathan

    Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: (1) Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis, (2) Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration, and (3) Noise modeling techniques for implementing robust preliminary design when approximate models are employed. The method developed and associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system; the turbofan system-level problem is partitioned into engine cycle and configuration design and a compressor module is integrated for more detailed subsystem-level design exploration, improving system evaluation.

  14. Optimization of hybrid power system composed of SMES and flywheel MG for large pulsed load

    International Nuclear Information System (INIS)

    Niiyama, K.; Yagai, T.; Tsuda, M.; Hamajima, T.

    2008-01-01

    A superconducting magnetic storage system (SMES) has some advantages such as rapid large power response and high storage efficiency which are superior to other energy storage systems. A flywheel motor generator (FWMG) has large scaled capacity and high reliability, and hence is broadly utilized for a large pulsed load, while it has comparatively low storage efficiency due to high mechanical loss compared with SMES. A fusion power plant such as International Thermo-Nuclear Experimental Reactor (ITER) requires a large and long pulsed load which causes a frequency deviation in a utility power system. In order to keep the frequency within an allowable deviation, we propose a hybrid power system for the pulsed load, which equips the SMES and the FWMG with the utility power system. We evaluate installation cost and frequency control performance of three power systems combined with energy storage devices; (i) SMES with the utility power, (ii) FWMG with the utility power, (iii) both SMES and FWMG with the utility power. The first power system has excellent frequency power control performance but its installation cost is high. The second system has inferior frequency control performance but its installation cost is the lowest. The third system has good frequency control performance and its installation cost is attained lower than the first power system by adjusting the ratio between SMES and FWMG

  15. Sequence co-evolutionary information is a natural partner to minimally-frustrated models of biomolecular dynamics [version 1; referees: 3 approved

    Directory of Open Access Journals (Sweden)

    Jeffrey K Noel

    2016-01-01

    Full Text Available Experimentally derived structural constraints have been crucial to the implementation of computational models of biomolecular dynamics. For example, not only does crystallography provide essential starting points for molecular simulations but also high-resolution structures permit for parameterization of simplified models. Since the energy landscapes for proteins and other biomolecules have been shown to be minimally frustrated and therefore funneled, these structure-based models have played a major role in understanding the mechanisms governing folding and many functions of these systems. Structural information, however, may be limited in many interesting cases. Recently, the statistical analysis of residue co-evolution in families of protein sequences has provided a complementary method of discovering residue-residue contact interactions involved in functional configurations. These functional configurations are often transient and difficult to capture experimentally. Thus, co-evolutionary information can be merged with that available for experimentally characterized low free-energy structures, in order to more fully capture the true underlying biomolecular energy landscape.

  16. Design of General-purpose Industrial signal acquisition system in a large scientific device

    Science.gov (United States)

    Ren, Bin; Yang, Lei

    2018-02-01

    In order to measure the industrial signal of a large scientific device experiment, a set of industrial data general-purpose acquisition system has been designed. It can collect 4~20mA current signal and 0~10V voltage signal. Through the practical experiments, it shows that the system is flexible, reliable, convenient and economical, and the system has characters of high definition and strong anti-interference ability. Thus, the system fully meets the design requirements..

  17. Does reintroducing large wood influence the hydraulic landscape of a lowland river system?

    Science.gov (United States)

    Matheson, Adrian; Thoms, Martin; Reid, Michael

    2017-09-01

    Our understanding of the effectiveness of reintroduced large wood for restoration is largely based on studies from high energy river systems. By contrast, few studies of the effectiveness of reintroducing large wood have been undertaken on large, low energy, lowland river systems: river systems where large wood is a significant physical feature on the in-channel environment. This study investigated the effect of reintroduced large wood on the hydraulic landscape of the Barwon-Darling River, Australia, at low flows. To achieve this, the study compared three hydraulic landscapes of replicated reference (naturally wooded), control (unwooded,) and managed (wood reintroduced) treatments on three low flow periods. These time periods were prior to the reintroduction of large wood to managed reaches; several months after the reintroduction of large wood into the managed reaches; and then more than four years after wood reintroduction following several large flood events. Hydraulic landscapes of reaches were characterised using a range of spatial measures calculated from velocity measurements taken with a boat-mounted Acoustic Doppler Profiler. We hypothesised that reintroduced large wood would increase the diversity of the hydraulic landscape at low flows and that managed reaches would be more similar to the reference reaches. Our results suggest that the reintroduction of large wood did not significantly change the character of the hydraulic landscape at the reach scale after several months (p = 0.16) or several years (p = 0.29). Overall, the character of the hydraulic landscape in the managed reaches was more similar to the hydraulic landscape of the control reaches than the hydraulic landscape of the reference reaches, at low flows. Some variability in the hydraulic landscapes was detected over time, and this may reflect reworking of riverbed sediments and sensitivity to variation in discharge. The lack of a response in the low flow hydraulic landscape to the

  18. Testing, development and demonstration of large scale solar district heating systems

    DEFF Research Database (Denmark)

    Furbo, Simon; Fan, Jianhua; Perers, Bengt

    2015-01-01

    In 2013-2014 the project “Testing, development and demonstration of large scale solar district heating systems” was carried out within the Sino-Danish Renewable Energy Development Programme, the so called RED programme jointly developed by the Chinese and Danish governments. In the project Danish...... know how on solar heating plants and solar heating test technology have been transferred from Denmark to China, large solar heating systems have been promoted in China, test capabilities on solar collectors and large scale solar heating systems have been improved in China and Danish-Chinese cooperation...

  19. Support of an Active Science Project by a Large Information System: Lessons for the EOS Era

    Science.gov (United States)

    Angelici, Gary L.; Skiles, J. W.; Popovici, Lidia Z.

    1993-01-01

    The ability of large information systems to support the changing data requirements of active science projects is being tested in a NASA collaborative study. This paper briefly profiles both the active science project and the large information system involved in this effort and offers some observations about the effectiveness of the project support. This is followed by lessons that are important for those participating in large information systems that need to support active science projects or that make available the valuable data produced by these projects. We learned in this work that it is difficult for a large information system focused on long term data management to satisfy the requirements of an on-going science project. For example, in order to provide the best service, it is important for all information system staff to keep focused on the needs and constraints of the scientists in the development of appropriate services. If the lessons learned in this and other science support experiences are not applied by those involved with large information systems of the EOS (Earth Observing System) era, then the final data products produced by future science projects may not be robust or of high quality, thereby making the conduct of the project science less efficacious and reducing the value of these unique suites of data for future research.

  20. A large scale software system for simulation and design optimization of mechanical systems

    Science.gov (United States)

    Dopker, Bernhard; Haug, Edward J.

    1989-01-01

    The concept of an advanced integrated, networked simulation and design system is outlined. Such an advanced system can be developed utilizing existing codes without compromising the integrity and functionality of the system. An example has been used to demonstrate the applicability of the concept of the integrated system outlined here. The development of an integrated system can be done incrementally. Initial capabilities can be developed and implemented without having a detailed design of the global system. Only a conceptual global system must exist. For a fully integrated, user friendly design system, further research is needed in the areas of engineering data bases, distributed data bases, and advanced user interface design.

  1. Report of the Workshop on Petascale Systems Integration for LargeScale Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Kramer, William T.C.; Walter, Howard; New, Gary; Engle, Tom; Pennington, Rob; Comes, Brad; Bland, Buddy; Tomlison, Bob; Kasdorf, Jim; Skinner, David; Regimbal, Kevin

    2007-10-01

    There are significant issues regarding Large Scale System integration that are not being addressed in other forums such as current research portfolios or vendor user groups. Unfortunately, the issues in the area of large-scale system integration often fall into a netherworld; not research, not facilities, not procurement, not operations, not user services. Taken together, these issues along with the impact of sub-optimal integration technology means the time required to deploy, integrate and stabilize large scale system may consume up to 20 percent of the useful life of such systems. Improving the state of the art for large scale systems integration has potential to increase the scientific productivity of these systems. Sites have significant expertise, but there are no easy ways to leverage this expertise among them . Many issues inhibit the sharing of information, including available time and effort, as well as issues with sharing proprietary information. Vendors also benefit in the long run from the solutions to issues detected during site testing and integration. There is a great deal of enthusiasm for making large scale system integration a full-fledged partner along with the other major thrusts supported by funding agencies in the definition, design, and use of a petascale systems. Integration technology and issues should have a full 'seat at the table' as petascale and exascale initiatives and programs are planned. The workshop attendees identified a wide range of issues and suggested paths forward. Pursuing these with funding opportunities and innovation offers the opportunity to dramatically improve the state of large scale system integration.

  2. Defining Execution Viewpoints for a Large and Complex Software-Intensive System

    NARCIS (Netherlands)

    Callo Arias, Trosky B.; America, Pierre; Avgeriou, Paris

    2009-01-01

    An execution view is an important asset for developing large and complex systems. An execution view helps practitioners to describe, analyze, and communicate what a software system does at runtime and how it does it. In this paper, we present an approach to define execution viewpoints for an

  3. Development of A Hydraulic Drive for a novel Diesel-Hydraulic system for Large commercial Vehicles

    DEFF Research Database (Denmark)

    Stecki, J. S.; Conrad, Finn; Matheson, P.

    2002-01-01

    The objectives and results of the research project Hybrid Diesel-Hydraulic System for Large commercial vehicles, e.g. urban freight delivery, buses or garbage trucks. The paper presents and discusses the research and development of the system, modelling approach and results from preliminary...... performance tests on a 10 ton vehicle....

  4. Distributed measurement system for long term monitoring of clouding effects on large PV plants

    DEFF Research Database (Denmark)

    Paasch, K. M.; Nymand, M.; Haase, F.

    2013-01-01

    A recording system for the generation of current-voltage characteristics of solar panels is presented. The system is intended for large area PV power plants. The recorded curves are used to optimize the energy output of PV power plants, which are likely to be influenced by passing clouds...

  5. Model of large scale man-machine systems with an application to vessel traffic control

    NARCIS (Netherlands)

    Wewerinke, P.H.; van der Ent, W.I.; ten Hove, D.

    1989-01-01

    Mathematical models are discussed to deal with complex large-scale man-machine systems such as vessel (air, road) traffic and process control systems. Only interrelationships between subsystems are assumed. Each subsystem is controlled by a corresponding human operator (HO). Because of the

  6. Large-scale computer networks and the future of legal knowledge-based systems

    NARCIS (Netherlands)

    Leenes, R.E.; Svensson, Jorgen S.; Hage, J.C.; Bench-Capon, T.J.M.; Cohen, M.J.; van den Herik, H.J.

    1995-01-01

    In this paper we investigate the relation between legal knowledge-based systems and large-scale computer networks such as the Internet. On the one hand, researchers of legal knowledge-based systems have claimed huge possibilities, but despite the efforts over the last twenty years, the number of

  7. Compensating active power imbalances in power system with large-scale wind power penetration

    DEFF Research Database (Denmark)

    Basit, Abdul; Hansen, Anca Daniela; Altin, Müfit

    2016-01-01

    Large-scale wind power penetration can affectthe supply continuity in the power system. This is a matterof high priority to investigate, as more regulating reservesand specified control strategies for generation control arerequired in the future power system with even more highwind power penetrat...

  8. Integration of large-scale heat pumps in the district heating systems of Greater Copenhagen

    DEFF Research Database (Denmark)

    Bach, Bjarne; Werling, Jesper; Ommen, Torben Schmidt

    2016-01-01

    This study analyses the technical and private economic aspects of integrating a large capacity of electric driven HP (heat pumps) in the Greater Copenhagen DH (district heating) system, which is an example of a state-of-the-art large district heating system with many consumers and suppliers....... The analysis was based on using the energy model Balmorel to determine the optimum dispatch of HPs in the system. The potential heat sources in Copenhagen for use in HPs were determined based on data related to temperatures, flows, and hydrography at different locations, while respecting technical constraints...

  9. Optimal Offering and Operating Strategy for a Large Wind-Storage System as a Price Maker

    DEFF Research Database (Denmark)

    Ding, Huajie; Pinson, Pierre; Hu, Zechun

    2017-01-01

    Wind farms and energy storage systems are playing increasingly more important roles in power systems, which makes their offering non-negligible in some markets. From the perspective of wind farm-energy storage systems (WF-ESS), this paper proposes an integrated strategy of day-ahead offering...... and real-time operation policies to maximize their overall profit. As participants with large capacity in electricity markets can influence cleared prices by strategic offering, a large scaled WFESS is assumed to be a price maker in day-ahead markets. Correspondingly, the strategy considers influence...

  10. The development of large-aperture test system of infrared camera and visible CCD camera

    Science.gov (United States)

    Li, Yingwen; Geng, Anbing; Wang, Bo; Wang, Haitao; Wu, Yanying

    2015-10-01

    Infrared camera and CCD camera dual-band imaging system is used in many equipment and application widely. If it is tested using the traditional infrared camera test system and visible CCD test system, 2 times of installation and alignment are needed in the test procedure. The large-aperture test system of infrared camera and visible CCD camera uses the common large-aperture reflection collimator, target wheel, frame-grabber, computer which reduces the cost and the time of installation and alignment. Multiple-frame averaging algorithm is used to reduce the influence of random noise. Athermal optical design is adopted to reduce the change of focal length location change of collimator when the environmental temperature is changing, and the image quality of the collimator of large field of view and test accuracy are also improved. Its performance is the same as that of the exotic congener and is much cheaper. It will have a good market.

  11. Output Control Technologies for a Large-scale PV System Considering Impacts on a Power Grid

    Science.gov (United States)

    Kuwayama, Akira

    The mega-solar demonstration project named “Verification of Grid Stabilization with Large-scale PV Power Generation systems” had been completed in March 2011 at Wakkanai, the northernmost city of Japan. The major objectives of this project were to evaluate adverse impacts of large-scale PV power generation systems connected to the power grid and develop output control technologies with integrated battery storage system. This paper describes the outline and results of this project. These results show the effectiveness of battery storage system and also proposed output control methods for a large-scale PV system to ensure stable operation of power grids. NEDO, New Energy and Industrial Technology Development Organization of Japan conducted this project and HEPCO, Hokkaido Electric Power Co., Inc managed the overall project.

  12. Quench detection system of the EURATOM coil for the Large Coil Task

    International Nuclear Information System (INIS)

    Noether, G.; Gauss, S.; Maurer, W.; Siewerdt, L.; Ulbricht, A.; Wuechner, F.

    1989-01-01

    A special quench detection system has been developed for the EURATOM Large Coil Task (LCT) coil. The system is based on a bridge circuit which uses a special 'two in hand' winding technique for the pancakes of the EURATOM LCT coil. The electronic circuit was designed in a fail safe way to prevent failure of the quench detector due to failure of one of its components. A method for quick balancing of the quench detection system in a large toroidal magnet system was applied. The quench detection system worked very reliably during the experimental phase of the LCT and was within the quench detection level setting of 50 mV, i.e. the system was not sensitive to poloidal field transients at or below this level. Non-electrical methods for quench detection were also investigated. (author)

  13. Integrated fringe projection 3D scanning system for large-scale metrology based on laser tracker

    Science.gov (United States)

    Du, Hui; Chen, Xiaobo; Zhou, Dan; Guo, Gen; Xi, Juntong

    2017-10-01

    Large scale components exist widely in advance manufacturing industry,3D profilometry plays a pivotal role for the quality control. This paper proposes a flexible, robust large-scale 3D scanning system by integrating a robot with a binocular structured light scanner and a laser tracker. The measurement principle and system construction of the integrated system are introduced. And a mathematical model is established for the global data fusion. Subsequently, a flexible and robust method and mechanism is introduced for the establishment of the end coordination system. Based on this method, a virtual robot noumenon is constructed for hand-eye calibration. And then the transformation matrix between end coordination system and world coordination system is solved. Validation experiment is implemented for verifying the proposed algorithms. Firstly, hand-eye transformation matrix is solved. Then a car body rear is measured for 16 times for the global data fusion algorithm verification. And the 3D shape of the rear is reconstructed successfully.

  14. Robust stability analysis of large power systems using the structured singular value theory

    Energy Technology Data Exchange (ETDEWEB)

    Castellanos, R.; Sarmiento, H. [Instituto de Investigaciones Electricas, Cuernavaca, Morelos (Mexico); Messina, A.R. [Cinvestav, Graduate Program in Electrical Engineering, Guadalajara, Jalisco (Mexico)

    2005-07-01

    This paper examines the application of structured singular value (SSV) theory to analyse robust stability of complex power systems with respect to a set of structured uncertainties. Based on SSV theory and the frequency sweep method, techniques for robust analysis of large-scale power systems are developed. The main interest is focused on determining robust stability for varying operating conditions and uncertainties in the structure of the power system. The applicability of the proposed techniques is verified through simulation studies on a large-scale power system. In particular, results for the system are considered for a wide range of uncertainties of operating conditions. Specifically, the developed technique is used to estimate the effect of variations in the parameters of a major system inter-tie on the nominal stability of a critical inter-area mode. (Author)

  15. Development and application of automatic frequency and power control systems for large power units

    Energy Technology Data Exchange (ETDEWEB)

    V.A. Bilenko; A.D. Melamed; E.E. Mikushevich; D.Y. Nikol' skii; R.L. Rogachev; N.A. Romanov [ZAO Interavtomatika (Interautomatika AG), Moscow (Russian Federation)

    2008-07-01

    We describe the results of work carried out at ZAO Interavtomatika on the development and putting into use of a system for automatically controlling the frequency and power output of large coal-fired power units involving the retrofitting of the turbine's hydraulic automatic control system. Certificates affirming conformity to the Standard of the System Operator Centralized Dispatching Administration (SO-CDA) have been received for eight power units as an outcome of these efforts.

  16. FY 2000 report on the results of the R and D of the fusion domain. Volume 3. Biomolecular mechanism and design; 2000 nendo yugo ryoiki kenkyu kaihatsu. 3. Biomolecular mechanism and design

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    For the purpose of creating cells/tissue assemblies and the molecular machine that substitute for self-organizing and self-repairing functions of a living body outside the body, the basic technology research was conducted, and the FY 2000 results were reported. In the study of 3D cell and tissue module engineering, the following were conducted: study of the surface modification and functional expression of biomaterials, study of the mechanical stress to cartilaginous cells and the response, development of the production method of biodegradable synthetic polymer porous materials, study of organism hard tissue materials/bone remodeling and cultured bone transportation, development of zinc-releasing calcium phosphate ceramics. In the study of biomolecular mechanism and design, 1D unidirectional movement of microtubules by applying microlithography technology, structural study of kinesin-family molecular motor by low temperature microscope, ribozyme and the application to leukemia, basic study on assessment of chemical substances by human cultured cells, study of a low molecule detection system using nucleic acid and peptide. (NEDO)

  17. An iterated cubature unscented Kalman filter for large-DoF systems identification with noisy data

    Science.gov (United States)

    Ghorbani, Esmaeil; Cha, Young-Jin

    2018-04-01

    Structural and mechanical system identification under dynamic loading has been an important research topic over the last three or four decades. Many Kalman-filtering-based approaches have been developed for linear and nonlinear systems. For example, to predict nonlinear systems, an unscented Kalman filter was applied. However, from extensive literature reviews, the unscented Kalman filter still showed weak performance on systems with large degrees of freedom. In this research, a modified unscented Kalman filter is proposed by integration of a cubature Kalman filter to improve the system identification performance of systems with large degrees of freedom. The novelty of this work lies on conjugating the unscented transform with the cubature integration concept to find a more accurate output from the transformation of the state vector and its related covariance matrix. To evaluate the proposed method, three different numerical models (i.e., the single degree-of-freedom Bouc-Wen model, the linear 3-degrees-of-freedom system, and the 10-degrees-of-freedom system) are investigated. To evaluate the robustness of the proposed method, high levels of noise in the measured response data are considered. The results show that the proposed method is significantly superior to the traditional UKF for noisy measured data in systems with large degrees of freedom.

  18. MPBEC, a Matlab Program for Biomolecular Electrostatic Calculations

    OpenAIRE

    Vergara-Perez, Sandra; Marucho, Marcelo

    2015-01-01

    One of the most used and efficient approaches to compute electrostatic properties of biological systems is to numerically solve the Poisson-Boltzmann (PB) equation. There are several software packages available that solve the PB equation for molecules in aqueous electrolyte solutions. Most of these software packages are useful for scientists with specialized training and expertise in computational biophysics. However, the user is usually required to manually take several important choices, de...

  19. Electric vehicles to support large wind power penetration in future danish power systems

    DEFF Research Database (Denmark)

    Pillai, Jayakrishnan Radhakrishna; Bak-Jensen, Birgitte; Thøgersen, Paul

    2012-01-01

    Electric Vehicles (EVs) could play major role in the future intelligent grids to support a large penetration of renewable energy in Denmark, especially electricity production from wind turbines. The future power systems aims to phase-out big conventional fossil-fueled generators with large number...... on low voltage residential networks. Significant amount of EVs could be integrated in local distribution grids with the support of intelligent grid and smart charging strategies....

  20. A Short Proof of the Large Time Energy Growth for the Boussinesq System

    Science.gov (United States)

    Brandolese, Lorenzo; Mouzouni, Charafeddine

    2017-10-01

    We give a direct proof of the fact that the L^p-norms of global solutions of the Boussinesq system in R^3 grow large as t→ ∞ for 1R+× R3. In particular, the kinetic energy blows up as \\Vert u(t)\\Vert _2^2˜ ct^{1/2} for large time. This contrasts with the case of the Navier-Stokes equations.