WorldWideScience

Sample records for advanced computational approaches

  1. Advanced computational approaches to biomedical engineering

    CERN Document Server

    Saha, Punam K; Basu, Subhadip

    2014-01-01

    There has been rapid growth in biomedical engineering in recent decades, given advancements in medical imaging and physiological modelling and sensing systems, coupled with immense growth in computational and network technology, analytic approaches, visualization and virtual-reality, man-machine interaction and automation. Biomedical engineering involves applying engineering principles to the medical and biological sciences and it comprises several topics including biomedicine, medical imaging, physiological modelling and sensing, instrumentation, real-time systems, automation and control, sig

  2. Computational experiment approach to advanced secondary mathematics curriculum

    CERN Document Server

    Abramovich, Sergei

    2014-01-01

    This book promotes the experimental mathematics approach in the context of secondary mathematics curriculum by exploring mathematical models depending on parameters that were typically considered advanced in the pre-digital education era. This approach, by drawing on the power of computers to perform numerical computations and graphical constructions, stimulates formal learning of mathematics through making sense of a computational experiment. It allows one (in the spirit of Freudenthal) to bridge serious mathematical content and contemporary teaching practice. In other words, the notion of teaching experiment can be extended to include a true mathematical experiment. When used appropriately, the approach creates conditions for collateral learning (in the spirit of Dewey) to occur including the development of skills important for engineering applications of mathematics. In the context of a mathematics teacher education program, this book addresses a call for the preparation of teachers capable of utilizing mo...

  3. Data analysis of asymmetric structures advanced approaches in computational statistics

    CERN Document Server

    Saito, Takayuki

    2004-01-01

    Data Analysis of Asymmetric Structures provides a comprehensive presentation of a variety of models and theories for the analysis of asymmetry and its applications and provides a wealth of new approaches in every section. It meets both the practical and theoretical needs of research professionals across a wide range of disciplines and  considers data analysis in fields such as psychology, sociology, social science, ecology, and marketing. In seven comprehensive chapters this guide details theories, methods, and models for the analysis of asymmetric structures in a variety of disciplines and presents future opportunities and challenges affecting research developments and business applications.

  4. Advanced approaches to characterize the human intestinal microbiota by computational meta-analysis

    NARCIS (Netherlands)

    Nikkilä, J.; Vos, de W.M.

    2010-01-01

    GOALS: We describe advanced approaches for the computational meta-analysis of a collection of independent studies, including over 1000 phylogenetic array datasets, as a means to characterize the variability of human intestinal microbiota. BACKGROUND: The human intestinal microbiota is a complex micr

  5. Advances in computers

    CERN Document Server

    Memon, Atif

    2012-01-01

    Since its first volume in 1960, Advances in Computers has presented detailed coverage of innovations in computer hardware, software, theory, design, and applications. It has also provided contributors with a medium in which they can explore their subjects in greater depth and breadth than journal articles usually allow. As a result, many articles have become standard references that continue to be of sugnificant, lasting value in this rapidly expanding field. In-depth surveys and tutorials on new computer technologyWell-known authors and researchers in the fieldExtensive bibliographies with m

  6. Recent Advances in Evolutionary Computation

    Institute of Scientific and Technical Information of China (English)

    Xin Yao; Yong Xu

    2006-01-01

    Evolutionary computation has experienced a tremendous growth in the last decade in both theoretical analyses and industrial applications. Its scope has evolved beyond its original meaning of "biological evolution" toward a wide variety of nature inspired computational algorithms and techniques, including evolutionary, neural, ecological, social and economical computation, etc., in a unified framework. Many research topics in evolutionary computation nowadays are not necessarily "evolutionary". This paper provides an overview of some recent advances in evolutionary computation that have been made in CERCIA at the University of Birmingham, UK. It covers a wide range of topics in optimization, learning and design using evolutionary approaches and techniques, and theoretical results in the computational time complexity of evolutionary algorithms. Some issues related to future development of evolutionary computation are also discussed.

  7. Advances in Computers

    CERN Document Server

    Zelkowitz, Marvin

    2010-01-01

    This is volume 79 of Advances in Computers. This series, which began publication in 1960, is the oldest continuously published anthology that chronicles the ever- changing information technology field. In these volumes we publish from 5 to 7 chapters, three times per year, that cover the latest changes to the design, development, use and implications of computer technology on society today. Covers the full breadth of innovations in hardware, software, theory, design, and applications.Many of the in-depth reviews have become standard references that co

  8. Advances in computational dynamics of particles, materials and structures a unified approach

    CERN Document Server

    Har, Jason

    2012-01-01

    Computational methods for the modeling and simulation of the dynamic response and behavior of particles, materials and structural systems have had a profound influence on science, engineering and technology. Complex science and engineering applications dealing with complicated structural geometries and materials that would be very difficult to treat using analytical methods have been successfully simulated using computational tools. With the incorporation of quantum, molecular and biological mechanics into new models, these methods are poised to play an even bigger role in the future. Ad

  9. Development of Computational Approaches for Simulation and Advanced Controls for Hybrid Combustion-Gasification Chemical Looping

    Energy Technology Data Exchange (ETDEWEB)

    Joshi, Abhinaya; Lou, Xinsheng; Neuschaefer, Carl; Chaudry, Majid; Quinn, Joseph

    2012-07-31

    This document provides the results of the project through September 2009. The Phase I project has recently been extended from September 2009 to March 2011. The project extension will begin work on Chemical Looping (CL) Prototype modeling and advanced control design exploration in preparation for a scale-up phase. The results to date include: successful development of dual loop chemical looping process models and dynamic simulation software tools, development and test of several advanced control concepts and applications for Chemical Looping transport control and investigation of several sensor concepts and establishment of two feasible sensor candidates recommended for further prototype development and controls integration. There are three sections in this summary and conclusions. Section 1 presents the project scope and objectives. Section 2 highlights the detailed accomplishments by project task area. Section 3 provides conclusions to date and recommendations for future work.

  10. Collaborative Learning: Cognitive and Computational Approaches. Advances in Learning and Instruction Series.

    Science.gov (United States)

    Dillenbourg, Pierre, Ed.

    Intended to illustrate the benefits of collaboration between scientists from psychology and computer science, namely machine learning, this book contains the following chapters, most of which are co-authored by scholars from both sides: (1) "Introduction: What Do You Mean by 'Collaborative Learning'?" (Pierre Dillenbourg); (2) "Learning Together:…

  11. Advanced Computational Approaches for Characterizing Stochastic Cellular Responses to Low Dose, Low Dose Rate Exposures

    Energy Technology Data Exchange (ETDEWEB)

    Scott, Bobby, R., Ph.D.

    2003-06-27

    OAK - B135 This project final report summarizes modeling research conducted in the U.S. Department of Energy (DOE), Low Dose Radiation Research Program at the Lovelace Respiratory Research Institute from October 1998 through June 2003. The modeling research described involves critically evaluating the validity of the linear nonthreshold (LNT) risk model as it relates to stochastic effects induced in cells by low doses of ionizing radiation and genotoxic chemicals. The LNT model plays a central role in low-dose risk assessment for humans. With the LNT model, any radiation (or genotoxic chemical) exposure is assumed to increase one¡¯s risk of cancer. Based on the LNT model, others have predicted tens of thousands of cancer deaths related to environmental exposure to radioactive material from nuclear accidents (e.g., Chernobyl) and fallout from nuclear weapons testing. Our research has focused on developing biologically based models that explain the shape of dose-response curves for low-dose radiation and genotoxic chemical-induced stochastic effects in cells. Understanding the shape of the dose-response curve for radiation and genotoxic chemical-induced stochastic effects in cells helps to better understand the shape of the dose-response curve for cancer induction in humans. We have used a modeling approach that facilitated model revisions over time, allowing for timely incorporation of new knowledge gained related to the biological basis for low-dose-induced stochastic effects in cells. Both deleterious (e.g., genomic instability, mutations, and neoplastic transformation) and protective (e.g., DNA repair and apoptosis) effects have been included in our modeling. Our most advanced model, NEOTRANS2, involves differing levels of genomic instability. Persistent genomic instability is presumed to be associated with nonspecific, nonlethal mutations and to increase both the risk for neoplastic transformation and for cancer occurrence. Our research results, based on

  12. Advances in physiological computing

    CERN Document Server

    Fairclough, Stephen H

    2014-01-01

    This edited collection will provide an overview of the field of physiological computing, i.e. the use of physiological signals as input for computer control. It will cover a breadth of current research, from brain-computer interfaces to telemedicine.

  13. International Conference on Advanced Computing for Innovation

    CERN Document Server

    Angelova, Galia; Agre, Gennady

    2016-01-01

    This volume is a selected collection of papers presented and discussed at the International Conference “Advanced Computing for Innovation (AComIn 2015)”. The Conference was held at 10th -11th of November, 2015 in Sofia, Bulgaria and was aimed at providing a forum for international scientific exchange between Central/Eastern Europe and the rest of the world on several fundamental topics of computational intelligence. The papers report innovative approaches and solutions in hot topics of computational intelligence – advanced computing, language and semantic technologies, signal and image processing, as well as optimization and intelligent control.

  14. Advanced intelligence and mechanism approach

    Institute of Scientific and Technical Information of China (English)

    ZHONG Yixin

    2007-01-01

    Advanced intelligence will feature the intelligence research in next 50 years.An understanding of the concept of advanced intelligence as well as its importance will be provided first,and detailed analysis on an approach,the mechanism approach.suitable to the advanced intelligence research will then be flolowed.And the mutual relationship among mechanism approach,traditional approaches existed in artificial intelligence research,and the cognitive informatics will be discussed.It is interesting to discover that mechanism approach is a good one to the Advanced Intelligence research and a tmified form of the existed approaches to artificial intelligence.

  15. Computational Intelligence Paradigms in Advanced Pattern Classification

    CERN Document Server

    Jain, Lakhmi

    2012-01-01

    This monograph presents selected areas of application of pattern recognition and classification approaches including handwriting recognition, medical image analysis and interpretation, development of cognitive systems for image computer understanding, moving object detection, advanced image filtration and intelligent multi-object labelling and classification. It is directed to the scientists, application engineers, professors, professors and students will find this book useful.

  16. Advanced computations in plasma physics

    International Nuclear Information System (INIS)

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. In this paper we review recent progress and future directions for advanced simulations in magnetically confined plasmas with illustrative examples chosen from magnetic confinement research areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to

  17. Recent advances in computational optimization

    CERN Document Server

    2013-01-01

    Optimization is part of our everyday life. We try to organize our work in a better way and optimization occurs in minimizing time and cost or the maximization of the profit, quality and efficiency. Also many real world problems arising in engineering, economics, medicine and other domains can be formulated as optimization tasks. This volume is a comprehensive collection of extended contributions from the Workshop on Computational Optimization. This book presents recent advances in computational optimization. The volume includes important real world problems like parameter settings for con- trolling processes in bioreactor, robot skin wiring, strip packing, project scheduling, tuning of PID controller and so on. Some of them can be solved by applying traditional numerical methods, but others need a huge amount of computational resources. For them it is shown that is appropriate to develop algorithms based on metaheuristic methods like evolutionary computation, ant colony optimization, constrain programming etc...

  18. International Conference on Advanced Computing

    CERN Document Server

    Patnaik, Srikanta

    2014-01-01

    This book is composed of the Proceedings of the International Conference on Advanced Computing, Networking, and Informatics (ICACNI 2013), held at Central Institute of Technology, Raipur, Chhattisgarh, India during June 14–16, 2013. The book records current research articles in the domain of computing, networking, and informatics. The book presents original research articles, case-studies, as well as review articles in the said field of study with emphasis on their implementation and practical application. Researchers, academicians, practitioners, and industry policy makers around the globe have contributed towards formation of this book with their valuable research submissions.

  19. Advances in embedded computer vision

    CERN Document Server

    Kisacanin, Branislav

    2014-01-01

    This illuminating collection offers a fresh look at the very latest advances in the field of embedded computer vision. Emerging areas covered by this comprehensive text/reference include the embedded realization of 3D vision technologies for a variety of applications, such as stereo cameras on mobile devices. Recent trends towards the development of small unmanned aerial vehicles (UAVs) with embedded image and video processing algorithms are also examined. The authoritative insights range from historical perspectives to future developments, reviewing embedded implementation, tools, technolog

  20. Advanced Computer Algebra for Determinants

    CERN Document Server

    Koutschan, Christoph

    2011-01-01

    We prove three conjectures concerning the evaluation of determinants, which are related to the counting of plane partitions and rhombus tilings. One of them has been posed by George Andrews in 1980, the other two are by Guoce Xin and Christian Krattenthaler. Our proofs employ computer algebra methods, namely the holonomic ansatz proposed by Doron Zeilberger and variations thereof. These variations make Zeilberger's original approach even more powerful and allow for addressing a wider variety of determinants. Finally we present, as a challenge problem, a conjecture about a closed form evaluation of Andrews's determinant.

  1. Computational approaches to vision

    Science.gov (United States)

    Barrow, H. G.; Tenenbaum, J. M.

    1986-01-01

    Vision is examined in terms of a computational process, and the competence, structure, and control of computer vision systems are analyzed. Theoretical and experimental data on the formation of a computer vision system are discussed. Consideration is given to early vision, the recovery of intrinsic surface characteristics, higher levels of interpretation, and system integration and control. A computational visual processing model is proposed and its architecture and operation are described. Examples of state-of-the-art vision systems, which include some of the levels of representation and processing mechanisms, are presented.

  2. Advances in Computer Science and Engineering

    CERN Document Server

    Second International Conference on Advances in Computer Science and Engineering (CES 2012)

    2012-01-01

    This book includes the proceedings of the second International Conference on Advances in Computer Science and Engineering (CES 2012), which was held during January 13-14, 2012 in Sanya, China. The papers in these proceedings of CES 2012 focus on the researchers’ advanced works in their fields of Computer Science and Engineering mainly organized in four topics, (1) Software Engineering, (2) Intelligent Computing, (3) Computer Networks, and (4) Artificial Intelligence Software.

  3. Handbook of computational approaches to counterterrorism

    CERN Document Server

    Subrahmanian, VS

    2012-01-01

    Terrorist groups throughout the world have been studied primarily through the use of social science methods. However, major advances in IT during the past decade have led to significant new ways of studying terrorist groups, making forecasts, learning models of their behaviour, and shaping policies about their behaviour. Handbook of Computational Approaches to Counterterrorism provides the first in-depth look at how advanced mathematics and modern computing technology is shaping the study of terrorist groups. This book includes contributions from world experts in the field, and presents extens

  4. Toward exascale computing through neuromorphic approaches.

    Energy Technology Data Exchange (ETDEWEB)

    James, Conrad D.

    2010-09-01

    While individual neurons function at relatively low firing rates, naturally-occurring nervous systems not only surpass manmade systems in computing power, but accomplish this feat using relatively little energy. It is asserted that the next major breakthrough in computing power will be achieved through application of neuromorphic approaches that mimic the mechanisms by which neural systems integrate and store massive quantities of data for real-time decision making. The proposed LDRD provides a conceptual foundation for SNL to make unique advances toward exascale computing. First, a team consisting of experts from the HPC, MESA, cognitive and biological sciences and nanotechnology domains will be coordinated to conduct an exercise with the outcome being a concept for applying neuromorphic computing to achieve exascale computing. It is anticipated that this concept will involve innovative extension and integration of SNL capabilities in MicroFab, material sciences, high-performance computing, and modeling and simulation of neural processes/systems.

  5. Advanced in Computer Science and its Applications

    CERN Document Server

    Yen, Neil; Park, James; CSA 2013

    2014-01-01

    The theme of CSA is focused on the various aspects of computer science and its applications for advances in computer science and its applications and provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of computer science and its applications. Therefore this book will be include the various theories and practical applications in computer science and its applications.

  6. Bringing Advanced Computational Techniques to Energy Research

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Julie C

    2012-11-17

    Please find attached our final technical report for the BACTER Institute award. BACTER was created as a graduate and postdoctoral training program for the advancement of computational biology applied to questions of relevance to bioenergy research.

  7. A programming approach to computability

    CERN Document Server

    Kfoury, A J; Arbib, Michael A

    1982-01-01

    Computability theory is at the heart of theoretical computer science. Yet, ironically, many of its basic results were discovered by mathematical logicians prior to the development of the first stored-program computer. As a result, many texts on computability theory strike today's computer science students as far removed from their concerns. To remedy this, we base our approach to computability on the language of while-programs, a lean subset of PASCAL, and postpone consideration of such classic models as Turing machines, string-rewriting systems, and p. -recursive functions till the final chapter. Moreover, we balance the presentation of un solvability results such as the unsolvability of the Halting Problem with a presentation of the positive results of modern programming methodology, including the use of proof rules, and the denotational semantics of programs. Computer science seeks to provide a scientific basis for the study of information processing, the solution of problems by algorithms, and the design ...

  8. Soft computing in advanced robotics

    CERN Document Server

    Kobayashi, Ichiro; Kim, Euntai

    2014-01-01

    Intelligent system and robotics are inevitably bound up; intelligent robots makes embodiment of system integration by using the intelligent systems. We can figure out that intelligent systems are to cell units, while intelligent robots are to body components. The two technologies have been synchronized in progress. Making leverage of the robotics and intelligent systems, applications cover boundlessly the range from our daily life to space station; manufacturing, healthcare, environment, energy, education, personal assistance, logistics. This book aims at presenting the research results in relevance with intelligent robotics technology. We propose to researchers and practitioners some methods to advance the intelligent systems and apply them to advanced robotics technology. This book consists of 10 contributions that feature mobile robots, robot emotion, electric power steering, multi-agent, fuzzy visual navigation, adaptive network-based fuzzy inference system, swarm EKF localization and inspection robot. Th...

  9. Advanced laptop and small personal computer technology

    Science.gov (United States)

    Johnson, Roger L.

    1991-01-01

    Advanced laptop and small personal computer technology is presented in the form of the viewgraphs. The following areas of hand carried computers and mobile workstation technology are covered: background, applications, high end products, technology trends, requirements for the Control Center application, and recommendations for the future.

  10. Advanced Biomedical Computing Center (ABCC) | DSITP

    Science.gov (United States)

    The Advanced Biomedical Computing Center (ABCC), located in Frederick Maryland (MD), provides HPC resources for both NIH/NCI intramural scientists and the extramural biomedical research community. Its mission is to provide HPC support, to provide collaborative research, and to conduct in-house research in various areas of computational biology and biomedical research.

  11. Elliptic curves a computational approach

    CERN Document Server

    Schmitt, Susanne; Pethö, Attila

    2003-01-01

    The basics of the theory of elliptic curves should be known to everybody, be he (or she) a mathematician or a computer scientist. Especially everybody concerned with cryptography should know the elements of this theory. The purpose of the present textbook is to give an elementary introduction to elliptic curves. Since this branch of number theory is particularly accessible to computer-assisted calculations, the authors make use of it by approaching the theory under a computational point of view. Specifically, the computer-algebra package SIMATH can be applied on several occasions. However, the book can be read also by those not interested in any computations. Of course, the theory of elliptic curves is very comprehensive and becomes correspondingly sophisticated. That is why the authors made a choice of the topics treated. Topics covered include the determination of torsion groups, computations regarding the Mordell-Weil group, height calculations, S-integral points. The contents is kept as elementary as poss...

  12. Advance Trends in Soft Computing

    CERN Document Server

    Kreinovich, Vladik; Kacprzyk, Janusz; WCSC 2013

    2014-01-01

    This book is the proceedings of the 3rd World Conference on Soft Computing (WCSC), which was held in San Antonio, TX, USA, on December 16-18, 2013. It presents start-of-the-art theory and applications of soft computing together with an in-depth discussion of current and future challenges in the field, providing readers with a 360 degree view on soft computing. Topics range from fuzzy sets, to fuzzy logic, fuzzy mathematics, neuro-fuzzy systems, fuzzy control, decision making in fuzzy environments, image processing and many more. The book is dedicated to Lotfi A. Zadeh, a renowned specialist in signal analysis and control systems research who proposed the idea of fuzzy sets, in which an element may have a partial membership, in the early 1960s, followed by the idea of fuzzy logic, in which a statement can be true only to a certain degree, with degrees described by numbers in the interval [0,1]. The performance of fuzzy systems can often be improved with the help of optimization techniques, e.g. evolutionary co...

  13. Antenna arrays a computational approach

    CERN Document Server

    Haupt, Randy L

    2010-01-01

    This book covers a wide range of antenna array topics that are becoming increasingly important in wireless applications, particularly in design and computer modeling. Signal processing and numerical modeling algorithms are explored, and MATLAB computer codes are provided for many of the design examples. Pictures of antenna arrays and components provided by industry and government sources are presented with explanations of how they work. Antenna Arrays is a valuable reference for practicing engineers and scientists in wireless communications, radar, and remote sensing, and an excellent textbook for advanced antenna courses.

  14. Recent Advances in Computational Conformal Geometry

    OpenAIRE

    Gu, Xianfeng David; Luo, Feng; Yau, Shing-Tung

    2009-01-01

    Computational conformal geometry focuses on developing the computational methodologies on discrete surfaces to discover conformal geometric invariants. In this work, we briefly summarize the recent developments for methods and related applications in computational conformal geometry. There are two major approaches, holomorphic differentials and curvature flow. Holomorphic differential method is a linear method, which is more efficient and robust to triangulations with lower qua...

  15. Advances in randomized parallel computing

    CERN Document Server

    Rajasekaran, Sanguthevar

    1999-01-01

    The technique of randomization has been employed to solve numerous prob­ lems of computing both sequentially and in parallel. Examples of randomized algorithms that are asymptotically better than their deterministic counterparts in solving various fundamental problems abound. Randomized algorithms have the advantages of simplicity and better performance both in theory and often in practice. This book is a collection of articles written by renowned experts in the area of randomized parallel computing. A brief introduction to randomized algorithms In the aflalysis of algorithms, at least three different measures of performance can be used: the best case, the worst case, and the average case. Often, the average case run time of an algorithm is much smaller than the worst case. 2 For instance, the worst case run time of Hoare's quicksort is O(n ), whereas its average case run time is only O( n log n). The average case analysis is conducted with an assumption on the input space. The assumption made to arrive at t...

  16. Computational Approaches to Nucleic Acid Origami.

    Science.gov (United States)

    Jabbari, Hosna; Aminpour, Maral; Montemagno, Carlo

    2015-10-12

    Recent advances in experimental DNA origami have dramatically expanded the horizon of DNA nanotechnology. Complex 3D suprastructures have been designed and developed using DNA origami with applications in biomaterial science, nanomedicine, nanorobotics, and molecular computation. Ribonucleic acid (RNA) origami has recently been realized as a new approach. Similar to DNA, RNA molecules can be designed to form complex 3D structures through complementary base pairings. RNA origami structures are, however, more compact and more thermodynamically stable due to RNA's non-canonical base pairing and tertiary interactions. With all these advantages, the development of RNA origami lags behind DNA origami by a large gap. Furthermore, although computational methods have proven to be effective in designing DNA and RNA origami structures and in their evaluation, advances in computational nucleic acid origami is even more limited. In this paper, we review major milestones in experimental and computational DNA and RNA origami and present current challenges in these fields. We believe collaboration between experimental nanotechnologists and computer scientists are critical for advancing these new research paradigms. PMID:26348196

  17. Computational approaches to energy materials

    CERN Document Server

    Catlow, Richard; Walsh, Aron

    2013-01-01

    The development of materials for clean and efficient energy generation and storage is one of the most rapidly developing, multi-disciplinary areas of contemporary science, driven primarily by concerns over global warming, diminishing fossil-fuel reserves, the need for energy security, and increasing consumer demand for portable electronics. Computational methods are now an integral and indispensable part of the materials characterisation and development process.   Computational Approaches to Energy Materials presents a detailed survey of current computational techniques for the

  18. Advances and trends in computational structures technology

    Science.gov (United States)

    Noor, A. K.; Venneri, S. L.

    1990-01-01

    The major goals of computational structures technology (CST) are outlined, and recent advances in CST are examined. These include computational material modeling, stochastic-based modeling, computational methods for articulated structural dynamics, strategies and numerical algorithms for new computing systems, multidisciplinary analysis and optimization. The role of CST in the future development of structures technology and the multidisciplinary design of future flight vehicles is addressed, and the future directions of CST research in the prediction of failures of structural components, the solution of large-scale structural problems, and quality assessment and control of numerical simulations are discussed.

  19. Computational electromagnetics recent advances and engineering applications

    CERN Document Server

    2014-01-01

    Emerging Topics in Computational Electromagnetics in Computational Electromagnetics presents advances in Computational Electromagnetics. This book is designed to fill the existing gap in current CEM literature that only cover the conventional numerical techniques for solving traditional EM problems. The book examines new algorithms, and applications of these algorithms for solving problems of current interest that are not readily amenable to efficient treatment by using the existing techniques. The authors discuss solution techniques for problems arising in nanotechnology, bioEM, metamaterials, as well as multiscale problems. They present techniques that utilize recent advances in computer technology, such as parallel architectures, and the increasing need to solve large and complex problems in a time efficient manner by using highly scalable algorithms.

  20. Mobile Cloud Computing: A Review on Smartphone Augmentation Approaches

    OpenAIRE

    Abolfazli, Saeid; Sanaei, Zohreh; Gani, Abdullah

    2012-01-01

    Smartphones have recently gained significant popularity in heavy mobile processing while users are increasing their expectations toward rich computing experience. However, resource limitations and current mobile computing advancements hinder this vision. Therefore, resource-intensive application execution remains a challenging task in mobile computing that necessitates device augmentation. In this article, smartphone augmentation approaches are reviewed and classified in two main groups, name...

  1. Managing Security in Advanced Computational Infrastructure

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Proposed by Education Ministry of China, Advanced Computational Infrastructure (ACI) aims at sharing geographically distributed high-performance computing and huge-capacity data resource among the universities of China. With the fast development of large-scale applications in ACI, the security requirements become more and more urgent. The special security needs in ACI is first analyzed in this paper, and security management system based on ACI is presented. Finally, the realization of security management system is discussed.

  2. Advances and Challenges in Computational Plasma Science

    Energy Technology Data Exchange (ETDEWEB)

    W.M. Tang; V.S. Chan

    2005-01-03

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behavior. Recent advances in simulations of magnetically-confined plasmas are reviewed in this paper with illustrative examples chosen from associated research areas such as microturbulence, magnetohydrodynamics, and other topics. Progress has been stimulated in particular by the exponential growth of computer speed along with significant improvements in computer technology.

  3. Advanced computational electromagnetic methods and applications

    CERN Document Server

    Li, Wenxing; Elsherbeni, Atef; Rahmat-Samii, Yahya

    2015-01-01

    This new resource covers the latest developments in computational electromagnetic methods, with emphasis on cutting-edge applications. This book is designed to extend existing literature to the latest development in computational electromagnetic methods, which are of interest to readers in both academic and industrial areas. The topics include advanced techniques in MoM, FEM and FDTD, spectral domain method, GPU and Phi hardware acceleration, metamaterials, frequency and time domain integral equations, and statistics methods in bio-electromagnetics.

  4. Computational Biology, Advanced Scientific Computing, and Emerging Computational Architectures

    Energy Technology Data Exchange (ETDEWEB)

    None

    2007-06-27

    This CRADA was established at the start of FY02 with $200 K from IBM and matching funds from DOE to support post-doctoral fellows in collaborative research between International Business Machines and Oak Ridge National Laboratory to explore effective use of emerging petascale computational architectures for the solution of computational biology problems. 'No cost' extensions of the CRADA were negotiated with IBM for FY03 and FY04.

  5. Computational Approach for Developing Blood Pump

    Science.gov (United States)

    Kwak, Dochan

    2002-01-01

    This viewgraph presentation provides an overview of the computational approach to developing a ventricular assist device (VAD) which utilizes NASA aerospace technology. The VAD is used as a temporary support to sick ventricles for those who suffer from late stage congestive heart failure (CHF). The need for donor hearts is much greater than their availability, and the VAD is seen as a bridge-to-transplant. The computational issues confronting the design of a more advanced, reliable VAD include the modelling of viscous incompressible flow. A computational approach provides the possibility of quantifying the flow characteristics, which is especially valuable for analyzing compact design with highly sensitive operating conditions. Computational fluid dynamics (CFD) and rocket engine technology has been applied to modify the design of a VAD which enabled human transplantation. The computing requirement for this project is still large, however, and the unsteady analysis of the entire system from natural heart to aorta involves several hundred revolutions of the impeller. Further study is needed to assess the impact of mechanical VADs on the human body

  6. Advances in Monte Carlo computer simulation

    Science.gov (United States)

    Swendsen, Robert H.

    2011-03-01

    Since the invention of the Metropolis method in 1953, Monte Carlo methods have been shown to provide an efficient, practical approach to the calculation of physical properties in a wide variety of systems. In this talk, I will discuss some of the advances in the MC simulation of thermodynamics systems, with an emphasis on optimization to obtain a maximum of useful information.

  7. Advances in computers improving the web

    CERN Document Server

    Zelkowitz, Marvin

    2010-01-01

    This is volume 78 of Advances in Computers. This series, which began publication in 1960, is the oldest continuously published anthology that chronicles the ever- changing information technology field. In these volumes we publish from 5 to 7 chapters, three times per year, that cover the latest changes to the design, development, use and implications of computer technology on society today.Covers the full breadth of innovations in hardware, software, theory, design, and applications.Many of the in-depth reviews have become standard references that continue to be of significant, lasting value i

  8. Computational Approaches to Vestibular Research

    Science.gov (United States)

    Ross, Muriel D.; Wade, Charles E. (Technical Monitor)

    1994-01-01

    The Biocomputation Center at NASA Ames Research Center is dedicated to a union between computational, experimental and theoretical approaches to the study of neuroscience and of life sciences in general. The current emphasis is on computer reconstruction and visualization of vestibular macular architecture in three-dimensions (3-D), and on mathematical modeling and computer simulation of neural activity in the functioning system. Our methods are being used to interpret the influence of spaceflight on mammalian vestibular maculas in a model system, that of the adult Sprague-Dawley rat. More than twenty 3-D reconstructions of type I and type II hair cells and their afferents have been completed by digitization of contours traced from serial sections photographed in a transmission electron microscope. This labor-intensive method has now been replace d by a semiautomated method developed in the Biocomputation Center in which conventional photography is eliminated. All viewing, storage and manipulation of original data is done using Silicon Graphics workstations. Recent improvements to the software include a new mesh generation method for connecting contours. This method will permit the investigator to describe any surface, regardless of complexity, including highly branched structures such as are routinely found in neurons. This same mesh can be used for 3-D, finite volume simulation of synapse activation and voltage spread on neuronal surfaces visualized via the reconstruction process. These simulations help the investigator interpret the relationship between neuroarchitecture and physiology, and are of assistance in determining which experiments will best test theoretical interpretations. Data are also used to develop abstract, 3-D models that dynamically display neuronal activity ongoing in the system. Finally, the same data can be used to visualize the neural tissue in a virtual environment. Our exhibit will depict capabilities of our computational approaches and

  9. Computer Networks A Systems Approach

    CERN Document Server

    Peterson, Larry L

    2011-01-01

    This best-selling and classic book teaches you the key principles of computer networks with examples drawn from the real world of network and protocol design. Using the Internet as the primary example, the authors explain various protocols and networking technologies. Their systems-oriented approach encourages you to think about how individual network components fit into a larger, complex system of interactions. Whatever your perspective, whether it be that of an application developer, network administrator, or a designer of network equipment or protocols, you will come away with a "big pictur

  10. Research Institute for Advanced Computer Science

    Science.gov (United States)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a

  11. Interacting electrons theory and computational approaches

    CERN Document Server

    Martin, Richard M; Ceperley, David M

    2016-01-01

    Recent progress in the theory and computation of electronic structure is bringing an unprecedented level of capability for research. Many-body methods are becoming essential tools vital for quantitative calculations and understanding materials phenomena in physics, chemistry, materials science and other fields. This book provides a unified exposition of the most-used tools: many-body perturbation theory, dynamical mean field theory and quantum Monte Carlo simulations. Each topic is introduced with a less technical overview for a broad readership, followed by in-depth descriptions and mathematical formulation. Practical guidelines, illustrations and exercises are chosen to enable readers to appreciate the complementary approaches, their relationships, and the advantages and disadvantages of each method. This book is designed for graduate students and researchers who want to use and understand these advanced computational tools, get a broad overview, and acquire a basis for participating in new developments.

  12. Foreword: Advanced Science Letters (ASL), Special Issue on Computational Astrophysics

    CERN Document Server

    ,

    2009-01-01

    Computational astrophysics has undergone unprecedented development over the last decade, becoming a field of its own. The challenge ahead of us will involve increasingly complex multi-scale simulations. These will bridge the gap between areas of astrophysics such as star and planet formation, or star formation and galaxy formation, that have evolved separately until today. A global knowledge of the physics and modeling techniques of astrophysical simulations is thus an important asset for the next generation of modelers. With the aim at fostering such a global approach, we present the Special Issue on Computational Astrophysics for the Advanced Science Letters (http://www.aspbs.com/science.htm). The Advanced Science Letters (ASL) is a new multi-disciplinary scientific journal which will cover extensively computational astrophysics and cosmology, and will act as a forum for the presentation and discussion of novel work attempting to connect different research areas. This Special Issue collects 9 reviews on 9 k...

  13. Computational Approaches to Vestibular Research

    Science.gov (United States)

    Ross, Muriel D.; Wade, Charles E. (Technical Monitor)

    1994-01-01

    The Biocomputation Center at NASA Ames Research Center is dedicated to a union between computational, experimental and theoretical approaches to the study of neuroscience and of life sciences in general. The current emphasis is on computer reconstruction and visualization of vestibular macular architecture in three-dimensions (3-D), and on mathematical modeling and computer simulation of neural activity in the functioning system. Our methods are being used to interpret the influence of spaceflight on mammalian vestibular maculas in a model system, that of the adult Sprague-Dawley rat. More than twenty 3-D reconstructions of type I and type II hair cells and their afferents have been completed by digitization of contours traced from serial sections photographed in a transmission electron microscope. This labor-intensive method has now been replace d by a semiautomated method developed in the Biocomputation Center in which conventional photography is eliminated. All viewing, storage and manipulation of original data is done using Silicon Graphics workstations. Recent improvements to the software include a new mesh generation method for connecting contours. This method will permit the investigator to describe any surface, regardless of complexity, including highly branched structures such as are routinely found in neurons. This same mesh can be used for 3-D, finite volume simulation of synapse activation and voltage spread on neuronal surfaces visualized via the reconstruction process. These simulations help the investigator interpret the relationship between neuroarchitecture and physiology, and are of assistance in determining which experiments will best test theoretical interpretations. Data are also used to develop abstract, 3-D models that dynamically display neuronal activity ongoing in the system. Finally, the same data can be used to visualize the neural tissue in a virtual environment. Our exhibit will depict capabilities of our computational approaches and

  14. Advances in medical decision support systems for diagnosis of acute graft-versus-host disease: molecular and computational intelligence joint approaches

    Institute of Scientific and Technical Information of China (English)

    Maurizio FIASCH(E); Maria CUZZOLA; Giuseppe IRRERA; Pasquale IACOPINO; Francesco Carlo MORABITO

    2011-01-01

    Acute graft-versus-host disease (aGVHD) is a serious systemic complication of allogeneic hematopoietic stem cell transplantation (HSCT) causing considerable morbidity and mortality.Acute GVHD occurs when alloreactive donor-derived T cells recognize host-recipient antigens as foreign.These trigger a complex multiphase process that ultimately results in apoptotic injury in target organs.The early events leading to GVHD seem to occur very soon,presumably within hours from the graft infusion.Therefore,when the first signs of aGVHD clinically manifest,the disease has been ongoing for several days at the cellular level,and the inflammatory cytokine cascade is fully activated.So,it comes as no surprise that progress in treatment based on clinical diagnosis of aGVHD has been limited in the past 30 years.It is likely that a pre-emptive strategy using systemic high-dose corticosteroids as early as possible could improve the outcome of aGVHD.Due to the deleterious effects of such treatment particularly in terms of infection risk posed by systemic steroid administration in a population that is already immune-suppressed,it is critical to identify biomarker signatures for approaching this very complex task.Some research groups have begun addressing this issue through molecular and proteomic analyses,combining these approaches with computational intelligence techniques,with the specific aim of facilitating the identification of diagnostic biomarkers in aGVHD.In this review,we focus on the aGVHD scenario and on the more recent state-of-the-art.We also attempt to give an overview of the classical and novel techniques proposed as medical decision support system for the diagnosis of GVHD.

  15. Computational approach to Riemann surfaces

    CERN Document Server

    Klein, Christian

    2011-01-01

    This volume offers a well-structured overview of existent computational approaches to Riemann surfaces and those currently in development. The authors of the contributions represent the groups providing publically available numerical codes in this field. Thus this volume illustrates which software tools are available and how they can be used in practice. In addition examples for solutions to partial differential equations and in surface theory are presented. The intended audience of this book is twofold. It can be used as a textbook for a graduate course in numerics of Riemann surfaces, in which case the standard undergraduate background, i.e., calculus and linear algebra, is required. In particular, no knowledge of the theory of Riemann surfaces is expected; the necessary background in this theory is contained in the Introduction chapter. At the same time, this book is also intended for specialists in geometry and mathematical physics applying the theory of Riemann surfaces in their research. It is the first...

  16. Advances of evolutionary computation methods and operators

    CERN Document Server

    Cuevas, Erik; Oliva Navarro, Diego Alberto

    2016-01-01

    The goal of this book is to present advances that discuss alternative Evolutionary Computation (EC) developments and non-conventional operators which have proved to be effective in the solution of several complex problems. The book has been structured so that each chapter can be read independently from the others. The book contains nine chapters with the following themes: 1) Introduction, 2) the Social Spider Optimization (SSO), 3) the States of Matter Search (SMS), 4) the collective animal behavior (CAB) algorithm, 5) the Allostatic Optimization (AO) method, 6) the Locust Search (LS) algorithm, 7) the Adaptive Population with Reduced Evaluations (APRE) method, 8) the multimodal CAB, 9) the constrained SSO method.

  17. Computational Design of Advanced Nuclear Fuels

    Energy Technology Data Exchange (ETDEWEB)

    Savrasov, Sergey [Univ. of California, Davis, CA (United States); Kotliar, Gabriel [Rutgers Univ., Piscataway, NJ (United States); Haule, Kristjan [Rutgers Univ., Piscataway, NJ (United States)

    2014-06-03

    The objective of the project was to develop a method for theoretical understanding of nuclear fuel materials whose physical and thermophysical properties can be predicted from first principles using a novel dynamical mean field method for electronic structure calculations. We concentrated our study on uranium, plutonium, their oxides, nitrides, carbides, as well as some rare earth materials whose 4f eletrons provide a simplified framework for understanding complex behavior of the f electrons. We addressed the issues connected to the electronic structure, lattice instabilities, phonon and magnon dynamics as well as thermal conductivity. This allowed us to evaluate characteristics of advanced nuclear fuel systems using computer based simulations and avoid costly experiments.

  18. Computer Forensics Education - the Open Source Approach

    Science.gov (United States)

    Huebner, Ewa; Bem, Derek; Cheung, Hon

    In this chapter we discuss the application of the open source software tools in computer forensics education at tertiary level. We argue that open source tools are more suitable than commercial tools, as they provide the opportunity for students to gain in-depth understanding and appreciation of the computer forensic process as opposed to familiarity with one software product, however complex and multi-functional. With the access to all source programs the students become more than just the consumers of the tools as future forensic investigators. They can also examine the code, understand the relationship between the binary images and relevant data structures, and in the process gain necessary background to become the future creators of new and improved forensic software tools. As a case study we present an advanced subject, Computer Forensics Workshop, which we designed for the Bachelor's degree in computer science at the University of Western Sydney. We based all laboratory work and the main take-home project in this subject on open source software tools. We found that without exception more than one suitable tool can be found to cover each topic in the curriculum adequately. We argue that this approach prepares students better for forensic field work, as they gain confidence to use a variety of tools, not just a single product they are familiar with.

  19. International Conference on Computers and Advanced Technology in Education

    CERN Document Server

    Advanced Information Technology in Education

    2012-01-01

    The volume includes a set of selected papers extended and revised from the 2011 International Conference on Computers and Advanced Technology in Education. With the development of computers and advanced technology, the human social activities are changing basically. Education, especially the education reforms in different countries, has been experiencing the great help from the computers and advanced technology. Generally speaking, education is a field which needs more information, while the computers, advanced technology and internet are a good information provider. Also, with the aid of the computer and advanced technology, persons can make the education an effective combination. Therefore, computers and advanced technology should be regarded as an important media in the modern education. Volume Advanced Information Technology in Education is to provide a forum for researchers, educators, engineers, and government officials involved in the general areas of computers and advanced technology in education to d...

  20. Advanced control room evaluation: General approach and rationale

    Energy Technology Data Exchange (ETDEWEB)

    O' Hara, J.M. (Brookhaven National Lab., Upton, NY (USA)); Wachtel, J. (Nuclear Regulatory Commission, Washington, DC (USA))

    1991-01-01

    Advanced control rooms (ACRs) for future nuclear power plants (NPPs) are being designed utilizing computer-based technologies. The US Nuclear Regulatory Commission reviews the human engineering aspects of such control rooms to ensure that they are designed to good human factors engineering principles and that operator performance and reliability are appropriately supported in order to protect public health and safety. This paper describes the rationale and general approach to the development of a human factors review guideline for ACRs. The factors influencing the guideline development are discussed, including the review environment, the types of advanced technologies being addressed, the human factors issues associated with advanced technology, and the current state-of-the-art of human factors guidelines for advanced human-system interfaces (HSIs). The proposed approach to ACR review would track the design and implementation process through the application of review guidelines reflecting four review modules: planning, design process analysis, human factors engineering review, and dynamic performance evaluation. 21 refs.

  1. Computer Architecture A Quantitative Approach

    CERN Document Server

    Hennessy, John L

    2011-01-01

    The computing world today is in the middle of a revolution: mobile clients and cloud computing have emerged as the dominant paradigms driving programming and hardware innovation today. The Fifth Edition of Computer Architecture focuses on this dramatic shift, exploring the ways in which software and technology in the cloud are accessed by cell phones, tablets, laptops, and other mobile computing devices. Each chapter includes two real-world examples, one mobile and one datacenter, to illustrate this revolutionary change.Updated to cover the mobile computing revolutionEmphasizes the two most im

  2. Advanced proton imaging in computed tomography

    CERN Document Server

    Mattiazzo, S; Giubilato, P; Pantano, D; Pozzobon, N; Snoeys, W; Wyss, J

    2015-01-01

    In recent years the use of hadrons for cancer radiation treatment has grown in importance, and many facilities are currently operational or under construction worldwide. To fully exploit the therapeutic advantages offered by hadron therapy, precise body imaging for accurate beam delivery is decisive. Proton computed tomography (pCT) scanners, currently in their R&D phase, provide the ultimate 3D imaging for hadrons treatment guidance. A key component of a pCT scanner is the detector used to track the protons, which has great impact on the scanner performances and ultimately limits its maximum speed. In this article, a novel proton-tracking detector was presented that would have higher scanning speed, better spatial resolution and lower material budget with respect to present state-of-the-art detectors, leading to enhanced performances. This advancement in performances is achieved by employing the very latest development in monolithic active pixel detectors (to build high granularity, low material budget, ...

  3. Transport modeling and advanced computer techniques

    International Nuclear Information System (INIS)

    A workshop was held at the University of Texas in June 1988 to consider the current state of transport codes and whether improved user interfaces would make the codes more usable and accessible to the fusion community. Also considered was the possibility that a software standard could be devised to ease the exchange of routines between groups. It was noted that two of the major obstacles to exchanging routines now are the variety of geometrical representation and choices of units. While the workshop formulated no standards, it was generally agreed that good software engineering would aid in the exchange of routines, and that a continued exchange of ideas between groups would be worthwhile. It seems that before we begin to discuss software standards we should review the current state of computer technology --- both hardware and software to see what influence recent advances might have on our software goals. This is done in this paper

  4. Advanced Scientific Computing Research Network Requirements

    Energy Technology Data Exchange (ETDEWEB)

    Bacon, Charles; Bell, Greg; Canon, Shane; Dart, Eli; Dattoria, Vince; Goodwin, Dave; Lee, Jason; Hicks, Susan; Holohan, Ed; Klasky, Scott; Lauzon, Carolyn; Rogers, Jim; Shipman, Galen; Skinner, David; Tierney, Brian

    2013-03-08

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 25 years. In October 2012, ESnet and the Office of Advanced Scientific Computing Research (ASCR) of the DOE SC organized a review to characterize the networking requirements of the programs funded by the ASCR program office. The requirements identified at the review are summarized in the Findings section, and are described in more detail in the body of the report.

  5. Advanced Safeguards Approaches for New Reprocessing Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Durst, Philip C.; Therios, Ike; Bean, Robert; Dougan, A.; Boyer, Brian; Wallace, Richard; Ehinger, Michael H.; Kovacic, Don N.; Tolk, K.

    2007-06-24

    U.S. efforts to promote the international expansion of nuclear energy through the Global Nuclear Energy Partnership (GNEP) will result in a dramatic expansion of nuclear fuel cycle facilities in the United States. New demonstration facilities, such as the Advanced Fuel Cycle Facility (AFCF), the Advanced Burner Reactor (ABR), and the Consolidated Fuel Treatment Center (CFTC) will use advanced nuclear and chemical process technologies that must incorporate increased proliferation resistance to enhance nuclear safeguards. The ASA-100 Project, “Advanced Safeguards Approaches for New Nuclear Fuel Cycle Facilities,” commissioned by the NA-243 Office of NNSA, has been tasked with reviewing and developing advanced safeguards approaches for these demonstration facilities. Because one goal of GNEP is developing and sharing proliferation-resistant nuclear technology and services with partner nations, the safeguards approaches considered are consistent with international safeguards as currently implemented by the International Atomic Energy Agency (IAEA). This first report reviews possible safeguards approaches for the new fuel reprocessing processes to be deployed at the AFCF and CFTC facilities. Similar analyses addressing the ABR and transuranic (TRU) fuel fabrication lines at AFCF and CFTC will be presented in subsequent reports.

  6. Making Advanced Computer Science Topics More Accessible through Interactive Technologies

    Science.gov (United States)

    Shao, Kun; Maher, Peter

    2012-01-01

    Purpose: Teaching advanced technical concepts in a computer science program to students of different technical backgrounds presents many challenges. The purpose of this paper is to present a detailed experimental pedagogy in teaching advanced computer science topics, such as computer networking, telecommunications and data structures using…

  7. OPENING REMARKS: Scientific Discovery through Advanced Computing

    Science.gov (United States)

    Strayer, Michael

    2006-01-01

    Good morning. Welcome to SciDAC 2006 and Denver. I share greetings from the new Undersecretary for Energy, Ray Orbach. Five years ago SciDAC was launched as an experiment in computational science. The goal was to form partnerships among science applications, computer scientists, and applied mathematicians to take advantage of the potential of emerging terascale computers. This experiment has been a resounding success. SciDAC has emerged as a powerful concept for addressing some of the biggest challenges facing our world. As significant as these successes were, I believe there is also significance in the teams that achieved them. In addition to their scientific aims these teams have advanced the overall field of computational science and set the stage for even larger accomplishments as we look ahead to SciDAC-2. I am sure that many of you are expecting to hear about the results of our current solicitation for SciDAC-2. I’m afraid we are not quite ready to make that announcement. Decisions are still being made and we will announce the results later this summer. Nearly 250 unique proposals were received and evaluated, involving literally thousands of researchers, postdocs, and students. These collectively requested more than five times our expected budget. This response is a testament to the success of SciDAC in the community. In SciDAC-2 our budget has been increased to about 70 million for FY 2007 and our partnerships have expanded to include the Environment and National Security missions of the Department. The National Science Foundation has also joined as a partner. These new partnerships are expected to expand the application space of SciDAC, and broaden the impact and visibility of the program. We have, with our recent solicitation, expanded to turbulence, computational biology, and groundwater reactive modeling and simulation. We are currently talking with the Department’s applied energy programs about risk assessment, optimization of complex systems - such

  8. Soft Computing Approaches To Fault Tolerant Systems

    Directory of Open Access Journals (Sweden)

    Neeraj Prakash Srivastava

    2014-05-01

    Full Text Available We present in this paper as an introduction to soft computing techniques for fault tolerant systems and the terminology with different ways of achieving fault tolerance. The paper focuses on the problem of fault tolerance using soft computing techniques. The fundamentals of soft computing approaches and its type with introduction of fault tolerance are discussed. The main objective is to show how to implement soft computing approaches for fault detection, isolation and identification. The paper contains details about soft computing application with an application of wireless sensor network as fault tolerant system.

  9. DNA Reservoir Computing: A Novel Molecular Computing Approach

    CERN Document Server

    Goudarzi, Alireza; Stefanovic, Darko

    2013-01-01

    We propose a novel molecular computing approach based on reservoir computing. In reservoir computing, a dynamical core, called a reservoir, is perturbed with an external input signal while a readout layer maps the reservoir dynamics to a target output. Computation takes place as a transformation from the input space to a high-dimensional spatiotemporal feature space created by the transient dynamics of the reservoir. The readout layer then combines these features to produce the target output. We show that coupled deoxyribozyme oscillators can act as the reservoir. We show that despite using only three coupled oscillators, a molecular reservoir computer could achieve 90% accuracy on a benchmark temporal problem.

  10. Ontological Approach toward Cybersecurity in Cloud Computing

    OpenAIRE

    Takahashi, Takeshi; Kadobayashi, Youki; FUJIWARA, HIROYUKI

    2014-01-01

    Widespread deployment of the Internet enabled building of an emerging IT delivery model, i.e., cloud computing. Albeit cloud computing-based services have rapidly developed, their security aspects are still at the initial stage of development. In order to preserve cybersecurity in cloud computing, cybersecurity information that will be exchanged within it needs to be identified and discussed. For this purpose, we propose an ontological approach to cybersecurity in cloud computing. We build an...

  11. GRID COMPUTING AND CHECKPOINT APPROACH

    Directory of Open Access Journals (Sweden)

    Pankaj gupta

    2011-05-01

    Full Text Available Grid computing is a means of allocating the computational power of alarge number of computers to complex difficult computation or problem. Grid computing is a distributed computing paradigm thatdiffers from traditional distributed computing in that it is aimed toward large scale systems that even span organizational boundaries. In this paper we investigate the different techniques of fault tolerance which are used in many real time distributed systems. The main focus is on types of fault occurring in the system, fault detection techniques and the recovery techniques used. A fault can occur due to link failure, resource failure or by any other reason is to be tolerated for working the system smoothly and accurately. These faults can be detected and recovered by many techniques used accordingly. An appropriate fault detector can avoid loss due to system crash and reliable fault tolerance technique can save from system failure. This paper provides how these methods are applied to detect and tolerate faults from various Real Time Distributed Systems. The advantages of utilizing the check pointing functionality are obvious; however so far the Grid community has notdeveloped a widely accepted standard that would allow the Gridenvironment to consciously utilize low level check pointing packages.Therefore, such a standard named Grid Check pointing Architecture isbeing designed. The fault tolerance mechanism used here sets the jobcheckpoints based on the resource failure rate. If resource failureoccurs, the job is restarted from its last successful state using acheckpoint file from another grid resource. A critical aspect for anautomatic recovery is the availability of checkpoint files. A strategy to increase the availability of checkpoints is replication. Grid is a form distributed computing mainly to virtualizes and utilize geographically distributed idle resources. A grid is a distributed computational and storage environment often composed of

  12. Advanced intelligent computational technologies and decision support systems

    CERN Document Server

    Kountchev, Roumen

    2014-01-01

    This book offers a state of the art collection covering themes related to Advanced Intelligent Computational Technologies and Decision Support Systems which can be applied to fields like healthcare assisting the humans in solving problems. The book brings forward a wealth of ideas, algorithms and case studies in themes like: intelligent predictive diagnosis; intelligent analyzing of medical images; new format for coding of single and sequences of medical images; Medical Decision Support Systems; diagnosis of Down’s syndrome; computational perspectives for electronic fetal monitoring; efficient compression of CT Images; adaptive interpolation and halftoning for medical images; applications of artificial neural networks for real-life problems solving; present and perspectives for Electronic Healthcare Record Systems; adaptive approaches for noise reduction in sequences of CT images etc.

  13. Immune based computer virus detection approaches

    Institute of Scientific and Technical Information of China (English)

    TAN Ying; ZHANG Pengtao

    2013-01-01

    The computer virus is considered one of the most horrifying threats to the security of computer systems worldwide.The rapid development of evasion techniques used in virus causes the signature based computer virus detection techniques to be ineffective.Many novel computer virus detection approaches have been proposed in the past to cope with the ineffectiveness,mainly classified into three categories:static,dynamic and heuristics techniques.As the natural similarities between the biological immune system (BIS),computer security system (CSS),and the artificial immune system (AIS) were all developed as a new prototype in the community of anti-virus research.The immune mechanisms in the BIS provide the opportunities to construct computer virus detection models that are robust and adaptive with the ability to detect unseen viruses.In this paper,a variety of classic computer virus detection approaches were introduced and reviewed based on the background knowledge of the computer virus history.Next,a variety of immune based computer virus detection approaches were also discussed in detail.Promising experimental results suggest that the immune based computer virus detection approaches were able to detect new variants and unseen viruses at lower false positive rates,which have paved a new way for the anti-virus research.

  14. COMPUTATIONAL APPROACH TO ORGANIZATIONAL DESIGN

    OpenAIRE

    Alexander Arenas; Roger Guimera; Joan R. Alabart; Hans-Joerg Witt; Albert Diaz-Guilera

    2000-01-01

    The idea of the work is to propose an abstract and simple enough agent-based model for company dynamics, in order to be able to deal computationally and even analytically with the problem of organizational design. Nevertheless, the model should be able to reproduce the essential characteristics of real organizations.The natural way of modeling a company is as being a network where the nodes represent employees and the links between them represent communication lines. In our model, problems ar...

  15. Mobile Cloud Computing: A Review on Smartphone Augmentation Approaches

    CERN Document Server

    Abolfazli, Saeid; Gani, Abdullah

    2012-01-01

    Smartphones have recently gained significant popularity in heavy mobile processing while users are increasing their expectations toward rich computing experience. However, resource limitations and current mobile computing advancements hinder this vision. Therefore, resource-intensive application execution remains a challenging task in mobile computing that necessitates device augmentation. In this article, smartphone augmentation approaches are reviewed and classified in two main groups, namely hardware and software. Generating high-end hardware is a subset of hardware augmentation approaches, whereas conserving local resource and reducing resource requirements approaches are grouped under software augmentation methods. Our study advocates that consreving smartphones' native resources, which is mainly done via task offloading, is more appropriate for already-developed applications than new ones, due to costly re-development process. Cloud computing has recently obtained momentous ground as one of the major co...

  16. Computer Architecture A Quantitative Approach

    CERN Document Server

    Hennessy, John L

    2007-01-01

    The era of seemingly unlimited growth in processor performance is over: single chip architectures can no longer overcome the performance limitations imposed by the power they consume and the heat they generate. Today, Intel and other semiconductor firms are abandoning the single fast processor model in favor of multi-core microprocessors--chips that combine two or more processors in a single package. In the fourth edition of Computer Architecture, the authors focus on this historic shift, increasing their coverage of multiprocessors and exploring the most effective ways of achieving parallelis

  17. Computational approaches for systems metabolomics.

    Science.gov (United States)

    Krumsiek, Jan; Bartel, Jörg; Theis, Fabian J

    2016-06-01

    Systems genetics is defined as the simultaneous assessment and analysis of multi-omics datasets. In the past few years, metabolomics has been established as a robust tool describing an important functional layer in this approach. The metabolome of a biological system represents an integrated state of genetic and environmental factors and has been referred to as a 'link between genotype and phenotype'. In this review, we summarize recent progresses in statistical analysis methods for metabolomics data in combination with other omics layers. We put a special focus on complex, multivariate statistical approaches as well as pathway-based and network-based analysis methods. Moreover, we outline current challenges and pitfalls of metabolomics-focused multi-omics analyses and discuss future steps for the field.

  18. Science based integrated approach to advanced nuclear fuel development - vision, approach, and overview

    Energy Technology Data Exchange (ETDEWEB)

    Unal, Cetin [Los Alamos National Laboratory; Pasamehmetoglu, Kemal [IDAHO NATIONAL LAB; Carmack, Jon [IDAHO NATIONAL LAB

    2010-01-01

    Advancing the performance of Light Water Reactors, Advanced Nuclear Fuel Cycles, and Advanced Rcactors, such as the Next Generation Nuclear Power Plants, requires enhancing our fundamental understanding of fuel and materials behavior under irradiation. The capability to accurately model the nuclear fuel systems is critical. In order to understand specific aspects of the nuclear fuel, fully coupled fuel simulation codes are required to achieve licensing of specific nuclear fuel designs for operation. The backbone of these codes, models, and simulations is a fundamental understanding and predictive capability for simulating the phase and microstructural behavior of the nuclear fuel system materials and matrices. The purpose of this paper is to identify the modeling and simulation approach in order to deliver predictive tools for advanced fuels development. The coordination between experimental nuclear fuel design, development technical experts, and computational fuel modeling and simulation technical experts is a critical aspect of the approach and naturally leads to an integrated, goal-oriented science-based R & D approach and strengthens both the experimental and computational efforts. The Advanced Fuels Campaign (AFC) and Nuclear Energy Advanced Modeling and Simulation (NEAMS) Fuels Integrated Performance and Safety Code (IPSC) are working together to determine experimental data and modeling needs. The primary objective of the NEAMS fuels IPSC project is to deliver a coupled, three-dimensional, predictive computational platform for modeling the fabrication and both normal and abnormal operation of nuclear fuel pins and assemblies, applicable to both existing and future reactor fuel designs. The science based program is pursuing the development of an integrated multi-scale and multi-physics modeling and simulation platform for nuclear fuels. This overview paper discusses the vision, goals and approaches how to develop and implement the new approach.

  19. Learning and geometry computational approaches

    CERN Document Server

    Smith, Carl

    1996-01-01

    The field of computational learning theory arose out of the desire to for­ mally understand the process of learning. As potential applications to artificial intelligence became apparent, the new field grew rapidly. The learning of geo­ metric objects became a natural area of study. The possibility of using learning techniques to compensate for unsolvability provided an attraction for individ­ uals with an immediate need to solve such difficult problems. Researchers at the Center for Night Vision were interested in solving the problem of interpreting data produced by a variety of sensors. Current vision techniques, which have a strong geometric component, can be used to extract features. However, these techniques fall short of useful recognition of the sensed objects. One potential solution is to incorporate learning techniques into the geometric manipulation of sensor data. As a first step toward realizing such a solution, the Systems Research Center at the University of Maryland, in conjunction with the C...

  20. Cloud computing approaches to accelerate drug discovery value chain.

    Science.gov (United States)

    Garg, Vibhav; Arora, Suchir; Gupta, Chitra

    2011-12-01

    Continued advancements in the area of technology have helped high throughput screening (HTS) evolve from a linear to parallel approach by performing system level screening. Advanced experimental methods used for HTS at various steps of drug discovery (i.e. target identification, target validation, lead identification and lead validation) can generate data of the order of terabytes. As a consequence, there is pressing need to store, manage, mine and analyze this data to identify informational tags. This need is again posing challenges to computer scientists to offer the matching hardware and software infrastructure, while managing the varying degree of desired computational power. Therefore, the potential of "On-Demand Hardware" and "Software as a Service (SAAS)" delivery mechanisms cannot be denied. This on-demand computing, largely referred to as Cloud Computing, is now transforming the drug discovery research. Also, integration of Cloud computing with parallel computing is certainly expanding its footprint in the life sciences community. The speed, efficiency and cost effectiveness have made cloud computing a 'good to have tool' for researchers, providing them significant flexibility, allowing them to focus on the 'what' of science and not the 'how'. Once reached to its maturity, Discovery-Cloud would fit best to manage drug discovery and clinical development data, generated using advanced HTS techniques, hence supporting the vision of personalized medicine.

  1. Cloud computing methods and practical approaches

    CERN Document Server

    Mahmood, Zaigham

    2013-01-01

    This book presents both state-of-the-art research developments and practical guidance on approaches, technologies and frameworks for the emerging cloud paradigm. Topics and features: presents the state of the art in cloud technologies, infrastructures, and service delivery and deployment models; discusses relevant theoretical frameworks, practical approaches and suggested methodologies; offers guidance and best practices for the development of cloud-based services and infrastructures, and examines management aspects of cloud computing; reviews consumer perspectives on mobile cloud computing an

  2. ADVANCES AT A GLANCE IN PARALLEL COMPUTING

    Directory of Open Access Journals (Sweden)

    RAJKUMAR SHARMA

    2014-07-01

    Full Text Available In the history of computational world, sequential uni-processor computers have been exploited for years to solve scientific and business problems. To satisfy the demand of compute & data hungry applications, it was observed that better response time can be achieved only through parallelism. Large computational problems were partitioned and solved by using multiple CPUs in parallel. Computing performance was further improved by adopting multi-core architecture which provides hardware parallelism through use of multiple cores. Efficient resource utilization of a parallel computing environment by using software and hardware parallelism is a major research challenge. The present hardware technologies provide freedom to algorithm developers for control & management of resources through software codes, such as threads-to-cores mapping in recent multi-core processors. In this paper, a survey is presented since beginning of parallel computing up to the use of present state-of-art multi-core processors.

  3. Computational intelligence for big data analysis frontier advances and applications

    CERN Document Server

    Dehuri, Satchidananda; Sanyal, Sugata

    2015-01-01

    The work presented in this book is a combination of theoretical advancements of big data analysis, cloud computing, and their potential applications in scientific computing. The theoretical advancements are supported with illustrative examples and its applications in handling real life problems. The applications are mostly undertaken from real life situations. The book discusses major issues pertaining to big data analysis using computational intelligence techniques and some issues of cloud computing. An elaborate bibliography is provided at the end of each chapter. The material in this book includes concepts, figures, graphs, and tables to guide researchers in the area of big data analysis and cloud computing.

  4. UNEDF: Advanced Scientific Computing Transforms the Low-Energy Nuclear Many-Body Problem

    CERN Document Server

    Stoitsov, M; Nazarewicz, W; Bulgac, A; Hagen, G; Kortelainen, M; Pei, J C; Roche, K J; Schunck, N; Thompson, I; Vary, J P; Wild, S M

    2011-01-01

    The UNEDF SciDAC collaboration of nuclear theorists, applied mathematicians, and computer scientists is developing a comprehensive description of nuclei and their reactions that delivers maximum predictive power with quantified uncertainties. This paper illustrates significant milestones accomplished by UNEDF through integration of the theoretical approaches, advanced numerical algorithms, and leadership class computational resources.

  5. Advanced Technologies, Embedded and Multimedia for Human-Centric Computing

    CERN Document Server

    Chao, Han-Chieh; Deng, Der-Jiunn; Park, James; HumanCom and EMC 2013

    2014-01-01

    The theme of HumanCom and EMC are focused on the various aspects of human-centric computing for advances in computer science and its applications, embedded and multimedia computing and provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of human-centric computing. And the theme of EMC (Advanced in Embedded and Multimedia Computing) is focused on the various aspects of embedded system, smart grid, cloud and multimedia computing, and it provides an opportunity for academic, industry professionals to discuss the latest issues and progress in the area of embedded and multimedia computing. Therefore this book will be include the various theories and practical applications in human-centric computing and embedded and multimedia computing.

  6. The ACP (Advanced Computer Program) multiprocessor system at Fermilab

    Energy Technology Data Exchange (ETDEWEB)

    Nash, T.; Areti, H.; Atac, R.; Biel, J.; Case, G.; Cook, A.; Fischler, M.; Gaines, I.; Hance, R.; Husby, D.

    1986-09-01

    The Advanced Computer Program at Fermilab has developed a multiprocessor system which is easy to use and uniquely cost effective for many high energy physics problems. The system is based on single board computers which cost under $2000 each to build including 2 Mbytes of on board memory. These standard VME modules each run experiment reconstruction code in Fortran at speeds approaching that of a VAX 11/780. Two versions have been developed: one uses Motorola's 68020 32 bit microprocessor, the other runs with AT and T's 32100. both include the corresponding floating point coprocessor chip. The first system, when fully configured, uses 70 each of the two types of processors. A 53 processor system has been operated for several months with essentially no down time by computer operators in the Fermilab Computer Center, performing at nearly the capacity of 6 CDC Cyber 175 mainframe computers. The VME crates in which the processing ''nodes'' sit are connected via a high speed ''Branch Bus'' to one or more MicroVAX computers which act as hosts handling system resource management and all I/O in offline applications. An interface from Fastbus to the Branch Bus has been developed for online use which has been tested error free at 20 Mbytes/sec for 48 hours. ACP hardware modules are now available commercially. A major package of software, including a simulator that runs on any VAX, has been developed. It allows easy migration of existing programs to this multiprocessor environment. This paper describes the ACP Multiprocessor System and early experience with it at Fermilab and elsewhere.

  7. GRID COMPUTING AND FAULT TOLERANCE APPROACH

    Directory of Open Access Journals (Sweden)

    Pankaj Gupta,

    2011-10-01

    Full Text Available Grid computing is a means of allocating the computational power of alarge number of computers to complex difficult computation orproblem. Grid computing is a distributed computing paradigm thatdiffers from traditional distributed computing in that it is aimed toward large scale systems that even span organizational boundaries. This paper proposes a method to achieve maximum fault tolerance in the Grid environment system by using Reliability consideration by using Replication approach and Check-point approach. Fault tolerance is an important property for large scale computational grid systems, where geographically distributed nodes co-operate to execute a task. In order to achieve high level of reliability and availability, the grid infrastructure should be a foolproof fault tolerant. Since the failure of resources affects job execution fatally, fault tolerance service is essential to satisfy QOS requirement in grid computing. Commonly utilized techniques for providing fault tolerance are job check pointing and replication. Both techniques mitigate the amount of work lost due to changing system availability but can introduce significant runtime overhead. The latter largely depends on the length of check pointing interval and the chosen number of replicas, respectively. In case of complex scientific workflows where tasks can execute in well defined order reliability is another biggest challenge because of the unreliable nature of the grid resources.

  8. Transonic wing analysis using advanced computational methods

    Science.gov (United States)

    Henne, P. A.; Hicks, R. M.

    1978-01-01

    This paper discusses the application of three-dimensional computational transonic flow methods to several different types of transport wing designs. The purpose of these applications is to evaluate the basic accuracy and limitations associated with such numerical methods. The use of such computational methods for practical engineering problems can only be justified after favorable evaluations are completed. The paper summarizes a study of both the small-disturbance and the full potential technique for computing three-dimensional transonic flows. Computed three-dimensional results are compared to both experimental measurements and theoretical results. Comparisons are made not only of pressure distributions but also of lift and drag forces. Transonic drag rise characteristics are compared. Three-dimensional pressure distributions and aerodynamic forces, computed from the full potential solution, compare reasonably well with experimental results for a wide range of configurations and flow conditions.

  9. Computational biomechanics for medicine new approaches and new applications

    CERN Document Server

    Miller, Karol; Wittek, Adam; Nielsen, Poul

    2015-01-01

    The Computational Biomechanics for Medicine titles provide an opportunity for specialists in computational biomechanics to present their latest methodologiesand advancements. Thisvolumecomprises twelve of the newest approaches and applications of computational biomechanics, from researchers in Australia, New Zealand, USA, France, Spain and Switzerland. Some of the interesting topics discussed are:real-time simulations; growth and remodelling of soft tissues; inverse and meshless solutions; medical image analysis; and patient-specific solid mechanics simulations. One of the greatest challenges facing the computational engineering community is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. We hope the research presented within this book series will contribute to overcoming this grand challenge.

  10. Second International Conference on Advanced Computing, Networking and Informatics

    CERN Document Server

    Mohapatra, Durga; Konar, Amit; Chakraborty, Aruna

    2014-01-01

    Advanced Computing, Networking and Informatics are three distinct and mutually exclusive disciplines of knowledge with no apparent sharing/overlap among them. However, their convergence is observed in many real world applications, including cyber-security, internet banking, healthcare, sensor networks, cognitive radio, pervasive computing amidst many others. This two-volume proceedings explore the combined use of Advanced Computing and Informatics in the next generation wireless networks and security, signal and image processing, ontology and human-computer interfaces (HCI). The two volumes together include 148 scholarly papers, which have been accepted for presentation from over 640 submissions in the second International Conference on Advanced Computing, Networking and Informatics, 2014, held in Kolkata, India during June 24-26, 2014. The first volume includes innovative computing techniques and relevant research results in informatics with selective applications in pattern recognition, signal/image process...

  11. Advances in Future Computer and Control Systems v.2

    CERN Document Server

    Lin, Sally; 2012 International Conference on Future Computer and Control Systems(FCCS2012)

    2012-01-01

    FCCS2012 is an integrated conference concentrating its focus on Future Computer and Control Systems. “Advances in Future Computer and Control Systems” presents the proceedings of the 2012 International Conference on Future Computer and Control Systems(FCCS2012) held April 21-22,2012, in Changsha, China including recent research results on Future Computer and Control Systems of researchers from all around the world.

  12. Advances in Future Computer and Control Systems v.1

    CERN Document Server

    Lin, Sally; 2012 International Conference on Future Computer and Control Systems(FCCS2012)

    2012-01-01

    FCCS2012 is an integrated conference concentrating its focus on Future Computer and Control Systems. “Advances in Future Computer and Control Systems” presents the proceedings of the 2012 International Conference on Future Computer and Control Systems(FCCS2012) held April 21-22,2012, in Changsha, China including recent research results on Future Computer and Control Systems of researchers from all around the world.

  13. Advances in computing, and their impact on scientific computing.

    Science.gov (United States)

    Giles, Mike

    2002-01-01

    This paper begins by discussing the developments and trends in computer hardware, starting with the basic components (microprocessors, memory, disks, system interconnect, networking and visualization) before looking at complete systems (death of vector supercomputing, slow demise of large shared-memory systems, rapid growth in very large clusters of PCs). It then considers the software side, the relative maturity of shared-memory (OpenMP) and distributed-memory (MPI) programming environments, and new developments in 'grid computing'. Finally, it touches on the increasing importance of software packages in scientific computing, and the increased importance and difficulty of introducing good software engineering practices into very large academic software development projects. PMID:12539947

  14. Power-efficient computer architectures recent advances

    CERN Document Server

    Själander, Magnus; Kaxiras, Stefanos

    2014-01-01

    As Moore's Law and Dennard scaling trends have slowed, the challenges of building high-performance computer architectures while maintaining acceptable power efficiency levels have heightened. Over the past ten years, architecture techniques for power efficiency have shifted from primarily focusing on module-level efficiencies, toward more holistic design styles based on parallelism and heterogeneity. This work highlights and synthesizes recent techniques and trends in power-efficient computer architecture.Table of Contents: Introduction / Voltage and Frequency Management / Heterogeneity and Sp

  15. Preface: Special issue: ten years of advances in computer entertainment

    NARCIS (Netherlands)

    Katayose, Haruhiro; Reidsma, Dennis; Rauterberg, M

    2014-01-01

    This special issue celebrates the 10th edition of the International Conference on Advances in Computer Entertainment (ACE) by collecting six selected and revised papers from among this year’s accepted contributions.

  16. Fragment-based approaches and computer-aided drug discovery.

    Science.gov (United States)

    Rognan, Didier

    2012-01-01

    Fragment-based design has significantly modified drug discovery strategies and paradigms in the last decade. Besides technological advances and novel therapeutic avenues, one of the most significant changes brought by this new discipline has occurred in the minds of drug designers. Fragment-based approaches have markedly impacted rational computer-aided design both in method development and in applications. The present review illustrates the importance of molecular fragments in many aspects of rational ligand design, and discusses how thinking in "fragment space" has boosted computational biology and chemistry. PMID:21710380

  17. Computational approaches for rational design of proteins with novel functionalities

    DEFF Research Database (Denmark)

    Tiwari, Manish Kumar; Singh, Ranjitha; Singh, Raushan Kumar;

    2012-01-01

    Proteins are the most multifaceted macromolecules in living systems and have various important functions, including structural, catalytic, sensory, and regulatory functions. Rational design of enzymes is a great challenge to our understanding of protein structure and physical chemistry and has...... exciting results. Developments in this field are already having a significant impact on biotechnology and chemical biology. The application of powerful computational methods for functional protein designing has recently succeeded at engineering target activities. Here, we review recently reported de novo...... functional proteins that were developed using various protein design approaches, including rational design, computational optimization, and selection from combinatorial libraries, highlighting recent advances and successes....

  18. 3rd International Conference on Advanced Computing, Networking and Informatics

    CERN Document Server

    Mohapatra, Durga; Chaki, Nabendu

    2016-01-01

    Advanced Computing, Networking and Informatics are three distinct and mutually exclusive disciplines of knowledge with no apparent sharing/overlap among them. However, their convergence is observed in many real world applications, including cyber-security, internet banking, healthcare, sensor networks, cognitive radio, pervasive computing amidst many others. This two volume proceedings explore the combined use of Advanced Computing and Informatics in the next generation wireless networks and security, signal and image processing, ontology and human-computer interfaces (HCI). The two volumes together include 132 scholarly articles, which have been accepted for presentation from over 550 submissions in the Third International Conference on Advanced Computing, Networking and Informatics, 2015, held in Bhubaneswar, India during June 23–25, 2015.

  19. Towards Lagrangian approach to quantum computations

    CERN Document Server

    Vlasov, A Yu

    2003-01-01

    In this work is discussed possibility and actuality of Lagrangian approach to quantum computations. Finite-dimensional Hilbert spaces used in this area provide some challenge for such consideration. The model discussed here can be considered as an analogue of Weyl quantization of field theory via path integral in L. D. Faddeev's approach. Weyl quantization is possible to use also in finite-dimensional case, and some formulas may be simply rewritten with change of integrals to finite sums. On the other hand, there are specific difficulties relevant to finite case. This work has some allusions with phase space models of quantum computations developed last time by different authors.

  20. Advances in Computer, Communication, Control and Automation

    CERN Document Server

    011 International Conference on Computer, Communication, Control and Automation

    2012-01-01

    The volume includes a set of selected papers extended and revised from the 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011). 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011) has been held in Zhuhai, China, November 19-20, 2011. This volume  topics covered include signal and Image processing, speech and audio Processing, video processing and analysis, artificial intelligence, computing and intelligent systems, machine learning, sensor and neural networks, knowledge discovery and data mining, fuzzy mathematics and Applications, knowledge-based systems, hybrid systems modeling and design, risk analysis and management, system modeling and simulation. We hope that researchers, graduate students and other interested readers benefit scientifically from the proceedings and also find it stimulating in the process.

  1. Advanced Computing Tools and Models for Accelerator Physics

    Energy Technology Data Exchange (ETDEWEB)

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  2. Statistical Physics An Advanced Approach with Applications

    CERN Document Server

    Honerkamp, Josef

    2012-01-01

    The application of statistical methods to physics is essential. This unique book on statistical physics offers an advanced approach with numerous applications to the modern problems students are confronted with. Therefore the text contains more concepts and methods in statistics than the student would need for statistical mechanics alone. Methods from mathematical statistics and stochastics for the analysis of data are discussed as well. The book is divided into two parts, focusing first on the modeling of statistical systems and then on the analysis of these systems. Problems with hints for solution help the students to deepen their knowledge. The third edition has been updated and enlarged with new sections deepening the knowledge about data analysis. Moreover, a customized set of  problems with solutions is accessible on the Web at extras.springer.com.

  3. Advanced Approach of Multiagent Based Buoy Communication

    Directory of Open Access Journals (Sweden)

    Gediminas Gricius

    2015-01-01

    Full Text Available Usually, a hydrometeorological information system is faced with great data flows, but the data levels are often excessive, depending on the observed region of the water. The paper presents advanced buoy communication technologies based on multiagent interaction and data exchange between several monitoring system nodes. The proposed management of buoy communication is based on a clustering algorithm, which enables the performance of the hydrometeorological information system to be enhanced. The experiment is based on the design and analysis of the inexpensive but reliable Baltic Sea autonomous monitoring network (buoys, which would be able to continuously monitor and collect temperature, waviness, and other required data. The proposed approach of multiagent based buoy communication enables all the data from the costal-based station to be monitored with limited transition speed by setting different tasks for the agent-based buoy system according to the clustering information.

  4. Advances in Computing and Information Technology : Proceedings of the Second International Conference on Advances in Computing and Information Technology

    CERN Document Server

    Nagamalai, Dhinaharan; Chaki, Nabendu

    2013-01-01

    The international conference on Advances in Computing and Information technology (ACITY 2012) provides an excellent international forum for both academics and professionals for sharing knowledge and results in theory, methodology and applications of Computer Science and Information Technology. The Second International Conference on Advances in Computing and Information technology (ACITY 2012), held in Chennai, India, during July 13-15, 2012, covered a number of topics in all major fields of Computer Science and Information Technology including: networking and communications, network security and applications, web and internet computing, ubiquitous computing, algorithms, bioinformatics, digital image processing and pattern recognition, artificial intelligence, soft computing and applications. Upon a strength review process, a number of high-quality, presenting not only innovative ideas but also a founded evaluation and a strong argumentation of the same, were selected and collected in the present proceedings, ...

  5. Recent advances in computational structural reliability analysis methods

    Science.gov (United States)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-01-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  6. Extending the horizons advances in computing, optimization, and decision technologies

    CERN Document Server

    Joseph, Anito; Mehrotra, Anuj; Trick, Michael

    2007-01-01

    Computer Science and Operations Research continue to have a synergistic relationship and this book represents the results of cross-fertilization between OR/MS and CS/AI. It is this interface of OR/CS that makes possible advances that could not have been achieved in isolation. Taken collectively, these articles are indicative of the state-of-the-art in the interface between OR/MS and CS/AI and of the high caliber of research being conducted by members of the INFORMS Computing Society. EXTENDING THE HORIZONS: Advances in Computing, Optimization, and Decision Technologies is a volume that presents the latest, leading research in the design and analysis of algorithms, computational optimization, heuristic search and learning, modeling languages, parallel and distributed computing, simulation, computational logic and visualization. This volume also emphasizes a variety of novel applications in the interface of CS, AI, and OR/MS.

  7. Advances in Computer Science and Education

    CERN Document Server

    Huang, Xiong

    2012-01-01

    CSE2011 is an integrated conference concentration its focus on computer science and education. In the proceeding, you can learn much more knowledge about computer science and education of researchers from all around the world. The main role of the proceeding is to be used as an exchange pillar for researchers who are working in the mentioned fields. In order to meet the high quality of Springer, AISC series, the organization committee has made their efforts to do the following things. Firstly, poor quality paper has been refused after reviewing course by anonymous referee experts. Secondly, periodically review meetings have been held around the reviewers about five times for exchanging reviewing suggestions. Finally, the conference organizers had several preliminary sessions before the conference. Through efforts of different people and departments, the conference will be successful and fruitful

  8. Advances in computational fluid dynamics solvers for modern computing environments

    Science.gov (United States)

    Hertenstein, Daniel; Humphrey, John R.; Paolini, Aaron L.; Kelmelis, Eric J.

    2013-05-01

    EM Photonics has been investigating the application of massively multicore processors to a key problem area: Computational Fluid Dynamics (CFD). While the capabilities of CFD solvers have continually increased and improved to support features such as moving bodies and adjoint-based mesh adaptation, the software architecture has often lagged behind. This has led to poor scaling as core counts reach the tens of thousands. In the modern High Performance Computing (HPC) world, clusters with hundreds of thousands of cores are becoming the standard. In addition, accelerator devices such as NVIDIA GPUs and Intel Xeon Phi are being installed in many new systems. It is important for CFD solvers to take advantage of the new hardware as the computations involved are well suited for the massively multicore architecture. In our work, we demonstrate that new features in NVIDIA GPUs are able to empower existing CFD solvers by example using AVUS, a CFD solver developed by the Air Force Research Labratory (AFRL) and the Volcanic Ash Advisory Center (VAAC). The effort has resulted in increased performance and scalability without sacrificing accuracy. There are many well-known codes in the CFD space that can benefit from this work, such as FUN3D, OVERFLOW, and TetrUSS. Such codes are widely used in the commercial, government, and defense sectors.

  9. Hybrid soft computing approaches research and applications

    CERN Document Server

    Dutta, Paramartha; Chakraborty, Susanta

    2016-01-01

    The book provides a platform for dealing with the flaws and failings of the soft computing paradigm through different manifestations. The different chapters highlight the necessity of the hybrid soft computing methodology in general with emphasis on several application perspectives in particular. Typical examples include (a) Study of Economic Load Dispatch by Various Hybrid Optimization Techniques, (b) An Application of Color Magnetic Resonance Brain Image Segmentation by ParaOptiMUSIG activation Function, (c) Hybrid Rough-PSO Approach in Remote Sensing Imagery Analysis,  (d) A Study and Analysis of Hybrid Intelligent Techniques for Breast Cancer Detection using Breast Thermograms, and (e) Hybridization of 2D-3D Images for Human Face Recognition. The elaborate findings of the chapters enhance the exhibition of the hybrid soft computing paradigm in the field of intelligent computing.

  10. Advanced Safeguards Approaches for New Fast Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Durst, Philip C.; Therios, Ike; Bean, Robert; Dougan, A.; Boyer, Brian; Wallace, Rick L.; Ehinger, Michael H.; Kovacic, Don N.; Tolk, K.

    2007-12-15

    This third report in the series reviews possible safeguards approaches for new fast reactors in general, and the ABR in particular. Fast-neutron spectrum reactors have been used since the early 1960s on an experimental and developmental level, generally with fertile blanket fuels to “breed” nuclear fuel such as plutonium. Whether the reactor is designed to breed plutonium, or transmute and “burn” actinides depends mainly on the design of the reactor neutron reflector and the whether the blanket fuel is “fertile” or suitable for transmutation. However, the safeguards issues are very similar, since they pertain mainly to the receipt, shipment and storage of fresh and spent plutonium and actinide-bearing “TRU”-fuel. For these reasons, the design of existing fast reactors and details concerning how they have been safeguarded were studied in developing advanced safeguards approaches for the new fast reactors. In this regard, the design of the Experimental Breeder Reactor-II “EBR-II” at the Idaho National Laboratory (INL) was of interest, because it was designed as a collocated fast reactor with a pyrometallurgical reprocessing and fuel fabrication line – a design option being considered for the ABR. Similarly, the design of the Fast Flux Facility (FFTF) on the Hanford Site was studied, because it was a successful prototype fast reactor that ran for two decades to evaluate fuels and the design for commercial-scale fast reactors.

  11. Advances on interdisciplinary approaches to urban carbon

    Science.gov (United States)

    Romero-Lankao, P.

    2015-12-01

    North American urban areas are emerging as climate policy and technology innovators, urbanization process laboratories, fonts of carbon relevant experiments, hubs for grass-roots mobilization, and centers for civil-society experiments to curb carbon emissions and avoid widespread and irreversible climate impacts. Since SOCCR diverse lines of inquiry on urbanization, urban areas and the carbon cycle have advanced our understanding of some of the societal processes through which energy and land uses affect carbon. This presentation provides an overview of these diverse perspectives. It suggests the need for approaches that complement and combine the plethora of existing insights into interdisciplinary explorations of how different urbanization processes, and socio-ecological and technological components of urban areas affect the spatial and temporal patterns of carbon emissions, differentially over time and within and across cities. It also calls for a more holistic approach to examining the carbon implications of urbanization and urban areas as places, based not only on demographics or income, but also on such other interconnected features of urban development pathways as urban form, economic function, economic growth policies and climate policies.

  12. Uncertainty quantification approaches for advanced reactor analyses.

    Energy Technology Data Exchange (ETDEWEB)

    Briggs, L. L.; Nuclear Engineering Division

    2009-03-24

    The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.

  13. Advances in optimal routing through computer networks

    Science.gov (United States)

    Paz, I. M.

    1977-01-01

    The optimal routing problem is defined. Progress in solving the problem during the previous decade is reviewed, with special emphasis on technical developments made during the last few years. The relationships between the routing, the throughput, and the switching technology used are discussed and their future trends are reviewed. Economic aspects are also briefly considered. Modern technical approaches for handling the routing problems and, more generally, the flow control problems are reviewed.

  14. Advances in computational studies of energy materials.

    Science.gov (United States)

    Catlow, C R A; Guo, Z X; Miskufova, M; Shevlin, S A; Smith, A G H; Sokol, A A; Walsh, A; Wilson, D J; Woodley, S M

    2010-07-28

    We review recent developments and applications of computational modelling techniques in the field of materials for energy technologies including hydrogen production and storage, energy storage and conversion, and light absorption and emission. In addition, we present new work on an Sn2TiO4 photocatalyst containing an Sn(II) lone pair, new interatomic potential models for SrTiO3 and GaN, an exploration of defects in the kesterite/stannite-structured solar cell absorber Cu2ZnSnS4, and report details of the incorporation of hydrogen into Ag2O and Cu2O. Special attention is paid to the modelling of nanostructured systems, including ceria (CeO2, mixed Ce(x)O(y) and Ce2O3) and group 13 sesquioxides. We consider applications based on both interatomic potential and electronic structure methodologies; and we illustrate the increasingly quantitative and predictive nature of modelling in this field. PMID:20566517

  15. Computational neuroscience for advancing artificial intelligence

    Directory of Open Access Journals (Sweden)

    Fernando P. Ponce

    2011-07-01

    Full Text Available resumen del libro de Alonso, E. y Mondragón, E. (2011. Hershey, NY: Medical Information Science Reference. La neurociencia como disciplinapersigue el entendimiento del cerebro y su relación con el funcionamiento de la mente a través del análisis de la comprensión de la interacción de diversos procesos físicos, químicos y biológicos (Bassett & Gazzaniga, 2011. Por otra parte, numerosas disciplinasprogresivamente han realizado significativas contribuciones en esta empresa tales como la matemática, la psicología o la filosofía, entre otras. Producto de este esfuerzo, es que junto con la neurociencia tradicional han aparecido disciplinas complementarias como la neurociencia cognitiva, la neuropsicología o la neurocienciacomputacional (Bengio, 2007; Dayan & Abbott, 2005. En el contexto de la neurociencia computacional como disciplina complementaria a laneurociencia tradicional. Alonso y Mondragón (2011 editan el libroComputacional Neuroscience for Advancing Artificial Intelligence: Models, Methods and Applications.

  16. Computational Approach To Understanding Autism Spectrum Disorders

    Directory of Open Access Journals (Sweden)

    Włodzisław Duch

    2012-01-01

    Full Text Available Every year the prevalence of Autism Spectrum of Disorders (ASD is rising. Is there a unifying mechanism of various ASD cases at the genetic, molecular, cellular or systems level? The hypothesis advanced in this paper is focused on neural dysfunctions that lead to problems with attention in autistic people. Simulations of attractor neural networks performing cognitive functions help to assess system long-term neurodynamics. The Fuzzy Symbolic Dynamics (FSD technique is used for the visualization of attractors in the semantic layer of the neural model of reading. Large-scale simulations of brain structures characterized by a high order of complexity requires enormous computational power, especially if biologically motivated neuron models are used to investigate the influence of cellular structure dysfunctions on the network dynamics. Such simulations have to be implemented on computer clusters in a grid-based architectures

  17. Advanced Methods and Applications in Computational Intelligence

    CERN Document Server

    Nikodem, Jan; Jacak, Witold; Chaczko, Zenon; ACASE 2012

    2014-01-01

    This book offers an excellent presentation of intelligent engineering and informatics foundations for researchers in this field as well as many examples with industrial application. It contains extended versions of selected papers presented at the inaugural ACASE 2012 Conference dedicated to the Applications of Systems Engineering. This conference was held from the 6th to the 8th of February 2012, at the University of Technology, Sydney, Australia, organized by the University of Technology, Sydney (Australia), Wroclaw University of Technology (Poland) and the University of Applied Sciences in Hagenberg (Austria). The  book is organized into three main parts. Part I contains papers devoted to the heuristic approaches that are applicable in situations where the problem cannot be solved by exact methods, due to various characteristics or  dimensionality problems. Part II covers essential issues of the network management, presents intelligent models of the next generation of networks and distributed systems ...

  18. Advances in FDTD computational electrodynamics photonics and nanotechnology

    CERN Document Server

    Oskooi, Ardavan; Johnson, Steven G

    2013-01-01

    Advances in photonics and nanotechnology have the potential to revolutionize humanity s ability to communicate and compute. To pursue these advances, it is mandatory to understand and properly model interactions of light with materials such as silicon and gold at the nanoscale, i.e., the span of a few tens of atoms laid side by side. These interactions are governed by the fundamental Maxwell s equations of classical electrodynamics, supplemented by quantum electrodynamics. This book presents the current state-of-the-art in formulating and implementing computational models of these interactions. Maxwell s equations are solved using the finite-difference time-domain (FDTD) technique, pioneered by the senior editor, whose prior Artech books in this area are among the top ten most-cited in the history of engineering. You discover the most important advances in all areas of FDTD and PSTD computational modeling of electromagnetic wave interactions. This cutting-edge resource helps you understand the latest develo...

  19. Reliability of an Interactive Computer Program for Advance Care Planning

    OpenAIRE

    Schubart, Jane R.; Levi, Benjamin H.; Camacho, Fabian; Whitehead, Megan; Farace, Elana; Green, Michael J.

    2012-01-01

    Despite widespread efforts to promote advance directives (ADs), completion rates remain low. Making Your Wishes Known: Planning Your Medical Future (MYWK) is an interactive computer program that guides individuals through the process of advance care planning, explaining health conditions and interventions that commonly involve life or death decisions, helps them articulate their values/goals, and translates users' preferences into a detailed AD document. The purpose of this study was to demon...

  20. 9th International Conference on Advanced Computing & Communication Technologies

    CERN Document Server

    Mandal, Jyotsna; Auluck, Nitin; Nagarajaram, H

    2016-01-01

    This book highlights a collection of high-quality peer-reviewed research papers presented at the Ninth International Conference on Advanced Computing & Communication Technologies (ICACCT-2015) held at Asia Pacific Institute of Information Technology, Panipat, India during 27–29 November 2015. The book discusses a wide variety of industrial, engineering and scientific applications of the emerging techniques. Researchers from academia and industry present their original work and exchange ideas, information, techniques and applications in the field of Advanced Computing and Communication Technology.

  1. UNEDF: Advanced Scientific Computing Collaboration Transforms the Low-Energy Nuclear Many-Body Problem

    CERN Document Server

    Nam, H; Nazarewicz, W; Bulgac, A; Hagen, G; Kortelainen, M; Maris, P; Pei, J C; Roche, K J; Schunck, N; Thompson, I; Vary, J P; Wild, S M

    2012-01-01

    The demands of cutting-edge science are driving the need for larger and faster computing resources. With the rapidly growing scale of computing systems and the prospect of technologically disruptive architectures to meet these needs, scientists face the challenge of effectively using complex computational resources to advance scientific discovery. Multidisciplinary collaborating networks of researchers with diverse scientific backgrounds are needed to address these complex challenges. The UNEDF SciDAC collaboration of nuclear theorists, applied mathematicians, and computer scientists is developing a comprehensive description of nuclei and their reactions that delivers maximum predictive power with quantified uncertainties. This paper describes UNEDF and identifies attributes that classify it as a successful computational collaboration. We illustrate significant milestones accomplished by UNEDF through integrative solutions using the most reliable theoretical approaches, most advanced algorithms, and leadershi...

  2. Multilayer Approach for Advanced Hybrid Lithium Battery

    KAUST Repository

    Ming, Jun

    2016-06-06

    Conventional intercalated rechargeable batteries have shown their capacity limit, and the development of an alternative battery system with higher capacity is strongly needed for sustainable electrical vehicles and hand-held devices. Herein, we introduce a feasible and scalable multilayer approach to fabricate a promising hybrid lithium battery with superior capacity and multivoltage plateaus. A sulfur-rich electrode (90 wt % S) is covered by a dual layer of graphite/Li4Ti5O12, where the active materials S and Li4Ti5O12 can both take part in redox reactions and thus deliver a high capacity of 572 mAh gcathode -1 (vs the total mass of electrode) or 1866 mAh gs -1 (vs the mass of sulfur) at 0.1C (with the definition of 1C = 1675 mA gs -1). The battery shows unique voltage platforms at 2.35 and 2.1 V, contributed from S, and 1.55 V from Li4Ti5O12. A high rate capability of 566 mAh gcathode -1 at 0.25C and 376 mAh gcathode -1 at 1C with durable cycle ability over 100 cycles can be achieved. Operando Raman and electron microscope analysis confirm that the graphite/Li4Ti5O12 layer slows the dissolution/migration of polysulfides, thereby giving rise to a higher sulfur utilization and a slower capacity decay. This advanced hybrid battery with a multilayer concept for marrying different voltage plateaus from various electrode materials opens a way of providing tunable capacity and multiple voltage platforms for energy device applications. © 2016 American Chemical Society.

  3. Multilayer Approach for Advanced Hybrid Lithium Battery.

    Science.gov (United States)

    Ming, Jun; Li, Mengliu; Kumar, Pushpendra; Li, Lain-Jong

    2016-06-28

    Conventional intercalated rechargeable batteries have shown their capacity limit, and the development of an alternative battery system with higher capacity is strongly needed for sustainable electrical vehicles and hand-held devices. Herein, we introduce a feasible and scalable multilayer approach to fabricate a promising hybrid lithium battery with superior capacity and multivoltage plateaus. A sulfur-rich electrode (90 wt % S) is covered by a dual layer of graphite/Li4Ti5O12, where the active materials S and Li4Ti5O12 can both take part in redox reactions and thus deliver a high capacity of 572 mAh gcathode(-1) (vs the total mass of electrode) or 1866 mAh gs(-1) (vs the mass of sulfur) at 0.1C (with the definition of 1C = 1675 mA gs(-1)). The battery shows unique voltage platforms at 2.35 and 2.1 V, contributed from S, and 1.55 V from Li4Ti5O12. A high rate capability of 566 mAh gcathode(-1) at 0.25C and 376 mAh gcathode(-1) at 1C with durable cycle ability over 100 cycles can be achieved. Operando Raman and electron microscope analysis confirm that the graphite/Li4Ti5O12 layer slows the dissolution/migration of polysulfides, thereby giving rise to a higher sulfur utilization and a slower capacity decay. This advanced hybrid battery with a multilayer concept for marrying different voltage plateaus from various electrode materials opens a way of providing tunable capacity and multiple voltage platforms for energy device applications. PMID:27268064

  4. Computer-Assisted Foreign Language Teaching and Learning: Technological Advances

    Science.gov (United States)

    Zou, Bin; Xing, Minjie; Wang, Yuping; Sun, Mingyu; Xiang, Catherine H.

    2013-01-01

    Computer-Assisted Foreign Language Teaching and Learning: Technological Advances highlights new research and an original framework that brings together foreign language teaching, experiments and testing practices that utilize the most recent and widely used e-learning resources. This comprehensive collection of research will offer linguistic…

  5. Innovations and Advances in Computer, Information, Systems Sciences, and Engineering

    CERN Document Server

    Sobh, Tarek

    2013-01-01

    Innovations and Advances in Computer, Information, Systems Sciences, and Engineering includes the proceedings of the International Joint Conferences on Computer, Information, and Systems Sciences, and Engineering (CISSE 2011). The contents of this book are a set of rigorously reviewed, world-class manuscripts addressing and detailing state-of-the-art research projects in the areas of  Industrial Electronics, Technology and Automation, Telecommunications and Networking, Systems, Computing Sciences and Software Engineering, Engineering Education, Instructional Technology, Assessment, and E-learning.

  6. Advances in computers dependable and secure systems engineering

    CERN Document Server

    Hurson, Ali

    2012-01-01

    Since its first volume in 1960, Advances in Computers has presented detailed coverage of innovations in computer hardware, software, theory, design, and applications. It has also provided contributors with a medium in which they can explore their subjects in greater depth and breadth than journal articles usually allow. As a result, many articles have become standard references that continue to be of sugnificant, lasting value in this rapidly expanding field. In-depth surveys and tutorials on new computer technologyWell-known authors and researchers in the fieldExtensive bibliographies with m

  7. Introducing Computational Approaches in Intermediate Mechanics

    Science.gov (United States)

    Cook, David M.

    2006-12-01

    In the winter of 2003, we at Lawrence University moved Lagrangian mechanics and rigid body dynamics from a required sophomore course to an elective junior/senior course, freeing 40% of the time for computational approaches to ordinary differential equations (trajectory problems, the large amplitude pendulum, non-linear dynamics); evaluation of integrals (finding centers of mass and moment of inertia tensors, calculating gravitational potentials for various sources); and finding eigenvalues and eigenvectors of matrices (diagonalizing the moment of inertia tensor, finding principal axes), and to generating graphical displays of computed results. Further, students begin to use LaTeX to prepare some of their submitted problem solutions. Placed in the middle of the sophomore year, this course provides the background that permits faculty members as appropriate to assign computer-based exercises in subsequent courses. Further, students are encouraged to use our Computational Physics Laboratory on their own initiative whenever that use seems appropriate. (Curricular development supported in part by the W. M. Keck Foundation, the National Science Foundation, and Lawrence University.)

  8. Advanced computational tools for 3-D seismic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Barhen, J.; Glover, C.W.; Protopopescu, V.A. [Oak Ridge National Lab., TN (United States)] [and others

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  9. Computational approaches to analogical reasoning current trends

    CERN Document Server

    Richard, Gilles

    2014-01-01

    Analogical reasoning is known as a powerful mode for drawing plausible conclusions and solving problems. It has been the topic of a huge number of works by philosophers, anthropologists, linguists, psychologists, and computer scientists. As such, it has been early studied in artificial intelligence, with a particular renewal of interest in the last decade. The present volume provides a structured view of current research trends on computational approaches to analogical reasoning. It starts with an overview of the field, with an extensive bibliography. The 14 collected contributions cover a large scope of issues. First, the use of analogical proportions and analogies is explained and discussed in various natural language processing problems, as well as in automated deduction. Then, different formal frameworks for handling analogies are presented, dealing with case-based reasoning, heuristic-driven theory projection, commonsense reasoning about incomplete rule bases, logical proportions induced by similarity an...

  10. [Activities of Research Institute for Advanced Computer Science

    Science.gov (United States)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2001-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  11. Recovery Act: Advanced Direct Methanol Fuel Cell for Mobile Computing

    Energy Technology Data Exchange (ETDEWEB)

    Fletcher, James H. [University of North Florida; Cox, Philip [University of North Florida; Harrington, William J [University of North Florida; Campbell, Joseph L [University of North Florida

    2013-09-03

    ABSTRACT Project Title: Recovery Act: Advanced Direct Methanol Fuel Cell for Mobile Computing PROJECT OBJECTIVE The objective of the project was to advance portable fuel cell system technology towards the commercial targets of power density, energy density and lifetime. These targets were laid out in the DOE’s R&D roadmap to develop an advanced direct methanol fuel cell power supply that meets commercial entry requirements. Such a power supply will enable mobile computers to operate non-stop, unplugged from the wall power outlet, by using the high energy density of methanol fuel contained in a replaceable fuel cartridge. Specifically this project focused on balance-of-plant component integration and miniaturization, as well as extensive component, subassembly and integrated system durability and validation testing. This design has resulted in a pre-production power supply design and a prototype that meet the rigorous demands of consumer electronic applications. PROJECT TASKS The proposed work plan was designed to meet the project objectives, which corresponded directly with the objectives outlined in the Funding Opportunity Announcement: To engineer the fuel cell balance-of-plant and packaging to meet the needs of consumer electronic systems, specifically at power levels required for mobile computing. UNF used existing balance-of-plant component technologies developed under its current US Army CERDEC project, as well as a previous DOE project completed by PolyFuel, to further refine them to both miniaturize and integrate their functionality to increase the system power density and energy density. Benefits of UNF’s novel passive water recycling MEA (membrane electrode assembly) and the simplified system architecture it enabled formed the foundation of the design approach. The package design was hardened to address orientation independence, shock, vibration, and environmental requirements. Fuel cartridge and fuel subsystems were improved to ensure effective fuel

  12. 2014 National Workshop on Advances in Communication and Computing

    CERN Document Server

    Prasanna, S; Sarma, Kandarpa; Saikia, Navajit

    2015-01-01

    The present volume is a compilation of research work in computation, communication, vision sciences, device design, fabrication, upcoming materials and related process design, etc. It is derived out of selected manuscripts submitted to the 2014 National Workshop on Advances in Communication and Computing (WACC 2014), Assam Engineering College, Guwahati, Assam, India which is emerging out to be a premier platform for discussion and dissemination of knowhow in this part of the world. The papers included in the volume are indicative of the recent thrust in computation, communications and emerging technologies. Certain recent advances in ZnO nanostructures for alternate energy generation provide emerging insights into an area that has promises for the energy sector including conservation and green technology. Similarly, scholarly contributions have focused on malware detection and related issues. Several contributions have focused on biomedical aspects including contributions related to cancer detection using act...

  13. Sculpting the band gap: a computational approach.

    Science.gov (United States)

    Prasai, Kiran; Biswas, Parthapratim; Drabold, D A

    2015-01-01

    Materials with optimized band gap are needed in many specialized applications. In this work, we demonstrate that Hellmann-Feynman forces associated with the gap states can be used to find atomic coordinates that yield desired electronic density of states. Using tight-binding models, we show that this approach may be used to arrive at electronically designed models of amorphous silicon and carbon. We provide a simple recipe to include a priori electronic information in the formation of computer models of materials, and prove that this information may have profound structural consequences. The models are validated with plane-wave density functional calculations. PMID:26490203

  14. Advanced quantum communications an engineering approach

    CERN Document Server

    Imre, Sandor

    2012-01-01

    The book provides an overview of the most advanced quantum informational geometric techniques, which can help quantum communication theorists analyze quantum channels, such as security or additivity properties. Each section addresses an area of major research of quantum information theory and quantum communication networks. The authors present the fundamental theoretical results of quantum information theory, while also presenting the details of advanced quantum ccommunication protocols with clear mathematical and information theoretical background. This book bridges the gap between quantum ph

  15. Computational approaches for rational design of proteins with novel functionalities

    Directory of Open Access Journals (Sweden)

    Manish Kumar Tiwari

    2012-09-01

    Full Text Available Proteins are the most multifaceted macromolecules in living systems and have various important functions, including structural, catalytic, sensory, and regulatory functions. Rational design of enzymes is a great challenge to our understanding of protein structure and physical chemistry and has numerous potential applications. Protein design algorithms have been applied to design or engineer proteins that fold, fold faster, catalyze, catalyze faster, signal, and adopt preferred conformational states. The field of de novo protein design, although only a few decades old, is beginning to produce exciting results. Developments in this field are already having a significant impact on biotechnology and chemical biology. The application of powerful computational methods for functional protein designing has recently succeeded at engineering target activities. Here, we review recently reported de novo functional proteins that were developed using various protein design approaches, including rational design, computational optimization, and selection from combinatorial libraries, highlighting recent advances and successes.

  16. Computational approaches for rational design of proteins with novel functionalities.

    Science.gov (United States)

    Tiwari, Manish Kumar; Singh, Ranjitha; Singh, Raushan Kumar; Kim, In-Won; Lee, Jung-Kul

    2012-01-01

    Proteins are the most multifaceted macromolecules in living systems and have various important functions, including structural, catalytic, sensory, and regulatory functions. Rational design of enzymes is a great challenge to our understanding of protein structure and physical chemistry and has numerous potential applications. Protein design algorithms have been applied to design or engineer proteins that fold, fold faster, catalyze, catalyze faster, signal, and adopt preferred conformational states. The field of de novo protein design, although only a few decades old, is beginning to produce exciting results. Developments in this field are already having a significant impact on biotechnology and chemical biology. The application of powerful computational methods for functional protein designing has recently succeeded at engineering target activities. Here, we review recently reported de novo functional proteins that were developed using various protein design approaches, including rational design, computational optimization, and selection from combinatorial libraries, highlighting recent advances and successes.

  17. Scientific Discovery through Advanced Computing (SciDAC-3) Partnership Project Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, Forest M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bochev, Pavel B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cameron-Smith, Philip J.. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Easter, Richard C [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Elliott, Scott M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ghan, Steven J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Liu, Xiaohong [Univ. of Wyoming, Laramie, WY (United States); Lowrie, Robert B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Lucas, Donald D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ma, Po-lun [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sacks, William J. [National Center for Atmospheric Research (NCAR), Boulder, CO (United States); Shrivastava, Manish [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Singh, Balwinder [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Tautges, Timothy J. [Argonne National Lab. (ANL), Argonne, IL (United States); Taylor, Mark A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Vertenstein, Mariana [National Center for Atmospheric Research (NCAR), Boulder, CO (United States); Worley, Patrick H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2014-01-15

    The Applying Computationally Efficient Schemes for BioGeochemical Cycles ACES4BGC Project is advancing the predictive capabilities of Earth System Models (ESMs) by reducing two of the largest sources of uncertainty, aerosols and biospheric feedbacks, with a highly efficient computational approach. In particular, this project is implementing and optimizing new computationally efficient tracer advection algorithms for large numbers of tracer species; adding important biogeochemical interactions between the atmosphere, land, and ocean models; and applying uncertainty quanti cation (UQ) techniques to constrain process parameters and evaluate uncertainties in feedbacks between biogeochemical cycles and the climate system.

  18. Advanced Simulation and Computing FY17 Implementation Plan, Version 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hendrickson, Bruce [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wade, Doug [National Nuclear Security Administration (NNSA), Washington, DC (United States). Office of Advanced Simulation and Computing and Institutional Research and Development; Hoang, Thuc [National Nuclear Security Administration (NNSA), Washington, DC (United States). Computational Systems and Software Environment

    2016-08-29

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.

  19. Activities of the Research Institute for Advanced Computer Science

    Science.gov (United States)

    Oliger, Joseph

    1994-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) high performance networks; and (4) learning systems. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period January 1, 1994 through December 31, 1994 is in the Reports and Abstracts section of this report.

  20. Recovery Act: Advanced Direct Methanol Fuel Cell for Mobile Computing

    Energy Technology Data Exchange (ETDEWEB)

    Fletcher, James H. [University of North Florida; Cox, Philip [University of North Florida; Harrington, William J [University of North Florida; Campbell, Joseph L [University of North Florida

    2013-09-03

    ABSTRACT Project Title: Recovery Act: Advanced Direct Methanol Fuel Cell for Mobile Computing PROJECT OBJECTIVE The objective of the project was to advance portable fuel cell system technology towards the commercial targets of power density, energy density and lifetime. These targets were laid out in the DOE’s R&D roadmap to develop an advanced direct methanol fuel cell power supply that meets commercial entry requirements. Such a power supply will enable mobile computers to operate non-stop, unplugged from the wall power outlet, by using the high energy density of methanol fuel contained in a replaceable fuel cartridge. Specifically this project focused on balance-of-plant component integration and miniaturization, as well as extensive component, subassembly and integrated system durability and validation testing. This design has resulted in a pre-production power supply design and a prototype that meet the rigorous demands of consumer electronic applications. PROJECT TASKS The proposed work plan was designed to meet the project objectives, which corresponded directly with the objectives outlined in the Funding Opportunity Announcement: To engineer the fuel cell balance-of-plant and packaging to meet the needs of consumer electronic systems, specifically at power levels required for mobile computing. UNF used existing balance-of-plant component technologies developed under its current US Army CERDEC project, as well as a previous DOE project completed by PolyFuel, to further refine them to both miniaturize and integrate their functionality to increase the system power density and energy density. Benefits of UNF’s novel passive water recycling MEA (membrane electrode assembly) and the simplified system architecture it enabled formed the foundation of the design approach. The package design was hardened to address orientation independence, shock, vibration, and environmental requirements. Fuel cartridge and fuel subsystems were improved to ensure effective fuel

  1. NATO Advanced Study Institute on Methods in Computational Molecular Physics

    CERN Document Server

    Diercksen, Geerd

    1992-01-01

    This volume records the lectures given at a NATO Advanced Study Institute on Methods in Computational Molecular Physics held in Bad Windsheim, Germany, from 22nd July until 2nd. August, 1991. This NATO Advanced Study Institute sought to bridge the quite considerable gap which exist between the presentation of molecular electronic structure theory found in contemporary monographs such as, for example, McWeeny's Methods 0/ Molecular Quantum Mechanics (Academic Press, London, 1989) or Wilson's Electron correlation in moleeules (Clarendon Press, Oxford, 1984) and the realization of the sophisticated computational algorithms required for their practical application. It sought to underline the relation between the electronic structure problem and the study of nuc1ear motion. Software for performing molecular electronic structure calculations is now being applied in an increasingly wide range of fields in both the academic and the commercial sectors. Numerous applications are reported in areas as diverse as catalysi...

  2. Recent advances in diagnostic approaches for sub-arachnoid hemorrhage.

    Science.gov (United States)

    Kumar, Ashish; Kato, Yoko; Hayakawa, Motoharu; Junpei, Oda; Watabe, Takeya; Imizu, Shuei; Oguri, Daikichi; Hirose, Yuichi

    2011-07-01

    Sub-arachnoid hemorrhage (SAH) has been easily one of the most debilitating neurosurgical entities as far as stroke related case mortality and morbidity rates are concerned. To date, it has case fatality rates ranging from 32-67%. Advances in the diagnostic accuracy of the available imaging methods have contributed significantly in reducing morbidity associated with this deadly disease. We currently have computed tomography angiography (CTA), magnetic resonance angiography (MRA) and the digital subtraction angiography (DSA) including three dimensional DSA as the mainstay diagnostic techniques. The non-invasive angiography in the form of CTA and MRA has evolved in the last decade as rapid, easily available, and economical means of diagnosing the cause of SAH. The role of three dimensional computed tomography angiography (3D-CTA) in management of aneurysms has been fairly acknowledged in the past. There have been numerous articles in the literature regarding its potential threat to the conventional "gold standard" DSA. The most recent addition has been the introduction of the fourth dimension to the established 3D-CT angiography (4D-CTA). At many centers, DSA is still treated as the first choice of investigation. Although, CT angiography still has some limitations, it can provide an unmatched multi-directional view of the aneurysmal morphology and its surroundings including relations with the skull base and blood vessels. We study the recent advances in the diagnostic approaches to SAH with special emphasis on 3D-CTA and 4D-CTA as the upcoming technologies. PMID:22347331

  3. Novel computational approaches characterizing knee physiotherapy

    Directory of Open Access Journals (Sweden)

    Wangdo Kim

    2014-01-01

    Full Text Available A knee joint’s longevity depends on the proper integration of structural components in an axial alignment. If just one of the components is abnormally off-axis, the biomechanical system fails, resulting in arthritis. The complexity of various failures in the knee joint has led orthopedic surgeons to select total knee replacement as a primary treatment. In many cases, this means sacrificing much of an otherwise normal joint. Here, we review novel computational approaches to describe knee physiotherapy by introducing a new dimension of foot loading to the knee axis alignment producing an improved functional status of the patient. New physiotherapeutic applications are then possible by aligning foot loading with the functional axis of the knee joint during the treatment of patients with osteoarthritis.

  4. Understanding Plant Nitrogen Metabolism through Metabolomics and Computational Approaches

    Directory of Open Access Journals (Sweden)

    Perrin H. Beatty

    2016-10-01

    Full Text Available A comprehensive understanding of plant metabolism could provide a direct mechanism for improving nitrogen use efficiency (NUE in crops. One of the major barriers to achieving this outcome is our poor understanding of the complex metabolic networks, physiological factors, and signaling mechanisms that affect NUE in agricultural settings. However, an exciting collection of computational and experimental approaches has begun to elucidate whole-plant nitrogen usage and provides an avenue for connecting nitrogen-related phenotypes to genes. Herein, we describe how metabolomics, computational models of metabolism, and flux balance analysis have been harnessed to advance our understanding of plant nitrogen metabolism. We introduce a model describing the complex flow of nitrogen through crops in a real-world agricultural setting and describe how experimental metabolomics data, such as isotope labeling rates and analyses of nutrient uptake, can be used to refine these models. In summary, the metabolomics/computational approach offers an exciting mechanism for understanding NUE that may ultimately lead to more effective crop management and engineered plants with higher yields.

  5. Condition Monitoring Through Advanced Sensor and Computational Technology

    International Nuclear Information System (INIS)

    The overall goal of this joint research project was to develop and demonstrate advanced sensors and computational technology for continuous monitoring of the condition of components, structures, and systems in advanced and next-generation nuclear power plants (NPPs). This project included investigating and adapting several advanced sensor technologies from Korean and US national laboratory research communities, some of which were developed and applied in non-nuclear industries. The project team investigated and developed sophisticated signal processing, noise reduction, and pattern recognition techniques and algorithms. The researchers installed sensors and conducted condition monitoring tests on two test loops, a check valve (an active component) and a piping elbow (a passive component), to demonstrate the feasibility of using advanced sensors and computational technology to achieve the project goal. Acoustic emission (AE) devices, optical fiber sensors, accelerometers, and ultrasonic transducers (UTs) were used to detect mechanical vibratory response of check valve and piping elbow in normal and degraded configurations. Chemical sensors were also installed to monitor the water chemistry in the piping elbow test loop. Analysis results of processed sensor data indicate that it is feasible to differentiate between the normal and degraded (with selected degradation mechanisms) configurations of these two components from the acquired sensor signals, but it is questionable that these methods can reliably identify the level and type of degradation. Additional research and development efforts are needed to refine the differentiation techniques and to reduce the level of uncertainties

  6. Delay Computation Using Fuzzy Logic Approach

    Directory of Open Access Journals (Sweden)

    Ramasesh G. R.

    2012-10-01

    Full Text Available The paper presents practical application of fuzzy sets and system theory in predicting delay, with reasonable accuracy, a wide range of factors pertaining to construction projects. In this paper we shall use fuzzy logic to predict delays on account of Delayed supplies and Labor shortage. It is observed that the project scheduling software use either deterministic method or probabilistic method for computation of schedule durations, delays, lags and other parameters. In other words, these methods use only quantitative inputs leaving-out the qualitative aspects associated with individual activity of work. The qualitative aspect viz., the expertise of the mason or the lack of experience can have a significant impact on the assessed duration. Such qualitative aspects do not find adequate representation in the Project Scheduling software. A realistic project is considered for which a PERT chart has been prepared using showing all the major activities in reasonable detail. This project has been periodically updated until its completion. It is observed that some of the activities are delayed due to extraneous factors resulting in the overall delay of the project. The software has the capability to calculate the overall delay through CPM (Critical Path Method when each of the activity-delays is reported. We shall now demonstrate that by using fuzzy logic, these delays could have been predicted well in advance.

  7. Blueprinting Approach in Support of Cloud Computing

    Directory of Open Access Journals (Sweden)

    Willem-Jan van den Heuvel

    2012-03-01

    Full Text Available Current cloud service offerings, i.e., Software-as-a-service (SaaS, Platform-as-a-service (PaaS and Infrastructure-as-a-service (IaaS offerings are often provided as monolithic, one-size-fits-all solutions and give little or no room for customization. This limits the ability of Service-based Application (SBA developers to configure and syndicate offerings from multiple SaaS, PaaS, and IaaS providers to address their application requirements. Furthermore, combining different independent cloud services necessitates a uniform description format that facilitates the design, customization, and composition. Cloud Blueprinting is a novel approach that allows SBA developers to easily design, configure and deploy virtual SBA payloads on virtual machines and resource pools on the cloud. We propose the Blueprint concept as a uniform abstract description for cloud service offerings that may cross different cloud computing layers, i.e., SaaS, PaaS and IaaS. To support developers with the SBA design and development in the cloud, this paper introduces a formal Blueprint Template for unambiguously describing a blueprint, as well as a Blueprint Lifecycle that guides developers through the manipulation, composition and deployment of different blueprints for an SBA. Finally, the empirical evaluation of the blueprinting approach within an EC’s FP7 project is reported and an associated blueprint prototype implementation is presented.

  8. The advanced computational testing and simulation toolkit (ACTS)

    Energy Technology Data Exchange (ETDEWEB)

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  9. An Educational Approach to Computationally Modeling Dynamical Systems

    Science.gov (United States)

    Chodroff, Leah; O'Neal, Tim M.; Long, David A.; Hemkin, Sheryl

    2009-01-01

    Chemists have used computational science methodologies for a number of decades and their utility continues to be unabated. For this reason we developed an advanced lab in computational chemistry in which students gain understanding of general strengths and weaknesses of computation-based chemistry by working through a specific research problem.…

  10. Advancing Instructional Communication: Integrating a Biosocial Approach

    Science.gov (United States)

    Horan, Sean M.; Afifi, Tamara D.

    2014-01-01

    Celebrating 100 years of the National Communication Association necessitates that, as we commemorate our past, we also look toward our future. As part of a larger conversation about the future of instructional communication, this essay reinvestigates the importance of integrating biosocial approaches into instructional communication research. In…

  11. Multimodality approach for locally advanced esophageal cancer

    Institute of Scientific and Technical Information of China (English)

    Khaldoun Almhanna; Jonathan R Strosberg

    2012-01-01

    Carcinoma of the esophagus is an aggressive and lethal malignancy with an increasing incidence world-wide.Incidence rates vary internationally,with the highest rates found in Southern and Eastern Africa and Eastern Asia,and the lowest in Western and Middle Africa and Central America.Patients with locally advanced disease face a poor prognosis,with 5-year survival rates ranging from 15%-34%.Recent clinical trials have evaluated different strategies for management of locoregional cancer; however,because of stage migration and changes in disease epidemiology,applying these trials to clinical practice has become a daunting task.We searched Medline and conference abstracts for randomized studies published in the last 3 decades.We restricted our search to articles published in English.Neoadjuvant chemoradiotherapy followed by surgical resection is an accepted standard of care in the United States.Esophagectomy remains an essential component of treatment and can lead to improved overall survival,especially when performed at high volume institutions.The role of adjuvant chemotherapy following curative resection is still unclear.External beam radiation therapy alone is considered palliative and is typically reserved for patients with a poor performance status.

  12. Human Computer Interaction: An intellectual approach

    OpenAIRE

    Mr. Kuntal Saroha; Sheela Sharma; Gurpreet Bhatia

    2011-01-01

    This paper discusses the research that has been done in thefield of Human Computer Interaction (HCI) relating tohuman psychology. Human-computer interaction (HCI) isthe study of how people design, implement, and useinteractive computer systems and how computers affectindividuals, organizations, and society. This encompassesnot only ease of use but also new interaction techniques forsupporting user tasks, providing better access toinformation, and creating more powerful forms ofcommunication. ...

  13. A first attempt to bring computational biology into advanced high school biology classrooms.

    Directory of Open Access Journals (Sweden)

    Suzanne Renick Gallagher

    2011-10-01

    Full Text Available Computer science has become ubiquitous in many areas of biological research, yet most high school and even college students are unaware of this. As a result, many college biology majors graduate without adequate computational skills for contemporary fields of biology. The absence of a computational element in secondary school biology classrooms is of growing concern to the computational biology community and biology teachers who would like to acquaint their students with updated approaches in the discipline. We present a first attempt to correct this absence by introducing a computational biology element to teach genetic evolution into advanced biology classes in two local high schools. Our primary goal was to show students how computation is used in biology and why a basic understanding of computation is necessary for research in many fields of biology. This curriculum is intended to be taught by a computational biologist who has worked with a high school advanced biology teacher to adapt the unit for his/her classroom, but a motivated high school teacher comfortable with mathematics and computing may be able to teach this alone. In this paper, we present our curriculum, which takes into consideration the constraints of the required curriculum, and discuss our experiences teaching it. We describe the successes and challenges we encountered while bringing this unit to high school students, discuss how we addressed these challenges, and make suggestions for future versions of this curriculum.We believe that our curriculum can be a valuable seed for further development of computational activities aimed at high school biology students. Further, our experiences may be of value to others teaching computational biology at this level. Our curriculum can be obtained at http://ecsite.cs.colorado.edu/?page_id=149#biology or by contacting the authors.

  14. Physics and computer science: quantum computation and other approaches

    OpenAIRE

    Salvador E. Venegas-Andraca

    2011-01-01

    This is a position paper written as an introduction to the special volume on quantum algorithms I edited for the journal Mathematical Structures in Computer Science (Volume 20 - Special Issue 06 (Quantum Algorithms), 2010).

  15. A semantic-web approach for modeling computing infrastructures

    NARCIS (Netherlands)

    M. Ghijsen; J. van der Ham; P. Grosso; C. Dumitru; H. Zhu; Z. Zhao; C. de Laat

    2013-01-01

    This paper describes our approach to modeling computing infrastructures. Our main contribution is the Infrastructure and Network Description Language (INDL) ontology. The aim of INDL is to provide technology independent descriptions of computing infrastructures, including the physical resources as w

  16. A computational approach to chemical etiologies of diabetes

    DEFF Research Database (Denmark)

    Audouze, Karine Marie Laure; Brunak, Søren; Grandjean, Philippe

    2013-01-01

    Computational meta-analysis can link environmental chemicals to genes and proteins involved in human diseases, thereby elucidating possible etiologies and pathogeneses of non-communicable diseases. We used an integrated computational systems biology approach to examine possible pathogenetic...

  17. Advanced measurement approach with loss distribution in operational risk management

    OpenAIRE

    Atilla ÇİFTER; Chambers, Nurgül

    2007-01-01

    According to the last proposal by Basel Committee, commercial banks are allowed to use advanced measurement approach for operational risk. Since basic indicator and standard approach considers operational risk as a percentage of gross profit, these methodologies are not satisfactory as real lost or probability of lost are not taken into consideration. In this article, loss distribution approach is applied with simulated data. 20 nonparametric loss distributions and mixing internal and externa...

  18. Building an advanced climate model: Program plan for the CHAMMP (Computer Hardware, Advanced Mathematics, and Model Physics) Climate Modeling Program

    Energy Technology Data Exchange (ETDEWEB)

    1990-12-01

    The issue of global warming and related climatic changes from increasing concentrations of greenhouse gases in the atmosphere has received prominent attention during the past few years. The Computer Hardware, Advanced Mathematics, and Model Physics (CHAMMP) Climate Modeling Program is designed to contribute directly to this rapid improvement. The goal of the CHAMMP Climate Modeling Program is to develop, verify, and apply a new generation of climate models within a coordinated framework that incorporates the best available scientific and numerical approaches to represent physical, biogeochemical, and ecological processes, that fully utilizes the hardware and software capabilities of new computer architectures, that probes the limits of climate predictability, and finally that can be used to address the challenging problem of understanding the greenhouse climate issue through the ability of the models to simulate time-dependent climatic changes over extended times and with regional resolution.

  19. Advanced Computational Methods for Thermal Radiative Heat Transfer.

    Energy Technology Data Exchange (ETDEWEB)

    Tencer, John; Carlberg, Kevin Thomas; Larsen, Marvin E.; Hogan, Roy E.,

    2016-10-01

    Participating media radiation (PMR) in weapon safety calculations for abnormal thermal environments are too costly to do routinely. This cost may be s ubstantially reduced by applying reduced order modeling (ROM) techniques. The application of ROM to PMR is a new and unique approach for this class of problems. This approach was investigated by the authors and shown to provide significant reductions in the computational expense associated with typical PMR simulations. Once this technology is migrated into production heat transfer analysis codes this capability will enable the routine use of PMR heat transfer in higher - fidelity simulations of weapon resp onse in fire environments.

  20. Advanced Computational Methods for Thermal Radiative Heat Transfer

    Energy Technology Data Exchange (ETDEWEB)

    Tencer, John; Carlberg, Kevin Thomas; Larsen, Marvin E.; Hogan, Roy E.,

    2016-10-01

    Participating media radiation (PMR) in weapon safety calculations for abnormal thermal environments are too costly to do routinely. This cost may be s ubstantially reduced by applying reduced order modeling (ROM) techniques. The application of ROM to PMR is a new and unique approach for this class of problems. This approach was investigated by the authors and shown to provide significant reductions in the computational expense associated with typical PMR simulations. Once this technology is migrated into production heat transfer analysis codes this capability will enable the routine use of PMR heat transfer in higher - fidelity simulations of weapon resp onse in fire environments.

  1. DOE Advanced Scientific Computing Advisory Committee (ASCAC) Report: Exascale Computing Initiative Review

    Energy Technology Data Exchange (ETDEWEB)

    Reed, Daniel [University of Iowa; Berzins, Martin [University of Utah; Pennington, Robert; Sarkar, Vivek [Rice University; Taylor, Valerie [Texas A& M University

    2015-08-01

    On November 19, 2014, the Advanced Scientific Computing Advisory Committee (ASCAC) was charged with reviewing the Department of Energy’s conceptual design for the Exascale Computing Initiative (ECI). In particular, this included assessing whether there are significant gaps in the ECI plan or areas that need to be given priority or extra management attention. Given the breadth and depth of previous reviews of the technical challenges inherent in exascale system design and deployment, the subcommittee focused its assessment on organizational and management issues, considering technical issues only as they informed organizational or management priorities and structures. This report presents the observations and recommendations of the subcommittee.

  2. Computer science approach to quantum control

    OpenAIRE

    Janzing, Dominik

    2006-01-01

    This work considers several hypothetical control processes on the nanoscopic level and show their analogy to computation processes. It shows that measuring certain types of quantum observables is such a complex task that every instrument that is able to perform it would necessarily be an extremely powerful computer.

  3. Recent advances in computational mechanics of the human knee joint.

    Science.gov (United States)

    Kazemi, M; Dabiri, Y; Li, L P

    2013-01-01

    Computational mechanics has been advanced in every area of orthopedic biomechanics. The objective of this paper is to provide a general review of the computational models used in the analysis of the mechanical function of the knee joint in different loading and pathological conditions. Major review articles published in related areas are summarized first. The constitutive models for soft tissues of the knee are briefly discussed to facilitate understanding the joint modeling. A detailed review of the tibiofemoral joint models is presented thereafter. The geometry reconstruction procedures as well as some critical issues in finite element modeling are also discussed. Computational modeling can be a reliable and effective method for the study of mechanical behavior of the knee joint, if the model is constructed correctly. Single-phase material models have been used to predict the instantaneous load response for the healthy knees and repaired joints, such as total and partial meniscectomies, ACL and PCL reconstructions, and joint replacements. Recently, poromechanical models accounting for fluid pressurization in soft tissues have been proposed to study the viscoelastic response of the healthy and impaired knee joints. While the constitutive modeling has been considerably advanced at the tissue level, many challenges still exist in applying a good material model to three-dimensional joint simulations. A complete model validation at the joint level seems impossible presently, because only simple data can be obtained experimentally. Therefore, model validation may be concentrated on the constitutive laws using multiple mechanical tests of the tissues. Extensive model verifications at the joint level are still crucial for the accuracy of the modeling.

  4. Recent Advances in Computational Mechanics of the Human Knee Joint

    Directory of Open Access Journals (Sweden)

    M. Kazemi

    2013-01-01

    Full Text Available Computational mechanics has been advanced in every area of orthopedic biomechanics. The objective of this paper is to provide a general review of the computational models used in the analysis of the mechanical function of the knee joint in different loading and pathological conditions. Major review articles published in related areas are summarized first. The constitutive models for soft tissues of the knee are briefly discussed to facilitate understanding the joint modeling. A detailed review of the tibiofemoral joint models is presented thereafter. The geometry reconstruction procedures as well as some critical issues in finite element modeling are also discussed. Computational modeling can be a reliable and effective method for the study of mechanical behavior of the knee joint, if the model is constructed correctly. Single-phase material models have been used to predict the instantaneous load response for the healthy knees and repaired joints, such as total and partial meniscectomies, ACL and PCL reconstructions, and joint replacements. Recently, poromechanical models accounting for fluid pressurization in soft tissues have been proposed to study the viscoelastic response of the healthy and impaired knee joints. While the constitutive modeling has been considerably advanced at the tissue level, many challenges still exist in applying a good material model to three-dimensional joint simulations. A complete model validation at the joint level seems impossible presently, because only simple data can be obtained experimentally. Therefore, model validation may be concentrated on the constitutive laws using multiple mechanical tests of the tissues. Extensive model verifications at the joint level are still crucial for the accuracy of the modeling.

  5. International conference on Advances in Intelligent Control and Innovative Computing

    CERN Document Server

    Castillo, Oscar; Huang, Xu; Intelligent Control and Innovative Computing

    2012-01-01

    In the lightning-fast world of intelligent control and cutting-edge computing, it is vitally important to stay abreast of developments that seem to follow each other without pause. This publication features the very latest and some of the very best current research in the field, with 32 revised and extended research articles written by prominent researchers in the field. Culled from contributions to the key 2011 conference Advances in Intelligent Control and Innovative Computing, held in Hong Kong, the articles deal with a wealth of relevant topics, from the most recent work in artificial intelligence and decision-supporting systems, to automated planning, modelling and simulation, signal processing, and industrial applications. Not only does this work communicate the current state of the art in intelligent control and innovative computing, it is also an illuminating guide to up-to-date topics for researchers and graduate students in the field. The quality of the contents is absolutely assured by the high pro...

  6. Computer networks ISE a systems approach

    CERN Document Server

    Peterson, Larry L

    2007-01-01

    Computer Networks, 4E is the only introductory computer networking book written by authors who have had first-hand experience with many of the protocols discussed in the book, who have actually designed some of them as well, and who are still actively designing the computer networks today. This newly revised edition continues to provide an enduring, practical understanding of networks and their building blocks through rich, example-based instruction. The authors' focus is on the why of network design, not just the specifications comprising today's systems but how key technologies and p

  7. Mathematics of shape description a morphological approach to image processing and computer graphics

    CERN Document Server

    Ghosh, Pijush K

    2009-01-01

    Image processing problems are often not well defined because real images are contaminated with noise and other uncertain factors. In Mathematics of Shape Description, the authors take a mathematical approach to address these problems using the morphological and set-theoretic approach to image processing and computer graphics by presenting a simple shape model using two basic shape operators called Minkowski addition and decomposition. This book is ideal for professional researchers and engineers in Information Processing, Image Measurement, Shape Description, Shape Representation and Computer Graphics. Post-graduate and advanced undergraduate students in pure and applied mathematics, computer sciences, robotics and engineering will also benefit from this book.  Key FeaturesExplains the fundamental and advanced relationships between algebraic system and shape description through the set-theoretic approachPromotes interaction of image processing geochronology and mathematics in the field of algebraic geometryP...

  8. Assessing creativity in computer music ensembles: a computational approach

    OpenAIRE

    Comajuncosas, Josep M.

    2016-01-01

    Over the last decade Laptop Orchestras and Mobile Ensembles have proliferated. As a result, a large body of research has arisen on infrastructure, evaluation, design principles and compositional methodologies for Computer Music Ensembles (CME). However, little has been addressed and very little is known about the challenges and opportunities provided by CMEs for creativity in musical performance. Therefore, one of the most common issues CMEs have to deal with is the lack of ...

  9. Advances in Intelligent Control Systems and Computer Science

    CERN Document Server

    2013-01-01

    The conception of real-time control networks taking into account, as an integrating approach, both the specific aspects of information and knowledge processing and the dynamic and energetic particularities of physical processes and of communication networks is representing one of the newest scientific and technological challenges. The new paradigm of Cyber-Physical Systems (CPS) reflects this tendency and will certainly change the evolution of the technology, with major social and economic impact. This book presents significant results in the field of process control and advanced information and knowledge processing, with applications in the fields of robotics, biotechnology, environment, energy, transportation, et al.. It introduces intelligent control concepts and strategies as well as real-time implementation aspects for complex control approaches. One of the sections is dedicated to the complex problem of designing software systems for distributed information processing networks. Problems as complexity an...

  10. Human Computer Interaction: An intellectual approach

    Directory of Open Access Journals (Sweden)

    Kuntal Saroha

    2011-08-01

    Full Text Available This paper discusses the research that has been done in thefield of Human Computer Interaction (HCI relating tohuman psychology. Human-computer interaction (HCI isthe study of how people design, implement, and useinteractive computer systems and how computers affectindividuals, organizations, and society. This encompassesnot only ease of use but also new interaction techniques forsupporting user tasks, providing better access toinformation, and creating more powerful forms ofcommunication. It involves input and output devices andthe interaction techniques that use them; how information ispresented and requested; how the computer’s actions arecontrolled and monitored; all forms of help, documentation,and training; the tools used to design, build, test, andevaluate user interfaces; and the processes that developersfollow when creating Interfaces.

  11. Computer science approach to quantum control

    Energy Technology Data Exchange (ETDEWEB)

    Janzing, D.

    2006-07-01

    Whereas it is obvious that every computation process is a physical process it has hardly been recognized that many complex physical processes bear similarities to computation processes. This is in particular true for the control of physical systems on the nanoscopic level: usually the system can only be accessed via a rather limited set of elementary control operations and for many purposes only a concatenation of a large number of these basic operations will implement the desired process. This concatenation is in many cases quite similar to building complex programs from elementary steps and principles for designing algorithm may thus be a paradigm for designing control processes. For instance, one can decrease the temperature of one part of a molecule by transferring its heat to the remaining part where it is then dissipated to the environment. But the implementation of such a process involves a complex sequence of electromagnetic pulses. This work considers several hypothetical control processes on the nanoscopic level and show their analogy to computation processes. We show that measuring certain types of quantum observables is such a complex task that every instrument that is able to perform it would necessarily be an extremely powerful computer. Likewise, the implementation of a heat engine on the nanoscale requires to process the heat in a way that is similar to information processing and it can be shown that heat engines with maximal efficiency would be powerful computers, too. In the same way as problems in computer science can be classified by complexity classes we can also classify control problems according to their complexity. Moreover, we directly relate these complexity classes for control problems to the classes in computer science. Unifying notions of complexity in computer science and physics has therefore two aspects: on the one hand, computer science methods help to analyze the complexity of physical processes. On the other hand, reasonable

  12. Uncertainty in biology: a computational modeling approach

    OpenAIRE

    2015-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies. Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building...

  13. Computational approaches to natural product discovery

    NARCIS (Netherlands)

    Medema, M.H.; Fischbach, M.A.

    2015-01-01

    Starting with the earliest Streptomyces genome sequences, the promise of natural product genome mining has been captivating: genomics and bioinformatics would transform compound discovery from an ad hoc pursuit to a high-throughput endeavor. Until recently, however, genome mining has advanced natura

  14. Computational approaches for urban environments: An editorial

    NARCIS (Netherlands)

    Helbich, M; Jokar Arsanjani, J; Leitner, M

    2015-01-01

    Cities are under continuous pressure due to an increasing urbanization which will have far-reaching consequences for housing, transportation, retail, etc. To cope with these challenges, methodological advances in quantitative modeling coupled with growing amounts of spatial and spatiotemporal data c

  15. DOE Advanced Scientific Computing Advisory Subcommittee (ASCAC) Report: Top Ten Exascale Research Challenges

    Energy Technology Data Exchange (ETDEWEB)

    Lucas, Robert [University of Southern California, Information Sciences Institute; Ang, James [Sandia National Laboratories; Bergman, Keren [Columbia University; Borkar, Shekhar [Intel; Carlson, William [Institute for Defense Analyses; Carrington, Laura [University of California, San Diego; Chiu, George [IBM; Colwell, Robert [DARPA; Dally, William [NVIDIA; Dongarra, Jack [University of Tennessee; Geist, Al [Oak Ridge National Laboratory; Haring, Rud [IBM; Hittinger, Jeffrey [Lawrence Livermore National Laboratory; Hoisie, Adolfy [Pacific Northwest National Laboratory; Klein, Dean Micron; Kogge, Peter [University of Notre Dame; Lethin, Richard [Reservoir Labs; Sarkar, Vivek [Rice University; Schreiber, Robert [Hewlett Packard; Shalf, John [Lawrence Berkeley National Laboratory; Sterling, Thomas [Indiana University; Stevens, Rick [Argonne National Laboratory; Bashor, Jon [Lawrence Berkeley National Laboratory; Brightwell, Ron [Sandia National Laboratories; Coteus, Paul [IBM; Debenedictus, Erik [Sandia National Laboratories; Hiller, Jon [Science and Technology Associates; Kim, K. H. [IBM; Langston, Harper [Reservoir Labs; Murphy, Richard Micron; Webster, Clayton [Oak Ridge National Laboratory; Wild, Stefan [Argonne National Laboratory; Grider, Gary [Los Alamos National Laboratory; Ross, Rob [Argonne National Laboratory; Leyffer, Sven [Argonne National Laboratory; Laros III, James [Sandia National Laboratories

    2014-02-10

    Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a system that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.

  16. High performance parallel computers for science: New developments at the Fermilab advanced computer program

    International Nuclear Information System (INIS)

    Fermilab's Advanced Computer Program (ACP) has been developing highly cost effective, yet practical, parallel computers for high energy physics since 1984. The ACP's latest developments are proceeding in two directions. A Second Generation ACP Multiprocessor System for experiments will include $3500 RISC processors each with performance over 15 VAX MIPS. To support such high performance, the new system allows parallel I/O, parallel interprocess communication, and parallel host processes. The ACP Multi-Array Processor, has been developed for theoretical physics. Each $4000 node is a FORTRAN or C programmable pipelined 20 MFlops (peak), 10 MByte single board computer. These are plugged into a 16 port crossbar switch crate which handles both inter and intra crate communication. The crates are connected in a hypercube. Site oriented applications like lattice gauge theory are supported by system software called CANOPY, which makes the hardware virtually transparent to users. A 256 node, 5 GFlop, system is under construction. 10 refs., 7 figs

  17. Computational dynamics for robotics systems using a non-strict computational approach

    Science.gov (United States)

    Orin, David E.; Wong, Ho-Cheung; Sadayappan, P.

    1989-01-01

    A Non-Strict computational approach for real-time robotics control computations is proposed. In contrast to the traditional approach to scheduling such computations, based strictly on task dependence relations, the proposed approach relaxes precedence constraints and scheduling is guided instead by the relative sensitivity of the outputs with respect to the various paths in the task graph. An example of the computation of the Inverse Dynamics of a simple inverted pendulum is used to demonstrate the reduction in effective computational latency through use of the Non-Strict approach. A speedup of 5 has been obtained when the processes of the task graph are scheduled to reduce the latency along the crucial path of the computation. While error is introduced by the relaxation of precedence constraints, the Non-Strict approach has a smaller error than the conventional Strict approach for a wide range of input conditions.

  18. Computational modeling, optimization and manufacturing simulation of advanced engineering materials

    CERN Document Server

    2016-01-01

    This volume presents recent research work focused in the development of adequate theoretical and numerical formulations to describe the behavior of advanced engineering materials.  Particular emphasis is devoted to applications in the fields of biological tissues, phase changing and porous materials, polymers and to micro/nano scale modeling. Sensitivity analysis, gradient and non-gradient based optimization procedures are involved in many of the chapters, aiming at the solution of constitutive inverse problems and parameter identification. All these relevant topics are exposed by experienced international and inter institutional research teams resulting in a high level compilation. The book is a valuable research reference for scientists, senior undergraduate and graduate students, as well as for engineers acting in the area of computational material modeling.

  19. Computer-Aided Approaches for Targeting HIVgp41

    Directory of Open Access Journals (Sweden)

    William J. Allen

    2012-08-01

    Full Text Available Virus-cell fusion is the primary means by which the human immunodeficiency virus-1 (HIV delivers its genetic material into the human T-cell host. Fusion is mediated in large part by the viral glycoprotein 41 (gp41 which advances through four distinct conformational states: (i native, (ii pre-hairpin intermediate, (iii fusion active (fusogenic, and (iv post-fusion. The pre-hairpin intermediate is a particularly attractive step for therapeutic intervention given that gp41 N-terminal heptad repeat (NHR and C‑terminal heptad repeat (CHR domains are transiently exposed prior to the formation of a six-helix bundle required for fusion. Most peptide-based inhibitors, including the FDA‑approved drug T20, target the intermediate and there are significant efforts to develop small molecule alternatives. Here, we review current approaches to studying interactions of inhibitors with gp41 with an emphasis on atomic-level computer modeling methods including molecular dynamics, free energy analysis, and docking. Atomistic modeling yields a unique level of structural and energetic detail, complementary to experimental approaches, which will be important for the design of improved next generation anti-HIV drugs.

  20. Reliability of an interactive computer program for advance care planning.

    Science.gov (United States)

    Schubart, Jane R; Levi, Benjamin H; Camacho, Fabian; Whitehead, Megan; Farace, Elana; Green, Michael J

    2012-06-01

    Despite widespread efforts to promote advance directives (ADs), completion rates remain low. Making Your Wishes Known: Planning Your Medical Future (MYWK) is an interactive computer program that guides individuals through the process of advance care planning, explaining health conditions and interventions that commonly involve life or death decisions, helps them articulate their values/goals, and translates users' preferences into a detailed AD document. The purpose of this study was to demonstrate that (in the absence of major life changes) the AD generated by MYWK reliably reflects an individual's values/preferences. English speakers ≥30 years old completed MYWK twice, 4 to 6 weeks apart. Reliability indices were assessed for three AD components: General Wishes; Specific Wishes for treatment; and Quality-of-Life values (QoL). Twenty-four participants completed the study. Both the Specific Wishes and QoL scales had high internal consistency in both time periods (Knuder Richardson formula 20 [KR-20]=0.83-0.95, and 0.86-0.89). Test-retest reliability was perfect for General Wishes (κ=1), high for QoL (Pearson's correlation coefficient=0.83), but lower for Specific Wishes (Pearson's correlation coefficient=0.57). MYWK generates an AD where General Wishes and QoL (but not Specific Wishes) statements remain consistent over time. PMID:22512830

  1. Reliability of an Interactive Computer Program for Advance Care Planning

    Science.gov (United States)

    Levi, Benjamin H.; Camacho, Fabian; Whitehead, Megan; Farace, Elana; Green, Michael J

    2012-01-01

    Abstract Despite widespread efforts to promote advance directives (ADs), completion rates remain low. Making Your Wishes Known: Planning Your Medical Future (MYWK) is an interactive computer program that guides individuals through the process of advance care planning, explaining health conditions and interventions that commonly involve life or death decisions, helps them articulate their values/goals, and translates users' preferences into a detailed AD document. The purpose of this study was to demonstrate that (in the absence of major life changes) the AD generated by MYWK reliably reflects an individual's values/preferences. English speakers ≥30 years old completed MYWK twice, 4 to 6 weeks apart. Reliability indices were assessed for three AD components: General Wishes; Specific Wishes for treatment; and Quality-of-Life values (QoL). Twenty-four participants completed the study. Both the Specific Wishes and QoL scales had high internal consistency in both time periods (Knuder Richardson formula 20 [KR-20]=0.83–0.95, and 0.86–0.89). Test-retest reliability was perfect for General Wishes (κ=1), high for QoL (Pearson's correlation coefficient=0.83), but lower for Specific Wishes (Pearson's correlation coefficient=0.57). MYWK generates an AD where General Wishes and QoL (but not Specific Wishes) statements remain consistent over time. PMID:22512830

  2. Optical design and characterization of an advanced computational imaging system

    Science.gov (United States)

    Shepard, R. Hamilton; Fernandez-Cull, Christy; Raskar, Ramesh; Shi, Boxin; Barsi, Christopher; Zhao, Hang

    2014-09-01

    We describe an advanced computational imaging system with an optical architecture that enables simultaneous and dynamic pupil-plane and image-plane coding accommodating several task-specific applications. We assess the optical requirement trades associated with custom and commercial-off-the-shelf (COTS) optics and converge on the development of two low-cost and robust COTS testbeds. The first is a coded-aperture programmable pixel imager employing a digital micromirror device (DMD) for image plane per-pixel oversampling and spatial super-resolution experiments. The second is a simultaneous pupil-encoded and time-encoded imager employing a DMD for pupil apodization or a deformable mirror for wavefront coding experiments. These two testbeds are built to leverage two MIT Lincoln Laboratory focal plane arrays - an orthogonal transfer CCD with non-uniform pixel sampling and on-chip dithering and a digital readout integrated circuit (DROIC) with advanced on-chip per-pixel processing capabilities. This paper discusses the derivation of optical component requirements, optical design metrics, and performance analyses for the two testbeds built.

  3. Human brain mapping: Experimental and computational approaches

    Energy Technology Data Exchange (ETDEWEB)

    Wood, C.C.; George, J.S.; Schmidt, D.M.; Aine, C.J. [Los Alamos National Lab., NM (US); Sanders, J. [Albuquerque VA Medical Center, NM (US); Belliveau, J. [Massachusetts General Hospital, Boston, MA (US)

    1998-11-01

    This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). This program developed project combined Los Alamos' and collaborators' strengths in noninvasive brain imaging and high performance computing to develop potential contributions to the multi-agency Human Brain Project led by the National Institute of Mental Health. The experimental component of the project emphasized the optimization of spatial and temporal resolution of functional brain imaging by combining: (a) structural MRI measurements of brain anatomy; (b) functional MRI measurements of blood flow and oxygenation; and (c) MEG measurements of time-resolved neuronal population currents. The computational component of the project emphasized development of a high-resolution 3-D volumetric model of the brain based on anatomical MRI, in which structural and functional information from multiple imaging modalities can be integrated into a single computational framework for modeling, visualization, and database representation.

  4. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  5. Advanced Simulation & Computing FY15 Implementation Plan Volume 2, Rev. 0.5

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Matzen, M. Keith [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-09-16

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. As the program approaches the end of its second decade, ASC is intently focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Where possible, the program also enables the use of high-performance simulation and computing tools to address broader national security needs, such as foreign nuclear weapon assessments and counternuclear terrorism.

  6. Adapting advanced engineering design approaches to building design. Potential benefits

    NARCIS (Netherlands)

    Böhms, M.

    2006-01-01

    A number of industries continuously progress advancing their design approaches based on the changing market constraints. Examples such as car, ship and airplane manufacturing industries utilize process setups and techniques, that differ significantly from the processes and techniques used by the tra

  7. PREFACE: 16th International workshop on Advanced Computing and Analysis Techniques in physics research (ACAT2014)

    Science.gov (United States)

    Fiala, L.; Lokajicek, M.; Tumova, N.

    2015-05-01

    This volume of the IOP Conference Series is dedicated to scientific contributions presented at the 16th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2014), this year the motto was ''bridging disciplines''. The conference took place on September 1-5, 2014, at the Faculty of Civil Engineering, Czech Technical University in Prague, Czech Republic. The 16th edition of ACAT explored the boundaries of computing system architectures, data analysis algorithmics, automatic calculations, and theoretical calculation technologies. It provided a forum for confronting and exchanging ideas among these fields, where new approaches in computing technologies for scientific research were explored and promoted. This year's edition of the workshop brought together over 140 participants from all over the world. The workshop's 16 invited speakers presented key topics on advanced computing and analysis techniques in physics. During the workshop, 60 talks and 40 posters were presented in three tracks: Computing Technology for Physics Research, Data Analysis - Algorithms and Tools, and Computations in Theoretical Physics: Techniques and Methods. The round table enabled discussions on expanding software, knowledge sharing and scientific collaboration in the respective areas. ACAT 2014 was generously sponsored by Western Digital, Brookhaven National Laboratory, Hewlett Packard, DataDirect Networks, M Computers, Bright Computing, Huawei and PDV-Systemhaus. Special appreciations go to the track liaisons Lorenzo Moneta, Axel Naumann and Grigory Rubtsov for their work on the scientific program and the publication preparation. ACAT's IACC would also like to express its gratitude to all referees for their work on making sure the contributions are published in the proceedings. Our thanks extend to the conference liaisons Andrei Kataev and Jerome Lauret who worked with the local contacts and made this conference possible as well as to the program

  8. Advanced and intelligent computations in diagnosis and control

    CERN Document Server

    2016-01-01

    This book is devoted to the demands of research and industrial centers for diagnostics, monitoring and decision making systems that result from the increasing complexity of automation and systems, the need to ensure the highest level of reliability and safety, and continuing research and the development of innovative approaches to fault diagnosis. The contributions combine domains of engineering knowledge for diagnosis, including detection, isolation, localization, identification, reconfiguration and fault-tolerant control. The book is divided into six parts:  (I) Fault Detection and Isolation; (II) Estimation and Identification; (III) Robust and Fault Tolerant Control; (IV) Industrial and Medical Diagnostics; (V) Artificial Intelligence; (VI) Expert and Computer Systems.

  9. Computational Approach To Understanding Autism Spectrum Disorders

    OpenAIRE

    Włodzisław Duch; Wiesław Nowak; Jaroslaw Meller; Grzegorz Osiński; Krzysztof Dobosz; Dariusz Mikołajewski; Grzegorz Marcin Wójcik

    2012-01-01

    Every year the prevalence of Autism Spectrum of Disorders (ASD) is rising. Is there a unifying mechanism of various ASD cases at the genetic, molecular, cellular or systems level? The hypothesis advanced in this paper is focused on neural dysfunctions that lead to problems with attention in autistic people. Simulations of attractor neural networks performing cognitive functions help to assess system long-term neurodynamics. The Fuzzy Symbolic Dynamics (FSD) technique is used for the visualiza...

  10. Computational Models of Spreadsheet Development: Basis for Educational Approaches

    CERN Document Server

    Hodnigg, Karin; Mittermeir, Roland T

    2008-01-01

    Among the multiple causes of high error rates in spreadsheets, lack of proper training and of deep understanding of the computational model upon which spreadsheet computations rest might not be the least issue. The paper addresses this problem by presenting a didactical model focussing on cell interaction, thus exceeding the atomicity of cell computations. The approach is motivated by an investigation how different spreadsheet systems handle certain computational issues implied from moving cells, copy-paste operations, or recursion.

  11. Cluster Computing: A Mobile Code Approach

    Directory of Open Access Journals (Sweden)

    R. B. Patel

    2006-01-01

    Full Text Available Cluster computing harnesses the combined computing power of multiple processors in a parallel configuration. Cluster Computing environments built from commodity hardware have provided a cost-effective solution for many scientific and high-performance applications. In this paper we have presented design and implementation of a cluster based framework using mobile code. The cluster implementation involves the designing of a server named MCLUSTER which manages the configuring, resetting of cluster. It allows a user to provide necessary information regarding the application to be executed via a graphical user interface (GUI. Framework handles- the generation of application mobile code and its distribution to appropriate client nodes, efficient handling of results so generated and communicated by a number of client nodes and recording of execution time of application. The client node receives and executes the mobile code that defines the distributed job submitted by MCLUSTER server and replies the results back. We have also the analyzed the performance of the developed system emphasizing the tradeoff between communication and computation overhead.

  12. Heterogeneous Computing in Economics: A Simplified Approach

    DEFF Research Database (Denmark)

    Dziubinski, Matt P.; Grassi, Stefano

    This paper shows the potential of heterogeneous computing in solving dynamic equilibrium models in economics. We illustrate the power and simplicity of the C++ Accelerated Massive Parallelism recently introduced by Microsoft. Starting from the same exercise as Aldrich et al. (2011) we document...

  13. Computational and mathematical approaches to societal transitions

    NARCIS (Netherlands)

    J.S. Timmermans (Jos); F. Squazzoni (Flaminio); J. de Haan (Hans)

    2008-01-01

    textabstractAfter an introduction of the theoretical framework and concepts of transition studies, this article gives an overview of how structural change in social systems has been studied from various disciplinary perspectives. This overview first leads to the conclusion that computational and mat

  14. Recent advances in diagnostic approaches for sub-arachnoid hemorrhage

    OpenAIRE

    Kumar, Ashish; Kato, Yoko; Hayakawa, Motoharu; Junpei, ODA; Watabe, Takeya; Imizu, Shuei; Oguri, Daikichi; Hirose, Yuichi

    2011-01-01

    Sub-arachnoid hemorrhage (SAH) has been easily one of the most debilitating neurosurgical entities as far as stroke related case mortality and morbidity rates are concerned. To date, it has case fatality rates ranging from 32-67%. Advances in the diagnostic accuracy of the available imaging methods have contributed significantly in reducing morbidity associated with this deadly disease. We currently have computed tomography angiography (CTA), magnetic resonance angiography (MRA) and the digit...

  15. TerraFERMA: Harnessing Advanced Computational Libraries in Earth Science

    Science.gov (United States)

    Wilson, C. R.; Spiegelman, M.; van Keken, P.

    2012-12-01

    Many important problems in Earth sciences can be described by non-linear coupled systems of partial differential equations. These "multi-physics" problems include thermo-chemical convection in Earth and planetary interiors, interactions of fluids and magmas with the Earth's mantle and crust and coupled flow of water and ice. These problems are of interest to a large community of researchers but are complicated to model and understand. Much of this complexity stems from the nature of multi-physics where small changes in the coupling between variables or constitutive relations can lead to radical changes in behavior, which in turn affect critical computational choices such as discretizations, solvers and preconditioners. To make progress in understanding such coupled systems requires a computational framework where multi-physics problems can be described at a high-level while maintaining the flexibility to easily modify the solution algorithm. Fortunately, recent advances in computational science provide a basis for implementing such a framework. Here we present the Transparent Finite Element Rapid Model Assembler (TerraFERMA), which leverages several advanced open-source libraries for core functionality. FEniCS (fenicsproject.org) provides a high level language for describing the weak forms of coupled systems of equations, and an automatic code generator that produces finite element assembly code. PETSc (www.mcs.anl.gov/petsc) provides a wide range of scalable linear and non-linear solvers that can be composed into effective multi-physics preconditioners. SPuD (amcg.ese.ic.ac.uk/Spud) is an application neutral options system that provides both human and machine-readable interfaces based on a single xml schema. Our software integrates these libraries and provides the user with a framework for exploring multi-physics problems. A single options file fully describes the problem, including all equations, coefficients and solver options. Custom compiled applications are

  16. Advances in computer technology: impact on the practice of medicine.

    Science.gov (United States)

    Groth-Vasselli, B; Singh, K; Farnsworth, P N

    1995-01-01

    Advances in computer technology provide a wide range of applications which are revolutionizing the practice of medicine. The development of new software for the office creates a web of communication among physicians, staff members, health care facilities and associated agencies. This provides the physician with the prospect of a paperless office. At the other end of the spectrum, the development of 3D work stations and software based on computational chemistry permits visualization of protein molecules involved in disease. Computer assisted molecular modeling has been used to construct working 3D models of lens alpha-crystallin. The 3D structure of alpha-crystallin is basic to our understanding of the molecular mechanisms involved in lens fiber cell maturation, stabilization of the inner nuclear region, the maintenance of lens transparency and cataractogenesis. The major component of the high molecular weight aggregates that occur during cataractogenesis is alpha-crystallin subunits. Subunits of alpha-crystallin occur in other tissues of the body. In the central nervous system accumulation of these subunits in the form of dense inclusion bodies occurs in pathological conditions such as Alzheimer's disease, Huntington's disease, multiple sclerosis and toxoplasmosis (Iwaki, Wisniewski et al., 1992), as well as neoplasms of astrocyte origin (Iwaki, Iwaki, et al., 1991). Also cardiac ischemia is associated with an increased alpha B synthesis (Chiesi, Longoni et al., 1990). On a more global level, the molecular structure of alpha-crystallin may provide information pertaining to the function of small heat shock proteins, hsp, in maintaining cell stability under the stress of disease.

  17. Advanced Safeguards Approaches for New TRU Fuel Fabrication Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Durst, Philip C.; Ehinger, Michael H.; Boyer, Brian; Therios, Ike; Bean, Robert; Dougan, A.; Tolk, K.

    2007-12-15

    This second report in a series of three reviews possible safeguards approaches for the new transuranic (TRU) fuel fabrication processes to be deployed at AFCF – specifically, the ceramic TRU (MOX) fuel fabrication line and the metallic (pyroprocessing) line. The most common TRU fuel has been fuel composed of mixed plutonium and uranium dioxide, referred to as “MOX”. However, under the Advanced Fuel Cycle projects custom-made fuels with higher contents of neptunium, americium, and curium may also be produced to evaluate if these “minor actinides” can be effectively burned and transmuted through irradiation in the ABR. A third and final report in this series will evaluate and review the advanced safeguards approach options for the ABR. In reviewing and developing the advanced safeguards approach for the new TRU fuel fabrication processes envisioned for AFCF, the existing international (IAEA) safeguards approach at the Plutonium Fuel Production Facility (PFPF) and the conceptual approach planned for the new J-MOX facility in Japan have been considered as a starting point of reference. The pyro-metallurgical reprocessing and fuel fabrication process at EBR-II near Idaho Falls also provided insight for safeguarding the additional metallic pyroprocessing fuel fabrication line planned for AFCF.

  18. Identification of Enhancers In Human: Advances In Computational Studies

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2016-03-24

    Roughly ~50% of the human genome, contains noncoding sequences serving as regulatory elements responsible for the diverse gene expression of the cells in the body. One very well studied category of regulatory elements is the category of enhancers. Enhancers increase the transcriptional output in cells through chromatin remodeling or recruitment of complexes of binding proteins. Identification of enhancer using computational techniques is an interesting area of research and up to now several approaches have been proposed. However, the current state-of-the-art methods face limitations since the function of enhancers is clarified, but their mechanism of function is not well understood. This PhD thesis presents a bioinformatics/computer science study that focuses on the problem of identifying enhancers in different human cells using computational techniques. The dissertation is decomposed into four main tasks that we present in different chapters. First, since many of the enhancer’s functions are not well understood, we study the basic biological models by which enhancers trigger transcriptional functions and we survey comprehensively over 30 bioinformatics approaches for identifying enhancers. Next, we elaborate more on the availability of enhancer data as produced by different enhancer identification methods and experimental procedures. In particular, we analyze advantages and disadvantages of existing solutions and we report obstacles that require further consideration. To mitigate these problems we developed the Database of Integrated Human Enhancers (DENdb), a centralized online repository that archives enhancer data from 16 ENCODE cell-lines. The integrated enhancer data are also combined with many other experimental data that can be used to interpret the enhancers content and generate a novel enhancer annotation that complements the existing integrative annotation proposed by the ENCODE consortium. Next, we propose the first deep-learning computational

  19. Music Genre Classification Systems - A Computational Approach

    OpenAIRE

    Ahrendt, Peter; Hansen, Lars Kai

    2006-01-01

    Automatic music genre classification is the classification of a piece of music into its corresponding genre (such as jazz or rock) by a computer. It is considered to be a cornerstone of the research area Music Information Retrieval (MIR) and closely linked to the other areas in MIR. It is thought that MIR will be a key element in the processing, searching and retrieval of digital music in the near future. This dissertation is concerned with music genre classification systems and in particular...

  20. Computational and Game-Theoretic Approaches for Modeling Bounded Rationality

    NARCIS (Netherlands)

    L. Waltman (Ludo)

    2011-01-01

    textabstractThis thesis studies various computational and game-theoretic approaches to economic modeling. Unlike traditional approaches to economic modeling, the approaches studied in this thesis do not rely on the assumption that economic agents behave in a fully rational way. Instead, economic age

  1. Recent advances on hybrid approaches for designing intelligent systems

    CERN Document Server

    Melin, Patricia; Pedrycz, Witold; Kacprzyk, Janusz

    2014-01-01

    This book describes recent advances on hybrid intelligent systems using soft computing techniques for diverse areas of application, such as intelligent control and robotics, pattern recognition, time series prediction and optimization complex problems. Soft Computing (SC) consists of several intelligent computing paradigms, including fuzzy logic, neural networks, and bio-inspired optimization algorithms, which can be used to produce powerful hybrid intelligent systems. The book is organized in five main parts, which contain a group of papers around a similar subject. The first part consists of papers with the main theme of type-2 fuzzy logic, which basically consists of papers that propose new models and applications for type-2 fuzzy systems. The second part contains papers with the main theme of bio-inspired optimization algorithms, which are basically papers using nature-inspired techniques to achieve optimization of complex optimization problems in diverse areas of application. The third part contains pape...

  2. SOFT COMPUTING APPROACH FOR NOISY IMAGE RESTORATION

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    A genetic learning algorithm based fuzzy neural network was proposed for noisy image restoration, which can adaptively find and extract the fuzzy rules contained in noise. It can efficiently remove image noise and preserve the detail image information as much as possible. The experimental results show that the proposed approach is able to performa far better than conventional noise removing techniques.

  3. Photonic reservoir computing: a new approach to optical information processing

    OpenAIRE

    Vandoorne, Kristof; Fiers, Martin; Verstraeten, David; Schrauwen, Benjamin; Dambre, Joni; Bienstman, Peter

    2010-01-01

    Despite ever increasing computational power, recognition and classification problems remain challenging to solve. Recently advances have been made by the introduction of the new concept of reservoir computing. This is a methodology coming from the field of machine learning and neural networks and has been successfully used in several pattern classification problems, like speech and image recognition. The implementations have so far been in software, limiting their speed and power efficiency. ...

  4. Computational approaches to homogeneous gold catalysis.

    Science.gov (United States)

    Faza, Olalla Nieto; López, Carlos Silva

    2015-01-01

    Homogenous gold catalysis has been exploding for the last decade at an outstanding pace. The best described reactivity of Au(I) and Au(III) species is based on gold's properties as a soft Lewis acid, but new reactivity patterns have recently emerged which further expand the range of transformations achievable using gold catalysis, with examples of dual gold activation, hydrogenation reactions, or Au(I)/Au(III) catalytic cycles.In this scenario, to develop fully all these new possibilities, the use of computational tools to understand at an atomistic level of detail the complete role of gold as a catalyst is unavoidable. In this work we aim to provide a comprehensive review of the available benchmark works on methodological options to study homogenous gold catalysis in the hope that this effort can help guide the choice of method in future mechanistic studies involving gold complexes. This is relevant because a representative number of current mechanistic studies still use methods which have been reported as inappropriate and dangerously inaccurate for this chemistry.Together with this, we describe a number of recent mechanistic studies where computational chemistry has provided relevant insights into non-conventional reaction paths, unexpected selectivities or novel reactivity, which illustrate the complexity behind gold-mediated organic chemistry.

  5. A complex network approach to cloud computing

    CERN Document Server

    Travieso, Gonzalo; Bruno, Odemir Martinez; Costa, Luciano da Fontoura

    2015-01-01

    Cloud computing has become an important means to speed up computing. One problem influencing heavily the performance of such systems is the choice of nodes as servers responsible for executing the users' tasks. In this article we report how complex networks can be used to model such a problem. More specifically, we investigate the performance of the processing respectively to cloud systems underlain by Erdos-Renyi and Barabasi-Albert topology containing two servers. Cloud networks involving two communities not necessarily of the same size are also considered in our analysis. The performance of each configuration is quantified in terms of two indices: the cost of communication between the user and the nearest server, and the balance of the distribution of tasks between the two servers. Regarding the latter index, the ER topology provides better performance than the BA case for smaller average degrees and opposite behavior for larger average degrees. With respect to the cost, smaller values are found in the BA ...

  6. Novel Computational Approaches to Drug Discovery

    Science.gov (United States)

    Skolnick, Jeffrey; Brylinski, Michal

    2010-01-01

    New approaches to protein functional inference based on protein structure and evolution are described. First, FINDSITE, a threading based approach to protein function prediction, is summarized. Then, the results of large scale benchmarking of ligand binding site prediction, ligand screening, including applications to HIV protease, and GO molecular functional inference are presented. A key advantage of FINDSITE is its ability to use low resolution, predicted structures as well as high resolution experimental structures. Then, an extension of FINDSITE to ligand screening in GPCRs using predicted GPCR structures, FINDSITE/QDOCKX, is presented. This is a particularly difficult case as there are few experimentally solved GPCR structures. Thus, we first train on a subset of known binding ligands for a set of GPCRs; this is then followed by benchmarking against a large ligand library. For the virtual ligand screening of a number of Dopamine receptors, encouraging results are seen, with significant enrichment in identified ligands over those found in the training set. Thus, FINDSITE and its extensions represent a powerful approach to the successful prediction of a variety of molecular functions.

  7. Securing applications in personal computers: the relay race approach.

    OpenAIRE

    Wright, James Michael

    1991-01-01

    Approved for public release; distribution is unlimited This Thesis reviews the increasing need for security in a personal computer (PC) environment and proposes a new approach for securing PC applications at the application layer. The Relay Race Approach extends two standard approaches : data encryption and password access control at the main program level, to the subprogram level by the use of a special parameter, the "Baton" . The applicability of this approach is de...

  8. A polyhedral approach to computing border bases

    CERN Document Server

    Braun, Gábor

    2009-01-01

    Border bases can be considered to be the natural extension of Gr\\"obner bases that have several advantages. Unfortunately, to date the classical border basis algorithm relies on (degree-compatible) term orderings and implicitly on reduced Gr\\"obner bases. We adapt the classical border basis algorithm to allow for calculating border bases for arbitrary degree-compatible order ideals, which is \\emph{independent} from term orderings. Moreover, the algorithm also supports calculating degree-compatible order ideals with \\emph{preference} on contained elements, even though finding a preferred order ideal is NP-hard. Effectively we retain degree-compatibility only to successively extend our computation degree-by-degree. The adaptation is based on our polyhedral characterization: order ideals that support a border basis correspond one-to-one to integral points of the order ideal polytope. This establishes a crucial connection between the ideal and the combinatorial structure of the associated factor spaces.

  9. Computational Diagnostic: A Novel Approach to View Medical Data.

    Energy Technology Data Exchange (ETDEWEB)

    Mane, K. K. (Ketan Kirtiraj); Börner, K. (Katy)

    2007-01-01

    A transition from traditional paper-based medical records to electronic health record is largely underway. The use of electronic records offers tremendous potential to personalize patient diagnosis and treatment. In this paper, we discuss a computational diagnostic tool that uses digital medical records to help doctors gain better insight about a patient's medical condition. The paper details different interactive features of the tool which offer potential to practice evidence-based medicine and advance patient diagnosis practices. The healthcare industry is a constantly evolving domain. Research from this domain is often translated into better understanding of different medical conditions. This new knowledge often contributes towards improved diagnosis and treatment solutions for patients. But the healthcare industry lags behind to seek immediate benefits of the new knowledge as it still adheres to the traditional paper-based approach to keep track of medical records. However recently we notice a drive that promotes a transition towards electronic health record (EHR). An EHR stores patient medical records in digital format and offers potential to replace the paper health records. Earlier attempts of an EHR replicated the paper layout on the screen, representation of medical history of a patient in a graphical time-series format, interactive visualization with 2D/3D generated images from an imaging device. But an EHR can be much more than just an 'electronic view' of the paper record or a collection of images from an imaging device. In this paper, we present an EHR called 'Computational Diagnostic Tool', that provides a novel computational approach to look at patient medical data. The developed EHR system is knowledge driven and acts as clinical decision support tool. The EHR tool provides two visual views of the medical data. Dynamic interaction with data is supported to help doctors practice evidence-based decisions and make judicious

  10. Advances in neural networks computational and theoretical issues

    CERN Document Server

    Esposito, Anna; Morabito, Francesco

    2015-01-01

    This book collects research works that exploit neural networks and machine learning techniques from a multidisciplinary perspective. Subjects covered include theoretical, methodological and computational topics which are grouped together into chapters devoted to the discussion of novelties and innovations related to the field of Artificial Neural Networks as well as the use of neural networks for applications, pattern recognition, signal processing, and special topics such as the detection and recognition of multimodal emotional expressions and daily cognitive functions, and  bio-inspired memristor-based networks.  Providing insights into the latest research interest from a pool of international experts coming from different research fields, the volume becomes valuable to all those with any interest in a holistic approach to implement believable, autonomous, adaptive, and context-aware Information Communication Technologies.

  11. Fractal approach to computer-analytical modelling of tree crown

    International Nuclear Information System (INIS)

    In this paper we discuss three approaches to the modeling of a tree crown development. These approaches are experimental (i.e. regressive), theoretical (i.e. analytical) and simulation (i.e. computer) modeling. The common assumption of these is that a tree can be regarded as one of the fractal objects which is the collection of semi-similar objects and combines the properties of two- and three-dimensional bodies. We show that a fractal measure of crown can be used as the link between the mathematical models of crown growth and light propagation through canopy. The computer approach gives the possibility to visualize a crown development and to calibrate the model on experimental data. In the paper different stages of the above-mentioned approaches are described. The experimental data for spruce, the description of computer system for modeling and the variant of computer model are presented. (author). 9 refs, 4 figs

  12. Q-P Wave traveltime computation by an iterative approach

    KAUST Repository

    Ma, Xuxin

    2013-01-01

    In this work, we present a new approach to compute anisotropic traveltime based on solving successively elliptical isotropic traveltimes. The method shows good accuracy and is very simple to implement.

  13. Computing material fronts with a Lagrange-Projection approach

    CERN Document Server

    Chalons, Christophe

    2010-01-01

    This paper reports investigations on the computation of material fronts in multi-fluid models using a Lagrange-Projection approach. Various forms of the Projection step are considered. Particular attention is paid to minimization of conservation errors.

  14. Deterministic and risk-informed approaches for safety analysis of advanced reactors: Part I, deterministic approaches

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Sang Kyu [Korea Institute of Nuclear Safety, 19 Kusong-dong, Yuseong-gu, Daejeon 305-338 (Korea, Republic of); Kim, Inn Seock, E-mail: innseockkim@gmail.co [ISSA Technology, 21318 Seneca Crossing Drive, Germantown, MD 20876 (United States); Oh, Kyu Myung [Korea Institute of Nuclear Safety, 19 Kusong-dong, Yuseong-gu, Daejeon 305-338 (Korea, Republic of)

    2010-05-15

    The objective of this paper and a companion paper in this issue (part II, risk-informed approaches) is to derive technical insights from a critical review of deterministic and risk-informed safety analysis approaches that have been applied to develop licensing requirements for water-cooled reactors, or proposed for safety verification of the advanced reactor design. To this end, a review was made of a number of safety analysis approaches including those specified in regulatory guides and industry standards, as well as novel methodologies proposed for licensing of advanced reactors. This paper and the companion paper present the review insights on the deterministic and risk-informed safety analysis approaches, respectively. These insights could be used in making a safety case or developing a new licensing review infrastructure for advanced reactors including Generation IV reactors.

  15. Development of advanced nodal diffusion methods for modern computer architectures

    International Nuclear Information System (INIS)

    A family of highly efficient multidimensional multigroup advanced neutron-diffusion nodal methods, ILLICO, were implemented on sequential, vector, and vector-concurrent computers. Three-dimensional realistic benchmark problems can be solved in vectorized mode in less than 0.73 s (33.86 Mflops) on a Cray X-MP/48. Vector-concurrent implementations yield speedups as high as 9.19 on an Alliant FX/8. These results show that the ILLICO method preserves essentially all of its speed advantage over finite-difference methods. A self-consistent higher-order nodal diffusion method was developed and implemented. Nodal methods for global nuclear reactor multigroup diffusion calculations which account explicitly for heterogeneities in the assembly nuclear properties were developed and evaluated. A systematic analysis of the zero-order variable cross section nodal method was conducted. Analyzing the KWU PWR depletion benchmark problem, it is shown that when burnup heterogeneities arise, ordinary nodal methods, which do not explicitly treat the heterogeneities, suffer a significant systematic error that accumulates. A nodal method that treats explicitly the space dependence of diffusion coefficients was developed and implemented. A consistent burnup-correction method for nodal microscopic depletion analysis was developed

  16. Quantitative Computed Tomography and image analysis for advanced muscle assessment

    Directory of Open Access Journals (Sweden)

    Kyle Joseph Edmunds

    2016-06-01

    Full Text Available Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration.

  17. Computational Efforts in Support of Advanced Coal Research

    Energy Technology Data Exchange (ETDEWEB)

    Suljo Linic

    2006-08-17

    The focus in this project was to employ first principles computational methods to study the underlying molecular elementary processes that govern hydrogen diffusion through Pd membranes as well as the elementary processes that govern the CO- and S-poisoning of these membranes. Our computational methodology integrated a multiscale hierarchical modeling approach, wherein a molecular understanding of the interactions between various species is gained from ab-initio quantum chemical Density Functional Theory (DFT) calculations, while a mesoscopic statistical mechanical model like Kinetic Monte Carlo is employed to predict the key macroscopic membrane properties such as permeability. The key developments are: (1) We have coupled systematically the ab initio calculations with Kinetic Monte Carlo (KMC) simulations to model hydrogen diffusion through the Pd based-membranes. The predicted tracer diffusivity of hydrogen atoms through the bulk of Pd lattice from KMC simulations are in excellent agreement with experiments. (2) The KMC simulations of dissociative adsorption of H{sub 2} over Pd(111) surface indicates that for thin membranes (less than 10{micro} thick), the diffusion of hydrogen from surface to the first subsurface layer is rate limiting. (3) Sulfur poisons the Pd surface by altering the electronic structure of the Pd atoms in the vicinity of the S atom. The KMC simulations indicate that increasing sulfur coverage drastically reduces the hydrogen coverage on the Pd surface and hence the driving force for diffusion through the membrane.

  18. An introduction to statistical computing a simulation-based approach

    CERN Document Server

    Voss, Jochen

    2014-01-01

    A comprehensive introduction to sampling-based methods in statistical computing The use of computers in mathematics and statistics has opened up a wide range of techniques for studying otherwise intractable problems.  Sampling-based simulation techniques are now an invaluable tool for exploring statistical models.  This book gives a comprehensive introduction to the exciting area of sampling-based methods. An Introduction to Statistical Computing introduces the classical topics of random number generation and Monte Carlo methods.  It also includes some advanced met

  19. Soft computing approaches to uncertainty propagation in environmental risk mangement

    OpenAIRE

    Kumar, Vikas

    2008-01-01

    Real-world problems, especially those that involve natural systems, are complex and composed of many nondeterministic components having non-linear coupling. It turns out that in dealing with such systems, one has to face a high degree of uncertainty and tolerate imprecision. Classical system models based on numerical analysis, crisp logic or binary logic have characteristics of precision and categoricity and classified as hard computing approach. In contrast soft computing approaches like pro...

  20. Biologically motivated computationally intensive approaches to image pattern recognition

    NARCIS (Netherlands)

    Petkov, Nikolay

    1995-01-01

    This paper presents some of the research activities of the research group in vision as a grand challenge problem whose solution is estimated to need the power of Tflop/s computers and for which computational methods have yet to be developed. The concerned approaches are biologically motivated, in th

  1. An Approach to Dynamic Provisioning of Social and Computational Services

    NARCIS (Netherlands)

    Bonino da Silva Santos, Luiz Olavo; Sorathia, Vikram; Ferreira Pires, Luis; Sinderen, van Marten

    2010-01-01

    Service-Oriented Computing (SOC) builds upon the intuitive notion of service already known and used in our society for a long time. SOC-related approaches are based on computer-executable functional units that often represent automation of services that exist at the social level, i.e., services at t

  2. Analyzing the Drivers of Advanced Sustainable Manufacturing System Using AHP Approach

    Directory of Open Access Journals (Sweden)

    K. Madan Shankar

    2016-08-01

    Full Text Available A number of current manufacturing sectors are striving hard to introduce innovative long-term strategies into their operations. As a result, many scholarly studies have found it fruitful to investigate advanced manufacturing strategies such as agile, computer-integrated, and cellular manufacturing. Through the example of downstream cases, manufacturing sectors have learned that financial benefits garnered through automated technologies cannot be counted on as a sole measure to ensure their success in today’s competitive and fluctuating marketplaces. The objective of this study is to integrate those advanced techniques with sustainable operations, to promote advanced sustainable manufacturing so those manufacturing sectors can thrive even in uncertain markets. To establish this connection, this study analyzes the drivers of advanced sustainable manufacturing through a proposed framework validated through a case study in India. Common drivers are collected from the literature, calibrated with opinions from experts, and analyzed through an analytical hierarchy process (AHP, which is a multi-criteria decision making (MCDM approach. This study reveals that quality is the primary driver that pressures manufacturing sectors to adopt advanced sustainable manufacturing. Manufacturers can easily note the top ranked driver and adopt it to soundly implement advanced sustainable manufacturing. In addition, some key future scopes are explored along with possible recommendations for effective implementation of advanced sustainable manufacturing systems.

  3. General approaches in ensemble quantum computing

    Indian Academy of Sciences (India)

    V Vimalan; N Chandrakumar

    2008-01-01

    We have developed methodology for NMR quantum computing focusing on enhancing the efficiency of initialization, of logic gate implementation and of readout. Our general strategy involves the application of rotating frame pulse sequences to prepare pseudopure states and to perform logic operations. We demonstrate experimentally our methodology for both homonuclear and heteronuclear spin ensembles. On model two-spin systems, the initialization time of one of our sequences is three-fourths (in the heteronuclear case) or one-fourth (in the homonuclear case), of the typical pulsed free precession sequences, attaining the same initialization efficiency. We have implemented the logical SWAP operation in homonuclear AMX spin systems using selective isotropic mixing, reducing the duration taken to a third compared to the standard re-focused INEPT-type sequence. We introduce the 1D version for readout of the rotating frame SWAP operation, in an attempt to reduce readout time. We further demonstrate the Hadamard mode of 1D SWAP, which offers 2N-fold reduction in experiment time for a system with -working bits, attaining the same sensitivity as the standard 1D version.

  4. Block sparse Cholesky algorithms on advanced uniprocessor computers

    Energy Technology Data Exchange (ETDEWEB)

    Ng, E.G.; Peyton, B.W.

    1991-12-01

    As with many other linear algebra algorithms, devising a portable implementation of sparse Cholesky factorization that performs well on the broad range of computer architectures currently available is a formidable challenge. Even after limiting our attention to machines with only one processor, as we have done in this report, there are still several interesting issues to consider. For dense matrices, it is well known that block factorization algorithms are the best means of achieving this goal. We take this approach for sparse factorization as well. This paper has two primary goals. First, we examine two sparse Cholesky factorization algorithms, the multifrontal method and a blocked left-looking sparse Cholesky method, in a systematic and consistent fashion, both to illustrate the strengths of the blocking techniques in general and to obtain a fair evaluation of the two approaches. Second, we assess the impact of various implementation techniques on time and storage efficiency, paying particularly close attention to the work-storage requirement of the two methods and their variants.

  5. Multivariate analysis: A statistical approach for computations

    Science.gov (United States)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  6. Integrated Computational Materials Engineering (ICME) for Third Generation Advanced High-Strength Steel Development

    Energy Technology Data Exchange (ETDEWEB)

    Savic, Vesna; Hector, Louis G.; Ezzat, Hesham; Sachdev, Anil K.; Quinn, James; Krupitzer, Ronald; Sun, Xin

    2015-06-01

    This paper presents an overview of a four-year project focused on development of an integrated computational materials engineering (ICME) toolset for third generation advanced high-strength steels (3GAHSS). Following a brief look at ICME as an emerging discipline within the Materials Genome Initiative, technical tasks in the ICME project will be discussed. Specific aims of the individual tasks are multi-scale, microstructure-based material model development using state-of-the-art computational and experimental techniques, forming, toolset assembly, design optimization, integration and technical cost modeling. The integrated approach is initially illustrated using a 980 grade transformation induced plasticity (TRIP) steel, subject to a two-step quenching and partitioning (Q&P) heat treatment, as an example.

  7. High performance parallel computers for science: New developments at the Fermilab advanced computer program

    Energy Technology Data Exchange (ETDEWEB)

    Nash, T.; Areti, H.; Atac, R.; Biel, J.; Cook, A.; Deppe, J.; Edel, M.; Fischler, M.; Gaines, I.; Hance, R.

    1988-08-01

    Fermilab's Advanced Computer Program (ACP) has been developing highly cost effective, yet practical, parallel computers for high energy physics since 1984. The ACP's latest developments are proceeding in two directions. A Second Generation ACP Multiprocessor System for experiments will include $3500 RISC processors each with performance over 15 VAX MIPS. To support such high performance, the new system allows parallel I/O, parallel interprocess communication, and parallel host processes. The ACP Multi-Array Processor, has been developed for theoretical physics. Each $4000 node is a FORTRAN or C programmable pipelined 20 MFlops (peak), 10 MByte single board computer. These are plugged into a 16 port crossbar switch crate which handles both inter and intra crate communication. The crates are connected in a hypercube. Site oriented applications like lattice gauge theory are supported by system software called CANOPY, which makes the hardware virtually transparent to users. A 256 node, 5 GFlop, system is under construction. 10 refs., 7 figs.

  8. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  9. What is intrinsic motivation? A typology of computational approaches

    Directory of Open Access Journals (Sweden)

    Pierre-Yves Oudeyer

    2009-11-01

    Full Text Available Intrinsic motivation, the causal mechanism for spontaneous exploration and curiosity, is a central concept in developmental psychology. It has been argued to be a crucial mechanism for open-ended cognitive development in humans, and as such has gathered a growing interest from developmental roboticists in the recent years. The goal of this paper is threefold. First, it provides a synthesis of the different approaches of intrinsic motivation in psychology. Second, by interpreting these approaches in a computational reinforcement learning framework, we argue that they are not operational and even sometimes inconsistent. Third, we set the ground for a systematic operational study of intrinsic motivation by presenting a formal typology of possible computational approaches. This typology is partly based on existing computational models, but also presents new ways of conceptualizing intrinsic motivation. We argue that this kind of computational typology might be useful for opening new avenues for research both in psychology and developmental robotics.

  10. 78 FR 68058 - Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational...

    Science.gov (United States)

    2013-11-13

    ... AGENCY Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and..., ``Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and... period was published on September 30, 2013. At the request of the American Chemistry Council, the...

  11. An Integrated Computer-Aided Approach for Environmental Studies

    DEFF Research Database (Denmark)

    Gani, Rafiqul; Chen, Fei; Jaksland, Cecilia;

    1997-01-01

    A general framework for an integrated computer-aided approach to solve process design, control, and environmental problems simultaneously is presented. Physicochemical properties and their relationships to the molecular structure play an important role in the proposed integrated approach. The scope...... and applicability of the integrated approach is highlighted through examples involving estimation of properties and environmental pollution prevention. The importance of mixture effects on some environmentally important properties is also demonstrated....

  12. Propagation of computer virus both across the Internet and external computers: A complex-network approach

    Science.gov (United States)

    Gan, Chenquan; Yang, Xiaofan; Liu, Wanping; Zhu, Qingyi; Jin, Jian; He, Li

    2014-08-01

    Based on the assumption that external computers (particularly, infected external computers) are connected to the Internet, and by considering the influence of the Internet topology on computer virus spreading, this paper establishes a novel computer virus propagation model with a complex-network approach. This model possesses a unique (viral) equilibrium which is globally attractive. Some numerical simulations are also given to illustrate this result. Further study shows that the computers with higher node degrees are more susceptible to infection than those with lower node degrees. In this regard, some appropriate protective measures are suggested.

  13. An integrated approach to emotion recognition for advanced emotional intelligence

    OpenAIRE

    Panagiotis D Bamidis; Frantzidis, Christos A.; Konstantinidis, Evdokimos I.; Luneski, Andrej; Lithari, Chrysa; Klados, Manousos A.; Bratsas, Charalambos; Papadelis, Christos; Pappas, Costas

    2009-01-01

    Emotion identification is beginning to be considered as an essential feature in human-computer interaction. However, most of the studies are mainly focused on facial expression classifications and speech recognition and not much attention has been paid until recently to physiological pattern recognition. In this paper, an integrative approach is proposed to emotional interaction by fusing multi-modal signals. Subjects are exposed to pictures selected from the International Affective Pic...

  14. Computational Thinking and Practice - A Generic Approach to Computing in Danish High Schools

    DEFF Research Database (Denmark)

    Caspersen, Michael E.; Nowack, Palle

    2014-01-01

    Internationally, there is a growing awareness on the necessity of providing relevant computing education in schools, particularly high schools. We present a new and generic approach to Computing in Danish High Schools based on a conceptual framework derived from ideas related to computational thi...... thinking. We present two main theses on which the subject is based, and we present the included knowledge areas and didactical design principles. Finally we summarize the status and future plans for the subject and related development projects.......Internationally, there is a growing awareness on the necessity of providing relevant computing education in schools, particularly high schools. We present a new and generic approach to Computing in Danish High Schools based on a conceptual framework derived from ideas related to computational...

  15. First 3 years of operation of RIACS (Research Institute for Advanced Computer Science) (1983-1985)

    Science.gov (United States)

    Denning, P. J.

    1986-01-01

    The focus of the Research Institute for Advanced Computer Science (RIACS) is to explore matches between advanced computing architectures and the processes of scientific research. An architecture evaluation of the MIT static dataflow machine, specification of a graphical language for expressing distributed computations, and specification of an expert system for aiding in grid generation for two-dimensional flow problems was initiated. Research projects for 1984 and 1985 are summarized.

  16. Recent Advances in Computational Methods for Nuclear Magnetic Resonance Data Processing

    KAUST Repository

    Gao, Xin

    2013-01-11

    Although three-dimensional protein structure determination using nuclear magnetic resonance (NMR) spectroscopy is a computationally costly and tedious process that would benefit from advanced computational techniques, it has not garnered much research attention from specialists in bioinformatics and computational biology. In this paper, we review recent advances in computational methods for NMR protein structure determination. We summarize the advantages of and bottlenecks in the existing methods and outline some open problems in the field. We also discuss current trends in NMR technology development and suggest directions for research on future computational methods for NMR.

  17. A tale of three bio-inspired computational approaches

    Science.gov (United States)

    Schaffer, J. David

    2014-05-01

    I will provide a high level walk-through for three computational approaches derived from Nature. First, evolutionary computation implements what we may call the "mother of all adaptive processes." Some variants on the basic algorithms will be sketched and some lessons I have gleaned from three decades of working with EC will be covered. Then neural networks, computational approaches that have long been studied as possible ways to make "thinking machines", an old dream of man's, and based upon the only known existing example of intelligence. Then, a little overview of attempts to combine these two approaches that some hope will allow us to evolve machines we could never hand-craft. Finally, I will touch on artificial immune systems, Nature's highly sophisticated defense mechanism, that has emerged in two major stages, the innate and the adaptive immune systems. This technology is finding applications in the cyber security world.

  18. Advanced multiresponse process optimisation an intelligent and integrated approach

    CERN Document Server

    Šibalija, Tatjana V

    2016-01-01

    This book presents an intelligent, integrated, problem-independent method for multiresponse process optimization. In contrast to traditional approaches, the idea of this method is to provide a unique model for the optimization of various processes, without imposition of assumptions relating to the type of process, the type and number of process parameters and responses, or interdependences among them. The presented method for experimental design of processes with multiple correlated responses is composed of three modules: an expert system that selects the experimental plan based on the orthogonal arrays; the factor effects approach, which performs processing of experimental data based on Taguchi’s quality loss function and multivariate statistical methods; and process modeling and optimization based on artificial neural networks and metaheuristic optimization algorithms. The implementation is demonstrated using four case studies relating to high-tech industries and advanced, non-conventional processes.

  19. Computer architectures for computational physics work done by Computational Research and Technology Branch and Advanced Computational Concepts Group

    Science.gov (United States)

    1985-01-01

    Slides are reproduced that describe the importance of having high performance number crunching and graphics capability. They also indicate the types of research and development underway at Ames Research Center to ensure that, in the near term, Ames is a smart buyer and user, and in the long-term that Ames knows the best possible solutions for number crunching and graphics needs. The drivers for this research are real computational physics applications of interest to Ames and NASA. They are concerned with how to map the applications, and how to maximize the physics learned from the results of the calculations. The computer graphics activities are aimed at getting maximum information from the three-dimensional calculations by using the real time manipulation of three-dimensional data on the Silicon Graphics workstation. Work is underway on new algorithms that will permit the display of experimental results that are sparse and random, the same way that the dense and regular computed results are displayed.

  20. Parallel computing in genomic research: advances and applications

    Directory of Open Access Journals (Sweden)

    Ocaña K

    2015-11-01

    Full Text Available Kary Ocaña,1 Daniel de Oliveira2 1National Laboratory of Scientific Computing, Petrópolis, Rio de Janeiro, 2Institute of Computing, Fluminense Federal University, Niterói, Brazil Abstract: Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities. Keywords: high-performance computing, genomic research, cloud computing, grid computing, cluster computing, parallel computing

  1. ADVANCED COMPUTATIONAL MODEL FOR THREE-PHASE SLURRY REACTORS

    Energy Technology Data Exchange (ETDEWEB)

    Goodarz Ahmadi

    2004-10-01

    In this project, an Eulerian-Lagrangian formulation for analyzing three-phase slurry flows in a bubble column was developed. The approach used an Eulerian analysis of liquid flows in the bubble column, and made use of the Lagrangian trajectory analysis for the bubbles and particle motions. The bubble-bubble and particle-particle collisions are included the model. The model predictions are compared with the experimental data and good agreement was found An experimental setup for studying two-dimensional bubble columns was developed. The multiphase flow conditions in the bubble column were measured using optical image processing and Particle Image Velocimetry techniques (PIV). A simple shear flow device for bubble motion in a constant shear flow field was also developed. The flow conditions in simple shear flow device were studied using PIV method. Concentration and velocity of particles of different sizes near a wall in a duct flow was also measured. The technique of Phase-Doppler anemometry was used in these studies. An Eulerian volume of fluid (VOF) computational model for the flow condition in the two-dimensional bubble column was also developed. The liquid and bubble motions were analyzed and the results were compared with observed flow patterns in the experimental setup. Solid-fluid mixture flows in ducts and passages at different angle of orientations were also analyzed. The model predictions were compared with the experimental data and good agreement was found. Gravity chute flows of solid-liquid mixtures were also studied. The simulation results were compared with the experimental data and discussed A thermodynamically consistent model for multiphase slurry flows with and without chemical reaction in a state of turbulent motion was developed. The balance laws were obtained and the constitutive laws established.

  2. ADVANCED COMPUTATIONAL MODEL FOR THREE-PHASE SLURRY REACTORS

    International Nuclear Information System (INIS)

    In this project, an Eulerian-Lagrangian formulation for analyzing three-phase slurry flows in a bubble column was developed. The approach used an Eulerian analysis of liquid flows in the bubble column, and made use of the Lagrangian trajectory analysis for the bubbles and particle motions. The bubble-bubble and particle-particle collisions are included the model. The model predictions are compared with the experimental data and good agreement was found An experimental setup for studying two-dimensional bubble columns was developed. The multiphase flow conditions in the bubble column were measured using optical image processing and Particle Image Velocimetry techniques (PIV). A simple shear flow device for bubble motion in a constant shear flow field was also developed. The flow conditions in simple shear flow device were studied using PIV method. Concentration and velocity of particles of different sizes near a wall in a duct flow was also measured. The technique of Phase-Doppler anemometry was used in these studies. An Eulerian volume of fluid (VOF) computational model for the flow condition in the two-dimensional bubble column was also developed. The liquid and bubble motions were analyzed and the results were compared with observed flow patterns in the experimental setup. Solid-fluid mixture flows in ducts and passages at different angle of orientations were also analyzed. The model predictions were compared with the experimental data and good agreement was found. Gravity chute flows of solid-liquid mixtures were also studied. The simulation results were compared with the experimental data and discussed A thermodynamically consistent model for multiphase slurry flows with and without chemical reaction in a state of turbulent motion was developed. The balance laws were obtained and the constitutive laws established

  3. Bio-inspired computational techniques based on advanced condition monitoring

    Institute of Scientific and Technical Information of China (English)

    Su Liangcheng; He Shan; Li Xiaoli; Li Xinglin

    2011-01-01

    The application of bio-inspired computational techniques to the field of condition monitoring is addressed.First, the bio-inspired computational techniques are briefly addressed; the advantages and disadvantages of these computational methods are made clear. Then, the roles of condition monitoring in the predictive maintenance and failures prediction and the development trends of condition monitoring are discussed. Finally, a case study on the condition monitoring of grinding machine is described, which shows the application of bio-inspired computational technique to a practical condition monitoring system.

  4. Computational challenges of structure-based approaches applied to HIV.

    Science.gov (United States)

    Forli, Stefano; Olson, Arthur J

    2015-01-01

    Here, we review some of the opportunities and challenges that we face in computational modeling of HIV therapeutic targets and structural biology, both in terms of methodology development and structure-based drug design (SBDD). Computational methods have provided fundamental support to HIV research since the initial structural studies, helping to unravel details of HIV biology. Computational models have proved to be a powerful tool to analyze and understand the impact of mutations and to overcome their structural and functional influence in drug resistance. With the availability of structural data, in silico experiments have been instrumental in exploiting and improving interactions between drugs and viral targets, such as HIV protease, reverse transcriptase, and integrase. Issues such as viral target dynamics and mutational variability, as well as the role of water and estimates of binding free energy in characterizing ligand interactions, are areas of active computational research. Ever-increasing computational resources and theoretical and algorithmic advances have played a significant role in progress to date, and we envision a continually expanding role for computational methods in our understanding of HIV biology and SBDD in the future.

  5. ADVANCING THE FUNDAMENTAL UNDERSTANDING AND SCALE-UP OF TRISO FUEL COATERS VIA ADVANCED MEASUREMENT AND COMPUTATIONAL TECHNIQUES

    Energy Technology Data Exchange (ETDEWEB)

    Biswas, Pratim; Al-Dahhan, Muthanna

    2012-11-01

    Tri-isotropic (TRISO) fuel particle coating is critical for the future use of nuclear energy produced byadvanced gas reactors (AGRs). The fuel kernels are coated using chemical vapor deposition in a spouted fluidized bed. The challenges encountered in operating TRISO fuel coaters are due to the fact that in modern AGRs, such as High Temperature Gas Reactors (HTGRs), the acceptable level of defective/failed coated particles is essentially zero. This specification requires processes that produce coated spherical particles with even coatings having extremely low defect fractions. Unfortunately, the scale-up and design of the current processes and coaters have been based on empirical approaches and are operated as black boxes. Hence, a voluminous amount of experimental development and trial and error work has been conducted. It has been clearly demonstrated that the quality of the coating applied to the fuel kernels is impacted by the hydrodynamics, solids flow field, and flow regime characteristics of the spouted bed coaters, which themselves are influenced by design parameters and operating variables. Further complicating the outlook for future fuel-coating technology and nuclear energy production is the fact that a variety of new concepts will involve fuel kernels of different sizes and with compositions of different densities. Therefore, without a fundamental understanding the underlying phenomena of the spouted bed TRISO coater, a significant amount of effort is required for production of each type of particle with a significant risk of not meeting the specifications. This difficulty will significantly and negatively impact the applications of AGRs for power generation and cause further challenges to them as an alternative source of commercial energy production. Accordingly, the proposed work seeks to overcome such hurdles and advance the scale-up, design, and performance of TRISO fuel particle spouted bed coaters. The overall objectives of the proposed work are

  6. The Advance of Computing from the Ground to the Cloud

    Science.gov (United States)

    Breeding, Marshall

    2009-01-01

    A trend toward the abstraction of computing platforms that has been developing in the broader IT arena over the last few years is just beginning to make inroads into the library technology scene. Cloud computing offers for libraries many interesting possibilities that may help reduce technology costs and increase capacity, reliability, and…

  7. Advances in soft computing, intelligent robotics and control

    CERN Document Server

    Fullér, Robert

    2014-01-01

    Soft computing, intelligent robotics and control are in the core interest of contemporary engineering. Essential characteristics of soft computing methods are the ability to handle vague information, to apply human-like reasoning, their learning capability, and ease of application. Soft computing techniques are widely applied in the control of dynamic systems, including mobile robots. The present volume is a collection of 20 chapters written by respectable experts of the fields, addressing various theoretical and practical aspects in soft computing, intelligent robotics and control. The first part of the book concerns with issues of intelligent robotics, including robust xed point transformation design, experimental verification of the input-output feedback linearization of differentially driven mobile robot and applying kinematic synthesis to micro electro-mechanical systems design. The second part of the book is devoted to fundamental aspects of soft computing. This includes practical aspects of fuzzy rule ...

  8. A computationally efficient approach for hidden-Markov model-augmented fingerprint-based positioning

    Science.gov (United States)

    Roth, John; Tummala, Murali; McEachen, John

    2016-09-01

    This paper presents a computationally efficient approach for mobile subscriber position estimation in wireless networks. A method of data scaling assisted by timing adjust is introduced in fingerprint-based location estimation under a framework which allows for minimising computational cost. The proposed method maintains a comparable level of accuracy to the traditional case where no data scaling is used and is evaluated in a simulated environment under varying channel conditions. The proposed scheme is studied when it is augmented by a hidden-Markov model to match the internal parameters to the channel conditions that present, thus minimising computational cost while maximising accuracy. Furthermore, the timing adjust quantity, available in modern wireless signalling messages, is shown to be able to further reduce computational cost and increase accuracy when available. The results may be seen as a significant step towards integrating advanced position-based modelling with power-sensitive mobile devices.

  9. A Unitifed Computational Approach to Oxide Aging Processes

    Energy Technology Data Exchange (ETDEWEB)

    Bowman, D.J.; Fleetwood, D.M.; Hjalmarson, H.P.; Schultz, P.A.

    1999-01-27

    In this paper we describe a unified, hierarchical computational approach to aging and reliability problems caused by materials changes in the oxide layers of Si-based microelectronic devices. We apply this method to a particular low-dose-rate radiation effects problem

  10. A computational approach to mechanistic and predictive toxicology of pesticides

    DEFF Research Database (Denmark)

    Kongsbak, Kristine Grønning; Vinggaard, Anne Marie; Hadrup, Niels;

    2014-01-01

    Emerging challenges of managing and interpreting large amounts of complex biological data have given rise to the growing field of computational biology. We investigated the applicability of an integrated systems toxicology approach on five selected pesticides to get an overview of their modes...

  11. Tutorial on Computing: Technological Advances, Social Implications, Ethical and Legal Issues

    OpenAIRE

    Debnath, Narayan

    2012-01-01

    Computing and information technology have made significant advances. The use of computing and technology is a major aspect of our lives, and this use will only continue to increase in our lifetime. Electronic digital computers and high performance communication networks are central to contemporary information technology. The computing applications in a wide range of areas including business, communications, medical research, transportation, entertainments, and education are transforming lo...

  12. Advanced neural network-based computational schemes for robust fault diagnosis

    CERN Document Server

    Mrugalski, Marcin

    2014-01-01

    The present book is devoted to problems of adaptation of artificial neural networks to robust fault diagnosis schemes. It presents neural networks-based modelling and estimation techniques used for designing robust fault diagnosis schemes for non-linear dynamic systems. A part of the book focuses on fundamental issues such as architectures of dynamic neural networks, methods for designing of neural networks and fault diagnosis schemes as well as the importance of robustness. The book is of a tutorial value and can be perceived as a good starting point for the new-comers to this field. The book is also devoted to advanced schemes of description of neural model uncertainty. In particular, the methods of computation of neural networks uncertainty with robust parameter estimation are presented. Moreover, a novel approach for system identification with the state-space GMDH neural network is delivered. All the concepts described in this book are illustrated by both simple academic illustrative examples and practica...

  13. Relaxed resource advance reservation policy in grid computing

    Institute of Scientific and Technical Information of China (English)

    XIAO Peng; HU Zhi-gang

    2009-01-01

    The advance reservation technique has been widely applied in many grid systems to provide end-to-end quality of service (QoS). However, it will result in low resource utilization rate and high rejection rate when the reservation rate is high. To mitigate these negative effects brought about by advance reservation, a relaxed advance reservation policy is proposed, which allows accepting new reservation requests that overlap the existing reservations under certain conditions. Both the benefits and the risks of the proposed policy are presented theoretically. The experimental results show that the policy can achieve a higher resource utilization rate and lower rejection rate compared to the conventional reservation policy and backfilling technique. In addition, the policy shows better adaptation when the grid systems are in the presence of a high reservation rate.

  14. Center for Advanced Energy Studies: Computer Assisted Virtual Environment (CAVE)

    Data.gov (United States)

    Federal Laboratory Consortium — The laboratory contains a four-walled 3D computer assisted virtual environment - or CAVE TM — that allows scientists and engineers to literally walk into their data...

  15. Building an Advanced Computing Environment with SAN Support

    Institute of Scientific and Technical Information of China (English)

    DajianYANG; MeiMA; 等

    2001-01-01

    The current computing environment of our Computing Center in IHEP uses a SAS (server Attached Storage)architecture,attaching all the storage devices directly to the machines.This kind of storage strategy can't meet the requirement of our BEPC II/BESⅢ project properly.Thus we design and implement a SAN-based computing environment,which consists of several computing farms,a three-level storage pool,a set of storage management software and a web-based data management system.The feature of ours system includes cross-platform data sharing,fast data access,high scalability,convenient storage management and data management.

  16. Advances in Physarum machines sensing and computing with Slime mould

    CERN Document Server

    2016-01-01

    This book is devoted to Slime mould Physarum polycephalum, which is a large single cell capable for distributed sensing, concurrent information processing, parallel computation and decentralized actuation. The ease of culturing and experimenting with Physarum makes this slime mould an ideal substrate for real-world implementations of unconventional sensing and computing devices The book is a treatise of theoretical and experimental laboratory studies on sensing and computing properties of slime mould, and on the development of mathematical and logical theories of Physarum behavior. It is shown how to make logical gates and circuits, electronic devices (memristors, diodes, transistors, wires, chemical and tactile sensors) with the slime mould. The book demonstrates how to modify properties of Physarum computing circuits with functional nano-particles and polymers, to interface the slime mould with field-programmable arrays, and to use Physarum as a controller of microbial fuel cells. A unique multi-agent model...

  17. A computational intelligence approach to the Mars Precision Landing problem

    Science.gov (United States)

    Birge, Brian Kent, III

    Various proposed Mars missions, such as the Mars Sample Return Mission (MRSR) and the Mars Smart Lander (MSL), require precise re-entry terminal position and velocity states. This is to achieve mission objectives including rendezvous with a previous landed mission, or reaching a particular geographic landmark. The current state of the art footprint is in the magnitude of kilometers. For this research a Mars Precision Landing is achieved with a landed footprint of no more than 100 meters, for a set of initial entry conditions representing worst guess dispersions. Obstacles to reducing the landed footprint include trajectory dispersions due to initial atmospheric entry conditions (entry angle, parachute deployment height, etc.), environment (wind, atmospheric density, etc.), parachute deployment dynamics, unavoidable injection error (propagated error from launch on), etc. Weather and atmospheric models have been developed. Three descent scenarios have been examined. First, terminal re-entry is achieved via a ballistic parachute with concurrent thrusting events while on the parachute, followed by a gravity turn. Second, terminal re-entry is achieved via a ballistic parachute followed by gravity turn to hover and then thrust vector to desired location. Third, a guided parafoil approach followed by vectored thrusting to reach terminal velocity is examined. The guided parafoil is determined to be the best architecture. The purpose of this study is to examine the feasibility of using a computational intelligence strategy to facilitate precision planetary re-entry, specifically to take an approach that is somewhat more intuitive and less rigid, and see where it leads. The test problems used for all research are variations on proposed Mars landing mission scenarios developed by NASA. A relatively recent method of evolutionary computation is Particle Swarm Optimization (PSO), which can be considered to be in the same general class as Genetic Algorithms. An improvement over

  18. Mutations that Cause Human Disease: A Computational/Experimental Approach

    Energy Technology Data Exchange (ETDEWEB)

    Beernink, P; Barsky, D; Pesavento, B

    2006-01-11

    International genome sequencing projects have produced billions of nucleotides (letters) of DNA sequence data, including the complete genome sequences of 74 organisms. These genome sequences have created many new scientific opportunities, including the ability to identify sequence variations among individuals within a species. These genetic differences, which are known as single nucleotide polymorphisms (SNPs), are particularly important in understanding the genetic basis for disease susceptibility. Since the report of the complete human genome sequence, over two million human SNPs have been identified, including a large-scale comparison of an entire chromosome from twenty individuals. Of the protein coding SNPs (cSNPs), approximately half leads to a single amino acid change in the encoded protein (non-synonymous coding SNPs). Most of these changes are functionally silent, while the remainder negatively impact the protein and sometimes cause human disease. To date, over 550 SNPs have been found to cause single locus (monogenic) diseases and many others have been associated with polygenic diseases. SNPs have been linked to specific human diseases, including late-onset Parkinson disease, autism, rheumatoid arthritis and cancer. The ability to predict accurately the effects of these SNPs on protein function would represent a major advance toward understanding these diseases. To date several attempts have been made toward predicting the effects of such mutations. The most successful of these is a computational approach called ''Sorting Intolerant From Tolerant'' (SIFT). This method uses sequence conservation among many similar proteins to predict which residues in a protein are functionally important. However, this method suffers from several limitations. First, a query sequence must have a sufficient number of relatives to infer sequence conservation. Second, this method does not make use of or provide any information on protein structure, which

  19. Computer Forensics for Graduate Accountants: A Motivational Curriculum Design Approach

    Directory of Open Access Journals (Sweden)

    Grover Kearns

    2010-06-01

    Full Text Available Computer forensics involves the investigation of digital sources to acquire evidence that can be used in a court of law. It can also be used to identify and respond to threats to hosts and systems. Accountants use computer forensics to investigate computer crime or misuse, theft of trade secrets, theft of or destruction of intellectual property, and fraud. Education of accountants to use forensic tools is a goal of the AICPA (American Institute of Certified Public Accountants. Accounting students, however, may not view information technology as vital to their career paths and need motivation to acquire forensic knowledge and skills. This paper presents a curriculum design methodology for teaching graduate accounting students computer forensics. The methodology is tested using perceptions of the students about the success of the methodology and their acquisition of forensics knowledge and skills. An important component of the pedagogical approach is the use of an annotated list of over 50 forensic web-based tools.

  20. Advances in computational design and analysis of airbreathing propulsion systems

    Science.gov (United States)

    Klineberg, John M.

    1989-01-01

    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  1. Genomic and physiological approaches to advancing forest tree improvement.

    Science.gov (United States)

    Nelson, C Dana; Johnsen, Kurt H

    2008-07-01

    The recent completion of a draft sequence of the poplar (Populus trichocarpa Torr. & Gray ex Brayshaw) genome has advanced forest tree genetics to an unprecedented level. A "parts list" for a forest tree has been produced, opening up new opportunities for dissecting the interworkings of tree growth and development. In the relatively near future we can anticipate additional reference genome sequences, including the much larger Pinus genome. One goal is to use this information to define the genomic attributes that affect the phenotypic performances of trees growing in various environments. A first step is the definition of ideotypes that constitute optimal tree and stand-level performance. Following this, the genome can be systematically searched for genetic elements and their allelic variants that affect the specified traits. Knowledge of these alleles and their effects will facilitate the development of efficient tree improvement programs through genome-guided breeding and genetic engineering and further our mechanistic understanding of trait variation. Improved mechanistic understanding of tree growth and development is needed to develop process models that will allow us to anticipate and manage change in forest ecosystems. Here we consider the development of an ideotype for loblolly pine (Pinus taeda L.) and discuss genomic approaches for studying the component traits that will enable advances in process model development and the genetic improvement of this important conifer. PMID:18450578

  2. Advancements in Violin-Related Human-Computer Interaction

    DEFF Research Database (Denmark)

    Overholt, Daniel

    2014-01-01

    of human intelligence and emotion is at the core of the Musical Interface Technology Design Space, MITDS. This is a framework that endeavors to retain and enhance such traits of traditional instruments in the design of interactive live performance interfaces. Utilizing the MITDS, advanced Human...

  3. The use of advanced computer simulation in structural design

    Energy Technology Data Exchange (ETDEWEB)

    Field, C.J.; Mole, A. [Arup, San Fransisco, CA (United States); Arkinstall, M. [Arup, Sydney (Australia)

    2005-07-01

    The benefits that can be gained from the application of advanced numerical simulation in building design were discussed. A review of current practices in structural engineering was presented along with an illustration of a range of international project case studies. Structural engineers use analytical methods to evaluate both static and dynamic loads. Structural design is prescribed by a range of building codes, depending on location, building type and loading, but often, buildings do not fit well within the codes, particularly if one wants to take advantage of new technologies and developments in design that are not covered by the code. Advanced simulation refers to the use of mathematical modeling to complex problems to allow a wider consideration of building types and conditions that can be designed reliably using standard practices. Advanced simulation is used to address virtual testing and prototyping, verifying innovative design ideas, forensic engineering, and design optimization. The benefits of advanced simulation include enhanced creativity, improved performance, cost savings, risk management, sustainable design solutions, and better communication. The following 5 case studies illustrated the value gained by using advanced simulation as an integral part of the design process: the earthquake resistant Maison Hermes in Tokyo; the seismic resistant braces known as the Unbonded Brace for use in the United States; a simulation of the existing Disney Museum to evaluate its capacity to resist earthquakes; simulation of the MIT Brain and Cognitive Science Project to evaluate the effect of different foundation types on the vibration entering the building; and, the Beijing Aquatic Center whose design was streamlined by optimized structural analysis. It was suggested that industry should encourage the transfer of technology from other professions and should try to collaborate towards a global building model to construct buildings in a more efficient manner. 7 refs

  4. Advances in Computing and Information Technology : Proceedings of the Second International

    CERN Document Server

    Nagamalai, Dhinaharan; Chaki, Nabendu

    2012-01-01

    The international conference on Advances in Computing and Information technology (ACITY 2012) provides an excellent international forum for both academics and professionals for sharing knowledge and results in theory, methodology and applications of Computer Science and Information Technology. The Second International Conference on Advances in Computing and Information technology (ACITY 2012), held in Chennai, India, during July 13-15, 2012, covered a number of topics in all major fields of Computer Science and Information Technology including: networking and communications, network security and applications, web and internet computing, ubiquitous computing, algorithms, bioinformatics, digital image processing and pattern recognition, artificial intelligence, soft computing and applications. Upon a strength review process, a number of high-quality, presenting not only innovative ideas but also a founded evaluation and a strong argumentation of the same, were selected and collected in the present proceedings, ...

  5. Advances in a computer aided bilateral manipulator system

    International Nuclear Information System (INIS)

    This paper relates developments and experiments carried at Saclay in the frame of ARA/sup b/ program by the computer aided teleoperation (CAT) group. The goal is to improve efficiency and operational safety of remote operations using computer and sensors. They enable to substitute to the to the operator(s) in time sharing and/or in parallel, and augment amount and/or quality of sensory feedback. After describing the test facility in Saclay, the developments of various participants are described. Result of this work will be commercially available with the MA23M and future MAE 200 at La Calhene (France, UK, Japan)

  6. Computer Synthesis Approaches of Hyperboloid Gear Drives with Linear Contact

    Directory of Open Access Journals (Sweden)

    Abadjiev Valentin

    2014-09-01

    Full Text Available The computer design has improved forming different type software for scientific researches in the field of gearing theory as well as performing an adequate scientific support of the gear drives manufacture. Here are attached computer programs that are based on mathematical models as a result of scientific researches. The modern gear transmissions require the construction of new mathematical approaches to their geometric, technological and strength analysis. The process of optimization, synthesis and design is based on adequate iteration procedures to find out an optimal solution by varying definite parameters.

  7. An evolutionary computational approach for the dynamic Stackelberg competition problems

    Directory of Open Access Journals (Sweden)

    Lorena Arboleda-Castro

    2016-06-01

    Full Text Available Stackelberg competition models are an important family of economical decision problems from game theory, in which the main goal is to find optimal strategies between two competitors taking into account their hierarchy relationship. Although these models have been widely studied in the past, it is important to note that very few works deal with uncertainty scenarios, especially those that vary over time. In this regard, the present research studies this topic and proposes a computational method for solving efficiently dynamic Stackelberg competition models. The computational experiments suggest that the proposed approach is effective for problems of this nature.

  8. Computer vision approaches to medical image analysis. Revised papers

    International Nuclear Information System (INIS)

    This book constitutes the thoroughly refereed post proceedings of the international workshop Computer Vision Approaches to Medical Image Analysis, CVAMIA 2006, held in Graz, Austria in May 2006 as a satellite event of the 9th European Conference on Computer Vision, EECV 2006. The 10 revised full papers and 11 revised poster papers presented together with 1 invited talk were carefully reviewed and selected from 38 submissions. The papers are organized in topical sections on clinical applications, image registration, image segmentation and analysis, and the poster session. (orig.)

  9. Current advances in molecular, biochemical, and computational modeling analysis of microalgal triacylglycerol biosynthesis.

    Science.gov (United States)

    Lenka, Sangram K; Carbonaro, Nicole; Park, Rudolph; Miller, Stephen M; Thorpe, Ian; Li, Yantao

    2016-01-01

    Triacylglycerols (TAGs) are highly reduced energy storage molecules ideal for biodiesel production. Microalgal TAG biosynthesis has been studied extensively in recent years, both at the molecular level and systems level through experimental studies and computational modeling. However, discussions of the strategies and products of the experimental and modeling approaches are rarely integrated and summarized together in a way that promotes collaboration among modelers and biologists in this field. In this review, we outline advances toward understanding the cellular and molecular factors regulating TAG biosynthesis in unicellular microalgae with an emphasis on recent studies on rate-limiting steps in fatty acid and TAG synthesis, while also highlighting new insights obtained from the integration of multi-omics datasets with mathematical models. Computational methodologies such as kinetic modeling, metabolic flux analysis, and new variants of flux balance analysis are explained in detail. We discuss how these methods have been used to simulate algae growth and lipid metabolism in response to changing culture conditions and how they have been used in conjunction with experimental validations. Since emerging evidence indicates that TAG synthesis in microalgae operates through coordinated crosstalk between multiple pathways in diverse subcellular destinations including the endoplasmic reticulum and plastids, we discuss new experimental studies and models that incorporate these findings for discovering key regulatory checkpoints. Finally, we describe tools for genetic manipulation of microalgae and their potential for future rational algal strain design. This comprehensive review explores the potential synergistic impact of pathway analysis, computational approaches, and molecular genetic manipulation strategies on improving TAG production in microalgae.

  10. Current advances in molecular, biochemical, and computational modeling analysis of microalgal triacylglycerol biosynthesis.

    Science.gov (United States)

    Lenka, Sangram K; Carbonaro, Nicole; Park, Rudolph; Miller, Stephen M; Thorpe, Ian; Li, Yantao

    2016-01-01

    Triacylglycerols (TAGs) are highly reduced energy storage molecules ideal for biodiesel production. Microalgal TAG biosynthesis has been studied extensively in recent years, both at the molecular level and systems level through experimental studies and computational modeling. However, discussions of the strategies and products of the experimental and modeling approaches are rarely integrated and summarized together in a way that promotes collaboration among modelers and biologists in this field. In this review, we outline advances toward understanding the cellular and molecular factors regulating TAG biosynthesis in unicellular microalgae with an emphasis on recent studies on rate-limiting steps in fatty acid and TAG synthesis, while also highlighting new insights obtained from the integration of multi-omics datasets with mathematical models. Computational methodologies such as kinetic modeling, metabolic flux analysis, and new variants of flux balance analysis are explained in detail. We discuss how these methods have been used to simulate algae growth and lipid metabolism in response to changing culture conditions and how they have been used in conjunction with experimental validations. Since emerging evidence indicates that TAG synthesis in microalgae operates through coordinated crosstalk between multiple pathways in diverse subcellular destinations including the endoplasmic reticulum and plastids, we discuss new experimental studies and models that incorporate these findings for discovering key regulatory checkpoints. Finally, we describe tools for genetic manipulation of microalgae and their potential for future rational algal strain design. This comprehensive review explores the potential synergistic impact of pathway analysis, computational approaches, and molecular genetic manipulation strategies on improving TAG production in microalgae. PMID:27321475

  11. 77 FR 45345 - DOE/Advanced Scientific Computing Advisory Committee

    Science.gov (United States)

    2012-07-31

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY DOE... at (301) 903-7486 or email at: Melea.Baker@science.doe.gov . You must make your request for an oral... Computing Web site ( www.sc.doe.gov/ascr ) for viewing. Issued at Washington, DC, on July 25, 2012....

  12. 77 FR 62231 - DOE/Advanced Scientific Computing Advisory Committee

    Science.gov (United States)

    2012-10-12

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY DOE.... Computational Science Graduate Fellowship (CSGF) Longitudinal Study. Update on Exascale. Update from DOE data... contact Melea Baker, (301) 903-7486 or by email at: Melea.Baker@science.doe.gov . You must make...

  13. Advanced Simulation and Computing Co-Design Strategy

    Energy Technology Data Exchange (ETDEWEB)

    Ang, James A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoang, Thuc T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kelly, Suzanne M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); McPherson, Allen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Neely, Rob [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-11-01

    This ASC Co-design Strategy lays out the full continuum and components of the co-design process, based on what we have experienced thus far and what we wish to do more in the future to meet the program’s mission of providing high performance computing (HPC) and simulation capabilities for NNSA to carry out its stockpile stewardship responsibility.

  14. Advanced Micro Optics Characterization Using Computer Generated Holograms

    Energy Technology Data Exchange (ETDEWEB)

    Arnold, S.; Maxey, L.C.; Moreshead, W.; Nogues, J.L.

    1998-11-01

    This CRADA has enabled the validation of Computer Generated Holograms (CGH) testing for certain classes of micro optics. It has also identified certain issues that are significant when considering the use of CGHs in this application. Both contributions are advantageous in the pursuit of better manufacturing and testing technologies for these important optical components.

  15. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Bremer, Peer-Timo [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mohr, Bernd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schulz, Martin [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pasccci, Valerio [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gamblin, Todd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Brunst, Holger [Dresden Univ. of Technology (Germany)

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  16. Proceedings: Workshop on Advanced Mathematics and Computer Science for Power Systems Analysis

    Energy Technology Data Exchange (ETDEWEB)

    None

    1991-08-01

    EPRI's Office of Exploratory Research sponsors a series of workshops that explore how to apply recent advances in mathematics and computer science to the problems of the electric utility industry. In this workshop, participants identified research objectives that may significantly improve the mathematical methods and computer architecture currently used for power system analysis.

  17. Integrated Computer Aided Planning and Manufacture of Advanced Technology Jet Engines

    Directory of Open Access Journals (Sweden)

    B. K. Subhas

    1987-10-01

    Full Text Available This paper highlights an attempt at evolving a computer aided manufacturing system on a personal computer. A case study of an advanced technology jet engine component is included to illustrate various outputs from the system. The proposed system could be an alternate solution to sophisticated and expensive CAD/CAM workstations.

  18. Teaching Advanced Concepts in Computer Networks: VNUML-UM Virtualization Tool

    Science.gov (United States)

    Ruiz-Martinez, A.; Pereniguez-Garcia, F.; Marin-Lopez, R.; Ruiz-Martinez, P. M.; Skarmeta-Gomez, A. F.

    2013-01-01

    In the teaching of computer networks the main problem that arises is the high price and limited number of network devices the students can work with in the laboratories. Nowadays, with virtualization we can overcome this limitation. In this paper, we present a methodology that allows students to learn advanced computer network concepts through…

  19. Style: A Computational and Conceptual Blending-Based Approach

    Science.gov (United States)

    Goguen, Joseph A.; Harrell, D. Fox

    This chapter proposes a new approach to style, arising from our work on computational media using structural blending, which enriches the conceptual blending of cognitive linguistics with structure building operations in order to encompass syntax and narrative as well as metaphor. We have implemented both conceptual and structural blending, and conducted initial experiments with poetry, including interactive multimedia poetry, although the approach generalizes to other media. The central idea is to generate multimedia content and analyze style in terms of blending principles, based on our finding that different principles from those of common sense blending are often needed for some contemporary poetic metaphors.

  20. Advances in bio-inspired computing for combinatorial optimization problems

    CERN Document Server

    Pintea, Camelia-Mihaela

    2013-01-01

    Advances in Bio-inspired Combinatorial Optimization Problems' illustrates several recent bio-inspired efficient algorithms for solving NP-hard problems.Theoretical bio-inspired concepts and models, in particular for agents, ants and virtual robots are described. Large-scale optimization problems, for example: the Generalized Traveling Salesman Problem and the Railway Traveling Salesman Problem, are solved and their results are discussed.Some of the main concepts and models described in this book are: inner rule to guide ant search - a recent model in ant optimization, heterogeneous sensitive a

  1. Advanced Modulation Techniques for High-Performance Computing Optical Interconnects

    DEFF Research Database (Denmark)

    Karinou, Fotini; Borkowski, Robert; Zibar, Darko;

    2013-01-01

    We experimentally assess the performance of a 64 × 64 optical switch fabric used for ns-speed optical cell switching in supercomputer optical interconnects. More specifically, we study four alternative modulation formats and detection schemes, namely, 10-Gb/s nonreturn-to-zero differential phase...... of the optical shared memory supercomputer interconnect system switch fabric. In particular, we investigate the resilience of the aforementioned advanced modulation formats to the nonlinearities of semiconductor optical amplifiers, used as ON/OFF gates in the supercomputer optical switch fabric under study...

  2. Advances in Computer Science and Information Engineering Volume 1

    CERN Document Server

    Lin, Sally

    2012-01-01

    CSIE2012 is an integrated conference concentrating its focus on Computer Science and Information Engineering . In the proceeding, you can learn much more knowledge about Computer Science and Information Engineering of researchers from all around the world. The main role of the proceeding is to be used as an exchange pillar for researchers who are working in the mentioned fields. In order to meet the high quality of Springer, AISC series, the organization committee has made their efforts to do the following things. Firstly, poor quality paper has been refused after reviewing course by anonymous referee experts. Secondly, periodically review meetings have been held around the reviewers about five times for exchanging reviewing suggestions. Finally, the conference organizers had several preliminary sessions before the conference. Through efforts of different people and departments, the conference will be successful and fruitful.

  3. Recent advances in swarm intelligence and evolutionary computation

    CERN Document Server

    2015-01-01

    This timely review volume summarizes the state-of-the-art developments in nature-inspired algorithms and applications with the emphasis on swarm intelligence and bio-inspired computation. Topics include the analysis and overview of swarm intelligence and evolutionary computation, hybrid metaheuristic algorithms, bat algorithm, discrete cuckoo search, firefly algorithm, particle swarm optimization, and harmony search as well as convergent hybridization. Application case studies have focused on the dehydration of fruits and vegetables by the firefly algorithm and goal programming, feature selection by the binary flower pollination algorithm, job shop scheduling, single row facility layout optimization, training of feed-forward neural networks, damage and stiffness identification, synthesis of cross-ambiguity functions by the bat algorithm, web document clustering, truss analysis, water distribution networks, sustainable building designs and others. As a timely review, this book can serve as an ideal reference f...

  4. Advances in Computer Science and Information Engineering Volume 2

    CERN Document Server

    Lin, Sally

    2012-01-01

    CSIE2012 is an integrated conference concentrating its focus on Computer Science and Information Engineering . In the proceeding, you can learn much more knowledge about Computer Science and Information Engineering of researchers from all around the world. The main role of the proceeding is to be used as an exchange pillar for researchers who are working in the mentioned fields. In order to meet the high quality of Springer, AISC series, the organization committee has made their efforts to do the following things. Firstly, poor quality paper has been refused after reviewing course by anonymous referee experts. Secondly, periodically review meetings have been held around the reviewers about five times for exchanging reviewing suggestions. Finally, the conference organizers had several preliminary sessions before the conference. Through efforts of different people and departments, the conference will be successful and fruitful.

  5. Advances in neural networks computational intelligence for ICT

    CERN Document Server

    Esposito, Anna; Morabito, Francesco; Pasero, Eros

    2016-01-01

    This carefully edited book is putting emphasis on computational and artificial intelligent methods for learning and their relative applications in robotics, embedded systems, and ICT interfaces for psychological and neurological diseases. The book is a follow-up of the scientific workshop on Neural Networks (WIRN 2015) held in Vietri sul Mare, Italy, from the 20th to the 22nd of May 2015. The workshop, at its 27th edition became a traditional scientific event that brought together scientists from many countries, and several scientific disciplines. Each chapter is an extended version of the original contribution presented at the workshop, and together with the reviewers’ peer revisions it also benefits from the live discussion during the presentation. The content of book is organized in the following sections. 1. Introduction, 2. Machine Learning, 3. Artificial Neural Networks: Algorithms and models, 4. Intelligent Cyberphysical and Embedded System, 5. Computational Intelligence Methods for Biomedical ICT in...

  6. Computer Mechatronics: A Radical Approach to Mechatronics Education

    OpenAIRE

    Nilsson, Martin

    2005-01-01

    This paper describes some distinguishing features of a course on mechatronics, based on computer science. We propose a teaching approach called Controlled Problem-Based Learning (CPBL). We have applied this method on three generations (2003-2005) of mainly fourth-year undergraduate students at Lund University (LTH). Although students found the course difficult, there were no dropouts, and all students attended the examination 2005.

  7. Transparency and deliberation within the FOMC: a computational linguistics approach

    OpenAIRE

    Hansen, Stephen; McMahon, Michael; Prat, Andrea

    2014-01-01

    How does transparency, a key feature of central bank design, affect the deliberation of monetary policymakers? We exploit a natural experiment in the Federal Open Market Committee in 1993 together with computational linguistic models (particularly Latent Dirichlet Allocation) to measure the effect of increased transparency on debate. Commentators have hypothesized both a beneficial discipline effect and a detrimental conformity effect. A difference-in-differences approach inspired by the care...

  8. National facility for advanced computational science: A sustainable path to scientific discovery

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst; Kramer, William; Saphir, William; Shalf, John; Bailey, David; Oliker, Leonid; Banda, Michael; McCurdy, C. William; Hules, John; Canning, Andrew; Day, Marc; Colella, Philip; Serafini, David; Wehner, Michael; Nugent, Peter

    2004-04-02

    Lawrence Berkeley National Laboratory (Berkeley Lab) proposes to create a National Facility for Advanced Computational Science (NFACS) and to establish a new partnership between the American computer industry and a national consortium of laboratories, universities, and computing facilities. NFACS will provide leadership-class scientific computing capability to scientists and engineers nationwide, independent of their institutional affiliation or source of funding. This partnership will bring into existence a new class of computational capability in the United States that is optimal for science and will create a sustainable path towards petaflops performance.

  9. WSRC approach to validation of criticality safety computer codes

    International Nuclear Information System (INIS)

    Recent hardware and operating system changes at Westinghouse Savannah River Site (WSRC) have necessitated review of the validation for JOSHUA criticality safety computer codes. As part of the planning for this effort, a policy for validation of JOSHUA and other criticality safety codes has been developed. This policy will be illustrated with the steps being taken at WSRC. The objective in validating a specific computational method is to reliably correlate its calculated neutron multiplication factor (Keff) with known values over a well-defined set of neutronic conditions. Said another way, such correlations should be: (1) repeatable; (2) demonstrated with defined confidence; and (3) identify the range of neutronic conditions (area of applicability) for which the correlations are valid. The general approach to validation of computational methods at WSRC must encompass a large number of diverse types of fissile material processes in different operations. Special problems are presented in validating computational methods when very few experiments are available (such as for enriched uranium systems with principal second isotope 236U). To cover all process conditions at WSRC, a broad validation approach has been used. Broad validation is based upon calculation of many experiments to span all possible ranges of reflection, nuclide concentrations, moderation ratios, etc. Narrow validation, in comparison, relies on calculations of a few experiments very near anticipated worst-case process conditions. The methods and problems of broad validation are discussed

  10. Archiving Software Systems: Approaches to Preserve Computational Capabilities

    Science.gov (United States)

    King, T. A.

    2014-12-01

    A great deal of effort is made to preserve scientific data. Not only because data is knowledge, but it is often costly to acquire and is sometimes collected under unique circumstances. Another part of the science enterprise is the development of software to process and analyze the data. Developed software is also a large investment and worthy of preservation. However, the long term preservation of software presents some challenges. Software often requires a specific technology stack to operate. This can include software, operating systems and hardware dependencies. One past approach to preserve computational capabilities is to maintain ancient hardware long past its typical viability. On an archive horizon of 100 years, this is not feasible. Another approach to preserve computational capabilities is to archive source code. While this can preserve details of the implementation and algorithms, it may not be possible to reproduce the technology stack needed to compile and run the resulting applications. This future forward dilemma has a solution. Technology used to create clouds and process big data can also be used to archive and preserve computational capabilities. We explore how basic hardware, virtual machines, containers and appropriate metadata can be used to preserve computational capabilities and to archive functional software systems. In conjunction with data archives, this provides scientist with both the data and capability to reproduce the processing and analysis used to generate past scientific results.

  11. Parallel computing in genomic research: advances and applications.

    Science.gov (United States)

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities.

  12. Reservoir Computing approach to Great Lakes water level forecasting

    Science.gov (United States)

    Coulibaly, Paulin

    2010-02-01

    SummaryThe use of echo state network (ESN) for dynamical system modeling is known as Reservoir Computing and has been shown to be effective for a number of applications, including signal processing, learning grammatical structure, time series prediction and motor/system control. However, the performance of Reservoir Computing approach on hydrological time series remains largely unexplored. This study investigates the potential of ESN or Reservoir Computing for long-term prediction of lake water levels. Great Lakes water levels from 1918 to 2005 are used to develop and evaluate the ESN models. The forecast performance of the ESN-based models is compared with the results obtained from two benchmark models, the conventional recurrent neural network (RNN) and the Bayesian neural network (BNN). The test results indicate a strong ability of ESN models to provide improved lake level forecasts up to 10-month ahead - suggesting that the inherent structure and innovative learning approach of the ESN is suitable for hydrological time series modeling. Another particular advantage of ESN learning approach is that it simplifies the network training complexity and avoids the limitations inherent to the gradient descent optimization method. Overall, it is shown that the ESN can be a good alternative method for improved lake level forecasting, performing better than both the RNN and the BNN on the four selected Great Lakes time series, namely, the Lakes Erie, Huron-Michigan, Ontario, and Superior.

  13. Vision 20/20: Automation and advanced computing in clinical radiation oncology

    Energy Technology Data Exchange (ETDEWEB)

    Moore, Kevin L., E-mail: kevinmoore@ucsd.edu; Moiseenko, Vitali [Department of Radiation Medicine and Applied Sciences, University of California San Diego, La Jolla, California 92093 (United States); Kagadis, George C. [Department of Medical Physics, School of Medicine, University of Patras, Rion, GR 26504 (Greece); McNutt, Todd R. [Department of Radiation Oncology and Molecular Radiation Science, School of Medicine, Johns Hopkins University, Baltimore, Maryland 21231 (United States); Mutic, Sasa [Department of Radiation Oncology, Washington University in St. Louis, St. Louis, Missouri 63110 (United States)

    2014-01-15

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.

  14. Discovering and understanding oncogenic gene fusions through data intensive computational approaches.

    Science.gov (United States)

    Latysheva, Natasha S; Babu, M Madan

    2016-06-01

    Although gene fusions have been recognized as important drivers of cancer for decades, our understanding of the prevalence and function of gene fusions has been revolutionized by the rise of next-generation sequencing, advances in bioinformatics theory and an increasing capacity for large-scale computational biology. The computational work on gene fusions has been vastly diverse, and the present state of the literature is fragmented. It will be fruitful to merge three camps of gene fusion bioinformatics that appear to rarely cross over: (i) data-intensive computational work characterizing the molecular biology of gene fusions; (ii) development research on fusion detection tools, candidate fusion prioritization algorithms and dedicated fusion databases and (iii) clinical research that seeks to either therapeutically target fusion transcripts and proteins or leverages advances in detection tools to perform large-scale surveys of gene fusion landscapes in specific cancer types. In this review, we unify these different-yet highly complementary and symbiotic-approaches with the view that increased synergy will catalyze advancements in gene fusion identification, characterization and significance evaluation. PMID:27105842

  15. Organic and inorganic nitrogen dynamics in soil - advanced Ntrace approach

    Science.gov (United States)

    Andresen, Louise C.; Björsne, Anna-Karin; Bodé, Samuel; Klemedtsson, Leif; Boeckx, Pascal; Rütting, Tobias

    2016-04-01

    Depolymerization of soil organic nitrogen (SON) into monomers (e.g. amino acids) is currently thought to be the rate limiting step for the terrestrial nitrogen (N) cycle. The production of free amino acids (AA) is followed by AA mineralization to ammonium, which is an important fraction of the total N mineralization. Accurate assessment of depolymerization and AA mineralization rate is important for a better understanding of the rate limiting steps. Recent developments in the 15N pool dilution techniques, based on 15N labelling of AA's, allow quantifying gross rates of SON depolymerization and AA mineralization (Wanek et al., 2010; Andersen et al., 2015) in addition to gross N mineralization. However, it is well known that the 15N pool dilution approach has limitations; in particular that gross rates of consumption processes (e.g. AA mineralization) are overestimated. This has consequences for evaluating the rate limiting step of the N cycle, as well as for estimating the nitrogen use efficiency (NUE). Here we present a novel 15N tracing approach, which combines 15N-AA labelling with an advanced version of the 15N tracing model Ntrace (Müller et al., 2007) explicitly accounting for AA turnover in soil. This approach (1) provides a more robust quantification of gross depolymerization and AA mineralization and (2) suggests a more realistic estimate for the microbial NUE of amino acids. Advantages of the new 15N tracing approach will be discussed and further improvements will be identified. References: Andresen, L.C., Bodé, S., Tietema, A., Boeckx, P., and Rütting, T.: Amino acid and N mineralization dynamics in heathland soil after long-term warming and repetitive drought, SOIL, 1, 341-349, 2015. Müller, C., Rütting, T., Kattge, J., Laughlin, R. J., and Stevens, R. J.: Estimation of parameters in complex 15N tracing models via Monte Carlo sampling, Soil Biology & Biochemistry, 39, 715-726, 2007. Wanek, W., Mooshammer, M., Blöchl, A., Hanreich, A., and Richter

  16. Advances in Computational Social Science and Social Simulation

    OpenAIRE

    Miguel Quesada, Francisco J.; Amblard, Frédéric; Juan A. Barceló; Madella, Marco; Aguirre, Cristián; Ahrweiler, Petra; Aldred, Rachel; Ali Abbas, Syed Muhammad; Lopez Rojas, Edgar Alonso; Alonso Betanzos, Amparo; Alvarez Galvez, Javier; Andrighetto, Giulia; Antunes, Luis; Araghi, Yashar; Asatani, Kimitaka

    2014-01-01

    Aquesta conferència és la celebració conjunta de la "10th Artificial Economics Conference AE", la "10th Conference of the European Social Simulation Association ESSA" i la "1st Simulating the Past to Understand Human History SPUHH". Conferència organitzada pel Laboratory for Socio­-Historical Dynamics Simulation (LSDS-­UAB) de la Universitat Autònoma de Barcelona. Readers will find results of recent research on computational social science and social simulation economics, management, so...

  17. Advanced wellbore thermal simulator GEOTEMP2. Appendix. Computer program listing

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, R.F.

    1982-02-01

    This appendix gives the program listing of GEOTEMP2 with comments and discussion to make the program organization more understandable. This appendix is divided into an introduction and four main blocks of code: main program, program initiation, wellbore flow, and wellbore heat transfer. The purpose and use of each subprogram is discussed and the program listing is given. Flowcharts will be included to clarify code organization when needed. GEOTEMP2 was written in FORTRAN IV. Efforts have been made to keep the programing as conventional as possible so that GEOTEMP2 will run without modification on most computers.

  18. Probabilistic Damage Characterization Using the Computationally-Efficient Bayesian Approach

    Science.gov (United States)

    Warner, James E.; Hochhalter, Jacob D.

    2016-01-01

    This work presents a computationally-ecient approach for damage determination that quanti es uncertainty in the provided diagnosis. Given strain sensor data that are polluted with measurement errors, Bayesian inference is used to estimate the location, size, and orientation of damage. This approach uses Bayes' Theorem to combine any prior knowledge an analyst may have about the nature of the damage with information provided implicitly by the strain sensor data to form a posterior probability distribution over possible damage states. The unknown damage parameters are then estimated based on samples drawn numerically from this distribution using a Markov Chain Monte Carlo (MCMC) sampling algorithm. Several modi cations are made to the traditional Bayesian inference approach to provide signi cant computational speedup. First, an ecient surrogate model is constructed using sparse grid interpolation to replace a costly nite element model that must otherwise be evaluated for each sample drawn with MCMC. Next, the standard Bayesian posterior distribution is modi ed using a weighted likelihood formulation, which is shown to improve the convergence of the sampling process. Finally, a robust MCMC algorithm, Delayed Rejection Adaptive Metropolis (DRAM), is adopted to sample the probability distribution more eciently. Numerical examples demonstrate that the proposed framework e ectively provides damage estimates with uncertainty quanti cation and can yield orders of magnitude speedup over standard Bayesian approaches.

  19. Oxidative Stress in Aging: Advances in Proteomic Approaches

    Directory of Open Access Journals (Sweden)

    Daniel Ortuño-Sahagún

    2014-01-01

    Full Text Available Aging is a gradual, complex process in which cells, tissues, organs, and the whole organism itself deteriorate in a progressive and irreversible manner that, in the majority of cases, implies pathological conditions that affect the individual’s Quality of Life (QOL. Although extensive research efforts in recent years have been made, the anticipation of aging and prophylactic or treatment strategies continue to experience major limitations. In this review, the focus is essentially on the compilation of the advances generated by cellular expression profile analysis through proteomics studies (two-dimensional [2D] electrophoresis and mass spectrometry [MS], which are currently used as an integral approach to study the aging process. Additionally, the relevance of the oxidative stress factors is discussed. Emphasis is placed on postmitotic tissues, such as neuronal, muscular, and red blood cells, which appear to be those most frequently studied with respect to aging. Additionally, models for the study of aging are discussed in a number of organisms, such as Caenorhabditis elegans, senescence-accelerated probe-8 mice (SAMP8, naked mole-rat (Heterocephalus glaber, and the beagle canine. Proteomic studies in specific tissues and organisms have revealed the extensive involvement of reactive oxygen species (ROS and oxidative stress in aging.

  20. A New Approach for Quality Management in Pervasive Computing Environments

    Directory of Open Access Journals (Sweden)

    Alti Adel

    2013-01-01

    Full Text Available This paper provides an extension of MDA called Context-aware Quality Model Driven Architecture (CQ-MDA which can be used for quality control in pervasive computing environments. The proposed CQ-MDA approach based on ContextualArchRQMM (Contextual ARCHitecture Quality Requirement MetaModel, being an extension to the MDA, allows for considering quality and resources-awareness while conducting the design process. The contributions of this paper are a meta-model for architecture quality control of context-aware applications and a model driven approach to separate architecture concerns from context and quality concerns and to configure reconfigurable software architectures of distributed systems. To demonstrate the utility of our approach, we use a videoconference system.

  1. A Computational Differential Geometry Approach to Grid Generation

    CERN Document Server

    Liseikin, Vladimir D

    2007-01-01

    The process of breaking up a physical domain into smaller sub-domains, known as meshing, facilitates the numerical solution of partial differential equations used to simulate physical systems. This monograph gives a detailed treatment of applications of geometric methods to advanced grid technology. It focuses on and describes a comprehensive approach based on the numerical solution of inverted Beltramian and diffusion equations with respect to monitor metrics for generating both structured and unstructured grids in domains and on surfaces. In this second edition the author takes a more detailed and practice-oriented approach towards explaining how to implement the method by: Employing geometric and numerical analyses of monitor metrics as the basis for developing efficient tools for controlling grid properties. Describing new grid generation codes based on finite differences for generating both structured and unstructured surface and domain grids. Providing examples of applications of the codes to the genera...

  2. 16th International workshop on Advanced Computing and Analysis Techniques in physics (ACAT)

    CERN Document Server

    Lokajicek, M; Tumova, N

    2015-01-01

    16th International workshop on Advanced Computing and Analysis Techniques in physics (ACAT). The ACAT workshop series, formerly AIHENP (Artificial Intelligence in High Energy and Nuclear Physics), was created back in 1990. Its main purpose is to gather researchers related with computing in physics research together, from both physics and computer science sides, and bring them a chance to communicate with each other. It has established bridges between physics and computer science research, facilitating the advances in our understanding of the Universe at its smallest and largest scales. With the Large Hadron Collider and many astronomy and astrophysics experiments collecting larger and larger amounts of data, such bridges are needed now more than ever. The 16th edition of ACAT aims to bring related researchers together, once more, to explore and confront the boundaries of computing, automatic data analysis and theoretical calculation technologies. It will create a forum for exchanging ideas among the fields an...

  3. Advances in x-ray computed microtomography at the NSLS

    International Nuclear Information System (INIS)

    The X-Ray Computed Microtomography workstation at beamline X27A at the NSLS has been utilized by scientists from a broad range of disciplines from industrial materials processing to environmental science. The most recent applications are presented here as well as a description of the facility that has evolved to accommodate a wide variety of materials and sample sizes. One of the most exciting new developments reported here resulted from a pursuit of faster reconstruction techniques. A Fast Filtered Back Transform (FFBT) reconstruction program has been developed and implemented, that is based on a refinement of the gridding algorithm first developed for use with radio astronomical data. This program has reduced the reconstruction time to 8.5 sec for a 929 x 929 pixel2 slice on an R10,000 CPU, more than 8x reduction compared with the Filtered Back-Projection method

  4. Recent advances in computational intelligence in defense and security

    CERN Document Server

    Falcon, Rafael; Zincir-Heywood, Nur; Abbass, Hussein

    2016-01-01

    This volume is an initiative undertaken by the IEEE Computational Intelligence Society’s Task Force on Security, Surveillance and Defense to consolidate and disseminate the role of CI techniques in the design, development and deployment of security and defense solutions. Applications range from the detection of buried explosive hazards in a battlefield to the control of unmanned underwater vehicles, the delivery of superior video analytics for protecting critical infrastructures or the development of stronger intrusion detection systems and the design of military surveillance networks. Defense scientists, industry experts, academicians and practitioners alike will all benefit from the wide spectrum of successful applications compiled in this volume. Senior undergraduate or graduate students may also discover uncharted territory for their own research endeavors.

  5. SPINET: A Parallel Computing Approach to Spine Simulations

    Directory of Open Access Journals (Sweden)

    Peter G. Kropf

    1996-01-01

    Full Text Available Research in scientitic programming enables us to realize more and more complex applications, and on the other hand, application-driven demands on computing methods and power are continuously growing. Therefore, interdisciplinary approaches become more widely used. The interdisciplinary SPINET project presented in this article applies modern scientific computing tools to biomechanical simulations: parallel computing and symbolic and modern functional programming. The target application is the human spine. Simulations of the spine help us to investigate and better understand the mechanisms of back pain and spinal injury. Two approaches have been used: the first uses the finite element method for high-performance simulations of static biomechanical models, and the second generates a simulation developmenttool for experimenting with different dynamic models. A finite element program for static analysis has been parallelized for the MUSIC machine. To solve the sparse system of linear equations, a conjugate gradient solver (iterative method and a frontal solver (direct method have been implemented. The preprocessor required for the frontal solver is written in the modern functional programming language SML, the solver itself in C, thus exploiting the characteristic advantages of both functional and imperative programming. The speedup analysis of both solvers show very satisfactory results for this irregular problem. A mixed symbolic-numeric environment for rigid body system simulations is presented. It automatically generates C code from a problem specification expressed by the Lagrange formalism using Maple.

  6. Computational approaches to detect allosteric pathways in transmembrane molecular machines.

    Science.gov (United States)

    Stolzenberg, Sebastian; Michino, Mayako; LeVine, Michael V; Weinstein, Harel; Shi, Lei

    2016-07-01

    Many of the functions of transmembrane proteins involved in signal processing and transduction across the cell membrane are determined by allosteric couplings that propagate the functional effects well beyond the original site of activation. Data gathered from breakthroughs in biochemistry, crystallography, and single molecule fluorescence have established a rich basis of information for the study of molecular mechanisms in the allosteric couplings of such transmembrane proteins. The mechanistic details of these couplings, many of which have therapeutic implications, however, have only become accessible in synergy with molecular modeling and simulations. Here, we review some recent computational approaches that analyze allosteric coupling networks (ACNs) in transmembrane proteins, and in particular the recently developed Protein Interaction Analyzer (PIA) designed to study ACNs in the structural ensembles sampled by molecular dynamics simulations. The power of these computational approaches in interrogating the functional mechanisms of transmembrane proteins is illustrated with selected examples of recent experimental and computational studies pursued synergistically in the investigation of secondary active transporters and GPCRs. This article is part of a Special Issue entitled: Membrane Proteins edited by J.C. Gumbart and Sergei Noskov. PMID:26806157

  7. Experimental and computing strategies in advanced material characterization problems

    Science.gov (United States)

    Bolzon, G.

    2015-10-01

    The mechanical characterization of materials relies more and more often on sophisticated experimental methods that permit to acquire a large amount of data and, contemporarily, to reduce the invasiveness of the tests. This evolution accompanies the growing demand of non-destructive diagnostic tools that assess the safety level of components in use in structures and infrastructures, for instance in the strategic energy sector. Advanced material systems and properties that are not amenable to traditional techniques, for instance thin layered structures and their adhesion on the relevant substrates, can be also characterized by means of combined experimental-numerical tools elaborating data acquired by full-field measurement techniques. In this context, parameter identification procedures involve the repeated simulation of the laboratory or in situ tests by sophisticated and usually expensive non-linear analyses while, in some situation, reliable and accurate results would be required in real time. The effectiveness and the filtering capabilities of reduced models based on decomposition and interpolation techniques can be profitably used to meet these conflicting requirements. This communication intends to summarize some results recently achieved in this field by the author and her co-workers. The aim is to foster further interaction between engineering and mathematical communities.

  8. Experimental and computing strategies in advanced material characterization problems

    Energy Technology Data Exchange (ETDEWEB)

    Bolzon, G. [Department of Civil and Environmental Engineering, Politecnico di Milano, piazza Leonardo da Vinci 32, 20133 Milano, Italy gabriella.bolzon@polimi.it (Italy)

    2015-10-28

    The mechanical characterization of materials relies more and more often on sophisticated experimental methods that permit to acquire a large amount of data and, contemporarily, to reduce the invasiveness of the tests. This evolution accompanies the growing demand of non-destructive diagnostic tools that assess the safety level of components in use in structures and infrastructures, for instance in the strategic energy sector. Advanced material systems and properties that are not amenable to traditional techniques, for instance thin layered structures and their adhesion on the relevant substrates, can be also characterized by means of combined experimental-numerical tools elaborating data acquired by full-field measurement techniques. In this context, parameter identification procedures involve the repeated simulation of the laboratory or in situ tests by sophisticated and usually expensive non-linear analyses while, in some situation, reliable and accurate results would be required in real time. The effectiveness and the filtering capabilities of reduced models based on decomposition and interpolation techniques can be profitably used to meet these conflicting requirements. This communication intends to summarize some results recently achieved in this field by the author and her co-workers. The aim is to foster further interaction between engineering and mathematical communities.

  9. A First Attempt to Bring Computational Biology into Advanced High School Biology Classrooms

    OpenAIRE

    Suzanne Renick Gallagher; William Coon; Kristin Donley; Abby Scott; GOLDBERG, DEBRA S.

    2011-01-01

    Computer science has become ubiquitous in many areas of biological research, yet most high school and even college students are unaware of this. As a result, many college biology majors graduate without adequate computational skills for contemporary fields of biology. The absence of a computational element in secondary school biology classrooms is of growing concern to the computational biology community and biology teachers who would like to acquaint their students with updated approaches in...

  10. Canadian Educational Approaches for the Advancement of Pharmacy Practice

    OpenAIRE

    Frankel, Grace; Louizos, Christopher; Austin, Zubin

    2014-01-01

    Canadian faculties (schools) of pharmacy are actively engaged in the advancement and restructuring of their programs in response to the shift in pharmacy to pharmacists having/assuming an advanced practitioner role. Unfortunately, there is a paucity of evidence outlining optimal strategies for accomplishing this task. This review explores several educational changes proposed in the literature to aid in the advancement of pharmacy education such as program admission requirements, critical-thin...

  11. Benchmarking of computer codes and approaches for modeling exposure scenarios

    International Nuclear Information System (INIS)

    The US Department of Energy Headquarters established a performance assessment task team (PATT) to integrate the activities of DOE sites that are preparing performance assessments for the disposal of newly generated low-level waste. The PATT chartered a subteam with the task of comparing computer codes and exposure scenarios used for dose calculations in performance assessments. This report documents the efforts of the subteam. Computer codes considered in the comparison include GENII, PATHRAE-EPA, MICROSHIELD, and ISOSHLD. Calculations were also conducted using spreadsheets to provide a comparison at the most fundamental level. Calculations and modeling approaches are compared for unit radionuclide concentrations in water and soil for the ingestion, inhalation, and external dose pathways. Over 30 tables comparing inputs and results are provided

  12. Identifying Pathogenicity Islands in Bacterial Pathogenomics Using Computational Approaches

    Directory of Open Access Journals (Sweden)

    Dongsheng Che

    2014-01-01

    Full Text Available High-throughput sequencing technologies have made it possible to study bacteria through analyzing their genome sequences. For instance, comparative genome sequence analyses can reveal the phenomenon such as gene loss, gene gain, or gene exchange in a genome. By analyzing pathogenic bacterial genomes, we can discover that pathogenic genomic regions in many pathogenic bacteria are horizontally transferred from other bacteria, and these regions are also known as pathogenicity islands (PAIs. PAIs have some detectable properties, such as having different genomic signatures than the rest of the host genomes, and containing mobility genes so that they can be integrated into the host genome. In this review, we will discuss various pathogenicity island-associated features and current computational approaches for the identification of PAIs. Existing pathogenicity island databases and related computational resources will also be discussed, so that researchers may find it to be useful for the studies of bacterial evolution and pathogenicity mechanisms.

  13. Benchmarking of computer codes and approaches for modeling exposure scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Seitz, R.R. [EG and G Idaho, Inc., Idaho Falls, ID (United States); Rittmann, P.D.; Wood, M.I. [Westinghouse Hanford Co., Richland, WA (United States); Cook, J.R. [Westinghouse Savannah River Co., Aiken, SC (United States)

    1994-08-01

    The US Department of Energy Headquarters established a performance assessment task team (PATT) to integrate the activities of DOE sites that are preparing performance assessments for the disposal of newly generated low-level waste. The PATT chartered a subteam with the task of comparing computer codes and exposure scenarios used for dose calculations in performance assessments. This report documents the efforts of the subteam. Computer codes considered in the comparison include GENII, PATHRAE-EPA, MICROSHIELD, and ISOSHLD. Calculations were also conducted using spreadsheets to provide a comparison at the most fundamental level. Calculations and modeling approaches are compared for unit radionuclide concentrations in water and soil for the ingestion, inhalation, and external dose pathways. Over 30 tables comparing inputs and results are provided.

  14. Advances in the MQDT approach of electron/molecular cation reactive collisions: High precision extensive calculations for applications

    Directory of Open Access Journals (Sweden)

    Motapon O.

    2015-01-01

    Full Text Available Recent advances in the stepwise multichannel quantum defect theory approach of electron/molecular cation reactive collisions have been applied to perform computations of cross sections and rate coefficients for dissociative recombination and electron-impact ro-vibrational transitions of H2+, BeH+ and their deuterated isotopomers. At very low energy, rovibronic interactions play a significant role in the dynamics, whereas at high energy, the dissociative excitation strongly competes with all other reactive processes.

  15. Advanced computer algebra algorithms for the expansion of Feynman integrals

    Energy Technology Data Exchange (ETDEWEB)

    Ablinger, Jakob; Round, Mark; Schneider, Carsten [Johannes Kepler Univ., Linz (Austria). Research Inst. for Symbolic Computation; Bluemlein, Johannes [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany)

    2012-10-15

    Two-point Feynman parameter integrals, with at most one mass and containing local operator insertions in 4+{epsilon}-dimensional Minkowski space, can be transformed to multi-integrals or multi-sums over hyperexponential and/or hypergeometric functions depending on a discrete parameter n. Given such a specific representation, we utilize an enhanced version of the multivariate Almkvist-Zeilberger algorithm (for multi-integrals) and a common summation framework of the holonomic and difference field approach (for multi-sums) to calculate recurrence relations in n. Finally, solving the recurrence we can decide efficiently if the first coefficients of the Laurent series expansion of a given Feynman integral can be expressed in terms of indefinite nested sums and products; if yes, the all n solution is returned in compact representations, i.e., no algebraic relations exist among the occurring sums and products.

  16. Proceedings: Workshop on advanced mathematics and computer science for power systems analysis

    Energy Technology Data Exchange (ETDEWEB)

    Esselman, W.H.; Iveson, R.H. (Electric Power Research Inst., Palo Alto, CA (United States))

    1991-08-01

    The Mathematics and Computer Workshop on Power System Analysis was held February 21--22, 1989, in Palo Alto, California. The workshop was the first in a series sponsored by EPRI's Office of Exploratory Research as part of its effort to develop ways in which recent advances in mathematics and computer science can be applied to the problems of the electric utility industry. The purpose of this workshop was to identify research objectives in the field of advanced computational algorithms needed for the application of advanced parallel processing architecture to problems of power system control and operation. Approximately 35 participants heard six presentations on power flow problems, transient stability, power system control, electromagnetic transients, user-machine interfaces, and database management. In the discussions that followed, participants identified five areas warranting further investigation: system load flow analysis, transient power and voltage analysis, structural instability and bifurcation, control systems design, and proximity to instability. 63 refs.

  17. How Children and Adults Learn to Use Computers: A Developmental Approach

    Science.gov (United States)

    Yan, Zheng; Fischer, Kurt W.

    2004-01-01

    How do children and adults learn to use computers? What developmental processes are involved in learning to use computers? This chapter reviews current understanding of these issues and presents empirical studies demonstrating how to advance that understanding. (Contains 2 figures.)

  18. ADVANCED METHODS FOR THE COMPUTATION OF PARTICLE BEAM TRANSPORT AND THE COMPUTATION OF ELECTROMAGNETIC FIELDS AND MULTIPARTICLE PHENOMENA

    Energy Technology Data Exchange (ETDEWEB)

    Alex J. Dragt

    2012-08-31

    Since 1980, under the grant DEFG02-96ER40949, the Department of Energy has supported the educational and research work of the University of Maryland Dynamical Systems and Accelerator Theory (DSAT) Group. The primary focus of this educational/research group has been on the computation and analysis of charged-particle beam transport using Lie algebraic methods, and on advanced methods for the computation of electromagnetic fields and multiparticle phenomena. This Final Report summarizes the accomplishments of the DSAT Group from its inception in 1980 through its end in 2011.

  19. TAMIS for rectal tumors: advancements of a new approach.

    Science.gov (United States)

    Rega, Daniela; Pace, Ugo; Niglio, Antonello; Scala, Dario; Sassaroli, Cinzia; Delrio, Paolo

    2016-03-01

    TAMIS allows transanal excision of rectal lesions by the means of a single-incision access port and traditional laparoscopic instruments. This technique represents a promising treatment of rectal neoplasms since it guarantees precise dissection and reproducible approaches. From May 2010 to September 2015, we performed excisions of rectal lesions in 55 patients using a SILS port. The pre-operative diagnosis was 26 tumours, 26 low and high grade displasias and 3 other benign neoplasias. 11 patients had a neoadjuvant treatment. Pneumorectum was established at a pressure of 15-20 mmHg CO2 with continuous insufflation, and ordinary laparoscopic instruments were used to perform full thickness resection of rectal neoplasm with a conventional 5-mm 30° laparoscopic camera. The average operative time was 78 min. Postoperative recovery was uneventful in 53 cases: in one case a Hartmann procedure was necessary at two postoperative days due to an intraoperative intraperitoneal perforation; in another case, a diverting colostomy was required at the five postoperative days due to an intraoperative perforation of the vaginal wall. Unclear resection margins were detected in six patients: thereafter five patients underwent radical surgery; the other patient was unfit for radical surgery, but is actually alive and well. Patients were discharged after a median of 3 days. Transanal minimally invasive surgery is an advanced transanal platform that provides a safe and effective method for low rectal tumors. The feasibility of TAMIS also for malignant lesions treated in a neoadjuvant setting could be cautiously evaluated in the future. PMID:27052544

  20. A pencil beam approach to proton computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Rescigno, Regina, E-mail: regina.rescigno@iphc.cnrs.fr; Bopp, Cécile; Rousseau, Marc; Brasse, David [Université de Strasbourg, IPHC, 23 rue du Loess, Strasbourg 67037, France and CNRS, UMR7178, Strasbourg 67037 (France)

    2015-11-15

    Purpose: A new approach to proton computed tomography (pCT) is presented. In this approach, protons are not tracked one-by-one but a beam of particles is considered instead. The elements of the pCT reconstruction problem (residual energy and path) are redefined on the basis of this new approach. An analytical image reconstruction algorithm applicable to this scenario is also proposed. Methods: The pencil beam (PB) and its propagation in matter were modeled by making use of the generalization of the Fermi–Eyges theory to account for multiple Coulomb scattering (MCS). This model was integrated into the pCT reconstruction problem, allowing the definition of the mean beam path concept similar to the most likely path (MLP) used in the single-particle approach. A numerical validation of the model was performed. The algorithm of filtered backprojection along MLPs was adapted to the beam-by-beam approach. The acquisition of a perfect proton scan was simulated and the data were used to reconstruct images of the relative stopping power of the phantom with the single-proton and beam-by-beam approaches. The resulting images were compared in a qualitative way. Results: The parameters of the modeled PB (mean and spread) were compared to Monte Carlo results in order to validate the model. For a water target, good agreement was found for the mean value of the distributions. As far as the spread is concerned, depth-dependent discrepancies as large as 2%–3% were found. For a heterogeneous phantom, discrepancies in the distribution spread ranged from 6% to 8%. The image reconstructed with the beam-by-beam approach showed a high level of noise compared to the one reconstructed with the classical approach. Conclusions: The PB approach to proton imaging may allow technical challenges imposed by the current proton-by-proton method to be overcome. In this framework, an analytical algorithm is proposed. Further work will involve a detailed study of the performances and limitations of

  1. Computational systems biology approaches to anti-angiogenic cancer therapeutics.

    Science.gov (United States)

    Finley, Stacey D; Chu, Liang-Hui; Popel, Aleksander S

    2015-02-01

    Angiogenesis is an exquisitely regulated process that is required for physiological processes and is also important in numerous diseases. Tumors utilize angiogenesis to generate the vascular network needed to supply the cancer cells with nutrients and oxygen, and many cancer drugs aim to inhibit tumor angiogenesis. Anti-angiogenic therapy involves inhibiting multiple cell types, molecular targets, and intracellular signaling pathways. Computational tools are useful in guiding treatment strategies, predicting the response to treatment, and identifying new targets of interest. Here, we describe progress that has been made in applying mathematical modeling and bioinformatics approaches to study anti-angiogenic therapeutics in cancer.

  2. A New Computational Scheme for Computing Greeks by the Asymptotic Expansion Approach

    OpenAIRE

    Matsuoka, Ryosuke; Takahashi, Akihiko; Uchida, Yoshihiko

    2005-01-01

    We developed a new scheme for computing "Greeks"of derivatives by an asymptotic expansion approach. In particular, we derived analytical approximation formulae for deltas and Vegas of plain vanilla and av-erage European call options under general Markovian processes of underlying asset prices. Moreover, we introduced a new variance reduction method of Monte Carlo simulations based on the asymptotic expansion scheme. Finally, several numerical examples under CEV processes con?rmed the validity...

  3. Systems Thinking: An Approach for Advancing Workplace Information Literacy

    Science.gov (United States)

    Somerville, Mary M.; Howard, Zaana

    2008-01-01

    As the importance of information literacy has gained increased recognition, so too have academic library professionals intensified their efforts to champion, activate, and advance these capabilities in others. To date, however, little attention has focused on advancing these essential competencies amongst practitioner advocates. This paper helps…

  4. A Dynamic Bayesian Approach to Computational Laban Shape Quality Analysis

    Directory of Open Access Journals (Sweden)

    Dilip Swaminathan

    2009-01-01

    kinesiology. LMA (especially Effort/Shape emphasizes how internal feelings and intentions govern the patterning of movement throughout the whole body. As we argue, a complex understanding of intention via LMA is necessary for human-computer interaction to become embodied in ways that resemble interaction in the physical world. We thus introduce a novel, flexible Bayesian fusion approach for identifying LMA Shape qualities from raw motion capture data in real time. The method uses a dynamic Bayesian network (DBN to fuse movement features across the body and across time and as we discuss can be readily adapted for low-cost video. It has delivered excellent performance in preliminary studies comprising improvisatory movements. Our approach has been incorporated in Response, a mixed-reality environment where users interact via natural, full-body human movement and enhance their bodily-kinesthetic awareness through immersive sound and light feedback, with applications to kinesiology training, Parkinson's patient rehabilitation, interactive dance, and many other areas.

  5. Advances and Computational Tools towards Predictable Design in Biological Engineering

    Directory of Open Access Journals (Sweden)

    Lorenzo Pasotti

    2014-01-01

    Full Text Available The design process of complex systems in all the fields of engineering requires a set of quantitatively characterized components and a method to predict the output of systems composed by such elements. This strategy relies on the modularity of the used components or the prediction of their context-dependent behaviour, when parts functioning depends on the specific context. Mathematical models usually support the whole process by guiding the selection of parts and by predicting the output of interconnected systems. Such bottom-up design process cannot be trivially adopted for biological systems engineering, since parts function is hard to predict when components are reused in different contexts. This issue and the intrinsic complexity of living systems limit the capability of synthetic biologists to predict the quantitative behaviour of biological systems. The high potential of synthetic biology strongly depends on the capability of mastering this issue. This review discusses the predictability issues of basic biological parts (promoters, ribosome binding sites, coding sequences, transcriptional terminators, and plasmids when used to engineer simple and complex gene expression systems in Escherichia coli. A comparison between bottom-up and trial-and-error approaches is performed for all the discussed elements and mathematical models supporting the prediction of parts behaviour are illustrated.

  6. Use of computational modeling approaches in studying the binding interactions of compounds with human estrogen receptors.

    Science.gov (United States)

    Wang, Pan; Dang, Li; Zhu, Bao-Ting

    2016-01-01

    Estrogens have a whole host of physiological functions in many human organs and systems, including the reproductive, cardiovascular, and central nervous systems. Many naturally-occurring compounds with estrogenic or antiestrogenic activity are present in our environment and food sources. Synthetic estrogens and antiestrogens are also important therapeutic agents. At the molecular level, estrogen receptors (ERs) mediate most of the well-known actions of estrogens. Given recent advances in computational modeling tools, it is now highly practical to use these tools to study the interaction of human ERs with various types of ligands. There are two common categories of modeling techniques: one is the quantitative structure activity relationship (QSAR) analysis, which uses the structural information of the interacting ligands to predict the binding site properties of a macromolecule, and the other one is molecular docking-based computational analysis, which uses the 3-dimensional structural information of both the ligands and the receptor to predict the binding interaction. In this review, we discuss recent results that employed these and other related computational modeling approaches to characterize the binding interaction of various estrogens and antiestrogens with the human ERs. These examples clearly demonstrate that the computational modeling approaches, when used in combination with other experimental methods, are powerful tools that can precisely predict the binding interaction of various estrogenic ligands and their derivatives with the human ERs.

  7. Stochastic Computational Approach for Complex Nonlinear Ordinary Differential Equations

    Institute of Scientific and Technical Information of China (English)

    Junaid Ali Khan; Muhammad Asif Zahoor Raja; Ijaz Mansoor Qureshi

    2011-01-01

    @@ We present an evolutionary computational approach for the solution of nonlinear ordinary differential equations (NLODEs).The mathematical modeling is performed by a feed-forward artificial neural network that defines an unsupervised error.The training of these networks is achieved by a hybrid intelligent algorithm, a combination of global search with genetic algorithm and local search by pattern search technique.The applicability of this approach ranges from single order NLODEs, to systems of coupled differential equations.We illustrate the method by solving a variety of model problems and present comparisons with solutions obtained by exact methods and classical numerical methods.The solution is provided on a continuous finite time interval unlike the other numerical techniques with comparable accuracy.With the advent of neuroprocessors and digital signal processors the method becomes particularly interesting due to the expected essential gains in the execution speed.%We present an evolutionary computational approach for the solution of nonlinear ordinary differential equations (NLODEs). The mathematical modeling is performed by a feed-forward artificial neural network that defines an unsupervised error. The training of these networks is achieved by a hybrid intelligent algorithm, a combination of global search with genetic algorithm and local search by pattern search technique. The applicability of this approach ranges from single order NLODEs, to systems of coupled differential equations. We illustrate the method by solving a variety of model problems and present comparisons with solutions obtained by exact methods and classical numerical methods. The solution is provided on a continuous finite time interval unlike the other numerical techniques with comparable accuracy. With the advent of neuroprocessors and digital signal processors the method becomes particularly interesting due to the expected essential gains in the execution speed.

  8. Computational Approach to Diarylprolinol-Silyl Ethers in Aminocatalysis.

    Science.gov (United States)

    Halskov, Kim Søholm; Donslund, Bjarke S; Paz, Bruno Matos; Jørgensen, Karl Anker

    2016-05-17

    Asymmetric organocatalysis has witnessed a remarkable development since its "re-birth" in the beginning of the millenium. In this rapidly growing field, computational investigations have proven to be an important contribution for the elucidation of mechanisms and rationalizations of the stereochemical outcomes of many of the reaction concepts developed. The improved understanding of mechanistic details has facilitated the further advancement of the field. The diarylprolinol-silyl ethers have since their introduction been one of the most applied catalysts in asymmetric aminocatalysis due to their robustness and generality. Although aminocatalytic methods at first glance appear to follow relatively simple mechanistic principles, more comprehensive computational studies have shown that this notion in some cases is deceiving and that more complex pathways might be operating. In this Account, the application of density functional theory (DFT) and other computational methods on systems catalyzed by the diarylprolinol-silyl ethers is described. It will be illustrated how computational investigations have shed light on the structure and reactivity of important intermediates in aminocatalysis, such as enamines and iminium ions formed from aldehydes and α,β-unsaturated aldehydes, respectively. Enamine and iminium ion catalysis can be classified as HOMO-raising and LUMO-lowering activation modes. In these systems, the exclusive reactivity through one of the possible intermediates is often a requisite for achieving high stereoselectivity; therefore, the appreciation of subtle energy differences has been vital for the efficient development of new stereoselective reactions. The diarylprolinol-silyl ethers have also allowed for novel activation modes for unsaturated aldehydes, which have opened up avenues for the development of new remote functionalization reactions of poly-unsaturated carbonyl compounds via di-, tri-, and tetraenamine intermediates and vinylogous iminium ions

  9. Computational Approach to Diarylprolinol-Silyl Ethers in Aminocatalysis.

    Science.gov (United States)

    Halskov, Kim Søholm; Donslund, Bjarke S; Paz, Bruno Matos; Jørgensen, Karl Anker

    2016-05-17

    Asymmetric organocatalysis has witnessed a remarkable development since its "re-birth" in the beginning of the millenium. In this rapidly growing field, computational investigations have proven to be an important contribution for the elucidation of mechanisms and rationalizations of the stereochemical outcomes of many of the reaction concepts developed. The improved understanding of mechanistic details has facilitated the further advancement of the field. The diarylprolinol-silyl ethers have since their introduction been one of the most applied catalysts in asymmetric aminocatalysis due to their robustness and generality. Although aminocatalytic methods at first glance appear to follow relatively simple mechanistic principles, more comprehensive computational studies have shown that this notion in some cases is deceiving and that more complex pathways might be operating. In this Account, the application of density functional theory (DFT) and other computational methods on systems catalyzed by the diarylprolinol-silyl ethers is described. It will be illustrated how computational investigations have shed light on the structure and reactivity of important intermediates in aminocatalysis, such as enamines and iminium ions formed from aldehydes and α,β-unsaturated aldehydes, respectively. Enamine and iminium ion catalysis can be classified as HOMO-raising and LUMO-lowering activation modes. In these systems, the exclusive reactivity through one of the possible intermediates is often a requisite for achieving high stereoselectivity; therefore, the appreciation of subtle energy differences has been vital for the efficient development of new stereoselective reactions. The diarylprolinol-silyl ethers have also allowed for novel activation modes for unsaturated aldehydes, which have opened up avenues for the development of new remote functionalization reactions of poly-unsaturated carbonyl compounds via di-, tri-, and tetraenamine intermediates and vinylogous iminium ions

  10. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    Energy Technology Data Exchange (ETDEWEB)

    Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

  11. Advances in Computational Fluid-Structure Interaction and Flow Simulation Conference

    CERN Document Server

    Takizawa, Kenji

    2016-01-01

    This contributed volume celebrates the work of Tayfun E. Tezduyar on the occasion of his 60th birthday. The articles it contains were born out of the Advances in Computational Fluid-Structure Interaction and Flow Simulation (AFSI 2014) conference, also dedicated to Prof. Tezduyar and held at Waseda University in Tokyo, Japan on March 19-21, 2014. The contributing authors represent a group of international experts in the field who discuss recent trends and new directions in computational fluid dynamics (CFD) and fluid-structure interaction (FSI). Organized into seven distinct parts arranged by thematic topics, the papers included cover basic methods and applications of CFD, flows with moving boundaries and interfaces, phase-field modeling, computer science and high-performance computing (HPC) aspects of flow simulation, mathematical methods, biomedical applications, and FSI. Researchers, practitioners, and advanced graduate students working on CFD, FSI, and related topics will find this collection to be a defi...

  12. A new approach in CHP steam turbines thermodynamic cycles computations

    Directory of Open Access Journals (Sweden)

    Grković Vojin R.

    2012-01-01

    Full Text Available This paper presents a new approach in mathematical modeling of thermodynamic cycles and electric power of utility district-heating and cogeneration steam turbines. The approach is based on the application of the dimensionless mass flows, which describe the thermodynamic cycle of a combined heat and power steam turbine. The mass flows are calculated relative to the mass flow to low pressure turbine. The procedure introduces the extraction mass flow load parameter νh which clearly indicates the energy transformation process, as well as the cogeneration turbine design features, but also its fitness for the electrical energy system requirements. The presented approach allows fast computations, as well as direct calculation of the selected energy efficiency indicators. The approach is exemplified with the calculation results of the district heat power to electric power ratio, as well as the cycle efficiency, versus νh. The influence of νh on the conformity of a combined heat and power turbine to the grid requirements is also analyzed and discussed. [Projekat Ministarstva nauke Republike Srbije, br. 33049: Development of CHP demonstration plant with gasification of biomass

  13. NATO Advanced Study Institute on Advances in the Computer Simulations of Liquid Crystals

    CERN Document Server

    Zannoni, Claudio

    2000-01-01

    Computer simulations provide an essential set of tools for understanding the macroscopic properties of liquid crystals and of their phase transitions in terms of molecular models. While simulations of liquid crystals are based on the same general Monte Carlo and molecular dynamics techniques as are used for other fluids, they present a number of specific problems and peculiarities connected to the intrinsic properties of these mesophases. The field of computer simulations of anisotropic fluids is interdisciplinary and is evolving very rapidly. The present volume covers a variety of techniques and model systems, from lattices to hard particle and Gay-Berne to atomistic, for thermotropics, lyotropics, and some biologically interesting liquid crystals. Contributions are written by an excellent panel of international lecturers and provides a timely account of the techniques and problems in the field.

  14. 76 FR 52954 - Workshop: Advancing Research on Mixtures; New Perspectives and Approaches for Predicting Adverse...

    Science.gov (United States)

    2011-08-24

    ... HUMAN SERVICES Workshop: Advancing Research on Mixtures; New Perspectives and Approaches for Predicting... ``Advancing Research on Mixtures: New Perspectives and Approaches for Predicting Adverse Human Health Effects... Research and Training, NIEHS, P.O. Box 12233, MD K3-04, Research Triangle Park, NC 27709, (telephone)...

  15. Crowd Computing as a Cooperation Problem: An Evolutionary Approach

    Science.gov (United States)

    Christoforou, Evgenia; Fernández Anta, Antonio; Georgiou, Chryssis; Mosteiro, Miguel A.; Sánchez, Angel

    2013-05-01

    Cooperation is one of the socio-economic issues that has received more attention from the physics community. The problem has been mostly considered by studying games such as the Prisoner's Dilemma or the Public Goods Game. Here, we take a step forward by studying cooperation in the context of crowd computing. We introduce a model loosely based on Principal-agent theory in which people (workers) contribute to the solution of a distributed problem by computing answers and reporting to the problem proposer (master). To go beyond classical approaches involving the concept of Nash equilibrium, we work on an evolutionary framework in which both the master and the workers update their behavior through reinforcement learning. Using a Markov chain approach, we show theoretically that under certain----not very restrictive—conditions, the master can ensure the reliability of the answer resulting of the process. Then, we study the model by numerical simulations, finding that convergence, meaning that the system reaches a point in which it always produces reliable answers, may in general be much faster than the upper bounds given by the theoretical calculation. We also discuss the effects of the master's level of tolerance to defectors, about which the theory does not provide information. The discussion shows that the system works even with very large tolerances. We conclude with a discussion of our results and possible directions to carry this research further.

  16. Advanced computational methods for the assessment of reactor core behaviour during reactivity initiated accidents. Final report

    International Nuclear Information System (INIS)

    The document at hand serves as the final report for the reactor safety research project RS1183 ''Advanced Computational Methods for the Assessment of Reactor Core Behavior During Reactivity-Initiated Accidents''. The work performed in the framework of this project was dedicated to the development, validation and application of advanced computational methods for the simulation of transients and accidents of nuclear installations. These simulation tools describe in particular the behavior of the reactor core (with respect to neutronics, thermal-hydraulics and thermal mechanics) at a very high level of detail. The overall goal of this project was the deployment of a modern nuclear computational chain which provides, besides advanced 3D tools for coupled neutronics/ thermal-hydraulics full core calculations, also appropriate tools for the generation of multi-group cross sections and Monte Carlo models for the verification of the individual calculational steps. This computational chain shall primarily be deployed for light water reactors (LWR), but should beyond that also be applicable for innovative reactor concepts. Thus, validation on computational benchmarks and critical experiments was of paramount importance. Finally, appropriate methods for uncertainty and sensitivity analysis were to be integrated into the computational framework, in order to assess and quantify the uncertainties due to insufficient knowledge of data, as well as due to methodological aspects.

  17. Projected role of advanced computational aerodynamic methods at the Lockheed-Georgia company

    Science.gov (United States)

    Lores, M. E.

    1978-01-01

    Experience with advanced computational methods being used at the Lockheed-Georgia Company to aid in the evaluation and design of new and modified aircraft indicates that large and specialized computers will be needed to make advanced three-dimensional viscous aerodynamic computations practical. The Numerical Aerodynamic Simulation Facility should be used to provide a tool for designing better aerospace vehicles while at the same time reducing development costs by performing computations using Navier-Stokes equations solution algorithms and permitting less sophisticated but nevertheless complex calculations to be made efficiently. Configuration definition procedures and data output formats can probably best be defined in cooperation with industry, therefore, the computer should handle many remote terminals efficiently. The capability of transferring data to and from other computers needs to be provided. Because of the significant amount of input and output associated with 3-D viscous flow calculations and because of the exceedingly fast computation speed envisioned for the computer, special attention should be paid to providing rapid, diversified, and efficient input and output.

  18. Computational Benefits Using an Advanced Concatenation Scheme Based on Reduced Order Models for RF Structures

    CERN Document Server

    Heller, Johann; Van Rienen, Ursula; 10.1016/j.phpro.2015.11.060

    2015-01-01

    The computation of electromagnetic fields and parameters derived thereof for lossless radio frequency (RF) structures filled with isotropic media is an important task for the design and operation of particle accelerators. Unfortunately, these computations are often highly demanding with regard to computational effort. The entire computational demand of the problem can be reduced using decomposition schemes in order to solve the field problems on standard workstations. This paper presents one of the first detailed comparisons between the recently proposed state-space concatenation approach (SSC) and a direct computation for an accelerator cavity with coupler-elements that break the rotational symmetry.

  19. Advanced approaches to high intensity laser-driven ion acceleration

    International Nuclear Information System (INIS)

    Since the pioneering work that was carried out 10 years ago, the generation of highly energetic ion beams from laser-plasma interactions has been investigated in much detail in the regime of target normal sheath acceleration (TNSA). Creation of ion beams with small longitudinal and transverse emittance and energies extending up to tens of MeV fueled visions of compact, laser-driven ion sources for applications such as ion beam therapy of tumors or fast ignition inertial con finement fusion. However, new pathways are of crucial importance to push the current limits of laser-generated ion beams further towards parameters necessary for those applications. The presented PhD work was intended to develop and explore advanced approaches to high intensity laser-driven ion acceleration that reach beyond TNSA. In this spirit, ion acceleration from two novel target systems was investigated, namely mass-limited microspheres and nm-thin, free-standing diamond-like carbon (DLC) foils. Using such ultrathin foils, a new regime of ion acceleration was found where the laser transfers energy to all electrons located within the focal volume. While for TNSA the accelerating electric field is stationary and ion acceleration is spatially separated from laser absorption into electrons, now a localized longitudinal field enhancement is present that co-propagates with the ions as the accompanying laser pulse pushes the electrons forward. Unprecedented maximum ion energies were obtained, reaching beyond 0.5 GeV for carbon C6+ and thus exceeding previous TNSA results by about one order of magnitude. When changing the laser polarization to circular, electron heating and expansion were shown to be efficiently suppressed, resulting for the first time in a phase-stable acceleration that is dominated by the laser radiation pressure which led to the observation of a peaked C6+ spectrum. Compared to quasi-monoenergetic ion beam generation within the TNSA regime, a more than 40 times increase in

  20. Advanced approaches to high intensity laser-driven ion acceleration

    Energy Technology Data Exchange (ETDEWEB)

    Henig, Andreas

    2010-04-26

    Since the pioneering work that was carried out 10 years ago, the generation of highly energetic ion beams from laser-plasma interactions has been investigated in much detail in the regime of target normal sheath acceleration (TNSA). Creation of ion beams with small longitudinal and transverse emittance and energies extending up to tens of MeV fueled visions of compact, laser-driven ion sources for applications such as ion beam therapy of tumors or fast ignition inertial con finement fusion. However, new pathways are of crucial importance to push the current limits of laser-generated ion beams further towards parameters necessary for those applications. The presented PhD work was intended to develop and explore advanced approaches to high intensity laser-driven ion acceleration that reach beyond TNSA. In this spirit, ion acceleration from two novel target systems was investigated, namely mass-limited microspheres and nm-thin, free-standing diamond-like carbon (DLC) foils. Using such ultrathin foils, a new regime of ion acceleration was found where the laser transfers energy to all electrons located within the focal volume. While for TNSA the accelerating electric field is stationary and ion acceleration is spatially separated from laser absorption into electrons, now a localized longitudinal field enhancement is present that co-propagates with the ions as the accompanying laser pulse pushes the electrons forward. Unprecedented maximum ion energies were obtained, reaching beyond 0.5 GeV for carbon C{sup 6+} and thus exceeding previous TNSA results by about one order of magnitude. When changing the laser polarization to circular, electron heating and expansion were shown to be efficiently suppressed, resulting for the first time in a phase-stable acceleration that is dominated by the laser radiation pressure which led to the observation of a peaked C{sup 6+} spectrum. Compared to quasi-monoenergetic ion beam generation within the TNSA regime, a more than 40 times

  1. Computing Optimal Stochastic Portfolio Execution Strategies: A Parametric Approach Using Simulations

    Science.gov (United States)

    Moazeni, Somayeh; Coleman, Thomas F.; Li, Yuying

    2010-09-01

    Computing optimal stochastic portfolio execution strategies under appropriate risk consideration presents great computational challenge. We investigate a parametric approach for computing optimal stochastic strategies using Monte Carlo simulations. This approach allows reduction in computational complexity by computing coefficients for a parametric representation of a stochastic dynamic strategy based on static optimization. Using this technique, constraints can be similarly handled using appropriate penalty functions. We illustrate the proposed approach to minimize the expected execution cost and Conditional Value-at-Risk (CVaR).

  2. Research in Computational Aeroscience Applications Implemented on Advanced Parallel Computing Systems

    Science.gov (United States)

    Wigton, Larry

    1996-01-01

    Improving the numerical linear algebra routines for use in new Navier-Stokes codes, specifically Tim Barth's unstructured grid code, with spin-offs to TRANAIR is reported. A fast distance calculation routine for Navier-Stokes codes using the new one-equation turbulence models is written. The primary focus of this work was devoted to improving matrix-iterative methods. New algorithms have been developed which activate the full potential of classical Cray-class computers as well as distributed-memory parallel computers.

  3. A Grounded Theory Approach to Physical Activity and Advanced Cancer

    Directory of Open Access Journals (Sweden)

    Sonya S. Lowe

    2015-11-01

    Full Text Available Background: Physical activity has demonstrated benefits in cancer-related fatigue and physical functioning in early-stage cancer patients, however the role of physical activity at the end stage of cancer has not been established. To challenge positivist–empiricist assumptions, I am seeking to develop a new theoretical framework that is grounded in the advanced cancer patient’s experience of activity. Aim: To gain an in-depth understanding of the experience of activity and quality of life in advanced cancer patients. Objectives: (1 To explore the meaning of activity for advanced cancer patients in the context of their day-to-day life, (2 to elicit advanced cancer patients’ perceptions of activity with respect to their quality of life, and (3 to elicit advanced cancer patients’ views of barriers and facilitators to activity in the context of their day-to-day life. Study Design: A two-phase, cross-sectional, qualitative study will be conducted through the postpositivist lens of subtle realism and informed by the principles of grounded theory methods. Study Methods: Advanced cancer patients will be recruited through the outpatient department of a tertiary cancer center. For Phase one, participants will wear an activPAL™ activity monitor and fill out a daily record sheet for seven days duration. For Phase two, the activity monitor output and daily record sheets will be used as qualitative probes for face-to-face, semistructured interviews. Concurrent coding, constant comparative analysis, and theoretical sampling will continue with the aim of achieving as close as possible to theoretical saturation. Ethics and Discussion: Ethical and scientific approval will be obtained by all local institutional review boards prior to study commencement. The findings will generate new mid-level theory about the experience of activity and quality of life in advanced cancer patients and aid in the development of a new theoretical framework for designing

  4. Computers-for-edu: An Advanced Business Application Programming (ABAP) Teaching Case

    Science.gov (United States)

    Boyle, Todd A.

    2007-01-01

    The "Computers-for-edu" case is designed to provide students with hands-on exposure to creating Advanced Business Application Programming (ABAP) reports and dialogue programs, as well as navigating various mySAP Enterprise Resource Planning (ERP) transactions needed by ABAP developers. The case requires students to apply a wide variety of ABAP…

  5. Advanced computational tools and methods for nuclear analyses of fusion technology systems

    International Nuclear Information System (INIS)

    An overview is presented of advanced computational tools and methods developed recently for nuclear analyses of Fusion Technology systems such as the experimental device ITER ('International Thermonuclear Experimental Reactor') and the intense neutron source IFMIF ('International Fusion Material Irradiation Facility'). These include Monte Carlo based computational schemes for the calculation of three-dimensional shut-down dose rate distributions, methods, codes and interfaces for the use of CAD geometry models in Monte Carlo transport calculations, algorithms for Monte Carlo based sensitivity/uncertainty calculations, as well as computational techniques and data for IFMIF neutronics and activation calculations. (author)

  6. A machine-learning approach for computation of fractional flow reserve from coronary computed tomography.

    Science.gov (United States)

    Itu, Lucian; Rapaka, Saikiran; Passerini, Tiziano; Georgescu, Bogdan; Schwemmer, Chris; Schoebinger, Max; Flohr, Thomas; Sharma, Puneet; Comaniciu, Dorin

    2016-07-01

    Fractional flow reserve (FFR) is a functional index quantifying the severity of coronary artery lesions and is clinically obtained using an invasive, catheter-based measurement. Recently, physics-based models have shown great promise in being able to noninvasively estimate FFR from patient-specific anatomical information, e.g., obtained from computed tomography scans of the heart and the coronary arteries. However, these models have high computational demand, limiting their clinical adoption. In this paper, we present a machine-learning-based model for predicting FFR as an alternative to physics-based approaches. The model is trained on a large database of synthetically generated coronary anatomies, where the target values are computed using the physics-based model. The trained model predicts FFR at each point along the centerline of the coronary tree, and its performance was assessed by comparing the predictions against physics-based computations and against invasively measured FFR for 87 patients and 125 lesions in total. Correlation between machine-learning and physics-based predictions was excellent (0.9994, P machine-learning algorithm with a sensitivity of 81.6%, a specificity of 83.9%, and an accuracy of 83.2%. The correlation was 0.729 (P machine-learning model on a workstation with 3.4-GHz Intel i7 8-core processor.

  7. Photonic reservoir computing: a new approach to optical information processing

    Science.gov (United States)

    Vandoorne, Kristof; Fiers, Martin; Verstraeten, David; Schrauwen, Benjamin; Dambre, Joni; Bienstman, Peter

    2010-06-01

    Despite ever increasing computational power, recognition and classification problems remain challenging to solve. Recently, advances have been made by the introduction of the new concept of reservoir computing. This is a methodology coming from the field of machine learning and neural networks that has been successfully used in several pattern classification problems, like speech and image recognition. Thus far, most implementations have been in software, limiting their speed and power efficiency. Photonics could be an excellent platform for a hardware implementation of this concept because of its inherent parallelism and unique nonlinear behaviour. Moreover, a photonic implementation offers the promise of massively parallel information processing with low power and high speed. We propose using a network of coupled Semiconductor Optical Amplifiers (SOA) and show in simulation that it could be used as a reservoir by comparing it to conventional software implementations using a benchmark speech recognition task. In spite of the differences with classical reservoir models, the performance of our photonic reservoir is comparable to that of conventional implementations and sometimes slightly better. As our implementation uses coherent light for information processing, we find that phase tuning is crucial to obtain high performance. In parallel we investigate the use of a network of photonic crystal cavities. The coupled mode theory (CMT) is used to investigate these resonators. A new framework is designed to model networks of resonators and SOAs. The same network topologies are used, but feedback is added to control the internal dynamics of the system. By adjusting the readout weights of the network in a controlled manner, we can generate arbitrary periodic patterns.

  8. FY05-FY06 Advanced Simulation and Computing Implementation Plan, Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Baron, A L

    2004-07-19

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program will require the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapon design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile life extension programs and the resolution of significant finding investigations (SFIs). This requires a balanced system of technical staff, hardware, simulation software, and computer science solutions.

  9. Innovations and advances in computing, informatics, systems sciences, networking and engineering

    CERN Document Server

    Elleithy, Khaled

    2015-01-01

    Innovations and Advances in Computing, Informatics, Systems Sciences, Networking and Engineering  This book includes a set of rigorously reviewed world-class manuscripts addressing and detailing state-of-the-art research projects in the areas of Computer Science, Informatics, and Systems Sciences, and Engineering. It includes selected papers from the conference proceedings of the Eighth and some selected papers of the Ninth International Joint Conferences on Computer, Information, and Systems Sciences, and Engineering (CISSE 2012 & CISSE 2013). Coverage includes topics in: Industrial Electronics, Technology & Automation, Telecommunications and Networking, Systems, Computing Sciences and Software Engineering, Engineering Education, Instructional Technology, Assessment, and E-learning.  ·       Provides the latest in a series of books growing out of the International Joint Conferences on Computer, Information, and Systems Sciences, and Engineering; ·       Includes chapters in the most a...

  10. 1st International Conference on Computational Advancement in Communication Circuits and Systems

    CERN Document Server

    Dalapati, Goutam; Banerjee, P; Mallick, Amiya; Mukherjee, Moumita

    2015-01-01

    This book comprises the proceedings of 1st International Conference on Computational Advancement in Communication Circuits and Systems (ICCACCS 2014) organized by Narula Institute of Technology under the patronage of JIS group, affiliated to West Bengal University of Technology. The conference was supported by Technical Education Quality Improvement Program (TEQIP), New Delhi, India and had technical collaboration with IEEE Kolkata Section, along with publication partner by Springer. The book contains 62 refereed papers that aim to highlight new theoretical and experimental findings in the field of Electronics and communication engineering including interdisciplinary fields like Advanced Computing, Pattern Recognition and Analysis, Signal and Image Processing. The proceedings cover the principles, techniques and applications in microwave & devices, communication & networking, signal & image processing, and computations & mathematics & control. The proceedings reflect the conference’s emp...

  11. From computer-assisted intervention research to clinical impact: The need for a holistic approach.

    Science.gov (United States)

    Ourselin, Sébastien; Emberton, Mark; Vercauteren, Tom

    2016-10-01

    The early days of the field of medical image computing (MIC) and computer-assisted intervention (CAI), when publishing a strong self-contained methodological algorithm was enough to produce impact, are over. As a community, we now have substantial responsibility to translate our scientific progresses into improved patient care. In the field of computer-assisted interventions, the emphasis is also shifting from the mere use of well-known established imaging modalities and position trackers to the design and combination of innovative sensing, elaborate computational models and fine-grained clinical workflow analysis to create devices with unprecedented capabilities. The barriers to translating such devices in the complex and understandably heavily regulated surgical and interventional environment can seem daunting. Whether we leave the translation task mostly to our industrial partners or welcome, as researchers, an important share of it is up to us. We argue that embracing the complexity of surgical and interventional sciences is mandatory to the evolution of the field. Being able to do so requires large-scale infrastructure and a critical mass of expertise that very few research centres have. In this paper, we emphasise the need for a holistic approach to computer-assisted interventions where clinical, scientific, engineering and regulatory expertise are combined as a means of moving towards clinical impact. To ensure that the breadth of infrastructure and expertise required for translational computer-assisted intervention research does not lead to a situation where the field advances only thanks to a handful of exceptionally large research centres, we also advocate that solutions need to be designed to lower the barriers to entry. Inspired by fields such as particle physics and astronomy, we claim that centralised very large innovation centres with state of the art technology and health technology assessment capabilities backed by core support staff and open

  12. A Computational Drug Repositioning Approach for Targeting Oncogenic Transcription Factors.

    Science.gov (United States)

    Gayvert, Kaitlyn M; Dardenne, Etienne; Cheung, Cynthia; Boland, Mary Regina; Lorberbaum, Tal; Wanjala, Jackline; Chen, Yu; Rubin, Mark A; Tatonetti, Nicholas P; Rickman, David S; Elemento, Olivier

    2016-06-14

    Mutations in transcription factor (TF) genes are frequently observed in tumors, often leading to aberrant transcriptional activity. Unfortunately, TFs are often considered undruggable due to the absence of targetable enzymatic activity. To address this problem, we developed CRAFTT, a computational drug-repositioning approach for targeting TF activity. CRAFTT combines ChIP-seq with drug-induced expression profiling to identify small molecules that can specifically perturb TF activity. Application to ENCODE ChIP-seq datasets revealed known drug-TF interactions, and a global drug-protein network analysis supported these predictions. Application of CRAFTT to ERG, a pro-invasive, frequently overexpressed oncogenic TF, predicted that dexamethasone would inhibit ERG activity. Dexamethasone significantly decreased cell invasion and migration in an ERG-dependent manner. Furthermore, analysis of electronic medical record data indicates a protective role for dexamethasone against prostate cancer. Altogether, our method provides a broadly applicable strategy for identifying drugs that specifically modulate TF activity. PMID:27264179

  13. Leaching from Heterogeneous Heck Catalysts: A Computational Approach

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The possibility of carrying out a purely heterogeneous Heck reaction in practice without Pd leaching has been previously considered by a number of research groups but no general consent has yet arrived. Here, the reaction was, for the first time, evaluated by a simple computational approach. Modelling experiments were performed on one of the initial catalytic steps: phenyl halides attachment on Pd (111) to (100) and (111) to (111) ridges of a Pd crystal. Three surface structures of resulting [PhPdX] were identified as possible reactive intermediates. Following potential energy minimisation calculations based on a universal force field, the relative stabilities of these surface species were then determined. Results showed the most stable species to be one in which a Pd ridge atom is removed from the Pd crystal structure, suggesting Pd leaching induced by phenyl halides is energetically favourable.

  14. Systems approaches to computational modeling of the oral microbiome

    Directory of Open Access Journals (Sweden)

    Dimiter V. Dimitrov

    2013-07-01

    Full Text Available Current microbiome research has generated tremendous amounts of data providing snapshots of molecular activity in a variety of organisms, environments, and cell types. However, turning this knowledge into whole system level of understanding on pathways and processes has proven to be a challenging task. In this review we highlight the applicability of bioinformatics and visualization techniques to large collections of data in order to better understand the information that contains related diet – oral microbiome – host mucosal transcriptome interactions. In particular we focus on systems biology of Porphyromonas gingivalis in the context of high throughput computational methods tightly integrated with translational systems medicine. Those approaches have applications for both basic research, where we can direct specific laboratory experiments in model organisms and cell cultures, to human disease, where we can validate new mechanisms and biomarkers for prevention and treatment of chronic disorders

  15. Vehicular traffic noise prediction using soft computing approach.

    Science.gov (United States)

    Singh, Daljeet; Nigam, S P; Agrawal, V P; Kumar, Maneek

    2016-12-01

    A new approach for the development of vehicular traffic noise prediction models is presented. Four different soft computing methods, namely, Generalized Linear Model, Decision Trees, Random Forests and Neural Networks, have been used to develop models to predict the hourly equivalent continuous sound pressure level, Leq, at different locations in the Patiala city in India. The input variables include the traffic volume per hour, percentage of heavy vehicles and average speed of vehicles. The performance of the four models is compared on the basis of performance criteria of coefficient of determination, mean square error and accuracy. 10-fold cross validation is done to check the stability of the Random Forest model, which gave the best results. A t-test is performed to check the fit of the model with the field data.

  16. Computer Modeling of Violent Intent: A Content Analysis Approach

    Energy Technology Data Exchange (ETDEWEB)

    Sanfilippo, Antonio P.; Mcgrath, Liam R.; Bell, Eric B.

    2014-01-03

    We present a computational approach to modeling the intent of a communication source representing a group or an individual to engage in violent behavior. Our aim is to identify and rank aspects of radical rhetoric that are endogenously related to violent intent to predict the potential for violence as encoded in written or spoken language. We use correlations between contentious rhetoric and the propensity for violent behavior found in documents from radical terrorist and non-terrorist groups and individuals to train and evaluate models of violent intent. We then apply these models to unseen instances of linguistic behavior to detect signs of contention that have a positive correlation with violent intent factors. Of particular interest is the application of violent intent models to social media, such as Twitter, that have proved to serve as effective channels in furthering sociopolitical change.

  17. Computer Aided Interpretation Approach for Optical Tomographic Images

    CERN Document Server

    Klose, Christian D; Netz, Uwe; Beuthan, Juergen; Hielscher, Andreas H

    2010-01-01

    A computer-aided interpretation approach is proposed to detect rheumatic arthritis (RA) of human finger joints in optical tomographic images. The image interpretation method employs a multi-variate signal detection analysis aided by a machine learning classification algorithm, called Self-Organizing Mapping (SOM). Unlike in previous studies, this allows for combining multiple physical image parameters, such as minimum and maximum values of the absorption coefficient for identifying affected and not affected joints. Classification performances obtained by the proposed method were evaluated in terms of sensitivity, specificity, Youden index, and mutual information. Different methods (i.e., clinical diagnostics, ultrasound imaging, magnet resonance imaging and inspection of optical tomographic images), were used as "ground truth"-benchmarks to determine the performance of image interpretations. Using data from 100 finger joints, findings suggest that some parameter combinations lead to higher sensitivities while...

  18. Computational Approach to Seasonal Changes of Living Leaves

    Directory of Open Access Journals (Sweden)

    Ying Tang

    2013-01-01

    Full Text Available This paper proposes a computational approach to seasonal changes of living leaves by combining the geometric deformations and textural color changes. The geometric model of a leaf is generated by triangulating the scanned image of a leaf using an optimized mesh. The triangular mesh of the leaf is deformed by the improved mass-spring model, while the deformation is controlled by setting different mass values for the vertices on the leaf model. In order to adaptively control the deformation of different regions in the leaf, the mass values of vertices are set to be in proportion to the pixels' intensities of the corresponding user-specified grayscale mask map. The geometric deformations as well as the textural color changes of a leaf are used to simulate the seasonal changing process of leaves based on Markov chain model with different environmental parameters including temperature, humidness, and time. Experimental results show that the method successfully simulates the seasonal changes of leaves.

  19. A computational approach for identifying pathogenicity islands in prokaryotic genomes

    Directory of Open Access Journals (Sweden)

    Oh Tae Kwang

    2005-07-01

    Full Text Available Abstract Background Pathogenicity islands (PAIs, distinct genomic segments of pathogens encoding virulence factors, represent a subgroup of genomic islands (GIs that have been acquired by horizontal gene transfer event. Up to now, computational approaches for identifying PAIs have been focused on the detection of genomic regions which only differ from the rest of the genome in their base composition and codon usage. These approaches often lead to the identification of genomic islands, rather than PAIs. Results We present a computational method for detecting potential PAIs in complete prokaryotic genomes by combining sequence similarities and abnormalities in genomic composition. We first collected 207 GenBank accessions containing either part or all of the reported PAI loci. In sequenced genomes, strips of PAI-homologs were defined based on the proximity of the homologs of genes in the same PAI accession. An algorithm reminiscent of sequence-assembly procedure was then devised to merge overlapping or adjacent genomic strips into a large genomic region. Among the defined genomic regions, PAI-like regions were identified by the presence of homolog(s of virulence genes. Also, GIs were postulated by calculating G+C content anomalies and codon usage bias. Of 148 prokaryotic genomes examined, 23 pathogenic and 6 non-pathogenic bacteria contained 77 candidate PAIs that partly or entirely overlap GIs. Conclusion Supporting the validity of our method, included in the list of candidate PAIs were thirty four PAIs previously identified from genome sequencing papers. Furthermore, in some instances, our method was able to detect entire PAIs for those only partial sequences are available. Our method was proven to be an efficient method for demarcating the potential PAIs in our study. Also, the function(s and origin(s of a candidate PAI can be inferred by investigating the PAI queries comprising it. Identification and analysis of potential PAIs in prokaryotic

  20. A computational approach for deciphering the organization of glycosaminoglycans.

    Directory of Open Access Journals (Sweden)

    Jean L Spencer

    Full Text Available BACKGROUND: Increasing evidence has revealed important roles for complex glycans as mediators of normal and pathological processes. Glycosaminoglycans are a class of glycans that bind and regulate the function of a wide array of proteins at the cell-extracellular matrix interface. The specific sequence and chemical organization of these polymers likely define function; however, identification of the structure-function relationships of glycosaminoglycans has been met with challenges associated with the unique level of complexity and the nontemplate-driven biosynthesis of these biopolymers. METHODOLOGY/PRINCIPAL FINDINGS: To address these challenges, we have devised a computational approach to predict fine structure and patterns of domain organization of the specific glycosaminoglycan, heparan sulfate (HS. Using chemical composition data obtained after complete and partial digestion of mixtures of HS chains with specific degradative enzymes, the computational analysis produces populations of theoretical HS chains with structures that meet both biosynthesis and enzyme degradation rules. The model performs these operations through a modular format consisting of input/output sections and three routines called chainmaker, chainbreaker, and chainsorter. We applied this methodology to analyze HS preparations isolated from pulmonary fibroblasts and epithelial cells. Significant differences in the general organization of these two HS preparations were observed, with HS from epithelial cells having a greater frequency of highly sulfated domains. Epithelial HS also showed a higher density of specific HS domains that have been associated with inhibition of neutrophil elastase. Experimental analysis of elastase inhibition was consistent with the model predictions and demonstrated that HS from epithelial cells had greater inhibitory activity than HS from fibroblasts. CONCLUSIONS/SIGNIFICANCE: This model establishes the conceptual framework for a new class of

  1. A NEW APPROACH TOWARDS INTEGRATED CLOUD COMPUTING ARCHITECTURE

    Directory of Open Access Journals (Sweden)

    Niloofar Khanghahi

    2014-03-01

    Full Text Available Today across various businesses, administrative and senior managers seeking for new technologies and approaches in which they can utilize it, more easy and affordable and thereby rise up their competitive profit and utility. Information Communications and Technology (ICT is no exception from this principle. Cloud computing concept and technology and its inherent advantages has created a new ecosystem in the world of computing and is driving ICT industry one step forward. This technology can play an important role in an organization’s durability and IT strategies. Nowadays, due to progress and global popularity of cloud environments, many organizations moving to cloud and some well-known IT solution providers such as IBM and Oracle have introduced specific architecture to be deployed for cloud environment. On the other hand, using of IT Frameworks can be the best way for integrated business processes and other different processes. The purpose of this paper is to provide a novel architecture for cloud environment, based on recent best practices and frameworks and other cloud reference architecture. Meanwhile, a new service model has been introduced in this proposed architecture. This architecture is finally compared with little other architecture in a form of statistical graphs to show its benefits.

  2. Towards scalable quantum communication and computation: Novel approaches and realizations

    Science.gov (United States)

    Jiang, Liang

    Quantum information science involves exploration of fundamental laws of quantum mechanics for information processing tasks. This thesis presents several new approaches towards scalable quantum information processing. First, we consider a hybrid approach to scalable quantum computation, based on an optically connected network of few-qubit quantum registers. Specifically, we develop a novel scheme for scalable quantum computation that is robust against various imperfections. To justify that nitrogen-vacancy (NV) color centers in diamond can be a promising realization of the few-qubit quantum register, we show how to isolate a few proximal nuclear spins from the rest of the environment and use them for the quantum register. We also demonstrate experimentally that the nuclear spin coherence is only weakly perturbed under optical illumination, which allows us to implement quantum logical operations that use the nuclear spins to assist the repetitive-readout of the electronic spin. Using this technique, we demonstrate more than two-fold improvement in signal-to-noise ratio. Apart from direct application to enhance the sensitivity of the NV-based nano-magnetometer, this experiment represents an important step towards the realization of robust quantum information processors using electronic and nuclear spin qubits. We then study realizations of quantum repeaters for long distance quantum communication. Specifically, we develop an efficient scheme for quantum repeaters based on atomic ensembles. We use dynamic programming to optimize various quantum repeater protocols. In addition, we propose a new protocol of quantum repeater with encoding, which efficiently uses local resources (about 100 qubits) to identify and correct errors, to achieve fast one-way quantum communication over long distances. Finally, we explore quantum systems with topological order. Such systems can exhibit remarkable phenomena such as quasiparticles with anyonic statistics and have been proposed as

  3. COMPUTER APPROACHES TO WHEAT HIGH-THROUGHPUT PHENOTYPING

    Directory of Open Access Journals (Sweden)

    Afonnikov D.

    2012-08-01

    Full Text Available The growing need for rapid and accurate approaches for large-scale assessment of phenotypic characters in plants becomes more and more obvious in the studies looking into relationships between genotype and phenotype. This need is due to the advent of high throughput methods for analysis of genomes. Nowadays, any genetic experiment involves data on thousands and dozens of thousands of plants. Traditional ways of assessing most phenotypic characteristics (those with reliance on the eye, the touch, the ruler are little effective on samples of such sizes. Modern approaches seek to take advantage of automated phenotyping, which warrants a much more rapid data acquisition, higher accuracy of the assessment of phenotypic features, measurement of new parameters of these features and exclusion of human subjectivity from the process. Additionally, automation allows measurement data to be rapidly loaded into computer databases, which reduces data processing time.In this work, we present the WheatPGE information system designed to solve the problem of integration of genotypic and phenotypic data and parameters of the environment, as well as to analyze the relationships between the genotype and phenotype in wheat. The system is used to consolidate miscellaneous data on a plant for storing and processing various morphological traits and genotypes of wheat plants as well as data on various environmental factors. The system is available at www.wheatdb.org. Its potential in genetic experiments has been demonstrated in high-throughput phenotyping of wheat leaf pubescence.

  4. Computational approaches to protein inference in shotgun proteomics.

    Science.gov (United States)

    Li, Yong Fuga; Radivojac, Predrag

    2012-01-01

    Shotgun proteomics has recently emerged as a powerful approach to characterizing proteomes in biological samples. Its overall objective is to identify the form and quantity of each protein in a high-throughput manner by coupling liquid chromatography with tandem mass spectrometry. As a consequence of its high throughput nature, shotgun proteomics faces challenges with respect to the analysis and interpretation of experimental data. Among such challenges, the identification of proteins present in a sample has been recognized as an important computational task. This task generally consists of (1) assigning experimental tandem mass spectra to peptides derived from a protein database, and (2) mapping assigned peptides to proteins and quantifying the confidence of identified proteins. Protein identification is fundamentally a statistical inference problem with a number of methods proposed to address its challenges. In this review we categorize current approaches into rule-based, combinatorial optimization and probabilistic inference techniques, and present them using integer programming and Bayesian inference frameworks. We also discuss the main challenges of protein identification and propose potential solutions with the goal of spurring innovative research in this area. PMID:23176300

  5. Computational approaches to protein inference in shotgun proteomics

    Directory of Open Access Journals (Sweden)

    Li Yong

    2012-11-01

    Full Text Available Abstract Shotgun proteomics has recently emerged as a powerful approach to characterizing proteomes in biological samples. Its overall objective is to identify the form and quantity of each protein in a high-throughput manner by coupling liquid chromatography with tandem mass spectrometry. As a consequence of its high throughput nature, shotgun proteomics faces challenges with respect to the analysis and interpretation of experimental data. Among such challenges, the identification of proteins present in a sample has been recognized as an important computational task. This task generally consists of (1 assigning experimental tandem mass spectra to peptides derived from a protein database, and (2 mapping assigned peptides to proteins and quantifying the confidence of identified proteins. Protein identification is fundamentally a statistical inference problem with a number of methods proposed to address its challenges. In this review we categorize current approaches into rule-based, combinatorial optimization and probabilistic inference techniques, and present them using integer programing and Bayesian inference frameworks. We also discuss the main challenges of protein identification and propose potential solutions with the goal of spurring innovative research in this area.

  6. An Evolutionary Computation Approach to Examine Functional Brain Plasticity.

    Science.gov (United States)

    Roy, Arnab; Campbell, Colin; Bernier, Rachel A; Hillary, Frank G

    2016-01-01

    One common research goal in systems neurosciences is to understand how the functional relationship between a pair of regions of interest (ROIs) evolves over time. Examining neural connectivity in this way is well-suited for the study of developmental processes, learning, and even in recovery or treatment designs in response to injury. For most fMRI based studies, the strength of the functional relationship between two ROIs is defined as the correlation between the average signal representing each region. The drawback to this approach is that much information is lost due to averaging heterogeneous voxels, and therefore, the functional relationship between a ROI-pair that evolve at a spatial scale much finer than the ROIs remain undetected. To address this shortcoming, we introduce a novel evolutionary computation (EC) based voxel-level procedure to examine functional plasticity between an investigator defined ROI-pair by simultaneously using subject-specific BOLD-fMRI data collected from two sessions seperated by finite duration of time. This data-driven procedure detects a sub-region composed of spatially connected voxels from each ROI (a so-called sub-regional-pair) such that the pair shows a significant gain/loss of functional relationship strength across the two time points. The procedure is recursive and iteratively finds all statistically significant sub-regional-pairs within the ROIs. Using this approach, we examine functional plasticity between the default mode network (DMN) and the executive control network (ECN) during recovery from traumatic brain injury (TBI); the study includes 14 TBI and 12 healthy control subjects. We demonstrate that the EC based procedure is able to detect functional plasticity where a traditional averaging based approach fails. The subject-specific plasticity estimates obtained using the EC-procedure are highly consistent across multiple runs. Group-level analyses using these plasticity estimates showed an increase in the strength

  7. An evolutionary computation approach to examine functional brain plasticity

    Directory of Open Access Journals (Sweden)

    Arnab eRoy

    2016-04-01

    Full Text Available One common research goal in systems neurosciences is to understand how the functional relationship between a pair of regions of interest (ROIs evolves over time. Examining neural connectivity in this way is well-suited for the study of developmental processes, learning, and even in recovery or treatment designs in response to injury. For most fMRI based studies, the strength of the functional relationship between two ROIs is defined as the correlation between the average signal representing each region. The drawback to this approach is that much information is lost due to averaging heterogeneous voxels, and therefore, the functional relationship between a ROI-pair that evolve at a spatial scale much finer than the ROIs remain undetected. To address this shortcoming, we introduce a novel evolutionary computation (EC based voxel-level procedure to examine functional plasticity between an investigator defined ROI-pair by simultaneously using subject-specific BOLD-fMRI data collected from two sessions seperated by finite duration of time. This data-driven procedure detects a sub-region composed of spatially connected voxels from each ROI (a so-called sub-regional-pair such that the pair shows a significant gain/loss of functional relationship strength across the two time points. The procedure is recursive and iteratively finds all statistically significant sub-regional-pairs within the ROIs. Using this approach, we examine functional plasticity between the default mode network (DMN and the executive control network (ECN during recovery from traumatic brain injury (TBI; the study includes 14 TBI and 12 healthy control subjects. We demonstrate that the EC based procedure is able to detect functional plasticity where a traditional averaging based approach fails. The subject-specific plasticity estimates obtained using the EC-procedure are highly consistent across multiple runs. Group-level analyses using these plasticity estimates showed an increase in

  8. Human Computer Interaction Approach in Developing Customer Relationship Management

    Directory of Open Access Journals (Sweden)

    Mohd H.N.M. Nasir

    2008-01-01

    Full Text Available Problem statement: Many published studies have found that more than 50% of Customer Relationship Management (CRM system implementations have failed due to the failure of system usability and does not fulfilled user expectation. This study presented the issues that contributed to the failures of CRM system and proposed a prototype of CRM system developed using Human Computer Interaction approaches in order to resolve the identified issues. Approach: In order to capture the users' requirements, a single in-depth case study of a multinational company was chosen in this research, in which the background, current conditions and environmental interactions were observed, recorded and analyzed for stages of patterns in relation to internal and external influences. Some techniques of blended data gathering which are interviews, naturalistic observation and studying user documentation were employed and then the prototype of CRM system was developed which incorporated User-Centered Design (UCD approach, Hierarchical Task Analysis (HTA, metaphor and identification of users' behaviors and characteristics. The implementation of these techniques, were then measured in terms of usability. Results: Based on the usability testing conducted, the results showed that most of the users agreed that the system is comfortable to work with by taking the quality attributes of learnability, memorizeablity, utility, sortability, font, visualization, user metaphor, information easy view and color as measurement parameters. Conclusions/Recommendations: By combining all these techniques, a comfort level for the users that leads to user satisfaction and higher usability degree can be achieved in a proposed CRM system. Thus, it is important that the companies should put usability quality attribute into a consideration before developing or procuring CRM system to ensure the implementation successfulness of the CRM system.

  9. A Representation-Theoretic Approach to Reversible Computation with Applications

    DEFF Research Database (Denmark)

    Maniotis, Andreas Milton

    Reversible computing is a sub-discipline of computer science that helps to understand the foundations of the interplay between physics, algebra, and logic in the context of computation. Its subjects of study are computational devices and abstract models of computation that satisfy the constraint......, there is still no uniform and consistent theory that is general in the sense of giving a model-independent account to the field....

  10. Recovery Act: Advanced Interaction, Computation, and Visualization Tools for Sustainable Building Design

    Energy Technology Data Exchange (ETDEWEB)

    Greenberg, Donald P. [Cornell Univ., Ithaca, NY (United States); Hencey, Brandon M. [Cornell Univ., Ithaca, NY (United States)

    2013-08-20

    Current building energy simulation technology requires excessive labor, time and expertise to create building energy models, excessive computational time for accurate simulations and difficulties with the interpretation of the results. These deficiencies can be ameliorated using modern graphical user interfaces and algorithms which take advantage of modern computer architectures and display capabilities. To prove this hypothesis, we developed an experimental test bed for building energy simulation. This novel test bed environment offers an easy-to-use interactive graphical interface, provides access to innovative simulation modules that run at accelerated computational speeds, and presents new graphics visualization methods to interpret simulation results. Our system offers the promise of dramatic ease of use in comparison with currently available building energy simulation tools. Its modular structure makes it suitable for early stage building design, as a research platform for the investigation of new simulation methods, and as a tool for teaching concepts of sustainable design. Improvements in the accuracy and execution speed of many of the simulation modules are based on the modification of advanced computer graphics rendering algorithms. Significant performance improvements are demonstrated in several computationally expensive energy simulation modules. The incorporation of these modern graphical techniques should advance the state of the art in the domain of whole building energy analysis and building performance simulation, particularly at the conceptual design stage when decisions have the greatest impact. More importantly, these better simulation tools will enable the transition from prescriptive to performative energy codes, resulting in better, more efficient designs for our future built environment.

  11. The "liver-first approach" for patients with locally advanced rectal cancer and synchronous liver metastases.

    NARCIS (Netherlands)

    Verhoef, C.; Pool, A.E. van der; Nuyttens, J.J.; Planting, A.S.; Eggermont, A.M.M.; Wilt, J.H.W. de

    2009-01-01

    PURPOSE: This study was designed to investigate the outcome of "the liver-first" approach in patients with locally advanced rectal cancer and synchronous liver metastases. METHODS: Patients with locally advanced rectal cancer and synchronous liver metastases were primarily treated for their liver me

  12. Continued rise of the cloud advances and trends in cloud computing

    CERN Document Server

    Mahmood, Zaigham

    2014-01-01

    Cloud computing is no-longer a novel paradigm, but instead an increasingly robust and established technology, yet new developments continue to emerge in this area. Continued Rise of the Cloud: Advances and Trends in Cloud Computing captures the state of the art in cloud technologies, infrastructures, and service delivery and deployment models. The book provides guidance and case studies on the development of cloud-based services and infrastructures from an international selection of expert researchers and practitioners. A careful analysis is provided of relevant theoretical frameworks, prac

  13. Advances in the Development and Application of Computational Methodologies for Structural Modeling of G-Protein Coupled Receptors

    Science.gov (United States)

    Mobarec, Juan Carlos

    2009-01-01

    Background Despite the large amount of experimental data accumulated in the past decade on G-protein coupled receptor (GPCR) structure and function, understanding of the molecular mechanisms underlying GPCR signaling is still far from being complete, thus impairing the design of effective and selective pharmaceuticals. Objective Understanding of GPCR function has been challenged even further by more recent experimental evidence that several of these receptors are organized in the cell membrane as homo- or hetero-oligomers, and that they may exhibit unique pharmacological properties. Given the complexity of these new signaling systems, researcher’s efforts are turning increasingly to molecular modeling, bioinformatics and computational simulations for mechanistic insights of GPCR functional plasticity. Methods We review here current advances in the development and application of computational approaches to improve prediction of GPCR structure and dynamics, thus enhancing current understanding of GPCR signaling. Results/Conclusions Models resulting from use of these computational approaches further supported by experiments are expected to help elucidate the complex allosterism that propagates through GPCR complexes, ultimately aiming at successful structure-based rational drug design. PMID:19672320

  14. Robotics, Stem Cells and Brain Computer Interfaces in Rehabilitation and Recovery from Stroke; Updates and Advances

    Science.gov (United States)

    Boninger, Michael L; Wechsler, Lawrence R.; Stein, Joel

    2014-01-01

    Objective To describe the current state and latest advances in robotics, stem cells, and brain computer interfaces in rehabilitation and recovery for stroke. Design The authors of this summary recently reviewed this work as part of a national presentation. The paper represents the information included in each area. Results Each area has seen great advances and challenges as products move to market and experiments are ongoing. Conclusion Robotics, stem cells, and brain computer interfaces all have tremendous potential to reduce disability and lead to better outcomes for patients with stroke. Continued research and investment will be needed as the field moves forward. With this investment, the potential for recovery of function is likely substantial PMID:25313662

  15. Computational methods in the prediction of advanced subsonic and supersonic propeller induced noise: ASSPIN users' manual

    Science.gov (United States)

    Dunn, M. H.; Tarkenton, G. M.

    1992-01-01

    This document describes the computational aspects of propeller noise prediction in the time domain and the use of high speed propeller noise prediction program ASSPIN (Advanced Subsonic and Supersonic Propeller Induced Noise). These formulations are valid in both the near and far fields. Two formulations are utilized by ASSPIN: (1) one is used for subsonic portions of the propeller blade; and (2) the second is used for transonic and supersonic regions on the blade. Switching between the two formulations is done automatically. ASSPIN incorporates advanced blade geometry and surface pressure modelling, adaptive observer time grid strategies, and contains enhanced numerical algorithms that result in reduced computational time. In addition, the ability to treat the nonaxial inflow case has been included.

  16. Individual Approach In Treatment Of Advanced Stomach Cancer

    Directory of Open Access Journals (Sweden)

    D Juraev

    2010-04-01

    Full Text Available Background: To study efficiency of the combined treatment of advanced gastric cancer with inclusion Trastuzumab.Material: We present the intermediate analysis of the use of target therapy with Trastuzumab in patients with a HER2-positive gastric cancer. Up to 01.10.2009y 118 patients have been tested for HER-2 expression, and in 24 gastric cancer patients it is revealed HER2-positive status of tumor. It is lead chemotherapy to all patients by the PLF regimen and Herceptin in doze 6 mg/kg once in 3 weeks (6 cycles. In control group in 26 patients it is lead only chemotherapy by the PLF regimen once in 3 weeks without addition Trastuzumab (6 cycles.Results:  At the moment of the analysis of preliminary data, the median remission duration in compared groups has made 8.3 months, and 5.2 months, accordingly.Conclusion: At advanced gastric cancer with high level HER-2 expression Trastuzumab increases frequency of objective effect and the median remission duration.

  17. Advances in Single-Photon Emission Computed Tomography Hardware and Software.

    Science.gov (United States)

    Piccinelli, Marina; Garcia, Ernest V

    2016-02-01

    Nuclear imaging techniques remain today's most reliable modality for the assessment and quantification of myocardial perfusion. In recent years, the field has experienced tremendous progress both in terms of dedicated cameras for cardiac applications and software techniques for image reconstruction. The most recent advances in single-photon emission computed tomography hardware and software are reviewed, focusing on how these improvements have resulted in an even more powerful diagnostic tool with reduced injected radiation dose and acquisition time.

  18. Recent advances in rational approaches for enzyme engineering

    Directory of Open Access Journals (Sweden)

    Kerstin Steiner

    2012-09-01

    Full Text Available Enzymes are an attractive alternative in the asymmetric syntheses of chiral building blocks. To meet the requirements of industrial biotechnology and to introduce new functionalities, the enzymes need to be optimized by protein engineering. This article specifically reviews rational approaches for enzyme engineering and de novo enzyme design involving structure-based approaches developed in recent years for improvement of the enzymes’ performance, broadened substrate range, and creation of novel functionalities to obtain products with high added value for industrial applications.

  19. Applying a cloud computing approach to storage architectures for spacecraft

    Science.gov (United States)

    Baldor, Sue A.; Quiroz, Carlos; Wood, Paul

    As sensor technologies, processor speeds, and memory densities increase, spacecraft command, control, processing, and data storage systems have grown in complexity to take advantage of these improvements and expand the possible missions of spacecraft. Spacecraft systems engineers are increasingly looking for novel ways to address this growth in complexity and mitigate associated risks. Looking to conventional computing, many solutions have been executed to solve both the problem of complexity and heterogeneity in systems. In particular, the cloud-based paradigm provides a solution for distributing applications and storage capabilities across multiple platforms. In this paper, we propose utilizing a cloud-like architecture to provide a scalable mechanism for providing mass storage in spacecraft networks that can be reused on multiple spacecraft systems. By presenting a consistent interface to applications and devices that request data to be stored, complex systems designed by multiple organizations may be more readily integrated. Behind the abstraction, the cloud storage capability would manage wear-leveling, power consumption, and other attributes related to the physical memory devices, critical components in any mass storage solution for spacecraft. Our approach employs SpaceWire networks and SpaceWire-capable devices, although the concept could easily be extended to non-heterogeneous networks consisting of multiple spacecraft and potentially the ground segment.

  20. Computational Approach for Epitaxial Polymorph Stabilization through Substrate Selection.

    Science.gov (United States)

    Ding, Hong; Dwaraknath, Shyam S; Garten, Lauren; Ndione, Paul; Ginley, David; Persson, Kristin A

    2016-05-25

    With the ultimate goal of finding new polymorphs through targeted synthesis conditions and techniques, we outline a computational framework to select optimal substrates for epitaxial growth using first principle calculations of formation energies, elastic strain energy, and topological information. To demonstrate the approach, we study the stabilization of metastable VO2 compounds which provides a rich chemical and structural polymorph space. We find that common polymorph statistics, lattice matching, and energy above hull considerations recommends homostructural growth on TiO2 substrates, where the VO2 brookite phase would be preferentially grown on the a-c TiO2 brookite plane while the columbite and anatase structures favor the a-b plane on the respective TiO2 phases. Overall, we find that a model which incorporates a geometric unit cell area matching between the substrate and the target film as well as the resulting strain energy density of the film provide qualitative agreement with experimental observations for the heterostructural growth of known VO2 polymorphs: rutile, A and B phases. The minimal interfacial geometry matching and estimated strain energy criteria provide several suggestions for substrates and substrate-film orientations for the heterostructural growth of the hitherto hypothetical anatase, brookite, and columbite polymorphs. These criteria serve as a preliminary guidance for the experimental efforts stabilizing new materials and/or polymorphs through epitaxy. The current screening algorithm is being integrated within the Materials Project online framework and data and hence publicly available. PMID:27145398

  1. Computational Approach for Epitaxial Polymorph Stabilization through Substrate Selection

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Hong; Dwaraknath, Shyam S.; Garten, Lauren; Ndione, Paul; Ginley, David; Persson, Kristin A.

    2016-05-25

    With the ultimate goal of finding new polymorphs through targeted synthesis conditions and techniques, we outline a computational framework to select optimal substrates for epitaxial growth using first principle calculations of formation energies, elastic strain energy, and topological information. To demonstrate the approach, we study the stabilization of metastable VO2 compounds which provides a rich chemical and structural polymorph space. We find that common polymorph statistics, lattice matching, and energy above hull considerations recommends homostructural growth on TiO2 substrates, where the VO2 brookite phase would be preferentially grown on the a-c TiO2 brookite plane while the columbite and anatase structures favor the a-b plane on the respective TiO2 phases. Overall, we find that a model which incorporates a geometric unit cell area matching between the substrate and the target film as well as the resulting strain energy density of the film provide qualitative agreement with experimental observations for the heterostructural growth of known VO2 polymorphs: rutile, A and B phases. The minimal interfacial geometry matching and estimated strain energy criteria provide several suggestions for substrates and substrate-film orientations for the heterostructural growth of the hitherto hypothetical anatase, brookite, and columbite polymorphs. These criteria serve as a preliminary guidance for the experimental efforts stabilizing new materials and/or polymorphs through epitaxy. The current screening algorithm is being integrated within the Materials Project online framework and data and hence publicly available.

  2. A Near-Term Quantum Computing Approach for Hard Computational Problems in Space Exploration

    CERN Document Server

    Smelyanskiy, Vadim N; Knysh, Sergey I; Williams, Colin P; Johnson, Mark W; Thom, Murray C; Macready, William G; Pudenz, Kristen L

    2012-01-01

    In this article, we show how to map a sampling of the hardest artificial intelligence problems in space exploration onto equivalent Ising models that then can be attacked using quantum annealing implemented in D-Wave machine. We overview the existing results as well as propose new Ising model implementations for quantum annealing. We review supervised and unsupervised learning algorithms for classification and clustering with applications to feature identification and anomaly detection. We introduce algorithms for data fusion and image matching for remote sensing applications. We overview planning problems for space exploration mission applications and algorithms for diagnostics and recovery with applications to deep space missions. We describe combinatorial optimization algorithms for task assignment in the context of autonomous unmanned exploration. Finally, we discuss the ways to circumvent the limitation of the Ising mapping using a "blackbox" approach based on ideas from probabilistic computing. In this ...

  3. A MOBILE COMPUTING TECHNOLOGY FORESIGHT STUDY WITH SCENARIO PLANNING APPROACH

    Directory of Open Access Journals (Sweden)

    Wei-Hsiu Weng

    2015-09-01

    Full Text Available Although the importance of mobile computing is gradually being recognized, mobile computing technology development and adoption have not been clearly realized. This paper focuses on the technology planning strategy for organizations that have an interest in developing or adopting mobile computing technology. By using scenario analysis, a technology planning strategy is constructed. In this study, thirty mobile computing technologies are classified into six groups, and the importance and risk factors of these technologies are then evaluated under two possible scenarios. The main research findings include the discovery that most mobile computing software technologies are rated high to medium in importance and low risk in both scenarios, and that scenario changes will have less impact on mobile computing devices and on mobile computing software technologies. These results provide a reference for organizations interested in developing or adopting mobile computing technology.

  4. Condition monitoring through advanced sensor and computational technology : final report (January 2002 to May 2005).

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jung-Taek (Korea Atomic Energy Research Institute, Daejon, Korea); Luk, Vincent K.

    2005-05-01

    The overall goal of this joint research project was to develop and demonstrate advanced sensors and computational technology for continuous monitoring of the condition of components, structures, and systems in advanced and next-generation nuclear power plants (NPPs). This project included investigating and adapting several advanced sensor technologies from Korean and US national laboratory research communities, some of which were developed and applied in non-nuclear industries. The project team investigated and developed sophisticated signal processing, noise reduction, and pattern recognition techniques and algorithms. The researchers installed sensors and conducted condition monitoring tests on two test loops, a check valve (an active component) and a piping elbow (a passive component), to demonstrate the feasibility of using advanced sensors and computational technology to achieve the project goal. Acoustic emission (AE) devices, optical fiber sensors, accelerometers, and ultrasonic transducers (UTs) were used to detect mechanical vibratory response of check valve and piping elbow in normal and degraded configurations. Chemical sensors were also installed to monitor the water chemistry in the piping elbow test loop. Analysis results of processed sensor data indicate that it is feasible to differentiate between the normal and degraded (with selected degradation mechanisms) configurations of these two components from the acquired sensor signals, but it is questionable that these methods can reliably identify the level and type of degradation. Additional research and development efforts are needed to refine the differentiation techniques and to reduce the level of uncertainties.

  5. A MOBILE COMPUTING TECHNOLOGY FORESIGHT STUDY WITH SCENARIO PLANNING APPROACH

    OpenAIRE

    Wei-Hsiu Weng; Woo-Tsong Lin

    2015-01-01

    Although the importance of mobile computing is gradually being recognized, mobile computing technology development and adoption have not been clearly realized. This paper focuses on the technology planning strategy for organizations that have an interest in developing or adopting mobile computing technology. By using scenario analysis, a technology planning strategy is constructed. In this study, thirty mobile computing technologies are classified into six groups, and the importance and risk ...

  6. Research Institute for Advanced Computer Science: Annual Report October 1998 through September 1999

    Science.gov (United States)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    1999-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center (ARC). It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. ARC has been designated NASA's Center of Excellence in Information Technology. In this capacity, ARC is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA ARC and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to

  7. Tuneable resolution as a systems biology approach for multi-scale, multi-compartment computational models.

    Science.gov (United States)

    Kirschner, Denise E; Hunt, C Anthony; Marino, Simeone; Fallahi-Sichani, Mohammad; Linderman, Jennifer J

    2014-01-01

    The use of multi-scale mathematical and computational models to study complex biological processes is becoming increasingly productive. Multi-scale models span a range of spatial and/or temporal scales and can encompass multi-compartment (e.g., multi-organ) models. Modeling advances are enabling virtual experiments to explore and answer questions that are problematic to address in the wet-lab. Wet-lab experimental technologies now allow scientists to observe, measure, record, and analyze experiments focusing on different system aspects at a variety of biological scales. We need the technical ability to mirror that same flexibility in virtual experiments using multi-scale models. Here we present a new approach, tuneable resolution, which can begin providing that flexibility. Tuneable resolution involves fine- or coarse-graining existing multi-scale models at the user's discretion, allowing adjustment of the level of resolution specific to a question, an experiment, or a scale of interest. Tuneable resolution expands options for revising and validating mechanistic multi-scale models, can extend the longevity of multi-scale models, and may increase computational efficiency. The tuneable resolution approach can be applied to many model types, including differential equation, agent-based, and hybrid models. We demonstrate our tuneable resolution ideas with examples relevant to infectious disease modeling, illustrating key principles at work.

  8. The responsive approach by the Basel Committee (on Banking Supervision) to regulation: Meta risk regulation, the Internal Ratings Based Approaches and the Advanced Measurement Approaches.

    OpenAIRE

    Ojo, Marianne

    2009-01-01

    The use of complex and sophisticated financial instruments, such as derivatives, in the modern financial environment, has triggered the emergence of new forms of risks. As well as the need to manage such types of risks, this paper investigates developments which have instigated the Basel Committee in developing advanced risk management techniques such as the Internal Ratings Based (IRB) approaches and the Advanced Measurement Approaches (AMA). Developments since the inception of the 1988 Base...

  9. Cooperative technology development: An approach to advancing energy technology

    International Nuclear Information System (INIS)

    Technology development requires an enormous financial investment over a long period of time. Scarce national and corporate resources, the result of highly competitive markets, decreased profit margins, wide currency fluctuations, and growing debt, often preclude continuous development of energy technology by single entities, i.e., corporations, institutions, or nations. Although the energy needs of the developed world are generally being met by existing institutions, it is becoming increasingly clear that existing capital formation and technology transfer structures have failed to aid developing nations in meeting their growing electricity needs. This paper will describe a method for meeting the electricity needs of the developing world through technology transfer and international cooperative technology development. The role of nuclear power and the advanced passive plant design will be discussed. (author)

  10. Advanced free space optics (FSO) a systems approach

    CERN Document Server

    Majumdar, Arun K

    2015-01-01

    This book provides a comprehensive, unified tutorial covering the most recent advances in the technology of free-space optics (FSO). It is an all-inclusive source of information on the fundamentals of FSO as well as up-to-date information on the state-of-the-art in technologies available today. This text is intended for graduate students, and will also be useful for research scientists and engineers with an interest in the field. FSO communication is a practical solution for creating a three dimensional global broadband communications grid, offering bandwidths far beyond what is possible in the Radio Frequency (RF) range. However, the attributes of atmospheric turbulence and scattering impose perennial limitations on availability and reliability of FSO links. From a systems point-of-view, this groundbreaking book provides a thorough understanding of channel behavior, which can be used to design and evaluate optimum transmission techniques that operate under realistic atmospheric conditions. Topics addressed...

  11. Advanced methods for the computation of particle beam transport and the computation of electromagnetic fields and beam-cavity interactions

    International Nuclear Information System (INIS)

    The University of Maryland Dynamical Systems and Accelerator Theory Group carries out research in two broad areas: the computation of charged particle beam transport using Lie algebraic methods and advanced methods for the computation of electromagnetic fields and beam-cavity interactions. Important improvements in the state of the art are believed to be possible in both of these areas. In addition, applications of these methods are made to problems of current interest in accelerator physics including the theoretical performance of present and proposed high energy machines. The Lie algebraic method of computing and analyzing beam transport handles both linear and nonlinear beam elements. Tests show this method to be superior to the earlier matrix or numerical integration methods. It has wide application to many areas including accelerator physics, intense particle beams, ion microprobes, high resolution electron microscopy, and light optics. With regard to the area of electromagnetic fields and beam cavity interactions, work is carried out on the theory of beam breakup in single pulses. Work is also done on the analysis of the high frequency behavior of longitudinal and transverse coupling impedances, including the examination of methods which may be used to measure these impedances. Finally, work is performed on the electromagnetic analysis of coupled cavities and on the coupling of cavities to waveguides

  12. Differentiating Information Skills and Computer Skills: A Factor Analytic Approach

    OpenAIRE

    Pask, Judith M.; Saunders, E. Stewart

    2004-01-01

    A basic tenet of information literacy programs is that the skills needed to use computers and the skills needed to find and evaluate information are two separate sets of skills. Outside the library this is not always the view. The claim is sometimes made that information skills are acquired by learning computer skills. All that is needed is a computer lab and someone to teach computer skills. This study uses data from a survey of computer and information skills to determine whether or not...

  13. Human Computation An Integrated Approach to Learning from the Crowd

    CERN Document Server

    Law, Edith

    2011-01-01

    Human computation is a new and evolving research area that centers around harnessing human intelligence to solve computational problems that are beyond the scope of existing Artificial Intelligence (AI) algorithms. With the growth of the Web, human computation systems can now leverage the abilities of an unprecedented number of people via the Web to perform complex computation. There are various genres of human computation applications that exist today. Games with a purpose (e.g., the ESP Game) specifically target online gamers who generate useful data (e.g., image tags) while playing an enjoy

  14. New advances in the statistical parton distributions approach*

    Directory of Open Access Journals (Sweden)

    Soffer Jacques

    2016-01-01

    Full Text Available The quantum statistical parton distributions approach proposed more than one decade ago is revisited by considering a larger set of recent and accurate Deep Inelastic Scattering experimental results. It enables us to improve the description of the data by means of a new determination of the parton distributions. This global next-to-leading order QCD analysis leads to a good description of several structure functions, involving unpolarized parton distributions and helicity distributions, in terms of a rather small number of free parameters. There are many serious challenging issues. The predictions of this theoretical approach will be tested for single-jet production and charge asymmetry in W± production in p̄p and pp collisions up to LHC energies, using recent data and also for forthcoming experimental results.

  15. A computational study of advanced exhaust system transition ducts with experimental validation

    Science.gov (United States)

    Wu, C.; Farokhi, S.; Taghavi, R.

    1992-01-01

    The current study is an application of CFD to a 'real' design and analysis environment. A subsonic, three-dimensional parabolized Navier-Stokes (PNS) code is used to construct stall margin design charts for optimum-length advanced exhaust systems' circular-to-rectangular transition ducts. Computer code validation has been conducted to examine the capability of wall static pressure predictions. The comparison of measured and computed wall static pressures indicates a reasonable accuracy of the PNS computer code results. Computations have also been conducted on 15 transition ducts, three area ratios, and five aspect ratios. The three area ratios investigated are constant area ratio of unity, moderate contracting area ratio of 0.8, and highly contracting area ratio of 0.5. The degree of mean flow acceleration is identified as a dominant parameter in establishing the minimum duct length requirement. The effect of increasing aspect ratio in the minimum length transition duct is to increase the length requirement, as well as to increase the mass-averaged total pressure losses. The design guidelines constructed from this investigation may aid in the design and manufacture of advanced exhaust systems for modern fighter aircraft.

  16. A Monomial Chaos Approach for Efficient Uncertainty Quantification in Computational Fluid Dynamics

    NARCIS (Netherlands)

    Witteveen, J.A.S.; Bijl, H.

    2006-01-01

    A monomial chaos approach is proposed for efficient uncertainty quantification in nonlinear computational problems. Propagating uncertainty through nonlinear equations can still be computationally intensive for existing uncertainty quantification methods. It usually results in a set of nonlinear equ

  17. A trait-based approach to advance coral reef science

    DEFF Research Database (Denmark)

    Madin, Joshua S.; Hoogenboom, Mia O.; Connolly, Sean R.;

    2016-01-01

    Coral reefs are biologically diverse and ecologically complex ecosystems constructed by stony corals. Despite decades of research, basic coral population biology and community ecology questions remain. Quantifying trait variation among species can help resolve these questions, but progress has been...... a large amount of variation for a range of biological and ecological processes. Such an approach can accelerate our understanding of coral ecology and our ability to protect critically threatened global ecosystems....

  18. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    Science.gov (United States)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  19. NATO Advanced Research Workshop on Exploiting Mental Imagery with Computers in Mathematics Education

    CERN Document Server

    Mason, John

    1995-01-01

    The advent of fast and sophisticated computer graphics has brought dynamic and interactive images under the control of professional mathematicians and mathematics teachers. This volume in the NATO Special Programme on Advanced Educational Technology takes a comprehensive and critical look at how the computer can support the use of visual images in mathematical problem solving. The contributions are written by researchers and teachers from a variety of disciplines including computer science, mathematics, mathematics education, psychology, and design. Some focus on the use of external visual images and others on the development of individual mental imagery. The book is the first collected volume in a research area that is developing rapidly, and the authors pose some challenging new questions.

  20. Recent advances in computational methods and clinical applications for spine imaging

    CERN Document Server

    Glocker, Ben; Klinder, Tobias; Li, Shuo

    2015-01-01

    This book contains the full papers presented at the MICCAI 2014 workshop on Computational Methods and Clinical Applications for Spine Imaging. The workshop brought together scientists and clinicians in the field of computational spine imaging. The chapters included in this book present and discuss the new advances and challenges in these fields, using several methods and techniques in order to address more efficiently different and timely applications involving signal and image acquisition, image processing and analysis, image segmentation, image registration and fusion, computer simulation, image based modeling, simulation and surgical planning, image guided robot assisted surgical and image based diagnosis. The book also includes papers and reports from the first challenge on vertebra segmentation held at the workshop.

  1. Computer Hardware, Advanced Mathematics and Model Physics pilot project final report

    International Nuclear Information System (INIS)

    The Computer Hardware, Advanced Mathematics and Model Physics (CHAMMP) Program was launched in January, 1990. A principal objective of the program has been to utilize the emerging capabilities of massively parallel scientific computers in the challenge of regional scale predictions of decade-to-century climate change. CHAMMP has already demonstrated the feasibility of achieving a 10,000 fold increase in computational throughput for climate modeling in this decade. What we have also recognized, however, is the need for new algorithms and computer software to capitalize on the radically new computing architectures. This report describes the pilot CHAMMP projects at the DOE National Laboratories and the National Center for Atmospheric Research (NCAR). The pilot projects were selected to identify the principal challenges to CHAMMP and to entrain new scientific computing expertise. The success of some of these projects has aided in the definition of the CHAMMP scientific plan. Many of the papers in this report have been or will be submitted for publication in the open literature. Readers are urged to consult with the authors directly for questions or comments about their papers

  2. Advances in a distributed approach for ocean model data interoperability

    Science.gov (United States)

    Signell, Richard P.; Snowden, Derrick P.

    2014-01-01

    An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF) metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF) output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC) Sensor Observation Service (SOS), a metadata standard for unstructured grid model output (UGRID), and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS®) Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.

  3. Advancing Partnerships Towards an Integrated Approach to Oil Spill Response

    Science.gov (United States)

    Green, D. S.; Stough, T.; Gallegos, S. C.; Leifer, I.; Murray, J. J.; Streett, D.

    2015-12-01

    Oil spills can cause enormous ecological and economic devastation, necessitating application of the best science and technology available, and remote sensing is playing a growing critical role in the detection and monitoring of oil spills, as well as facilitating validation of remote sensing oil spill products. The FOSTERRS (Federal Oil Science Team for Emergency Response Remote Sensing) interagency working group seeks to ensure that during an oil spill, remote sensing assets (satellite/aircraft/instruments) and analysis techniques are quickly, effectively, appropriately, and seamlessly available to oil spills responders. Yet significant challenges remain for addressing oils spanning a vast range of chemical properties that may be spilled from the Tropics to the Arctic, with algorithms and scientific understanding needing advances to keep up with technology. Thus, FOSTERRS promotes enabling scientific discovery to ensure robust utilization of available technology as well as identifying technologies moving up the TRL (Technology Readiness Level). A recent FOSTERRS facilitated support activity involved deployment of the AVIRIS NG (Airborne Visual Infrared Imaging Spectrometer- Next Generation) during the Santa Barbara Oil Spill to validate the potential of airborne hyperspectral imaging to real-time map beach tar coverage including surface validation data. Many developing airborne technologies have potential to transition to space-based platforms providing global readiness.

  4. Advances in a Distributed Approach for Ocean Model Data Interoperability

    Directory of Open Access Journals (Sweden)

    Richard P. Signell

    2014-03-01

    Full Text Available An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC Sensor Observation Service (SOS, a metadata standard for unstructured grid model output (UGRID, and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS® Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.

  5. A Soft Computing Approach to Kidney Diseases Evaluation.

    Science.gov (United States)

    Neves, José; Martins, M Rosário; Vilhena, João; Neves, João; Gomes, Sabino; Abelha, António; Machado, José; Vicente, Henrique

    2015-10-01

    Kidney renal failure means that one's kidney have unexpectedly stopped functioning, i.e., once chronic disease is exposed, the presence or degree of kidney dysfunction and its progression must be assessed, and the underlying syndrome has to be diagnosed. Although the patient's history and physical examination may denote good practice, some key information has to be obtained from valuation of the glomerular filtration rate, and the analysis of serum biomarkers. Indeed, chronic kidney sickness depicts anomalous kidney function and/or its makeup, i.e., there is evidence that treatment may avoid or delay its progression, either by reducing and prevent the development of some associated complications, namely hypertension, obesity, diabetes mellitus, and cardiovascular complications. Acute kidney injury appears abruptly, with a rapid deterioration of the renal function, but is often reversible if it is recognized early and treated promptly. In both situations, i.e., acute kidney injury and chronic kidney disease, an early intervention can significantly improve the prognosis. The assessment of these pathologies is therefore mandatory, although it is hard to do it with traditional methodologies and existing tools for problem solving. Hence, in this work, we will focus on the development of a hybrid decision support system, in terms of its knowledge representation and reasoning procedures based on Logic Programming, that will allow one to consider incomplete, unknown, and even contradictory information, complemented with an approach to computing centered on Artificial Neural Networks, in order to weigh the Degree-of-Confidence that one has on such a happening. The present study involved 558 patients with an age average of 51.7 years and the chronic kidney disease was observed in 175 cases. The dataset comprise twenty four variables, grouped into five main categories. The proposed model showed a good performance in the diagnosis of chronic kidney disease, since the

  6. Advances in Assays and Analytical Approaches for Botulinum Toxin Detection

    Energy Technology Data Exchange (ETDEWEB)

    Grate, Jay W.; Ozanich, Richard M.; Warner, Marvin G.; Bruckner-Lea, Cindy J.; Marks, James D.

    2010-08-04

    Methods to detect botulinum toxin, the most poisonous substance known, are reviewed. Current assays are being developed with two main objectives in mind: 1) to obtain sufficiently low detection limits to replace the mouse bioassay with an in vitro assay, and 2) to develop rapid assays for screening purposes that are as sensitive as possible while requiring an hour or less to process the sample an obtain the result. This review emphasizes the diverse analytical approaches and devices that have been developed over the last decade, while also briefly reviewing representative older immunoassays to provide background and context.

  7. Advanced welding for closed structure. Pt. 2 The ultrasonic approach

    Energy Technology Data Exchange (ETDEWEB)

    Sacripanti, A.; Paoloni, M.; Sagratella, G. [ENEA Centro Ricerche Casaccia, Rome (Italy). Dipt. Innovazione

    1999-07-01

    This report describes the activities developed for the European Contract BRITE AWCS III to study the use of ultrasonic sensing techniques to obtain an accurate detection of the internal reinforcement of the closed steel structures employed in the shipbuilding industry. After a description of the methods, techniques and problems for the ultrasonic testing of materials in the conventional approach, a new method of the multiple reflection-absorption is introduced with their experimental tests and results. The obtained conclusion shows that the ultrasonic non destructive testing techniques in the new approach should be useful to assemble a complete sensing system with two receivers, one thermal and one ultrasonic. [Italian] Questo rapporto descrive le attivita' sperimentali sviluppate nell'ambito del contratto europeo BRITE AWCS III, in cui si sono utilizzate tecniche ultrasoniche per ottenere un preciso rilevamento dei rinforzi interni di strutture metalliche chiuse utilizzate nell'industria delle costruzioni navali. Dopo la descrizione dei metodi, delle tecniche e dei problemi riguardanti il testing ultrasonico dei materiali, e' stato introdotto un approccio innovativo basato sul metodo dell'assorbimento delle riflessioni multiple con i risultati sperimentali. Le conclusioni ottenute mostrano che nel nuovo approccio, il testing ultrasonico non distruttivo dovrebbe essere utile per assemblare un sistema sensoriale con due sensori, uno di tipo termico, uno di tipo ultrasonico.

  8. Advanced Simulation and Computing FY08-09 Implementation Plan, Volume 2, Revision 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Kusnezov, D; Bickel, T; McCoy, M; Hopson, J

    2007-09-13

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  9. Advanced Simulation and Computing FY09-FY10 Implementation Plan, Volume 2, Revision 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Meisner, R; Hopson, J; Peery, J; McCoy, M

    2008-10-07

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  10. Advanced Simulation and Computing FY10-FY11 Implementation Plan Volume 2, Rev. 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Meisner, R; Peery, J; McCoy, M; Hopson, J

    2009-09-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  11. Advanced Simulation and Computing FY10-11 Implementation Plan Volume 2, Rev. 0

    Energy Technology Data Exchange (ETDEWEB)

    Carnes, B

    2009-06-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  12. Advanced Simulation and Computing FY07-08 Implementation Plan Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Kusnezov, D; Hale, A; McCoy, M; Hopson, J

    2006-06-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program will require the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  13. Advanced Simulation and Computing FY09-FY10 Implementation Plan Volume 2, Rev. 1

    Energy Technology Data Exchange (ETDEWEB)

    Kissel, L

    2009-04-01

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  14. SMARTPHONE-BASED APPROACH TO ADVANCED DRIVER ASSISTANCE SYSTEM (ADAS RESEARCH AND DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    I. B. Lashkov

    2015-11-01

    Full Text Available Subject of Research.The paper deals with findings and presents asmartphone-based approach to advanced driver assistance system (ADAS research and development.The approach is based on the data of smartphone cameras and sensors. The line of researchis associated with the developmentof mobile advanced driver assistance system (ADAS. Method.The proposedapproach isbased on the use of driver'sand vehicle behavior ontologies. Current ADAS systems can be divided into two main categories according to the method of implementation: mobile applications, manually installed by the driver from the application stores, and safetyhardware and softwaresystems,integrated into vehicles by manufacturesor in the automotive service centers.Mobile application installed on the smartphone uses the built-in rear and front-facing cameras and sensors to monitor both the road and vehicles ahead, and at the same time the driver in order to prevent traffic collisions. The service consists of components for objects recognition in the images obtained with cameras, and components for traffic situation analysis. Main Results. The driver safety mobile application has been developedfor the use on mobile phones.The mobile phone is mounted on the windshield of a car.In case of dangerous event occurrence, the application engine will make an audible or vibration signal to inform the driver to be concentratedand more vigilant. For example, road obstacles, rear-end and stationary vehicle accidents are the most common accident types.The mobile application detects whether a crash is imminent by computing the ‘Time To Contact’ (TTC taking into account host vehicle speed, relative speed and relative acceleration.If the driver doesn’t maintain safe minimum distance with the car immediately ahead, the mobile application will alert the driver by displaying an attention icon with an audible alert. The dual-camera sensing application is designed to help the drivers increase the trip safety

  15. Funnel function approach to determine uncertainty: Some advances

    Science.gov (United States)

    Routh, P. S.

    2006-12-01

    Given a finite number of noisy data it is difficult (perhaps impossible) to obtain unique average of the model value in any region of the model (Backus & Gilbert, 1970; Oldenburg, 1983). This difficulty motivated Backus and Gilbert to construct the averaging kernels that is in some sense close to delta function. Averaging kernels describe how the true model is averaged over the entire domain to generate the model value in the region of interest. An unique average value is difficult to obtain theoretically. However we can compute the bounds on the average value and this allows us to obtain a measure of uncertainty. This idea was proposed by Oldenburg (1983). As the region of interest increases the uncertainty decreases associated with the average value giving a funnel like shape. Mathematically this is equivalent to solving minimization and maximization problem of average value (Oldenburg, 1983). In this work I developed a nonlinear interior point method to solve this min-max problem and construct the bounds. The bounds determined in this manner honors all types of available information: (a) geophysical data with errors (b) deterministic or statistical prior information and (c ) complementary information from other data sets at different scales (such as hydrology or other geophysical data) if they are formulated in a joint inversion framework.

  16. An evolutionary approach to advanced water cooled reactors

    International Nuclear Information System (INIS)

    Based on the result of the Feasibility Study undertaken since 1991, Indonesia may enter in the new nuclear era by introduction of several Nuclear Power Plants in our energy supply system. Requirements for the future NPP's are developed in two step approach. First step is for the immediate future that is the next 50 years where the system will be dominated by A-LWR's/A-PHWR's and the second step is for the time period beyond 50 years in which new reactor systems may start to dominate. The integral reactor concept provides a revolutionary improvements in terms of conceptual and safety. However, it creates a new set of complex machinery and operational problems of its own. The paper concerns with a brief description of nuclear technology status in Indonesia and a qualitative assessment of integral reactor concept. (author)

  17. Advanced welding for closed structure. Pt. 1 The magnetic approach

    Energy Technology Data Exchange (ETDEWEB)

    Sacripanti, A.; Paoloni, M.; Sagratella, G. [ENEA Centro Ricerche Casaccia, Rome (Italy). Dipt. Innovazione

    1999-07-01

    This report describes the activities developed for the European Contract BRITE AWCS III to study the use of magnetic sensing techniques to obtain an accurate detection of the internal reinforcement of the closed steel structures employed in the shipbuilding industry. After a description of the methods, techniques and problems for the magnetic testing of materials in the conventional approach, a new method was tried to obtain the wanted results. The obtained conclusion shows that the magnetic non destructive testing approach produce very small effects to measure, are too much sensible to the anisotropy of the magnetic properties of the steel plates and to the quality of the contact with the reinforcement. This system is not flexible enough to assemble a sensing for the goal of the BRITE AWCS III. [Italian] Questo rapporto descrive le attivita' sperimentali sviluppate nell'ambito del contratto europeo BRITE AWCS III, in cui si sono utilizzate tecniche magnetiche per ottenere un preciso rilevamento dei rinforzi interni di strutture metalliche chiuse utilizzate nell'industria delle costruzioni navali. Dopo la descrizione dei metodi, delle tecniche e dei problemi riguardanti il testing magnetico dei materiali, e' stato introdotto un approccio innovativo basato su elettromagneti costruiti ad hoc. Le conclusioni ottenute mostrano che nel nuovo approccio, il testing magnetico non distruttivo produce perturbazioni troppo piccole per essere correttamente apprezzate, risulta inoltre troppo legato alle anisotropie ed alla qualita' del contatto tra piatto e web ed infine esso appare poco flessibile per soddisfare le richieste tecniche del BRITE AWCS III.

  18. A note on “A new approach for the selection of advanced manufacturing technologies: Data envelopment analysis with double frontiers”

    Directory of Open Access Journals (Sweden)

    Hossein Azizi

    2015-08-01

    Full Text Available Recently, using the data envelopment analysis (DEA with double frontiers approach, Wang and Chin (2009 proposed a new approach for the selection of advanced manufacturing technologies: DEA with double frontiers and a new measure for the selection of the best advanced manufacturing technologies (AMTs. In this note, we show that their proposed overall performance measure for the selection of the best AMT has an additional computational burden. Moreover, we propose a new measure for developing a complete ranking of AMTs. Numerical examples are examined using the proposed measure to show its simplicity and usefulness in the AMT selection and justification.

  19. Teaching Scientific Computing: A Model-Centered Approach to Pipeline and Parallel Programming with C

    OpenAIRE

    Vladimiras Dolgopolovas; Valentina Dagienė; Saulius Minkevičius; Leonidas Sakalauskas

    2015-01-01

    The aim of this study is to present an approach to the introduction into pipeline and parallel computing, using a model of the multiphase queueing system. Pipeline computing, including software pipelines, is among the key concepts in modern computing and electronics engineering. The modern computer science and engineering education requires a comprehensive curriculum, so the introduction to pipeline and parallel computing is the essential topic to be included in the curriculum. At the same ti...

  20. The effects of advance organizers according learning styles in computer assisted instruction software on academic achievement

    Directory of Open Access Journals (Sweden)

    Buket Demir , Ertuğrul Usta

    2011-09-01

    Full Text Available This study aims to investigate the effects of advance organizers existing in computer assisted instruction software on academic achievement of the students who have different types of learning styles. Semi–empirical design with pretest–posttest and with control group was used. The research sample was composed of 131students having Information Technology Course in Süleyman Türkmani Primary School located in Kırşehir in 2010–2011 academic year. Research data was collected by using Kolb’s Learning Style Inventory and Academic Achievement Test (KR–20: 0,82. One way ANOVA and Independent Sample T-Test were conducted on the all data collected and these results were emerged: The existence of advance organizers in a instructional software was affect the the academic achievement of students. There was also difference between the academic achievement of field independent learners whom studied in the computer assisted environment which was both include advance organizer and not include.

  1. The effects of advance organizers according learning styles in computer assisted instruction software on academic achievement

    Directory of Open Access Journals (Sweden)

    Buket Demir

    2011-09-01

    Full Text Available Normal 0 21 false false false MicrosoftInternetExplorer4 This study aims to investigate the effects of advance organizers existing in computer assisted instruction software on academic achievement of the students who have different types of learning styles. Semi–empirical design with Pretest–posttest and with control group was used. The research sample was composed of 131students having Information Technology Course in Süleyman Türkmani Primary School located in Kırşehir in 2010–2011 academic year. Research data was collected by using Kolb’s Learning Style Inventory and Academic Achievement Test (KR–20: 0,82. One way ANOVA and Independent Sample T-Test were conducted on the all data collected and these results were emerged: The existence of advance organizers in a instructional software was affect the the academic achievement of students. There was also difference between the academic achievement of field independent learners whom studied in the computer assisted environment which was both include advance organizer and not include.

  2. Computer Science Contests for Secondary School Students: Approaches to Classification

    Directory of Open Access Journals (Sweden)

    Wolfgang POHL

    2006-04-01

    Full Text Available The International Olympiad in Informatics currently provides a model which is imitated by the majority of contests for secondary school students in Informatics or Computer Science. However, the IOI model can be criticized, and alternative contest models exist. To support the discussion about contests in Computer Science, several dimensions for characterizing and classifying contests are suggested.

  3. Gesture Recognition by Computer Vision: An Integral Approach

    NARCIS (Netherlands)

    Lichtenauer, J.F.

    2009-01-01

    The fundamental objective of this Ph.D. thesis is to gain more insight into what is involved in the practical application of a computer vision system, when the conditions of use cannot be controlled completely. The basic assumption is that research on isolated aspects of computer vision often leads

  4. Development of Computer Science Disciplines - A Social Network Analysis Approach

    CERN Document Server

    Pham, Manh Cuong; Jarke, Matthias

    2011-01-01

    In contrast to many other scientific disciplines, computer science considers conference publications. Conferences have the advantage of providing fast publication of papers and of bringing researchers together to present and discuss the paper with peers. Previous work on knowledge mapping focused on the map of all sciences or a particular domain based on ISI published JCR (Journal Citation Report). Although this data covers most of important journals, it lacks computer science conference and workshop proceedings. That results in an imprecise and incomplete analysis of the computer science knowledge. This paper presents an analysis on the computer science knowledge network constructed from all types of publications, aiming at providing a complete view of computer science research. Based on the combination of two important digital libraries (DBLP and CiteSeerX), we study the knowledge network created at journal/conference level using citation linkage, to identify the development of sub-disciplines. We investiga...

  5. Exploring the Process of Adult Computer Software Training Using Andragogy, Situated Cognition, and a Minimalist Approach

    Science.gov (United States)

    Hurt, Andrew C.

    2007-01-01

    With technology advances, computer software becomes increasingly difficult to learn. Adults often rely on software training to keep abreast of these changes. Instructor-led software training is frequently used to teach adults new software skills; however there is limited research regarding the best practices in adult computer software training.…

  6. An expanded framework for the advanced computational testing and simulation toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Marques, Osni A.; Drummond, Leroy A.

    2003-11-09

    The Advanced Computational Testing and Simulation (ACTS) Toolkit is a set of computational tools developed primarily at DOE laboratories and is aimed at simplifying the solution of common and important computational problems. The use of the tools reduces the development time for new codes and the tools provide functionality that might not otherwise be available. This document outlines an agenda for expanding the scope of the ACTS Project based on lessons learned from current activities. Highlights of this agenda include peer-reviewed certification of new tools; finding tools to solve problems that are not currently addressed by the Toolkit; working in collaboration with other software initiatives and DOE computer facilities; expanding outreach efforts; promoting interoperability, further development of the tools; and improving functionality of the ACTS Information Center, among other tasks. The ultimate goal is to make the ACTS tools more widely used and more effective in solving DOE's and the nation's scientific problems through the creation of a reliable software infrastructure for scientific computing.

  7. An advanced safeguards approach for a model 200t/a reprocessing facility, (1)

    International Nuclear Information System (INIS)

    This report describes an advanced safeguards approach which has been developed for a model 200 t/a reprocessing plant, using near-real-time materials accountancy in the process MBA, and borrowing advanced ideas from TASTEX, the IWG-RPS, or the authors own invention for the spent fuel storage and plutonium nitrate storage MBAs. In the spent fuel storage MBA primary reliance is placed on 100% inspector observation and verification of all spent fuel receipts, and on surveillance measures to ensure that the inspector is aware of all receipts or other activities in the spent fuel cask receiving bay. The advanced safeguards approach gives more detailed consideration to the mechanical or chop-leach cell than most conventional approaches. Safeguards in the process MBA are based on n.r.t. accountancy. The n.r.t. accountancy model used assumes weekly in-process physical inventories of solution in some five buffer storage tanks. The safeguards approach suggested for the plutonium nitrate storage MBA is not significantly different from conventional approaches. The use of sequential statistical techniques for the analysis of n.r.t. accountancy data requires a significantly different philosophical approach to anomalies and anomaly resolution. This report summarizes anomaly resolution procedures, at least through the earlier stages, and describes a summary estimate of inspection effort likely to be needed to implement the advanced safeguards approach. (author)

  8. Recent Advances in Computational Simulation of Macro-, Meso-, and Micro-Scale Biomimetics Related Fluid Flow Problems

    Institute of Scientific and Technical Information of China (English)

    Y. Y. Yan

    2007-01-01

    Over the last decade, computational methods have been intensively applied to a variety of scientific researches and engineering designs. Although the computational fluid dynamics (CFD) method has played a dominant role in studying and simulating transport phenomena involving fluid flow and heat and mass transfers, in recent years, other numerical methods for the simulations at meso- and micro-scales have also been actively applied to solve the physics of complex flow and fluid-interface interactions. This paper presents a review of recent advances in multi-scale computational simulation of biomimetics related fluid flow problems. The state-of-the-art numerical techniques, such as lattice Boltzmann method (LBM), molecular dynamics (MD), and conventional CFD, applied to different problems such as fish flow, electro-osmosis effect of earthworm motion, and self-cleaning hydrophobic surface, and the numerical approaches are introduced. The new challenging of modelling biomimetics problems in developing the physical conditions of self-clean hydrophobic surfaces is discussed.

  9. Recent Advances in Treatment Approaches of Mucopolysaccharidosis VI.

    Science.gov (United States)

    Giugliani, Roberto; Carvalho, Clarissa Gutiérrez; Herber, Silvani; de Camargo Pinto, Louise Lapagesse

    2011-06-01

    Mucopolysaccharidosis VI is caused by accumulation of the glycosaminoglycan dermatan sulfate in all tissues due to decreased activity of the enzyme arylsulfatase B. Patients exhibit multisystemic signs and symptoms in a chronic and progressive manner, especially with changes in the skeleton, cardiopulmonary system, cornea, skin, liver, spleen and meninges. Patients usually have normal intelligence. In the past, treatment of mucopolysaccharidoses was limited to palliative medical care. The outcome for affected patients improved with the introduction of new technologies as hematopoietic stem cell transplantation, relegated to specific situations after enzyme replacement therapy (ERT) became available. The specific ERT for MPS VI, galsulfase (Naglazyme®, Biomarin Pharmaceutical) was approved in 2005 by FDA and in 2006 by EMEA, and three clinical studies including 56 patients have evaluated the efficacy and safety. Long-term follow up data with patients treated up to 5 years showed that ERT is well tolerated and associated with sustained improvements in the patients' clinical condition. Intrathecal ERT may be considered in situations of high neurosurgical risk but still it is experimental in humans, as is intra-articular ERT. It is possible that the full impact of this therapy will only be demonstrated when patients are identified and treated soon after birth, as it was shown that early introduction of ERT produced immune tolerance and improved enzyme effectiveness in the cat model. New insights on the pathophysiology of MPS disorders are leading to alternative therapeutic approaches, as gene therapy, inflammatory response modulators and substrate reduction therapy.

  10. Recent advances in lipoprotein and atherosclerosis: nutrigenomic approach

    Energy Technology Data Exchange (ETDEWEB)

    Lopez, S.; Ortega, A.; Varela, L.; Bermudez, B.; Muriana, F. J. G.; Abaia, R.

    2009-07-01

    Atherosclerosis is a disease in which multiple factors contribute to the degeneration of the vascular wall. Many risk factors have been identified as having influence on the progression of atherosclerosis among them, the type of diet. Multifactorial interaction among lipoproteins, vascular wall cells, and inflammatory mediators has been recognised as the basis of atherogenesis. Dietary intake affects lipoprotein concentration and composition providing risk or protection at several stages of atherosclerosis. More intriguingly, it has been demonstrated that the extent to which each lipid or lipoprotein is associated with cardiovascular disease depends on the time to last meal; thus, postprandial lipoproteins, main lipoproteins in blood after a high-fat meal, have been shown to strongly influence atherogenesis. As a complex biological process, the full cellular and molecular characterization of atherosclerosis derived by diet, calls for application of the newly developing omics techniques of analysis. This review will considered recent studies using high-throughput technologies and a nutrigenomic approach to reveal the patho-physiological effects that the fasting and postprandial lipoproteins may exert on the vascular wall. (Author) 55 refs.

  11. Neutron stimulated emission computed tomography: a Monte Carlo simulation approach

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, A C [Department of Biomedical Engineering, Duke University, 136 Hudson Hall, Durham, NC 27708 (United States); Harrawood, B P [Duke Advance Imaging Labs, Department of Radiology, 2424 Erwin Rd, Suite 302, Durham, NC 27705 (United States); Bender, J E [Department of Biomedical Engineering, Duke University, 136 Hudson Hall, Durham, NC 27708 (United States); Tourassi, G D [Duke Advance Imaging Labs, Department of Radiology, 2424 Erwin Rd, Suite 302, Durham, NC 27705 (United States); Kapadia, A J [Department of Biomedical Engineering, Duke University, 136 Hudson Hall, Durham, NC 27708 (United States)

    2007-10-21

    A Monte Carlo simulation has been developed for neutron stimulated emission computed tomography (NSECT) using the GEANT4 toolkit. NSECT is a new approach to biomedical imaging that allows spectral analysis of the elements present within the sample. In NSECT, a beam of high-energy neutrons interrogates a sample and the nuclei in the sample are stimulated to an excited state by inelastic scattering of the neutrons. The characteristic gammas emitted by the excited nuclei are captured in a spectrometer to form multi-energy spectra. Currently, a tomographic image is formed using a collimated neutron beam to define the line integral paths for the tomographic projections. These projection data are reconstructed to form a representation of the distribution of individual elements in the sample. To facilitate the development of this technique, a Monte Carlo simulation model has been constructed from the GEANT4 toolkit. This simulation includes modeling of the neutron beam source and collimation, the samples, the neutron interactions within the samples, the emission of characteristic gammas, and the detection of these gammas in a Germanium crystal. In addition, the model allows the absorbed radiation dose to be calculated for internal components of the sample. NSECT presents challenges not typically addressed in Monte Carlo modeling of high-energy physics applications. In order to address issues critical to the clinical development of NSECT, this paper will describe the GEANT4 simulation environment and three separate simulations performed to accomplish three specific aims. First, comparison of a simulation to a tomographic experiment will verify the accuracy of both the gamma energy spectra produced and the positioning of the beam relative to the sample. Second, parametric analysis of simulations performed with different user-defined variables will determine the best way to effectively model low energy neutrons in tissue, which is a concern with the high hydrogen content in

  12. What Computational Approaches Should be Taught for Physics?

    Science.gov (United States)

    Landau, Rubin

    2005-03-01

    The standard Computational Physics courses are designed for upper-level physics majors who already have some computational skills. We believe that it is important for first-year physics students to learn modern computing techniques that will be useful throughout their college careers, even before they have learned the math and science required for Computational Physics. To teach such Introductory Scientific Computing courses requires that some choices be made as to what subjects and computer languages wil be taught. Our survey of colleagues active in Computational Physics and Physics Education show no predominant choice, with strong positions taken for the compiled languages Java, C, C++ and Fortran90, as well as for problem-solving environments like Maple and Mathematica. Over the last seven years we have developed an Introductory course and have written up those courses as text books for others to use. We will describe our model of using both a problem-solving environment and a compiled language. The developed materials are available in both Maple and Mathaematica, and Java and Fortran90ootnotetextPrinceton University Press, to be published; www.physics.orst.edu/˜rubin/IntroBook/.

  13. Advanced welding for closed structure. Pt. 3 The thermal approach

    Energy Technology Data Exchange (ETDEWEB)

    Sacripanti, A.; Bonanno, G.; Paoloni, M.; Sagratella, G. [ENEA Centro Ricerche Casaccia, Rome (Italy). Dipt. Innovazione; Arborino, A.; Varesi, R.; Antonucci, A. [DUNE, (Italy)

    1999-07-01

    This report describes the activities developed for the European Contract BRITE AWCS III to study the use of thermal sensing techniques to obtain an accurate detection of the internal reinforcement of the closed steel structures employed in the shipbuilding industry. After a description of the methods, normally developed in Russia, about the techniques and problems, for the thermal testing of materials in the conventional approach, a new thermal detector was utilized, a new bolometric thermo camera is introduced with a special software for the on line image analysis, there are also shown the experimental tests and results. The obtained conclusion shows that the thermal non destructive testing techniques with the new detector should be useful to assemble a complete sensing system with one ultrasonic head. [Italian] Questo rapporto descrive le attivita' sperimentali sviluppate nell'ambito del contratto europeo BRITE AWCS III, in cui si sono utilizzate tecniche termiche per ottenere un preciso rilevamento dei rinforzi interni di strutture metalliche chiuse utilizzate nell'industria delle costruzioni navali. Dopo la descrizione dei metodi sviluppati essenzialmente in Russia, circa le tecniche e i problemi riguardanti il testing termico dei materiali, e' stato introdotto un approccio innovativo basato su un nuovo sensore: una termocamera bolometrica connessa con un software dedicato per l'analisi online del setto; vengono inoltre mostrati i risultati sperimentali ottenuti. Le conclusioni ottenute mostrano che nel nuovo approccio, il testing termico non distruttivo dovrebbe essere utile per assemblare un sistema sensoriale completo che utilizzi anche un sensore di tipo ultrasonico.

  14. A Human-Centred Tangible approach to learning Computational Thinking

    Directory of Open Access Journals (Sweden)

    Tommaso Turchi

    2016-08-01

    Full Text Available Computational Thinking has recently become a focus of many teaching and research domains; it encapsulates those thinking skills integral to solving complex problems using a computer, thus being widely applicable in our society. It is influencing research across many disciplines and also coming into the limelight of education, mostly thanks to public initiatives such as the Hour of Code. In this paper we present our arguments for promoting Computational Thinking in education through the Human-centred paradigm of Tangible End-User Development, namely by exploiting objects whose interactions with the physical environment are mapped to digital actions performed on the system.

  15. Loss tolerant one-way quantum computation -- a horticultural approach

    CERN Document Server

    Varnava, M; Rudolph, T; Varnava, Michael; Browne, Daniel E.; Rudolph, Terry

    2005-01-01

    We introduce a scheme for fault tolerantly dealing with losses in cluster state computation that can tolerate up to 50% qubit loss. This is achieved passively - no coherent measurements or coherent correction is required. We then use this procedure within a specific linear optical quantum computation proposal to show that: (i) given perfect sources, detector inefficiencies of up to 50% can be tolerated and (ii) given perfect detectors, the purity of the photon source (overlap of the photonic wavefunction with the desired single mode) need only be greater than 66.6% for efficient computation to be possible.

  16. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    Science.gov (United States)

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  17. Liposuction for Advanced Lymphedema: A Multidisciplinary Approach for Complete Reduction of Arm and Leg Swelling

    OpenAIRE

    Boyages, John; Kastanias, Katrina; Koelmeyer, Louise A.; Winch, Caleb J.; Lam, Thomas C.; Sherman, Kerry A.; Munnoch, David Alex; Brorson, Håkan; Ngo, Quan D.; Heydon-White, Asha; Magnussen, John S.; Mackie, Helen

    2015-01-01

    Purpose This research describes and evaluates a liposuction surgery and multidisciplinary rehabilitation approach for advanced lymphedema of the upper and lower extremities. Methods A prospective clinical study was conducted at an Advanced Lymphedema Assessment Clinic (ALAC) comprised of specialists in plastic surgery, rehabilitation, imaging, oncology, and allied health, at Macquarie University, Australia. Between May 2012 and 31 May 2014, a total of 104 patients attended the ALAC. Eligibili...

  18. Diffuse globally, compute locally: a cyclist approach to modeling long time robot locomotion

    Science.gov (United States)

    Zhang, Tingnan; Goldman, Daniel; Cvitanović, Predrag

    2015-03-01

    To advance autonomous robots we are interested to develop a statistical/dynamical description of diffusive self-propulsion on heterogeneous terrain. We consider a minimal model for such diffusion, the 2-dimensional Lorentz gas, which abstracts the motion of a light, point-like particle bouncing within a large number of heavy scatters (e.g. small robots in a boulder field). We present a precise computation (based on exact periodic orbit theory formula for the diffusion constant) for a periodic triangular Lorentz gas with finite horizon. We formulate a new approach to tiling the plane in terms of three elementary tiling generators which, for the first time, enables use of periodic orbits computed in the fundamental domain (that is, 1 / 12 of the hexagonal elementary cell whose translations tile the entire plane). Compared with previous literature, our fundamental domain value of the diffusion constant converges quickly for inter-disk separation/disk radius > 0 . 2 , with the cycle expansion truncated to only a few hundred periodic orbits of up to 5 billiard wall bounces. For small inter-disk separations, with periodic orbits up to 6 bounces, our diffusion constants are close (simulation estimates and the recent literature probabilistic estimates.

  19. DOE Advanced Scientific Computing Advisory Committee (ASCAC) Subcommittee Report on Scientific and Technical Information

    Energy Technology Data Exchange (ETDEWEB)

    Hey, Tony [eScience Institute, University of Washington; Agarwal, Deborah [Lawrence Berkeley National Laboratory; Borgman, Christine [University of California, Los Angeles; Cartaro, Concetta [SLAC National Accelerator Laboratory; Crivelli, Silvia [Lawrence Berkeley National Laboratory; Van Dam, Kerstin Kleese [Pacific Northwest National Laboratory; Luce, Richard [University of Oklahoma; Arjun, Shankar [CADES, Oak Ridge National Laboratory; Trefethen, Anne [University of Oxford; Wade, Alex [Microsoft Research, Microsoft Corporation; Williams, Dean [Lawrence Livermore National Laboratory

    2015-09-04

    The Advanced Scientific Computing Advisory Committee (ASCAC) was charged to form a standing subcommittee to review the Department of Energy’s Office of Scientific and Technical Information (OSTI) and to begin by assessing the quality and effectiveness of OSTI’s recent and current products and services and to comment on its mission and future directions in the rapidly changing environment for scientific publication and data. The Committee met with OSTI staff and reviewed available products, services and other materials. This report summaries their initial findings and recommendations.

  20. Recent advances in computational biology, bioinformatics, medicine, and healthcare by modern OR

    OpenAIRE

    Türkay, Metin; Weber, Gerhard-Wilhelm; Blazewicz, Jacek; Rauner, Marion

    2014-01-01

    CEJOR (2014) 22:427–430 DOI 10.1007/s10100-013-0327-2 EDITORIAL Recent advances in computational biology, bioinformatics, medicine, and healthcare by modern OR Gerhard-Wilhelm Weber · Jacek Blazewicz · Marion Rauner · Metin Türkay Published online: 7 September 2013 © Springer-Verlag Berlin Heidelberg 2013 At the occasion of the 25th European Conference on Operational Research, EURO XXV 2012, July 8–11, 2012, in Vilnius, Lithuania (http://www.euro-2012.lt/), the ...

  1. Advanced Computing Technologies for Rocket Engine Propulsion Systems: Object-Oriented Design with C++

    Science.gov (United States)

    Bekele, Gete

    2002-01-01

    This document explores the use of advanced computer technologies with an emphasis on object-oriented design to be applied in the development of software for a rocket engine to improve vehicle safety and reliability. The primary focus is on phase one of this project, the smart start sequence module. The objectives are: 1) To use current sound software engineering practices, object-orientation; 2) To improve on software development time, maintenance, execution and management; 3) To provide an alternate design choice for control, implementation, and performance.

  2. AVES: A Computer Cluster System approach for INTEGRAL Scientific Analysis

    Science.gov (United States)

    Federici, M.; Martino, B. L.; Natalucci, L.; Umbertini, P.

    The AVES computing system, based on an "Cluster" architecture is a fully integrated, low cost computing facility dedicated to the archiving and analysis of the INTEGRAL data. AVES is a modular system that uses the software resource manager (SLURM) and allows almost unlimited expandibility (65,536 nodes and hundreds of thousands of processors); actually is composed by 30 Personal Computers with Quad-Cores CPU able to reach the computing power of 300 Giga Flops (300x10{9} Floating point Operations Per Second), with 120 GB of RAM and 7.5 Tera Bytes (TB) of storage memory in UFS configuration plus 6 TB for users area. AVES was designed and built to solve growing problems raised from the analysis of the large data amount accumulated by the INTEGRAL mission (actually about 9 TB) and due to increase every year. The used analysis software is the OSA package, distributed by the ISDC in Geneva. This is a very complex package consisting of dozens of programs that can not be converted to parallel computing. To overcome this limitation we developed a series of programs to distribute the workload analysis on the various nodes making AVES automatically divide the analysis in N jobs sent to N cores. This solution thus produces a result similar to that obtained by the parallel computing configuration. In support of this we have developed tools that allow a flexible use of the scientific software and quality control of on-line data storing. The AVES software package is constituted by about 50 specific programs. Thus the whole computing time, compared to that provided by a Personal Computer with single processor, has been enhanced up to a factor 70.

  3. A computational approach to George Boole's discovery of mathematical logic

    OpenAIRE

    Ledesma, Luis de; Pérez, Aurora; Borrajo, Daniel; Laita, Luis M.

    1997-01-01

    This paper reports a computational model of Boole's discovery of Logic as a part of Mathematics. George Boole (1815–1864) found that the symbols of Logic behaved as algebraic symbols, and he then rebuilt the whole contemporary theory of Logic by the use of methods such as the solution of algebraic equations. Study of the different historical factors that influenced this achievement has served as background for our two main contributions: a computational representation of Boole's Logic before ...

  4. AN ETHICAL ASSESSMENT OF COMPUTER ETHICS USING SCENARIO APPROACH

    OpenAIRE

    Maslin Masrom; Zuraini Ismail; Ramlah Hussein

    2010-01-01

    Ethics refers to a set of rules that define right and wrong behavior, used for moral decision making. In this case, computer ethics is one of the major issues in information technology (IT) and information system (IS). The ethical behaviour of IT students and professionals need to be studied in an attempt to reduce many unethical practices such as software piracy, hacking, and software intellectual property violations. This paper attempts to address computer-related scenarios that can be used...

  5. Collaboration in computer science: a network science approach. Part I

    OpenAIRE

    Franceschet, Massimo

    2010-01-01

    Co-authorship in publications within a discipline uncovers interesting properties of the analysed field. We represent collaboration in academic papers of computer science in terms of differently grained networks, including those sub-networks that emerge from conference and journal co-authorship only. We take advantage of the network science paraphernalia to take a picture of computer science collaboration including all papers published in the field since 1936. We investigate typical bibliomet...

  6. An Econometric Approach of Computing Competitiveness Index in Human Capital

    OpenAIRE

    Salahodjaev, Raufhon; Nazarov, Zafar

    2013-01-01

    The aim of this paper is to provide methodology of estimating one of the components (pillar) of the Global Competitiveness Index (GCI), health and primary education (HPE) pillar for not included in the Global Competitiveness Report countries using conventional econometric techniques. Specifically, using the weighted least square and bootstrapping methods, we enable to compute the HPE for two countries of the former Soviet Union, Uzbekistan and Belarus and then compare the computed...

  7. Computational Approaches for Probing the Formation of Atmospheric Molecular Clusters

    DEFF Research Database (Denmark)

    Elm, Jonas

    This thesis presents the investigation of atmospheric molecular clusters using computational methods. Previous investigations have focused on solving problems related to atmospheric nucleation, and have not been targeted at the performance of the applied methods. This thesis focuses on assessing...... the performance of computational strategies in order to identify a sturdy methodology, which should be applicable for handling various issues related to atmospheric cluster formation. Density functional theory (DFT) is applied to study individual cluster formation steps. Utilizing large test sets of numerous...

  8. Advanced Simulation and Computing Fiscal Year 2016 Implementation Plan, Version 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hendrickson, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-27

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The purpose of this IP is to outline key work requirements to be performed and to control individual work activities within the scope of work. Contractors may not deviate from this plan without a revised WA or subsequent IP.

  9. Computational methods to extract meaning from text and advance theories of human cognition.

    Science.gov (United States)

    McNamara, Danielle S

    2011-01-01

    Over the past two decades, researchers have made great advances in the area of computational methods for extracting meaning from text. This research has to a large extent been spurred by the development of latent semantic analysis (LSA), a method for extracting and representing the meaning of words using statistical computations applied to large corpora of text. Since the advent of LSA, researchers have developed and tested alternative statistical methods designed to detect and analyze meaning in text corpora. This research exemplifies how statistical models of semantics play an important role in our understanding of cognition and contribute to the field of cognitive science. Importantly, these models afford large-scale representations of human knowledge and allow researchers to explore various questions regarding knowledge, discourse processing, text comprehension, and language. This topic includes the latest progress by the leading researchers in the endeavor to go beyond LSA.

  10. Advances in Intelligent Modelling and Simulation Artificial Intelligence-Based Models and Techniques in Scalable Computing

    CERN Document Server

    Khan, Samee; Burczy´nski, Tadeusz

    2012-01-01

    One of the most challenging issues in today’s large-scale computational modeling and design is to effectively manage the complex distributed environments, such as computational clouds, grids, ad hoc, and P2P networks operating under  various  types of users with evolving relationships fraught with  uncertainties. In this context, the IT resources and services usually belong to different owners (institutions, enterprises, or individuals) and are managed by different administrators. Moreover, uncertainties are presented to the system at hand in various forms of information that are incomplete, imprecise, fragmentary, or overloading, which hinders in the full and precise resolve of the evaluation criteria, subsequencing and selection, and the assignment scores. Intelligent scalable systems enable the flexible routing and charging, advanced user interactions and the aggregation and sharing of geographically-distributed resources in modern large-scale systems.   This book presents new ideas, theories, models...

  11. New Approaches to Quantum Computing using Nuclear Magnetic Resonance Spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Colvin, M; Krishnan, V V

    2003-02-07

    The power of a quantum computer (QC) relies on the fundamental concept of the superposition in quantum mechanics and thus allowing an inherent large-scale parallelization of computation. In a QC, binary information embodied in a quantum system, such as spin degrees of freedom of a spin-1/2 particle forms the qubits (quantum mechanical bits), over which appropriate logical gates perform the computation. In classical computers, the basic unit of information is the bit, which can take a value of either 0 or 1. Bits are connected together by logic gates to form logic circuits to implement complex logical operations. The expansion of modern computers has been driven by the developments of faster, smaller and cheaper logic gates. As the size of the logic gates become smaller toward the level of atomic dimensions, the performance of such a system is no longer considered classical but is rather governed by quantum mechanics. Quantum computers offer the potentially superior prospect of solving computational problems that are intractable to classical computers such as efficient database searches and cryptography. A variety of algorithms have been developed recently, most notably Shor's algorithm for factorizing long numbers into prime factors in polynomial time and Grover's quantum search algorithm. The algorithms that were of only theoretical interest as recently, until several methods were proposed to build an experimental QC. These methods include, trapped ions, cavity-QED, coupled quantum dots, Josephson junctions, spin resonance transistors, linear optics and nuclear magnetic resonance. Nuclear magnetic resonance (NMR) is uniquely capable of constructing small QCs and several algorithms have been implemented successfully. NMR-QC differs from other implementations in one important way that it is not a single QC, but a statistical ensemble of them. Thus, quantum computing based on NMR is considered as ensemble quantum computing. In NMR quantum computing, the

  12. New Approaches to Quantum Computing using Nuclear Magnetic Resonance Spectroscopy

    International Nuclear Information System (INIS)

    The power of a quantum computer (QC) relies on the fundamental concept of the superposition in quantum mechanics and thus allowing an inherent large-scale parallelization of computation. In a QC, binary information embodied in a quantum system, such as spin degrees of freedom of a spin-1/2 particle forms the qubits (quantum mechanical bits), over which appropriate logical gates perform the computation. In classical computers, the basic unit of information is the bit, which can take a value of either 0 or 1. Bits are connected together by logic gates to form logic circuits to implement complex logical operations. The expansion of modern computers has been driven by the developments of faster, smaller and cheaper logic gates. As the size of the logic gates become smaller toward the level of atomic dimensions, the performance of such a system is no longer considered classical but is rather governed by quantum mechanics. Quantum computers offer the potentially superior prospect of solving computational problems that are intractable to classical computers such as efficient database searches and cryptography. A variety of algorithms have been developed recently, most notably Shor's algorithm for factorizing long numbers into prime factors in polynomial time and Grover's quantum search algorithm. The algorithms that were of only theoretical interest as recently, until several methods were proposed to build an experimental QC. These methods include, trapped ions, cavity-QED, coupled quantum dots, Josephson junctions, spin resonance transistors, linear optics and nuclear magnetic resonance. Nuclear magnetic resonance (NMR) is uniquely capable of constructing small QCs and several algorithms have been implemented successfully. NMR-QC differs from other implementations in one important way that it is not a single QC, but a statistical ensemble of them. Thus, quantum computing based on NMR is considered as ensemble quantum computing. In NMR quantum computing, the spins with

  13. Recent advances in lipoprotein and atherosclerosis: A nutrigenomic approach

    Directory of Open Access Journals (Sweden)

    López, Sergio

    2009-03-01

    Full Text Available Atherosclerosis is a disease in which multiple factors contribute to the degeneration of the vascular wall. Many risk factors have been identified as having influence on the progression of atherosclerosis among them, the type of diet. Multifactorial interaction among lipoproteins, vascular wall cells, and inflammatory mediators has been recognised as the basis of atherogenesis. Dietary intake affects lipoprotein concentration and composition providing risk or protection at several stages of atherosclerosis. More intriguingly, it has been demonstrated that the extent to which each lipid or lipoprotein is associated with cardiovascular disease depends on the time to last meal; thus, postprandial lipoproteins, main lipoproteins in blood after a high-fat meal, have been shown to strongly influence atherogenesis. As a complex biological process, the full cellular and molecular characterization of atherosclerosis derived by diet, calls for application of the newly developing “omics” techniques of analysis. This review will considered recent studies using high-throughput technologies and a nutrigenomic approach to reveal the patho-physiological effects that the fasting and postprandial lipoproteins may exert on the vascular wall.La aterosclerosis es una enfermedad en la que múltiples factores, entre los que se encuentra la dieta, contribuyen a la degradación de la pared vascular. En la etiología de la aterogénesis son determinantes las lipoproteínas plasmáticas y los distintos tipos celulares de la pared vascular, incluyendo una respuesta inflamatoria. La ingesta de alimentos afecta la concentración y composición de las lipoproteínas, ejerciendo un papel de riesgo o protector durante las diferentes etapas del proceso aterosclerótico. Es importante destacar que la naturaleza de las lipoproteínas y por lo tanto su papel en la enfermedad cardiovascular, también depende del tiempo transcurrido entre comidas. Por ejemplo, las lipoprote

  14. Demand side management scheme in smart grid with cloud computing approach using stochastic dynamic programming

    Directory of Open Access Journals (Sweden)

    S. Sofana Reka

    2016-09-01

    Full Text Available This paper proposes a cloud computing framework in smart grid environment by creating small integrated energy hub supporting real time computing for handling huge storage of data. A stochastic programming approach model is developed with cloud computing scheme for effective demand side management (DSM in smart grid. Simulation results are obtained using GUI interface and Gurobi optimizer in Matlab in order to reduce the electricity demand by creating energy networks in a smart hub approach.

  15. Challenges and possible approaches: towards the petaflops computers

    Institute of Scientific and Technical Information of China (English)

    Depei QIAN; Danfeng ZHU

    2009-01-01

    In parallel with the R&D efforts in USA and Eu-rope, China's National High-tech R&D program has setup its goal in developing petaflops computers. Researchers and engineers world-wide are looking for appropriate methods and technologies to achieve the petaflops computer system. Based on discussion on important design issues in devel-oping the petafiops computer, this paper raises the major technological challenges including the memory wall, low power system design, interconnects, and programming sup-port, etc. Current efforts in addressing some of these chal-lenges and in pursuing possible solutions for developing the petaflops systems are presented. Several existing systems are briefly introduced as examples, including Roadrunner, Cray XT5 jaguar, Dawning 5000A/6000, and Lenovo DeepComp 7000. Architectures proposed by Chinese researchers for im-plementing the petaflops computer are also introduced. Ad-vantages of the architecture as well as the difficulties in its implementation are discussed. Finally, future research direc-tion in development of high productivity computing systems is discussed.

  16. AN ETHICAL ASSESSMENT OF COMPUTER ETHICS USING SCENARIO APPROACH

    Directory of Open Access Journals (Sweden)

    Maslin Masrom

    2010-06-01

    Full Text Available Ethics refers to a set of rules that define right and wrong behavior, used for moral decision making. In this case, computer ethics is one of the major issues in information technology (IT and information system (IS. The ethical behaviour of IT students and professionals need to be studied in an attempt to reduce many unethical practices such as software piracy, hacking, and software intellectual property violations. This paper attempts to address computer-related scenarios that can be used to examine the computer ethics. The computer-related scenario consists of a short description of an ethical situation whereby the subject of the study such as IT professionals or students, then rate the ethics of the scenario, namely attempt to identify the ethical issues involved. This paper also reviews several measures of computer ethics in different setting. The perceptions of various dimensions of ethical behaviour in IT that are related to the circumstances of the ethical scenario are also presented.

  17. Using Data Mining and Computational Approaches to Study Intermediate Filament Structure and Function.

    Science.gov (United States)

    Parry, David A D

    2016-01-01

    Experimental and theoretical research aimed at determining the structure and function of the family of intermediate filament proteins has made significant advances over the past 20 years. Much of this has either contributed to or relied on the amino acid sequence databases that are now available online, and the data mining approaches that have been developed to analyze these sequences. As the quality of sequence data is generally high, it follows that it is the design of the computational and graphical methodologies that are of especial importance to researchers who aspire to gain a greater understanding of those sequence features that specify both function and structural hierarchy. However, these techniques are necessarily subject to limitations and it is important that these be recognized. In addition, no single method is likely to be successful in solving a particular problem, and a coordinated approach using a suite of methods is generally required. A final step in the process involves the interpretation of the results obtained and the construction of a working model or hypothesis that suggests further experimentation. While such methods allow meaningful progress to be made it is still important that the data are interpreted correctly and conservatively. New data mining methods are continually being developed, and it can be expected that even greater understanding of the relationship between structure and function will be gleaned from sequence data in the coming years.

  18. Computational approaches and metrics required for formulating biologically realistic nanomaterial pharmacokinetic models

    International Nuclear Information System (INIS)

    The field of nanomaterial pharmacokinetics is in its infancy, with major advances largely restricted by a lack of biologically relevant metrics, fundamental differences between particles and small molecules of organic chemicals and drugs relative to biological processes involved in disposition, a scarcity of sufficiently rich and characterized in vivo data and a lack of computational approaches to integrating nanomaterial properties to biological endpoints. A central concept that links nanomaterial properties to biological disposition, in addition to their colloidal properties, is the tendency to form a biocorona which modulates biological interactions including cellular uptake and biodistribution. Pharmacokinetic models must take this crucial process into consideration to accurately predict in vivo disposition, especially when extrapolating from laboratory animals to humans since allometric principles may not be applicable. The dynamics of corona formation, which modulates biological interactions including cellular uptake and biodistribution, is thereby a crucial process involved in the rate and extent of biodisposition. The challenge will be to develop a quantitative metric that characterizes a nanoparticle's surface adsorption forces that are important for predicting biocorona dynamics. These types of integrative quantitative approaches discussed in this paper for the dynamics of corona formation must be developed before realistic engineered nanomaterial risk assessment can be accomplished. (paper)

  19. Computational approaches and metrics required for formulating biologically realistic nanomaterial pharmacokinetic models

    Science.gov (United States)

    Riviere, Jim E.; Scoglio, Caterina; Sahneh, Faryad D.; Monteiro-Riviere, Nancy A.

    2013-01-01

    The field of nanomaterial pharmacokinetics is in its infancy, with major advances largely restricted by a lack of biologically relevant metrics, fundamental differences between particles and small molecules of organic chemicals and drugs relative to biological processes involved in disposition, a scarcity of sufficiently rich and characterized in vivo data and a lack of computational approaches to integrating nanomaterial properties to biological endpoints. A central concept that links nanomaterial properties to biological disposition, in addition to their colloidal properties, is the tendency to form a biocorona which modulates biological interactions including cellular uptake and biodistribution. Pharmacokinetic models must take this crucial process into consideration to accurately predict in vivo disposition, especially when extrapolating from laboratory animals to humans since allometric principles may not be applicable. The dynamics of corona formation, which modulates biological interactions including cellular uptake and biodistribution, is thereby a crucial process involved in the rate and extent of biodisposition. The challenge will be to develop a quantitative metric that characterizes a nanoparticle's surface adsorption forces that are important for predicting biocorona dynamics. These types of integrative quantitative approaches discussed in this paper for the dynamics of corona formation must be developed before realistic engineered nanomaterial risk assessment can be accomplished.

  20. Computationally inexpensive approach for pitch control of offshore wind turbine on barge floating platform.

    Science.gov (United States)

    Zuo, Shan; Song, Y D; Wang, Lei; Song, Qing-wang

    2013-01-01

    Offshore floating wind turbine (OFWT) has gained increasing attention during the past decade because of the offshore high-quality wind power and complex load environment. The control system is a tradeoff between power tracking and fatigue load reduction in the above-rated wind speed area. In allusion to the external disturbances and uncertain system parameters of OFWT due to the proximity to load centers and strong wave coupling, this paper proposes a computationally inexpensive robust adaptive control approach with memory-based compensation for blade pitch control. The method is tested and compared with a baseline controller and a conventional individual blade pitch controller with the "NREL offshore 5 MW baseline wind turbine" being mounted on a barge platform run on FAST and Matlab/Simulink, operating in the above-rated condition. It is shown that the advanced control approach is not only robust to complex wind and wave disturbances but adaptive to varying and uncertain system parameters as well. The simulation results demonstrate that the proposed method performs better in reducing power fluctuations, fatigue loads and platform vibration as compared to the conventional individual blade pitch control.

  1. Computationally Inexpensive Approach for Pitch Control of Offshore Wind Turbine on Barge Floating Platform

    Directory of Open Access Journals (Sweden)

    Shan Zuo

    2013-01-01

    Full Text Available Offshore floating wind turbine (OFWT has gained increasing attention during the past decade because of the offshore high-quality wind power and complex load environment. The control system is a tradeoff between power tracking and fatigue load reduction in the above-rated wind speed area. In allusion to the external disturbances and uncertain system parameters of OFWT due to the proximity to load centers and strong wave coupling, this paper proposes a computationally inexpensive robust adaptive control approach with memory-based compensation for blade pitch control. The method is tested and compared with a baseline controller and a conventional individual blade pitch controller with the “NREL offshore 5 MW baseline wind turbine” being mounted on a barge platform run on FAST and Matlab/Simulink, operating in the above-rated condition. It is shown that the advanced control approach is not only robust to complex wind and wave disturbances but adaptive to varying and uncertain system parameters as well. The simulation results demonstrate that the proposed method performs better in reducing power fluctuations, fatigue loads and platform vibration as compared to the conventional individual blade pitch control.

  2. Computational consideration on advanced oxidation degradation of phenolic preservative, methylparaben, in water: mechanisms, kinetics, and toxicity assessments

    International Nuclear Information System (INIS)

    Graphical abstract: - Highlights: • Computational approach is effective to reveal the transformation mechanism of MPB. • MPB degradation was more dependent on the [• OH] than temperature during AOPs. • O2 could enhance MPB degradation, but more harmful products were formed. • The risks of MPB products in natural waters should be considered seriously. • The risks of MPB products can be overlooked in AOPs due to short half-time. - Abstract: Hydroxyl radicals (• OH) are strong oxidants that can degrade organic pollutants in advanced oxidation processes (AOPs). The mechanisms, kinetics, and toxicity assessment of the • OH-initiated oxidative degradation of the phenolic preservative, methylparaben (MPB), were systematically investigated using a computational approach, as the supplementary information for experimental data. Results showed that MPB can be initially attacked by • OH via OH-addition and H-abstraction routes. Among these routes, the • OH addition to the C atom at the ortho-position of phenolic hydroxyl group was the most significant route. However, the methyl-H-abstraction route also cannot be neglected. Further, the formed transient intermediates, OH-adduct (• MPB-OH1) and dehydrogenated radical (• MPB(-H)α), could be easily transformed to several stable degradation products in the presence of O2 and • OH. To better understand the potential toxicity of MPB and its products to aquatic organisms, both acute and chronic toxicities were assessed computationally at three trophic levels. Both MPB and its products, particularly the OH-addition products, are harmful to aquatic organisms. Therefore, the application of AOPs to remove MPB should be carefully performed for safe water treatment

  3. Computational consideration on advanced oxidation degradation of phenolic preservative, methylparaben, in water: mechanisms, kinetics, and toxicity assessments

    Energy Technology Data Exchange (ETDEWEB)

    Gao, Yanpeng [State Key Laboratory of Organic Geochemistry and Guangdong Key Laboratory of Environmental Resources Utilization and Protection, Guangzhou Institute of Geochemistry, Chinese Academy of Sciences, Guangzhou 510640 (China); University of Chinese Academy of Sciences, Beijing 100049 (China); An, Taicheng, E-mail: antc99@gig.ac.cn [State Key Laboratory of Organic Geochemistry and Guangdong Key Laboratory of Environmental Resources Utilization and Protection, Guangzhou Institute of Geochemistry, Chinese Academy of Sciences, Guangzhou 510640 (China); Fang, Hansun [State Key Laboratory of Organic Geochemistry and Guangdong Key Laboratory of Environmental Resources Utilization and Protection, Guangzhou Institute of Geochemistry, Chinese Academy of Sciences, Guangzhou 510640 (China); University of Chinese Academy of Sciences, Beijing 100049 (China); Ji, Yuemeng; Li, Guiying [State Key Laboratory of Organic Geochemistry and Guangdong Key Laboratory of Environmental Resources Utilization and Protection, Guangzhou Institute of Geochemistry, Chinese Academy of Sciences, Guangzhou 510640 (China)

    2014-08-15

    Graphical abstract: - Highlights: • Computational approach is effective to reveal the transformation mechanism of MPB. • MPB degradation was more dependent on the [{sup •} OH] than temperature during AOPs. • O{sub 2} could enhance MPB degradation, but more harmful products were formed. • The risks of MPB products in natural waters should be considered seriously. • The risks of MPB products can be overlooked in AOPs due to short half-time. - Abstract: Hydroxyl radicals ({sup •} OH) are strong oxidants that can degrade organic pollutants in advanced oxidation processes (AOPs). The mechanisms, kinetics, and toxicity assessment of the {sup •} OH-initiated oxidative degradation of the phenolic preservative, methylparaben (MPB), were systematically investigated using a computational approach, as the supplementary information for experimental data. Results showed that MPB can be initially attacked by {sup •} OH via OH-addition and H-abstraction routes. Among these routes, the {sup •} OH addition to the C atom at the ortho-position of phenolic hydroxyl group was the most significant route. However, the methyl-H-abstraction route also cannot be neglected. Further, the formed transient intermediates, OH-adduct ({sup •} MPB-OH{sub 1}) and dehydrogenated radical ({sup •} MPB(-H)α), could be easily transformed to several stable degradation products in the presence of O{sub 2} and {sup •} OH. To better understand the potential toxicity of MPB and its products to aquatic organisms, both acute and chronic toxicities were assessed computationally at three trophic levels. Both MPB and its products, particularly the OH-addition products, are harmful to aquatic organisms. Therefore, the application of AOPs to remove MPB should be carefully performed for safe water treatment.

  4. Computer Mediated Learning: An Example of an Approach.

    Science.gov (United States)

    Arcavi, Abraham; Hadas, Nurit

    2000-01-01

    There are several possible approaches in which dynamic computerized environments play a significant and possibly unique role in supporting innovative learning trajectories in mathematics in general and geometry in particular. Describes an approach based on a problem situation and some experiences using it with students and teachers. (Contains 15…

  5. Numerical Methods for Stochastic Computations A Spectral Method Approach

    CERN Document Server

    Xiu, Dongbin

    2010-01-01

    The first graduate-level textbook to focus on fundamental aspects of numerical methods for stochastic computations, this book describes the class of numerical methods based on generalized polynomial chaos (gPC). These fast, efficient, and accurate methods are an extension of the classical spectral methods of high-dimensional random spaces. Designed to simulate complex systems subject to random inputs, these methods are widely used in many areas of computer science and engineering. The book introduces polynomial approximation theory and probability theory; describes the basic theory of gPC meth

  6. Advanced computational sensors technology: testing and evaluation in visible, SWIR, and LWIR imaging

    Science.gov (United States)

    Rizk, Charbel G.; Wilson, John P.; Pouliquen, Philippe

    2015-05-01

    The Advanced Computational Sensors Team at the Johns Hopkins University Applied Physics Laboratory and the Johns Hopkins University Department of Electrical and Computer Engineering has been developing advanced readout integrated circuit (ROIC) technology for more than 10 years with a particular focus on the key challenges of dynamic range, sampling rate, system interface and bandwidth, and detector materials or band dependencies. Because the pixel array offers parallel sampling by default, the team successfully demonstrated that adding smarts in the pixel and the chip can increase performance significantly. Each pixel becomes a smart sensor and can operate independently in collecting, processing, and sharing data. In addition, building on the digital circuit revolution, the effective well size can be increased by orders of magnitude within the same pixel pitch over analog designs. This research has yielded an innovative class of a system-on-chip concept: the Flexible Readout and Integration Sensor (FRIS) architecture. All key parameters are programmable and/or can be adjusted dynamically, and this architecture can potentially be sensor and application agnostic. This paper reports on the testing and evaluation of one prototype that can support either detector polarity and includes sample results with visible, short-wavelength infrared (SWIR), and long-wavelength infrared (LWIR) imaging.

  7. A Social Network Approach to Provisioning and Management of Cloud Computing Services for Enterprises

    OpenAIRE

    Kuada, Eric; Olesen, Henning

    2011-01-01

    This paper proposes a social network approach to the provisioning and management of cloud computing services termed Opportunistic Cloud Computing Services (OCCS), for enterprises; and presents the research issues that need to be addressed for its implementation. We hypothesise that OCCS will facilitate the adoption process of cloud computing services by enterprises. OCCS deals with the concept of enterprises taking advantage of cloud computing services to meet their business needs without hav...

  8. A Comparative Study in Dynamic Job Scheduling Approaches in Grid Computing Environment

    Directory of Open Access Journals (Sweden)

    Amr Rekaby

    2013-09-01

    Full Text Available Grid computing is one of the most interesting research areas for present and future computing strategy and methodology. The dramatic changes in the complexity of scientific applications and part of non-scientific applications increase the need for distributed systems in general and grid computing specifically. One of the main challenges in grid computing environment is the way of handling the jobs(tasks in the grid environment. Job scheduling is the activity to schedule the submitted jobs in the grid environment. There are many approaches in job scheduling in grid computing.This paper provides an experimental study of different approaches in grid computing job scheduling. The involved approaches in this paper are “4-levels/RMFF” and our previously published approach “X-Levels/XD-Binary Tree”. First of all, introduction to grid computing and job scheduling techniques is provided. Then the description of currently existing approaches will be presented. After that, experiments and provided results give a practical evaluation of these approaches from different perspectives. Conclusion of the comparative study states that overall average tasks waiting time is enhanced by approximately 30% by using the X-levels/XD-binary tree approach against 4-levels/RMFF approach

  9. Exploring polymorphism in molecular crystals with a computational approach

    NARCIS (Netherlands)

    Ende, J.A. van den

    2016-01-01

    Different crystal structures can possess different properties and therefore the control of polymorphism in molecular crystals is a goal in multiple industries, e.g. the pharmaceutical industry. Part I of this thesis is a computational study at the molecular scale of a particular solid-solid polymorp

  10. Simulation of Quantum Computation : A Deterministic Event-Based Approach

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, K. De; Raedt, H. De

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  11. R for cloud computing an approach for data scientists

    CERN Document Server

    Ohri, A

    2014-01-01

    R for Cloud Computing looks at some of the tasks performed by business analysts on the desktop (PC era)  and helps the user navigate the wealth of information in R and its 4000 packages as well as transition the same analytics using the cloud.  With this information the reader can select both cloud vendors  and the sometimes confusing cloud ecosystem as well  as the R packages that can help process the analytical tasks with minimum effort and cost, and maximum usefulness and customization. The use of Graphical User Interfaces (GUI)  and Step by Step screenshot tutorials is emphasized in this book to lessen the famous learning curve in learning R and some of the needless confusion created in cloud computing that hinders its widespread adoption. This will help you kick-start analytics on the cloud including chapters on cloud computing, R, common tasks performed in analytics, scrutiny of big data analytics, and setting up and navigating cloud providers. Readers are exposed to a breadth of cloud computing ch...

  12. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  13. Percutaneous Irreversible Electroporation of Locally Advanced Pancreatic Carcinoma Using the Dorsal Approach: A Case Report

    Energy Technology Data Exchange (ETDEWEB)

    Scheffer, Hester J., E-mail: hj.scheffer@vumc.nl; Melenhorst, Marleen C. A. M., E-mail: m.melenhorst@vumc.nl [VU University Medical Center, Department of Radiology and Nuclear Medicine (Netherlands); Vogel, Jantien A., E-mail: j.a.vogel@amc.uva.nl [Academic Medical Center, Department of Surgery (Netherlands); Tilborg, Aukje A. J. M. van, E-mail: a.vantilborg@vumc.nl [VU University Medical Center, Department of Radiology and Nuclear Medicine (Netherlands); Nielsen, Karin, E-mail: k.nielsen@vumc.nl; Kazemier, Geert, E-mail: g.kazemier@vumc.nl [VU University Medical Center, Department of Surgery (Netherlands); Meijerink, Martijn R., E-mail: mr.meijerink@vumc.nl [VU University Medical Center, Department of Radiology and Nuclear Medicine (Netherlands)

    2015-06-15

    Irreversible electroporation (IRE) is a novel image-guided ablation technique that is increasingly used to treat locally advanced pancreatic carcinoma (LAPC). We describe a 67-year-old male patient with a 5 cm stage III pancreatic tumor who was referred for IRE. Because the ventral approach for electrode placement was considered dangerous due to vicinity of the tumor to collateral vessels and duodenum, the dorsal approach was chosen. Under CT-guidance, six electrodes were advanced in the tumor, approaching paravertebrally alongside the aorta and inferior vena cava. Ablation was performed without complications. This case describes that when ventral electrode placement for pancreatic IRE is impaired, the dorsal approach could be considered alternatively.

  14. A genetic and computational approach to structurally classify neuronal types

    OpenAIRE

    Sümbül, Uygar; Song, Sen; McCulloch, Kyle; Becker, Michael; Lin, Bin; Sanes, Joshua R.; Masland, Richard H.; Seung, H. Sebastian

    2014-01-01

    The importance of cell types in understanding brain function is widely appreciated but only a tiny fraction of neuronal diversity has been catalogued. Here, we exploit recent progress in genetic definition of cell types in an objective structural approach to neuronal classification. The approach is based on highly accurate quantification of dendritic arbor position relative to neurites of other cells. We test the method on a population of 363 mouse retinal ganglion cells. For each cell, we de...

  15. A comprehensive approach to decipher biological computation to achieve next generation high-performance exascale computing.

    Energy Technology Data Exchange (ETDEWEB)

    James, Conrad D.; Schiess, Adrian B.; Howell, Jamie; Baca, Michael J.; Partridge, L. Donald; Finnegan, Patrick Sean; Wolfley, Steven L.; Dagel, Daryl James; Spahn, Olga Blum; Harper, Jason C.; Pohl, Kenneth Roy; Mickel, Patrick R.; Lohn, Andrew; Marinella, Matthew

    2013-10-01

    The human brain (volume=1200cm3) consumes 20W and is capable of performing > 10^16 operations/s. Current supercomputer technology has reached 1015 operations/s, yet it requires 1500m^3 and 3MW, giving the brain a 10^12 advantage in operations/s/W/cm^3. Thus, to reach exascale computation, two achievements are required: 1) improved understanding of computation in biological tissue, and 2) a paradigm shift towards neuromorphic computing where hardware circuits mimic properties of neural tissue. To address 1), we will interrogate corticostriatal networks in mouse brain tissue slices, specifically with regard to their frequency filtering capabilities as a function of input stimulus. To address 2), we will instantiate biological computing characteristics such as multi-bit storage into hardware devices with future computational and memory applications. Resistive memory devices will be modeled, designed, and fabricated in the MESA facility in consultation with our internal and external collaborators.

  16. Practical Approach to Knowledge-based Question Answering with Natural Language Understanding and Advanced Reasoning

    CERN Document Server

    Wong, Wilson

    2007-01-01

    This research hypothesized that a practical approach in the form of a solution framework known as Natural Language Understanding and Reasoning for Intelligence (NaLURI), which combines full-discourse natural language understanding, powerful representation formalism capable of exploiting ontological information and reasoning approach with advanced features, will solve the following problems without compromising practicality factors: 1) restriction on the nature of question and response, and 2) limitation to scale across domains and to real-life natural language text.

  17. Energy Therapies in Advanced Practice Oncology: An Evidence-Informed Practice Approach

    OpenAIRE

    Potter, Pamela J.

    2013-01-01

    Advanced practitioners in oncology want patients to receive state-of-the-art care and support for their healing process. Evidence-informed practice (EIP), an approach to evaluating evidence for clinical practice, considers the varieties of evidence in the context of patient preference and condition as well as practitioner knowledge and experience. This article offers an EIP approach to energy therapies, namely, Therapeutic Touch (TT), Healing Touch (HT), and Reiki, as supportive interventions...

  18. Sensitivity analysis of scenario models for operational risk Advanced Measurement Approach

    OpenAIRE

    Chaudhary, Dinesh

    2014-01-01

    Scenario Analysis (SA) plays a key role in determination of operational risk capital under Basel II Advanced Measurement Approach. However, operational risk capital based on scenario data may exhibit high sensitivity or wrong-way sensitivity to scenario inputs. In this paper, we first discuss scenario generation using quantile approach and parameter estimation using quantile matching. Then we use single-loss approximation (SLA) to examine sensitivity of scenario based capital to scenario inputs.

  19. TOWARD HIGHLY SECURE AND AUTONOMIC COMPUTING SYSTEMS: A HIERARCHICAL APPROACH

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hsien-Hsin S

    2010-05-11

    The overall objective of this research project is to develop novel architectural techniques as well as system software to achieve a highly secure and intrusion-tolerant computing system. Such system will be autonomous, self-adapting, introspective, with self-healing capability under the circumstances of improper operations, abnormal workloads, and malicious attacks. The scope of this research includes: (1) System-wide, unified introspection techniques for autonomic systems, (2) Secure information-flow microarchitecture, (3) Memory-centric security architecture, (4) Authentication control and its implication to security, (5) Digital right management, (5) Microarchitectural denial-of-service attacks on shared resources. During the period of the project, we developed several architectural techniques and system software for achieving a robust, secure, and reliable computing system toward our goal.

  20. Computational Model of Music Sight Reading: A Reinforcement Learning Approach

    CERN Document Server

    Yahya, Keyvan

    2010-01-01

    Although the Music Sight Reading process usually has been studied from the cognitive or neurological view points, but the computational learning methods like the Reinforcement Learning have not yet been used to modeling of such processes. In this paper with regards to essential properties of our specific problem, we consider the value function concept and will indicate that the optimum policy can be obtained by the method we offer without to be getting involved with computing of the complex value functions which are in most of cases inexact. Also, the algorithm we will offer here is somehow a PDE based algorithm which is associated with a stochastic optimization programming and we consider that in this case, this one is more applicable than the normative algorithms like temporal difference method.

  1. Analytic reconstruction approach for parallel translational computed tomography.

    Science.gov (United States)

    Kong, Huihua; Yu, Hengyong

    2015-01-01

    To develop low-cost and low-dose computed tomography (CT) scanners for developing countries, recently a parallel translational computed tomography (PTCT) is proposed, and the source and detector are translated oppositely with respect to the imaging object without a slip-ring. In this paper, we develop an analytic filtered-backprojection (FBP)-type reconstruction algorithm for two dimensional (2D) fan-beam PTCT and extend it to three dimensional (3D) cone-beam geometry in a Feldkamp-type framework. Particularly, a weighting function is constructed to deal with data redundancy for multiple translations PTCT to eliminate image artifacts. Extensive numerical simulations are performed to validate and evaluate the proposed analytic reconstruction algorithms, and the results confirm their correctness and merits. PMID:25882732

  2. Distance Based Asynchronous Recovery Approach In Mobile Computing Environment

    Directory of Open Access Journals (Sweden)

    Yogita Khatri,

    2012-06-01

    Full Text Available A mobile computing system is a distributed system in which at least one of the processes is mobile. They are constrained by lack of stable storage, low network bandwidth, mobility, frequent disconnection andlimited battery life. Checkpointing is one of the commonly used techniques to provide fault tolerance in mobile computing environment. In order to suit the mobile environment a distance based recovery schemeis proposed which is based on checkpointing and message logging. After the system recovers from failures, only the failed processes rollback and restart from their respective recent checkpoints, independent of the others. The salient feature of this scheme is to reduce the transfer and recovery cost. While the mobile host moves with in a specific range, recovery information is not moved and thus only be transferred nearby if the mobile host moves out of certain range.

  3. Behavioral and Cognitive-Behavioral Approaches to Chronic Pain: Recent Advances and Future Directions.

    Science.gov (United States)

    Keefe, Francis J.; And Others

    1992-01-01

    Reviews and highlights recent research advances and future research directions concerned with behavioral and cognitive-behavioral approaches to chronic pain. Reviews assessment research on studies of social context of pain, relationship of chronic pain to depression, cognitive variables affecting pain, and comprehensive assessment measures.…

  4. Exploring Advanced Piano Students' Approaches to Sight-Reading

    Science.gov (United States)

    Zhukov, Katie

    2014-01-01

    The ability to read music fluently is fundamental for undergraduate music study yet the training of sight-reading is often neglected. This study compares approaches to sight-reading and accompanying by students with extensive sight-reading experience to those with limited experience, and evaluates the importance of this skill to advanced pianists…

  5. Advanced light source's approach to ensure conditions for safe top-off operation

    International Nuclear Information System (INIS)

    The purpose of this document is to outline the Advanced Light Source (ALS) approach for preventing a radiation accident scenario on the ALS experimental floor due to top-off operation. The document will describe the potential risks, the analysis, and the resulting specifications for the controls.

  6. Development of a computer code for dynamic analysis of the primary circuit of advanced reactors

    International Nuclear Information System (INIS)

    Currently, advanced reactors are being developed, seeking for enhanced safety, better performance and low environmental impacts. Reactor designs must follow several steps and numerous tests before a conceptual project could be certified. In this sense, computational tools become indispensable in the preparation of such projects. Thus, this study aimed at the development of a computational tool for thermal-hydraulic analysis by coupling two computer codes to evaluate the influence of transients caused by pressure variations and flow surges in the region of the primary circuit of IRIS reactor between the core and the pressurizer. For the simulation, it was used a situation of 'insurge', characterized by the entry of water in the pressurizer, due to the expansion of the refrigerant in the primary circuit. This expansion was represented by a pressure disturbance in step form, through the block 'step' of SIMULINK, thus enabling the transient startup. The results showed that the dynamic tool, obtained through the coupling of the codes, generated very satisfactory responses within model limitations, preserving the most important phenomena in the process. (author)

  7. Development of a computer code for dynamic analysis of the primary circuit of advanced reactors

    Energy Technology Data Exchange (ETDEWEB)

    Rocha, Jussie Soares da; Lira, Carlos A.B.O.; Magalhaes, Mardson A. de Sa, E-mail: cabol@ufpe.b [Universidade Federal de Pernambuco (DEN/UFPE), Recife, PE (Brazil). Dept. de Energia Nuclear

    2011-07-01

    Currently, advanced reactors are being developed, seeking for enhanced safety, better performance and low environmental impacts. Reactor designs must follow several steps and numerous tests before a conceptual project could be certified. In this sense, computational tools become indispensable in the preparation of such projects. Thus, this study aimed at the development of a computational tool for thermal-hydraulic analysis by coupling two computer codes to evaluate the influence of transients caused by pressure variations and flow surges in the region of the primary circuit of IRIS reactor between the core and the pressurizer. For the simulation, it was used a situation of 'insurge', characterized by the entry of water in the pressurizer, due to the expansion of the refrigerant in the primary circuit. This expansion was represented by a pressure disturbance in step form, through the block 'step' of SIMULINK, thus enabling the transient startup. The results showed that the dynamic tool, obtained through the coupling of the codes, generated very satisfactory responses within model limitations, preserving the most important phenomena in the process. (author)

  8. Variations of geometric invariant quotients for pairs, a computational approach

    OpenAIRE

    Gallardo, Patricio; Martinez-Garcia, Jesus

    2016-01-01

    We study, from a computational viewpoint, the GIT compactifications of pairs formed by a hypersurface and a hyperplane. We provide a general setting and algorithms to calculate all polarizations which give different GIT quotients, the finite number of one-parameter subgroups required to detect the lack of stability, and all maximal orbits of non stable pairs. Our algorithms have been fully implemented in Python for all dimensions and degrees. We applied our work with the case of cubic surface...

  9. Computational approaches to identify functional genetic variants in cancer genomes

    Science.gov (United States)

    Gonzalez-Perez, Abel; Mustonen, Ville; Reva, Boris; Ritchie, Graham R.S.; Creixell, Pau; Karchin, Rachel; Vazquez, Miguel; Fink, J. Lynn; Kassahn, Karin S.; Pearson, John V.; Bader, Gary; Boutros, Paul C.; Muthuswamy, Lakshmi; Ouellette, B.F. Francis; Reimand, Jüri; Linding, Rune; Shibata, Tatsuhiro; Valencia, Alfonso; Butler, Adam; Dronov, Serge; Flicek, Paul; Shannon, Nick B.; Carter, Hannah; Ding, Li; Sander, Chris; Stuart, Josh M.; Stein, Lincoln D.; Lopez-Bigas, Nuria

    2014-01-01

    The International Cancer Genome Consortium (ICGC) aims to catalog genomic abnormalities in tumors from 50 different cancer types. Genome sequencing reveals hundreds to thousands of somatic mutations in each tumor, but only a minority drive tumor progression. We present the result of discussions within the ICGC on how to address the challenge of identifying mutations that contribute to oncogenesis, tumor maintenance or response to therapy, and recommend computational techniques to annotate somatic variants and predict their impact on cancer phenotype. PMID:23900255

  10. A learning strategy approach for teaching novice computer programmers

    OpenAIRE

    Begley, Donald D.

    1984-01-01

    Approved for public release; distribution is unlimited The purpose of this thesis is to investigate carious learning strategies and present some suggested applications for the teaching of computer programming to Marine Corps entry level programmers. These learning strategies are used to develop a cognitively designed structure for the teaching of the software engineering process. This structure is designed so that programmers could have readily available in their thinking process modern ...

  11. Analysis of diabetic retinopathy biomarker VEGF gene by computational approaches

    OpenAIRE

    Jayashree Sadasivam; Ramesh, N; Vijayalakshmi, K.; Vinni Viridi; Shiva prasad

    2012-01-01

    Diabetic retinopathy, the most common diabetic eye disease, is caused by changes in the blood vessels of the retina which remains the major cause. It is characterized by vascular permeability and increased tissue ischemia and angiogenesis. One of the biomarker for Diabetic retinopathy has been identified as Vascular Endothelial Growth Factor ( VEGF )gene by computational analysis. VEGF is a sub-family of growth factors, the platelet-derived growth factor family of cystine-knot growth factors...

  12. Computational Approaches to Viral Evolution and Rational Vaccine Design

    Science.gov (United States)

    Bhattacharya, Tanmoy

    2006-10-01

    Viral pandemics, including HIV, are a major health concern across the world. Experimental techniques available today have uncovered a great wealth of information about how these viruses infect, grow, and cause disease; as well as how our body attempts to defend itself against them. Nevertheless, due to the high variability and fast evolution of many of these viruses, the traditional method of developing vaccines by presenting a heuristically chosen strain to the body fails and an effective intervention strategy still eludes us. A large amount of carefully curated genomic data on a number of these viruses are now available, often annotated with disease and immunological context. The availability of parallel computers has now made it possible to carry out a systematic analysis of this data within an evolutionary framework. I will describe, as an example, how computations on such data has allowed us to understand the origins and diversification of HIV, the causative agent of AIDS. On the practical side, computations on the same data is now being used to inform choice or defign of optimal vaccine strains.

  13. SOFT COMPUTING APPROACH TO PREDICT INTRACRANIAL PRESSURE VALUES

    Directory of Open Access Journals (Sweden)

    Mario Versaci

    2014-01-01

    Full Text Available The estimation and the prediction of the values related to the Intracranial Pression (ICP represents an important step for the evaluation of the compliance of the human brain, above all in those cases in which the increase of the ICP values determines high risk conditions for the patient. The regular therapy is neuro-surgical but, waiting for it, it is needed an aimed pharmacological therapy leading to an overload of the kidneys’ functionality. Thus, it becomes evident the necessity to set an effective and efficient procedure for the prediction of the ICP values with a suitable time recordings to mark the systematic pharmacological action addressed towards really necessary deliverings. The prediction techniques most commonly used in the literature, while providing a good window of time, are characterized by heavy computational complexity unappetizing to real time applications and technology transfer. In addition, ICP sampling techniques are not free from uncertainties due to affected elements (breath, heartbeat, voluntary/involuntary movement requesting the manipulation of uncertain and imprecise data. Thus, the choice of predictive techniques of soft computing type appears reasonable firstly, because it manipulates data effectively with uncertainty and /or imprecision and, secondly, for the same time frame predictive requires are duce computational load. In this study the author presents a study of the prediction of the ICP values through a two factors fuzzy time series comparing the results with more sophisticated techniques.

  14. Computational morphology a computational geometric approach to the analysis of form

    CERN Document Server

    Toussaint, GT

    1988-01-01

    Computational Geometry is a new discipline of computer science that deals with the design and analysis of algorithms for solving geometric problems. There are many areas of study in different disciplines which, while being of a geometric nature, have as their main component the extraction of a description of the shape or form of the input data. This notion is more imprecise and subjective than pure geometry. Such fields include cluster analysis in statistics, computer vision and pattern recognition, and the measurement of form and form-change in such areas as stereology and developmental biolo

  15. An engineering based approach for hydraulic computations in river flows

    Science.gov (United States)

    Di Francesco, S.; Biscarini, C.; Pierleoni, A.; Manciola, P.

    2016-06-01

    This paper presents an engineering based approach for hydraulic risk evaluation. The aim of the research is to identify a criteria for the choice of the simplest and appropriate model to use in different scenarios varying the characteristics of main river channel. The complete flow field, generally expressed in terms of pressure, velocities, accelerations can be described through a three dimensional approach that consider all the flow properties varying in all directions. In many practical applications for river flow studies, however, the greatest changes occur only in two dimensions or even only in one. In these cases the use of simplified approaches can lead to accurate results, with easy to build and faster simulations. The study has been conducted taking in account a dimensionless parameter of channels (ratio of curvature radius and width of the channel (R/B).

  16. Computational approach for calculating bound states in quantum field theory

    Science.gov (United States)

    Lv, Q. Z.; Norris, S.; Brennan, R.; Stefanovich, E.; Su, Q.; Grobe, R.

    2016-09-01

    We propose a nonperturbative approach to calculate bound-state energies and wave functions for quantum field theoretical models. It is based on the direct diagonalization of the corresponding quantum field theoretical Hamiltonian in an effectively discretized and truncated Hilbert space. We illustrate this approach for a Yukawa-like interaction between fermions and bosons in one spatial dimension and show where it agrees with the traditional method based on the potential picture and where it deviates due to recoil and radiative corrections. This method permits us also to obtain some insight into the spatial characteristics of the distribution of the fermions in the ground state, such as the bremsstrahlung-induced widening.

  17. Computer-assisted modeling: Contributions of computational approaches to elucidating macromolecular structure and function: Final report

    International Nuclear Information System (INIS)

    The Committee, asked to provide an assessment of computer-assisted modeling of molecular structure, has highlighted the signal successes and the significant limitations for a broad panoply of technologies and has projected plausible paths of development over the next decade. As with any assessment of such scope, differing opinions about present or future prospects were expressed. The conclusions and recommendations, however, represent a consensus of our views of the present status of computational efforts in this field

  18. Computer programs for capital cost estimation, lifetime economic performance simulation, and computation of cost indexes for laser fusion and other advanced technology facilities

    International Nuclear Information System (INIS)

    Three FORTRAN programs, CAPITAL, VENTURE, and INDEXER, have been developed to automate computations used in assessing the economic viability of proposed or conceptual laser fusion and other advanced-technology facilities, as well as conventional projects. The types of calculations performed by these programs are, respectively, capital cost estimation, lifetime economic performance simulation, and computation of cost indexes. The codes permit these three topics to be addressed with considerable sophistication commensurate with user requirements and available data

  19. Computational approaches to metabolic engineering utilizing systems biology and synthetic biology.

    Science.gov (United States)

    Fong, Stephen S

    2014-08-01

    Metabolic engineering modifies cellular function to address various biochemical applications. Underlying metabolic engineering efforts are a host of tools and knowledge that are integrated to enable successful outcomes. Concurrent development of computational and experimental tools has enabled different approaches to metabolic engineering. One approach is to leverage knowledge and computational tools to prospectively predict designs to achieve the desired outcome. An alternative approach is to utilize combinatorial experimental tools to empirically explore the range of cellular function and to screen for desired traits. This mini-review focuses on computational systems biology and synthetic biology tools that can be used in combination for prospective in silico strain design.

  20. Workflow Scheduling in Grid Computing Environment using a Hybrid GAACO Approach

    Science.gov (United States)

    Sathish, Kuppani; RamaMohan Reddy, A.

    2016-06-01

    In recent trends, grid computing is one of the emerging areas in computing platform which supports parallel and distributed environments. The main problem for grid computing is scheduling of workflows in terms of user specifications is a stimulating task and it also impacts the performance. This paper proposes a hybrid GAACO approach, which is a combination of Genetic Algorithm and Ant Colony Optimization Algorithm. The GAACO approach proposes different types of scheduling heuristics for the grid environment. The main objective of this approach is to satisfy all the defined constraints and user parameters.