WorldWideScience

Sample records for advanced computational approaches

  1. Advanced computational approaches to biomedical engineering

    CERN Document Server

    Saha, Punam K; Basu, Subhadip

    2014-01-01

    There has been rapid growth in biomedical engineering in recent decades, given advancements in medical imaging and physiological modelling and sensing systems, coupled with immense growth in computational and network technology, analytic approaches, visualization and virtual-reality, man-machine interaction and automation. Biomedical engineering involves applying engineering principles to the medical and biological sciences and it comprises several topics including biomedicine, medical imaging, physiological modelling and sensing, instrumentation, real-time systems, automation and control, sig

  2. Computational experiment approach to advanced secondary mathematics curriculum

    CERN Document Server

    Abramovich, Sergei

    2014-01-01

    This book promotes the experimental mathematics approach in the context of secondary mathematics curriculum by exploring mathematical models depending on parameters that were typically considered advanced in the pre-digital education era. This approach, by drawing on the power of computers to perform numerical computations and graphical constructions, stimulates formal learning of mathematics through making sense of a computational experiment. It allows one (in the spirit of Freudenthal) to bridge serious mathematical content and contemporary teaching practice. In other words, the notion of teaching experiment can be extended to include a true mathematical experiment. When used appropriately, the approach creates conditions for collateral learning (in the spirit of Dewey) to occur including the development of skills important for engineering applications of mathematics. In the context of a mathematics teacher education program, this book addresses a call for the preparation of teachers capable of utilizing mo...

  3. A Computationally Based Approach to Homogenizing Advanced Alloys

    Energy Technology Data Exchange (ETDEWEB)

    Jablonski, P D; Cowen, C J

    2011-02-27

    We have developed a computationally based approach to optimizing the homogenization heat treatment of complex alloys. The Scheil module within the Thermo-Calc software is used to predict the as-cast segregation present within alloys, and DICTRA (Diffusion Controlled TRAnsformations) is used to model the homogenization kinetics as a function of time, temperature and microstructural scale. We will discuss this approach as it is applied to both Ni based superalloys as well as the more complex (computationally) case of alloys that solidify with more than one matrix phase as a result of segregation. Such is the case typically observed in martensitic steels. With these alloys it is doubly important to homogenize them correctly, especially at the laboratory scale, since they are austenitic at high temperature and thus constituent elements will diffuse slowly. The computationally designed heat treatment and the subsequent verification real castings are presented.

  4. Data analysis of asymmetric structures advanced approaches in computational statistics

    CERN Document Server

    Saito, Takayuki

    2004-01-01

    Data Analysis of Asymmetric Structures provides a comprehensive presentation of a variety of models and theories for the analysis of asymmetry and its applications and provides a wealth of new approaches in every section. It meets both the practical and theoretical needs of research professionals across a wide range of disciplines and  considers data analysis in fields such as psychology, sociology, social science, ecology, and marketing. In seven comprehensive chapters this guide details theories, methods, and models for the analysis of asymmetric structures in a variety of disciplines and presents future opportunities and challenges affecting research developments and business applications.

  5. Advances in computers

    CERN Document Server

    Memon, Atif

    2012-01-01

    Since its first volume in 1960, Advances in Computers has presented detailed coverage of innovations in computer hardware, software, theory, design, and applications. It has also provided contributors with a medium in which they can explore their subjects in greater depth and breadth than journal articles usually allow. As a result, many articles have become standard references that continue to be of sugnificant, lasting value in this rapidly expanding field. In-depth surveys and tutorials on new computer technologyWell-known authors and researchers in the fieldExtensive bibliographies with m

  6. Recent Advances in Evolutionary Computation

    Institute of Scientific and Technical Information of China (English)

    Xin Yao; Yong Xu

    2006-01-01

    Evolutionary computation has experienced a tremendous growth in the last decade in both theoretical analyses and industrial applications. Its scope has evolved beyond its original meaning of "biological evolution" toward a wide variety of nature inspired computational algorithms and techniques, including evolutionary, neural, ecological, social and economical computation, etc., in a unified framework. Many research topics in evolutionary computation nowadays are not necessarily "evolutionary". This paper provides an overview of some recent advances in evolutionary computation that have been made in CERCIA at the University of Birmingham, UK. It covers a wide range of topics in optimization, learning and design using evolutionary approaches and techniques, and theoretical results in the computational time complexity of evolutionary algorithms. Some issues related to future development of evolutionary computation are also discussed.

  7. Advances in Computers

    CERN Document Server

    Zelkowitz, Marvin

    2010-01-01

    This is volume 79 of Advances in Computers. This series, which began publication in 1960, is the oldest continuously published anthology that chronicles the ever- changing information technology field. In these volumes we publish from 5 to 7 chapters, three times per year, that cover the latest changes to the design, development, use and implications of computer technology on society today. Covers the full breadth of innovations in hardware, software, theory, design, and applications.Many of the in-depth reviews have become standard references that co

  8. Computational approaches for conflict resolution in decision making: New advances and developments

    OpenAIRE

    Aydogan, Reyhan; Sánchez Anguix, Víctor; Julian Inglada, Vicente Javier; Broekens, Joost; Jonker, Catholijn

    2014-01-01

    Conflict is an omnipresent phenomenon in human society. It spans from individual decision-making trade-offs such as deciding what to do next (sleep, eat, work, play), to complex scenarios including politics and business. The social sciences, psychology, economy, and biology study the nature of conflict, its consequences, and strategies to successfully deal with it. Over the last decades computer science has joined those disciplines and studies conflict from a computational perspective. This s...

  9. Development of Computational Approaches for Simulation and Advanced Controls for Hybrid Combustion-Gasification Chemical Looping

    Energy Technology Data Exchange (ETDEWEB)

    Joshi, Abhinaya; Lou, Xinsheng; Neuschaefer, Carl; Chaudry, Majid; Quinn, Joseph

    2012-07-31

    This document provides the results of the project through September 2009. The Phase I project has recently been extended from September 2009 to March 2011. The project extension will begin work on Chemical Looping (CL) Prototype modeling and advanced control design exploration in preparation for a scale-up phase. The results to date include: successful development of dual loop chemical looping process models and dynamic simulation software tools, development and test of several advanced control concepts and applications for Chemical Looping transport control and investigation of several sensor concepts and establishment of two feasible sensor candidates recommended for further prototype development and controls integration. There are three sections in this summary and conclusions. Section 1 presents the project scope and objectives. Section 2 highlights the detailed accomplishments by project task area. Section 3 provides conclusions to date and recommendations for future work.

  10. Advances in computational dynamics of particles, materials and structures a unified approach

    CERN Document Server

    Har, Jason

    2012-01-01

    Computational methods for the modeling and simulation of the dynamic response and behavior of particles, materials and structural systems have had a profound influence on science, engineering and technology. Complex science and engineering applications dealing with complicated structural geometries and materials that would be very difficult to treat using analytical methods have been successfully simulated using computational tools. With the incorporation of quantum, molecular and biological mechanics into new models, these methods are poised to play an even bigger role in the future. Ad

  11. Advanced Computational Approaches for Characterizing Stochastic Cellular Responses to Low Dose, Low Dose Rate Exposures

    Energy Technology Data Exchange (ETDEWEB)

    Scott, Bobby, R., Ph.D.

    2003-06-27

    OAK - B135 This project final report summarizes modeling research conducted in the U.S. Department of Energy (DOE), Low Dose Radiation Research Program at the Lovelace Respiratory Research Institute from October 1998 through June 2003. The modeling research described involves critically evaluating the validity of the linear nonthreshold (LNT) risk model as it relates to stochastic effects induced in cells by low doses of ionizing radiation and genotoxic chemicals. The LNT model plays a central role in low-dose risk assessment for humans. With the LNT model, any radiation (or genotoxic chemical) exposure is assumed to increase one¡¯s risk of cancer. Based on the LNT model, others have predicted tens of thousands of cancer deaths related to environmental exposure to radioactive material from nuclear accidents (e.g., Chernobyl) and fallout from nuclear weapons testing. Our research has focused on developing biologically based models that explain the shape of dose-response curves for low-dose radiation and genotoxic chemical-induced stochastic effects in cells. Understanding the shape of the dose-response curve for radiation and genotoxic chemical-induced stochastic effects in cells helps to better understand the shape of the dose-response curve for cancer induction in humans. We have used a modeling approach that facilitated model revisions over time, allowing for timely incorporation of new knowledge gained related to the biological basis for low-dose-induced stochastic effects in cells. Both deleterious (e.g., genomic instability, mutations, and neoplastic transformation) and protective (e.g., DNA repair and apoptosis) effects have been included in our modeling. Our most advanced model, NEOTRANS2, involves differing levels of genomic instability. Persistent genomic instability is presumed to be associated with nonspecific, nonlethal mutations and to increase both the risk for neoplastic transformation and for cancer occurrence. Our research results, based on

  12. Collaborative Learning: Cognitive and Computational Approaches. Advances in Learning and Instruction Series.

    Science.gov (United States)

    Dillenbourg, Pierre, Ed.

    Intended to illustrate the benefits of collaboration between scientists from psychology and computer science, namely machine learning, this book contains the following chapters, most of which are co-authored by scholars from both sides: (1) "Introduction: What Do You Mean by 'Collaborative Learning'?" (Pierre Dillenbourg); (2) "Learning Together:…

  13. Compute Canada: Advancing Computational Research

    International Nuclear Information System (INIS)

    High Performance Computing (HPC) is redefining the way that research is done. Compute Canada's HPC infrastructure provides a national platform that enables Canadian researchers to compete on an international scale, attracts top talent to Canadian universities and broadens the scope of research.

  14. Advances in physiological computing

    CERN Document Server

    Fairclough, Stephen H

    2014-01-01

    This edited collection will provide an overview of the field of physiological computing, i.e. the use of physiological signals as input for computer control. It will cover a breadth of current research, from brain-computer interfaces to telemedicine.

  15. International Conference on Advanced Computing for Innovation

    CERN Document Server

    Angelova, Galia; Agre, Gennady

    2016-01-01

    This volume is a selected collection of papers presented and discussed at the International Conference “Advanced Computing for Innovation (AComIn 2015)”. The Conference was held at 10th -11th of November, 2015 in Sofia, Bulgaria and was aimed at providing a forum for international scientific exchange between Central/Eastern Europe and the rest of the world on several fundamental topics of computational intelligence. The papers report innovative approaches and solutions in hot topics of computational intelligence – advanced computing, language and semantic technologies, signal and image processing, as well as optimization and intelligent control.

  16. Advanced intelligence and mechanism approach

    Institute of Scientific and Technical Information of China (English)

    ZHONG Yixin

    2007-01-01

    Advanced intelligence will feature the intelligence research in next 50 years.An understanding of the concept of advanced intelligence as well as its importance will be provided first,and detailed analysis on an approach,the mechanism approach.suitable to the advanced intelligence research will then be flolowed.And the mutual relationship among mechanism approach,traditional approaches existed in artificial intelligence research,and the cognitive informatics will be discussed.It is interesting to discover that mechanism approach is a good one to the Advanced Intelligence research and a tmified form of the existed approaches to artificial intelligence.

  17. Computational Intelligence Paradigms in Advanced Pattern Classification

    CERN Document Server

    Jain, Lakhmi

    2012-01-01

    This monograph presents selected areas of application of pattern recognition and classification approaches including handwriting recognition, medical image analysis and interpretation, development of cognitive systems for image computer understanding, moving object detection, advanced image filtration and intelligent multi-object labelling and classification. It is directed to the scientists, application engineers, professors, professors and students will find this book useful.

  18. Computational photography: advances and challenges

    OpenAIRE

    Lam, EYM

    2011-01-01

    In the mid-1990s when digital photography began to enter the consumer market, Professor Joseph Goodman and I set out to explore how computation would impact the imaging system design. The field of study has since grown to be known as computational photography. In this paper I'll describe some of its recent advances and challenges, and discuss what the future holds. © 2011 Copyright Society of Photo-Optical Instrumentation Engineers (SPIE).

  19. Advanced computations in plasma physics

    International Nuclear Information System (INIS)

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. In this paper we review recent progress and future directions for advanced simulations in magnetically confined plasmas with illustrative examples chosen from magnetic confinement research areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to

  20. Recent advances in computational optimization

    CERN Document Server

    2013-01-01

    Optimization is part of our everyday life. We try to organize our work in a better way and optimization occurs in minimizing time and cost or the maximization of the profit, quality and efficiency. Also many real world problems arising in engineering, economics, medicine and other domains can be formulated as optimization tasks. This volume is a comprehensive collection of extended contributions from the Workshop on Computational Optimization. This book presents recent advances in computational optimization. The volume includes important real world problems like parameter settings for con- trolling processes in bioreactor, robot skin wiring, strip packing, project scheduling, tuning of PID controller and so on. Some of them can be solved by applying traditional numerical methods, but others need a huge amount of computational resources. For them it is shown that is appropriate to develop algorithms based on metaheuristic methods like evolutionary computation, ant colony optimization, constrain programming etc...

  1. International Conference on Advanced Computing

    CERN Document Server

    Patnaik, Srikanta

    2014-01-01

    This book is composed of the Proceedings of the International Conference on Advanced Computing, Networking, and Informatics (ICACNI 2013), held at Central Institute of Technology, Raipur, Chhattisgarh, India during June 14–16, 2013. The book records current research articles in the domain of computing, networking, and informatics. The book presents original research articles, case-studies, as well as review articles in the said field of study with emphasis on their implementation and practical application. Researchers, academicians, practitioners, and industry policy makers around the globe have contributed towards formation of this book with their valuable research submissions.

  2. Advances in embedded computer vision

    CERN Document Server

    Kisacanin, Branislav

    2014-01-01

    This illuminating collection offers a fresh look at the very latest advances in the field of embedded computer vision. Emerging areas covered by this comprehensive text/reference include the embedded realization of 3D vision technologies for a variety of applications, such as stereo cameras on mobile devices. Recent trends towards the development of small unmanned aerial vehicles (UAVs) with embedded image and video processing algorithms are also examined. The authoritative insights range from historical perspectives to future developments, reviewing embedded implementation, tools, technolog

  3. Advanced Computer Algebra for Determinants

    CERN Document Server

    Koutschan, Christoph

    2011-01-01

    We prove three conjectures concerning the evaluation of determinants, which are related to the counting of plane partitions and rhombus tilings. One of them has been posed by George Andrews in 1980, the other two are by Guoce Xin and Christian Krattenthaler. Our proofs employ computer algebra methods, namely the holonomic ansatz proposed by Doron Zeilberger and variations thereof. These variations make Zeilberger's original approach even more powerful and allow for addressing a wider variety of determinants. Finally we present, as a challenge problem, a conjecture about a closed form evaluation of Andrews's determinant.

  4. Advances in Computer Science and Engineering

    CERN Document Server

    Second International Conference on Advances in Computer Science and Engineering (CES 2012)

    2012-01-01

    This book includes the proceedings of the second International Conference on Advances in Computer Science and Engineering (CES 2012), which was held during January 13-14, 2012 in Sanya, China. The papers in these proceedings of CES 2012 focus on the researchers’ advanced works in their fields of Computer Science and Engineering mainly organized in four topics, (1) Software Engineering, (2) Intelligent Computing, (3) Computer Networks, and (4) Artificial Intelligence Software.

  5. Computational approaches to vision

    Science.gov (United States)

    Barrow, H. G.; Tenenbaum, J. M.

    1986-01-01

    Vision is examined in terms of a computational process, and the competence, structure, and control of computer vision systems are analyzed. Theoretical and experimental data on the formation of a computer vision system are discussed. Consideration is given to early vision, the recovery of intrinsic surface characteristics, higher levels of interpretation, and system integration and control. A computational visual processing model is proposed and its architecture and operation are described. Examples of state-of-the-art vision systems, which include some of the levels of representation and processing mechanisms, are presented.

  6. Handbook of computational approaches to counterterrorism

    CERN Document Server

    Subrahmanian, VS

    2012-01-01

    Terrorist groups throughout the world have been studied primarily through the use of social science methods. However, major advances in IT during the past decade have led to significant new ways of studying terrorist groups, making forecasts, learning models of their behaviour, and shaping policies about their behaviour. Handbook of Computational Approaches to Counterterrorism provides the first in-depth look at how advanced mathematics and modern computing technology is shaping the study of terrorist groups. This book includes contributions from world experts in the field, and presents extens

  7. Advanced in Computer Science and its Applications

    CERN Document Server

    Yen, Neil; Park, James; CSA 2013

    2014-01-01

    The theme of CSA is focused on the various aspects of computer science and its applications for advances in computer science and its applications and provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of computer science and its applications. Therefore this book will be include the various theories and practical applications in computer science and its applications.

  8. Toward exascale computing through neuromorphic approaches.

    Energy Technology Data Exchange (ETDEWEB)

    James, Conrad D.

    2010-09-01

    While individual neurons function at relatively low firing rates, naturally-occurring nervous systems not only surpass manmade systems in computing power, but accomplish this feat using relatively little energy. It is asserted that the next major breakthrough in computing power will be achieved through application of neuromorphic approaches that mimic the mechanisms by which neural systems integrate and store massive quantities of data for real-time decision making. The proposed LDRD provides a conceptual foundation for SNL to make unique advances toward exascale computing. First, a team consisting of experts from the HPC, MESA, cognitive and biological sciences and nanotechnology domains will be coordinated to conduct an exercise with the outcome being a concept for applying neuromorphic computing to achieve exascale computing. It is anticipated that this concept will involve innovative extension and integration of SNL capabilities in MicroFab, material sciences, high-performance computing, and modeling and simulation of neural processes/systems.

  9. Bringing Advanced Computational Techniques to Energy Research

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Julie C

    2012-11-17

    Please find attached our final technical report for the BACTER Institute award. BACTER was created as a graduate and postdoctoral training program for the advancement of computational biology applied to questions of relevance to bioenergy research.

  10. Soft computing in advanced robotics

    CERN Document Server

    Kobayashi, Ichiro; Kim, Euntai

    2014-01-01

    Intelligent system and robotics are inevitably bound up; intelligent robots makes embodiment of system integration by using the intelligent systems. We can figure out that intelligent systems are to cell units, while intelligent robots are to body components. The two technologies have been synchronized in progress. Making leverage of the robotics and intelligent systems, applications cover boundlessly the range from our daily life to space station; manufacturing, healthcare, environment, energy, education, personal assistance, logistics. This book aims at presenting the research results in relevance with intelligent robotics technology. We propose to researchers and practitioners some methods to advance the intelligent systems and apply them to advanced robotics technology. This book consists of 10 contributions that feature mobile robots, robot emotion, electric power steering, multi-agent, fuzzy visual navigation, adaptive network-based fuzzy inference system, swarm EKF localization and inspection robot. Th...

  11. A programming approach to computability

    CERN Document Server

    Kfoury, A J; Arbib, Michael A

    1982-01-01

    Computability theory is at the heart of theoretical computer science. Yet, ironically, many of its basic results were discovered by mathematical logicians prior to the development of the first stored-program computer. As a result, many texts on computability theory strike today's computer science students as far removed from their concerns. To remedy this, we base our approach to computability on the language of while-programs, a lean subset of PASCAL, and postpone consideration of such classic models as Turing machines, string-rewriting systems, and p. -recursive functions till the final chapter. Moreover, we balance the presentation of un solvability results such as the unsolvability of the Halting Problem with a presentation of the positive results of modern programming methodology, including the use of proof rules, and the denotational semantics of programs. Computer science seeks to provide a scientific basis for the study of information processing, the solution of problems by algorithms, and the design ...

  12. Computer networks and advanced communications

    International Nuclear Information System (INIS)

    One of the major methods for getting the most productivity and benefits from computer usage is networking. However, for those who are contemplating a change from stand-alone computers to a network system, the investigation of actual networks in use presents a paradox: network systems can be highly productive and beneficial; at the same time, these networks can create many complex, frustrating problems. The issue becomes a question of whether the benefits of networking are worth the extra effort and cost. In response to this issue, the authors review in this paper the implementation and management of an actual network in the LSU Petroleum Engineering Department. The network, which has been in operation for four years, is large and diverse (50 computers, 2 sites, PC's, UNIX RISC workstations, etc.). The benefits, costs, and method of operation of this network will be described, and an effort will be made to objectively weigh these elements from the point of view of the average computer user

  13. Advanced laptop and small personal computer technology

    Science.gov (United States)

    Johnson, Roger L.

    1991-01-01

    Advanced laptop and small personal computer technology is presented in the form of the viewgraphs. The following areas of hand carried computers and mobile workstation technology are covered: background, applications, high end products, technology trends, requirements for the Control Center application, and recommendations for the future.

  14. Advanced Biomedical Computing Center (ABCC) | DSITP

    Science.gov (United States)

    The Advanced Biomedical Computing Center (ABCC), located in Frederick Maryland (MD), provides HPC resources for both NIH/NCI intramural scientists and the extramural biomedical research community. Its mission is to provide HPC support, to provide collaborative research, and to conduct in-house research in various areas of computational biology and biomedical research.

  15. Advance Trends in Soft Computing

    CERN Document Server

    Kreinovich, Vladik; Kacprzyk, Janusz; WCSC 2013

    2014-01-01

    This book is the proceedings of the 3rd World Conference on Soft Computing (WCSC), which was held in San Antonio, TX, USA, on December 16-18, 2013. It presents start-of-the-art theory and applications of soft computing together with an in-depth discussion of current and future challenges in the field, providing readers with a 360 degree view on soft computing. Topics range from fuzzy sets, to fuzzy logic, fuzzy mathematics, neuro-fuzzy systems, fuzzy control, decision making in fuzzy environments, image processing and many more. The book is dedicated to Lotfi A. Zadeh, a renowned specialist in signal analysis and control systems research who proposed the idea of fuzzy sets, in which an element may have a partial membership, in the early 1960s, followed by the idea of fuzzy logic, in which a statement can be true only to a certain degree, with degrees described by numbers in the interval [0,1]. The performance of fuzzy systems can often be improved with the help of optimization techniques, e.g. evolutionary co...

  16. Elliptic curves a computational approach

    CERN Document Server

    Schmitt, Susanne; Pethö, Attila

    2003-01-01

    The basics of the theory of elliptic curves should be known to everybody, be he (or she) a mathematician or a computer scientist. Especially everybody concerned with cryptography should know the elements of this theory. The purpose of the present textbook is to give an elementary introduction to elliptic curves. Since this branch of number theory is particularly accessible to computer-assisted calculations, the authors make use of it by approaching the theory under a computational point of view. Specifically, the computer-algebra package SIMATH can be applied on several occasions. However, the book can be read also by those not interested in any computations. Of course, the theory of elliptic curves is very comprehensive and becomes correspondingly sophisticated. That is why the authors made a choice of the topics treated. Topics covered include the determination of torsion groups, computations regarding the Mordell-Weil group, height calculations, S-integral points. The contents is kept as elementary as poss...

  17. Advances in randomized parallel computing

    CERN Document Server

    Rajasekaran, Sanguthevar

    1999-01-01

    The technique of randomization has been employed to solve numerous prob­ lems of computing both sequentially and in parallel. Examples of randomized algorithms that are asymptotically better than their deterministic counterparts in solving various fundamental problems abound. Randomized algorithms have the advantages of simplicity and better performance both in theory and often in practice. This book is a collection of articles written by renowned experts in the area of randomized parallel computing. A brief introduction to randomized algorithms In the aflalysis of algorithms, at least three different measures of performance can be used: the best case, the worst case, and the average case. Often, the average case run time of an algorithm is much smaller than the worst case. 2 For instance, the worst case run time of Hoare's quicksort is O(n ), whereas its average case run time is only O( n log n). The average case analysis is conducted with an assumption on the input space. The assumption made to arrive at t...

  18. Antenna arrays a computational approach

    CERN Document Server

    Haupt, Randy L

    2010-01-01

    This book covers a wide range of antenna array topics that are becoming increasingly important in wireless applications, particularly in design and computer modeling. Signal processing and numerical modeling algorithms are explored, and MATLAB computer codes are provided for many of the design examples. Pictures of antenna arrays and components provided by industry and government sources are presented with explanations of how they work. Antenna Arrays is a valuable reference for practicing engineers and scientists in wireless communications, radar, and remote sensing, and an excellent textbook for advanced antenna courses.

  19. Computational Approaches to Nucleic Acid Origami.

    Science.gov (United States)

    Jabbari, Hosna; Aminpour, Maral; Montemagno, Carlo

    2015-10-12

    Recent advances in experimental DNA origami have dramatically expanded the horizon of DNA nanotechnology. Complex 3D suprastructures have been designed and developed using DNA origami with applications in biomaterial science, nanomedicine, nanorobotics, and molecular computation. Ribonucleic acid (RNA) origami has recently been realized as a new approach. Similar to DNA, RNA molecules can be designed to form complex 3D structures through complementary base pairings. RNA origami structures are, however, more compact and more thermodynamically stable due to RNA's non-canonical base pairing and tertiary interactions. With all these advantages, the development of RNA origami lags behind DNA origami by a large gap. Furthermore, although computational methods have proven to be effective in designing DNA and RNA origami structures and in their evaluation, advances in computational nucleic acid origami is even more limited. In this paper, we review major milestones in experimental and computational DNA and RNA origami and present current challenges in these fields. We believe collaboration between experimental nanotechnologists and computer scientists are critical for advancing these new research paradigms. PMID:26348196

  20. Computational approaches to energy materials

    CERN Document Server

    Catlow, Richard; Walsh, Aron

    2013-01-01

    The development of materials for clean and efficient energy generation and storage is one of the most rapidly developing, multi-disciplinary areas of contemporary science, driven primarily by concerns over global warming, diminishing fossil-fuel reserves, the need for energy security, and increasing consumer demand for portable electronics. Computational methods are now an integral and indispensable part of the materials characterisation and development process.   Computational Approaches to Energy Materials presents a detailed survey of current computational techniques for the

  1. Advances and challenges in computational plasma science

    International Nuclear Information System (INIS)

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This

  2. Computational electromagnetics recent advances and engineering applications

    CERN Document Server

    2014-01-01

    Emerging Topics in Computational Electromagnetics in Computational Electromagnetics presents advances in Computational Electromagnetics. This book is designed to fill the existing gap in current CEM literature that only cover the conventional numerical techniques for solving traditional EM problems. The book examines new algorithms, and applications of these algorithms for solving problems of current interest that are not readily amenable to efficient treatment by using the existing techniques. The authors discuss solution techniques for problems arising in nanotechnology, bioEM, metamaterials, as well as multiscale problems. They present techniques that utilize recent advances in computer technology, such as parallel architectures, and the increasing need to solve large and complex problems in a time efficient manner by using highly scalable algorithms.

  3. Advances and Challenges in Computational Plasma Science

    Energy Technology Data Exchange (ETDEWEB)

    W.M. Tang; V.S. Chan

    2005-01-03

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behavior. Recent advances in simulations of magnetically-confined plasmas are reviewed in this paper with illustrative examples chosen from associated research areas such as microturbulence, magnetohydrodynamics, and other topics. Progress has been stimulated in particular by the exponential growth of computer speed along with significant improvements in computer technology.

  4. Advances and Challenges in Computational Plasma Science

    International Nuclear Information System (INIS)

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behavior. Recent advances in simulations of magnetically-confined plasmas are reviewed in this paper with illustrative examples chosen from associated research areas such as microturbulence, magnetohydrodynamics, and other topics. Progress has been stimulated in particular by the exponential growth of computer speed along with significant improvements in computer technology

  5. Mobile Cloud Computing: A Review on Smartphone Augmentation Approaches

    OpenAIRE

    Abolfazli, Saeid; Sanaei, Zohreh; Gani, Abdullah

    2012-01-01

    Smartphones have recently gained significant popularity in heavy mobile processing while users are increasing their expectations toward rich computing experience. However, resource limitations and current mobile computing advancements hinder this vision. Therefore, resource-intensive application execution remains a challenging task in mobile computing that necessitates device augmentation. In this article, smartphone augmentation approaches are reviewed and classified in two main groups, name...

  6. Computational Biology, Advanced Scientific Computing, and Emerging Computational Architectures

    Energy Technology Data Exchange (ETDEWEB)

    None

    2007-06-27

    This CRADA was established at the start of FY02 with $200 K from IBM and matching funds from DOE to support post-doctoral fellows in collaborative research between International Business Machines and Oak Ridge National Laboratory to explore effective use of emerging petascale computational architectures for the solution of computational biology problems. 'No cost' extensions of the CRADA were negotiated with IBM for FY03 and FY04.

  7. Computational Approach for Developing Blood Pump

    Science.gov (United States)

    Kwak, Dochan

    2002-01-01

    This viewgraph presentation provides an overview of the computational approach to developing a ventricular assist device (VAD) which utilizes NASA aerospace technology. The VAD is used as a temporary support to sick ventricles for those who suffer from late stage congestive heart failure (CHF). The need for donor hearts is much greater than their availability, and the VAD is seen as a bridge-to-transplant. The computational issues confronting the design of a more advanced, reliable VAD include the modelling of viscous incompressible flow. A computational approach provides the possibility of quantifying the flow characteristics, which is especially valuable for analyzing compact design with highly sensitive operating conditions. Computational fluid dynamics (CFD) and rocket engine technology has been applied to modify the design of a VAD which enabled human transplantation. The computing requirement for this project is still large, however, and the unsteady analysis of the entire system from natural heart to aorta involves several hundred revolutions of the impeller. Further study is needed to assess the impact of mechanical VADs on the human body

  8. Advances in computers improving the web

    CERN Document Server

    Zelkowitz, Marvin

    2010-01-01

    This is volume 78 of Advances in Computers. This series, which began publication in 1960, is the oldest continuously published anthology that chronicles the ever- changing information technology field. In these volumes we publish from 5 to 7 chapters, three times per year, that cover the latest changes to the design, development, use and implications of computer technology on society today.Covers the full breadth of innovations in hardware, software, theory, design, and applications.Many of the in-depth reviews have become standard references that continue to be of significant, lasting value i

  9. Computational Approaches to Vestibular Research

    Science.gov (United States)

    Ross, Muriel D.; Wade, Charles E. (Technical Monitor)

    1994-01-01

    The Biocomputation Center at NASA Ames Research Center is dedicated to a union between computational, experimental and theoretical approaches to the study of neuroscience and of life sciences in general. The current emphasis is on computer reconstruction and visualization of vestibular macular architecture in three-dimensions (3-D), and on mathematical modeling and computer simulation of neural activity in the functioning system. Our methods are being used to interpret the influence of spaceflight on mammalian vestibular maculas in a model system, that of the adult Sprague-Dawley rat. More than twenty 3-D reconstructions of type I and type II hair cells and their afferents have been completed by digitization of contours traced from serial sections photographed in a transmission electron microscope. This labor-intensive method has now been replace d by a semiautomated method developed in the Biocomputation Center in which conventional photography is eliminated. All viewing, storage and manipulation of original data is done using Silicon Graphics workstations. Recent improvements to the software include a new mesh generation method for connecting contours. This method will permit the investigator to describe any surface, regardless of complexity, including highly branched structures such as are routinely found in neurons. This same mesh can be used for 3-D, finite volume simulation of synapse activation and voltage spread on neuronal surfaces visualized via the reconstruction process. These simulations help the investigator interpret the relationship between neuroarchitecture and physiology, and are of assistance in determining which experiments will best test theoretical interpretations. Data are also used to develop abstract, 3-D models that dynamically display neuronal activity ongoing in the system. Finally, the same data can be used to visualize the neural tissue in a virtual environment. Our exhibit will depict capabilities of our computational approaches and

  10. Research Institute for Advanced Computer Science

    Science.gov (United States)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a

  11. Computer Networks A Systems Approach

    CERN Document Server

    Peterson, Larry L

    2011-01-01

    This best-selling and classic book teaches you the key principles of computer networks with examples drawn from the real world of network and protocol design. Using the Internet as the primary example, the authors explain various protocols and networking technologies. Their systems-oriented approach encourages you to think about how individual network components fit into a larger, complex system of interactions. Whatever your perspective, whether it be that of an application developer, network administrator, or a designer of network equipment or protocols, you will come away with a "big pictur

  12. Interacting electrons theory and computational approaches

    CERN Document Server

    Martin, Richard M; Ceperley, David M

    2016-01-01

    Recent progress in the theory and computation of electronic structure is bringing an unprecedented level of capability for research. Many-body methods are becoming essential tools vital for quantitative calculations and understanding materials phenomena in physics, chemistry, materials science and other fields. This book provides a unified exposition of the most-used tools: many-body perturbation theory, dynamical mean field theory and quantum Monte Carlo simulations. Each topic is introduced with a less technical overview for a broad readership, followed by in-depth descriptions and mathematical formulation. Practical guidelines, illustrations and exercises are chosen to enable readers to appreciate the complementary approaches, their relationships, and the advantages and disadvantages of each method. This book is designed for graduate students and researchers who want to use and understand these advanced computational tools, get a broad overview, and acquire a basis for participating in new developments.

  13. SPEECH RECOGNITION - A COMPUTER MEDIATED APPROACH

    OpenAIRE

    Kaliyaperumal Karthikeyan

    2012-01-01

    The computer revolution is now well advanced, but although we see a starting preparation of computer machines in many forms of work people do, the domain of computers is still significantly small because of the specialized training needed to use them and the lack of intelligence in computer systems. In the history of computer science five generations have passed by, each adding a new innovative technology that brought computers nearer and nearer to the people. Now it is sixth generation, whos...

  14. Foreword: Advanced Science Letters (ASL), Special Issue on Computational Astrophysics

    CERN Document Server

    ,

    2009-01-01

    Computational astrophysics has undergone unprecedented development over the last decade, becoming a field of its own. The challenge ahead of us will involve increasingly complex multi-scale simulations. These will bridge the gap between areas of astrophysics such as star and planet formation, or star formation and galaxy formation, that have evolved separately until today. A global knowledge of the physics and modeling techniques of astrophysical simulations is thus an important asset for the next generation of modelers. With the aim at fostering such a global approach, we present the Special Issue on Computational Astrophysics for the Advanced Science Letters (http://www.aspbs.com/science.htm). The Advanced Science Letters (ASL) is a new multi-disciplinary scientific journal which will cover extensively computational astrophysics and cosmology, and will act as a forum for the presentation and discussion of novel work attempting to connect different research areas. This Special Issue collects 9 reviews on 9 k...

  15. Aerodynamic optimization studies on advanced architecture computers

    Science.gov (United States)

    Chawla, Kalpana

    1995-01-01

    The approach to carrying out multi-discipline aerospace design studies in the future, especially in massively parallel computing environments, comprises of choosing (1) suitable solvers to compute solutions to equations characterizing a discipline, and (2) efficient optimization methods. In addition, for aerodynamic optimization problems, (3) smart methodologies must be selected to modify the surface shape. In this research effort, a 'direct' optimization method is implemented on the Cray C-90 to improve aerodynamic design. It is coupled with an existing implicit Navier-Stokes solver, OVERFLOW, to compute flow solutions. The optimization method is chosen such that it can accomodate multi-discipline optimization in future computations. In the work , however, only single discipline aerodynamic optimization will be included.

  16. Computational Approaches to Vestibular Research

    Science.gov (United States)

    Ross, Muriel D.; Wade, Charles E. (Technical Monitor)

    1994-01-01

    The Biocomputation Center at NASA Ames Research Center is dedicated to a union between computational, experimental and theoretical approaches to the study of neuroscience and of life sciences in general. The current emphasis is on computer reconstruction and visualization of vestibular macular architecture in three-dimensions (3-D), and on mathematical modeling and computer simulation of neural activity in the functioning system. Our methods are being used to interpret the influence of spaceflight on mammalian vestibular maculas in a model system, that of the adult Sprague-Dawley rat. More than twenty 3-D reconstructions of type I and type II hair cells and their afferents have been completed by digitization of contours traced from serial sections photographed in a transmission electron microscope. This labor-intensive method has now been replace d by a semiautomated method developed in the Biocomputation Center in which conventional photography is eliminated. All viewing, storage and manipulation of original data is done using Silicon Graphics workstations. Recent improvements to the software include a new mesh generation method for connecting contours. This method will permit the investigator to describe any surface, regardless of complexity, including highly branched structures such as are routinely found in neurons. This same mesh can be used for 3-D, finite volume simulation of synapse activation and voltage spread on neuronal surfaces visualized via the reconstruction process. These simulations help the investigator interpret the relationship between neuroarchitecture and physiology, and are of assistance in determining which experiments will best test theoretical interpretations. Data are also used to develop abstract, 3-D models that dynamically display neuronal activity ongoing in the system. Finally, the same data can be used to visualize the neural tissue in a virtual environment. Our exhibit will depict capabilities of our computational approaches and

  17. Computational Approaches to Vestibular Research

    Science.gov (United States)

    Ross, Muriel D.; Wade, Charles E. (Technical Monitor)

    1994-01-01

    The Biocomputation Center at NASA Ames Research Center is dedicated to a union between computational, experimental and theoretical approaches to the study of neuroscience and of life sciences in general. The current emphasis is on computer reconstruction and visualization of vestibular macular architecture in three-dimensions (3-D), and on mathematical modeling and computer simulation of neural activity in the functioning system. Our methods are being used to interpret the influence of spaceflight on mammalian vestibular maculas in a model system, that of the adult Sprague-Dawley rat. More than twenty 3-D reconstructions of type I and type II hair cells and their afferents have been completed by digitization of contours traced from serial sections photographed in a transmission electron microscope. This labor-intensive method has now been replace d by a semiautomated method developed in the Biocomputation Center in which conventional photography is eliminated. All viewing, storage and manipulation of original data is done using Silicon Graphics workstations. Recent improvements to the software include a new mesh generation method for connecting contours. This method will permit the investigator to describe any surface, regardless of complexity, including highly branched structures such as are routinely found in neurons. This same mesh can be used for 3-D, finite volume simulation of synapse activation and voltage spread on neuronal surfaces visualized via the reconstruction process. These simulations help the investigator interpret the relationship between neuroarchitecture and physiology, and are of assistance in determining which experiments will best test theoretical interpretations. Data are also used to develop abstract, 3-D models that dynamically display neuronal activity ongoing in the system. Finally, the same data can be used to visualize the neural tissue in a virtual environment. Our exhibit will depict capabilities of our computational approaches and

  18. Computational Approaches to Vestibular Research

    Science.gov (United States)

    Ross, Muriel D.; Wade, Charles E. (Technical Monitor)

    1994-01-01

    The Biocomputation Center at NASA Ames Research Center is dedicated to a union between computational, experimental and theoretical approaches to the study of neuroscience and of life sciences in general. The current emphasis is on computer reconstruction and visualization of vestibular macular architecture in three-dimensions (3-D), and on mathematical modeling and computer simulation of neural activity in the functioning system. Our methods are being used to interpret the influence of spaceflight on mammalian vestibular maculas in a model system, that of the adult Sprague-Dawley rat. More than twenty 3-D reconstructions of type I and type II hair cells and their afferents have been completed by digitization of contours traced from serial sections photographed in a transmission electron microscope. This labor-intensive method has now been replace d by a semiautomated method developed in the Biocomputation Center in which conventional photography is eliminated. All viewing, storage and manipulation of original data is done using Silicon Graphics workstations. Recent improvements to the software include a new mesh generation method for connecting contours. This method will permit the investigator to describe any surface, regardless of complexity, including highly branched structures such as are routinely found in neurons. This same mesh can be used for 3-D, finite volume simulation of synapse activation and voltage spread on neuronal surfaces visualized via the reconstruction process. These simulations help the investigator interpret the relationship between neuroarchitecture and physiology, and are of assistance in determining which experiments will best test theoretical interpretations. Data are also used to develop abstract, 3-D models that dynamically display neuronal activity ongoing in the system. Finally, the same data can be used to visualize the neural tissue in a virtual environment. Our exhibit will depict capabilities of our computational approaches and

  19. GRID COMPUTING AND CHECKPOINT APPROACH

    OpenAIRE

    Pankaj gupta

    2011-01-01

    Grid computing is a means of allocating the computational power of alarge number of computers to complex difficult computation or problem. Grid computing is a distributed computing paradigm thatdiffers from traditional distributed computing in that it is aimed toward large scale systems that even span organizational boundaries. In this paper we investigate the different techniques of fault tolerance which are used in many real time distributed systems. The main focus is on types of fault occu...

  20. Fuzzy multiple linear regression: A computational approach

    Science.gov (United States)

    Juang, C. H.; Huang, X. H.; Fleming, J. W.

    1992-01-01

    This paper presents a new computational approach for performing fuzzy regression. In contrast to Bardossy's approach, the new approach, while dealing with fuzzy variables, closely follows the conventional regression technique. In this approach, treatment of fuzzy input is more 'computational' than 'symbolic.' The following sections first outline the formulation of the new approach, then deal with the implementation and computational scheme, and this is followed by examples to illustrate the new procedure.

  1. Computational approach to Riemann surfaces

    CERN Document Server

    Klein, Christian

    2011-01-01

    This volume offers a well-structured overview of existent computational approaches to Riemann surfaces and those currently in development. The authors of the contributions represent the groups providing publically available numerical codes in this field. Thus this volume illustrates which software tools are available and how they can be used in practice. In addition examples for solutions to partial differential equations and in surface theory are presented. The intended audience of this book is twofold. It can be used as a textbook for a graduate course in numerics of Riemann surfaces, in which case the standard undergraduate background, i.e., calculus and linear algebra, is required. In particular, no knowledge of the theory of Riemann surfaces is expected; the necessary background in this theory is contained in the Introduction chapter. At the same time, this book is also intended for specialists in geometry and mathematical physics applying the theory of Riemann surfaces in their research. It is the first...

  2. Advances of evolutionary computation methods and operators

    CERN Document Server

    Cuevas, Erik; Oliva Navarro, Diego Alberto

    2016-01-01

    The goal of this book is to present advances that discuss alternative Evolutionary Computation (EC) developments and non-conventional operators which have proved to be effective in the solution of several complex problems. The book has been structured so that each chapter can be read independently from the others. The book contains nine chapters with the following themes: 1) Introduction, 2) the Social Spider Optimization (SSO), 3) the States of Matter Search (SMS), 4) the collective animal behavior (CAB) algorithm, 5) the Allostatic Optimization (AO) method, 6) the Locust Search (LS) algorithm, 7) the Adaptive Population with Reduced Evaluations (APRE) method, 8) the multimodal CAB, 9) the constrained SSO method.

  3. Computational Design of Advanced Nuclear Fuels

    Energy Technology Data Exchange (ETDEWEB)

    Savrasov, Sergey [Univ. of California, Davis, CA (United States); Kotliar, Gabriel [Rutgers Univ., Piscataway, NJ (United States); Haule, Kristjan [Rutgers Univ., Piscataway, NJ (United States)

    2014-06-03

    The objective of the project was to develop a method for theoretical understanding of nuclear fuel materials whose physical and thermophysical properties can be predicted from first principles using a novel dynamical mean field method for electronic structure calculations. We concentrated our study on uranium, plutonium, their oxides, nitrides, carbides, as well as some rare earth materials whose 4f eletrons provide a simplified framework for understanding complex behavior of the f electrons. We addressed the issues connected to the electronic structure, lattice instabilities, phonon and magnon dynamics as well as thermal conductivity. This allowed us to evaluate characteristics of advanced nuclear fuel systems using computer based simulations and avoid costly experiments.

  4. International Conference on Computers and Advanced Technology in Education

    CERN Document Server

    Advanced Information Technology in Education

    2012-01-01

    The volume includes a set of selected papers extended and revised from the 2011 International Conference on Computers and Advanced Technology in Education. With the development of computers and advanced technology, the human social activities are changing basically. Education, especially the education reforms in different countries, has been experiencing the great help from the computers and advanced technology. Generally speaking, education is a field which needs more information, while the computers, advanced technology and internet are a good information provider. Also, with the aid of the computer and advanced technology, persons can make the education an effective combination. Therefore, computers and advanced technology should be regarded as an important media in the modern education. Volume Advanced Information Technology in Education is to provide a forum for researchers, educators, engineers, and government officials involved in the general areas of computers and advanced technology in education to d...

  5. Computer Forensics Education - the Open Source Approach

    Science.gov (United States)

    Huebner, Ewa; Bem, Derek; Cheung, Hon

    In this chapter we discuss the application of the open source software tools in computer forensics education at tertiary level. We argue that open source tools are more suitable than commercial tools, as they provide the opportunity for students to gain in-depth understanding and appreciation of the computer forensic process as opposed to familiarity with one software product, however complex and multi-functional. With the access to all source programs the students become more than just the consumers of the tools as future forensic investigators. They can also examine the code, understand the relationship between the binary images and relevant data structures, and in the process gain necessary background to become the future creators of new and improved forensic software tools. As a case study we present an advanced subject, Computer Forensics Workshop, which we designed for the Bachelor's degree in computer science at the University of Western Sydney. We based all laboratory work and the main take-home project in this subject on open source software tools. We found that without exception more than one suitable tool can be found to cover each topic in the curriculum adequately. We argue that this approach prepares students better for forensic field work, as they gain confidence to use a variety of tools, not just a single product they are familiar with.

  6. Advanced control room evaluation: General approach and rationale

    International Nuclear Information System (INIS)

    Advanced control rooms (ACRs) for future nuclear power plants (NPPs) are being designed utilizing computer-based technologies. The US Nuclear Regulatory Commission reviews the human engineering aspects of such control rooms to ensure that they are designed to good human factors engineering principles and that operator performance and reliability are appropriately supported in order to protect public health and safety. This paper describes the rationale and general approach to the development of a human factors review guideline for ACRs. The factors influencing the guideline development are discussed, including the review environment, the types of advanced technologies being addressed, the human factors issues associated with advanced technology, and the current state-of-the-art of human factors guidelines for advanced human-system interfaces (HSIs). The proposed approach to ACR review would track the design and implementation process through the application of review guidelines reflecting four review modules: planning, design process analysis, human factors engineering review, and dynamic performance evaluation. 21 refs

  7. Advanced Safeguards Approaches for New Reprocessing Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Durst, Philip C.; Therios, Ike; Bean, Robert; Dougan, A.; Boyer, Brian; Wallace, Richard; Ehinger, Michael H.; Kovacic, Don N.; Tolk, K.

    2007-06-24

    U.S. efforts to promote the international expansion of nuclear energy through the Global Nuclear Energy Partnership (GNEP) will result in a dramatic expansion of nuclear fuel cycle facilities in the United States. New demonstration facilities, such as the Advanced Fuel Cycle Facility (AFCF), the Advanced Burner Reactor (ABR), and the Consolidated Fuel Treatment Center (CFTC) will use advanced nuclear and chemical process technologies that must incorporate increased proliferation resistance to enhance nuclear safeguards. The ASA-100 Project, “Advanced Safeguards Approaches for New Nuclear Fuel Cycle Facilities,” commissioned by the NA-243 Office of NNSA, has been tasked with reviewing and developing advanced safeguards approaches for these demonstration facilities. Because one goal of GNEP is developing and sharing proliferation-resistant nuclear technology and services with partner nations, the safeguards approaches considered are consistent with international safeguards as currently implemented by the International Atomic Energy Agency (IAEA). This first report reviews possible safeguards approaches for the new fuel reprocessing processes to be deployed at the AFCF and CFTC facilities. Similar analyses addressing the ABR and transuranic (TRU) fuel fabrication lines at AFCF and CFTC will be presented in subsequent reports.

  8. Transport modeling and advanced computer techniques

    International Nuclear Information System (INIS)

    A workshop was held at the University of Texas in June 1988 to consider the current state of transport codes and whether improved user interfaces would make the codes more usable and accessible to the fusion community. Also considered was the possibility that a software standard could be devised to ease the exchange of routines between groups. It was noted that two of the major obstacles to exchanging routines now are the variety of geometrical representation and choices of units. While the workshop formulated no standards, it was generally agreed that good software engineering would aid in the exchange of routines, and that a continued exchange of ideas between groups would be worthwhile. It seems that before we begin to discuss software standards we should review the current state of computer technology --- both hardware and software to see what influence recent advances might have on our software goals. This is done in this paper

  9. Advanced Scientific Computing Research Network Requirements

    Energy Technology Data Exchange (ETDEWEB)

    Bacon, Charles; Bell, Greg; Canon, Shane; Dart, Eli; Dattoria, Vince; Goodwin, Dave; Lee, Jason; Hicks, Susan; Holohan, Ed; Klasky, Scott; Lauzon, Carolyn; Rogers, Jim; Shipman, Galen; Skinner, David; Tierney, Brian

    2013-03-08

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 25 years. In October 2012, ESnet and the Office of Advanced Scientific Computing Research (ASCR) of the DOE SC organized a review to characterize the networking requirements of the programs funded by the ASCR program office. The requirements identified at the review are summarized in the Findings section, and are described in more detail in the body of the report.

  10. Advanced proton imaging in computed tomography

    CERN Document Server

    Mattiazzo, S; Giubilato, P; Pantano, D; Pozzobon, N; Snoeys, W; Wyss, J

    2015-01-01

    In recent years the use of hadrons for cancer radiation treatment has grown in importance, and many facilities are currently operational or under construction worldwide. To fully exploit the therapeutic advantages offered by hadron therapy, precise body imaging for accurate beam delivery is decisive. Proton computed tomography (pCT) scanners, currently in their R&D phase, provide the ultimate 3D imaging for hadrons treatment guidance. A key component of a pCT scanner is the detector used to track the protons, which has great impact on the scanner performances and ultimately limits its maximum speed. In this article, a novel proton-tracking detector was presented that would have higher scanning speed, better spatial resolution and lower material budget with respect to present state-of-the-art detectors, leading to enhanced performances. This advancement in performances is achieved by employing the very latest development in monolithic active pixel detectors (to build high granularity, low material budget, ...

  11. Computer Architecture A Quantitative Approach

    CERN Document Server

    Hennessy, John L

    2011-01-01

    The computing world today is in the middle of a revolution: mobile clients and cloud computing have emerged as the dominant paradigms driving programming and hardware innovation today. The Fifth Edition of Computer Architecture focuses on this dramatic shift, exploring the ways in which software and technology in the cloud are accessed by cell phones, tablets, laptops, and other mobile computing devices. Each chapter includes two real-world examples, one mobile and one datacenter, to illustrate this revolutionary change.Updated to cover the mobile computing revolutionEmphasizes the two most im

  12. OPENING REMARKS: Scientific Discovery through Advanced Computing

    Science.gov (United States)

    Strayer, Michael

    2006-01-01

    Good morning. Welcome to SciDAC 2006 and Denver. I share greetings from the new Undersecretary for Energy, Ray Orbach. Five years ago SciDAC was launched as an experiment in computational science. The goal was to form partnerships among science applications, computer scientists, and applied mathematicians to take advantage of the potential of emerging terascale computers. This experiment has been a resounding success. SciDAC has emerged as a powerful concept for addressing some of the biggest challenges facing our world. As significant as these successes were, I believe there is also significance in the teams that achieved them. In addition to their scientific aims these teams have advanced the overall field of computational science and set the stage for even larger accomplishments as we look ahead to SciDAC-2. I am sure that many of you are expecting to hear about the results of our current solicitation for SciDAC-2. I’m afraid we are not quite ready to make that announcement. Decisions are still being made and we will announce the results later this summer. Nearly 250 unique proposals were received and evaluated, involving literally thousands of researchers, postdocs, and students. These collectively requested more than five times our expected budget. This response is a testament to the success of SciDAC in the community. In SciDAC-2 our budget has been increased to about 70 million for FY 2007 and our partnerships have expanded to include the Environment and National Security missions of the Department. The National Science Foundation has also joined as a partner. These new partnerships are expected to expand the application space of SciDAC, and broaden the impact and visibility of the program. We have, with our recent solicitation, expanded to turbulence, computational biology, and groundwater reactive modeling and simulation. We are currently talking with the Department’s applied energy programs about risk assessment, optimization of complex systems - such

  13. Making Advanced Computer Science Topics More Accessible through Interactive Technologies

    Science.gov (United States)

    Shao, Kun; Maher, Peter

    2012-01-01

    Purpose: Teaching advanced technical concepts in a computer science program to students of different technical backgrounds presents many challenges. The purpose of this paper is to present a detailed experimental pedagogy in teaching advanced computer science topics, such as computer networking, telecommunications and data structures using…

  14. Machine Computation; An Algorithmic Approach.

    Science.gov (United States)

    Gonzalez, Richard F.; McMillan, Claude, Jr.

    Designed for undergraduate social science students, this textbook concentrates on using the computer in a straightforward way to manipulate numbers and variables in order to solve problems. The text is problem oriented and assumes that the student has had little prior experience with either a computer or programing languages. An introduction to…

  15. An institutional approach to computational social creativity

    OpenAIRE

    Corneli, Joseph

    2016-01-01

    Elinor Ostrom's Nobel Memorial Prize-winning work on "the analysis of economic governance, especially the commons" scaffolds an argument for an institutional approach to computational social creativity. Several Ostrom-inspired "creativity design principles" are explored and exemplified to illustrate the computational and institutional structures that are employed in current and potential computational creativity practice.

  16. Soft Computing Approaches To Fault Tolerant Systems

    Directory of Open Access Journals (Sweden)

    Neeraj Prakash Srivastava

    2014-05-01

    Full Text Available We present in this paper as an introduction to soft computing techniques for fault tolerant systems and the terminology with different ways of achieving fault tolerance. The paper focuses on the problem of fault tolerance using soft computing techniques. The fundamentals of soft computing approaches and its type with introduction of fault tolerance are discussed. The main objective is to show how to implement soft computing approaches for fault detection, isolation and identification. The paper contains details about soft computing application with an application of wireless sensor network as fault tolerant system.

  17. Ontological Approach toward Cybersecurity in Cloud Computing

    OpenAIRE

    Takahashi, Takeshi; Kadobayashi, Youki; FUJIWARA, HIROYUKI

    2014-01-01

    Widespread deployment of the Internet enabled building of an emerging IT delivery model, i.e., cloud computing. Albeit cloud computing-based services have rapidly developed, their security aspects are still at the initial stage of development. In order to preserve cybersecurity in cloud computing, cybersecurity information that will be exchanged within it needs to be identified and discussed. For this purpose, we propose an ontological approach to cybersecurity in cloud computing. We build an...

  18. Infinitesimal symmetries: a computational approach

    International Nuclear Information System (INIS)

    This thesis is concerned with computational aspects in the determination of infinitesimal symmetries and Lie-Baecklund transformations of differential equations. Moreover some problems are calculated explicitly. A brief introduction to some concepts in the theory of symmetries and Lie-Baecklund transformations, relevant for this thesis, are given. The mathematical formalism is shortly reviewed. The jet bundle formulation is chosen, in which, by its algebraic nature, objects can be described very precisely. Consequently it is appropriate for implementation. A number of procedures are discussed, which enable to carry through computations with the help of a computer. These computations are very extensive in practice. The Lie algebras of infinitesimal symmetries of a number of differential equations in Mathematical Physics are established and some of their applications are discussed, i.e., Maxwell equations, nonlinear diffusion equation, nonlinear Schroedinger equation, nonlinear Dirac equations and self dual SU(2) Yang-Mills equations. Lie-Baecklund transformations of Burgers' equation, Classical Boussinesq equation and the Massive Thirring Model are determined. Furthermore, nonlocal Lie-Baecklund transformations of the last equation are derived. (orig.)

  19. Advanced intelligent computational technologies and decision support systems

    CERN Document Server

    Kountchev, Roumen

    2014-01-01

    This book offers a state of the art collection covering themes related to Advanced Intelligent Computational Technologies and Decision Support Systems which can be applied to fields like healthcare assisting the humans in solving problems. The book brings forward a wealth of ideas, algorithms and case studies in themes like: intelligent predictive diagnosis; intelligent analyzing of medical images; new format for coding of single and sequences of medical images; Medical Decision Support Systems; diagnosis of Down’s syndrome; computational perspectives for electronic fetal monitoring; efficient compression of CT Images; adaptive interpolation and halftoning for medical images; applications of artificial neural networks for real-life problems solving; present and perspectives for Electronic Healthcare Record Systems; adaptive approaches for noise reduction in sequences of CT images etc.

  20. GRID COMPUTING AND CHECKPOINT APPROACH

    Directory of Open Access Journals (Sweden)

    Pankaj gupta

    2011-05-01

    Full Text Available Grid computing is a means of allocating the computational power of alarge number of computers to complex difficult computation or problem. Grid computing is a distributed computing paradigm thatdiffers from traditional distributed computing in that it is aimed toward large scale systems that even span organizational boundaries. In this paper we investigate the different techniques of fault tolerance which are used in many real time distributed systems. The main focus is on types of fault occurring in the system, fault detection techniques and the recovery techniques used. A fault can occur due to link failure, resource failure or by any other reason is to be tolerated for working the system smoothly and accurately. These faults can be detected and recovered by many techniques used accordingly. An appropriate fault detector can avoid loss due to system crash and reliable fault tolerance technique can save from system failure. This paper provides how these methods are applied to detect and tolerate faults from various Real Time Distributed Systems. The advantages of utilizing the check pointing functionality are obvious; however so far the Grid community has notdeveloped a widely accepted standard that would allow the Gridenvironment to consciously utilize low level check pointing packages.Therefore, such a standard named Grid Check pointing Architecture isbeing designed. The fault tolerance mechanism used here sets the jobcheckpoints based on the resource failure rate. If resource failureoccurs, the job is restarted from its last successful state using acheckpoint file from another grid resource. A critical aspect for anautomatic recovery is the availability of checkpoint files. A strategy to increase the availability of checkpoints is replication. Grid is a form distributed computing mainly to virtualizes and utilize geographically distributed idle resources. A grid is a distributed computational and storage environment often composed of

  1. COMPUTATIONAL APPROACH TO ORGANIZATIONAL DESIGN

    OpenAIRE

    Alexander Arenas; Roger Guimera; Joan R. Alabart; Hans-Joerg Witt; Albert Diaz-Guilera

    2000-01-01

    The idea of the work is to propose an abstract and simple enough agent-based model for company dynamics, in order to be able to deal computationally and even analytically with the problem of organizational design. Nevertheless, the model should be able to reproduce the essential characteristics of real organizations.The natural way of modeling a company is as being a network where the nodes represent employees and the links between them represent communication lines. In our model, problems ar...

  2. Immune based computer virus detection approaches

    Institute of Scientific and Technical Information of China (English)

    TAN Ying; ZHANG Pengtao

    2013-01-01

    The computer virus is considered one of the most horrifying threats to the security of computer systems worldwide.The rapid development of evasion techniques used in virus causes the signature based computer virus detection techniques to be ineffective.Many novel computer virus detection approaches have been proposed in the past to cope with the ineffectiveness,mainly classified into three categories:static,dynamic and heuristics techniques.As the natural similarities between the biological immune system (BIS),computer security system (CSS),and the artificial immune system (AIS) were all developed as a new prototype in the community of anti-virus research.The immune mechanisms in the BIS provide the opportunities to construct computer virus detection models that are robust and adaptive with the ability to detect unseen viruses.In this paper,a variety of classic computer virus detection approaches were introduced and reviewed based on the background knowledge of the computer virus history.Next,a variety of immune based computer virus detection approaches were also discussed in detail.Promising experimental results suggest that the immune based computer virus detection approaches were able to detect new variants and unseen viruses at lower false positive rates,which have paved a new way for the anti-virus research.

  3. Mobile Cloud Computing: A Review on Smartphone Augmentation Approaches

    CERN Document Server

    Abolfazli, Saeid; Gani, Abdullah

    2012-01-01

    Smartphones have recently gained significant popularity in heavy mobile processing while users are increasing their expectations toward rich computing experience. However, resource limitations and current mobile computing advancements hinder this vision. Therefore, resource-intensive application execution remains a challenging task in mobile computing that necessitates device augmentation. In this article, smartphone augmentation approaches are reviewed and classified in two main groups, namely hardware and software. Generating high-end hardware is a subset of hardware augmentation approaches, whereas conserving local resource and reducing resource requirements approaches are grouped under software augmentation methods. Our study advocates that consreving smartphones' native resources, which is mainly done via task offloading, is more appropriate for already-developed applications than new ones, due to costly re-development process. Cloud computing has recently obtained momentous ground as one of the major co...

  4. Advanced computational modelling for drying processes – A review

    International Nuclear Information System (INIS)

    Highlights: • Understanding the product dehydration process is a key aspect in drying technology. • Advanced modelling thereof plays an increasingly important role for developing next-generation drying technology. • Dehydration modelling should be more energy-oriented. • An integrated “nexus” modelling approach is needed to produce more energy-smart products. • Multi-objective process optimisation requires development of more complete multiphysics models. - Abstract: Drying is one of the most complex and energy-consuming chemical unit operations. R and D efforts in drying technology have skyrocketed in the past decades, as new drivers emerged in this industry next to procuring prime product quality and high throughput, namely reduction of energy consumption and carbon footprint as well as improving food safety and security. Solutions are sought in optimising existing technologies or developing new ones which increase energy and resource efficiency, use renewable energy, recuperate waste heat and reduce product loss, thus also the embodied energy therein. Novel tools are required to push such technological innovations and their subsequent implementation. Particularly computer-aided drying process engineering has a large potential to develop next-generation drying technology, including more energy-smart and environmentally-friendly products and dryers systems. This review paper deals with rapidly emerging advanced computational methods for modelling dehydration of porous materials, particularly for foods. Drying is approached as a combined multiphysics, multiscale and multiphase problem. These advanced methods include computational fluid dynamics, several multiphysics modelling methods (e.g. conjugate modelling), multiscale modelling and modelling of material properties and the associated propagation of material property variability. Apart from the current challenges for each of these, future perspectives should be directed towards material property

  5. Science based integrated approach to advanced nuclear fuel development - vision, approach, and overview

    Energy Technology Data Exchange (ETDEWEB)

    Unal, Cetin [Los Alamos National Laboratory; Pasamehmetoglu, Kemal [IDAHO NATIONAL LAB; Carmack, Jon [IDAHO NATIONAL LAB

    2010-01-01

    Advancing the performance of Light Water Reactors, Advanced Nuclear Fuel Cycles, and Advanced Rcactors, such as the Next Generation Nuclear Power Plants, requires enhancing our fundamental understanding of fuel and materials behavior under irradiation. The capability to accurately model the nuclear fuel systems is critical. In order to understand specific aspects of the nuclear fuel, fully coupled fuel simulation codes are required to achieve licensing of specific nuclear fuel designs for operation. The backbone of these codes, models, and simulations is a fundamental understanding and predictive capability for simulating the phase and microstructural behavior of the nuclear fuel system materials and matrices. The purpose of this paper is to identify the modeling and simulation approach in order to deliver predictive tools for advanced fuels development. The coordination between experimental nuclear fuel design, development technical experts, and computational fuel modeling and simulation technical experts is a critical aspect of the approach and naturally leads to an integrated, goal-oriented science-based R & D approach and strengthens both the experimental and computational efforts. The Advanced Fuels Campaign (AFC) and Nuclear Energy Advanced Modeling and Simulation (NEAMS) Fuels Integrated Performance and Safety Code (IPSC) are working together to determine experimental data and modeling needs. The primary objective of the NEAMS fuels IPSC project is to deliver a coupled, three-dimensional, predictive computational platform for modeling the fabrication and both normal and abnormal operation of nuclear fuel pins and assemblies, applicable to both existing and future reactor fuel designs. The science based program is pursuing the development of an integrated multi-scale and multi-physics modeling and simulation platform for nuclear fuels. This overview paper discusses the vision, goals and approaches how to develop and implement the new approach.

  6. Computer Architecture A Quantitative Approach

    CERN Document Server

    Hennessy, John L

    2007-01-01

    The era of seemingly unlimited growth in processor performance is over: single chip architectures can no longer overcome the performance limitations imposed by the power they consume and the heat they generate. Today, Intel and other semiconductor firms are abandoning the single fast processor model in favor of multi-core microprocessors--chips that combine two or more processors in a single package. In the fourth edition of Computer Architecture, the authors focus on this historic shift, increasing their coverage of multiprocessors and exploring the most effective ways of achieving parallelis

  7. Learning and geometry computational approaches

    CERN Document Server

    Smith, Carl

    1996-01-01

    The field of computational learning theory arose out of the desire to for­ mally understand the process of learning. As potential applications to artificial intelligence became apparent, the new field grew rapidly. The learning of geo­ metric objects became a natural area of study. The possibility of using learning techniques to compensate for unsolvability provided an attraction for individ­ uals with an immediate need to solve such difficult problems. Researchers at the Center for Night Vision were interested in solving the problem of interpreting data produced by a variety of sensors. Current vision techniques, which have a strong geometric component, can be used to extract features. However, these techniques fall short of useful recognition of the sensed objects. One potential solution is to incorporate learning techniques into the geometric manipulation of sensor data. As a first step toward realizing such a solution, the Systems Research Center at the University of Maryland, in conjunction with the C...

  8. Cloud computing methods and practical approaches

    CERN Document Server

    Mahmood, Zaigham

    2013-01-01

    This book presents both state-of-the-art research developments and practical guidance on approaches, technologies and frameworks for the emerging cloud paradigm. Topics and features: presents the state of the art in cloud technologies, infrastructures, and service delivery and deployment models; discusses relevant theoretical frameworks, practical approaches and suggested methodologies; offers guidance and best practices for the development of cloud-based services and infrastructures, and examines management aspects of cloud computing; reviews consumer perspectives on mobile cloud computing an

  9. Computational intelligence for big data analysis frontier advances and applications

    CERN Document Server

    Dehuri, Satchidananda; Sanyal, Sugata

    2015-01-01

    The work presented in this book is a combination of theoretical advancements of big data analysis, cloud computing, and their potential applications in scientific computing. The theoretical advancements are supported with illustrative examples and its applications in handling real life problems. The applications are mostly undertaken from real life situations. The book discusses major issues pertaining to big data analysis using computational intelligence techniques and some issues of cloud computing. An elaborate bibliography is provided at the end of each chapter. The material in this book includes concepts, figures, graphs, and tables to guide researchers in the area of big data analysis and cloud computing.

  10. UNEDF: Advanced Scientific Computing Transforms the Low-Energy Nuclear Many-Body Problem

    CERN Document Server

    Stoitsov, M; Nazarewicz, W; Bulgac, A; Hagen, G; Kortelainen, M; Pei, J C; Roche, K J; Schunck, N; Thompson, I; Vary, J P; Wild, S M

    2011-01-01

    The UNEDF SciDAC collaboration of nuclear theorists, applied mathematicians, and computer scientists is developing a comprehensive description of nuclei and their reactions that delivers maximum predictive power with quantified uncertainties. This paper illustrates significant milestones accomplished by UNEDF through integration of the theoretical approaches, advanced numerical algorithms, and leadership class computational resources.

  11. Advanced Technologies, Embedded and Multimedia for Human-Centric Computing

    CERN Document Server

    Chao, Han-Chieh; Deng, Der-Jiunn; Park, James; HumanCom and EMC 2013

    2014-01-01

    The theme of HumanCom and EMC are focused on the various aspects of human-centric computing for advances in computer science and its applications, embedded and multimedia computing and provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of human-centric computing. And the theme of EMC (Advanced in Embedded and Multimedia Computing) is focused on the various aspects of embedded system, smart grid, cloud and multimedia computing, and it provides an opportunity for academic, industry professionals to discuss the latest issues and progress in the area of embedded and multimedia computing. Therefore this book will be include the various theories and practical applications in human-centric computing and embedded and multimedia computing.

  12. The ACP (Advanced Computer Program) multiprocessor system at Fermilab

    Energy Technology Data Exchange (ETDEWEB)

    Nash, T.; Areti, H.; Atac, R.; Biel, J.; Case, G.; Cook, A.; Fischler, M.; Gaines, I.; Hance, R.; Husby, D.

    1986-09-01

    The Advanced Computer Program at Fermilab has developed a multiprocessor system which is easy to use and uniquely cost effective for many high energy physics problems. The system is based on single board computers which cost under $2000 each to build including 2 Mbytes of on board memory. These standard VME modules each run experiment reconstruction code in Fortran at speeds approaching that of a VAX 11/780. Two versions have been developed: one uses Motorola's 68020 32 bit microprocessor, the other runs with AT and T's 32100. both include the corresponding floating point coprocessor chip. The first system, when fully configured, uses 70 each of the two types of processors. A 53 processor system has been operated for several months with essentially no down time by computer operators in the Fermilab Computer Center, performing at nearly the capacity of 6 CDC Cyber 175 mainframe computers. The VME crates in which the processing ''nodes'' sit are connected via a high speed ''Branch Bus'' to one or more MicroVAX computers which act as hosts handling system resource management and all I/O in offline applications. An interface from Fastbus to the Branch Bus has been developed for online use which has been tested error free at 20 Mbytes/sec for 48 hours. ACP hardware modules are now available commercially. A major package of software, including a simulator that runs on any VAX, has been developed. It allows easy migration of existing programs to this multiprocessor environment. This paper describes the ACP Multiprocessor System and early experience with it at Fermilab and elsewhere.

  13. The ACP [Advanced Computer Program] multiprocessor system at Fermilab

    International Nuclear Information System (INIS)

    The Advanced Computer Program at Fermilab has developed a multiprocessor system which is easy to use and uniquely cost effective for many high energy physics problems. The system is based on single board computers which cost under $2000 each to build including 2 Mbytes of on board memory. These standard VME modules each run experiment reconstruction code in Fortran at speeds approaching that of a VAX 11/780. Two versions have been developed: one uses Motorola's 68020 32 bit microprocessor, the other runs with AT and T's 32100. both include the corresponding floating point coprocessor chip. The first system, when fully configured, uses 70 each of the two types of processors. A 53 processor system has been operated for several months with essentially no down time by computer operators in the Fermilab Computer Center, performing at nearly the capacity of 6 CDC Cyber 175 mainframe computers. The VME crates in which the processing ''nodes'' sit are connected via a high speed ''Branch Bus'' to one or more MicroVAX computers which act as hosts handling system resource management and all I/O in offline applications. An interface from Fastbus to the Branch Bus has been developed for online use which has been tested error free at 20 Mbytes/sec for 48 hours. ACP hardware modules are now available commercially. A major package of software, including a simulator that runs on any VAX, has been developed. It allows easy migration of existing programs to this multiprocessor environment. This paper describes the ACP Multiprocessor System and early experience with it at Fermilab and elsewhere

  14. Transonic wing analysis using advanced computational methods

    Science.gov (United States)

    Henne, P. A.; Hicks, R. M.

    1978-01-01

    This paper discusses the application of three-dimensional computational transonic flow methods to several different types of transport wing designs. The purpose of these applications is to evaluate the basic accuracy and limitations associated with such numerical methods. The use of such computational methods for practical engineering problems can only be justified after favorable evaluations are completed. The paper summarizes a study of both the small-disturbance and the full potential technique for computing three-dimensional transonic flows. Computed three-dimensional results are compared to both experimental measurements and theoretical results. Comparisons are made not only of pressure distributions but also of lift and drag forces. Transonic drag rise characteristics are compared. Three-dimensional pressure distributions and aerodynamic forces, computed from the full potential solution, compare reasonably well with experimental results for a wide range of configurations and flow conditions.

  15. Second International Conference on Advanced Computing, Networking and Informatics

    CERN Document Server

    Mohapatra, Durga; Konar, Amit; Chakraborty, Aruna

    2014-01-01

    Advanced Computing, Networking and Informatics are three distinct and mutually exclusive disciplines of knowledge with no apparent sharing/overlap among them. However, their convergence is observed in many real world applications, including cyber-security, internet banking, healthcare, sensor networks, cognitive radio, pervasive computing amidst many others. This two-volume proceedings explore the combined use of Advanced Computing and Informatics in the next generation wireless networks and security, signal and image processing, ontology and human-computer interfaces (HCI). The two volumes together include 148 scholarly papers, which have been accepted for presentation from over 640 submissions in the second International Conference on Advanced Computing, Networking and Informatics, 2014, held in Kolkata, India during June 24-26, 2014. The first volume includes innovative computing techniques and relevant research results in informatics with selective applications in pattern recognition, signal/image process...

  16. GRID COMPUTING AND FAULT TOLERANCE APPROACH

    Directory of Open Access Journals (Sweden)

    Pankaj Gupta,

    2011-10-01

    Full Text Available Grid computing is a means of allocating the computational power of alarge number of computers to complex difficult computation orproblem. Grid computing is a distributed computing paradigm thatdiffers from traditional distributed computing in that it is aimed toward large scale systems that even span organizational boundaries. This paper proposes a method to achieve maximum fault tolerance in the Grid environment system by using Reliability consideration by using Replication approach and Check-point approach. Fault tolerance is an important property for large scale computational grid systems, where geographically distributed nodes co-operate to execute a task. In order to achieve high level of reliability and availability, the grid infrastructure should be a foolproof fault tolerant. Since the failure of resources affects job execution fatally, fault tolerance service is essential to satisfy QOS requirement in grid computing. Commonly utilized techniques for providing fault tolerance are job check pointing and replication. Both techniques mitigate the amount of work lost due to changing system availability but can introduce significant runtime overhead. The latter largely depends on the length of check pointing interval and the chosen number of replicas, respectively. In case of complex scientific workflows where tasks can execute in well defined order reliability is another biggest challenge because of the unreliable nature of the grid resources.

  17. Advances in Computational Techniques to Study GPCR-Ligand Recognition.

    Science.gov (United States)

    Ciancetta, Antonella; Sabbadin, Davide; Federico, Stephanie; Spalluto, Giampiero; Moro, Stefano

    2015-12-01

    G-protein-coupled receptors (GPCRs) are among the most intensely investigated drug targets. The recent revolutions in protein engineering and molecular modeling algorithms have overturned the research paradigm in the GPCR field. While the numerous ligand-bound X-ray structures determined have provided invaluable insights into GPCR structure and function, the development of algorithms exploiting graphics processing units (GPUs) has made the simulation of GPCRs in explicit lipid-water environments feasible within reasonable computation times. In this review we present a survey of the recent advances in structure-based drug design approaches with a particular emphasis on the elucidation of the ligand recognition process in class A GPCRs by means of membrane molecular dynamics (MD) simulations. PMID:26538318

  18. Computational biomechanics for medicine new approaches and new applications

    CERN Document Server

    Miller, Karol; Wittek, Adam; Nielsen, Poul

    2015-01-01

    The Computational Biomechanics for Medicine titles provide an opportunity for specialists in computational biomechanics to present their latest methodologiesand advancements. Thisvolumecomprises twelve of the newest approaches and applications of computational biomechanics, from researchers in Australia, New Zealand, USA, France, Spain and Switzerland. Some of the interesting topics discussed are:real-time simulations; growth and remodelling of soft tissues; inverse and meshless solutions; medical image analysis; and patient-specific solid mechanics simulations. One of the greatest challenges facing the computational engineering community is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. We hope the research presented within this book series will contribute to overcoming this grand challenge.

  19. Advances in Future Computer and Control Systems v.2

    CERN Document Server

    Lin, Sally; 2012 International Conference on Future Computer and Control Systems(FCCS2012)

    2012-01-01

    FCCS2012 is an integrated conference concentrating its focus on Future Computer and Control Systems. “Advances in Future Computer and Control Systems” presents the proceedings of the 2012 International Conference on Future Computer and Control Systems(FCCS2012) held April 21-22,2012, in Changsha, China including recent research results on Future Computer and Control Systems of researchers from all around the world.

  20. Advances in Future Computer and Control Systems v.1

    CERN Document Server

    Lin, Sally; 2012 International Conference on Future Computer and Control Systems(FCCS2012)

    2012-01-01

    FCCS2012 is an integrated conference concentrating its focus on Future Computer and Control Systems. “Advances in Future Computer and Control Systems” presents the proceedings of the 2012 International Conference on Future Computer and Control Systems(FCCS2012) held April 21-22,2012, in Changsha, China including recent research results on Future Computer and Control Systems of researchers from all around the world.

  1. Computing Algorithms for Nuffield Advanced Physics.

    Science.gov (United States)

    Summers, M. K.

    1978-01-01

    Defines all recurrence relations used in the Nuffield course, to solve first- and second-order differential equations, and describes a typical algorithm for computer generation of solutions. (Author/GA)

  2. Power-efficient computer architectures recent advances

    CERN Document Server

    Själander, Magnus; Kaxiras, Stefanos

    2014-01-01

    As Moore's Law and Dennard scaling trends have slowed, the challenges of building high-performance computer architectures while maintaining acceptable power efficiency levels have heightened. Over the past ten years, architecture techniques for power efficiency have shifted from primarily focusing on module-level efficiencies, toward more holistic design styles based on parallelism and heterogeneity. This work highlights and synthesizes recent techniques and trends in power-efficient computer architecture.Table of Contents: Introduction / Voltage and Frequency Management / Heterogeneity and Sp

  3. Advanced Metamorphic Techniques in Computer Viruses

    OpenAIRE

    Beaucamps, Philippe

    2007-01-01

    Nowadays viruses use polymorphic techniques to mutate their code on each replication, thus evading detection by antiviruses. However detection by emulation can defeat simple polymorphism: thus metamorphic techniques are used which thoroughly change the viral code, even after decryption. We briefly detail this evolution of virus protection techniques against detection and then study the MetaPHOR virus, today's most advanced metamorphic virus.

  4. Advances in computing, and their impact on scientific computing.

    Science.gov (United States)

    Giles, Mike

    2002-01-01

    This paper begins by discussing the developments and trends in computer hardware, starting with the basic components (microprocessors, memory, disks, system interconnect, networking and visualization) before looking at complete systems (death of vector supercomputing, slow demise of large shared-memory systems, rapid growth in very large clusters of PCs). It then considers the software side, the relative maturity of shared-memory (OpenMP) and distributed-memory (MPI) programming environments, and new developments in 'grid computing'. Finally, it touches on the increasing importance of software packages in scientific computing, and the increased importance and difficulty of introducing good software engineering practices into very large academic software development projects. PMID:12539947

  5. Preface: Special issue: ten years of advances in computer entertainment

    NARCIS (Netherlands)

    Katayose, Haruhiro; Reidsma, Dennis; Rauterberg, M

    2014-01-01

    This special issue celebrates the 10th edition of the International Conference on Advances in Computer Entertainment (ACE) by collecting six selected and revised papers from among this year’s accepted contributions.

  6. Advanced Approach of Multiagent Based Buoy Communication

    Directory of Open Access Journals (Sweden)

    Gediminas Gricius

    2015-01-01

    Full Text Available Usually, a hydrometeorological information system is faced with great data flows, but the data levels are often excessive, depending on the observed region of the water. The paper presents advanced buoy communication technologies based on multiagent interaction and data exchange between several monitoring system nodes. The proposed management of buoy communication is based on a clustering algorithm, which enables the performance of the hydrometeorological information system to be enhanced. The experiment is based on the design and analysis of the inexpensive but reliable Baltic Sea autonomous monitoring network (buoys, which would be able to continuously monitor and collect temperature, waviness, and other required data. The proposed approach of multiagent based buoy communication enables all the data from the costal-based station to be monitored with limited transition speed by setting different tasks for the agent-based buoy system according to the clustering information.

  7. 3rd International Conference on Advanced Computing, Networking and Informatics

    CERN Document Server

    Mohapatra, Durga; Chaki, Nabendu

    2016-01-01

    Advanced Computing, Networking and Informatics are three distinct and mutually exclusive disciplines of knowledge with no apparent sharing/overlap among them. However, their convergence is observed in many real world applications, including cyber-security, internet banking, healthcare, sensor networks, cognitive radio, pervasive computing amidst many others. This two volume proceedings explore the combined use of Advanced Computing and Informatics in the next generation wireless networks and security, signal and image processing, ontology and human-computer interfaces (HCI). The two volumes together include 132 scholarly articles, which have been accepted for presentation from over 550 submissions in the Third International Conference on Advanced Computing, Networking and Informatics, 2015, held in Bhubaneswar, India during June 23–25, 2015.

  8. Advances in Computer, Communication, Control and Automation

    CERN Document Server

    011 International Conference on Computer, Communication, Control and Automation

    2012-01-01

    The volume includes a set of selected papers extended and revised from the 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011). 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011) has been held in Zhuhai, China, November 19-20, 2011. This volume  topics covered include signal and Image processing, speech and audio Processing, video processing and analysis, artificial intelligence, computing and intelligent systems, machine learning, sensor and neural networks, knowledge discovery and data mining, fuzzy mathematics and Applications, knowledge-based systems, hybrid systems modeling and design, risk analysis and management, system modeling and simulation. We hope that researchers, graduate students and other interested readers benefit scientifically from the proceedings and also find it stimulating in the process.

  9. Parallel Computation in Econometrics: A Simplified Approach

    OpenAIRE

    Jurgen A. Doornik; Shephard, Neil; Hendry, David F.

    2004-01-01

    Parallel computation has a long history in econometric computing, but is not at all wide spread. We believe that a major impediment is the labour cost of coding for parallel architectures. Moreover, programs for specific hardware often become obsolete quite quickly. Our approach is to take a popular matrix programming language (Ox), and implement a message-passing interface using MPI. Next, object-oriented programming allows us to hide the specific parallelization code, so that a program does...

  10. A Big Data Approach to Computational Creativity

    CERN Document Server

    Varshney, Lav R; Varshney, Kush R; Bhattacharjya, Debarun; Schoergendorfer, Angela; Chee, Yi-Min

    2013-01-01

    Computational creativity is an emerging branch of artificial intelligence that places computers in the center of the creative process. Broadly, creativity involves a generative step to produce many ideas and a selective step to determine the ones that are the best. Many previous attempts at computational creativity, however, have not been able to achieve a valid selective step. This work shows how bringing data sources from the creative domain and from hedonic psychophysics together with big data analytics techniques can overcome this shortcoming to yield a system that can produce novel and high-quality creative artifacts. Our data-driven approach is demonstrated through a computational creativity system for culinary recipes and menus we developed and deployed, which can operate either autonomously or semi-autonomously with human interaction. We also comment on the volume, velocity, variety, and veracity of data in computational creativity.

  11. Recent advances in computational structural reliability analysis methods

    Science.gov (United States)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-01-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  12. Advanced Computing Tools and Models for Accelerator Physics

    Energy Technology Data Exchange (ETDEWEB)

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  13. Advanced Safeguards Approaches for New Fast Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Durst, Philip C.; Therios, Ike; Bean, Robert; Dougan, A.; Boyer, Brian; Wallace, Rick L.; Ehinger, Michael H.; Kovacic, Don N.; Tolk, K.

    2007-12-15

    This third report in the series reviews possible safeguards approaches for new fast reactors in general, and the ABR in particular. Fast-neutron spectrum reactors have been used since the early 1960s on an experimental and developmental level, generally with fertile blanket fuels to “breed” nuclear fuel such as plutonium. Whether the reactor is designed to breed plutonium, or transmute and “burn” actinides depends mainly on the design of the reactor neutron reflector and the whether the blanket fuel is “fertile” or suitable for transmutation. However, the safeguards issues are very similar, since they pertain mainly to the receipt, shipment and storage of fresh and spent plutonium and actinide-bearing “TRU”-fuel. For these reasons, the design of existing fast reactors and details concerning how they have been safeguarded were studied in developing advanced safeguards approaches for the new fast reactors. In this regard, the design of the Experimental Breeder Reactor-II “EBR-II” at the Idaho National Laboratory (INL) was of interest, because it was designed as a collocated fast reactor with a pyrometallurgical reprocessing and fuel fabrication line – a design option being considered for the ABR. Similarly, the design of the Fast Flux Facility (FFTF) on the Hanford Site was studied, because it was a successful prototype fast reactor that ran for two decades to evaluate fuels and the design for commercial-scale fast reactors.

  14. Uncertainty quantification approaches for advanced reactor analyses.

    Energy Technology Data Exchange (ETDEWEB)

    Briggs, L. L.; Nuclear Engineering Division

    2009-03-24

    The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.

  15. Advances in Computing and Information Technology : Proceedings of the Second International Conference on Advances in Computing and Information Technology

    CERN Document Server

    Nagamalai, Dhinaharan; Chaki, Nabendu

    2013-01-01

    The international conference on Advances in Computing and Information technology (ACITY 2012) provides an excellent international forum for both academics and professionals for sharing knowledge and results in theory, methodology and applications of Computer Science and Information Technology. The Second International Conference on Advances in Computing and Information technology (ACITY 2012), held in Chennai, India, during July 13-15, 2012, covered a number of topics in all major fields of Computer Science and Information Technology including: networking and communications, network security and applications, web and internet computing, ubiquitous computing, algorithms, bioinformatics, digital image processing and pattern recognition, artificial intelligence, soft computing and applications. Upon a strength review process, a number of high-quality, presenting not only innovative ideas but also a founded evaluation and a strong argumentation of the same, were selected and collected in the present proceedings, ...

  16. Advances in Computer Science and Education

    CERN Document Server

    Huang, Xiong

    2012-01-01

    CSE2011 is an integrated conference concentration its focus on computer science and education. In the proceeding, you can learn much more knowledge about computer science and education of researchers from all around the world. The main role of the proceeding is to be used as an exchange pillar for researchers who are working in the mentioned fields. In order to meet the high quality of Springer, AISC series, the organization committee has made their efforts to do the following things. Firstly, poor quality paper has been refused after reviewing course by anonymous referee experts. Secondly, periodically review meetings have been held around the reviewers about five times for exchanging reviewing suggestions. Finally, the conference organizers had several preliminary sessions before the conference. Through efforts of different people and departments, the conference will be successful and fruitful

  17. Advances in Computer-Based Autoantibodies Analysis

    Science.gov (United States)

    Soda, Paolo; Iannello, Giulio

    Indirect Immunofluorescence (IIF) imaging is the recommended me-thod to detect autoantibodies in patient serum, whose common markers are antinuclear autoantibodies (ANA) and autoantibodies directed against double strand DNA (anti-dsDNA). Since the availability of accurately performed and correctly reported laboratory determinations is crucial for the clinicians, an evident medical demand is the development of Computer Aided Diagnosis (CAD) tools supporting physicians' decisions.

  18. Advances in Neurotechnology for Brain Computer Interfaces

    OpenAIRE

    Fazli, Siamac

    2011-01-01

    Gehirn Computer Schnittstellen haben in den letzten 10 Jahren ein enormes wissenschaftliches Interesse hervorgerufen. Allerdings offenbart diese spannende Technology bei näherer Betrachtung noch einige Hürden, welche bisher die Entwicklung von massentauglichen Anwendungen verhindert haben. Unter Anderem eine lange Vorbereitungszeit eines BCI Systems, die fehlende Steuermöglichkeiten für manche Benutzer, sowie die nicht Stationaritäten innerhalb einer Aufnahme. Diese Dissertation führt eine Re...

  19. Recent Advances in Computer Engineering and Applications

    OpenAIRE

    Manoj Jha, Stephen Lagakos Leonid Perlovsky; Covaci, Brindusa; Nikos Mastorakis, Azami Zaharim

    2010-01-01

    This year the 4th WSEAS International Conference on COMPUTER ENGINEERING and APPLICATIONS (CEA '10) was held at Harvard University, Cambridge, USA, January 27-29, 2010. The conference remains faithful to its original idea of providing a platform to discuss network architecture, network design software, mobile networks and mobile services, digital broadcasting, e-commerce, optical networks, hacking, trojian horses, viruses, worms, spam, information security, standards of information security: ...

  20. Extending the horizons advances in computing, optimization, and decision technologies

    CERN Document Server

    Joseph, Anito; Mehrotra, Anuj; Trick, Michael

    2007-01-01

    Computer Science and Operations Research continue to have a synergistic relationship and this book represents the results of cross-fertilization between OR/MS and CS/AI. It is this interface of OR/CS that makes possible advances that could not have been achieved in isolation. Taken collectively, these articles are indicative of the state-of-the-art in the interface between OR/MS and CS/AI and of the high caliber of research being conducted by members of the INFORMS Computing Society. EXTENDING THE HORIZONS: Advances in Computing, Optimization, and Decision Technologies is a volume that presents the latest, leading research in the design and analysis of algorithms, computational optimization, heuristic search and learning, modeling languages, parallel and distributed computing, simulation, computational logic and visualization. This volume also emphasizes a variety of novel applications in the interface of CS, AI, and OR/MS.

  1. Proceedings of International Conference on Advances in Computing

    CERN Document Server

    R, Selvarani; Kumar, T

    2012-01-01

    This is the first International Conference on Advances in Computing (ICAdC-2012). The scope of the conference includes all the areas of Theoretical Computer Science, Systems and Software, and Intelligent Systems. Conference Proceedings is a culmination of research results, papers and the theory related to all the three major areas of computing, i.e., Theoretical Computer Science, Systems and Software, and Intelligent Systems. Helps budding researchers, graduates in the areas of Computer Science, Information science, Electronics, Telecommunication, Instrumentation, Networking to take forward their research work based on the reviewed results in the paper,  by mutual interaction through E-mail contacts in the proceedings.

  2. Advances in computational fluid dynamics solvers for modern computing environments

    Science.gov (United States)

    Hertenstein, Daniel; Humphrey, John R.; Paolini, Aaron L.; Kelmelis, Eric J.

    2013-05-01

    EM Photonics has been investigating the application of massively multicore processors to a key problem area: Computational Fluid Dynamics (CFD). While the capabilities of CFD solvers have continually increased and improved to support features such as moving bodies and adjoint-based mesh adaptation, the software architecture has often lagged behind. This has led to poor scaling as core counts reach the tens of thousands. In the modern High Performance Computing (HPC) world, clusters with hundreds of thousands of cores are becoming the standard. In addition, accelerator devices such as NVIDIA GPUs and Intel Xeon Phi are being installed in many new systems. It is important for CFD solvers to take advantage of the new hardware as the computations involved are well suited for the massively multicore architecture. In our work, we demonstrate that new features in NVIDIA GPUs are able to empower existing CFD solvers by example using AVUS, a CFD solver developed by the Air Force Research Labratory (AFRL) and the Volcanic Ash Advisory Center (VAAC). The effort has resulted in increased performance and scalability without sacrificing accuracy. There are many well-known codes in the CFD space that can benefit from this work, such as FUN3D, OVERFLOW, and TetrUSS. Such codes are widely used in the commercial, government, and defense sectors.

  3. Intelligent Software Tools for Advanced Computing

    Energy Technology Data Exchange (ETDEWEB)

    Baumgart, C.W.

    2001-04-03

    Feature extraction and evaluation are two procedures common to the development of any pattern recognition application. These features are the primary pieces of information which are used to train the pattern recognition tool, whether that tool is a neural network, a fuzzy logic rulebase, or a genetic algorithm. Careful selection of the features to be used by the pattern recognition tool can significantly streamline the overall development and training of the solution for the pattern recognition application. This report summarizes the development of an integrated, computer-based software package called the Feature Extraction Toolbox (FET), which can be used for the development and deployment of solutions to generic pattern recognition problems. This toolbox integrates a number of software techniques for signal processing, feature extraction and evaluation, and pattern recognition, all under a single, user-friendly development environment. The toolbox has been developed to run on a laptop computer, so that it may be taken to a site and used to develop pattern recognition applications in the field. A prototype version of this toolbox has been completed and is currently being used for applications development on several projects in support of the Department of Energy.

  4. Hybrid soft computing approaches research and applications

    CERN Document Server

    Dutta, Paramartha; Chakraborty, Susanta

    2016-01-01

    The book provides a platform for dealing with the flaws and failings of the soft computing paradigm through different manifestations. The different chapters highlight the necessity of the hybrid soft computing methodology in general with emphasis on several application perspectives in particular. Typical examples include (a) Study of Economic Load Dispatch by Various Hybrid Optimization Techniques, (b) An Application of Color Magnetic Resonance Brain Image Segmentation by ParaOptiMUSIG activation Function, (c) Hybrid Rough-PSO Approach in Remote Sensing Imagery Analysis,  (d) A Study and Analysis of Hybrid Intelligent Techniques for Breast Cancer Detection using Breast Thermograms, and (e) Hybridization of 2D-3D Images for Human Face Recognition. The elaborate findings of the chapters enhance the exhibition of the hybrid soft computing paradigm in the field of intelligent computing.

  5. Advances in optimal routing through computer networks

    Science.gov (United States)

    Paz, I. M.

    1977-01-01

    The optimal routing problem is defined. Progress in solving the problem during the previous decade is reviewed, with special emphasis on technical developments made during the last few years. The relationships between the routing, the throughput, and the switching technology used are discussed and their future trends are reviewed. Economic aspects are also briefly considered. Modern technical approaches for handling the routing problems and, more generally, the flow control problems are reviewed.

  6. Towards a Resource Reservation Approach for an Opportunistic Computing Environment

    International Nuclear Information System (INIS)

    Advanced reservation has been used in grid environments to provide quality of service (QoS) and to guarantee resources available at the execution time. However, in grid subtypes, such as opportunistic grid computing, it is a challenge provides QoS and guarantee of availability resources. In this article, we propose a new advanced reservation approach which offers to a user the possibility to select resources in advance for a future utilization. Therefore, the main goal of this proposal is to offer a best effort feature to a user from an opportunistic configuration. In these types of environments, it is not possible to provide QoS, because, usually, there are no guarantees of resources availability and, consequently, the execution of users applications. In addition, this research work provides a way to organize executions, what it can improve the scheduling and system operations. Experimental results, carried out through a case study, shown the efficiency and relevance of our proposal

  7. Computational neuroscience for advancing artificial intelligence

    Directory of Open Access Journals (Sweden)

    Fernando P. Ponce

    2011-07-01

    Full Text Available resumen del libro de Alonso, E. y Mondragón, E. (2011. Hershey, NY: Medical Information Science Reference. La neurociencia como disciplinapersigue el entendimiento del cerebro y su relación con el funcionamiento de la mente a través del análisis de la comprensión de la interacción de diversos procesos físicos, químicos y biológicos (Bassett & Gazzaniga, 2011. Por otra parte, numerosas disciplinasprogresivamente han realizado significativas contribuciones en esta empresa tales como la matemática, la psicología o la filosofía, entre otras. Producto de este esfuerzo, es que junto con la neurociencia tradicional han aparecido disciplinas complementarias como la neurociencia cognitiva, la neuropsicología o la neurocienciacomputacional (Bengio, 2007; Dayan & Abbott, 2005. En el contexto de la neurociencia computacional como disciplina complementaria a laneurociencia tradicional. Alonso y Mondragón (2011 editan el libroComputacional Neuroscience for Advancing Artificial Intelligence: Models, Methods and Applications.

  8. Computational Approach To Understanding Autism Spectrum Disorders

    Directory of Open Access Journals (Sweden)

    Włodzisław Duch

    2012-01-01

    Full Text Available Every year the prevalence of Autism Spectrum of Disorders (ASD is rising. Is there a unifying mechanism of various ASD cases at the genetic, molecular, cellular or systems level? The hypothesis advanced in this paper is focused on neural dysfunctions that lead to problems with attention in autistic people. Simulations of attractor neural networks performing cognitive functions help to assess system long-term neurodynamics. The Fuzzy Symbolic Dynamics (FSD technique is used for the visualization of attractors in the semantic layer of the neural model of reading. Large-scale simulations of brain structures characterized by a high order of complexity requires enormous computational power, especially if biologically motivated neuron models are used to investigate the influence of cellular structure dysfunctions on the network dynamics. Such simulations have to be implemented on computer clusters in a grid-based architectures

  9. Advanced Methods and Applications in Computational Intelligence

    CERN Document Server

    Nikodem, Jan; Jacak, Witold; Chaczko, Zenon; ACASE 2012

    2014-01-01

    This book offers an excellent presentation of intelligent engineering and informatics foundations for researchers in this field as well as many examples with industrial application. It contains extended versions of selected papers presented at the inaugural ACASE 2012 Conference dedicated to the Applications of Systems Engineering. This conference was held from the 6th to the 8th of February 2012, at the University of Technology, Sydney, Australia, organized by the University of Technology, Sydney (Australia), Wroclaw University of Technology (Poland) and the University of Applied Sciences in Hagenberg (Austria). The  book is organized into three main parts. Part I contains papers devoted to the heuristic approaches that are applicable in situations where the problem cannot be solved by exact methods, due to various characteristics or  dimensionality problems. Part II covers essential issues of the network management, presents intelligent models of the next generation of networks and distributed systems ...

  10. Advances in FDTD computational electrodynamics photonics and nanotechnology

    CERN Document Server

    Oskooi, Ardavan; Johnson, Steven G

    2013-01-01

    Advances in photonics and nanotechnology have the potential to revolutionize humanity s ability to communicate and compute. To pursue these advances, it is mandatory to understand and properly model interactions of light with materials such as silicon and gold at the nanoscale, i.e., the span of a few tens of atoms laid side by side. These interactions are governed by the fundamental Maxwell s equations of classical electrodynamics, supplemented by quantum electrodynamics. This book presents the current state-of-the-art in formulating and implementing computational models of these interactions. Maxwell s equations are solved using the finite-difference time-domain (FDTD) technique, pioneered by the senior editor, whose prior Artech books in this area are among the top ten most-cited in the history of engineering. You discover the most important advances in all areas of FDTD and PSTD computational modeling of electromagnetic wave interactions. This cutting-edge resource helps you understand the latest develo...

  11. Reliability of an Interactive Computer Program for Advance Care Planning

    OpenAIRE

    Schubart, Jane R.; Levi, Benjamin H.; Camacho, Fabian; Whitehead, Megan; Farace, Elana; Green, Michael J.

    2012-01-01

    Despite widespread efforts to promote advance directives (ADs), completion rates remain low. Making Your Wishes Known: Planning Your Medical Future (MYWK) is an interactive computer program that guides individuals through the process of advance care planning, explaining health conditions and interventions that commonly involve life or death decisions, helps them articulate their values/goals, and translates users' preferences into a detailed AD document. The purpose of this study was to demon...

  12. The Advanced Computational Methods Center, University of Georgia

    OpenAIRE

    Nute, Donald; Covington, Michael; Rankin, Terry

    1986-01-01

    The Advanced Computational Methods Center (ACMC) established at the University of Georgia in 1984, supports several research projects in artificial intelligence. The primary goal of AI research at ACMC is the design and installation of a logic-programming environment with advanced natural language processing and knowledge-acquisition capabilities on the university's highly parallel CYBERPLUS system from Control Data Corporation. This article briefly describes current research projects in arti...

  13. Isogenies of Elliptic Curves: A Computational Approach

    CERN Document Server

    Shumow, Daniel

    2009-01-01

    Isogenies, the mappings of elliptic curves, have become a useful tool in cryptology. These mathematical objects have been proposed for use in computing pairings, constructing hash functions and random number generators, and analyzing the reducibility of the elliptic curve discrete logarithm problem. With such diverse uses, understanding these objects is important for anyone interested in the field of elliptic curve cryptography. This paper, targeted at an audience with a knowledge of the basic theory of elliptic curves, provides an introduction to the necessary theoretical background for understanding what isogenies are and their basic properties. This theoretical background is used to explain some of the basic computational tasks associated with isogenies. Herein, algorithms for computing isogenies are collected and presented with proofs of correctness and complexity analyses. As opposed to the complex analytic approach provided in most texts on the subject, the proofs in this paper are primarily algebraic i...

  14. 9th International Conference on Advanced Computing & Communication Technologies

    CERN Document Server

    Mandal, Jyotsna; Auluck, Nitin; Nagarajaram, H

    2016-01-01

    This book highlights a collection of high-quality peer-reviewed research papers presented at the Ninth International Conference on Advanced Computing & Communication Technologies (ICACCT-2015) held at Asia Pacific Institute of Information Technology, Panipat, India during 27–29 November 2015. The book discusses a wide variety of industrial, engineering and scientific applications of the emerging techniques. Researchers from academia and industry present their original work and exchange ideas, information, techniques and applications in the field of Advanced Computing and Communication Technology.

  15. UNEDF: Advanced Scientific Computing Collaboration Transforms the Low-Energy Nuclear Many-Body Problem

    CERN Document Server

    Nam, H; Nazarewicz, W; Bulgac, A; Hagen, G; Kortelainen, M; Maris, P; Pei, J C; Roche, K J; Schunck, N; Thompson, I; Vary, J P; Wild, S M

    2012-01-01

    The demands of cutting-edge science are driving the need for larger and faster computing resources. With the rapidly growing scale of computing systems and the prospect of technologically disruptive architectures to meet these needs, scientists face the challenge of effectively using complex computational resources to advance scientific discovery. Multidisciplinary collaborating networks of researchers with diverse scientific backgrounds are needed to address these complex challenges. The UNEDF SciDAC collaboration of nuclear theorists, applied mathematicians, and computer scientists is developing a comprehensive description of nuclei and their reactions that delivers maximum predictive power with quantified uncertainties. This paper describes UNEDF and identifies attributes that classify it as a successful computational collaboration. We illustrate significant milestones accomplished by UNEDF through integrative solutions using the most reliable theoretical approaches, most advanced algorithms, and leadershi...

  16. Computer-Assisted Foreign Language Teaching and Learning: Technological Advances

    Science.gov (United States)

    Zou, Bin; Xing, Minjie; Wang, Yuping; Sun, Mingyu; Xiang, Catherine H.

    2013-01-01

    Computer-Assisted Foreign Language Teaching and Learning: Technological Advances highlights new research and an original framework that brings together foreign language teaching, experiments and testing practices that utilize the most recent and widely used e-learning resources. This comprehensive collection of research will offer linguistic…

  17. Advanced quantum communications an engineering approach

    CERN Document Server

    Imre, Sandor

    2012-01-01

    The book provides an overview of the most advanced quantum informational geometric techniques, which can help quantum communication theorists analyze quantum channels, such as security or additivity properties. Each section addresses an area of major research of quantum information theory and quantum communication networks. The authors present the fundamental theoretical results of quantum information theory, while also presenting the details of advanced quantum ccommunication protocols with clear mathematical and information theoretical background. This book bridges the gap between quantum ph

  18. Innovations and Advances in Computer, Information, Systems Sciences, and Engineering

    CERN Document Server

    Sobh, Tarek

    2013-01-01

    Innovations and Advances in Computer, Information, Systems Sciences, and Engineering includes the proceedings of the International Joint Conferences on Computer, Information, and Systems Sciences, and Engineering (CISSE 2011). The contents of this book are a set of rigorously reviewed, world-class manuscripts addressing and detailing state-of-the-art research projects in the areas of  Industrial Electronics, Technology and Automation, Telecommunications and Networking, Systems, Computing Sciences and Software Engineering, Engineering Education, Instructional Technology, Assessment, and E-learning.

  19. Advances in computers dependable and secure systems engineering

    CERN Document Server

    Hurson, Ali

    2012-01-01

    Since its first volume in 1960, Advances in Computers has presented detailed coverage of innovations in computer hardware, software, theory, design, and applications. It has also provided contributors with a medium in which they can explore their subjects in greater depth and breadth than journal articles usually allow. As a result, many articles have become standard references that continue to be of sugnificant, lasting value in this rapidly expanding field. In-depth surveys and tutorials on new computer technologyWell-known authors and researchers in the fieldExtensive bibliographies with m

  20. Advanced computational tools for 3-D seismic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Barhen, J.; Glover, C.W.; Protopopescu, V.A. [Oak Ridge National Lab., TN (United States)] [and others

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  1. [Activities of Research Institute for Advanced Computer Science

    Science.gov (United States)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2001-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  2. Introducing Computational Approaches in Intermediate Mechanics

    Science.gov (United States)

    Cook, David M.

    2006-12-01

    In the winter of 2003, we at Lawrence University moved Lagrangian mechanics and rigid body dynamics from a required sophomore course to an elective junior/senior course, freeing 40% of the time for computational approaches to ordinary differential equations (trajectory problems, the large amplitude pendulum, non-linear dynamics); evaluation of integrals (finding centers of mass and moment of inertia tensors, calculating gravitational potentials for various sources); and finding eigenvalues and eigenvectors of matrices (diagonalizing the moment of inertia tensor, finding principal axes), and to generating graphical displays of computed results. Further, students begin to use LaTeX to prepare some of their submitted problem solutions. Placed in the middle of the sophomore year, this course provides the background that permits faculty members as appropriate to assign computer-based exercises in subsequent courses. Further, students are encouraged to use our Computational Physics Laboratory on their own initiative whenever that use seems appropriate. (Curricular development supported in part by the W. M. Keck Foundation, the National Science Foundation, and Lawrence University.)

  3. Predicting microbial interactions through computational approaches.

    Science.gov (United States)

    Li, Chenhao; Lim, Kun Ming Kenneth; Chng, Kern Rei; Nagarajan, Niranjan

    2016-06-01

    Microorganisms play a vital role in various ecosystems and characterizing interactions between them is an essential step towards understanding the organization and function of microbial communities. Computational prediction has recently become a widely used approach to investigate microbial interactions. We provide a thorough review of emerging computational methods organized by the type of data they employ. We highlight three major challenges in inferring interactions using metagenomic survey data and discuss the underlying assumptions and mathematics of interaction inference algorithms. In addition, we review interaction prediction methods relying on metabolic pathways, which are increasingly used to reveal mechanisms of interactions. Furthermore, we also emphasize the importance of mining the scientific literature for microbial interactions - a largely overlooked data source for experimentally validated interactions. PMID:27025964

  4. Computational approaches to analogical reasoning current trends

    CERN Document Server

    Richard, Gilles

    2014-01-01

    Analogical reasoning is known as a powerful mode for drawing plausible conclusions and solving problems. It has been the topic of a huge number of works by philosophers, anthropologists, linguists, psychologists, and computer scientists. As such, it has been early studied in artificial intelligence, with a particular renewal of interest in the last decade. The present volume provides a structured view of current research trends on computational approaches to analogical reasoning. It starts with an overview of the field, with an extensive bibliography. The 14 collected contributions cover a large scope of issues. First, the use of analogical proportions and analogies is explained and discussed in various natural language processing problems, as well as in automated deduction. Then, different formal frameworks for handling analogies are presented, dealing with case-based reasoning, heuristic-driven theory projection, commonsense reasoning about incomplete rule bases, logical proportions induced by similarity an...

  5. 2014 National Workshop on Advances in Communication and Computing

    CERN Document Server

    Prasanna, S; Sarma, Kandarpa; Saikia, Navajit

    2015-01-01

    The present volume is a compilation of research work in computation, communication, vision sciences, device design, fabrication, upcoming materials and related process design, etc. It is derived out of selected manuscripts submitted to the 2014 National Workshop on Advances in Communication and Computing (WACC 2014), Assam Engineering College, Guwahati, Assam, India which is emerging out to be a premier platform for discussion and dissemination of knowhow in this part of the world. The papers included in the volume are indicative of the recent thrust in computation, communications and emerging technologies. Certain recent advances in ZnO nanostructures for alternate energy generation provide emerging insights into an area that has promises for the energy sector including conservation and green technology. Similarly, scholarly contributions have focused on malware detection and related issues. Several contributions have focused on biomedical aspects including contributions related to cancer detection using act...

  6. Sculpting the band gap: a computational approach.

    Science.gov (United States)

    Prasai, Kiran; Biswas, Parthapratim; Drabold, D A

    2015-01-01

    Materials with optimized band gap are needed in many specialized applications. In this work, we demonstrate that Hellmann-Feynman forces associated with the gap states can be used to find atomic coordinates that yield desired electronic density of states. Using tight-binding models, we show that this approach may be used to arrive at electronically designed models of amorphous silicon and carbon. We provide a simple recipe to include a priori electronic information in the formation of computer models of materials, and prove that this information may have profound structural consequences. The models are validated with plane-wave density functional calculations. PMID:26490203

  7. Computational modeling approaches in gonadotropin signaling.

    Science.gov (United States)

    Ayoub, Mohammed Akli; Yvinec, Romain; Crépieux, Pascale; Poupon, Anne

    2016-07-01

    Follicle-stimulating hormone and LH play essential roles in animal reproduction. They exert their function through binding to their cognate receptors, which belong to the large family of G protein-coupled receptors. This recognition at the plasma membrane triggers a plethora of cellular events, whose processing and integration ultimately lead to an adapted biological response. Understanding the nature and the kinetics of these events is essential for innovative approaches in drug discovery. The study and manipulation of such complex systems requires the use of computational modeling approaches combined with robust in vitro functional assays for calibration and validation. Modeling brings a detailed understanding of the system and can also be used to understand why existing drugs do not work as well as expected, and how to design more efficient ones. PMID:27165991

  8. Computational approaches for rational design of proteins with novel functionalities

    Directory of Open Access Journals (Sweden)

    Manish Kumar Tiwari

    2012-09-01

    Full Text Available Proteins are the most multifaceted macromolecules in living systems and have various important functions, including structural, catalytic, sensory, and regulatory functions. Rational design of enzymes is a great challenge to our understanding of protein structure and physical chemistry and has numerous potential applications. Protein design algorithms have been applied to design or engineer proteins that fold, fold faster, catalyze, catalyze faster, signal, and adopt preferred conformational states. The field of de novo protein design, although only a few decades old, is beginning to produce exciting results. Developments in this field are already having a significant impact on biotechnology and chemical biology. The application of powerful computational methods for functional protein designing has recently succeeded at engineering target activities. Here, we review recently reported de novo functional proteins that were developed using various protein design approaches, including rational design, computational optimization, and selection from combinatorial libraries, highlighting recent advances and successes.

  9. Scientific Discovery through Advanced Computing (SciDAC-3) Partnership Project Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, Forest M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bochev, Pavel B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cameron-Smith, Philip J.. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Easter, Richard C [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Elliott, Scott M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ghan, Steven J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Liu, Xiaohong [Univ. of Wyoming, Laramie, WY (United States); Lowrie, Robert B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Lucas, Donald D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ma, Po-lun [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sacks, William J. [National Center for Atmospheric Research (NCAR), Boulder, CO (United States); Shrivastava, Manish [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Singh, Balwinder [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Tautges, Timothy J. [Argonne National Lab. (ANL), Argonne, IL (United States); Taylor, Mark A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Vertenstein, Mariana [National Center for Atmospheric Research (NCAR), Boulder, CO (United States); Worley, Patrick H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2014-01-15

    The Applying Computationally Efficient Schemes for BioGeochemical Cycles ACES4BGC Project is advancing the predictive capabilities of Earth System Models (ESMs) by reducing two of the largest sources of uncertainty, aerosols and biospheric feedbacks, with a highly efficient computational approach. In particular, this project is implementing and optimizing new computationally efficient tracer advection algorithms for large numbers of tracer species; adding important biogeochemical interactions between the atmosphere, land, and ocean models; and applying uncertainty quanti cation (UQ) techniques to constrain process parameters and evaluate uncertainties in feedbacks between biogeochemical cycles and the climate system.

  10. Advanced computer graphics techniques as applied to the nuclear industry

    International Nuclear Information System (INIS)

    Computer graphics is a rapidly advancing technological area in computer science. This is being motivated by increased hardware capability coupled with reduced hardware costs. This paper will cover six topics in computer graphics, with examples forecasting how each of these capabilities could be used in the nuclear industry. These topics are: (1) Image Realism with Surfaces and Transparency; (2) Computer Graphics Motion; (3) Graphics Resolution Issues and Examples; (4) Iconic Interaction; (5) Graphic Workstations; and (6) Data Fusion - illustrating data coming from numerous sources, for display through high dimensional, greater than 3-D, graphics. All topics will be discussed using extensive examples with slides, video tapes, and movies. Illustrations have been omitted from the paper due to the complexity of color reproduction. 11 refs., 2 figs., 3 tabs

  11. Computer code qualification program for the Advanced CANDU Reactor

    International Nuclear Information System (INIS)

    Atomic Energy of Canada Ltd (AECL) has developed and implemented a Software Quality Assurance program (SQA) to ensure that its analytical, scientific and design computer codes meet the required standards for software used in safety analyses. This paper provides an overview of the computer programs used in Advanced CANDU Reactor (ACR) safety analysis, and assessment of their applicability in the safety analyses of the ACR design. An outline of the incremental validation program, and an overview of the experimental program in support of the code validation are also presented. An outline of the SQA program used to qualify these computer codes is also briefly presented. To provide context to the differences in the SQA with respect to current CANDUs, the paper also provides an overview of the ACR design features that have an impact on the computer code qualification. (author)

  12. Advanced Simulation and Computing FY17 Implementation Plan, Version 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hendrickson, Bruce [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wade, Doug [National Nuclear Security Administration (NNSA), Washington, DC (United States). Office of Advanced Simulation and Computing and Institutional Research and Development; Hoang, Thuc [National Nuclear Security Administration (NNSA), Washington, DC (United States). Computational Systems and Software Environment

    2016-08-29

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.

  13. Recent advances in diagnostic approaches for sub-arachnoid hemorrhage.

    Science.gov (United States)

    Kumar, Ashish; Kato, Yoko; Hayakawa, Motoharu; Junpei, Oda; Watabe, Takeya; Imizu, Shuei; Oguri, Daikichi; Hirose, Yuichi

    2011-07-01

    Sub-arachnoid hemorrhage (SAH) has been easily one of the most debilitating neurosurgical entities as far as stroke related case mortality and morbidity rates are concerned. To date, it has case fatality rates ranging from 32-67%. Advances in the diagnostic accuracy of the available imaging methods have contributed significantly in reducing morbidity associated with this deadly disease. We currently have computed tomography angiography (CTA), magnetic resonance angiography (MRA) and the digital subtraction angiography (DSA) including three dimensional DSA as the mainstay diagnostic techniques. The non-invasive angiography in the form of CTA and MRA has evolved in the last decade as rapid, easily available, and economical means of diagnosing the cause of SAH. The role of three dimensional computed tomography angiography (3D-CTA) in management of aneurysms has been fairly acknowledged in the past. There have been numerous articles in the literature regarding its potential threat to the conventional "gold standard" DSA. The most recent addition has been the introduction of the fourth dimension to the established 3D-CT angiography (4D-CTA). At many centers, DSA is still treated as the first choice of investigation. Although, CT angiography still has some limitations, it can provide an unmatched multi-directional view of the aneurysmal morphology and its surroundings including relations with the skull base and blood vessels. We study the recent advances in the diagnostic approaches to SAH with special emphasis on 3D-CTA and 4D-CTA as the upcoming technologies. PMID:22347331

  14. Activities of the Research Institute for Advanced Computer Science

    Science.gov (United States)

    Oliger, Joseph

    1994-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) high performance networks; and (4) learning systems. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period January 1, 1994 through December 31, 1994 is in the Reports and Abstracts section of this report.

  15. Advanced computer architecture specification for automated weld systems

    Science.gov (United States)

    Katsinis, Constantine

    1994-01-01

    This report describes the requirements for an advanced automated weld system and the associated computer architecture, and defines the overall system specification from a broad perspective. According to the requirements of welding procedures as they relate to an integrated multiaxis motion control and sensor architecture, the computer system requirements are developed based on a proven multiple-processor architecture with an expandable, distributed-memory, single global bus architecture, containing individual processors which are assigned to specific tasks that support sensor or control processes. The specified architecture is sufficiently flexible to integrate previously developed equipment, be upgradable and allow on-site modifications.

  16. Recovery Act: Advanced Direct Methanol Fuel Cell for Mobile Computing

    Energy Technology Data Exchange (ETDEWEB)

    Fletcher, James H. [University of North Florida; Cox, Philip [University of North Florida; Harrington, William J [University of North Florida; Campbell, Joseph L [University of North Florida

    2013-09-03

    ABSTRACT Project Title: Recovery Act: Advanced Direct Methanol Fuel Cell for Mobile Computing PROJECT OBJECTIVE The objective of the project was to advance portable fuel cell system technology towards the commercial targets of power density, energy density and lifetime. These targets were laid out in the DOE’s R&D roadmap to develop an advanced direct methanol fuel cell power supply that meets commercial entry requirements. Such a power supply will enable mobile computers to operate non-stop, unplugged from the wall power outlet, by using the high energy density of methanol fuel contained in a replaceable fuel cartridge. Specifically this project focused on balance-of-plant component integration and miniaturization, as well as extensive component, subassembly and integrated system durability and validation testing. This design has resulted in a pre-production power supply design and a prototype that meet the rigorous demands of consumer electronic applications. PROJECT TASKS The proposed work plan was designed to meet the project objectives, which corresponded directly with the objectives outlined in the Funding Opportunity Announcement: To engineer the fuel cell balance-of-plant and packaging to meet the needs of consumer electronic systems, specifically at power levels required for mobile computing. UNF used existing balance-of-plant component technologies developed under its current US Army CERDEC project, as well as a previous DOE project completed by PolyFuel, to further refine them to both miniaturize and integrate their functionality to increase the system power density and energy density. Benefits of UNF’s novel passive water recycling MEA (membrane electrode assembly) and the simplified system architecture it enabled formed the foundation of the design approach. The package design was hardened to address orientation independence, shock, vibration, and environmental requirements. Fuel cartridge and fuel subsystems were improved to ensure effective fuel

  17. Soft computing in design and manufacturing of advanced materials

    Science.gov (United States)

    Cios, Krzysztof J.; Baaklini, George Y; Vary, Alex

    1993-01-01

    The potential of fuzzy sets and neural networks, often referred to as soft computing, for aiding in all aspects of manufacturing of advanced materials like ceramics is addressed. In design and manufacturing of advanced materials, it is desirable to find which of the many processing variables contribute most to the desired properties of the material. There is also interest in real time quality control of parameters that govern material properties during processing stages. The concepts of fuzzy sets and neural networks are briefly introduced and it is shown how they can be used in the design and manufacturing processes. These two computational methods are alternatives to other methods such as the Taguchi method. The two methods are demonstrated by using data collected at NASA Lewis Research Center. Future research directions are also discussed.

  18. NATO Advanced Study Institute on Methods in Computational Molecular Physics

    CERN Document Server

    Diercksen, Geerd

    1992-01-01

    This volume records the lectures given at a NATO Advanced Study Institute on Methods in Computational Molecular Physics held in Bad Windsheim, Germany, from 22nd July until 2nd. August, 1991. This NATO Advanced Study Institute sought to bridge the quite considerable gap which exist between the presentation of molecular electronic structure theory found in contemporary monographs such as, for example, McWeeny's Methods 0/ Molecular Quantum Mechanics (Academic Press, London, 1989) or Wilson's Electron correlation in moleeules (Clarendon Press, Oxford, 1984) and the realization of the sophisticated computational algorithms required for their practical application. It sought to underline the relation between the electronic structure problem and the study of nuc1ear motion. Software for performing molecular electronic structure calculations is now being applied in an increasingly wide range of fields in both the academic and the commercial sectors. Numerous applications are reported in areas as diverse as catalysi...

  19. Condition Monitoring Through Advanced Sensor and Computational Technology

    International Nuclear Information System (INIS)

    The overall goal of this joint research project was to develop and demonstrate advanced sensors and computational technology for continuous monitoring of the condition of components, structures, and systems in advanced and next-generation nuclear power plants (NPPs). This project included investigating and adapting several advanced sensor technologies from Korean and US national laboratory research communities, some of which were developed and applied in non-nuclear industries. The project team investigated and developed sophisticated signal processing, noise reduction, and pattern recognition techniques and algorithms. The researchers installed sensors and conducted condition monitoring tests on two test loops, a check valve (an active component) and a piping elbow (a passive component), to demonstrate the feasibility of using advanced sensors and computational technology to achieve the project goal. Acoustic emission (AE) devices, optical fiber sensors, accelerometers, and ultrasonic transducers (UTs) were used to detect mechanical vibratory response of check valve and piping elbow in normal and degraded configurations. Chemical sensors were also installed to monitor the water chemistry in the piping elbow test loop. Analysis results of processed sensor data indicate that it is feasible to differentiate between the normal and degraded (with selected degradation mechanisms) configurations of these two components from the acquired sensor signals, but it is questionable that these methods can reliably identify the level and type of degradation. Additional research and development efforts are needed to refine the differentiation techniques and to reduce the level of uncertainties

  20. Music Genre Classification Systems - A Computational Approach

    DEFF Research Database (Denmark)

    Ahrendt, Peter

    2006-01-01

    Automatic music genre classification is the classification of a piece of music into its corresponding genre (such as jazz or rock) by a computer. It is considered to be a cornerstone of the research area Music Information Retrieval (MIR) and closely linked to the other areas in MIR. It is thought...... that MIR will be a key element in the processing, searching and retrieval of digital music in the near future. This dissertation is concerned with music genre classification systems and in particular systems which use the raw audio signal as input to estimate the corresponding genre. This is in...... contrast to systems which use e.g. a symbolic representation or textual information about the music. The approach to music genre classification systems has here been system-oriented. In other words, all the different aspects of the systems have been considered and it is emphasized that the systems should...

  1. A computational approach to negative priming

    Science.gov (United States)

    Schrobsdorff, H.; Ihrke, M.; Kabisch, B.; Behrendt, J.; Hasselhorn, M.; Herrmann, J. Michael

    2007-09-01

    Priming is characterized by a sensitivity of reaction times to the sequence of stimuli in psychophysical experiments. The reduction of the reaction time observed in positive priming is well-known and experimentally understood (Scarborough et al., J. Exp. Psycholol: Hum. Percept. Perform., 3, pp. 1-17, 1977). Negative priming—the opposite effect—is experimentally less tangible (Fox, Psychonom. Bull. Rev., 2, pp. 145-173, 1995). The dependence on subtle parameter changes (such as response-stimulus interval) usually varies. The sensitivity of the negative priming effect bears great potential for applications in research in fields such as memory, selective attention, and ageing effects. We develop and analyse a computational realization, CISAM, of a recent psychological model for action decision making, the ISAM (Kabisch, PhD thesis, Friedrich-Schiller-Universitat, 2003), which is sensitive to priming conditions. With the dynamical systems approach of the CISAM, we show that a single adaptive threshold mechanism is sufficient to explain both positive and negative priming effects. This is achieved by comparing results obtained by the computational modelling with experimental data from our laboratory. The implementation provides a rich base from which testable predictions can be derived, e.g. with respect to hitherto untested stimulus combinations (e.g. single-object trials).

  2. A Desktop Grid Computing Approach for Scientific Computing and Visualization

    OpenAIRE

    Constantinescu-Fuløp, Zoran

    2008-01-01

    Scientific Computing is the collection of tools, techniques, and theories required to solve on a computer, mathematical models of problems from science and engineering, and its main goal is to gain insight in such problems. Generally, it is difficult to understand or communicate information from complex or large datasets generated by Scientific Computing methods and techniques (computational simulations, complex experiments, observational instruments etc.). Therefore, support of Scientific Vi...

  3. The advanced computational testing and simulation toolkit (ACTS)

    Energy Technology Data Exchange (ETDEWEB)

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  4. The advanced computational testing and simulation toolkit (ACTS)

    International Nuclear Information System (INIS)

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  5. A Survey on Recent Advances of Computer Vision Algorithms for Egocentric Video

    OpenAIRE

    Bambach, Sven

    2015-01-01

    Recent technological advances have made lightweight, head mounted cameras both practical and affordable and products like Google Glass show first approaches to introduce the idea of egocentric (first-person) video to the mainstream. Interestingly, the computer vision community has only recently started to explore this new domain of egocentric vision, where research can roughly be categorized into three areas: Object recognition, activity detection/recognition, video summarization. In this pap...

  6. Multimodality approach for locally advanced esophageal cancer

    Institute of Scientific and Technical Information of China (English)

    Khaldoun Almhanna; Jonathan R Strosberg

    2012-01-01

    Carcinoma of the esophagus is an aggressive and lethal malignancy with an increasing incidence world-wide.Incidence rates vary internationally,with the highest rates found in Southern and Eastern Africa and Eastern Asia,and the lowest in Western and Middle Africa and Central America.Patients with locally advanced disease face a poor prognosis,with 5-year survival rates ranging from 15%-34%.Recent clinical trials have evaluated different strategies for management of locoregional cancer; however,because of stage migration and changes in disease epidemiology,applying these trials to clinical practice has become a daunting task.We searched Medline and conference abstracts for randomized studies published in the last 3 decades.We restricted our search to articles published in English.Neoadjuvant chemoradiotherapy followed by surgical resection is an accepted standard of care in the United States.Esophagectomy remains an essential component of treatment and can lead to improved overall survival,especially when performed at high volume institutions.The role of adjuvant chemotherapy following curative resection is still unclear.External beam radiation therapy alone is considered palliative and is typically reserved for patients with a poor performance status.

  7. Computability and Analysis, a Historical Approach

    OpenAIRE

    Brattka, Vasco

    2016-01-01

    The history of computability theory and and the history of analysis are surprisingly intertwined since the beginning of the twentieth century. For one, \\'Emil Borel discussed his ideas on computable real number functions in his introduction to measure theory. On the other hand, Alan Turing had computable real numbers in mind when he introduced his now famous machine model. Here we want to focus on a particular aspect of computability and analysis, namely on computability properties of theorem...

  8. Advancing a holistic approach to openness

    DEFF Research Database (Denmark)

    Søndergaard, Helle Alsted; Araújo, Ana Luiza Lara de

    Open innovation has emerged as a new and interesting research area, and with this paper we wish to contribute to the research on open innovation by proposing a more holistic approach to openness that includes the internal sphere of openness. We use data from 170 Danish SMEs in the high-tech and...... medium high-tech industries and study the moderating role of internal openness on the relationship between external openness and performance. Our results show that the internal spheres of transformative and exploitative learning moderate the effect of external openness on performance and a model with the...... interaction effects of these variables gives significantly better prediction power. This indicates that internal knowledge practices are important for a firm's ability to leverage open innovation strategies....

  9. A comparison of computer architectures for the NASA demonstration advanced avionics system

    Science.gov (United States)

    Seacord, C. L.; Bailey, D. G.; Larson, J. C.

    1979-01-01

    The paper compares computer architectures for the NASA demonstration advanced avionics system. Two computer architectures are described with an unusual approach to fault tolerance: a single spare processor can correct for faults in any of the distributed processors by taking on the role of a failed module. It was shown the system must be used from a functional point of view to properly apply redundancy and achieve fault tolerance and ultra reliability. Data are presented on complexity and mission failure probability which show that the revised version offers equivalent mission reliability at lower cost as measured by hardware and software complexity.

  10. Advanced measurement approach with loss distribution in operational risk management

    OpenAIRE

    Atilla ÇİFTER; Chambers, Nurgül

    2007-01-01

    According to the last proposal by Basel Committee, commercial banks are allowed to use advanced measurement approach for operational risk. Since basic indicator and standard approach considers operational risk as a percentage of gross profit, these methodologies are not satisfactory as real lost or probability of lost are not taken into consideration. In this article, loss distribution approach is applied with simulated data. 20 nonparametric loss distributions and mixing internal and externa...

  11. Human Computer Interaction: An intellectual approach

    OpenAIRE

    Mr. Kuntal Saroha; Sheela Sharma; Gurpreet Bhatia

    2011-01-01

    This paper discusses the research that has been done in thefield of Human Computer Interaction (HCI) relating tohuman psychology. Human-computer interaction (HCI) isthe study of how people design, implement, and useinteractive computer systems and how computers affectindividuals, organizations, and society. This encompassesnot only ease of use but also new interaction techniques forsupporting user tasks, providing better access toinformation, and creating more powerful forms ofcommunication. ...

  12. A first attempt to bring computational biology into advanced high school biology classrooms.

    Directory of Open Access Journals (Sweden)

    Suzanne Renick Gallagher

    2011-10-01

    Full Text Available Computer science has become ubiquitous in many areas of biological research, yet most high school and even college students are unaware of this. As a result, many college biology majors graduate without adequate computational skills for contemporary fields of biology. The absence of a computational element in secondary school biology classrooms is of growing concern to the computational biology community and biology teachers who would like to acquaint their students with updated approaches in the discipline. We present a first attempt to correct this absence by introducing a computational biology element to teach genetic evolution into advanced biology classes in two local high schools. Our primary goal was to show students how computation is used in biology and why a basic understanding of computation is necessary for research in many fields of biology. This curriculum is intended to be taught by a computational biologist who has worked with a high school advanced biology teacher to adapt the unit for his/her classroom, but a motivated high school teacher comfortable with mathematics and computing may be able to teach this alone. In this paper, we present our curriculum, which takes into consideration the constraints of the required curriculum, and discuss our experiences teaching it. We describe the successes and challenges we encountered while bringing this unit to high school students, discuss how we addressed these challenges, and make suggestions for future versions of this curriculum.We believe that our curriculum can be a valuable seed for further development of computational activities aimed at high school biology students. Further, our experiences may be of value to others teaching computational biology at this level. Our curriculum can be obtained at http://ecsite.cs.colorado.edu/?page_id=149#biology or by contacting the authors.

  13. Building an advanced climate model: Program plan for the CHAMMP (Computer Hardware, Advanced Mathematics, and Model Physics) Climate Modeling Program

    Energy Technology Data Exchange (ETDEWEB)

    1990-12-01

    The issue of global warming and related climatic changes from increasing concentrations of greenhouse gases in the atmosphere has received prominent attention during the past few years. The Computer Hardware, Advanced Mathematics, and Model Physics (CHAMMP) Climate Modeling Program is designed to contribute directly to this rapid improvement. The goal of the CHAMMP Climate Modeling Program is to develop, verify, and apply a new generation of climate models within a coordinated framework that incorporates the best available scientific and numerical approaches to represent physical, biogeochemical, and ecological processes, that fully utilizes the hardware and software capabilities of new computer architectures, that probes the limits of climate predictability, and finally that can be used to address the challenging problem of understanding the greenhouse climate issue through the ability of the models to simulate time-dependent climatic changes over extended times and with regional resolution.

  14. Physics and computer science: quantum computation and other approaches

    OpenAIRE

    Salvador E. Venegas-Andraca

    2011-01-01

    This is a position paper written as an introduction to the special volume on quantum algorithms I edited for the journal Mathematical Structures in Computer Science (Volume 20 - Special Issue 06 (Quantum Algorithms), 2010).

  15. Cluster Computing: A Mobile Code Approach

    OpenAIRE

    Patel, R. B.; Manpreet Singh

    2006-01-01

    Cluster computing harnesses the combined computing power of multiple processors in a parallel configuration. Cluster Computing environments built from commodity hardware have provided a cost-effective solution for many scientific and high-performance applications. In this paper we have presented design and implementation of a cluster based framework using mobile code. The cluster implementation involves the designing of a server named MCLUSTER which manages the configuring, resetting of clust...

  16. What is Computation: An Epistemic Approach

    Czech Academy of Sciences Publication Activity Database

    Wiedermann, Jiří; van Leeuwen, J.

    Berlin: Springer, 2015 - (Italiano, G.; Margaria-Steffen, T.; Pokorný, J.; Quisquater, J.; Wattenhofer, R.), s. 1-13. (Lecture Notes in Computer Science. 8939). ISBN 978-3-662-46077-1. ISSN 0302-9743. [Sofsem 2015. International Conference on Current Trends in Theory and Practice of Computer Science /41./. Pec pod Sněžkou (CZ), 24.01.2015-29.01.2015] R&D Projects: GA ČR GAP202/10/1333 Institutional support: RVO:67985807 Keywords : computation * knowledge generation * information technology Subject RIV: IN - Informatics, Computer Science

  17. DOE Advanced Scientific Computing Advisory Committee (ASCAC) Report: Exascale Computing Initiative Review

    Energy Technology Data Exchange (ETDEWEB)

    Reed, Daniel [University of Iowa; Berzins, Martin [University of Utah; Pennington, Robert; Sarkar, Vivek [Rice University; Taylor, Valerie [Texas A& M University

    2015-08-01

    On November 19, 2014, the Advanced Scientific Computing Advisory Committee (ASCAC) was charged with reviewing the Department of Energy’s conceptual design for the Exascale Computing Initiative (ECI). In particular, this included assessing whether there are significant gaps in the ECI plan or areas that need to be given priority or extra management attention. Given the breadth and depth of previous reviews of the technical challenges inherent in exascale system design and deployment, the subcommittee focused its assessment on organizational and management issues, considering technical issues only as they informed organizational or management priorities and structures. This report presents the observations and recommendations of the subcommittee.

  18. Computer science approach to quantum control

    OpenAIRE

    Janzing, Dominik

    2006-01-01

    This work considers several hypothetical control processes on the nanoscopic level and show their analogy to computation processes. It shows that measuring certain types of quantum observables is such a complex task that every instrument that is able to perform it would necessarily be an extremely powerful computer.

  19. International conference on Advances in Intelligent Control and Innovative Computing

    CERN Document Server

    Castillo, Oscar; Huang, Xu; Intelligent Control and Innovative Computing

    2012-01-01

    In the lightning-fast world of intelligent control and cutting-edge computing, it is vitally important to stay abreast of developments that seem to follow each other without pause. This publication features the very latest and some of the very best current research in the field, with 32 revised and extended research articles written by prominent researchers in the field. Culled from contributions to the key 2011 conference Advances in Intelligent Control and Innovative Computing, held in Hong Kong, the articles deal with a wealth of relevant topics, from the most recent work in artificial intelligence and decision-supporting systems, to automated planning, modelling and simulation, signal processing, and industrial applications. Not only does this work communicate the current state of the art in intelligent control and innovative computing, it is also an illuminating guide to up-to-date topics for researchers and graduate students in the field. The quality of the contents is absolutely assured by the high pro...

  20. Advances in Intelligent Control Systems and Computer Science

    CERN Document Server

    2013-01-01

    The conception of real-time control networks taking into account, as an integrating approach, both the specific aspects of information and knowledge processing and the dynamic and energetic particularities of physical processes and of communication networks is representing one of the newest scientific and technological challenges. The new paradigm of Cyber-Physical Systems (CPS) reflects this tendency and will certainly change the evolution of the technology, with major social and economic impact. This book presents significant results in the field of process control and advanced information and knowledge processing, with applications in the fields of robotics, biotechnology, environment, energy, transportation, et al.. It introduces intelligent control concepts and strategies as well as real-time implementation aspects for complex control approaches. One of the sections is dedicated to the complex problem of designing software systems for distributed information processing networks. Problems as complexity an...

  1. Mathematics of shape description a morphological approach to image processing and computer graphics

    CERN Document Server

    Ghosh, Pijush K

    2009-01-01

    Image processing problems are often not well defined because real images are contaminated with noise and other uncertain factors. In Mathematics of Shape Description, the authors take a mathematical approach to address these problems using the morphological and set-theoretic approach to image processing and computer graphics by presenting a simple shape model using two basic shape operators called Minkowski addition and decomposition. This book is ideal for professional researchers and engineers in Information Processing, Image Measurement, Shape Description, Shape Representation and Computer Graphics. Post-graduate and advanced undergraduate students in pure and applied mathematics, computer sciences, robotics and engineering will also benefit from this book.  Key FeaturesExplains the fundamental and advanced relationships between algebraic system and shape description through the set-theoretic approachPromotes interaction of image processing geochronology and mathematics in the field of algebraic geometryP...

  2. Adapting advanced engineering design approaches to building design. Potential benefits

    NARCIS (Netherlands)

    Böhms, M.

    2006-01-01

    A number of industries continuously progress advancing their design approaches based on the changing market constraints. Examples such as car, ship and airplane manufacturing industries utilize process setups and techniques, that differ significantly from the processes and techniques used by the tra

  3. DISTRIBUTED COMPUTING APPROACHES FOR SCALABILITY AND HIGH PERFORMANCE

    Directory of Open Access Journals (Sweden)

    MANJULA K A

    2010-06-01

    Full Text Available Distributed computing is a science which solves a large problem by giving small parts of the problem to many computers to solve and then combining the solutions for the parts into a solution for the problem. This distributed computing framework suits to projects, which have an insatiable appetite for computing power. Two such popular projects are SETI@Home and Folding@Home. Different architectures and approaches for distributed computing are being proposed as part of the works progressing around the world. One way ofdistributing both data and computing power, known as grid computing, taps the Internet to put petabyte processing on every researcher's desktop. Grid technology is finding its way out of the academic incubator and entering into commercial environments. Cloud computing, which is a variant to grid computing, has emerged as a potentially competing approach for architecting large distributed systems. Clouds can be viewed as a logical and next higher-level abstraction from Grids.

  4. Assessing creativity in computer music ensembles: a computational approach

    OpenAIRE

    Comajuncosas, Josep M.

    2016-01-01

    Over the last decade Laptop Orchestras and Mobile Ensembles have proliferated. As a result, a large body of research has arisen on infrastructure, evaluation, design principles and compositional methodologies for Computer Music Ensembles (CME). However, little has been addressed and very little is known about the challenges and opportunities provided by CMEs for creativity in musical performance. Therefore, one of the most common issues CMEs have to deal with is the lack of ...

  5. Human Computer Interaction: An intellectual approach

    Directory of Open Access Journals (Sweden)

    Kuntal Saroha

    2011-08-01

    Full Text Available This paper discusses the research that has been done in thefield of Human Computer Interaction (HCI relating tohuman psychology. Human-computer interaction (HCI isthe study of how people design, implement, and useinteractive computer systems and how computers affectindividuals, organizations, and society. This encompassesnot only ease of use but also new interaction techniques forsupporting user tasks, providing better access toinformation, and creating more powerful forms ofcommunication. It involves input and output devices andthe interaction techniques that use them; how information ispresented and requested; how the computer’s actions arecontrolled and monitored; all forms of help, documentation,and training; the tools used to design, build, test, andevaluate user interfaces; and the processes that developersfollow when creating Interfaces.

  6. Uncertainty in biology: a computational modeling approach

    OpenAIRE

    2015-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies. Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building...

  7. NETWORK SECURITY: AN APPROACH TOWARDS SECURE COMPUTING

    OpenAIRE

    Rahul Pareek

    2011-01-01

    The security of computer networks plays a strategic role in modern computer systems. In order to enforce high protection levels against malicious attack, a number of software tools have been currently developed. Intrusion Detection System has recently become a heated research topic due to its capability of detecting and preventing the attacks from malicious network users. A pattern matching IDS for network security has been proposed in this paper. Many network security applications...

  8. ADVANCED FILE BASED SECURITY MECHANISM IN CLOUD COMPUTING: A REVIEW

    Directory of Open Access Journals (Sweden)

    Nisha Nisha

    2015-04-01

    Full Text Available Cloud computing is a broad solution that delivers IT as a service. Cloud computing uses the internet and the central remote servers to support different data and applications. It is an internet based technology. It permits the users to approach their personal files at any computer with internet access. The cloud computing flexibility is a function of the allocation of resources on authority’s request. It represents all the complexities of the network which may include everything from cables, routers, servers, data centers and all such other devices. Cloud based systems saves data off multiple organizations on shared hardware systems. In this paper the attempt to secure data from unauthorized access. The Method of data security is AES algorithm for providing data security by encrypting the given data based on the AES. It is based on a design principle known as a substitution-permutation network, and is fast in both Software and Hardware. The algorithms used in AES are so simple that they can be easily implemented using heap processors and a minimum amount of memory and this data then can only be decrypted by authorized person by using his private key. 

  9. Computational approaches for urban environments: An editorial

    NARCIS (Netherlands)

    Helbich, M; Jokar Arsanjani, J; Leitner, M

    2015-01-01

    Cities are under continuous pressure due to an increasing urbanization which will have far-reaching consequences for housing, transportation, retail, etc. To cope with these challenges, methodological advances in quantitative modeling coupled with growing amounts of spatial and spatiotemporal data c

  10. Computational approaches to natural product discovery

    NARCIS (Netherlands)

    Medema, M.H.; Fischbach, M.A.

    2015-01-01

    Starting with the earliest Streptomyces genome sequences, the promise of natural product genome mining has been captivating: genomics and bioinformatics would transform compound discovery from an ad hoc pursuit to a high-throughput endeavor. Until recently, however, genome mining has advanced natura

  11. DOE Advanced Scientific Computing Advisory Subcommittee (ASCAC) Report: Top Ten Exascale Research Challenges

    Energy Technology Data Exchange (ETDEWEB)

    Lucas, Robert [University of Southern California, Information Sciences Institute; Ang, James [Sandia National Laboratories; Bergman, Keren [Columbia University; Borkar, Shekhar [Intel; Carlson, William [Institute for Defense Analyses; Carrington, Laura [University of California, San Diego; Chiu, George [IBM; Colwell, Robert [DARPA; Dally, William [NVIDIA; Dongarra, Jack [University of Tennessee; Geist, Al [Oak Ridge National Laboratory; Haring, Rud [IBM; Hittinger, Jeffrey [Lawrence Livermore National Laboratory; Hoisie, Adolfy [Pacific Northwest National Laboratory; Klein, Dean Micron; Kogge, Peter [University of Notre Dame; Lethin, Richard [Reservoir Labs; Sarkar, Vivek [Rice University; Schreiber, Robert [Hewlett Packard; Shalf, John [Lawrence Berkeley National Laboratory; Sterling, Thomas [Indiana University; Stevens, Rick [Argonne National Laboratory; Bashor, Jon [Lawrence Berkeley National Laboratory; Brightwell, Ron [Sandia National Laboratories; Coteus, Paul [IBM; Debenedictus, Erik [Sandia National Laboratories; Hiller, Jon [Science and Technology Associates; Kim, K. H. [IBM; Langston, Harper [Reservoir Labs; Murphy, Richard Micron; Webster, Clayton [Oak Ridge National Laboratory; Wild, Stefan [Argonne National Laboratory; Grider, Gary [Los Alamos National Laboratory; Ross, Rob [Argonne National Laboratory; Leyffer, Sven [Argonne National Laboratory; Laros III, James [Sandia National Laboratories

    2014-02-10

    Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a system that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.

  12. Advanced information processing system: Inter-computer communication services

    Science.gov (United States)

    Burkhardt, Laura; Masotto, Tom; Sims, J. Terry; Whittredge, Roy; Alger, Linda S.

    1991-01-01

    The purpose is to document the functional requirements and detailed specifications for the Inter-Computer Communications Services (ICCS) of the Advanced Information Processing System (AIPS). An introductory section is provided to outline the overall architecture and functional requirements of the AIPS and to present an overview of the ICCS. An overview of the AIPS architecture as well as a brief description of the AIPS software is given. The guarantees of the ICCS are provided, and the ICCS is described as a seven-layered International Standards Organization (ISO) Model. The ICCS functional requirements, functional design, and detailed specifications as well as each layer of the ICCS are also described. A summary of results and suggestions for future work are presented.

  13. Computational modeling, optimization and manufacturing simulation of advanced engineering materials

    CERN Document Server

    2016-01-01

    This volume presents recent research work focused in the development of adequate theoretical and numerical formulations to describe the behavior of advanced engineering materials.  Particular emphasis is devoted to applications in the fields of biological tissues, phase changing and porous materials, polymers and to micro/nano scale modeling. Sensitivity analysis, gradient and non-gradient based optimization procedures are involved in many of the chapters, aiming at the solution of constitutive inverse problems and parameter identification. All these relevant topics are exposed by experienced international and inter institutional research teams resulting in a high level compilation. The book is a valuable research reference for scientists, senior undergraduate and graduate students, as well as for engineers acting in the area of computational material modeling.

  14. Towards Advanced Data Analysis by Combining Soft Computing and Statistics

    CERN Document Server

    Gil, María; Sousa, João; Verleysen, Michel

    2013-01-01

    Soft computing, as an engineering science, and statistics, as a classical branch of mathematics, emphasize different aspects of data analysis. Soft computing focuses on obtaining working solutions quickly, accepting approximations and unconventional approaches. Its strength lies in its flexibility to create models that suit the needs arising in applications. In addition, it emphasizes the need for intuitive and interpretable models, which are tolerant to imprecision and uncertainty. Statistics is more rigorous and focuses on establishing objective conclusions based on experimental data by analyzing the possible situations and their (relative) likelihood. It emphasizes the need for mathematical methods and tools to assess solutions and guarantee performance. Combining the two fields enhances the robustness and generalizability of data analysis methods, while preserving the flexibility to solve real-world problems efficiently and intuitively.

  15. High performance parallel computers for science: New developments at the Fermilab advanced computer program

    International Nuclear Information System (INIS)

    Fermilab's Advanced Computer Program (ACP) has been developing highly cost effective, yet practical, parallel computers for high energy physics since 1984. The ACP's latest developments are proceeding in two directions. A Second Generation ACP Multiprocessor System for experiments will include $3500 RISC processors each with performance over 15 VAX MIPS. To support such high performance, the new system allows parallel I/O, parallel interprocess communication, and parallel host processes. The ACP Multi-Array Processor, has been developed for theoretical physics. Each $4000 node is a FORTRAN or C programmable pipelined 20 MFlops (peak), 10 MByte single board computer. These are plugged into a 16 port crossbar switch crate which handles both inter and intra crate communication. The crates are connected in a hypercube. Site oriented applications like lattice gauge theory are supported by system software called CANOPY, which makes the hardware virtually transparent to users. A 256 node, 5 GFlop, system is under construction. 10 refs., 7 figs

  16. Reliability of an interactive computer program for advance care planning.

    Science.gov (United States)

    Schubart, Jane R; Levi, Benjamin H; Camacho, Fabian; Whitehead, Megan; Farace, Elana; Green, Michael J

    2012-06-01

    Despite widespread efforts to promote advance directives (ADs), completion rates remain low. Making Your Wishes Known: Planning Your Medical Future (MYWK) is an interactive computer program that guides individuals through the process of advance care planning, explaining health conditions and interventions that commonly involve life or death decisions, helps them articulate their values/goals, and translates users' preferences into a detailed AD document. The purpose of this study was to demonstrate that (in the absence of major life changes) the AD generated by MYWK reliably reflects an individual's values/preferences. English speakers ≥30 years old completed MYWK twice, 4 to 6 weeks apart. Reliability indices were assessed for three AD components: General Wishes; Specific Wishes for treatment; and Quality-of-Life values (QoL). Twenty-four participants completed the study. Both the Specific Wishes and QoL scales had high internal consistency in both time periods (Knuder Richardson formula 20 [KR-20]=0.83-0.95, and 0.86-0.89). Test-retest reliability was perfect for General Wishes (κ=1), high for QoL (Pearson's correlation coefficient=0.83), but lower for Specific Wishes (Pearson's correlation coefficient=0.57). MYWK generates an AD where General Wishes and QoL (but not Specific Wishes) statements remain consistent over time. PMID:22512830

  17. Reliability of an Interactive Computer Program for Advance Care Planning

    Science.gov (United States)

    Levi, Benjamin H.; Camacho, Fabian; Whitehead, Megan; Farace, Elana; Green, Michael J

    2012-01-01

    Abstract Despite widespread efforts to promote advance directives (ADs), completion rates remain low. Making Your Wishes Known: Planning Your Medical Future (MYWK) is an interactive computer program that guides individuals through the process of advance care planning, explaining health conditions and interventions that commonly involve life or death decisions, helps them articulate their values/goals, and translates users' preferences into a detailed AD document. The purpose of this study was to demonstrate that (in the absence of major life changes) the AD generated by MYWK reliably reflects an individual's values/preferences. English speakers ≥30 years old completed MYWK twice, 4 to 6 weeks apart. Reliability indices were assessed for three AD components: General Wishes; Specific Wishes for treatment; and Quality-of-Life values (QoL). Twenty-four participants completed the study. Both the Specific Wishes and QoL scales had high internal consistency in both time periods (Knuder Richardson formula 20 [KR-20]=0.83–0.95, and 0.86–0.89). Test-retest reliability was perfect for General Wishes (κ=1), high for QoL (Pearson's correlation coefficient=0.83), but lower for Specific Wishes (Pearson's correlation coefficient=0.57). MYWK generates an AD where General Wishes and QoL (but not Specific Wishes) statements remain consistent over time. PMID:22512830

  18. Computational approaches to enhance mass spectrometry-based proteomics

    OpenAIRE

    Neuhauser, Nadin

    2014-01-01

    In this thesis I present three computational approaches that improve the analysis of mass spectrometry-based proteomics data. The novel search engine Andromeda allows efficient identification of peptides and proteins. Implementation of a rule-based expert system provides more detailed information contained in the mass spectra. Furthermore I adapted our computational proteomics pipeline to high performance computers.

  19. Advanced Safeguards Approaches for New TRU Fuel Fabrication Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Durst, Philip C.; Ehinger, Michael H.; Boyer, Brian; Therios, Ike; Bean, Robert; Dougan, A.; Tolk, K.

    2007-12-15

    This second report in a series of three reviews possible safeguards approaches for the new transuranic (TRU) fuel fabrication processes to be deployed at AFCF – specifically, the ceramic TRU (MOX) fuel fabrication line and the metallic (pyroprocessing) line. The most common TRU fuel has been fuel composed of mixed plutonium and uranium dioxide, referred to as “MOX”. However, under the Advanced Fuel Cycle projects custom-made fuels with higher contents of neptunium, americium, and curium may also be produced to evaluate if these “minor actinides” can be effectively burned and transmuted through irradiation in the ABR. A third and final report in this series will evaluate and review the advanced safeguards approach options for the ABR. In reviewing and developing the advanced safeguards approach for the new TRU fuel fabrication processes envisioned for AFCF, the existing international (IAEA) safeguards approach at the Plutonium Fuel Production Facility (PFPF) and the conceptual approach planned for the new J-MOX facility in Japan have been considered as a starting point of reference. The pyro-metallurgical reprocessing and fuel fabrication process at EBR-II near Idaho Falls also provided insight for safeguarding the additional metallic pyroprocessing fuel fabrication line planned for AFCF.

  20. Computational dynamics for robotics systems using a non-strict computational approach

    Science.gov (United States)

    Orin, David E.; Wong, Ho-Cheung; Sadayappan, P.

    1989-01-01

    A Non-Strict computational approach for real-time robotics control computations is proposed. In contrast to the traditional approach to scheduling such computations, based strictly on task dependence relations, the proposed approach relaxes precedence constraints and scheduling is guided instead by the relative sensitivity of the outputs with respect to the various paths in the task graph. An example of the computation of the Inverse Dynamics of a simple inverted pendulum is used to demonstrate the reduction in effective computational latency through use of the Non-Strict approach. A speedup of 5 has been obtained when the processes of the task graph are scheduled to reduce the latency along the crucial path of the computation. While error is introduced by the relaxation of precedence constraints, the Non-Strict approach has a smaller error than the conventional Strict approach for a wide range of input conditions.

  1. Computer-Aided Approaches for Targeting HIVgp41

    Directory of Open Access Journals (Sweden)

    William J. Allen

    2012-08-01

    Full Text Available Virus-cell fusion is the primary means by which the human immunodeficiency virus-1 (HIV delivers its genetic material into the human T-cell host. Fusion is mediated in large part by the viral glycoprotein 41 (gp41 which advances through four distinct conformational states: (i native, (ii pre-hairpin intermediate, (iii fusion active (fusogenic, and (iv post-fusion. The pre-hairpin intermediate is a particularly attractive step for therapeutic intervention given that gp41 N-terminal heptad repeat (NHR and C‑terminal heptad repeat (CHR domains are transiently exposed prior to the formation of a six-helix bundle required for fusion. Most peptide-based inhibitors, including the FDA‑approved drug T20, target the intermediate and there are significant efforts to develop small molecule alternatives. Here, we review current approaches to studying interactions of inhibitors with gp41 with an emphasis on atomic-level computer modeling methods including molecular dynamics, free energy analysis, and docking. Atomistic modeling yields a unique level of structural and energetic detail, complementary to experimental approaches, which will be important for the design of improved next generation anti-HIV drugs.

  2. Recent advances in diagnostic approaches for sub-arachnoid hemorrhage

    OpenAIRE

    Kumar, Ashish; Kato, Yoko; Hayakawa, Motoharu; Junpei, ODA; Watabe, Takeya; Imizu, Shuei; Oguri, Daikichi; Hirose, Yuichi

    2011-01-01

    Sub-arachnoid hemorrhage (SAH) has been easily one of the most debilitating neurosurgical entities as far as stroke related case mortality and morbidity rates are concerned. To date, it has case fatality rates ranging from 32-67%. Advances in the diagnostic accuracy of the available imaging methods have contributed significantly in reducing morbidity associated with this deadly disease. We currently have computed tomography angiography (CTA), magnetic resonance angiography (MRA) and the digit...

  3. Advanced and intelligent computations in diagnosis and control

    CERN Document Server

    2016-01-01

    This book is devoted to the demands of research and industrial centers for diagnostics, monitoring and decision making systems that result from the increasing complexity of automation and systems, the need to ensure the highest level of reliability and safety, and continuing research and the development of innovative approaches to fault diagnosis. The contributions combine domains of engineering knowledge for diagnosis, including detection, isolation, localization, identification, reconfiguration and fault-tolerant control. The book is divided into six parts:  (I) Fault Detection and Isolation; (II) Estimation and Identification; (III) Robust and Fault Tolerant Control; (IV) Industrial and Medical Diagnostics; (V) Artificial Intelligence; (VI) Expert and Computer Systems.

  4. Human brain mapping: Experimental and computational approaches

    Energy Technology Data Exchange (ETDEWEB)

    Wood, C.C.; George, J.S.; Schmidt, D.M.; Aine, C.J. [Los Alamos National Lab., NM (US); Sanders, J. [Albuquerque VA Medical Center, NM (US); Belliveau, J. [Massachusetts General Hospital, Boston, MA (US)

    1998-11-01

    This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). This program developed project combined Los Alamos' and collaborators' strengths in noninvasive brain imaging and high performance computing to develop potential contributions to the multi-agency Human Brain Project led by the National Institute of Mental Health. The experimental component of the project emphasized the optimization of spatial and temporal resolution of functional brain imaging by combining: (a) structural MRI measurements of brain anatomy; (b) functional MRI measurements of blood flow and oxygenation; and (c) MEG measurements of time-resolved neuronal population currents. The computational component of the project emphasized development of a high-resolution 3-D volumetric model of the brain based on anatomical MRI, in which structural and functional information from multiple imaging modalities can be integrated into a single computational framework for modeling, visualization, and database representation.

  5. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  6. Computational Approach To Understanding Autism Spectrum Disorders

    OpenAIRE

    Włodzisław Duch; Wiesław Nowak; Jaroslaw Meller; Grzegorz Osiński; Krzysztof Dobosz; Dariusz Mikołajewski; Grzegorz Marcin Wójcik

    2012-01-01

    Every year the prevalence of Autism Spectrum of Disorders (ASD) is rising. Is there a unifying mechanism of various ASD cases at the genetic, molecular, cellular or systems level? The hypothesis advanced in this paper is focused on neural dysfunctions that lead to problems with attention in autistic people. Simulations of attractor neural networks performing cognitive functions help to assess system long-term neurodynamics. The Fuzzy Symbolic Dynamics (FSD) technique is used for the visualiza...

  7. Dye Sensitised Solar Cells: A Computational Approach

    OpenAIRE

    O Rourke, C.

    2013-01-01

    Dye sensitised solar cells (DSSCs) mimic charge excitation and transfer processes found in natural photosynthesis to directly convert sunlight into electricity. Combining easy assembly with relatively cheap materials they offer a potentially cost effective solution to our energy requirements. Numerous physical processes are at work within a DSSC and the underlying complexity of these competing processes has meant that, despite considerable research effort, advances in obtaining a viable devic...

  8. Advances in computer technology: impact on the practice of medicine.

    Science.gov (United States)

    Groth-Vasselli, B; Singh, K; Farnsworth, P N

    1995-01-01

    Advances in computer technology provide a wide range of applications which are revolutionizing the practice of medicine. The development of new software for the office creates a web of communication among physicians, staff members, health care facilities and associated agencies. This provides the physician with the prospect of a paperless office. At the other end of the spectrum, the development of 3D work stations and software based on computational chemistry permits visualization of protein molecules involved in disease. Computer assisted molecular modeling has been used to construct working 3D models of lens alpha-crystallin. The 3D structure of alpha-crystallin is basic to our understanding of the molecular mechanisms involved in lens fiber cell maturation, stabilization of the inner nuclear region, the maintenance of lens transparency and cataractogenesis. The major component of the high molecular weight aggregates that occur during cataractogenesis is alpha-crystallin subunits. Subunits of alpha-crystallin occur in other tissues of the body. In the central nervous system accumulation of these subunits in the form of dense inclusion bodies occurs in pathological conditions such as Alzheimer's disease, Huntington's disease, multiple sclerosis and toxoplasmosis (Iwaki, Wisniewski et al., 1992), as well as neoplasms of astrocyte origin (Iwaki, Iwaki, et al., 1991). Also cardiac ischemia is associated with an increased alpha B synthesis (Chiesi, Longoni et al., 1990). On a more global level, the molecular structure of alpha-crystallin may provide information pertaining to the function of small heat shock proteins, hsp, in maintaining cell stability under the stress of disease. PMID:8721907

  9. Heterogeneous Computing in Economics: A Simplified Approach

    DEFF Research Database (Denmark)

    Dziubinski, Matt P.; Grassi, Stefano

    This paper shows the potential of heterogeneous computing in solving dynamic equilibrium models in economics. We illustrate the power and simplicity of the C++ Accelerated Massive Parallelism recently introduced by Microsoft. Starting from the same exercise as Aldrich et al. (2011) we document a...

  10. Heterogeneous Computing in Economics: A Simplified Approach

    OpenAIRE

    Dziubinski, Matt P.; Grassi, Stefano

    2012-01-01

    This paper shows the potential of heterogeneous computing in solving dynamic equilibrium models in economics. We illustrate the power and simplicity of the C++ Accelerated Massive Parallelism recently introduced by Microsoft. Starting from the same exercise as Aldrich et al. (2011) we document a speed gain together with a simplified programming style that naturally enables parallelization.

  11. Cluster Computing: A Mobile Code Approach

    Directory of Open Access Journals (Sweden)

    R. B. Patel

    2006-01-01

    Full Text Available Cluster computing harnesses the combined computing power of multiple processors in a parallel configuration. Cluster Computing environments built from commodity hardware have provided a cost-effective solution for many scientific and high-performance applications. In this paper we have presented design and implementation of a cluster based framework using mobile code. The cluster implementation involves the designing of a server named MCLUSTER which manages the configuring, resetting of cluster. It allows a user to provide necessary information regarding the application to be executed via a graphical user interface (GUI. Framework handles- the generation of application mobile code and its distribution to appropriate client nodes, efficient handling of results so generated and communicated by a number of client nodes and recording of execution time of application. The client node receives and executes the mobile code that defines the distributed job submitted by MCLUSTER server and replies the results back. We have also the analyzed the performance of the developed system emphasizing the tradeoff between communication and computation overhead.

  12. Identification of Enhancers In Human: Advances In Computational Studies

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2016-03-24

    Roughly ~50% of the human genome, contains noncoding sequences serving as regulatory elements responsible for the diverse gene expression of the cells in the body. One very well studied category of regulatory elements is the category of enhancers. Enhancers increase the transcriptional output in cells through chromatin remodeling or recruitment of complexes of binding proteins. Identification of enhancer using computational techniques is an interesting area of research and up to now several approaches have been proposed. However, the current state-of-the-art methods face limitations since the function of enhancers is clarified, but their mechanism of function is not well understood. This PhD thesis presents a bioinformatics/computer science study that focuses on the problem of identifying enhancers in different human cells using computational techniques. The dissertation is decomposed into four main tasks that we present in different chapters. First, since many of the enhancer’s functions are not well understood, we study the basic biological models by which enhancers trigger transcriptional functions and we survey comprehensively over 30 bioinformatics approaches for identifying enhancers. Next, we elaborate more on the availability of enhancer data as produced by different enhancer identification methods and experimental procedures. In particular, we analyze advantages and disadvantages of existing solutions and we report obstacles that require further consideration. To mitigate these problems we developed the Database of Integrated Human Enhancers (DENdb), a centralized online repository that archives enhancer data from 16 ENCODE cell-lines. The integrated enhancer data are also combined with many other experimental data that can be used to interpret the enhancers content and generate a novel enhancer annotation that complements the existing integrative annotation proposed by the ENCODE consortium. Next, we propose the first deep-learning computational

  13. Recent advances on hybrid approaches for designing intelligent systems

    CERN Document Server

    Melin, Patricia; Pedrycz, Witold; Kacprzyk, Janusz

    2014-01-01

    This book describes recent advances on hybrid intelligent systems using soft computing techniques for diverse areas of application, such as intelligent control and robotics, pattern recognition, time series prediction and optimization complex problems. Soft Computing (SC) consists of several intelligent computing paradigms, including fuzzy logic, neural networks, and bio-inspired optimization algorithms, which can be used to produce powerful hybrid intelligent systems. The book is organized in five main parts, which contain a group of papers around a similar subject. The first part consists of papers with the main theme of type-2 fuzzy logic, which basically consists of papers that propose new models and applications for type-2 fuzzy systems. The second part contains papers with the main theme of bio-inspired optimization algorithms, which are basically papers using nature-inspired techniques to achieve optimization of complex optimization problems in diverse areas of application. The third part contains pape...

  14. Music Genre Classification Systems - A Computational Approach

    OpenAIRE

    Ahrendt, Peter; Hansen, Lars Kai

    2006-01-01

    Automatic music genre classification is the classification of a piece of music into its corresponding genre (such as jazz or rock) by a computer. It is considered to be a cornerstone of the research area Music Information Retrieval (MIR) and closely linked to the other areas in MIR. It is thought that MIR will be a key element in the processing, searching and retrieval of digital music in the near future. This dissertation is concerned with music genre classification systems and in particular...

  15. Computational and Game-Theoretic Approaches for Modeling Bounded Rationality

    NARCIS (Netherlands)

    L. Waltman (Ludo)

    2011-01-01

    textabstractThis thesis studies various computational and game-theoretic approaches to economic modeling. Unlike traditional approaches to economic modeling, the approaches studied in this thesis do not rely on the assumption that economic agents behave in a fully rational way. Instead, economic age

  16. SOFT COMPUTING APPROACH FOR NOISY IMAGE RESTORATION

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    A genetic learning algorithm based fuzzy neural network was proposed for noisy image restoration, which can adaptively find and extract the fuzzy rules contained in noise. It can efficiently remove image noise and preserve the detail image information as much as possible. The experimental results show that the proposed approach is able to performa far better than conventional noise removing techniques.

  17. Advanced business communication: a multi-media approach

    OpenAIRE

    Christopher, E

    2009-01-01

    The newly launched "Advanced Business Communication" course in the Faculty of Business and Economics adopts a fully integrated multi-media approach to enhancing students’ communication skills through various media such as case videos, case texts, print articles as well as online resources. A key feature of the course is also a fully dedicated interactive course website which contains necessary course information as well as interactive language and discipline related exercises, video related m...

  18. A complex network approach to cloud computing

    CERN Document Server

    Travieso, Gonzalo; Bruno, Odemir Martinez; Costa, Luciano da Fontoura

    2015-01-01

    Cloud computing has become an important means to speed up computing. One problem influencing heavily the performance of such systems is the choice of nodes as servers responsible for executing the users' tasks. In this article we report how complex networks can be used to model such a problem. More specifically, we investigate the performance of the processing respectively to cloud systems underlain by Erdos-Renyi and Barabasi-Albert topology containing two servers. Cloud networks involving two communities not necessarily of the same size are also considered in our analysis. The performance of each configuration is quantified in terms of two indices: the cost of communication between the user and the nearest server, and the balance of the distribution of tasks between the two servers. Regarding the latter index, the ER topology provides better performance than the BA case for smaller average degrees and opposite behavior for larger average degrees. With respect to the cost, smaller values are found in the BA ...

  19. Approach for Application on Cloud Computing

    Directory of Open Access Journals (Sweden)

    Shiv Kumar

    2012-08-01

    Full Text Available A web application is any application using web browser asclient or we can say that it is a dynamic version of a web orapplication server. There are two types of web applicationsbased on orientation:1. A presentation-oriented web application generates interactiveweb pages containing various types of markup language likeHTML, XML etc. and dynamic content in response to requests.2. A service-oriented web application implements the endpointof a web service.Web applications commonly use server-side script like ASP,PHP, etc and client-side script like HTML, JavaScript, etc. todevelop the application. Web applications are used in the fieldof banking sector, insurance sector, marketing, finance, servicesetc.“Cloud computing is a model for enabling convenient, ondemandnetwork access to a shared pool of configurablecomputing resources (e.g., networks, servers, storage,applications, and services that can be rapidly provisioned andreleased with minimal management effort or service providerinteraction.” - U.S. National Institute of Standards andTechnology (NISTA general and simple cloud computing definition is using webapplications and/or server services that you pay to access ratherthan software or hardware that you buy and install.

  20. Integrated Computable General Equilibrium (CGE) microsimulation approach

    OpenAIRE

    John Cockburn; Erwin Corong; Caesar Cororaton

    2010-01-01

    Conventionally, the analysis of macro-economic shocks and the analysis of income distribution and poverty require very different methodological techniques and sources of data. Over the last decade however, the natural divide between both approaches has diminished, as evaluating the impact of macro-economic shocks on poverty and income distribution within a CGE framework complemented by household survey data has flourished. This paper focuses on explicitly integrating into a CGE model each hou...

  1. Advances in neural networks computational and theoretical issues

    CERN Document Server

    Esposito, Anna; Morabito, Francesco

    2015-01-01

    This book collects research works that exploit neural networks and machine learning techniques from a multidisciplinary perspective. Subjects covered include theoretical, methodological and computational topics which are grouped together into chapters devoted to the discussion of novelties and innovations related to the field of Artificial Neural Networks as well as the use of neural networks for applications, pattern recognition, signal processing, and special topics such as the detection and recognition of multimodal emotional expressions and daily cognitive functions, and  bio-inspired memristor-based networks.  Providing insights into the latest research interest from a pool of international experts coming from different research fields, the volume becomes valuable to all those with any interest in a holistic approach to implement believable, autonomous, adaptive, and context-aware Information Communication Technologies.

  2. Development of advanced nodal diffusion methods for modern computer architectures

    International Nuclear Information System (INIS)

    A family of highly efficient multidimensional multigroup advanced neutron-diffusion nodal methods, ILLICO, were implemented on sequential, vector, and vector-concurrent computers. Three-dimensional realistic benchmark problems can be solved in vectorized mode in less than 0.73 s (33.86 Mflops) on a Cray X-MP/48. Vector-concurrent implementations yield speedups as high as 9.19 on an Alliant FX/8. These results show that the ILLICO method preserves essentially all of its speed advantage over finite-difference methods. A self-consistent higher-order nodal diffusion method was developed and implemented. Nodal methods for global nuclear reactor multigroup diffusion calculations which account explicitly for heterogeneities in the assembly nuclear properties were developed and evaluated. A systematic analysis of the zero-order variable cross section nodal method was conducted. Analyzing the KWU PWR depletion benchmark problem, it is shown that when burnup heterogeneities arise, ordinary nodal methods, which do not explicitly treat the heterogeneities, suffer a significant systematic error that accumulates. A nodal method that treats explicitly the space dependence of diffusion coefficients was developed and implemented. A consistent burnup-correction method for nodal microscopic depletion analysis was developed

  3. Panel discussion: Innovative approaches to high performance computing

    International Nuclear Information System (INIS)

    A large part of research in lattice field theory is carried out via computer simulations. Some research groups use computer clusters readily assembled using off-the-shelf components, while others have been developing dedicated closely coupled massively parallel supercomputers. Pros and cons of these approaches, in particular the affordability and performance of these computers, were discussed. All the options being explored have different specific uses, and it is a good sign for the future that the computer industry is now taking active interest in building special purpose high performance computers

  4. Advanced approaches to intelligent information and database systems

    CERN Document Server

    Boonjing, Veera; Chittayasothorn, Suphamit

    2014-01-01

    This book consists of 35 chapters presenting different theoretical and practical aspects of Intelligent Information and Database Systems. Nowadays both Intelligent and Database Systems are applied in most of the areas of human activities which necessitates further research in these areas. In this book various interesting issues related to the intelligent information models and methods as well as their advanced applications, database systems applications, data models and their analysis, and digital multimedia methods and applications are presented and discussed both from the practical and theoretical points of view. The book is organized in four parts devoted to intelligent systems models and methods, intelligent systems advanced applications, database systems methods and applications, and multimedia systems methods and applications. The book will be interesting for both practitioners and researchers, especially graduate and PhD students of information technology and computer science, as well more experienced ...

  5. Securing applications in personal computers: the relay race approach.

    OpenAIRE

    Wright, James Michael

    1991-01-01

    Approved for public release; distribution is unlimited This Thesis reviews the increasing need for security in a personal computer (PC) environment and proposes a new approach for securing PC applications at the application layer. The Relay Race Approach extends two standard approaches : data encryption and password access control at the main program level, to the subprogram level by the use of a special parameter, the "Baton" . The applicability of this approach is de...

  6. Computational Efforts in Support of Advanced Coal Research

    Energy Technology Data Exchange (ETDEWEB)

    Suljo Linic

    2006-08-17

    The focus in this project was to employ first principles computational methods to study the underlying molecular elementary processes that govern hydrogen diffusion through Pd membranes as well as the elementary processes that govern the CO- and S-poisoning of these membranes. Our computational methodology integrated a multiscale hierarchical modeling approach, wherein a molecular understanding of the interactions between various species is gained from ab-initio quantum chemical Density Functional Theory (DFT) calculations, while a mesoscopic statistical mechanical model like Kinetic Monte Carlo is employed to predict the key macroscopic membrane properties such as permeability. The key developments are: (1) We have coupled systematically the ab initio calculations with Kinetic Monte Carlo (KMC) simulations to model hydrogen diffusion through the Pd based-membranes. The predicted tracer diffusivity of hydrogen atoms through the bulk of Pd lattice from KMC simulations are in excellent agreement with experiments. (2) The KMC simulations of dissociative adsorption of H{sub 2} over Pd(111) surface indicates that for thin membranes (less than 10{micro} thick), the diffusion of hydrogen from surface to the first subsurface layer is rate limiting. (3) Sulfur poisons the Pd surface by altering the electronic structure of the Pd atoms in the vicinity of the S atom. The KMC simulations indicate that increasing sulfur coverage drastically reduces the hydrogen coverage on the Pd surface and hence the driving force for diffusion through the membrane.

  7. Computational Diagnostic: A Novel Approach to View Medical Data.

    Energy Technology Data Exchange (ETDEWEB)

    Mane, K. K. (Ketan Kirtiraj); Börner, K. (Katy)

    2007-01-01

    A transition from traditional paper-based medical records to electronic health record is largely underway. The use of electronic records offers tremendous potential to personalize patient diagnosis and treatment. In this paper, we discuss a computational diagnostic tool that uses digital medical records to help doctors gain better insight about a patient's medical condition. The paper details different interactive features of the tool which offer potential to practice evidence-based medicine and advance patient diagnosis practices. The healthcare industry is a constantly evolving domain. Research from this domain is often translated into better understanding of different medical conditions. This new knowledge often contributes towards improved diagnosis and treatment solutions for patients. But the healthcare industry lags behind to seek immediate benefits of the new knowledge as it still adheres to the traditional paper-based approach to keep track of medical records. However recently we notice a drive that promotes a transition towards electronic health record (EHR). An EHR stores patient medical records in digital format and offers potential to replace the paper health records. Earlier attempts of an EHR replicated the paper layout on the screen, representation of medical history of a patient in a graphical time-series format, interactive visualization with 2D/3D generated images from an imaging device. But an EHR can be much more than just an 'electronic view' of the paper record or a collection of images from an imaging device. In this paper, we present an EHR called 'Computational Diagnostic Tool', that provides a novel computational approach to look at patient medical data. The developed EHR system is knowledge driven and acts as clinical decision support tool. The EHR tool provides two visual views of the medical data. Dynamic interaction with data is supported to help doctors practice evidence-based decisions and make judicious

  8. Fractal approach to computer-analytical modelling of tree crown

    International Nuclear Information System (INIS)

    In this paper we discuss three approaches to the modeling of a tree crown development. These approaches are experimental (i.e. regressive), theoretical (i.e. analytical) and simulation (i.e. computer) modeling. The common assumption of these is that a tree can be regarded as one of the fractal objects which is the collection of semi-similar objects and combines the properties of two- and three-dimensional bodies. We show that a fractal measure of crown can be used as the link between the mathematical models of crown growth and light propagation through canopy. The computer approach gives the possibility to visualize a crown development and to calibrate the model on experimental data. In the paper different stages of the above-mentioned approaches are described. The experimental data for spruce, the description of computer system for modeling and the variant of computer model are presented. (author). 9 refs, 4 figs

  9. Q-P Wave traveltime computation by an iterative approach

    KAUST Repository

    Ma, Xuxin

    2013-01-01

    In this work, we present a new approach to compute anisotropic traveltime based on solving successively elliptical isotropic traveltimes. The method shows good accuracy and is very simple to implement.

  10. Computing material fronts with a Lagrange-Projection approach

    CERN Document Server

    Chalons, Christophe

    2010-01-01

    This paper reports investigations on the computation of material fronts in multi-fluid models using a Lagrange-Projection approach. Various forms of the Projection step are considered. Particular attention is paid to minimization of conservation errors.

  11. Block sparse Cholesky algorithms on advanced uniprocessor computers

    Energy Technology Data Exchange (ETDEWEB)

    Ng, E.G.; Peyton, B.W.

    1991-12-01

    As with many other linear algebra algorithms, devising a portable implementation of sparse Cholesky factorization that performs well on the broad range of computer architectures currently available is a formidable challenge. Even after limiting our attention to machines with only one processor, as we have done in this report, there are still several interesting issues to consider. For dense matrices, it is well known that block factorization algorithms are the best means of achieving this goal. We take this approach for sparse factorization as well. This paper has two primary goals. First, we examine two sparse Cholesky factorization algorithms, the multifrontal method and a blocked left-looking sparse Cholesky method, in a systematic and consistent fashion, both to illustrate the strengths of the blocking techniques in general and to obtain a fair evaluation of the two approaches. Second, we assess the impact of various implementation techniques on time and storage efficiency, paying particularly close attention to the work-storage requirement of the two methods and their variants.

  12. Quantum Computing Approach to Nonrelativistic and Relativistic Molecular Energy Calculations

    Czech Academy of Sciences Publication Activity Database

    Veis, Libor; Pittner, Jiří

    Hoboken : John Wiley, 2014 - (Kais, S.), s. 107-135 ISBN 978-1-118-49566-7. - (Advances in Chemical Physics. Vol. 154) R&D Projects: GA ČR GA203/08/0626 Institutional support: RVO:61388955 Keywords : full configuration interaction (FCI) calculations * nonrelativistic molecular hamiltonians * quantum computing Subject RIV: CF - Physical ; Theoretical Chemistry

  13. Soft computing approaches to uncertainty propagation in environmental risk mangement

    OpenAIRE

    Kumar, Vikas

    2008-01-01

    Real-world problems, especially those that involve natural systems, are complex and composed of many nondeterministic components having non-linear coupling. It turns out that in dealing with such systems, one has to face a high degree of uncertainty and tolerate imprecision. Classical system models based on numerical analysis, crisp logic or binary logic have characteristics of precision and categoricity and classified as hard computing approach. In contrast soft computing approaches like pro...

  14. An introduction to statistical computing a simulation-based approach

    CERN Document Server

    Voss, Jochen

    2014-01-01

    A comprehensive introduction to sampling-based methods in statistical computing The use of computers in mathematics and statistics has opened up a wide range of techniques for studying otherwise intractable problems.  Sampling-based simulation techniques are now an invaluable tool for exploring statistical models.  This book gives a comprehensive introduction to the exciting area of sampling-based methods. An Introduction to Statistical Computing introduces the classical topics of random number generation and Monte Carlo methods.  It also includes some advanced met

  15. A computational approach to chemical etiologies of diabetes

    DEFF Research Database (Denmark)

    Audouze, Karine Marie Laure; Brunak, Søren; Grandjean, Philippe

    2013-01-01

    Computational meta-analysis can link environmental chemicals to genes and proteins involved in human diseases, thereby elucidating possible etiologies and pathogeneses of non-communicable diseases. We used an integrated computational systems biology approach to examine possible pathogenetic......, and offers thus promising guidance for future research in regard to the etiology and pathogenesis of complex diseases....

  16. General approaches in ensemble quantum computing

    Indian Academy of Sciences (India)

    V Vimalan; N Chandrakumar

    2008-01-01

    We have developed methodology for NMR quantum computing focusing on enhancing the efficiency of initialization, of logic gate implementation and of readout. Our general strategy involves the application of rotating frame pulse sequences to prepare pseudopure states and to perform logic operations. We demonstrate experimentally our methodology for both homonuclear and heteronuclear spin ensembles. On model two-spin systems, the initialization time of one of our sequences is three-fourths (in the heteronuclear case) or one-fourth (in the homonuclear case), of the typical pulsed free precession sequences, attaining the same initialization efficiency. We have implemented the logical SWAP operation in homonuclear AMX spin systems using selective isotropic mixing, reducing the duration taken to a third compared to the standard re-focused INEPT-type sequence. We introduce the 1D version for readout of the rotating frame SWAP operation, in an attempt to reduce readout time. We further demonstrate the Hadamard mode of 1D SWAP, which offers 2N-fold reduction in experiment time for a system with -working bits, attaining the same sensitivity as the standard 1D version.

  17. Multivariate analysis: A statistical approach for computations

    Science.gov (United States)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  18. The advanced light water reactor program approach for gaining acceptance

    International Nuclear Information System (INIS)

    Electric utilities in the US face several obstacles and disincentives to building new nuclear power plants. A reformed licensing process that permits true one-step licensing is a prerequisite, as is timely implementation of the technical and political solutions of the high-level waste issue by the federal government and low-level waste storage by state governments. In addition, a more tangible acceptance of the need, benefits, and residual risk of nuclear power by all concerned institutions is an essential impetus for the return of the nuclear option. An improved version of the light water reactor (LWR) is expected to be the preferred choice for the next increment of nuclear capacity ordered in the US. The paper discusses the need for research and advanced LWR development. The advanced LWR program, sponsored jointly by the utility industry and the US Department of Energy (DOE), offers the most promising approach to developing the next generation of nuclear power plants

  19. An integrated approach to emotion recognition for advanced emotional intelligence

    OpenAIRE

    Panagiotis D Bamidis; Frantzidis, Christos A.; Konstantinidis, Evdokimos I.; Luneski, Andrej; Lithari, Chrysa; Klados, Manousos A.; Bratsas, Charalambos; Papadelis, Christos; Pappas, Costas

    2009-01-01

    Emotion identification is beginning to be considered as an essential feature in human-computer interaction. However, most of the studies are mainly focused on facial expression classifications and speech recognition and not much attention has been paid until recently to physiological pattern recognition. In this paper, an integrative approach is proposed to emotional interaction by fusing multi-modal signals. Subjects are exposed to pictures selected from the International Affective Pic...

  20. An Advanced Survey on Cloud Computing and State-of-the-art Research Issues

    Directory of Open Access Journals (Sweden)

    Mohiuddin Ahmed

    2012-01-01

    Full Text Available Cloud Computing is considered as one of the emerging arenas of computer science in recent times. It is providing excellent facilities to business entrepreneurs by flexible infrastructure. Although, cloud computing is facilitating the Information Technology industry, the research and development in this arena is yet to be satisfactory. Our contribution in this paper is an advanced survey focusing on cloud computing concept and most advanced research issues. This paper provides a better understanding of the cloud computing and identifies important research issues in this burgeoning area of computer science.

  1. What is intrinsic motivation? A typology of computational approaches

    Directory of Open Access Journals (Sweden)

    Pierre-Yves Oudeyer

    2009-11-01

    Full Text Available Intrinsic motivation, the causal mechanism for spontaneous exploration and curiosity, is a central concept in developmental psychology. It has been argued to be a crucial mechanism for open-ended cognitive development in humans, and as such has gathered a growing interest from developmental roboticists in the recent years. The goal of this paper is threefold. First, it provides a synthesis of the different approaches of intrinsic motivation in psychology. Second, by interpreting these approaches in a computational reinforcement learning framework, we argue that they are not operational and even sometimes inconsistent. Third, we set the ground for a systematic operational study of intrinsic motivation by presenting a formal typology of possible computational approaches. This typology is partly based on existing computational models, but also presents new ways of conceptualizing intrinsic motivation. We argue that this kind of computational typology might be useful for opening new avenues for research both in psychology and developmental robotics.

  2. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  3. Propagation of computer virus both across the Internet and external computers: A complex-network approach

    Science.gov (United States)

    Gan, Chenquan; Yang, Xiaofan; Liu, Wanping; Zhu, Qingyi; Jin, Jian; He, Li

    2014-08-01

    Based on the assumption that external computers (particularly, infected external computers) are connected to the Internet, and by considering the influence of the Internet topology on computer virus spreading, this paper establishes a novel computer virus propagation model with a complex-network approach. This model possesses a unique (viral) equilibrium which is globally attractive. Some numerical simulations are also given to illustrate this result. Further study shows that the computers with higher node degrees are more susceptible to infection than those with lower node degrees. In this regard, some appropriate protective measures are suggested.

  4. Advanced multiresponse process optimisation an intelligent and integrated approach

    CERN Document Server

    Šibalija, Tatjana V

    2016-01-01

    This book presents an intelligent, integrated, problem-independent method for multiresponse process optimization. In contrast to traditional approaches, the idea of this method is to provide a unique model for the optimization of various processes, without imposition of assumptions relating to the type of process, the type and number of process parameters and responses, or interdependences among them. The presented method for experimental design of processes with multiple correlated responses is composed of three modules: an expert system that selects the experimental plan based on the orthogonal arrays; the factor effects approach, which performs processing of experimental data based on Taguchi’s quality loss function and multivariate statistical methods; and process modeling and optimization based on artificial neural networks and metaheuristic optimization algorithms. The implementation is demonstrated using four case studies relating to high-tech industries and advanced, non-conventional processes.

  5. First 3 years of operation of RIACS (Research Institute for Advanced Computer Science) (1983-1985)

    Science.gov (United States)

    Denning, P. J.

    1986-01-01

    The focus of the Research Institute for Advanced Computer Science (RIACS) is to explore matches between advanced computing architectures and the processes of scientific research. An architecture evaluation of the MIT static dataflow machine, specification of a graphical language for expressing distributed computations, and specification of an expert system for aiding in grid generation for two-dimensional flow problems was initiated. Research projects for 1984 and 1985 are summarized.

  6. Computational Thinking and Practice - A Generic Approach to Computing in Danish High Schools

    DEFF Research Database (Denmark)

    Caspersen, Michael E.; Nowack, Palle

    2014-01-01

    Internationally, there is a growing awareness on the necessity of providing relevant computing education in schools, particularly high schools. We present a new and generic approach to Computing in Danish High Schools based on a conceptual framework derived from ideas related to computational thi...... thinking. We present two main theses on which the subject is based, and we present the included knowledge areas and didactical design principles. Finally we summarize the status and future plans for the subject and related development projects....

  7. Recent Advances in Computational Methods for Nuclear Magnetic Resonance Data Processing

    KAUST Repository

    Gao, Xin

    2013-01-11

    Although three-dimensional protein structure determination using nuclear magnetic resonance (NMR) spectroscopy is a computationally costly and tedious process that would benefit from advanced computational techniques, it has not garnered much research attention from specialists in bioinformatics and computational biology. In this paper, we review recent advances in computational methods for NMR protein structure determination. We summarize the advantages of and bottlenecks in the existing methods and outline some open problems in the field. We also discuss current trends in NMR technology development and suggest directions for research on future computational methods for NMR.

  8. Parallel computing in genomic research: advances and applications

    Directory of Open Access Journals (Sweden)

    Ocaña K

    2015-11-01

    Full Text Available Kary Ocaña,1 Daniel de Oliveira2 1National Laboratory of Scientific Computing, Petrópolis, Rio de Janeiro, 2Institute of Computing, Fluminense Federal University, Niterói, Brazil Abstract: Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities. Keywords: high-performance computing, genomic research, cloud computing, grid computing, cluster computing, parallel computing

  9. ADVANCED COMPUTATIONAL MODEL FOR THREE-PHASE SLURRY REACTORS

    International Nuclear Information System (INIS)

    In this project, an Eulerian-Lagrangian formulation for analyzing three-phase slurry flows in a bubble column was developed. The approach used an Eulerian analysis of liquid flows in the bubble column, and made use of the Lagrangian trajectory analysis for the bubbles and particle motions. The bubble-bubble and particle-particle collisions are included the model. The model predictions are compared with the experimental data and good agreement was found An experimental setup for studying two-dimensional bubble columns was developed. The multiphase flow conditions in the bubble column were measured using optical image processing and Particle Image Velocimetry techniques (PIV). A simple shear flow device for bubble motion in a constant shear flow field was also developed. The flow conditions in simple shear flow device were studied using PIV method. Concentration and velocity of particles of different sizes near a wall in a duct flow was also measured. The technique of Phase-Doppler anemometry was used in these studies. An Eulerian volume of fluid (VOF) computational model for the flow condition in the two-dimensional bubble column was also developed. The liquid and bubble motions were analyzed and the results were compared with observed flow patterns in the experimental setup. Solid-fluid mixture flows in ducts and passages at different angle of orientations were also analyzed. The model predictions were compared with the experimental data and good agreement was found. Gravity chute flows of solid-liquid mixtures were also studied. The simulation results were compared with the experimental data and discussed A thermodynamically consistent model for multiphase slurry flows with and without chemical reaction in a state of turbulent motion was developed. The balance laws were obtained and the constitutive laws established

  10. ADVANCED COMPUTATIONAL MODEL FOR THREE-PHASE SLURRY REACTORS

    Energy Technology Data Exchange (ETDEWEB)

    Goodarz Ahmadi

    2004-10-01

    In this project, an Eulerian-Lagrangian formulation for analyzing three-phase slurry flows in a bubble column was developed. The approach used an Eulerian analysis of liquid flows in the bubble column, and made use of the Lagrangian trajectory analysis for the bubbles and particle motions. The bubble-bubble and particle-particle collisions are included the model. The model predictions are compared with the experimental data and good agreement was found An experimental setup for studying two-dimensional bubble columns was developed. The multiphase flow conditions in the bubble column were measured using optical image processing and Particle Image Velocimetry techniques (PIV). A simple shear flow device for bubble motion in a constant shear flow field was also developed. The flow conditions in simple shear flow device were studied using PIV method. Concentration and velocity of particles of different sizes near a wall in a duct flow was also measured. The technique of Phase-Doppler anemometry was used in these studies. An Eulerian volume of fluid (VOF) computational model for the flow condition in the two-dimensional bubble column was also developed. The liquid and bubble motions were analyzed and the results were compared with observed flow patterns in the experimental setup. Solid-fluid mixture flows in ducts and passages at different angle of orientations were also analyzed. The model predictions were compared with the experimental data and good agreement was found. Gravity chute flows of solid-liquid mixtures were also studied. The simulation results were compared with the experimental data and discussed A thermodynamically consistent model for multiphase slurry flows with and without chemical reaction in a state of turbulent motion was developed. The balance laws were obtained and the constitutive laws established.

  11. An approach to computing direction relations between separated object groups

    Directory of Open Access Journals (Sweden)

    H. Yan

    2013-06-01

    Full Text Available Direction relations between object groups play an important role in qualitative spatial reasoning, spatial computation and spatial recognition. However, none of existing models can be used to compute direction relations between object groups. To fill this gap, an approach to computing direction relations between separated object groups is proposed in this paper, which is theoretically based on Gestalt principles and the idea of multi-directions. The approach firstly triangulates the two object groups; and then it constructs the Voronoi Diagram between the two groups using the triangular network; after this, the normal of each Vornoi edge is calculated, and the quantitative expression of the direction relations is constructed; finally, the quantitative direction relations are transformed into qualitative ones. The psychological experiments show that the proposed approach can obtain direction relations both between two single objects and between two object groups, and the results are correct from the point of view of spatial cognition.

  12. An approach to computing direction relations between separated object groups

    Science.gov (United States)

    Yan, H.; Wang, Z.; Li, J.

    2013-09-01

    Direction relations between object groups play an important role in qualitative spatial reasoning, spatial computation and spatial recognition. However, none of existing models can be used to compute direction relations between object groups. To fill this gap, an approach to computing direction relations between separated object groups is proposed in this paper, which is theoretically based on gestalt principles and the idea of multi-directions. The approach firstly triangulates the two object groups, and then it constructs the Voronoi diagram between the two groups using the triangular network. After this, the normal of each Voronoi edge is calculated, and the quantitative expression of the direction relations is constructed. Finally, the quantitative direction relations are transformed into qualitative ones. The psychological experiments show that the proposed approach can obtain direction relations both between two single objects and between two object groups, and the results are correct from the point of view of spatial cognition.

  13. A tale of three bio-inspired computational approaches

    Science.gov (United States)

    Schaffer, J. David

    2014-05-01

    I will provide a high level walk-through for three computational approaches derived from Nature. First, evolutionary computation implements what we may call the "mother of all adaptive processes." Some variants on the basic algorithms will be sketched and some lessons I have gleaned from three decades of working with EC will be covered. Then neural networks, computational approaches that have long been studied as possible ways to make "thinking machines", an old dream of man's, and based upon the only known existing example of intelligence. Then, a little overview of attempts to combine these two approaches that some hope will allow us to evolve machines we could never hand-craft. Finally, I will touch on artificial immune systems, Nature's highly sophisticated defense mechanism, that has emerged in two major stages, the innate and the adaptive immune systems. This technology is finding applications in the cyber security world.

  14. Computer architectures for computational physics work done by Computational Research and Technology Branch and Advanced Computational Concepts Group

    Science.gov (United States)

    1985-01-01

    Slides are reproduced that describe the importance of having high performance number crunching and graphics capability. They also indicate the types of research and development underway at Ames Research Center to ensure that, in the near term, Ames is a smart buyer and user, and in the long-term that Ames knows the best possible solutions for number crunching and graphics needs. The drivers for this research are real computational physics applications of interest to Ames and NASA. They are concerned with how to map the applications, and how to maximize the physics learned from the results of the calculations. The computer graphics activities are aimed at getting maximum information from the three-dimensional calculations by using the real time manipulation of three-dimensional data on the Silicon Graphics workstation. Work is underway on new algorithms that will permit the display of experimental results that are sparse and random, the same way that the dense and regular computed results are displayed.

  15. A GPU-Computing Approach to Solar Stokes Profile Inversion

    OpenAIRE

    Harker, Brian J.; Mighell, Kenneth J.

    2012-01-01

    We present a new computational approach to the inversion of solar photospheric Stokes polarization profiles, under the Milne-Eddington model, for vector magnetography. Our code, named GENESIS (GENEtic Stokes Inversion Strategy), employs multi-threaded parallel-processing techniques to harness the computing power of graphics processing units GPUs, along with algorithms designed to exploit the inherent parallelism of the Stokes inversion problem. Using a genetic algorithm (GA) engineered specif...

  16. Computational approaches for the study of biotechnologically-relevant macromolecules.

    OpenAIRE

    Filippi, G.

    2016-01-01

    Computer-based techniques have become especially important in molecular biology, since they often represent the only viable way to understand some phenomena at atomic and molecular level. The complexity of biological systems, which usually needs to be analyzed with different levels of accuracy, requires the application of different approaches. Computational methodologies applied to biotechnologies allow a molecular comprehension of biological systems at different levels of depth. Quantum mech...

  17. Distributed computer-controlled systems: the DEAR-COTS approach

    OpenAIRE

    Veríssimo, P; A. Casimiro; L. M. Pinho; vasques, f; Rodrigues, L.; E. Tovar

    2000-01-01

    This paper proposes a new architecture targeting real-time and reliable Distributed Computer-Controlled Systems (DCCS). This architecture provides a structured approach for the integration of soft and/or hard real-time applications with Commercial O -The-Shelf (COTS) components. The Timely Computing Base model is used as the reference model to deal with the heterogeneity of system components with respect to guaranteeing the timeliness of applications. The reliability and ava...

  18. ADVANCING THE FUNDAMENTAL UNDERSTANDING AND SCALE-UP OF TRISO FUEL COATERS VIA ADVANCED MEASUREMENT AND COMPUTATIONAL TECHNIQUES

    Energy Technology Data Exchange (ETDEWEB)

    Biswas, Pratim; Al-Dahhan, Muthanna

    2012-11-01

    Tri-isotropic (TRISO) fuel particle coating is critical for the future use of nuclear energy produced byadvanced gas reactors (AGRs). The fuel kernels are coated using chemical vapor deposition in a spouted fluidized bed. The challenges encountered in operating TRISO fuel coaters are due to the fact that in modern AGRs, such as High Temperature Gas Reactors (HTGRs), the acceptable level of defective/failed coated particles is essentially zero. This specification requires processes that produce coated spherical particles with even coatings having extremely low defect fractions. Unfortunately, the scale-up and design of the current processes and coaters have been based on empirical approaches and are operated as black boxes. Hence, a voluminous amount of experimental development and trial and error work has been conducted. It has been clearly demonstrated that the quality of the coating applied to the fuel kernels is impacted by the hydrodynamics, solids flow field, and flow regime characteristics of the spouted bed coaters, which themselves are influenced by design parameters and operating variables. Further complicating the outlook for future fuel-coating technology and nuclear energy production is the fact that a variety of new concepts will involve fuel kernels of different sizes and with compositions of different densities. Therefore, without a fundamental understanding the underlying phenomena of the spouted bed TRISO coater, a significant amount of effort is required for production of each type of particle with a significant risk of not meeting the specifications. This difficulty will significantly and negatively impact the applications of AGRs for power generation and cause further challenges to them as an alternative source of commercial energy production. Accordingly, the proposed work seeks to overcome such hurdles and advance the scale-up, design, and performance of TRISO fuel particle spouted bed coaters. The overall objectives of the proposed work are

  19. The Advance of Computing from the Ground to the Cloud

    Science.gov (United States)

    Breeding, Marshall

    2009-01-01

    A trend toward the abstraction of computing platforms that has been developing in the broader IT arena over the last few years is just beginning to make inroads into the library technology scene. Cloud computing offers for libraries many interesting possibilities that may help reduce technology costs and increase capacity, reliability, and…

  20. Advances in soft computing, intelligent robotics and control

    CERN Document Server

    Fullér, Robert

    2014-01-01

    Soft computing, intelligent robotics and control are in the core interest of contemporary engineering. Essential characteristics of soft computing methods are the ability to handle vague information, to apply human-like reasoning, their learning capability, and ease of application. Soft computing techniques are widely applied in the control of dynamic systems, including mobile robots. The present volume is a collection of 20 chapters written by respectable experts of the fields, addressing various theoretical and practical aspects in soft computing, intelligent robotics and control. The first part of the book concerns with issues of intelligent robotics, including robust xed point transformation design, experimental verification of the input-output feedback linearization of differentially driven mobile robot and applying kinematic synthesis to micro electro-mechanical systems design. The second part of the book is devoted to fundamental aspects of soft computing. This includes practical aspects of fuzzy rule ...

  1. Developing a New Atomic Physics Computer Program (HTAC) to Perform Atomic Structure and Transition Rate Calculations in Three Advanced Methods

    OpenAIRE

    Amani Tahat; Mahmoud Abu-Allaban; Safeia Hamasha

    2011-01-01

    In this study, a new atomic physics program (HTAC) is introduced and tested. It is a utility program designed to automate the computation of various atomic structure and spectral data. It is the first comprehensive code that enables performing atomic calculations based on three advanced theories: the fully relativistic configuration interactions approach, the multi-reference many body perturbation theory and the R-Matrix method. It has been designed to generate tabulated atomic data files tha...

  2. Simulating advanced life support systems to test integrated control approaches

    Science.gov (United States)

    Kortenkamp, D.; Bell, S.

    Simulations allow for testing of life support control approaches before hardware is designed and built. Simulations also allow for the safe exploration of alternative control strategies during life support operation. As such, they are an important component of any life support research program and testbed. This paper describes a specific advanced life support simulation being created at NASA Johnson Space Center. It is a discrete-event simulation that is dynamic and stochastic. It simulates all major components of an advanced life support system, including crew (with variable ages, weights and genders), biomass production (with scalable plantings of ten different crops), water recovery, air revitalization, food processing, solid waste recycling and energy production. Each component is modeled as a producer of certain resources and a consumer of certain resources. The control system must monitor (via sensors) and control (via actuators) the flow of resources throughout the system to provide life support functionality. The simulation is written in an object-oriented paradigm that makes it portable, extensible and reconfigurable.

  3. A computationally efficient approach for hidden-Markov model-augmented fingerprint-based positioning

    Science.gov (United States)

    Roth, John; Tummala, Murali; McEachen, John

    2016-09-01

    This paper presents a computationally efficient approach for mobile subscriber position estimation in wireless networks. A method of data scaling assisted by timing adjust is introduced in fingerprint-based location estimation under a framework which allows for minimising computational cost. The proposed method maintains a comparable level of accuracy to the traditional case where no data scaling is used and is evaluated in a simulated environment under varying channel conditions. The proposed scheme is studied when it is augmented by a hidden-Markov model to match the internal parameters to the channel conditions that present, thus minimising computational cost while maximising accuracy. Furthermore, the timing adjust quantity, available in modern wireless signalling messages, is shown to be able to further reduce computational cost and increase accuracy when available. The results may be seen as a significant step towards integrating advanced position-based modelling with power-sensitive mobile devices.

  4. Advanced neural network-based computational schemes for robust fault diagnosis

    CERN Document Server

    Mrugalski, Marcin

    2014-01-01

    The present book is devoted to problems of adaptation of artificial neural networks to robust fault diagnosis schemes. It presents neural networks-based modelling and estimation techniques used for designing robust fault diagnosis schemes for non-linear dynamic systems. A part of the book focuses on fundamental issues such as architectures of dynamic neural networks, methods for designing of neural networks and fault diagnosis schemes as well as the importance of robustness. The book is of a tutorial value and can be perceived as a good starting point for the new-comers to this field. The book is also devoted to advanced schemes of description of neural model uncertainty. In particular, the methods of computation of neural networks uncertainty with robust parameter estimation are presented. Moreover, a novel approach for system identification with the state-space GMDH neural network is delivered. All the concepts described in this book are illustrated by both simple academic illustrative examples and practica...

  5. Tutorial on Computing: Technological Advances, Social Implications, Ethical and Legal Issues

    OpenAIRE

    Debnath, Narayan

    2012-01-01

    Computing and information technology have made significant advances. The use of computing and technology is a major aspect of our lives, and this use will only continue to increase in our lifetime. Electronic digital computers and high performance communication networks are central to contemporary information technology. The computing applications in a wide range of areas including business, communications, medical research, transportation, entertainments, and education are transforming lo...

  6. Advances in Physarum machines sensing and computing with Slime mould

    CERN Document Server

    2016-01-01

    This book is devoted to Slime mould Physarum polycephalum, which is a large single cell capable for distributed sensing, concurrent information processing, parallel computation and decentralized actuation. The ease of culturing and experimenting with Physarum makes this slime mould an ideal substrate for real-world implementations of unconventional sensing and computing devices The book is a treatise of theoretical and experimental laboratory studies on sensing and computing properties of slime mould, and on the development of mathematical and logical theories of Physarum behavior. It is shown how to make logical gates and circuits, electronic devices (memristors, diodes, transistors, wires, chemical and tactile sensors) with the slime mould. The book demonstrates how to modify properties of Physarum computing circuits with functional nano-particles and polymers, to interface the slime mould with field-programmable arrays, and to use Physarum as a controller of microbial fuel cells. A unique multi-agent model...

  7. Building an Advanced Computing Environment with SAN Support

    Institute of Scientific and Technical Information of China (English)

    DajianYANG; MeiMA; 等

    2001-01-01

    The current computing environment of our Computing Center in IHEP uses a SAS (server Attached Storage)architecture,attaching all the storage devices directly to the machines.This kind of storage strategy can't meet the requirement of our BEPC II/BESⅢ project properly.Thus we design and implement a SAN-based computing environment,which consists of several computing farms,a three-level storage pool,a set of storage management software and a web-based data management system.The feature of ours system includes cross-platform data sharing,fast data access,high scalability,convenient storage management and data management.

  8. Center for Advanced Energy Studies: Computer Assisted Virtual Environment (CAVE)

    Data.gov (United States)

    Federal Laboratory Consortium — The laboratory contains a four-walled 3D computer assisted virtual environment - or CAVE TM — that allows scientists and engineers to literally walk into their data...

  9. Parallel computing in genomic research: advances and applications

    OpenAIRE

    Ocaña K; de Oliveira D

    2015-01-01

    Kary Ocaña,1 Daniel de Oliveira2 1National Laboratory of Scientific Computing, Petrópolis, Rio de Janeiro, 2Institute of Computing, Fluminense Federal University, Niterói, Brazil Abstract: Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques...

  10. Parallel computing in genomic research: advances and applications

    OpenAIRE

    Oliveira, Daniel De

    2015-01-01

    Kary Ocaña,1 Daniel de Oliveira2 1National Laboratory of Scientific Computing, Petrópolis, Rio de Janeiro, 2Institute of Computing, Fluminense Federal University, Niterói, Brazil Abstract: Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations...

  11. Advanced Simulation and Computing: A Summary Report to the Director's Review

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, M G; Peck, T

    2003-06-01

    It has now been three years since the Advanced Simulation and Computing Program (ASCI), as managed by Defense and Nuclear Technologies (DNT) Directorate, has been reviewed by this Director's Review Committee (DRC). Since that time, there has been considerable progress for all components of the ASCI Program, and these developments will be highlighted in this document and in the presentations planned for June 9 and 10, 2003. There have also been some name changes. Today, the Program is called ''Advanced Simulation and Computing,'' Although it retains the familiar acronym ASCI, the initiative nature of the effort has given way to sustained services as an integral part of the Stockpile Stewardship Program (SSP). All computing efforts at LLNL and the other two Defense Program (DP) laboratories are funded and managed under ASCI. This includes the so-called legacy codes, which remain essential tools in stockpile stewardship. The contract between the Department of Energy (DOE) and the University of California (UC) specifies an independent appraisal of Directorate technical work and programmatic management. Such represents the work of this DNT Review Committee. Beginning this year, the Laboratory is implementing a new review system. This process was negotiated between UC, the National Nuclear Security Administration (NNSA), and the Laboratory Directors. Central to this approach are eight performance objectives that focus on key programmatic and administrative goals. Associated with each of these objectives are a number of performance measures to more clearly characterize the attainment of the objectives. Each performance measure has a lead directorate and one or more contributing directorates. Each measure has an evaluation plan and has identified expected documentation to be included in the ''Assessment File''.

  12. Advances in computational design and analysis of airbreathing propulsion systems

    Science.gov (United States)

    Klineberg, John M.

    1989-01-01

    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  13. Advancements in Violin-Related Human-Computer Interaction

    DEFF Research Database (Denmark)

    Overholt, Daniel

    2014-01-01

    of human intelligence and emotion is at the core of the Musical Interface Technology Design Space, MITDS. This is a framework that endeavors to retain and enhance such traits of traditional instruments in the design of interactive live performance interfaces. Utilizing the MITDS, advanced Human...

  14. Advanced Modulation Techniques for High-Performance Computing Optical Interconnects

    DEFF Research Database (Denmark)

    Karinou, Fotini; Borkowski, Robert; Zibar, Darko;

    2013-01-01

    optical shared memory supercomputer interconnect system switch fabric. In particular, we investigate the resilience of the aforementioned advanced modulation formats to the nonlinearities of semiconductor optical amplifiers, used as ON/OFF gates in the supercomputer optical switch fabric under study. In...

  15. A computational intelligence approach to the Mars Precision Landing problem

    Science.gov (United States)

    Birge, Brian Kent, III

    Various proposed Mars missions, such as the Mars Sample Return Mission (MRSR) and the Mars Smart Lander (MSL), require precise re-entry terminal position and velocity states. This is to achieve mission objectives including rendezvous with a previous landed mission, or reaching a particular geographic landmark. The current state of the art footprint is in the magnitude of kilometers. For this research a Mars Precision Landing is achieved with a landed footprint of no more than 100 meters, for a set of initial entry conditions representing worst guess dispersions. Obstacles to reducing the landed footprint include trajectory dispersions due to initial atmospheric entry conditions (entry angle, parachute deployment height, etc.), environment (wind, atmospheric density, etc.), parachute deployment dynamics, unavoidable injection error (propagated error from launch on), etc. Weather and atmospheric models have been developed. Three descent scenarios have been examined. First, terminal re-entry is achieved via a ballistic parachute with concurrent thrusting events while on the parachute, followed by a gravity turn. Second, terminal re-entry is achieved via a ballistic parachute followed by gravity turn to hover and then thrust vector to desired location. Third, a guided parafoil approach followed by vectored thrusting to reach terminal velocity is examined. The guided parafoil is determined to be the best architecture. The purpose of this study is to examine the feasibility of using a computational intelligence strategy to facilitate precision planetary re-entry, specifically to take an approach that is somewhat more intuitive and less rigid, and see where it leads. The test problems used for all research are variations on proposed Mars landing mission scenarios developed by NASA. A relatively recent method of evolutionary computation is Particle Swarm Optimization (PSO), which can be considered to be in the same general class as Genetic Algorithms. An improvement over

  16. The use of advanced computer simulation in structural design

    Energy Technology Data Exchange (ETDEWEB)

    Field, C.J.; Mole, A. [Arup, San Fransisco, CA (United States); Arkinstall, M. [Arup, Sydney (Australia)

    2005-07-01

    The benefits that can be gained from the application of advanced numerical simulation in building design were discussed. A review of current practices in structural engineering was presented along with an illustration of a range of international project case studies. Structural engineers use analytical methods to evaluate both static and dynamic loads. Structural design is prescribed by a range of building codes, depending on location, building type and loading, but often, buildings do not fit well within the codes, particularly if one wants to take advantage of new technologies and developments in design that are not covered by the code. Advanced simulation refers to the use of mathematical modeling to complex problems to allow a wider consideration of building types and conditions that can be designed reliably using standard practices. Advanced simulation is used to address virtual testing and prototyping, verifying innovative design ideas, forensic engineering, and design optimization. The benefits of advanced simulation include enhanced creativity, improved performance, cost savings, risk management, sustainable design solutions, and better communication. The following 5 case studies illustrated the value gained by using advanced simulation as an integral part of the design process: the earthquake resistant Maison Hermes in Tokyo; the seismic resistant braces known as the Unbonded Brace for use in the United States; a simulation of the existing Disney Museum to evaluate its capacity to resist earthquakes; simulation of the MIT Brain and Cognitive Science Project to evaluate the effect of different foundation types on the vibration entering the building; and, the Beijing Aquatic Center whose design was streamlined by optimized structural analysis. It was suggested that industry should encourage the transfer of technology from other professions and should try to collaborate towards a global building model to construct buildings in a more efficient manner. 7 refs

  17. Mutations that Cause Human Disease: A Computational/Experimental Approach

    Energy Technology Data Exchange (ETDEWEB)

    Beernink, P; Barsky, D; Pesavento, B

    2006-01-11

    International genome sequencing projects have produced billions of nucleotides (letters) of DNA sequence data, including the complete genome sequences of 74 organisms. These genome sequences have created many new scientific opportunities, including the ability to identify sequence variations among individuals within a species. These genetic differences, which are known as single nucleotide polymorphisms (SNPs), are particularly important in understanding the genetic basis for disease susceptibility. Since the report of the complete human genome sequence, over two million human SNPs have been identified, including a large-scale comparison of an entire chromosome from twenty individuals. Of the protein coding SNPs (cSNPs), approximately half leads to a single amino acid change in the encoded protein (non-synonymous coding SNPs). Most of these changes are functionally silent, while the remainder negatively impact the protein and sometimes cause human disease. To date, over 550 SNPs have been found to cause single locus (monogenic) diseases and many others have been associated with polygenic diseases. SNPs have been linked to specific human diseases, including late-onset Parkinson disease, autism, rheumatoid arthritis and cancer. The ability to predict accurately the effects of these SNPs on protein function would represent a major advance toward understanding these diseases. To date several attempts have been made toward predicting the effects of such mutations. The most successful of these is a computational approach called ''Sorting Intolerant From Tolerant'' (SIFT). This method uses sequence conservation among many similar proteins to predict which residues in a protein are functionally important. However, this method suffers from several limitations. First, a query sequence must have a sufficient number of relatives to infer sequence conservation. Second, this method does not make use of or provide any information on protein structure, which

  18. A GPU-Computing Approach to Solar Stokes Profile Inversion

    CERN Document Server

    Harker, Brian J

    2012-01-01

    We present a new computational approach to the inversion of solar photospheric Stokes polarization profiles, under the Milne-Eddington model, for vector magnetography. Our code, named GENESIS (GENEtic Stokes Inversion Strategy), employs multi-threaded parallel-processing techniques to harness the computing power of graphics processing units GPUs, along with algorithms designed to exploit the inherent parallelism of the Stokes inversion problem. Using a genetic algorithm (GA) engineered specifically for use with a GPU, we produce full-disc maps of the photospheric vector magnetic field from polarized spectral line observations recorded by the Synoptic Optical Long-term Investigations of the Sun (SOLIS) Vector Spectromagnetograph (VSM) instrument. We show the advantages of pairing a population-parallel genetic algorithm with data-parallel GPU-computing techniques, and present an overview of the Stokes inversion problem, including a description of our adaptation to the GPU-computing paradigm. Full-disc vector ma...

  19. Advances in a computer aided bilateral manipulator system

    International Nuclear Information System (INIS)

    This paper relates developments and experiments carried at Saclay in the frame of ARA/sup b/ program by the computer aided teleoperation (CAT) group. The goal is to improve efficiency and operational safety of remote operations using computer and sensors. They enable to substitute to the to the operator(s) in time sharing and/or in parallel, and augment amount and/or quality of sensory feedback. After describing the test facility in Saclay, the developments of various participants are described. Result of this work will be commercially available with the MA23M and future MAE 200 at La Calhene (France, UK, Japan)

  20. Recent advances in computer modelling of granular systems

    OpenAIRE

    Jullien, R.; Meakin, P.; Pavlovitch, A.

    1993-01-01

    We present simple computer algorithms able to build random packings of spheres using the ballistic deposition model and we show how they can be used to investigate several size segregation phenomena occuring in granular systems : 1) penetration of a small sphere in a packing of large ones, 2) size-segregation in the formation of a heap or when pouring a silo, 3) size-segregation by shaking. In the last case, the computer simulation provides a very simple geometrical explanation of the phenome...

  1. Current advances in molecular, biochemical, and computational modeling analysis of microalgal triacylglycerol biosynthesis.

    Science.gov (United States)

    Lenka, Sangram K; Carbonaro, Nicole; Park, Rudolph; Miller, Stephen M; Thorpe, Ian; Li, Yantao

    2016-01-01

    Triacylglycerols (TAGs) are highly reduced energy storage molecules ideal for biodiesel production. Microalgal TAG biosynthesis has been studied extensively in recent years, both at the molecular level and systems level through experimental studies and computational modeling. However, discussions of the strategies and products of the experimental and modeling approaches are rarely integrated and summarized together in a way that promotes collaboration among modelers and biologists in this field. In this review, we outline advances toward understanding the cellular and molecular factors regulating TAG biosynthesis in unicellular microalgae with an emphasis on recent studies on rate-limiting steps in fatty acid and TAG synthesis, while also highlighting new insights obtained from the integration of multi-omics datasets with mathematical models. Computational methodologies such as kinetic modeling, metabolic flux analysis, and new variants of flux balance analysis are explained in detail. We discuss how these methods have been used to simulate algae growth and lipid metabolism in response to changing culture conditions and how they have been used in conjunction with experimental validations. Since emerging evidence indicates that TAG synthesis in microalgae operates through coordinated crosstalk between multiple pathways in diverse subcellular destinations including the endoplasmic reticulum and plastids, we discuss new experimental studies and models that incorporate these findings for discovering key regulatory checkpoints. Finally, we describe tools for genetic manipulation of microalgae and their potential for future rational algal strain design. This comprehensive review explores the potential synergistic impact of pathway analysis, computational approaches, and molecular genetic manipulation strategies on improving TAG production in microalgae. PMID:27321475

  2. Advanced Micro Optics Characterization Using Computer Generated Holograms

    Energy Technology Data Exchange (ETDEWEB)

    Arnold, S.; Maxey, L.C.; Moreshead, W.; Nogues, J.L.

    1998-11-01

    This CRADA has enabled the validation of Computer Generated Holograms (CGH) testing for certain classes of micro optics. It has also identified certain issues that are significant when considering the use of CGHs in this application. Both contributions are advantageous in the pursuit of better manufacturing and testing technologies for these important optical components.

  3. Computing support for advanced medical data analysis and imaging

    CERN Document Server

    Wiślicki, W; Białas, P; Czerwiński, E; Kapłon, Ł; Kochanowski, A; Korcyl, G; Kowal, J; Kowalski, P; Kozik, T; Krzemień, W; Molenda, M; Moskal, P; Niedźwiecki, S; Pałka, M; Pawlik, M; Raczyński, L; Rudy, Z; Salabura, P; Sharma, N G; Silarski, M; Słomski, A; Smyrski, J; Strzelecki, A; Wieczorek, A; Zieliński, M; Zoń, N

    2014-01-01

    We discuss computing issues for data analysis and image reconstruction of PET-TOF medical scanner or other medical scanning devices producing large volumes of data. Service architecture based on the grid and cloud concepts for distributed processing is proposed and critically discussed.

  4. 77 FR 45345 - DOE/Advanced Scientific Computing Advisory Committee

    Science.gov (United States)

    2012-07-31

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY DOE... at (301) 903-7486 or email at: Melea.Baker@science.doe.gov . You must make your request for an oral... Computing Web site ( www.sc.doe.gov/ascr ) for viewing. Issued at Washington, DC, on July 25, 2012....

  5. 77 FR 62231 - DOE/Advanced Scientific Computing Advisory Committee

    Science.gov (United States)

    2012-10-12

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY DOE.... Computational Science Graduate Fellowship (CSGF) Longitudinal Study. Update on Exascale. Update from DOE data... contact Melea Baker, (301) 903-7486 or by email at: Melea.Baker@science.doe.gov . You must make...

  6. Advanced Simulation and Computing Co-Design Strategy

    Energy Technology Data Exchange (ETDEWEB)

    Ang, James A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoang, Thuc T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kelly, Suzanne M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); McPherson, Allen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Neely, Rob [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-11-01

    This ASC Co-design Strategy lays out the full continuum and components of the co-design process, based on what we have experienced thus far and what we wish to do more in the future to meet the program’s mission of providing high performance computing (HPC) and simulation capabilities for NNSA to carry out its stockpile stewardship responsibility.

  7. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Bremer, Peer-Timo [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mohr, Bernd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schulz, Martin [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pasccci, Valerio [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gamblin, Todd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Brunst, Holger [Dresden Univ. of Technology (Germany)

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  8. Computer vision approaches to medical image analysis. Revised papers

    International Nuclear Information System (INIS)

    This book constitutes the thoroughly refereed post proceedings of the international workshop Computer Vision Approaches to Medical Image Analysis, CVAMIA 2006, held in Graz, Austria in May 2006 as a satellite event of the 9th European Conference on Computer Vision, EECV 2006. The 10 revised full papers and 11 revised poster papers presented together with 1 invited talk were carefully reviewed and selected from 38 submissions. The papers are organized in topical sections on clinical applications, image registration, image segmentation and analysis, and the poster session. (orig.)

  9. Computer Synthesis Approaches of Hyperboloid Gear Drives with Linear Contact

    Directory of Open Access Journals (Sweden)

    Abadjiev Valentin

    2014-09-01

    Full Text Available The computer design has improved forming different type software for scientific researches in the field of gearing theory as well as performing an adequate scientific support of the gear drives manufacture. Here are attached computer programs that are based on mathematical models as a result of scientific researches. The modern gear transmissions require the construction of new mathematical approaches to their geometric, technological and strength analysis. The process of optimization, synthesis and design is based on adequate iteration procedures to find out an optimal solution by varying definite parameters.

  10. Integrated Computer Aided Planning and Manufacture of Advanced Technology Jet Engines

    Directory of Open Access Journals (Sweden)

    B. K. Subhas

    1987-10-01

    Full Text Available This paper highlights an attempt at evolving a computer aided manufacturing system on a personal computer. A case study of an advanced technology jet engine component is included to illustrate various outputs from the system. The proposed system could be an alternate solution to sophisticated and expensive CAD/CAM workstations.

  11. Teaching Advanced Concepts in Computer Networks: VNUML-UM Virtualization Tool

    Science.gov (United States)

    Ruiz-Martinez, A.; Pereniguez-Garcia, F.; Marin-Lopez, R.; Ruiz-Martinez, P. M.; Skarmeta-Gomez, A. F.

    2013-01-01

    In the teaching of computer networks the main problem that arises is the high price and limited number of network devices the students can work with in the laboratories. Nowadays, with virtualization we can overcome this limitation. In this paper, we present a methodology that allows students to learn advanced computer network concepts through…

  12. Advances in bio-inspired computing for combinatorial optimization problems

    CERN Document Server

    Pintea, Camelia-Mihaela

    2013-01-01

    Advances in Bio-inspired Combinatorial Optimization Problems' illustrates several recent bio-inspired efficient algorithms for solving NP-hard problems.Theoretical bio-inspired concepts and models, in particular for agents, ants and virtual robots are described. Large-scale optimization problems, for example: the Generalized Traveling Salesman Problem and the Railway Traveling Salesman Problem, are solved and their results are discussed.Some of the main concepts and models described in this book are: inner rule to guide ant search - a recent model in ant optimization, heterogeneous sensitive a

  13. Organic and inorganic nitrogen dynamics in soil - advanced Ntrace approach

    Science.gov (United States)

    Andresen, Louise C.; Björsne, Anna-Karin; Bodé, Samuel; Klemedtsson, Leif; Boeckx, Pascal; Rütting, Tobias

    2016-04-01

    Depolymerization of soil organic nitrogen (SON) into monomers (e.g. amino acids) is currently thought to be the rate limiting step for the terrestrial nitrogen (N) cycle. The production of free amino acids (AA) is followed by AA mineralization to ammonium, which is an important fraction of the total N mineralization. Accurate assessment of depolymerization and AA mineralization rate is important for a better understanding of the rate limiting steps. Recent developments in the 15N pool dilution techniques, based on 15N labelling of AA's, allow quantifying gross rates of SON depolymerization and AA mineralization (Wanek et al., 2010; Andersen et al., 2015) in addition to gross N mineralization. However, it is well known that the 15N pool dilution approach has limitations; in particular that gross rates of consumption processes (e.g. AA mineralization) are overestimated. This has consequences for evaluating the rate limiting step of the N cycle, as well as for estimating the nitrogen use efficiency (NUE). Here we present a novel 15N tracing approach, which combines 15N-AA labelling with an advanced version of the 15N tracing model Ntrace (Müller et al., 2007) explicitly accounting for AA turnover in soil. This approach (1) provides a more robust quantification of gross depolymerization and AA mineralization and (2) suggests a more realistic estimate for the microbial NUE of amino acids. Advantages of the new 15N tracing approach will be discussed and further improvements will be identified. References: Andresen, L.C., Bodé, S., Tietema, A., Boeckx, P., and Rütting, T.: Amino acid and N mineralization dynamics in heathland soil after long-term warming and repetitive drought, SOIL, 1, 341-349, 2015. Müller, C., Rütting, T., Kattge, J., Laughlin, R. J., and Stevens, R. J.: Estimation of parameters in complex 15N tracing models via Monte Carlo sampling, Soil Biology & Biochemistry, 39, 715-726, 2007. Wanek, W., Mooshammer, M., Blöchl, A., Hanreich, A., and Richter

  14. Recent advances in swarm intelligence and evolutionary computation

    CERN Document Server

    2015-01-01

    This timely review volume summarizes the state-of-the-art developments in nature-inspired algorithms and applications with the emphasis on swarm intelligence and bio-inspired computation. Topics include the analysis and overview of swarm intelligence and evolutionary computation, hybrid metaheuristic algorithms, bat algorithm, discrete cuckoo search, firefly algorithm, particle swarm optimization, and harmony search as well as convergent hybridization. Application case studies have focused on the dehydration of fruits and vegetables by the firefly algorithm and goal programming, feature selection by the binary flower pollination algorithm, job shop scheduling, single row facility layout optimization, training of feed-forward neural networks, damage and stiffness identification, synthesis of cross-ambiguity functions by the bat algorithm, web document clustering, truss analysis, water distribution networks, sustainable building designs and others. As a timely review, this book can serve as an ideal reference f...

  15. Advances in neural networks computational intelligence for ICT

    CERN Document Server

    Esposito, Anna; Morabito, Francesco; Pasero, Eros

    2016-01-01

    This carefully edited book is putting emphasis on computational and artificial intelligent methods for learning and their relative applications in robotics, embedded systems, and ICT interfaces for psychological and neurological diseases. The book is a follow-up of the scientific workshop on Neural Networks (WIRN 2015) held in Vietri sul Mare, Italy, from the 20th to the 22nd of May 2015. The workshop, at its 27th edition became a traditional scientific event that brought together scientists from many countries, and several scientific disciplines. Each chapter is an extended version of the original contribution presented at the workshop, and together with the reviewers’ peer revisions it also benefits from the live discussion during the presentation. The content of book is organized in the following sections. 1. Introduction, 2. Machine Learning, 3. Artificial Neural Networks: Algorithms and models, 4. Intelligent Cyberphysical and Embedded System, 5. Computational Intelligence Methods for Biomedical ICT in...

  16. Advances in Computer Science and Information Engineering Volume 2

    CERN Document Server

    Lin, Sally

    2012-01-01

    CSIE2012 is an integrated conference concentrating its focus on Computer Science and Information Engineering . In the proceeding, you can learn much more knowledge about Computer Science and Information Engineering of researchers from all around the world. The main role of the proceeding is to be used as an exchange pillar for researchers who are working in the mentioned fields. In order to meet the high quality of Springer, AISC series, the organization committee has made their efforts to do the following things. Firstly, poor quality paper has been refused after reviewing course by anonymous referee experts. Secondly, periodically review meetings have been held around the reviewers about five times for exchanging reviewing suggestions. Finally, the conference organizers had several preliminary sessions before the conference. Through efforts of different people and departments, the conference will be successful and fruitful.

  17. Advances in Computer Science and Information Engineering Volume 1

    CERN Document Server

    Lin, Sally

    2012-01-01

    CSIE2012 is an integrated conference concentrating its focus on Computer Science and Information Engineering . In the proceeding, you can learn much more knowledge about Computer Science and Information Engineering of researchers from all around the world. The main role of the proceeding is to be used as an exchange pillar for researchers who are working in the mentioned fields. In order to meet the high quality of Springer, AISC series, the organization committee has made their efforts to do the following things. Firstly, poor quality paper has been refused after reviewing course by anonymous referee experts. Secondly, periodically review meetings have been held around the reviewers about five times for exchanging reviewing suggestions. Finally, the conference organizers had several preliminary sessions before the conference. Through efforts of different people and departments, the conference will be successful and fruitful.

  18. The ergonomics of computer aided design within advanced manufacturing technology.

    Science.gov (United States)

    John, P A

    1988-03-01

    Many manufacturing companies have now awakened to the significance of computer aided design (CAD), although the majority of them have only been able to purchase computerised draughting systems of which only a subset produce direct manufacturing data. Such companies are moving steadily towards the concept of computer integrated manufacture (CIM), and this demands CAD to address more than draughting. CAD architects are thus having to rethink the basic specification of such systems, although they typically suffer from an insufficient understanding of the design task and have consequently been working with inadequate specifications. It is at this fundamental level that ergonomics has much to offer, making its contribution by encouraging user-centred design. The discussion considers the relationships between CAD and: the design task; the organisation and people; creativity; and artificial intelligence. It finishes with a summary of the contribution of ergonomics. PMID:15676646

  19. SciDAC Advances and Applications in Computational Beam Dynamics

    OpenAIRE

    Ryne, R.; Abell, D.; Adelmann, A.; Amundson, J; Bohn, C; Cary, J; Colella, P.; Dechow, D.; Decyk, V.; Dragt, A.; Gerber, R.; Habib, S.; Higdon, D.; Katsouleas, T.; Ma, K.-L.

    2005-01-01

    SciDAC has had a major impact on computational beam dynamics and the design of particle accelerators. Particle accelerators -- which account for half of the facilities in the DOE Office of Science Facilities for the Future of Science 20 Year Outlook -- are crucial for US scientific, industrial, and economic competitiveness. Thanks to SciDAC, accelerator design calculations that were once thought impossible are now carried routinely, and new challenging and important calculations are with...

  20. Foreword: Advanced Science Letters (ASL), Special Issue on Computational Astrophysics

    OpenAIRE

    Mayer, Lucio

    2009-01-01

    Computational astrophysics has undergone unprecedented development over the last decade, becoming a field of its own. The challenge ahead of us will involve increasingly complex multi-scale simulations. These will bridge the gap between areas of astrophysics such as star and planet formation, or star formation and galaxy formation, that have evolved separately until today. A global knowledge of the physics and modeling techniques of astrophysical simulations is thus an important asset for the...

  1. National facility for advanced computational science: A sustainable path to scientific discovery

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst; Kramer, William; Saphir, William; Shalf, John; Bailey, David; Oliker, Leonid; Banda, Michael; McCurdy, C. William; Hules, John; Canning, Andrew; Day, Marc; Colella, Philip; Serafini, David; Wehner, Michael; Nugent, Peter

    2004-04-02

    Lawrence Berkeley National Laboratory (Berkeley Lab) proposes to create a National Facility for Advanced Computational Science (NFACS) and to establish a new partnership between the American computer industry and a national consortium of laboratories, universities, and computing facilities. NFACS will provide leadership-class scientific computing capability to scientists and engineers nationwide, independent of their institutional affiliation or source of funding. This partnership will bring into existence a new class of computational capability in the United States that is optimal for science and will create a sustainable path towards petaflops performance.

  2. Transparency and deliberation within the FOMC: a computational linguistics approach

    OpenAIRE

    Hansen, Stephen; McMahon, Michael; Prat, Andrea

    2014-01-01

    How does transparency, a key feature of central bank design, affect the deliberation of monetary policymakers? We exploit a natural experiment in the Federal Open Market Committee in 1993 together with computational linguistic models (particularly Latent Dirichlet Allocation) to measure the effect of increased transparency on debate. Commentators have hypothesized both a beneficial discipline effect and a detrimental conformity effect. A difference-in-differences approach inspired by the care...

  3. Computational Fluid Dynamic Approach for Biological System Modeling

    OpenAIRE

    Huang, Weidong; Wu, Chundu; Xiao, Bingjia; Xia, Weidong

    2005-01-01

    Various biological system models have been proposed in systems biology, which are based on the complex biological reactions kinetic of various components. These models are not practical because we lack of kinetic information. In this paper, it is found that the enzymatic reaction and multi-order reaction rate is often controlled by the transport of the reactants in biological systems. A Computational Fluid Dynamic (CFD) approach, which is based on transport of the components and kinetics of b...

  4. Computer Mechatronics: A Radical Approach to Mechatronics Education

    OpenAIRE

    Nilsson, Martin

    2005-01-01

    This paper describes some distinguishing features of a course on mechatronics, based on computer science. We propose a teaching approach called Controlled Problem-Based Learning (CPBL). We have applied this method on three generations (2003-2005) of mainly fourth-year undergraduate students at Lund University (LTH). Although students found the course difficult, there were no dropouts, and all students attended the examination 2005.

  5. WSRC approach to validation of criticality safety computer codes

    Energy Technology Data Exchange (ETDEWEB)

    Finch, D.R.; Mincey, J.F.

    1991-12-31

    Recent hardware and operating system changes at Westinghouse Savannah River Site (WSRC) have necessitated review of the validation for JOSHUA criticality safety computer codes. As part of the planning for this effort, a policy for validation of JOSHUA and other criticality safety codes has been developed. This policy will be illustrated with the steps being taken at WSRC. The objective in validating a specific computational method is to reliably correlate its calculated neutron multiplication factor (K{sub eff}) with known values over a well-defined set of neutronic conditions. Said another way, such correlations should be: (1) repeatable; (2) demonstrated with defined confidence; and (3) identify the range of neutronic conditions (area of applicability) for which the correlations are valid. The general approach to validation of computational methods at WSRC must encompass a large number of diverse types of fissile material processes in different operations. Special problems are presented in validating computational methods when very few experiments are available (such as for enriched uranium systems with principal second isotope {sup 236}U). To cover all process conditions at WSRC, a broad validation approach has been used. Broad validation is based upon calculation of many experiments to span all possible ranges of reflection, nuclide concentrations, moderation ratios, etc. Narrow validation, in comparison, relies on calculations of a few experiments very near anticipated worst-case process conditions. The methods and problems of broad validation are discussed.

  6. WSRC approach to validation of criticality safety computer codes

    Energy Technology Data Exchange (ETDEWEB)

    Finch, D.R.; Mincey, J.F.

    1991-01-01

    Recent hardware and operating system changes at Westinghouse Savannah River Site (WSRC) have necessitated review of the validation for JOSHUA criticality safety computer codes. As part of the planning for this effort, a policy for validation of JOSHUA and other criticality safety codes has been developed. This policy will be illustrated with the steps being taken at WSRC. The objective in validating a specific computational method is to reliably correlate its calculated neutron multiplication factor (K{sub eff}) with known values over a well-defined set of neutronic conditions. Said another way, such correlations should be: (1) repeatable; (2) demonstrated with defined confidence; and (3) identify the range of neutronic conditions (area of applicability) for which the correlations are valid. The general approach to validation of computational methods at WSRC must encompass a large number of diverse types of fissile material processes in different operations. Special problems are presented in validating computational methods when very few experiments are available (such as for enriched uranium systems with principal second isotope {sup 236}U). To cover all process conditions at WSRC, a broad validation approach has been used. Broad validation is based upon calculation of many experiments to span all possible ranges of reflection, nuclide concentrations, moderation ratios, etc. Narrow validation, in comparison, relies on calculations of a few experiments very near anticipated worst-case process conditions. The methods and problems of broad validation are discussed.

  7. Computer Synthesis Approaches of Hyperboloid Gear Drives with Linear Contact

    Science.gov (United States)

    Abadjiev, Valentin; Kawasaki, Haruhisa

    2014-09-01

    The computer design has improved forming different type software for scientific researches in the field of gearing theory as well as performing an adequate scientific support of the gear drives manufacture. Here are attached computer programs that are based on mathematical models as a result of scientific researches. The modern gear transmissions require the construction of new mathematical approaches to their geometric, technological and strength analysis. The process of optimization, synthesis and design is based on adequate iteration procedures to find out an optimal solution by varying definite parameters. The study is dedicated to accepted methodology in the creation of soft- ware for the synthesis of a class high reduction hyperboloid gears - Spiroid and Helicon ones (Spiroid and Helicon are trademarks registered by the Illinois Tool Works, Chicago, Ill). The developed basic computer products belong to software, based on original mathematical models. They are based on the two mathematical models for the synthesis: "upon a pitch contact point" and "upon a mesh region". Computer programs are worked out on the basis of the described mathematical models, and the relations between them are shown. The application of the shown approaches to the synthesis of commented gear drives is illustrated.

  8. Archiving Software Systems: Approaches to Preserve Computational Capabilities

    Science.gov (United States)

    King, T. A.

    2014-12-01

    A great deal of effort is made to preserve scientific data. Not only because data is knowledge, but it is often costly to acquire and is sometimes collected under unique circumstances. Another part of the science enterprise is the development of software to process and analyze the data. Developed software is also a large investment and worthy of preservation. However, the long term preservation of software presents some challenges. Software often requires a specific technology stack to operate. This can include software, operating systems and hardware dependencies. One past approach to preserve computational capabilities is to maintain ancient hardware long past its typical viability. On an archive horizon of 100 years, this is not feasible. Another approach to preserve computational capabilities is to archive source code. While this can preserve details of the implementation and algorithms, it may not be possible to reproduce the technology stack needed to compile and run the resulting applications. This future forward dilemma has a solution. Technology used to create clouds and process big data can also be used to archive and preserve computational capabilities. We explore how basic hardware, virtual machines, containers and appropriate metadata can be used to preserve computational capabilities and to archive functional software systems. In conjunction with data archives, this provides scientist with both the data and capability to reproduce the processing and analysis used to generate past scientific results.

  9. A computational approach to developing mathematical models of polyploid meiosis.

    Science.gov (United States)

    Rehmsmeier, Marc

    2013-04-01

    Mathematical models of meiosis that relate offspring to parental genotypes through parameters such as meiotic recombination frequency have been difficult to develop for polyploids. Existing models have limitations with respect to their analytic potential, their compatibility with insights into mechanistic aspects of meiosis, and their treatment of model parameters in terms of parameter dependencies. In this article I put forward a computational approach to the probabilistic modeling of meiosis. A computer program enumerates all possible paths through the phases of replication, pairing, recombination, and segregation, while keeping track of the probabilities of the paths according to the various parameters involved. Probabilities for classes of genotypes or phenotypes are added, and the resulting formulas are simplified by the symbolic-computation system Mathematica. An example application to autotetraploids results in a model that remedies the limitations of previous models mentioned above. In addition to the immediate implications, the computational approach presented here can be expected to be useful through opening avenues for modeling a host of processes, including meiosis in higher-order ploidies. PMID:23335332

  10. WSRC approach to validation of criticality safety computer codes

    International Nuclear Information System (INIS)

    Recent hardware and operating system changes at Westinghouse Savannah River Site (WSRC) have necessitated review of the validation for JOSHUA criticality safety computer codes. As part of the planning for this effort, a policy for validation of JOSHUA and other criticality safety codes has been developed. This policy will be illustrated with the steps being taken at WSRC. The objective in validating a specific computational method is to reliably correlate its calculated neutron multiplication factor (Keff) with known values over a well-defined set of neutronic conditions. Said another way, such correlations should be: (1) repeatable; (2) demonstrated with defined confidence; and (3) identify the range of neutronic conditions (area of applicability) for which the correlations are valid. The general approach to validation of computational methods at WSRC must encompass a large number of diverse types of fissile material processes in different operations. Special problems are presented in validating computational methods when very few experiments are available (such as for enriched uranium systems with principal second isotope 236U). To cover all process conditions at WSRC, a broad validation approach has been used. Broad validation is based upon calculation of many experiments to span all possible ranges of reflection, nuclide concentrations, moderation ratios, etc. Narrow validation, in comparison, relies on calculations of a few experiments very near anticipated worst-case process conditions. The methods and problems of broad validation are discussed

  11. Vision 20/20: Automation and advanced computing in clinical radiation oncology

    Energy Technology Data Exchange (ETDEWEB)

    Moore, Kevin L., E-mail: kevinmoore@ucsd.edu; Moiseenko, Vitali [Department of Radiation Medicine and Applied Sciences, University of California San Diego, La Jolla, California 92093 (United States); Kagadis, George C. [Department of Medical Physics, School of Medicine, University of Patras, Rion, GR 26504 (Greece); McNutt, Todd R. [Department of Radiation Oncology and Molecular Radiation Science, School of Medicine, Johns Hopkins University, Baltimore, Maryland 21231 (United States); Mutic, Sasa [Department of Radiation Oncology, Washington University in St. Louis, St. Louis, Missouri 63110 (United States)

    2014-01-15

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.

  12. Vision 20/20: Automation and advanced computing in clinical radiation oncology

    International Nuclear Information System (INIS)

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy

  13. Reservoir Computing approach to Great Lakes water level forecasting

    Science.gov (United States)

    Coulibaly, Paulin

    2010-02-01

    SummaryThe use of echo state network (ESN) for dynamical system modeling is known as Reservoir Computing and has been shown to be effective for a number of applications, including signal processing, learning grammatical structure, time series prediction and motor/system control. However, the performance of Reservoir Computing approach on hydrological time series remains largely unexplored. This study investigates the potential of ESN or Reservoir Computing for long-term prediction of lake water levels. Great Lakes water levels from 1918 to 2005 are used to develop and evaluate the ESN models. The forecast performance of the ESN-based models is compared with the results obtained from two benchmark models, the conventional recurrent neural network (RNN) and the Bayesian neural network (BNN). The test results indicate a strong ability of ESN models to provide improved lake level forecasts up to 10-month ahead - suggesting that the inherent structure and innovative learning approach of the ESN is suitable for hydrological time series modeling. Another particular advantage of ESN learning approach is that it simplifies the network training complexity and avoids the limitations inherent to the gradient descent optimization method. Overall, it is shown that the ESN can be a good alternative method for improved lake level forecasting, performing better than both the RNN and the BNN on the four selected Great Lakes time series, namely, the Lakes Erie, Huron-Michigan, Ontario, and Superior.

  14. NPP advanced pipework technologies: recent backfitting projects and computational analyses

    International Nuclear Information System (INIS)

    Some recent international NPP projects involving advanced engineering services for existing installations, such as replacement of valves, containment penetrations and pipework as well as design and installations of pipe supports and dynamic restraints, are summarized. Examples of thermomechanical analyses of operational phenomena performed as part of comprehensive plant design, licensing, and commissioning support activities are presented. Turnkey project management with system function warranty resulted in most effective use of all resources drawn together from several high-qualified subcontractors and international equipment manufacturers working under the supervision of an in-house team. Experience collected to date in backfitting various plants of different design and age provides a strong knowledge basis. It is available for evaluating any plant currently in operation or under construction in order to check the need for modifications and recommend the appropriate scheduling and level of effort. (authors)

  15. Discovering and understanding oncogenic gene fusions through data intensive computational approaches.

    Science.gov (United States)

    Latysheva, Natasha S; Babu, M Madan

    2016-06-01

    Although gene fusions have been recognized as important drivers of cancer for decades, our understanding of the prevalence and function of gene fusions has been revolutionized by the rise of next-generation sequencing, advances in bioinformatics theory and an increasing capacity for large-scale computational biology. The computational work on gene fusions has been vastly diverse, and the present state of the literature is fragmented. It will be fruitful to merge three camps of gene fusion bioinformatics that appear to rarely cross over: (i) data-intensive computational work characterizing the molecular biology of gene fusions; (ii) development research on fusion detection tools, candidate fusion prioritization algorithms and dedicated fusion databases and (iii) clinical research that seeks to either therapeutically target fusion transcripts and proteins or leverages advances in detection tools to perform large-scale surveys of gene fusion landscapes in specific cancer types. In this review, we unify these different-yet highly complementary and symbiotic-approaches with the view that increased synergy will catalyze advancements in gene fusion identification, characterization and significance evaluation. PMID:27105842

  16. Advances in Computational Social Science and Social Simulation

    OpenAIRE

    Miguel Quesada, Francisco J.; Amblard, Frédéric; Juan A. Barceló; Madella, Marco; Aguirre, Cristián; Ahrweiler, Petra; Aldred, Rachel; Ali Abbas, Syed Muhammad; Lopez Rojas, Edgar Alonso; Alonso Betanzos, Amparo; Alvarez Galvez, Javier; Andrighetto, Giulia; Antunes, Luis; Araghi, Yashar; Asatani, Kimitaka

    2014-01-01

    Aquesta conferència és la celebració conjunta de la "10th Artificial Economics Conference AE", la "10th Conference of the European Social Simulation Association ESSA" i la "1st Simulating the Past to Understand Human History SPUHH". Conferència organitzada pel Laboratory for Socio­-Historical Dynamics Simulation (LSDS-­UAB) de la Universitat Autònoma de Barcelona. Readers will find results of recent research on computational social science and social simulation economics, management, so...

  17. A combinatorial approach to the discovery of advanced materials

    Science.gov (United States)

    Sun, Xiao-Dong

    This thesis discusses the application of combinatorial methods to the search of advanced materials. The goal of this research is to develop a "parallel" or "fast sequential" methodology for both the synthesis and characterization of materials with novel electronic, magnetic and optical properties. Our hope is to dramatically accelerate the rate at which materials are generated and studied. We have developed two major combinatorial methodologies to this end. One involves generating thin film materials libraries using a combination of various thin film deposition and masking strategies with multi-layer thin film precursors. The second approach is to generate powder materials libraries with solution precursors delivered with a multi-nozzle inkjet system. The first step in this multistep combinatorial process involves the design and synthesis of high density libraries of diverse materials aimed at exploring a large segment of the compositional space of interest based on our understanding of the physical and structural properties of a particular class of materials. Rapid, sensitive measurements of one or more relevant physical properties of each library member result in the identification of a family of "lead" compositions with a desired property. These compositions are then optimized by continuously varying the stoichiometries of a more focused set of precursors. Materials with the optimal composition are then synthesized in quantities sufficient for detailed characterization of their structural and physical properties. Finally, the information obtained from this process should enhance our predictive ability in subsequent experiments. Combinatorial methods have been successfully used in the synthesis and discovery of materials with novel properties. For example, a class of cobaltite based giant magnetoresistance (GMR) ceramics was discovered; Application of this method to luminescence materials has resulted in the discovery of a few highly efficient tricolor

  18. A note on “A new approach for the selection of advanced manufacturing technologies: Data envelopment analysis with double frontiers”

    OpenAIRE

    Hossein Azizi

    2015-01-01

    Recently, using the data envelopment analysis (DEA) with double frontiers approach, Wang and Chin (2009) proposed a new approach for the selection of advanced manufacturing technologies: DEA with double frontiers and a new measure for the selection of the best advanced manufacturing technologies (AMTs). In this note, we show that their proposed overall performance measure for the selection of the best AMT has an additional computational burden. Moreover, we propose a new measure for developin...

  19. Probabilistic Damage Characterization Using the Computationally-Efficient Bayesian Approach

    Science.gov (United States)

    Warner, James E.; Hochhalter, Jacob D.

    2016-01-01

    This work presents a computationally-ecient approach for damage determination that quanti es uncertainty in the provided diagnosis. Given strain sensor data that are polluted with measurement errors, Bayesian inference is used to estimate the location, size, and orientation of damage. This approach uses Bayes' Theorem to combine any prior knowledge an analyst may have about the nature of the damage with information provided implicitly by the strain sensor data to form a posterior probability distribution over possible damage states. The unknown damage parameters are then estimated based on samples drawn numerically from this distribution using a Markov Chain Monte Carlo (MCMC) sampling algorithm. Several modi cations are made to the traditional Bayesian inference approach to provide signi cant computational speedup. First, an ecient surrogate model is constructed using sparse grid interpolation to replace a costly nite element model that must otherwise be evaluated for each sample drawn with MCMC. Next, the standard Bayesian posterior distribution is modi ed using a weighted likelihood formulation, which is shown to improve the convergence of the sampling process. Finally, a robust MCMC algorithm, Delayed Rejection Adaptive Metropolis (DRAM), is adopted to sample the probability distribution more eciently. Numerical examples demonstrate that the proposed framework e ectively provides damage estimates with uncertainty quanti cation and can yield orders of magnitude speedup over standard Bayesian approaches.

  20. Canadian Educational Approaches for the Advancement of Pharmacy Practice

    OpenAIRE

    Frankel, Grace; Louizos, Christopher; Austin, Zubin

    2014-01-01

    Canadian faculties (schools) of pharmacy are actively engaged in the advancement and restructuring of their programs in response to the shift in pharmacy to pharmacists having/assuming an advanced practitioner role. Unfortunately, there is a paucity of evidence outlining optimal strategies for accomplishing this task. This review explores several educational changes proposed in the literature to aid in the advancement of pharmacy education such as program admission requirements, critical-thin...

  1. Advances in x-ray computed microtomography at the NSLS

    International Nuclear Information System (INIS)

    The X-Ray Computed Microtomography workstation at beamline X27A at the NSLS has been utilized by scientists from a broad range of disciplines from industrial materials processing to environmental science. The most recent applications are presented here as well as a description of the facility that has evolved to accommodate a wide variety of materials and sample sizes. One of the most exciting new developments reported here resulted from a pursuit of faster reconstruction techniques. A Fast Filtered Back Transform (FFBT) reconstruction program has been developed and implemented, that is based on a refinement of the gridding algorithm first developed for use with radio astronomical data. This program has reduced the reconstruction time to 8.5 sec for a 929 x 929 pixel2 slice on an R10,000 CPU, more than 8x reduction compared with the Filtered Back-Projection method

  2. ADVANCES IN X-RAY COMPUTED MICROTOMOGRAPHY AT THE NSLS.

    Energy Technology Data Exchange (ETDEWEB)

    DOWD,B.A.

    1998-08-07

    The X-Ray Computed Microtomography workstation at beamline X27A at the NSLS has been utilized by scientists from a broad range of disciplines from industrial materials processing to environmental science. The most recent applications are presented here as well as a description of the facility that has evolved to accommodate a wide variety of materials and sample sizes. One of the most exciting new developments reported here resulted from a pursuit of faster reconstruction techniques. A Fast Filtered Back Transform (FFBT) reconstruction program has been developed and implemented, that is based on a refinement of the ''gridding'' algorithm first developed for use with radio astronomical data. This program has reduced the reconstruction time to 8.5 sec for a 929 x 929 pixel{sup 2} slice on an R10,000 CPU, more than 8x reduction compared with the Filtered Back-Projection method.

  3. Advances in x-ray computed microtomography at the NSLS

    Energy Technology Data Exchange (ETDEWEB)

    Dowd, B.A.; Andrews, A.B.; Marr, R.B.; Siddons, D.P.; Jones, K.W.; Peskin, A.M.

    1998-08-01

    The X-Ray Computed Microtomography workstation at beamline X27A at the NSLS has been utilized by scientists from a broad range of disciplines from industrial materials processing to environmental science. The most recent applications are presented here as well as a description of the facility that has evolved to accommodate a wide variety of materials and sample sizes. One of the most exciting new developments reported here resulted from a pursuit of faster reconstruction techniques. A Fast Filtered Back Transform (FFBT) reconstruction program has been developed and implemented, that is based on a refinement of the gridding algorithm first developed for use with radio astronomical data. This program has reduced the reconstruction time to 8.5 sec for a 929 x 929 pixel{sup 2} slice on an R10,000 CPU, more than 8x reduction compared with the Filtered Back-Projection method.

  4. Recent advances in computational intelligence in defense and security

    CERN Document Server

    Falcon, Rafael; Zincir-Heywood, Nur; Abbass, Hussein

    2016-01-01

    This volume is an initiative undertaken by the IEEE Computational Intelligence Society’s Task Force on Security, Surveillance and Defense to consolidate and disseminate the role of CI techniques in the design, development and deployment of security and defense solutions. Applications range from the detection of buried explosive hazards in a battlefield to the control of unmanned underwater vehicles, the delivery of superior video analytics for protecting critical infrastructures or the development of stronger intrusion detection systems and the design of military surveillance networks. Defense scientists, industry experts, academicians and practitioners alike will all benefit from the wide spectrum of successful applications compiled in this volume. Senior undergraduate or graduate students may also discover uncharted territory for their own research endeavors.

  5. Experimental and computing strategies in advanced material characterization problems

    International Nuclear Information System (INIS)

    The mechanical characterization of materials relies more and more often on sophisticated experimental methods that permit to acquire a large amount of data and, contemporarily, to reduce the invasiveness of the tests. This evolution accompanies the growing demand of non-destructive diagnostic tools that assess the safety level of components in use in structures and infrastructures, for instance in the strategic energy sector. Advanced material systems and properties that are not amenable to traditional techniques, for instance thin layered structures and their adhesion on the relevant substrates, can be also characterized by means of combined experimental-numerical tools elaborating data acquired by full-field measurement techniques. In this context, parameter identification procedures involve the repeated simulation of the laboratory or in situ tests by sophisticated and usually expensive non-linear analyses while, in some situation, reliable and accurate results would be required in real time. The effectiveness and the filtering capabilities of reduced models based on decomposition and interpolation techniques can be profitably used to meet these conflicting requirements. This communication intends to summarize some results recently achieved in this field by the author and her co-workers. The aim is to foster further interaction between engineering and mathematical communities

  6. Experimental and computing strategies in advanced material characterization problems

    Science.gov (United States)

    Bolzon, G.

    2015-10-01

    The mechanical characterization of materials relies more and more often on sophisticated experimental methods that permit to acquire a large amount of data and, contemporarily, to reduce the invasiveness of the tests. This evolution accompanies the growing demand of non-destructive diagnostic tools that assess the safety level of components in use in structures and infrastructures, for instance in the strategic energy sector. Advanced material systems and properties that are not amenable to traditional techniques, for instance thin layered structures and their adhesion on the relevant substrates, can be also characterized by means of combined experimental-numerical tools elaborating data acquired by full-field measurement techniques. In this context, parameter identification procedures involve the repeated simulation of the laboratory or in situ tests by sophisticated and usually expensive non-linear analyses while, in some situation, reliable and accurate results would be required in real time. The effectiveness and the filtering capabilities of reduced models based on decomposition and interpolation techniques can be profitably used to meet these conflicting requirements. This communication intends to summarize some results recently achieved in this field by the author and her co-workers. The aim is to foster further interaction between engineering and mathematical communities.

  7. Experimental and computing strategies in advanced material characterization problems

    Energy Technology Data Exchange (ETDEWEB)

    Bolzon, G. [Department of Civil and Environmental Engineering, Politecnico di Milano, piazza Leonardo da Vinci 32, 20133 Milano, Italy gabriella.bolzon@polimi.it (Italy)

    2015-10-28

    The mechanical characterization of materials relies more and more often on sophisticated experimental methods that permit to acquire a large amount of data and, contemporarily, to reduce the invasiveness of the tests. This evolution accompanies the growing demand of non-destructive diagnostic tools that assess the safety level of components in use in structures and infrastructures, for instance in the strategic energy sector. Advanced material systems and properties that are not amenable to traditional techniques, for instance thin layered structures and their adhesion on the relevant substrates, can be also characterized by means of combined experimental-numerical tools elaborating data acquired by full-field measurement techniques. In this context, parameter identification procedures involve the repeated simulation of the laboratory or in situ tests by sophisticated and usually expensive non-linear analyses while, in some situation, reliable and accurate results would be required in real time. The effectiveness and the filtering capabilities of reduced models based on decomposition and interpolation techniques can be profitably used to meet these conflicting requirements. This communication intends to summarize some results recently achieved in this field by the author and her co-workers. The aim is to foster further interaction between engineering and mathematical communities.

  8. An Approach for Indoor Path Computation among Obstacles that Considers User Dimension

    Directory of Open Access Journals (Sweden)

    Liu Liu

    2015-12-01

    Full Text Available People often transport objects within indoor environments, who need enough space for the motion. In such cases, the accessibility of indoor spaces relies on the dimensions, which includes a person and her/his operated objects. This paper proposes a new approach to avoid obstacles and compute indoor paths with respect to the user dimension. The approach excludes inaccessible spaces for a user in five steps: (1 compute the minimum distance between obstacles and find the inaccessible gaps; (2 group obstacles according to the inaccessible gaps; (3 identify groups of obstacles that influence the path between two locations; (4 compute boundaries for the selected groups; and (5 build a network in the accessible area around the obstacles in the room. Compared to the Minkowski sum method for outlining inaccessible spaces, the proposed approach generates simpler polygons for groups of obstacles that do not contain inner rings. The creation of a navigation network becomes easier based on these simple polygons. By using this approach, we can create user- and task-specific networks in advance. Alternatively, the accessible path can be generated on the fly before the user enters a room.

  9. Computational approach to interference phase detection and linearity error correction in laser interferometry

    Czech Academy of Sciences Publication Activity Database

    Řeřucha, Šimon; Šarbort, Martin; Buchta, Zdeněk; Číp, Ondřej; Lazar, Josef

    Budva: University of Montenegro, 2013. s. 54. [ALT´13. Annual International Conference on Advanced Laser Technologies /21./. 16.09.2013-20.09.2013, Budva] R&D Projects: GA ČR GAP102/10/1813; GA MŠk ED0017/01/01; GA MŠk EE2.3.30.0054; GA MPO FR-TI2/705; GA MPO FR-TI1/241 Institutional support: RVO:68081731 Keywords : laser interferometry * computational approach Subject RIV: BH - Optics, Masers, Lasers

  10. Computational approach to interference phase detection and linearity error correction in laser interferometry

    Czech Academy of Sciences Publication Activity Database

    Řeřucha, Šimon; Šarbort, Martin; Buchta, Zdeněk; Číp, Ondřej; Lazar, Josef

    Budva: University of Montenegro, 2013. s. 54. [ALT´13. Annual International Conference on Advanced Laser Technologies /21./. 16.09.2013-20.09.2013, Budva] R&D Projects: GA ČR GAP102/10/1813; GA MŠk ED0017/01/01; GA MŠk EE2.3.30.0054; GA MPO FR-TI2/705; GA MPO FR-TI1/241 Institutional support: RVO:68081731 Keywords : laser interferometry * computational approach Subject RIV: BH - Optics, Masers, Laser s

  11. TAMIS for rectal tumors: advancements of a new approach.

    Science.gov (United States)

    Rega, Daniela; Pace, Ugo; Niglio, Antonello; Scala, Dario; Sassaroli, Cinzia; Delrio, Paolo

    2016-03-01

    TAMIS allows transanal excision of rectal lesions by the means of a single-incision access port and traditional laparoscopic instruments. This technique represents a promising treatment of rectal neoplasms since it guarantees precise dissection and reproducible approaches. From May 2010 to September 2015, we performed excisions of rectal lesions in 55 patients using a SILS port. The pre-operative diagnosis was 26 tumours, 26 low and high grade displasias and 3 other benign neoplasias. 11 patients had a neoadjuvant treatment. Pneumorectum was established at a pressure of 15-20 mmHg CO2 with continuous insufflation, and ordinary laparoscopic instruments were used to perform full thickness resection of rectal neoplasm with a conventional 5-mm 30° laparoscopic camera. The average operative time was 78 min. Postoperative recovery was uneventful in 53 cases: in one case a Hartmann procedure was necessary at two postoperative days due to an intraoperative intraperitoneal perforation; in another case, a diverting colostomy was required at the five postoperative days due to an intraoperative perforation of the vaginal wall. Unclear resection margins were detected in six patients: thereafter five patients underwent radical surgery; the other patient was unfit for radical surgery, but is actually alive and well. Patients were discharged after a median of 3 days. Transanal minimally invasive surgery is an advanced transanal platform that provides a safe and effective method for low rectal tumors. The feasibility of TAMIS also for malignant lesions treated in a neoadjuvant setting could be cautiously evaluated in the future. PMID:27052544

  12. Advances in the MQDT approach of electron/molecular cation reactive collisions: High precision extensive calculations for applications

    Directory of Open Access Journals (Sweden)

    Motapon O.

    2015-01-01

    Full Text Available Recent advances in the stepwise multichannel quantum defect theory approach of electron/molecular cation reactive collisions have been applied to perform computations of cross sections and rate coefficients for dissociative recombination and electron-impact ro-vibrational transitions of H2+, BeH+ and their deuterated isotopomers. At very low energy, rovibronic interactions play a significant role in the dynamics, whereas at high energy, the dissociative excitation strongly competes with all other reactive processes.

  13. A First Attempt to Bring Computational Biology into Advanced High School Biology Classrooms

    OpenAIRE

    Suzanne Renick Gallagher; William Coon; Kristin Donley; Abby Scott; GOLDBERG, DEBRA S.

    2011-01-01

    Computer science has become ubiquitous in many areas of biological research, yet most high school and even college students are unaware of this. As a result, many college biology majors graduate without adequate computational skills for contemporary fields of biology. The absence of a computational element in secondary school biology classrooms is of growing concern to the computational biology community and biology teachers who would like to acquaint their students with updated approaches in...

  14. Computational approaches to detect allosteric pathways in transmembrane molecular machines.

    Science.gov (United States)

    Stolzenberg, Sebastian; Michino, Mayako; LeVine, Michael V; Weinstein, Harel; Shi, Lei

    2016-07-01

    Many of the functions of transmembrane proteins involved in signal processing and transduction across the cell membrane are determined by allosteric couplings that propagate the functional effects well beyond the original site of activation. Data gathered from breakthroughs in biochemistry, crystallography, and single molecule fluorescence have established a rich basis of information for the study of molecular mechanisms in the allosteric couplings of such transmembrane proteins. The mechanistic details of these couplings, many of which have therapeutic implications, however, have only become accessible in synergy with molecular modeling and simulations. Here, we review some recent computational approaches that analyze allosteric coupling networks (ACNs) in transmembrane proteins, and in particular the recently developed Protein Interaction Analyzer (PIA) designed to study ACNs in the structural ensembles sampled by molecular dynamics simulations. The power of these computational approaches in interrogating the functional mechanisms of transmembrane proteins is illustrated with selected examples of recent experimental and computational studies pursued synergistically in the investigation of secondary active transporters and GPCRs. This article is part of a Special Issue entitled: Membrane Proteins edited by J.C. Gumbart and Sergei Noskov. PMID:26806157

  15. SPINET: A Parallel Computing Approach to Spine Simulations

    Directory of Open Access Journals (Sweden)

    Peter G. Kropf

    1996-01-01

    Full Text Available Research in scientitic programming enables us to realize more and more complex applications, and on the other hand, application-driven demands on computing methods and power are continuously growing. Therefore, interdisciplinary approaches become more widely used. The interdisciplinary SPINET project presented in this article applies modern scientific computing tools to biomechanical simulations: parallel computing and symbolic and modern functional programming. The target application is the human spine. Simulations of the spine help us to investigate and better understand the mechanisms of back pain and spinal injury. Two approaches have been used: the first uses the finite element method for high-performance simulations of static biomechanical models, and the second generates a simulation developmenttool for experimenting with different dynamic models. A finite element program for static analysis has been parallelized for the MUSIC machine. To solve the sparse system of linear equations, a conjugate gradient solver (iterative method and a frontal solver (direct method have been implemented. The preprocessor required for the frontal solver is written in the modern functional programming language SML, the solver itself in C, thus exploiting the characteristic advantages of both functional and imperative programming. The speedup analysis of both solvers show very satisfactory results for this irregular problem. A mixed symbolic-numeric environment for rigid body system simulations is presented. It automatically generates C code from a problem specification expressed by the Lagrange formalism using Maple.

  16. Computational approaches to parameter estimation and model selection in immunology

    Science.gov (United States)

    Baker, C. T. H.; Bocharov, G. A.; Ford, J. M.; Lumb, P. M.; Norton, S. J.; Paul, C. A. H.; Junt, T.; Krebs, P.; Ludewig, B.

    2005-12-01

    One of the significant challenges in biomathematics (and other areas of science) is to formulate meaningful mathematical models. Our problem is to decide on a parametrized model which is, in some sense, most likely to represent the information in a set of observed data. In this paper, we illustrate the computational implementation of an information-theoretic approach (associated with a maximum likelihood treatment) to modelling in immunology.The approach is illustrated by modelling LCMV infection using a family of models based on systems of ordinary differential and delay differential equations. The models (which use parameters that have a scientific interpretation) are chosen to fit data arising from experimental studies of virus-cytotoxic T lymphocyte kinetics; the parametrized models that result are arranged in a hierarchy by the computation of Akaike indices. The practical illustration is used to convey more general insight. Because the mathematical equations that comprise the models are solved numerically, the accuracy in the computation has a bearing on the outcome, and we address this and other practical details in our discussion.

  17. Recent developments in the INAA advance prediction computer program

    International Nuclear Information System (INIS)

    Full text: The INAA APCP developed at U.C. Irvine and tested experimentally with the UCI 250 kW TRIGA Mark I reactor has been very successful and useful. Commencing several years ago, copies of it have been distributed to many laboratories that requested them, quite a few of whom have put the program to good use. The BASIC-PLUS program (run earlier at UCI on a PDP-11/45) includes all (n, γ) products, for both thermal-and-epithermal-neutron fluxes. The more extensive FORTRAN-IV program (run earlier at UCI on a DEC-10) - also includes fission-spectrum and 14 MeV-neutron fast-neutron products. The most recent extension of the INAA APCP, to be discussed, is an (n, γ) program for high-intensity (e.g., 1000 MW) TRIGA pulses. At present, both the steady-state and pulsing (n, γ) programs are being rewritten for use with an IBM (or IBM-compatible) Personal Computer. Once the PC version of the steady-state (both thermal and epithermal) APCP is operational, processing of a large number of reference materials of interest to activation analysts (NGS SRM's, IAEA and USGS reference materials, etc.) will be continued and the compiled results made available. (author)

  18. Advanced computer algebra algorithms for the expansion of Feynman integrals

    Energy Technology Data Exchange (ETDEWEB)

    Ablinger, Jakob; Round, Mark; Schneider, Carsten [Johannes Kepler Univ., Linz (Austria). Research Inst. for Symbolic Computation; Bluemlein, Johannes [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany)

    2012-10-15

    Two-point Feynman parameter integrals, with at most one mass and containing local operator insertions in 4+{epsilon}-dimensional Minkowski space, can be transformed to multi-integrals or multi-sums over hyperexponential and/or hypergeometric functions depending on a discrete parameter n. Given such a specific representation, we utilize an enhanced version of the multivariate Almkvist-Zeilberger algorithm (for multi-integrals) and a common summation framework of the holonomic and difference field approach (for multi-sums) to calculate recurrence relations in n. Finally, solving the recurrence we can decide efficiently if the first coefficients of the Laurent series expansion of a given Feynman integral can be expressed in terms of indefinite nested sums and products; if yes, the all n solution is returned in compact representations, i.e., no algebraic relations exist among the occurring sums and products.

  19. Computing electronic structures: A new multiconfiguration approach for excited states

    International Nuclear Information System (INIS)

    We present a new method for the computation of electronic excited states of molecular systems. This method is based upon a recent theoretical definition of multiconfiguration excited states [due to one of us, see M. Lewin, Solutions of the multiconfiguration equations in quantum chemistry, Arch. Rat. Mech. Anal. 171 (2004) 83-114]. Our algorithm, dedicated to the computation of the first excited state, always converges to a stationary state of the multiconfiguration model, which can be interpreted as an approximate excited state of the molecule. The definition of this approximate excited state is variational. An interesting feature is that it satisfies a non-linear Hylleraas-Undheim-MacDonald type principle: the energy of the approximate excited state is an upper bound to the true excited state energy of the N-body Hamiltonian. To compute the first excited state, one has to deform paths on a manifold, like this is usually done in the search for transition states between reactants and products on potential energy surfaces. We propose here a general method for the deformation of paths which could also be useful in other settings. We also compare our method to other approaches used in Quantum Chemistry and give some explanation of the unsatisfactory behaviours which are sometimes observed when using the latter. Numerical results for the special case of two-electron systems are provided: we compute the first singlet excited state potential energy surface of the H 2 molecule

  20. Identifying Pathogenicity Islands in Bacterial Pathogenomics Using Computational Approaches

    Directory of Open Access Journals (Sweden)

    Dongsheng Che

    2014-01-01

    Full Text Available High-throughput sequencing technologies have made it possible to study bacteria through analyzing their genome sequences. For instance, comparative genome sequence analyses can reveal the phenomenon such as gene loss, gene gain, or gene exchange in a genome. By analyzing pathogenic bacterial genomes, we can discover that pathogenic genomic regions in many pathogenic bacteria are horizontally transferred from other bacteria, and these regions are also known as pathogenicity islands (PAIs. PAIs have some detectable properties, such as having different genomic signatures than the rest of the host genomes, and containing mobility genes so that they can be integrated into the host genome. In this review, we will discuss various pathogenicity island-associated features and current computational approaches for the identification of PAIs. Existing pathogenicity island databases and related computational resources will also be discussed, so that researchers may find it to be useful for the studies of bacterial evolution and pathogenicity mechanisms.

  1. An Improved Computational Approach for Salient Region Detection

    Directory of Open Access Journals (Sweden)

    Qiaorong Zhang

    2010-07-01

    Full Text Available Salient region detection in images is very useful for image processing applications like image compressing, image segmentation, object detection and recognition. In this paper, an improved approach to detect salient region is presented. The proposed method can generate a robust saliency map and extract salient regions with precise boundaries. In the proposed method, local saliency, global saliency and rarity saliency of three kinds of low-level feature contrast of intensity, color and orientation are used to compute the visual saliency. A new feature integration strategy is proposed in this paper. This method can select features and compute the weights of the features dynamically by analyzing the effect of different features on the saliency. Then a more robust saliency map is obtained. It has been tested on many images to evaluate the validity and effectiveness of the proposed method. We also compare our method with other salient region detection methods and our method outperforms other methods in detection results.

  2. Benchmarking of computer codes and approaches for modeling exposure scenarios

    International Nuclear Information System (INIS)

    The US Department of Energy Headquarters established a performance assessment task team (PATT) to integrate the activities of DOE sites that are preparing performance assessments for the disposal of newly generated low-level waste. The PATT chartered a subteam with the task of comparing computer codes and exposure scenarios used for dose calculations in performance assessments. This report documents the efforts of the subteam. Computer codes considered in the comparison include GENII, PATHRAE-EPA, MICROSHIELD, and ISOSHLD. Calculations were also conducted using spreadsheets to provide a comparison at the most fundamental level. Calculations and modeling approaches are compared for unit radionuclide concentrations in water and soil for the ingestion, inhalation, and external dose pathways. Over 30 tables comparing inputs and results are provided

  3. [Computer work and De Quervain's tenosynovitis: an evidence based approach].

    Science.gov (United States)

    Gigante, M R; Martinotti, I; Cirla, P E

    2012-01-01

    The debate around the role of the work at personal computer as cause of De Quervain's Tenosynovitis was developed partially, without considering multidisciplinary available data. A systematic review of the literature, using an evidence-based approach, was performed. In disorders associated with the use of VDU, we must distinguish those at the upper limbs and among them those related to an overload. Experimental studies on the occurrence of De Quervain's Tenosynovitis are quite limited, as well as clinically are quite difficult to prove the professional etiology, considering the interference due to other activities of daily living or to the biological susceptibility (i.e. anatomical variability, sex, age, exercise). At present there is no evidence of any connection between De Quervain syndrome and time of use of the personal computer or keyboard, limited evidence of correlation is found with time using a mouse. No data are available regarding the use exclusively or predominantly for personal laptops or mobile "smart phone". PMID:23405595

  4. ADVANCED METHODS FOR THE COMPUTATION OF PARTICLE BEAM TRANSPORT AND THE COMPUTATION OF ELECTROMAGNETIC FIELDS AND MULTIPARTICLE PHENOMENA

    Energy Technology Data Exchange (ETDEWEB)

    Alex J. Dragt

    2012-08-31

    Since 1980, under the grant DEFG02-96ER40949, the Department of Energy has supported the educational and research work of the University of Maryland Dynamical Systems and Accelerator Theory (DSAT) Group. The primary focus of this educational/research group has been on the computation and analysis of charged-particle beam transport using Lie algebraic methods, and on advanced methods for the computation of electromagnetic fields and multiparticle phenomena. This Final Report summarizes the accomplishments of the DSAT Group from its inception in 1980 through its end in 2011.

  5. A pencil beam approach to proton computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Rescigno, Regina, E-mail: regina.rescigno@iphc.cnrs.fr; Bopp, Cécile; Rousseau, Marc; Brasse, David [Université de Strasbourg, IPHC, 23 rue du Loess, Strasbourg 67037, France and CNRS, UMR7178, Strasbourg 67037 (France)

    2015-11-15

    Purpose: A new approach to proton computed tomography (pCT) is presented. In this approach, protons are not tracked one-by-one but a beam of particles is considered instead. The elements of the pCT reconstruction problem (residual energy and path) are redefined on the basis of this new approach. An analytical image reconstruction algorithm applicable to this scenario is also proposed. Methods: The pencil beam (PB) and its propagation in matter were modeled by making use of the generalization of the Fermi–Eyges theory to account for multiple Coulomb scattering (MCS). This model was integrated into the pCT reconstruction problem, allowing the definition of the mean beam path concept similar to the most likely path (MLP) used in the single-particle approach. A numerical validation of the model was performed. The algorithm of filtered backprojection along MLPs was adapted to the beam-by-beam approach. The acquisition of a perfect proton scan was simulated and the data were used to reconstruct images of the relative stopping power of the phantom with the single-proton and beam-by-beam approaches. The resulting images were compared in a qualitative way. Results: The parameters of the modeled PB (mean and spread) were compared to Monte Carlo results in order to validate the model. For a water target, good agreement was found for the mean value of the distributions. As far as the spread is concerned, depth-dependent discrepancies as large as 2%–3% were found. For a heterogeneous phantom, discrepancies in the distribution spread ranged from 6% to 8%. The image reconstructed with the beam-by-beam approach showed a high level of noise compared to the one reconstructed with the classical approach. Conclusions: The PB approach to proton imaging may allow technical challenges imposed by the current proton-by-proton method to be overcome. In this framework, an analytical algorithm is proposed. Further work will involve a detailed study of the performances and limitations of

  6. A pencil beam approach to proton computed tomography

    International Nuclear Information System (INIS)

    Purpose: A new approach to proton computed tomography (pCT) is presented. In this approach, protons are not tracked one-by-one but a beam of particles is considered instead. The elements of the pCT reconstruction problem (residual energy and path) are redefined on the basis of this new approach. An analytical image reconstruction algorithm applicable to this scenario is also proposed. Methods: The pencil beam (PB) and its propagation in matter were modeled by making use of the generalization of the Fermi–Eyges theory to account for multiple Coulomb scattering (MCS). This model was integrated into the pCT reconstruction problem, allowing the definition of the mean beam path concept similar to the most likely path (MLP) used in the single-particle approach. A numerical validation of the model was performed. The algorithm of filtered backprojection along MLPs was adapted to the beam-by-beam approach. The acquisition of a perfect proton scan was simulated and the data were used to reconstruct images of the relative stopping power of the phantom with the single-proton and beam-by-beam approaches. The resulting images were compared in a qualitative way. Results: The parameters of the modeled PB (mean and spread) were compared to Monte Carlo results in order to validate the model. For a water target, good agreement was found for the mean value of the distributions. As far as the spread is concerned, depth-dependent discrepancies as large as 2%–3% were found. For a heterogeneous phantom, discrepancies in the distribution spread ranged from 6% to 8%. The image reconstructed with the beam-by-beam approach showed a high level of noise compared to the one reconstructed with the classical approach. Conclusions: The PB approach to proton imaging may allow technical challenges imposed by the current proton-by-proton method to be overcome. In this framework, an analytical algorithm is proposed. Further work will involve a detailed study of the performances and limitations of

  7. 76 FR 52954 - Workshop: Advancing Research on Mixtures; New Perspectives and Approaches for Predicting Adverse...

    Science.gov (United States)

    2011-08-24

    ... HUMAN SERVICES Workshop: Advancing Research on Mixtures; New Perspectives and Approaches for Predicting... ``Advancing Research on Mixtures: New Perspectives and Approaches for Predicting Adverse Human Health Effects... Research and Training, NIEHS, P.O. Box 12233, MD K3-04, Research Triangle Park, NC 27709, (telephone)...

  8. Advances and Computational Tools towards Predictable Design in Biological Engineering

    Directory of Open Access Journals (Sweden)

    Lorenzo Pasotti

    2014-01-01

    Full Text Available The design process of complex systems in all the fields of engineering requires a set of quantitatively characterized components and a method to predict the output of systems composed by such elements. This strategy relies on the modularity of the used components or the prediction of their context-dependent behaviour, when parts functioning depends on the specific context. Mathematical models usually support the whole process by guiding the selection of parts and by predicting the output of interconnected systems. Such bottom-up design process cannot be trivially adopted for biological systems engineering, since parts function is hard to predict when components are reused in different contexts. This issue and the intrinsic complexity of living systems limit the capability of synthetic biologists to predict the quantitative behaviour of biological systems. The high potential of synthetic biology strongly depends on the capability of mastering this issue. This review discusses the predictability issues of basic biological parts (promoters, ribosome binding sites, coding sequences, transcriptional terminators, and plasmids when used to engineer simple and complex gene expression systems in Escherichia coli. A comparison between bottom-up and trial-and-error approaches is performed for all the discussed elements and mathematical models supporting the prediction of parts behaviour are illustrated.

  9. Monitoring Neuromotricity On-line: a Cloud Computing Approach

    OpenAIRE

    Lefebvre, Olivier; Riba, Pau; Gagnon-Marchand, Jules; Fournier, Charles; Fornes, Alicia; Llados, Josep; Plamondon, Réjean

    2015-01-01

    The goal of our experiment is to develop a useful and accessible tool that can be used to evaluate a patient's health by analyzing handwritten strokes. We use a cloud computing approach to analyze stroke data sampled on a commercial tablet working on the Android platform and a distant server to perform complex calculations using the Delta and Sigma lognormal algorithms. A Google Drive account is used to store the data and to ease the development of the project. The communication between the t...

  10. A computational approach to mechanistic and predictive toxicology of pesticides

    DEFF Research Database (Denmark)

    Kongsbak, Kristine Grønning; Vinggaard, Anne Marie; Hadrup, Niels;

    2014-01-01

    Emerging challenges of managing and interpreting large amounts of complex biological data have given rise to the growing field of computational biology. We investigated the applicability of an integrated systems toxicology approach on five selected pesticides to get an overview of their modes...... protein interactome. Then, we explored modes of action of the chemicals, by integrating protein-disease information to the resulting protein networks. The dominating human adverse effects affected were reproductive disorders followed by adrenal diseases. Our results indicated that prochloraz, tebuconazole...

  11. A New Computational Scheme for Computing Greeks by the Asymptotic Expansion Approach

    OpenAIRE

    Matsuoka, Ryosuke; Takahashi, Akihiko; Uchida, Yoshihiko

    2005-01-01

    We developed a new scheme for computing "Greeks"of derivatives by an asymptotic expansion approach. In particular, we derived analytical approximation formulae for deltas and Vegas of plain vanilla and av-erage European call options under general Markovian processes of underlying asset prices. Moreover, we introduced a new variance reduction method of Monte Carlo simulations based on the asymptotic expansion scheme. Finally, several numerical examples under CEV processes con?rmed the validity...

  12. A Dynamic Bayesian Approach to Computational Laban Shape Quality Analysis

    Directory of Open Access Journals (Sweden)

    Dilip Swaminathan

    2009-01-01

    kinesiology. LMA (especially Effort/Shape emphasizes how internal feelings and intentions govern the patterning of movement throughout the whole body. As we argue, a complex understanding of intention via LMA is necessary for human-computer interaction to become embodied in ways that resemble interaction in the physical world. We thus introduce a novel, flexible Bayesian fusion approach for identifying LMA Shape qualities from raw motion capture data in real time. The method uses a dynamic Bayesian network (DBN to fuse movement features across the body and across time and as we discuss can be readily adapted for low-cost video. It has delivered excellent performance in preliminary studies comprising improvisatory movements. Our approach has been incorporated in Response, a mixed-reality environment where users interact via natural, full-body human movement and enhance their bodily-kinesthetic awareness through immersive sound and light feedback, with applications to kinesiology training, Parkinson's patient rehabilitation, interactive dance, and many other areas.

  13. NATO Advanced Study Institute on Advances in the Computer Simulations of Liquid Crystals

    CERN Document Server

    Zannoni, Claudio

    2000-01-01

    Computer simulations provide an essential set of tools for understanding the macroscopic properties of liquid crystals and of their phase transitions in terms of molecular models. While simulations of liquid crystals are based on the same general Monte Carlo and molecular dynamics techniques as are used for other fluids, they present a number of specific problems and peculiarities connected to the intrinsic properties of these mesophases. The field of computer simulations of anisotropic fluids is interdisciplinary and is evolving very rapidly. The present volume covers a variety of techniques and model systems, from lattices to hard particle and Gay-Berne to atomistic, for thermotropics, lyotropics, and some biologically interesting liquid crystals. Contributions are written by an excellent panel of international lecturers and provides a timely account of the techniques and problems in the field.

  14. Advanced approaches to high intensity laser-driven ion acceleration

    International Nuclear Information System (INIS)

    Since the pioneering work that was carried out 10 years ago, the generation of highly energetic ion beams from laser-plasma interactions has been investigated in much detail in the regime of target normal sheath acceleration (TNSA). Creation of ion beams with small longitudinal and transverse emittance and energies extending up to tens of MeV fueled visions of compact, laser-driven ion sources for applications such as ion beam therapy of tumors or fast ignition inertial con finement fusion. However, new pathways are of crucial importance to push the current limits of laser-generated ion beams further towards parameters necessary for those applications. The presented PhD work was intended to develop and explore advanced approaches to high intensity laser-driven ion acceleration that reach beyond TNSA. In this spirit, ion acceleration from two novel target systems was investigated, namely mass-limited microspheres and nm-thin, free-standing diamond-like carbon (DLC) foils. Using such ultrathin foils, a new regime of ion acceleration was found where the laser transfers energy to all electrons located within the focal volume. While for TNSA the accelerating electric field is stationary and ion acceleration is spatially separated from laser absorption into electrons, now a localized longitudinal field enhancement is present that co-propagates with the ions as the accompanying laser pulse pushes the electrons forward. Unprecedented maximum ion energies were obtained, reaching beyond 0.5 GeV for carbon C6+ and thus exceeding previous TNSA results by about one order of magnitude. When changing the laser polarization to circular, electron heating and expansion were shown to be efficiently suppressed, resulting for the first time in a phase-stable acceleration that is dominated by the laser radiation pressure which led to the observation of a peaked C6+ spectrum. Compared to quasi-monoenergetic ion beam generation within the TNSA regime, a more than 40 times increase in

  15. Advanced approaches to high intensity laser-driven ion acceleration

    Energy Technology Data Exchange (ETDEWEB)

    Henig, Andreas

    2010-04-26

    Since the pioneering work that was carried out 10 years ago, the generation of highly energetic ion beams from laser-plasma interactions has been investigated in much detail in the regime of target normal sheath acceleration (TNSA). Creation of ion beams with small longitudinal and transverse emittance and energies extending up to tens of MeV fueled visions of compact, laser-driven ion sources for applications such as ion beam therapy of tumors or fast ignition inertial con finement fusion. However, new pathways are of crucial importance to push the current limits of laser-generated ion beams further towards parameters necessary for those applications. The presented PhD work was intended to develop and explore advanced approaches to high intensity laser-driven ion acceleration that reach beyond TNSA. In this spirit, ion acceleration from two novel target systems was investigated, namely mass-limited microspheres and nm-thin, free-standing diamond-like carbon (DLC) foils. Using such ultrathin foils, a new regime of ion acceleration was found where the laser transfers energy to all electrons located within the focal volume. While for TNSA the accelerating electric field is stationary and ion acceleration is spatially separated from laser absorption into electrons, now a localized longitudinal field enhancement is present that co-propagates with the ions as the accompanying laser pulse pushes the electrons forward. Unprecedented maximum ion energies were obtained, reaching beyond 0.5 GeV for carbon C{sup 6+} and thus exceeding previous TNSA results by about one order of magnitude. When changing the laser polarization to circular, electron heating and expansion were shown to be efficiently suppressed, resulting for the first time in a phase-stable acceleration that is dominated by the laser radiation pressure which led to the observation of a peaked C{sup 6+} spectrum. Compared to quasi-monoenergetic ion beam generation within the TNSA regime, a more than 40 times

  16. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    Energy Technology Data Exchange (ETDEWEB)

    Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

  17. Stochastic Computational Approach for Complex Nonlinear Ordinary Differential Equations

    Institute of Scientific and Technical Information of China (English)

    Junaid Ali Khan; Muhammad Asif Zahoor Raja; Ijaz Mansoor Qureshi

    2011-01-01

    @@ We present an evolutionary computational approach for the solution of nonlinear ordinary differential equations (NLODEs).The mathematical modeling is performed by a feed-forward artificial neural network that defines an unsupervised error.The training of these networks is achieved by a hybrid intelligent algorithm, a combination of global search with genetic algorithm and local search by pattern search technique.The applicability of this approach ranges from single order NLODEs, to systems of coupled differential equations.We illustrate the method by solving a variety of model problems and present comparisons with solutions obtained by exact methods and classical numerical methods.The solution is provided on a continuous finite time interval unlike the other numerical techniques with comparable accuracy.With the advent of neuroprocessors and digital signal processors the method becomes particularly interesting due to the expected essential gains in the execution speed.%We present an evolutionary computational approach for the solution of nonlinear ordinary differential equations (NLODEs). The mathematical modeling is performed by a feed-forward artificial neural network that defines an unsupervised error. The training of these networks is achieved by a hybrid intelligent algorithm, a combination of global search with genetic algorithm and local search by pattern search technique. The applicability of this approach ranges from single order NLODEs, to systems of coupled differential equations. We illustrate the method by solving a variety of model problems and present comparisons with solutions obtained by exact methods and classical numerical methods. The solution is provided on a continuous finite time interval unlike the other numerical techniques with comparable accuracy. With the advent of neuroprocessors and digital signal processors the method becomes particularly interesting due to the expected essential gains in the execution speed.

  18. Computational Approach to Diarylprolinol-Silyl Ethers in Aminocatalysis.

    Science.gov (United States)

    Halskov, Kim Søholm; Donslund, Bjarke S; Paz, Bruno Matos; Jørgensen, Karl Anker

    2016-05-17

    Asymmetric organocatalysis has witnessed a remarkable development since its "re-birth" in the beginning of the millenium. In this rapidly growing field, computational investigations have proven to be an important contribution for the elucidation of mechanisms and rationalizations of the stereochemical outcomes of many of the reaction concepts developed. The improved understanding of mechanistic details has facilitated the further advancement of the field. The diarylprolinol-silyl ethers have since their introduction been one of the most applied catalysts in asymmetric aminocatalysis due to their robustness and generality. Although aminocatalytic methods at first glance appear to follow relatively simple mechanistic principles, more comprehensive computational studies have shown that this notion in some cases is deceiving and that more complex pathways might be operating. In this Account, the application of density functional theory (DFT) and other computational methods on systems catalyzed by the diarylprolinol-silyl ethers is described. It will be illustrated how computational investigations have shed light on the structure and reactivity of important intermediates in aminocatalysis, such as enamines and iminium ions formed from aldehydes and α,β-unsaturated aldehydes, respectively. Enamine and iminium ion catalysis can be classified as HOMO-raising and LUMO-lowering activation modes. In these systems, the exclusive reactivity through one of the possible intermediates is often a requisite for achieving high stereoselectivity; therefore, the appreciation of subtle energy differences has been vital for the efficient development of new stereoselective reactions. The diarylprolinol-silyl ethers have also allowed for novel activation modes for unsaturated aldehydes, which have opened up avenues for the development of new remote functionalization reactions of poly-unsaturated carbonyl compounds via di-, tri-, and tetraenamine intermediates and vinylogous iminium ions

  19. Computational approaches to predict bacteriophage-host relationships.

    Science.gov (United States)

    Edwards, Robert A; McNair, Katelyn; Faust, Karoline; Raes, Jeroen; Dutilh, Bas E

    2016-03-01

    Metagenomics has changed the face of virus discovery by enabling the accurate identification of viral genome sequences without requiring isolation of the viruses. As a result, metagenomic virus discovery leaves the first and most fundamental question about any novel virus unanswered: What host does the virus infect? The diversity of the global virosphere and the volumes of data obtained in metagenomic sequencing projects demand computational tools for virus-host prediction. We focus on bacteriophages (phages, viruses that infect bacteria), the most abundant and diverse group of viruses found in environmental metagenomes. By analyzing 820 phages with annotated hosts, we review and assess the predictive power of in silico phage-host signals. Sequence homology approaches are the most effective at identifying known phage-host pairs. Compositional and abundance-based methods contain significant signal for phage-host classification, providing opportunities for analyzing the unknowns in viral metagenomes. Together, these computational approaches further our knowledge of the interactions between phages and their hosts. Importantly, we find that all reviewed signals significantly link phages to their hosts, illustrating how current knowledge and insights about the interaction mechanisms and ecology of coevolving phages and bacteria can be exploited to predict phage-host relationships, with potential relevance for medical and industrial applications. PMID:26657537

  20. Advanced computational methods for the assessment of reactor core behaviour during reactivity initiated accidents. Final report

    International Nuclear Information System (INIS)

    The document at hand serves as the final report for the reactor safety research project RS1183 ''Advanced Computational Methods for the Assessment of Reactor Core Behavior During Reactivity-Initiated Accidents''. The work performed in the framework of this project was dedicated to the development, validation and application of advanced computational methods for the simulation of transients and accidents of nuclear installations. These simulation tools describe in particular the behavior of the reactor core (with respect to neutronics, thermal-hydraulics and thermal mechanics) at a very high level of detail. The overall goal of this project was the deployment of a modern nuclear computational chain which provides, besides advanced 3D tools for coupled neutronics/ thermal-hydraulics full core calculations, also appropriate tools for the generation of multi-group cross sections and Monte Carlo models for the verification of the individual calculational steps. This computational chain shall primarily be deployed for light water reactors (LWR), but should beyond that also be applicable for innovative reactor concepts. Thus, validation on computational benchmarks and critical experiments was of paramount importance. Finally, appropriate methods for uncertainty and sensitivity analysis were to be integrated into the computational framework, in order to assess and quantify the uncertainties due to insufficient knowledge of data, as well as due to methodological aspects.

  1. Projected role of advanced computational aerodynamic methods at the Lockheed-Georgia company

    Science.gov (United States)

    Lores, M. E.

    1978-01-01

    Experience with advanced computational methods being used at the Lockheed-Georgia Company to aid in the evaluation and design of new and modified aircraft indicates that large and specialized computers will be needed to make advanced three-dimensional viscous aerodynamic computations practical. The Numerical Aerodynamic Simulation Facility should be used to provide a tool for designing better aerospace vehicles while at the same time reducing development costs by performing computations using Navier-Stokes equations solution algorithms and permitting less sophisticated but nevertheless complex calculations to be made efficiently. Configuration definition procedures and data output formats can probably best be defined in cooperation with industry, therefore, the computer should handle many remote terminals efficiently. The capability of transferring data to and from other computers needs to be provided. Because of the significant amount of input and output associated with 3-D viscous flow calculations and because of the exceedingly fast computation speed envisioned for the computer, special attention should be paid to providing rapid, diversified, and efficient input and output.

  2. Computational Benefits Using an Advanced Concatenation Scheme Based on Reduced Order Models for RF Structures

    CERN Document Server

    Heller, Johann; Van Rienen, Ursula; 10.1016/j.phpro.2015.11.060

    2015-01-01

    The computation of electromagnetic fields and parameters derived thereof for lossless radio frequency (RF) structures filled with isotropic media is an important task for the design and operation of particle accelerators. Unfortunately, these computations are often highly demanding with regard to computational effort. The entire computational demand of the problem can be reduced using decomposition schemes in order to solve the field problems on standard workstations. This paper presents one of the first detailed comparisons between the recently proposed state-space concatenation approach (SSC) and a direct computation for an accelerator cavity with coupler-elements that break the rotational symmetry.

  3. Computer visualization for enhanced operator performance for advanced nuclear power plants

    International Nuclear Information System (INIS)

    The operators of nuclear power plants are presented with an often uncoordinated and arbitrary array of displays and controls. Information is presented in different formats and on physically dissimilar instruments. In an accident situation, an operator must be very alert to quickly diagnose and respond to the state of the plant as represented by the control room displays. Improvements in display technology and increased automation have helped reduce operator burden; however, too much automation may lead to operator apathy and decreased efficiency. A proposed approach to the human-system interface uses modern graphics technology and advances in computational power to provide a visualization or ''virtual reality'' framework for the operator. This virtual reality comprises a simulated perception of another existence, complete with three-dimensional structures, backgrounds, and objects. By placing the operator in an environment that presents an integrated, graphical, and dynamic view of the plant, his attention is directly engaged. Through computer simulation, the operator can view plant equipment, read local displays, and manipulate controls as if he were in the local area. This process not only keeps an operator involved in plant operation and testing procedures, but also reduces personnel exposure. In addition, operator stress is reduced because, with realistic views of plant areas and equipment, the status of the plant can be accurately grasped without interpreting a large number of displays. Since a single operator can quickly ''visit'' many different plant areas without physically moving from the control room, these techniques are useful in reducing labor requirements for surveillance and maintenance activities. This concept requires a plant dynamic model continuously updated via real-time process monitoring. This model interacts with a three-dimensional, solid-model architectural configuration of the physical plant

  4. Advances in multi-physics and high performance computing in support of nuclear reactor power systems modeling and simulation

    International Nuclear Information System (INIS)

    Significant advances in computational performance have occurred over the past two decades, achieved not only by the introduction of more powerful processors but the incorporation of parallelism in computer hardware at all levels. Simultaneous with these hardware and associated system software advances have been advances in modeling physical phenomena and the numerical algorithms to allow their usage in simulation. This paper presents a review of the advances in computer performance, discusses the modeling and simulation capabilities required to address the multi-physics and multi-scale phenomena applicable to a nuclear reactor core simulator, and present examples of relevant physics simulation codes' performances on high performance computers.

  5. Inferring haplotypes at the NAT2 locus: the computational approach

    Directory of Open Access Journals (Sweden)

    Sabbagh Audrey

    2005-06-01

    Full Text Available Abstract Background Numerous studies have attempted to relate genetic polymorphisms within the N-acetyltransferase 2 gene (NAT2 to interindividual differences in response to drugs or in disease susceptibility. However, genotyping of individuals single-nucleotide polymorphisms (SNPs alone may not always provide enough information to reach these goals. It is important to link SNPs in terms of haplotypes which carry more information about the genotype-phenotype relationship. Special analytical techniques have been designed to unequivocally determine the allocation of mutations to either DNA strand. However, molecular haplotyping methods are labour-intensive and expensive and do not appear to be good candidates for routine clinical applications. A cheap and relatively straightforward alternative is the use of computational algorithms. The objective of this study was to assess the performance of the computational approach in NAT2 haplotype reconstruction from phase-unknown genotype data, for population samples of various ethnic origin. Results We empirically evaluated the effectiveness of four haplotyping algorithms in predicting haplotype phases at NAT2, by comparing the results with those directly obtained through molecular haplotyping. All computational methods provided remarkably accurate and reliable estimates for NAT2 haplotype frequencies and individual haplotype phases. The Bayesian algorithm implemented in the PHASE program performed the best. Conclusion This investigation provides a solid basis for the confident and rational use of computational methods which appear to be a good alternative to infer haplotype phases in the particular case of the NAT2 gene, where there is near complete linkage disequilibrium between polymorphic markers.

  6. Proceedings of the national conference on advanced communication and computing techniques

    International Nuclear Information System (INIS)

    The objective of the conference was to take stock of technological innovation aimed at the improvement and development of humanity. The main areas discussed in the conference were: advanced computer architecture, next generation networking, optical wireless communication, wireless networking, embedded systems etc. Papers relevant to INIS are indexed separately

  7. ENHANCING THE ATOMIC-LEVEL UNDERSTANDING OF CO2 MINERAL SEQUESTRATION MECHANISMS VIA ADVANCED COMPUTATIONAL MODELING

    Energy Technology Data Exchange (ETDEWEB)

    A.V.G. Chizmeshya; M.J. McKelvy; G.H. Wolf; R.W. Carpenter; D.A. Gormley; J.R. Diefenbacher; R. Marzke

    2006-03-01

    have already significantly improved our understanding of mineral carbonation. Group members at the Albany Research Center have recently shown that carbonation of olivine and serpentine, which naturally occurs over geological time (i.e., 100,000s of years), can be accelerated to near completion in hours. Further process refinement will require a synergetic science/engineering approach that emphasizes simultaneous investigation of both thermodynamic processes and the detailed microscopic, atomic-level mechanisms that govern carbonation kinetics. Our previously funded Phase I Innovative Concepts project demonstrated the value of advanced quantum-mechanical modeling as a complementary tool in bridging important gaps in our understanding of the atomic/molecular structure and reaction mechanisms that govern CO2 mineral sequestration reaction processes for the model Mg-rich lamellar hydroxide feedstock material Mg(OH)2. In the present simulation project, improved techniques and more efficient computational schemes have allowed us to expand and augment these capabilities and explore more complex Mg-rich, lamellar hydroxide-based feedstock materials, including the serpentine-based minerals. These feedstock materials are being actively investigated due to their wide availability, and low-cost CO2 mineral sequestration potential. Cutting-edge first principles quantum chemical, computational solid-state and materials simulation methodology studies proposed herein, have been strategically integrated with our new DOE supported (ASU-Argonne National Laboratory) project to investigate the mechanisms that govern mineral feedstock heat-treatment and aqueous/fluid-phase serpentine mineral carbonation in situ. This unified, synergetic theoretical and experimental approach has provided a deeper understanding of the key reaction mechanisms than either individual approach can alone. We used ab initio techniques to significantly advance our understanding of atomic-level processes at the solid

  8. Advanced computational tools and methods for nuclear analyses of fusion technology systems

    International Nuclear Information System (INIS)

    An overview is presented of advanced computational tools and methods developed recently for nuclear analyses of Fusion Technology systems such as the experimental device ITER ('International Thermonuclear Experimental Reactor') and the intense neutron source IFMIF ('International Fusion Material Irradiation Facility'). These include Monte Carlo based computational schemes for the calculation of three-dimensional shut-down dose rate distributions, methods, codes and interfaces for the use of CAD geometry models in Monte Carlo transport calculations, algorithms for Monte Carlo based sensitivity/uncertainty calculations, as well as computational techniques and data for IFMIF neutronics and activation calculations. (author)

  9. Computational Approaches for Microalgal Biofuel Optimization: A Review

    Directory of Open Access Journals (Sweden)

    Joseph Koussa

    2014-01-01

    Full Text Available The increased demand and consumption of fossil fuels have raised interest in finding renewable energy sources throughout the globe. Much focus has been placed on optimizing microorganisms and primarily microalgae, to efficiently produce compounds that can substitute for fossil fuels. However, the path to achieving economic feasibility is likely to require strain optimization through using available tools and technologies in the fields of systems and synthetic biology. Such approaches invoke a deep understanding of the metabolic networks of the organisms and their genomic and proteomic profiles. The advent of next generation sequencing and other high throughput methods has led to a major increase in availability of biological data. Integration of such disparate data can help define the emergent metabolic system properties, which is of crucial importance in addressing biofuel production optimization. Herein, we review major computational tools and approaches developed and used in order to potentially identify target genes, pathways, and reactions of particular interest to biofuel production in algae. As the use of these tools and approaches has not been fully implemented in algal biofuel research, the aim of this review is to highlight the potential utility of these resources toward their future implementation in algal research.

  10. FY05-FY06 Advanced Simulation and Computing Implementation Plan, Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Baron, A L

    2004-07-19

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program will require the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapon design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile life extension programs and the resolution of significant finding investigations (SFIs). This requires a balanced system of technical staff, hardware, simulation software, and computer science solutions.

  11. 1st International Conference on Computational Advancement in Communication Circuits and Systems

    CERN Document Server

    Dalapati, Goutam; Banerjee, P; Mallick, Amiya; Mukherjee, Moumita

    2015-01-01

    This book comprises the proceedings of 1st International Conference on Computational Advancement in Communication Circuits and Systems (ICCACCS 2014) organized by Narula Institute of Technology under the patronage of JIS group, affiliated to West Bengal University of Technology. The conference was supported by Technical Education Quality Improvement Program (TEQIP), New Delhi, India and had technical collaboration with IEEE Kolkata Section, along with publication partner by Springer. The book contains 62 refereed papers that aim to highlight new theoretical and experimental findings in the field of Electronics and communication engineering including interdisciplinary fields like Advanced Computing, Pattern Recognition and Analysis, Signal and Image Processing. The proceedings cover the principles, techniques and applications in microwave & devices, communication & networking, signal & image processing, and computations & mathematics & control. The proceedings reflect the conference’s emp...

  12. A Novel Approach of Load Balancing in Cloud Computing using Computational Intelligence

    Directory of Open Access Journals (Sweden)

    Shabnam Sharma

    2016-02-01

    Full Text Available Nature Inspired Meta-Heuristic algorithms are proved to be beneficial for solving real world combinatorial problems such as minimum spanning tree, knapsack problem, process planning problems, load balancing and many more. In this research work, existing meta-heuristic approaches are discussed. Due to astonishing feature of echolocation, bat algorithm has drawn major attention in recent years and is applicable in different applications such vehicle routing optimization, time-tabling in railway optimization problems, load balancing in cloud computing etc. Later, the biological behaviour of bats is explored and various areas of further research are discussed. Finally, the main objective of the research paper is to propose an algorithm for one of the most important application, which is load balancing in cloud computing environment.

  13. Innovations and advances in computing, informatics, systems sciences, networking and engineering

    CERN Document Server

    Elleithy, Khaled

    2015-01-01

    Innovations and Advances in Computing, Informatics, Systems Sciences, Networking and Engineering  This book includes a set of rigorously reviewed world-class manuscripts addressing and detailing state-of-the-art research projects in the areas of Computer Science, Informatics, and Systems Sciences, and Engineering. It includes selected papers from the conference proceedings of the Eighth and some selected papers of the Ninth International Joint Conferences on Computer, Information, and Systems Sciences, and Engineering (CISSE 2012 & CISSE 2013). Coverage includes topics in: Industrial Electronics, Technology & Automation, Telecommunications and Networking, Systems, Computing Sciences and Software Engineering, Engineering Education, Instructional Technology, Assessment, and E-learning.  ·       Provides the latest in a series of books growing out of the International Joint Conferences on Computer, Information, and Systems Sciences, and Engineering; ·       Includes chapters in the most a...

  14. Leaching from Heterogeneous Heck Catalysts: A Computational Approach

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The possibility of carrying out a purely heterogeneous Heck reaction in practice without Pd leaching has been previously considered by a number of research groups but no general consent has yet arrived. Here, the reaction was, for the first time, evaluated by a simple computational approach. Modelling experiments were performed on one of the initial catalytic steps: phenyl halides attachment on Pd (111) to (100) and (111) to (111) ridges of a Pd crystal. Three surface structures of resulting [PhPdX] were identified as possible reactive intermediates. Following potential energy minimisation calculations based on a universal force field, the relative stabilities of these surface species were then determined. Results showed the most stable species to be one in which a Pd ridge atom is removed from the Pd crystal structure, suggesting Pd leaching induced by phenyl halides is energetically favourable.

  15. Computer Modeling of Violent Intent: A Content Analysis Approach

    Energy Technology Data Exchange (ETDEWEB)

    Sanfilippo, Antonio P.; Mcgrath, Liam R.; Bell, Eric B.

    2014-01-03

    We present a computational approach to modeling the intent of a communication source representing a group or an individual to engage in violent behavior. Our aim is to identify and rank aspects of radical rhetoric that are endogenously related to violent intent to predict the potential for violence as encoded in written or spoken language. We use correlations between contentious rhetoric and the propensity for violent behavior found in documents from radical terrorist and non-terrorist groups and individuals to train and evaluate models of violent intent. We then apply these models to unseen instances of linguistic behavior to detect signs of contention that have a positive correlation with violent intent factors. Of particular interest is the application of violent intent models to social media, such as Twitter, that have proved to serve as effective channels in furthering sociopolitical change.

  16. Systems approaches to computational modeling of the oral microbiome

    Directory of Open Access Journals (Sweden)

    Dimiter V. Dimitrov

    2013-07-01

    Full Text Available Current microbiome research has generated tremendous amounts of data providing snapshots of molecular activity in a variety of organisms, environments, and cell types. However, turning this knowledge into whole system level of understanding on pathways and processes has proven to be a challenging task. In this review we highlight the applicability of bioinformatics and visualization techniques to large collections of data in order to better understand the information that contains related diet – oral microbiome – host mucosal transcriptome interactions. In particular we focus on systems biology of Porphyromonas gingivalis in the context of high throughput computational methods tightly integrated with translational systems medicine. Those approaches have applications for both basic research, where we can direct specific laboratory experiments in model organisms and cell cultures, to human disease, where we can validate new mechanisms and biomarkers for prevention and treatment of chronic disorders

  17. Computer Aided Interpretation Approach for Optical Tomographic Images

    CERN Document Server

    Klose, Christian D; Netz, Uwe; Beuthan, Juergen; Hielscher, Andreas H

    2010-01-01

    A computer-aided interpretation approach is proposed to detect rheumatic arthritis (RA) of human finger joints in optical tomographic images. The image interpretation method employs a multi-variate signal detection analysis aided by a machine learning classification algorithm, called Self-Organizing Mapping (SOM). Unlike in previous studies, this allows for combining multiple physical image parameters, such as minimum and maximum values of the absorption coefficient for identifying affected and not affected joints. Classification performances obtained by the proposed method were evaluated in terms of sensitivity, specificity, Youden index, and mutual information. Different methods (i.e., clinical diagnostics, ultrasound imaging, magnet resonance imaging and inspection of optical tomographic images), were used as "ground truth"-benchmarks to determine the performance of image interpretations. Using data from 100 finger joints, findings suggest that some parameter combinations lead to higher sensitivities while...

  18. A Computational Drug Repositioning Approach for Targeting Oncogenic Transcription Factors.

    Science.gov (United States)

    Gayvert, Kaitlyn M; Dardenne, Etienne; Cheung, Cynthia; Boland, Mary Regina; Lorberbaum, Tal; Wanjala, Jackline; Chen, Yu; Rubin, Mark A; Tatonetti, Nicholas P; Rickman, David S; Elemento, Olivier

    2016-06-14

    Mutations in transcription factor (TF) genes are frequently observed in tumors, often leading to aberrant transcriptional activity. Unfortunately, TFs are often considered undruggable due to the absence of targetable enzymatic activity. To address this problem, we developed CRAFTT, a computational drug-repositioning approach for targeting TF activity. CRAFTT combines ChIP-seq with drug-induced expression profiling to identify small molecules that can specifically perturb TF activity. Application to ENCODE ChIP-seq datasets revealed known drug-TF interactions, and a global drug-protein network analysis supported these predictions. Application of CRAFTT to ERG, a pro-invasive, frequently overexpressed oncogenic TF, predicted that dexamethasone would inhibit ERG activity. Dexamethasone significantly decreased cell invasion and migration in an ERG-dependent manner. Furthermore, analysis of electronic medical record data indicates a protective role for dexamethasone against prostate cancer. Altogether, our method provides a broadly applicable strategy for identifying drugs that specifically modulate TF activity. PMID:27264179

  19. From computer-assisted intervention research to clinical impact: The need for a holistic approach.

    Science.gov (United States)

    Ourselin, Sébastien; Emberton, Mark; Vercauteren, Tom

    2016-10-01

    The early days of the field of medical image computing (MIC) and computer-assisted intervention (CAI), when publishing a strong self-contained methodological algorithm was enough to produce impact, are over. As a community, we now have substantial responsibility to translate our scientific progresses into improved patient care. In the field of computer-assisted interventions, the emphasis is also shifting from the mere use of well-known established imaging modalities and position trackers to the design and combination of innovative sensing, elaborate computational models and fine-grained clinical workflow analysis to create devices with unprecedented capabilities. The barriers to translating such devices in the complex and understandably heavily regulated surgical and interventional environment can seem daunting. Whether we leave the translation task mostly to our industrial partners or welcome, as researchers, an important share of it is up to us. We argue that embracing the complexity of surgical and interventional sciences is mandatory to the evolution of the field. Being able to do so requires large-scale infrastructure and a critical mass of expertise that very few research centres have. In this paper, we emphasise the need for a holistic approach to computer-assisted interventions where clinical, scientific, engineering and regulatory expertise are combined as a means of moving towards clinical impact. To ensure that the breadth of infrastructure and expertise required for translational computer-assisted intervention research does not lead to a situation where the field advances only thanks to a handful of exceptionally large research centres, we also advocate that solutions need to be designed to lower the barriers to entry. Inspired by fields such as particle physics and astronomy, we claim that centralised very large innovation centres with state of the art technology and health technology assessment capabilities backed by core support staff and open

  20. A computational approach for identifying pathogenicity islands in prokaryotic genomes

    Directory of Open Access Journals (Sweden)

    Oh Tae Kwang

    2005-07-01

    Full Text Available Abstract Background Pathogenicity islands (PAIs, distinct genomic segments of pathogens encoding virulence factors, represent a subgroup of genomic islands (GIs that have been acquired by horizontal gene transfer event. Up to now, computational approaches for identifying PAIs have been focused on the detection of genomic regions which only differ from the rest of the genome in their base composition and codon usage. These approaches often lead to the identification of genomic islands, rather than PAIs. Results We present a computational method for detecting potential PAIs in complete prokaryotic genomes by combining sequence similarities and abnormalities in genomic composition. We first collected 207 GenBank accessions containing either part or all of the reported PAI loci. In sequenced genomes, strips of PAI-homologs were defined based on the proximity of the homologs of genes in the same PAI accession. An algorithm reminiscent of sequence-assembly procedure was then devised to merge overlapping or adjacent genomic strips into a large genomic region. Among the defined genomic regions, PAI-like regions were identified by the presence of homolog(s of virulence genes. Also, GIs were postulated by calculating G+C content anomalies and codon usage bias. Of 148 prokaryotic genomes examined, 23 pathogenic and 6 non-pathogenic bacteria contained 77 candidate PAIs that partly or entirely overlap GIs. Conclusion Supporting the validity of our method, included in the list of candidate PAIs were thirty four PAIs previously identified from genome sequencing papers. Furthermore, in some instances, our method was able to detect entire PAIs for those only partial sequences are available. Our method was proven to be an efficient method for demarcating the potential PAIs in our study. Also, the function(s and origin(s of a candidate PAI can be inferred by investigating the PAI queries comprising it. Identification and analysis of potential PAIs in prokaryotic

  1. A computational approach for deciphering the organization of glycosaminoglycans.

    Directory of Open Access Journals (Sweden)

    Jean L Spencer

    Full Text Available BACKGROUND: Increasing evidence has revealed important roles for complex glycans as mediators of normal and pathological processes. Glycosaminoglycans are a class of glycans that bind and regulate the function of a wide array of proteins at the cell-extracellular matrix interface. The specific sequence and chemical organization of these polymers likely define function; however, identification of the structure-function relationships of glycosaminoglycans has been met with challenges associated with the unique level of complexity and the nontemplate-driven biosynthesis of these biopolymers. METHODOLOGY/PRINCIPAL FINDINGS: To address these challenges, we have devised a computational approach to predict fine structure and patterns of domain organization of the specific glycosaminoglycan, heparan sulfate (HS. Using chemical composition data obtained after complete and partial digestion of mixtures of HS chains with specific degradative enzymes, the computational analysis produces populations of theoretical HS chains with structures that meet both biosynthesis and enzyme degradation rules. The model performs these operations through a modular format consisting of input/output sections and three routines called chainmaker, chainbreaker, and chainsorter. We applied this methodology to analyze HS preparations isolated from pulmonary fibroblasts and epithelial cells. Significant differences in the general organization of these two HS preparations were observed, with HS from epithelial cells having a greater frequency of highly sulfated domains. Epithelial HS also showed a higher density of specific HS domains that have been associated with inhibition of neutrophil elastase. Experimental analysis of elastase inhibition was consistent with the model predictions and demonstrated that HS from epithelial cells had greater inhibitory activity than HS from fibroblasts. CONCLUSIONS/SIGNIFICANCE: This model establishes the conceptual framework for a new class of

  2. A NEW APPROACH TOWARDS INTEGRATED CLOUD COMPUTING ARCHITECTURE

    Directory of Open Access Journals (Sweden)

    Niloofar Khanghahi

    2014-03-01

    Full Text Available Today across various businesses, administrative and senior managers seeking for new technologies and approaches in which they can utilize it, more easy and affordable and thereby rise up their competitive profit and utility. Information Communications and Technology (ICT is no exception from this principle. Cloud computing concept and technology and its inherent advantages has created a new ecosystem in the world of computing and is driving ICT industry one step forward. This technology can play an important role in an organization’s durability and IT strategies. Nowadays, due to progress and global popularity of cloud environments, many organizations moving to cloud and some well-known IT solution providers such as IBM and Oracle have introduced specific architecture to be deployed for cloud environment. On the other hand, using of IT Frameworks can be the best way for integrated business processes and other different processes. The purpose of this paper is to provide a novel architecture for cloud environment, based on recent best practices and frameworks and other cloud reference architecture. Meanwhile, a new service model has been introduced in this proposed architecture. This architecture is finally compared with little other architecture in a form of statistical graphs to show its benefits.

  3. COMPUTER APPROACHES TO WHEAT HIGH-THROUGHPUT PHENOTYPING

    Directory of Open Access Journals (Sweden)

    Afonnikov D.

    2012-08-01

    Full Text Available The growing need for rapid and accurate approaches for large-scale assessment of phenotypic characters in plants becomes more and more obvious in the studies looking into relationships between genotype and phenotype. This need is due to the advent of high throughput methods for analysis of genomes. Nowadays, any genetic experiment involves data on thousands and dozens of thousands of plants. Traditional ways of assessing most phenotypic characteristics (those with reliance on the eye, the touch, the ruler are little effective on samples of such sizes. Modern approaches seek to take advantage of automated phenotyping, which warrants a much more rapid data acquisition, higher accuracy of the assessment of phenotypic features, measurement of new parameters of these features and exclusion of human subjectivity from the process. Additionally, automation allows measurement data to be rapidly loaded into computer databases, which reduces data processing time.In this work, we present the WheatPGE information system designed to solve the problem of integration of genotypic and phenotypic data and parameters of the environment, as well as to analyze the relationships between the genotype and phenotype in wheat. The system is used to consolidate miscellaneous data on a plant for storing and processing various morphological traits and genotypes of wheat plants as well as data on various environmental factors. The system is available at www.wheatdb.org. Its potential in genetic experiments has been demonstrated in high-throughput phenotyping of wheat leaf pubescence.

  4. Computational approaches to protein inference in shotgun proteomics.

    Science.gov (United States)

    Li, Yong Fuga; Radivojac, Predrag

    2012-01-01

    Shotgun proteomics has recently emerged as a powerful approach to characterizing proteomes in biological samples. Its overall objective is to identify the form and quantity of each protein in a high-throughput manner by coupling liquid chromatography with tandem mass spectrometry. As a consequence of its high throughput nature, shotgun proteomics faces challenges with respect to the analysis and interpretation of experimental data. Among such challenges, the identification of proteins present in a sample has been recognized as an important computational task. This task generally consists of (1) assigning experimental tandem mass spectra to peptides derived from a protein database, and (2) mapping assigned peptides to proteins and quantifying the confidence of identified proteins. Protein identification is fundamentally a statistical inference problem with a number of methods proposed to address its challenges. In this review we categorize current approaches into rule-based, combinatorial optimization and probabilistic inference techniques, and present them using integer programming and Bayesian inference frameworks. We also discuss the main challenges of protein identification and propose potential solutions with the goal of spurring innovative research in this area. PMID:23176300

  5. Computational approaches to protein inference in shotgun proteomics

    Directory of Open Access Journals (Sweden)

    Li Yong

    2012-11-01

    Full Text Available Abstract Shotgun proteomics has recently emerged as a powerful approach to characterizing proteomes in biological samples. Its overall objective is to identify the form and quantity of each protein in a high-throughput manner by coupling liquid chromatography with tandem mass spectrometry. As a consequence of its high throughput nature, shotgun proteomics faces challenges with respect to the analysis and interpretation of experimental data. Among such challenges, the identification of proteins present in a sample has been recognized as an important computational task. This task generally consists of (1 assigning experimental tandem mass spectra to peptides derived from a protein database, and (2 mapping assigned peptides to proteins and quantifying the confidence of identified proteins. Protein identification is fundamentally a statistical inference problem with a number of methods proposed to address its challenges. In this review we categorize current approaches into rule-based, combinatorial optimization and probabilistic inference techniques, and present them using integer programing and Bayesian inference frameworks. We also discuss the main challenges of protein identification and propose potential solutions with the goal of spurring innovative research in this area.

  6. A computational approach to finding novel targets for existing drugs.

    Directory of Open Access Journals (Sweden)

    Yvonne Y Li

    2011-09-01

    Full Text Available Repositioning existing drugs for new therapeutic uses is an efficient approach to drug discovery. We have developed a computational drug repositioning pipeline to perform large-scale molecular docking of small molecule drugs against protein drug targets, in order to map the drug-target interaction space and find novel interactions. Our method emphasizes removing false positive interaction predictions using criteria from known interaction docking, consensus scoring, and specificity. In all, our database contains 252 human protein drug targets that we classify as reliable-for-docking as well as 4621 approved and experimental small molecule drugs from DrugBank. These were cross-docked, then filtered through stringent scoring criteria to select top drug-target interactions. In particular, we used MAPK14 and the kinase inhibitor BIM-8 as examples where our stringent thresholds enriched the predicted drug-target interactions with known interactions up to 20 times compared to standard score thresholds. We validated nilotinib as a potent MAPK14 inhibitor in vitro (IC50 40 nM, suggesting a potential use for this drug in treating inflammatory diseases. The published literature indicated experimental evidence for 31 of the top predicted interactions, highlighting the promising nature of our approach. Novel interactions discovered may lead to the drug being repositioned as a therapeutic treatment for its off-target's associated disease, added insight into the drug's mechanism of action, and added insight into the drug's side effects.

  7. An Evolutionary Computation Approach to Examine Functional Brain Plasticity.

    Science.gov (United States)

    Roy, Arnab; Campbell, Colin; Bernier, Rachel A; Hillary, Frank G

    2016-01-01

    One common research goal in systems neurosciences is to understand how the functional relationship between a pair of regions of interest (ROIs) evolves over time. Examining neural connectivity in this way is well-suited for the study of developmental processes, learning, and even in recovery or treatment designs in response to injury. For most fMRI based studies, the strength of the functional relationship between two ROIs is defined as the correlation between the average signal representing each region. The drawback to this approach is that much information is lost due to averaging heterogeneous voxels, and therefore, the functional relationship between a ROI-pair that evolve at a spatial scale much finer than the ROIs remain undetected. To address this shortcoming, we introduce a novel evolutionary computation (EC) based voxel-level procedure to examine functional plasticity between an investigator defined ROI-pair by simultaneously using subject-specific BOLD-fMRI data collected from two sessions seperated by finite duration of time. This data-driven procedure detects a sub-region composed of spatially connected voxels from each ROI (a so-called sub-regional-pair) such that the pair shows a significant gain/loss of functional relationship strength across the two time points. The procedure is recursive and iteratively finds all statistically significant sub-regional-pairs within the ROIs. Using this approach, we examine functional plasticity between the default mode network (DMN) and the executive control network (ECN) during recovery from traumatic brain injury (TBI); the study includes 14 TBI and 12 healthy control subjects. We demonstrate that the EC based procedure is able to detect functional plasticity where a traditional averaging based approach fails. The subject-specific plasticity estimates obtained using the EC-procedure are highly consistent across multiple runs. Group-level analyses using these plasticity estimates showed an increase in the strength

  8. An evolutionary computation approach to examine functional brain plasticity

    Directory of Open Access Journals (Sweden)

    Arnab eRoy

    2016-04-01

    Full Text Available One common research goal in systems neurosciences is to understand how the functional relationship between a pair of regions of interest (ROIs evolves over time. Examining neural connectivity in this way is well-suited for the study of developmental processes, learning, and even in recovery or treatment designs in response to injury. For most fMRI based studies, the strength of the functional relationship between two ROIs is defined as the correlation between the average signal representing each region. The drawback to this approach is that much information is lost due to averaging heterogeneous voxels, and therefore, the functional relationship between a ROI-pair that evolve at a spatial scale much finer than the ROIs remain undetected. To address this shortcoming, we introduce a novel evolutionary computation (EC based voxel-level procedure to examine functional plasticity between an investigator defined ROI-pair by simultaneously using subject-specific BOLD-fMRI data collected from two sessions seperated by finite duration of time. This data-driven procedure detects a sub-region composed of spatially connected voxels from each ROI (a so-called sub-regional-pair such that the pair shows a significant gain/loss of functional relationship strength across the two time points. The procedure is recursive and iteratively finds all statistically significant sub-regional-pairs within the ROIs. Using this approach, we examine functional plasticity between the default mode network (DMN and the executive control network (ECN during recovery from traumatic brain injury (TBI; the study includes 14 TBI and 12 healthy control subjects. We demonstrate that the EC based procedure is able to detect functional plasticity where a traditional averaging based approach fails. The subject-specific plasticity estimates obtained using the EC-procedure are highly consistent across multiple runs. Group-level analyses using these plasticity estimates showed an increase in

  9. The "liver-first approach" for patients with locally advanced rectal cancer and synchronous liver metastases.

    NARCIS (Netherlands)

    Verhoef, C.; Pool, A.E. van der; Nuyttens, J.J.; Planting, A.S.; Eggermont, A.M.M.; Wilt, J.H.W. de

    2009-01-01

    PURPOSE: This study was designed to investigate the outcome of "the liver-first" approach in patients with locally advanced rectal cancer and synchronous liver metastases. METHODS: Patients with locally advanced rectal cancer and synchronous liver metastases were primarily treated for their liver me

  10. Driving profile modeling and recognition based on soft computing approach.

    Science.gov (United States)

    Wahab, Abdul; Quek, Chai; Tan, Chin Keong; Takeda, Kazuya

    2009-04-01

    Advancements in biometrics-based authentication have led to its increasing prominence and are being incorporated into everyday tasks. Existing vehicle security systems rely only on alarms or smart card as forms of protection. A biometric driver recognition system utilizing driving behaviors is a highly novel and personalized approach and could be incorporated into existing vehicle security system to form a multimodal identification system and offer a greater degree of multilevel protection. In this paper, detailed studies have been conducted to model individual driving behavior in order to identify features that may be efficiently and effectively used to profile each driver. Feature extraction techniques based on Gaussian mixture models (GMMs) are proposed and implemented. Features extracted from the accelerator and brake pedal pressure were then used as inputs to a fuzzy neural network (FNN) system to ascertain the identity of the driver. Two fuzzy neural networks, namely, the evolving fuzzy neural network (EFuNN) and the adaptive network-based fuzzy inference system (ANFIS), are used to demonstrate the viability of the two proposed feature extraction techniques. The performances were compared against an artificial neural network (NN) implementation using the multilayer perceptron (MLP) network and a statistical method based on the GMM. Extensive testing was conducted and the results show great potential in the use of the FNN for real-time driver identification and verification. In addition, the profiling of driver behaviors has numerous other potential applications for use by law enforcement and companies dealing with buses and truck drivers. PMID:19258199

  11. Approaches for computing uncertainties in predictions of complex-codes

    International Nuclear Information System (INIS)

    Uncertainty analysis aims at characterizing the errors associated with experiments and predictions of computer codes, in contradistinction with sensitivity analysis, which aims at determining the rate of change (i.e., derivative) in the predictions of codes when one or more (typically uncertain) input parameters varies within its range of interest. This paper reviews the salient features of three independent approaches for estimating uncertainties associated with predictions of complex system codes. The first approach reviewed in this paper as the prototype for propagation of code input errors is the so-called 'GRS method', which includes the so-called 'CSAU method' (Code Scaling, Applicability and Uncertainty) and the majority of methods adopted by the nuclear industry. Although the entire set of the actual number of input parameters for a typical NPP (Nuclear Power Plant) input deck, ranging up to about 105 input parameters, could theoretically be considered as uncertainty sources by these methods, only a 'manageable' number (of the order of several tens) is actually taken into account in practice. Ranges of variations, together with suitable PDF (Probability Density Function) are then assigned for each of the uncertain input parameter actually considered in the analysis. The number of computations using the code under investigation needed for obtaining the desired confidence in the results can be determined theoretically (it is of the order of 100). Subsequently, an additional number of computations (ca. 100) with the code are performed to propagate the uncertainties inside the code, from inputs to outputs (results). The second approach reviewed in this paper is the propagation of code output errors, as representatively illustrated by the UMAE-CIAU (Uncertainty Method based upon Accuracy Extrapolation 'embedded' into the Code with capability of Internal Assessment of Uncertainty). Note that this class of methods includes only a few applications from industry

  12. Recent advances in rational approaches for enzyme engineering

    Directory of Open Access Journals (Sweden)

    Kerstin Steiner

    2012-09-01

    Full Text Available Enzymes are an attractive alternative in the asymmetric syntheses of chiral building blocks. To meet the requirements of industrial biotechnology and to introduce new functionalities, the enzymes need to be optimized by protein engineering. This article specifically reviews rational approaches for enzyme engineering and de novo enzyme design involving structure-based approaches developed in recent years for improvement of the enzymes’ performance, broadened substrate range, and creation of novel functionalities to obtain products with high added value for industrial applications.

  13. Recovery Act: Advanced Interaction, Computation, and Visualization Tools for Sustainable Building Design

    Energy Technology Data Exchange (ETDEWEB)

    Greenberg, Donald P. [Cornell Univ., Ithaca, NY (United States); Hencey, Brandon M. [Cornell Univ., Ithaca, NY (United States)

    2013-08-20

    Current building energy simulation technology requires excessive labor, time and expertise to create building energy models, excessive computational time for accurate simulations and difficulties with the interpretation of the results. These deficiencies can be ameliorated using modern graphical user interfaces and algorithms which take advantage of modern computer architectures and display capabilities. To prove this hypothesis, we developed an experimental test bed for building energy simulation. This novel test bed environment offers an easy-to-use interactive graphical interface, provides access to innovative simulation modules that run at accelerated computational speeds, and presents new graphics visualization methods to interpret simulation results. Our system offers the promise of dramatic ease of use in comparison with currently available building energy simulation tools. Its modular structure makes it suitable for early stage building design, as a research platform for the investigation of new simulation methods, and as a tool for teaching concepts of sustainable design. Improvements in the accuracy and execution speed of many of the simulation modules are based on the modification of advanced computer graphics rendering algorithms. Significant performance improvements are demonstrated in several computationally expensive energy simulation modules. The incorporation of these modern graphical techniques should advance the state of the art in the domain of whole building energy analysis and building performance simulation, particularly at the conceptual design stage when decisions have the greatest impact. More importantly, these better simulation tools will enable the transition from prescriptive to performative energy codes, resulting in better, more efficient designs for our future built environment.

  14. A trait-based approach to advance coral reef science

    DEFF Research Database (Denmark)

    Madin, Joshua S.; Hoogenboom, Mia O.; Connolly, Sean R.;

    2016-01-01

    hampered by a paucity of trait data for the many, often rare, species and by a reliance on nonquantitative approaches. Therefore, we propose filling data gaps by prioritizing traits that are easy to measure, estimating key traits for species with missing data, and identifying ‘supertraits’ that capture...

  15. Continued rise of the cloud advances and trends in cloud computing

    CERN Document Server

    Mahmood, Zaigham

    2014-01-01

    Cloud computing is no-longer a novel paradigm, but instead an increasingly robust and established technology, yet new developments continue to emerge in this area. Continued Rise of the Cloud: Advances and Trends in Cloud Computing captures the state of the art in cloud technologies, infrastructures, and service delivery and deployment models. The book provides guidance and case studies on the development of cloud-based services and infrastructures from an international selection of expert researchers and practitioners. A careful analysis is provided of relevant theoretical frameworks, prac

  16. The responsive approach by the Basel Committee (on Banking Supervision) to regulation: Meta risk regulation, the Internal Ratings Based Approaches and the Advanced Measurement Approaches.

    OpenAIRE

    Ojo, Marianne

    2009-01-01

    The use of complex and sophisticated financial instruments, such as derivatives, in the modern financial environment, has triggered the emergence of new forms of risks. As well as the need to manage such types of risks, this paper investigates developments which have instigated the Basel Committee in developing advanced risk management techniques such as the Internal Ratings Based (IRB) approaches and the Advanced Measurement Approaches (AMA). Developments since the inception of the 1988 Base...

  17. Computational methods in the prediction of advanced subsonic and supersonic propeller induced noise: ASSPIN users' manual

    Science.gov (United States)

    Dunn, M. H.; Tarkenton, G. M.

    1992-01-01

    This document describes the computational aspects of propeller noise prediction in the time domain and the use of high speed propeller noise prediction program ASSPIN (Advanced Subsonic and Supersonic Propeller Induced Noise). These formulations are valid in both the near and far fields. Two formulations are utilized by ASSPIN: (1) one is used for subsonic portions of the propeller blade; and (2) the second is used for transonic and supersonic regions on the blade. Switching between the two formulations is done automatically. ASSPIN incorporates advanced blade geometry and surface pressure modelling, adaptive observer time grid strategies, and contains enhanced numerical algorithms that result in reduced computational time. In addition, the ability to treat the nonaxial inflow case has been included.

  18. A task-specific approach to computational imaging system design

    Science.gov (United States)

    Ashok, Amit

    The traditional approach to imaging system design places the sole burden of image formation on optical components. In contrast, a computational imaging system relies on a combination of optics and post-processing to produce the final image and/or output measurement. Therefore, the joint-optimization (JO) of the optical and the post-processing degrees of freedom plays a critical role in the design of computational imaging systems. The JO framework also allows us to incorporate task-specific performance measures to optimize an imaging system for a specific task. In this dissertation, we consider the design of computational imaging systems within a JO framework for two separate tasks: object reconstruction and iris-recognition. The goal of these design studies is to optimize the imaging system to overcome the performance degradations introduced by under-sampled image measurements. Within the JO framework, we engineer the optical point spread function (PSF) of the imager, representing the optical degrees of freedom, in conjunction with the post-processing algorithm parameters to maximize the task performance. For the object reconstruction task, the optimized imaging system achieves a 50% improvement in resolution and nearly 20% lower reconstruction root-mean-square-error (RMSE) as compared to the un-optimized imaging system. For the iris-recognition task, the optimized imaging system achieves a 33% improvement in false rejection ratio (FRR) for a fixed alarm ratio (FAR) relative to the conventional imaging system. The effect of the performance measures like resolution, RMSE, FRR, and FAR on the optimal design highlights the crucial role of task-specific design metrics in the JO framework. We introduce a fundamental measure of task-specific performance known as task-specific information (TSI), an information-theoretic measure that quantifies the information content of an image measurement relevant to a specific task. A variety of source-models are derived to illustrate

  19. Cooperative technology development: An approach to advancing energy technology

    International Nuclear Information System (INIS)

    Technology development requires an enormous financial investment over a long period of time. Scarce national and corporate resources, the result of highly competitive markets, decreased profit margins, wide currency fluctuations, and growing debt, often preclude continuous development of energy technology by single entities, i.e., corporations, institutions, or nations. Although the energy needs of the developed world are generally being met by existing institutions, it is becoming increasingly clear that existing capital formation and technology transfer structures have failed to aid developing nations in meeting their growing electricity needs. This paper will describe a method for meeting the electricity needs of the developing world through technology transfer and international cooperative technology development. The role of nuclear power and the advanced passive plant design will be discussed. (author)

  20. Advanced free space optics (FSO) a systems approach

    CERN Document Server

    Majumdar, Arun K

    2015-01-01

    This book provides a comprehensive, unified tutorial covering the most recent advances in the technology of free-space optics (FSO). It is an all-inclusive source of information on the fundamentals of FSO as well as up-to-date information on the state-of-the-art in technologies available today. This text is intended for graduate students, and will also be useful for research scientists and engineers with an interest in the field. FSO communication is a practical solution for creating a three dimensional global broadband communications grid, offering bandwidths far beyond what is possible in the Radio Frequency (RF) range. However, the attributes of atmospheric turbulence and scattering impose perennial limitations on availability and reliability of FSO links. From a systems point-of-view, this groundbreaking book provides a thorough understanding of channel behavior, which can be used to design and evaluate optimum transmission techniques that operate under realistic atmospheric conditions. Topics addressed...

  1. The Numerical Tours of Signal Processing - Advanced Computational Signal and Image Processing

    OpenAIRE

    Peyré, Gabriel

    2011-01-01

    The Numerical Tours of Signal Processing is an online collection of tutorials to learn advanced computational signal and image processing. These tours allow one to follow a step by step Matlab or Scilab implementation of many important processing algorithms. This implementation is commented and the connexions with the relevant mathematical notions are exposed. These algorithms are applied to various signal, image, movie and 3D mesh datasets. These tours are suitable for practitioners in the f...

  2. A Trait-Based Approach to Advance Coral Reef Science.

    Science.gov (United States)

    Madin, Joshua S; Hoogenboom, Mia O; Connolly, Sean R; Darling, Emily S; Falster, Daniel S; Huang, Danwei; Keith, Sally A; Mizerek, Toni; Pandolfi, John M; Putnam, Hollie M; Baird, Andrew H

    2016-06-01

    Coral reefs are biologically diverse and ecologically complex ecosystems constructed by stony corals. Despite decades of research, basic coral population biology and community ecology questions remain. Quantifying trait variation among species can help resolve these questions, but progress has been hampered by a paucity of trait data for the many, often rare, species and by a reliance on nonquantitative approaches. Therefore, we propose filling data gaps by prioritizing traits that are easy to measure, estimating key traits for species with missing data, and identifying 'supertraits' that capture a large amount of variation for a range of biological and ecological processes. Such an approach can accelerate our understanding of coral ecology and our ability to protect critically threatened global ecosystems. PMID:26969335

  3. New advances in the statistical parton distributions approach*

    Directory of Open Access Journals (Sweden)

    Soffer Jacques

    2016-01-01

    Full Text Available The quantum statistical parton distributions approach proposed more than one decade ago is revisited by considering a larger set of recent and accurate Deep Inelastic Scattering experimental results. It enables us to improve the description of the data by means of a new determination of the parton distributions. This global next-to-leading order QCD analysis leads to a good description of several structure functions, involving unpolarized parton distributions and helicity distributions, in terms of a rather small number of free parameters. There are many serious challenging issues. The predictions of this theoretical approach will be tested for single-jet production and charge asymmetry in W± production in p̄p and pp collisions up to LHC energies, using recent data and also for forthcoming experimental results.

  4. Condition monitoring through advanced sensor and computational technology : final report (January 2002 to May 2005).

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jung-Taek (Korea Atomic Energy Research Institute, Daejon, Korea); Luk, Vincent K.

    2005-05-01

    The overall goal of this joint research project was to develop and demonstrate advanced sensors and computational technology for continuous monitoring of the condition of components, structures, and systems in advanced and next-generation nuclear power plants (NPPs). This project included investigating and adapting several advanced sensor technologies from Korean and US national laboratory research communities, some of which were developed and applied in non-nuclear industries. The project team investigated and developed sophisticated signal processing, noise reduction, and pattern recognition techniques and algorithms. The researchers installed sensors and conducted condition monitoring tests on two test loops, a check valve (an active component) and a piping elbow (a passive component), to demonstrate the feasibility of using advanced sensors and computational technology to achieve the project goal. Acoustic emission (AE) devices, optical fiber sensors, accelerometers, and ultrasonic transducers (UTs) were used to detect mechanical vibratory response of check valve and piping elbow in normal and degraded configurations. Chemical sensors were also installed to monitor the water chemistry in the piping elbow test loop. Analysis results of processed sensor data indicate that it is feasible to differentiate between the normal and degraded (with selected degradation mechanisms) configurations of these two components from the acquired sensor signals, but it is questionable that these methods can reliably identify the level and type of degradation. Additional research and development efforts are needed to refine the differentiation techniques and to reduce the level of uncertainties.

  5. High-Throughput Computational Design of Advanced Functional Materials: Topological Insulators and Two-Dimensional Electron Gas Systems

    Science.gov (United States)

    Yang, Kesong

    As a rapidly growing area of materials science, high-throughput (HT) computational materials design is playing a crucial role in accelerating the discovery and development of novel functional materials. In this presentation, I will first introduce the strategy of HT computational materials design, and take the HT discovery of topological insulators (TIs) as a practical example to show the usage of such an approach. Topological insulators are one of the most studied classes of novel materials because of their great potential for applications ranging from spintronics to quantum computers. Here I will show that, by defining a reliable and accessible descriptor, which represents the topological robustness or feasibility of the candidate, and by searching the quantum materials repository aflowlib.org, we have automatically discovered 28 TIs (some of them already known) in five different symmetry families. Next, I will talk about our recent research work on the HT computational design of the perovskite-based two-dimensional electron gas (2DEG) systems. The 2DEG formed on the perovskite oxide heterostructure (HS) has potential applications in next-generation nanoelectronic devices. In order to achieve practical implementation of the 2DEG in the device design, desired physical properties such as high charge carrier density and mobility are necessary. Here I show that, using the same strategy with the HT discovery of TIs, by introducing a series of combinatorial descriptors, we have successfully identified a series of candidate 2DEG systems based on the perovskite oxides. This work provides another exemplar of applying HT computational design approach for the discovery of advanced functional materials.

  6. Research Institute for Advanced Computer Science: Annual Report October 1998 through September 1999

    Science.gov (United States)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    1999-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center (ARC). It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. ARC has been designated NASA's Center of Excellence in Information Technology. In this capacity, ARC is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA ARC and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to

  7. Computational Approach for Epitaxial Polymorph Stabilization through Substrate Selection.

    Science.gov (United States)

    Ding, Hong; Dwaraknath, Shyam S; Garten, Lauren; Ndione, Paul; Ginley, David; Persson, Kristin A

    2016-05-25

    With the ultimate goal of finding new polymorphs through targeted synthesis conditions and techniques, we outline a computational framework to select optimal substrates for epitaxial growth using first principle calculations of formation energies, elastic strain energy, and topological information. To demonstrate the approach, we study the stabilization of metastable VO2 compounds which provides a rich chemical and structural polymorph space. We find that common polymorph statistics, lattice matching, and energy above hull considerations recommends homostructural growth on TiO2 substrates, where the VO2 brookite phase would be preferentially grown on the a-c TiO2 brookite plane while the columbite and anatase structures favor the a-b plane on the respective TiO2 phases. Overall, we find that a model which incorporates a geometric unit cell area matching between the substrate and the target film as well as the resulting strain energy density of the film provide qualitative agreement with experimental observations for the heterostructural growth of known VO2 polymorphs: rutile, A and B phases. The minimal interfacial geometry matching and estimated strain energy criteria provide several suggestions for substrates and substrate-film orientations for the heterostructural growth of the hitherto hypothetical anatase, brookite, and columbite polymorphs. These criteria serve as a preliminary guidance for the experimental efforts stabilizing new materials and/or polymorphs through epitaxy. The current screening algorithm is being integrated within the Materials Project online framework and data and hence publicly available. PMID:27145398

  8. Computational Approach for Epitaxial Polymorph Stabilization through Substrate Selection

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Hong; Dwaraknath, Shyam S.; Garten, Lauren; Ndione, Paul; Ginley, David; Persson, Kristin A.

    2016-05-25

    With the ultimate goal of finding new polymorphs through targeted synthesis conditions and techniques, we outline a computational framework to select optimal substrates for epitaxial growth using first principle calculations of formation energies, elastic strain energy, and topological information. To demonstrate the approach, we study the stabilization of metastable VO2 compounds which provides a rich chemical and structural polymorph space. We find that common polymorph statistics, lattice matching, and energy above hull considerations recommends homostructural growth on TiO2 substrates, where the VO2 brookite phase would be preferentially grown on the a-c TiO2 brookite plane while the columbite and anatase structures favor the a-b plane on the respective TiO2 phases. Overall, we find that a model which incorporates a geometric unit cell area matching between the substrate and the target film as well as the resulting strain energy density of the film provide qualitative agreement with experimental observations for the heterostructural growth of known VO2 polymorphs: rutile, A and B phases. The minimal interfacial geometry matching and estimated strain energy criteria provide several suggestions for substrates and substrate-film orientations for the heterostructural growth of the hitherto hypothetical anatase, brookite, and columbite polymorphs. These criteria serve as a preliminary guidance for the experimental efforts stabilizing new materials and/or polymorphs through epitaxy. The current screening algorithm is being integrated within the Materials Project online framework and data and hence publicly available.

  9. Applying a cloud computing approach to storage architectures for spacecraft

    Science.gov (United States)

    Baldor, Sue A.; Quiroz, Carlos; Wood, Paul

    As sensor technologies, processor speeds, and memory densities increase, spacecraft command, control, processing, and data storage systems have grown in complexity to take advantage of these improvements and expand the possible missions of spacecraft. Spacecraft systems engineers are increasingly looking for novel ways to address this growth in complexity and mitigate associated risks. Looking to conventional computing, many solutions have been executed to solve both the problem of complexity and heterogeneity in systems. In particular, the cloud-based paradigm provides a solution for distributing applications and storage capabilities across multiple platforms. In this paper, we propose utilizing a cloud-like architecture to provide a scalable mechanism for providing mass storage in spacecraft networks that can be reused on multiple spacecraft systems. By presenting a consistent interface to applications and devices that request data to be stored, complex systems designed by multiple organizations may be more readily integrated. Behind the abstraction, the cloud storage capability would manage wear-leveling, power consumption, and other attributes related to the physical memory devices, critical components in any mass storage solution for spacecraft. Our approach employs SpaceWire networks and SpaceWire-capable devices, although the concept could easily be extended to non-heterogeneous networks consisting of multiple spacecraft and potentially the ground segment.

  10. Computational approach for a pair of bubble coalescence process

    Energy Technology Data Exchange (ETDEWEB)

    Nurul Hasan, E-mail: nurul_hasan@petronas.com.my [Department of Chemical Engineering, Universiti Teknologi Petronas, Bandar Seri Iskandar, Perak 31750 (Malaysia); Zalinawati binti Zakaria [Department of Chemical Engineering, Universiti Teknologi Petronas, Bandar Seri Iskandar, Perak 31750 (Malaysia)

    2011-06-15

    The coalescence of bubbles has great value in mineral recovery and oil industry. In this paper, two co-axial bubbles rising in a cylinder is modelled to study the coalescence of bubbles for four computational experimental test cases. The Reynolds' (Re) number is chosen in between 8.50 and 10, Bond number, Bo {approx}4.25-50, Morton number, M 0.0125-14.7. The viscosity ratio ({mu}{sub r}) and density ratio ({rho}{sub r}) of liquid to bubble are kept constant (100 and 850 respectively). It was found that the Bo number has significant effect on the coalescence process for constant Re, {mu}{sub r} and {rho}{sub r}. The bubble-bubble distance over time was validated against published experimental data. The results show that VOF approach can be used to model these phenomena accurately. The surface tension was changed to alter the Bo and density of the fluids to alter the Re and M, keeping the {mu}{sub r} and {rho}{sub r} the same. It was found that for lower Bo, the bubble coalesce is slower and the pocket at the lower part of the leading bubble is less concave (towards downward) which is supported by the experimental data.

  11. A Near-Term Quantum Computing Approach for Hard Computational Problems in Space Exploration

    CERN Document Server

    Smelyanskiy, Vadim N; Knysh, Sergey I; Williams, Colin P; Johnson, Mark W; Thom, Murray C; Macready, William G; Pudenz, Kristen L

    2012-01-01

    In this article, we show how to map a sampling of the hardest artificial intelligence problems in space exploration onto equivalent Ising models that then can be attacked using quantum annealing implemented in D-Wave machine. We overview the existing results as well as propose new Ising model implementations for quantum annealing. We review supervised and unsupervised learning algorithms for classification and clustering with applications to feature identification and anomaly detection. We introduce algorithms for data fusion and image matching for remote sensing applications. We overview planning problems for space exploration mission applications and algorithms for diagnostics and recovery with applications to deep space missions. We describe combinatorial optimization algorithms for task assignment in the context of autonomous unmanned exploration. Finally, we discuss the ways to circumvent the limitation of the Ising mapping using a "blackbox" approach based on ideas from probabilistic computing. In this ...

  12. An overview of bacterial efflux pumps and computational approaches to study efflux pump inhibitors.

    Science.gov (United States)

    Jamshidi, Shirin; Sutton, J Mark; Rahman, Khondaker M

    2016-02-01

    Micro-organisms express a wide range of transmembrane pumps known as multidrug efflux pumps that improve the micro-organism's ability to survive in severe environments and contribute to resistance against antibiotic and antimicrobial agents. There is significant interest in developing efflux inhibitors as an adjunct to treatment with current and next generation of antibiotics. A greater understanding of drug recognition and transport by multidrug efflux pumps is needed to develop clinically useful inhibitors, given the breadth of molecules that can be effluxed by these systems. We summarize some structural and functional data that could provide insights into the inhibition of transport mechanisms of these intricate molecular nanomachines with a focus on the advances in computational approaches. PMID:26824720

  13. A MOBILE COMPUTING TECHNOLOGY FORESIGHT STUDY WITH SCENARIO PLANNING APPROACH

    Directory of Open Access Journals (Sweden)

    Wei-Hsiu Weng

    2015-09-01

    Full Text Available Although the importance of mobile computing is gradually being recognized, mobile computing technology development and adoption have not been clearly realized. This paper focuses on the technology planning strategy for organizations that have an interest in developing or adopting mobile computing technology. By using scenario analysis, a technology planning strategy is constructed. In this study, thirty mobile computing technologies are classified into six groups, and the importance and risk factors of these technologies are then evaluated under two possible scenarios. The main research findings include the discovery that most mobile computing software technologies are rated high to medium in importance and low risk in both scenarios, and that scenario changes will have less impact on mobile computing devices and on mobile computing software technologies. These results provide a reference for organizations interested in developing or adopting mobile computing technology.

  14. A MOBILE COMPUTING TECHNOLOGY FORESIGHT STUDY WITH SCENARIO PLANNING APPROACH

    OpenAIRE

    Wei-Hsiu Weng; Woo-Tsong Lin

    2015-01-01

    Although the importance of mobile computing is gradually being recognized, mobile computing technology development and adoption have not been clearly realized. This paper focuses on the technology planning strategy for organizations that have an interest in developing or adopting mobile computing technology. By using scenario analysis, a technology planning strategy is constructed. In this study, thirty mobile computing technologies are classified into six groups, and the importance and risk ...

  15. Computational approaches to stochastic systems in physics and biology

    Science.gov (United States)

    Jeraldo Maldonado, Patricio Rodrigo

    In this dissertation, I devise computational approaches to model and understand two very different systems which exhibit stochastic behavior: quantum fluids with topological defects arising during quenches and forcing, and complex microbial communities living and evolving with the gastrointestinal tracts of vertebrates. As such, this dissertation is organized into two parts. In Part I, I create a model for quantum fluids, which incorporates a conservative and dissipative part, and I also allow the fluid to be externally forced by a normal fluid. I use then this model to calculate scaling laws arising from the stochastic interactions of the topological defects exhibited by the modeled fluid while undergoing a quench. In Chapter 2 I give a detailed description of this model of quantum fluids. Unlike more traditional approaches, this model is based on Cell Dynamical Systems (CDS), an approach that captures relevant physical features of the system and allows for long time steps during its evolution. I devise a two step CDS model, implementing both conservative and dissipative dynamics present in quantum fluids. I also couple the model with an external normal fluid field that drives the system. I then validate the results of the model by measuring different scaling laws predicted for quantum fluids. I also propose an extension of the model that also incorporates the excitations of the fluid and couples its dynamics with the dynamics of the condensate. In Chapter 3 I use the above model to calculate scaling laws predicted for the velocity of topological defects undergoing a critical quench. To accomplish this, I numerically implement an algorithm that extracts from the order parameter field the velocity components of the defects as they move during the quench process. This algorithm is robust and extensible to any system where defects are located by the zeros of the order parameter. The algorithm is also applied to a sheared stripe-forming system, allowing the

  16. Advancing Partnerships Towards an Integrated Approach to Oil Spill Response

    Science.gov (United States)

    Green, D. S.; Stough, T.; Gallegos, S. C.; Leifer, I.; Murray, J. J.; Streett, D.

    2015-12-01

    Oil spills can cause enormous ecological and economic devastation, necessitating application of the best science and technology available, and remote sensing is playing a growing critical role in the detection and monitoring of oil spills, as well as facilitating validation of remote sensing oil spill products. The FOSTERRS (Federal Oil Science Team for Emergency Response Remote Sensing) interagency working group seeks to ensure that during an oil spill, remote sensing assets (satellite/aircraft/instruments) and analysis techniques are quickly, effectively, appropriately, and seamlessly available to oil spills responders. Yet significant challenges remain for addressing oils spanning a vast range of chemical properties that may be spilled from the Tropics to the Arctic, with algorithms and scientific understanding needing advances to keep up with technology. Thus, FOSTERRS promotes enabling scientific discovery to ensure robust utilization of available technology as well as identifying technologies moving up the TRL (Technology Readiness Level). A recent FOSTERRS facilitated support activity involved deployment of the AVIRIS NG (Airborne Visual Infrared Imaging Spectrometer- Next Generation) during the Santa Barbara Oil Spill to validate the potential of airborne hyperspectral imaging to real-time map beach tar coverage including surface validation data. Many developing airborne technologies have potential to transition to space-based platforms providing global readiness.

  17. Advanced Modular Power Approach to Affordable, Supportable Space Systems

    Science.gov (United States)

    Oeftering, Richard C.; Kimnach, Greg L.; Fincannon, James; Mckissock,, Barbara I.; Loyselle, Patricia L.; Wong, Edmond

    2013-01-01

    Recent studies of missions to the Moon, Mars and Near Earth Asteroids (NEA) indicate that these missions often involve several distinct separately launched vehicles that must ultimately be integrated together in-flight and operate as one unit. Therefore, it is important to see these vehicles as elements of a larger segmented spacecraft rather than separate spacecraft flying in formation. The evolution of large multi-vehicle exploration architecture creates the need (and opportunity) to establish a global power architecture that is common across all vehicles. The Advanced Exploration Systems (AES) Modular Power System (AMPS) project managed by NASA Glenn Research Center (GRC) is aimed at establishing the modular power system architecture that will enable power systems to be built from a common set of modular building blocks. The project is developing, demonstrating and evaluating key modular power technologies that are expected to minimize non-recurring development costs, reduce recurring integration costs, as well as, mission operational and support costs. Further, modular power is expected to enhance mission flexibility, vehicle reliability, scalability and overall mission supportability. The AMPS project not only supports multi-vehicle architectures but should enable multi-mission capability as well. The AMPS technology development involves near term demonstrations involving developmental prototype vehicles and field demonstrations. These operational demonstrations not only serve as a means of evaluating modular technology but also provide feedback to developers that assure that they progress toward truly flexible and operationally supportable modular power architecture.

  18. Advances in a Distributed Approach for Ocean Model Data Interoperability

    Directory of Open Access Journals (Sweden)

    Richard P. Signell

    2014-03-01

    Full Text Available An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC Sensor Observation Service (SOS, a metadata standard for unstructured grid model output (UGRID, and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS® Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.

  19. Advances in a distributed approach for ocean model data interoperability

    Science.gov (United States)

    Signell, Richard P.; Snowden, Derrick P.

    2014-01-01

    An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF) metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF) output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC) Sensor Observation Service (SOS), a metadata standard for unstructured grid model output (UGRID), and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS®) Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.

  20. Advanced methods for the computation of particle beam transport and the computation of electromagnetic fields and beam-cavity interactions

    International Nuclear Information System (INIS)

    The University of Maryland Dynamical Systems and Accelerator Theory Group carries out research in two broad areas: the computation of charged particle beam transport using Lie algebraic methods and advanced methods for the computation of electromagnetic fields and beam-cavity interactions. Important improvements in the state of the art are believed to be possible in both of these areas. In addition, applications of these methods are made to problems of current interest in accelerator physics including the theoretical performance of present and proposed high energy machines. The Lie algebraic method of computing and analyzing beam transport handles both linear and nonlinear beam elements. Tests show this method to be superior to the earlier matrix or numerical integration methods. It has wide application to many areas including accelerator physics, intense particle beams, ion microprobes, high resolution electron microscopy, and light optics. With regard to the area of electromagnetic fields and beam cavity interactions, work is carried out on the theory of beam breakup in single pulses. Work is also done on the analysis of the high frequency behavior of longitudinal and transverse coupling impedances, including the examination of methods which may be used to measure these impedances. Finally, work is performed on the electromagnetic analysis of coupled cavities and on the coupling of cavities to waveguides

  1. Advances in Assays and Analytical Approaches for Botulinum Toxin Detection

    Energy Technology Data Exchange (ETDEWEB)

    Grate, Jay W.; Ozanich, Richard M.; Warner, Marvin G.; Bruckner-Lea, Cindy J.; Marks, James D.

    2010-08-04

    Methods to detect botulinum toxin, the most poisonous substance known, are reviewed. Current assays are being developed with two main objectives in mind: 1) to obtain sufficiently low detection limits to replace the mouse bioassay with an in vitro assay, and 2) to develop rapid assays for screening purposes that are as sensitive as possible while requiring an hour or less to process the sample an obtain the result. This review emphasizes the diverse analytical approaches and devices that have been developed over the last decade, while also briefly reviewing representative older immunoassays to provide background and context.

  2. The Metacognitive Approach to Computer Education: Making Explicit the Learning Journey

    Science.gov (United States)

    Phelps, Renata

    2007-01-01

    This paper presents a theoretical and practical exploration of a metacognitive approach to computer education, developed through a three-year action research project. It is argued that the approach contrasts significantly with often-employed directive and competency-based approaches to computer education and is more appropriate in addressing the…

  3. A new approach in development of data flow control and investigation system for computer networks

    International Nuclear Information System (INIS)

    This paper describes a new approach in development of data flow control and investigation system for computer networks. This approach was developed and applied in the Moscow Radiotechnical Institute for control and investigations of Institute computer network. It allowed us to solve our network current problems successfully. Description of our approach is represented below along with the most interesting results of our work. (author)

  4. Advanced welding for closed structure. Pt. 2 The ultrasonic approach

    Energy Technology Data Exchange (ETDEWEB)

    Sacripanti, A.; Paoloni, M.; Sagratella, G. [ENEA Centro Ricerche Casaccia, Rome (Italy). Dipt. Innovazione

    1999-07-01

    This report describes the activities developed for the European Contract BRITE AWCS III to study the use of ultrasonic sensing techniques to obtain an accurate detection of the internal reinforcement of the closed steel structures employed in the shipbuilding industry. After a description of the methods, techniques and problems for the ultrasonic testing of materials in the conventional approach, a new method of the multiple reflection-absorption is introduced with their experimental tests and results. The obtained conclusion shows that the ultrasonic non destructive testing techniques in the new approach should be useful to assemble a complete sensing system with two receivers, one thermal and one ultrasonic. [Italian] Questo rapporto descrive le attivita' sperimentali sviluppate nell'ambito del contratto europeo BRITE AWCS III, in cui si sono utilizzate tecniche ultrasoniche per ottenere un preciso rilevamento dei rinforzi interni di strutture metalliche chiuse utilizzate nell'industria delle costruzioni navali. Dopo la descrizione dei metodi, delle tecniche e dei problemi riguardanti il testing ultrasonico dei materiali, e' stato introdotto un approccio innovativo basato sul metodo dell'assorbimento delle riflessioni multiple con i risultati sperimentali. Le conclusioni ottenute mostrano che nel nuovo approccio, il testing ultrasonico non distruttivo dovrebbe essere utile per assemblare un sistema sensoriale con due sensori, uno di tipo termico, uno di tipo ultrasonico.

  5. Human Computation An Integrated Approach to Learning from the Crowd

    CERN Document Server

    Law, Edith

    2011-01-01

    Human computation is a new and evolving research area that centers around harnessing human intelligence to solve computational problems that are beyond the scope of existing Artificial Intelligence (AI) algorithms. With the growth of the Web, human computation systems can now leverage the abilities of an unprecedented number of people via the Web to perform complex computation. There are various genres of human computation applications that exist today. Games with a purpose (e.g., the ESP Game) specifically target online gamers who generate useful data (e.g., image tags) while playing an enjoy

  6. Differentiating Information Skills and Computer Skills: A Factor Analytic Approach

    OpenAIRE

    Pask, Judith M.; Saunders, E. Stewart

    2004-01-01

    A basic tenet of information literacy programs is that the skills needed to use computers and the skills needed to find and evaluate information are two separate sets of skills. Outside the library this is not always the view. The claim is sometimes made that information skills are acquired by learning computer skills. All that is needed is a computer lab and someone to teach computer skills. This study uses data from a survey of computer and information skills to determine whether or not...

  7. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    Science.gov (United States)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  8. NATO Advanced Research Workshop on Exploiting Mental Imagery with Computers in Mathematics Education

    CERN Document Server

    Mason, John

    1995-01-01

    The advent of fast and sophisticated computer graphics has brought dynamic and interactive images under the control of professional mathematicians and mathematics teachers. This volume in the NATO Special Programme on Advanced Educational Technology takes a comprehensive and critical look at how the computer can support the use of visual images in mathematical problem solving. The contributions are written by researchers and teachers from a variety of disciplines including computer science, mathematics, mathematics education, psychology, and design. Some focus on the use of external visual images and others on the development of individual mental imagery. The book is the first collected volume in a research area that is developing rapidly, and the authors pose some challenging new questions.

  9. Recent advances in computational methods and clinical applications for spine imaging

    CERN Document Server

    Glocker, Ben; Klinder, Tobias; Li, Shuo

    2015-01-01

    This book contains the full papers presented at the MICCAI 2014 workshop on Computational Methods and Clinical Applications for Spine Imaging. The workshop brought together scientists and clinicians in the field of computational spine imaging. The chapters included in this book present and discuss the new advances and challenges in these fields, using several methods and techniques in order to address more efficiently different and timely applications involving signal and image acquisition, image processing and analysis, image segmentation, image registration and fusion, computer simulation, image based modeling, simulation and surgical planning, image guided robot assisted surgical and image based diagnosis. The book also includes papers and reports from the first challenge on vertebra segmentation held at the workshop.

  10. Computer Hardware, Advanced Mathematics and Model Physics pilot project final report

    International Nuclear Information System (INIS)

    The Computer Hardware, Advanced Mathematics and Model Physics (CHAMMP) Program was launched in January, 1990. A principal objective of the program has been to utilize the emerging capabilities of massively parallel scientific computers in the challenge of regional scale predictions of decade-to-century climate change. CHAMMP has already demonstrated the feasibility of achieving a 10,000 fold increase in computational throughput for climate modeling in this decade. What we have also recognized, however, is the need for new algorithms and computer software to capitalize on the radically new computing architectures. This report describes the pilot CHAMMP projects at the DOE National Laboratories and the National Center for Atmospheric Research (NCAR). The pilot projects were selected to identify the principal challenges to CHAMMP and to entrain new scientific computing expertise. The success of some of these projects has aided in the definition of the CHAMMP scientific plan. Many of the papers in this report have been or will be submitted for publication in the open literature. Readers are urged to consult with the authors directly for questions or comments about their papers

  11. Funnel function approach to determine uncertainty: Some advances

    Science.gov (United States)

    Routh, P. S.

    2006-12-01

    Given a finite number of noisy data it is difficult (perhaps impossible) to obtain unique average of the model value in any region of the model (Backus & Gilbert, 1970; Oldenburg, 1983). This difficulty motivated Backus and Gilbert to construct the averaging kernels that is in some sense close to delta function. Averaging kernels describe how the true model is averaged over the entire domain to generate the model value in the region of interest. An unique average value is difficult to obtain theoretically. However we can compute the bounds on the average value and this allows us to obtain a measure of uncertainty. This idea was proposed by Oldenburg (1983). As the region of interest increases the uncertainty decreases associated with the average value giving a funnel like shape. Mathematically this is equivalent to solving minimization and maximization problem of average value (Oldenburg, 1983). In this work I developed a nonlinear interior point method to solve this min-max problem and construct the bounds. The bounds determined in this manner honors all types of available information: (a) geophysical data with errors (b) deterministic or statistical prior information and (c ) complementary information from other data sets at different scales (such as hydrology or other geophysical data) if they are formulated in a joint inversion framework.

  12. An evolutionary approach to advanced water cooled reactors

    International Nuclear Information System (INIS)

    Based on the result of the Feasibility Study undertaken since 1991, Indonesia may enter in the new nuclear era by introduction of several Nuclear Power Plants in our energy supply system. Requirements for the future NPP's are developed in two step approach. First step is for the immediate future that is the next 50 years where the system will be dominated by A-LWR's/A-PHWR's and the second step is for the time period beyond 50 years in which new reactor systems may start to dominate. The integral reactor concept provides a revolutionary improvements in terms of conceptual and safety. However, it creates a new set of complex machinery and operational problems of its own. The paper concerns with a brief description of nuclear technology status in Indonesia and a qualitative assessment of integral reactor concept. (author)

  13. Advanced welding for closed structure. Pt. 1 The magnetic approach

    Energy Technology Data Exchange (ETDEWEB)

    Sacripanti, A.; Paoloni, M.; Sagratella, G. [ENEA Centro Ricerche Casaccia, Rome (Italy). Dipt. Innovazione

    1999-07-01

    This report describes the activities developed for the European Contract BRITE AWCS III to study the use of magnetic sensing techniques to obtain an accurate detection of the internal reinforcement of the closed steel structures employed in the shipbuilding industry. After a description of the methods, techniques and problems for the magnetic testing of materials in the conventional approach, a new method was tried to obtain the wanted results. The obtained conclusion shows that the magnetic non destructive testing approach produce very small effects to measure, are too much sensible to the anisotropy of the magnetic properties of the steel plates and to the quality of the contact with the reinforcement. This system is not flexible enough to assemble a sensing for the goal of the BRITE AWCS III. [Italian] Questo rapporto descrive le attivita' sperimentali sviluppate nell'ambito del contratto europeo BRITE AWCS III, in cui si sono utilizzate tecniche magnetiche per ottenere un preciso rilevamento dei rinforzi interni di strutture metalliche chiuse utilizzate nell'industria delle costruzioni navali. Dopo la descrizione dei metodi, delle tecniche e dei problemi riguardanti il testing magnetico dei materiali, e' stato introdotto un approccio innovativo basato su elettromagneti costruiti ad hoc. Le conclusioni ottenute mostrano che nel nuovo approccio, il testing magnetico non distruttivo produce perturbazioni troppo piccole per essere correttamente apprezzate, risulta inoltre troppo legato alle anisotropie ed alla qualita' del contatto tra piatto e web ed infine esso appare poco flessibile per soddisfare le richieste tecniche del BRITE AWCS III.

  14. A note on “A new approach for the selection of advanced manufacturing technologies: Data envelopment analysis with double frontiers”

    Directory of Open Access Journals (Sweden)

    Hossein Azizi

    2015-08-01

    Full Text Available Recently, using the data envelopment analysis (DEA with double frontiers approach, Wang and Chin (2009 proposed a new approach for the selection of advanced manufacturing technologies: DEA with double frontiers and a new measure for the selection of the best advanced manufacturing technologies (AMTs. In this note, we show that their proposed overall performance measure for the selection of the best AMT has an additional computational burden. Moreover, we propose a new measure for developing a complete ranking of AMTs. Numerical examples are examined using the proposed measure to show its simplicity and usefulness in the AMT selection and justification.

  15. A Soft Computing Approach to Kidney Diseases Evaluation.

    Science.gov (United States)

    Neves, José; Martins, M Rosário; Vilhena, João; Neves, João; Gomes, Sabino; Abelha, António; Machado, José; Vicente, Henrique

    2015-10-01

    Kidney renal failure means that one's kidney have unexpectedly stopped functioning, i.e., once chronic disease is exposed, the presence or degree of kidney dysfunction and its progression must be assessed, and the underlying syndrome has to be diagnosed. Although the patient's history and physical examination may denote good practice, some key information has to be obtained from valuation of the glomerular filtration rate, and the analysis of serum biomarkers. Indeed, chronic kidney sickness depicts anomalous kidney function and/or its makeup, i.e., there is evidence that treatment may avoid or delay its progression, either by reducing and prevent the development of some associated complications, namely hypertension, obesity, diabetes mellitus, and cardiovascular complications. Acute kidney injury appears abruptly, with a rapid deterioration of the renal function, but is often reversible if it is recognized early and treated promptly. In both situations, i.e., acute kidney injury and chronic kidney disease, an early intervention can significantly improve the prognosis. The assessment of these pathologies is therefore mandatory, although it is hard to do it with traditional methodologies and existing tools for problem solving. Hence, in this work, we will focus on the development of a hybrid decision support system, in terms of its knowledge representation and reasoning procedures based on Logic Programming, that will allow one to consider incomplete, unknown, and even contradictory information, complemented with an approach to computing centered on Artificial Neural Networks, in order to weigh the Degree-of-Confidence that one has on such a happening. The present study involved 558 patients with an age average of 51.7 years and the chronic kidney disease was observed in 175 cases. The dataset comprise twenty four variables, grouped into five main categories. The proposed model showed a good performance in the diagnosis of chronic kidney disease, since the

  16. An advanced safeguards approach for a model 200t/a reprocessing facility, (1)

    International Nuclear Information System (INIS)

    This report describes an advanced safeguards approach which has been developed for a model 200 t/a reprocessing plant, using near-real-time materials accountancy in the process MBA, and borrowing advanced ideas from TASTEX, the IWG-RPS, or the authors own invention for the spent fuel storage and plutonium nitrate storage MBAs. In the spent fuel storage MBA primary reliance is placed on 100% inspector observation and verification of all spent fuel receipts, and on surveillance measures to ensure that the inspector is aware of all receipts or other activities in the spent fuel cask receiving bay. The advanced safeguards approach gives more detailed consideration to the mechanical or chop-leach cell than most conventional approaches. Safeguards in the process MBA are based on n.r.t. accountancy. The n.r.t. accountancy model used assumes weekly in-process physical inventories of solution in some five buffer storage tanks. The safeguards approach suggested for the plutonium nitrate storage MBA is not significantly different from conventional approaches. The use of sequential statistical techniques for the analysis of n.r.t. accountancy data requires a significantly different philosophical approach to anomalies and anomaly resolution. This report summarizes anomaly resolution procedures, at least through the earlier stages, and describes a summary estimate of inspection effort likely to be needed to implement the advanced safeguards approach. (author)

  17. Advanced Simulation and Computing FY10-11 Implementation Plan Volume 2, Rev. 0

    Energy Technology Data Exchange (ETDEWEB)

    Carnes, B

    2009-06-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  18. Advanced Simulation and Computing FY09-FY10 Implementation Plan, Volume 2, Revision 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Meisner, R; Hopson, J; Peery, J; McCoy, M

    2008-10-07

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  19. Advanced Simulation and Computing FY09-FY10 Implementation Plan Volume 2, Rev. 1

    Energy Technology Data Exchange (ETDEWEB)

    Kissel, L

    2009-04-01

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  20. Advanced Simulation and Computing FY10-FY11 Implementation Plan Volume 2, Rev. 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Meisner, R; Peery, J; McCoy, M; Hopson, J

    2009-09-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  1. Advanced Simulation and Computing FY08-09 Implementation Plan, Volume 2, Revision 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Kusnezov, D; Bickel, T; McCoy, M; Hopson, J

    2007-09-13

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  2. Advanced Simulation and Computing FY08-09 Implementation Plan Volume 2 Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, M; Kusnezov, D; Bikkel, T; Hopson, J

    2007-04-25

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  3. Advanced Simulation and Computing FY07-08 Implementation Plan Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Kusnezov, D; Hale, A; McCoy, M; Hopson, J

    2006-06-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program will require the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  4. Advanced Urothelial Carcinoma: Overcoming treatment resistance through novel treatment approaches

    Directory of Open Access Journals (Sweden)

    RichardMBambury

    2013-02-01

    Full Text Available The current standard of care for metastatic urothelial carcinoma (UC is cisplatin-based chemotherapy but treatment is generally not curative. Mechanisms of resistance to conventional cytotoxic regimens include tumor cell drug efflux pumps, intracellular anti-oxidants and enhanced anti-apoptotic signaling. Blockade of signaling pathways with small molecule tyrosine kinase inhibitors has produced dramatic responses in subsets of other cancers. Multiple potential signaling pathway targets are altered in UC. Blockade of the PI3K/Akt/mTOR pathway may prove efficacious because 21% have activating PI3K mutations and another 30% have PTEN inactivation (which leads to activation of this pathway. The fibroblast growth factor receptor 3 protein may be overactive in 50-60% and agents which block this pathway are under active development. Blockade of multiple other pathways including HER2 and aurora kinase also have potential efficacy. Anti-angiogenic and immunotherapy strategies are also under development in UC and are discussed in this review. Novel therapeutic approaches are needed in UC. We review the various strategies under development in this disease and discuss how best to evaluate and optimize their efficacy.

  5. Computation on Information, Meaning and Representations. An Evolutionary Approach

    OpenAIRE

    Menant, Christophe

    2011-01-01

    Understanding computation as “a process of the dynamic change of information” brings to look at the different types of computation and information. Computation of information does not exist alone by itself but is to be considered as part of a system that uses it for some given purpose. Information can be meaningless like a thunderstorm noise, it can be meaningful like an alert signal, or like the representation of a desired food. A thunderstorm noise participates to the generation of meani...

  6. The effects of advance organizers according learning styles in computer assisted instruction software on academic achievement

    Directory of Open Access Journals (Sweden)

    Buket Demir

    2011-09-01

    Full Text Available Normal 0 21 false false false MicrosoftInternetExplorer4 This study aims to investigate the effects of advance organizers existing in computer assisted instruction software on academic achievement of the students who have different types of learning styles. Semi–empirical design with Pretest–posttest and with control group was used. The research sample was composed of 131students having Information Technology Course in Süleyman Türkmani Primary School located in Kırşehir in 2010–2011 academic year. Research data was collected by using Kolb’s Learning Style Inventory and Academic Achievement Test (KR–20: 0,82. One way ANOVA and Independent Sample T-Test were conducted on the all data collected and these results were emerged: The existence of advance organizers in a instructional software was affect the the academic achievement of students. There was also difference between the academic achievement of field independent learners whom studied in the computer assisted environment which was both include advance organizer and not include.

  7. The effects of advance organizers according learning styles in computer assisted instruction software on academic achievement

    Directory of Open Access Journals (Sweden)

    Buket Demir , Ertuğrul Usta

    2011-09-01

    Full Text Available This study aims to investigate the effects of advance organizers existing in computer assisted instruction software on academic achievement of the students who have different types of learning styles. Semi–empirical design with pretest–posttest and with control group was used. The research sample was composed of 131students having Information Technology Course in Süleyman Türkmani Primary School located in Kırşehir in 2010–2011 academic year. Research data was collected by using Kolb’s Learning Style Inventory and Academic Achievement Test (KR–20: 0,82. One way ANOVA and Independent Sample T-Test were conducted on the all data collected and these results were emerged: The existence of advance organizers in a instructional software was affect the the academic achievement of students. There was also difference between the academic achievement of field independent learners whom studied in the computer assisted environment which was both include advance organizer and not include.

  8. Advanced welding for closed structure. Pt. 3 The thermal approach

    Energy Technology Data Exchange (ETDEWEB)

    Sacripanti, A.; Bonanno, G.; Paoloni, M.; Sagratella, G. [ENEA Centro Ricerche Casaccia, Rome (Italy). Dipt. Innovazione; Arborino, A.; Varesi, R.; Antonucci, A. [DUNE, (Italy)

    1999-07-01

    This report describes the activities developed for the European Contract BRITE AWCS III to study the use of thermal sensing techniques to obtain an accurate detection of the internal reinforcement of the closed steel structures employed in the shipbuilding industry. After a description of the methods, normally developed in Russia, about the techniques and problems, for the thermal testing of materials in the conventional approach, a new thermal detector was utilized, a new bolometric thermo camera is introduced with a special software for the on line image analysis, there are also shown the experimental tests and results. The obtained conclusion shows that the thermal non destructive testing techniques with the new detector should be useful to assemble a complete sensing system with one ultrasonic head. [Italian] Questo rapporto descrive le attivita' sperimentali sviluppate nell'ambito del contratto europeo BRITE AWCS III, in cui si sono utilizzate tecniche termiche per ottenere un preciso rilevamento dei rinforzi interni di strutture metalliche chiuse utilizzate nell'industria delle costruzioni navali. Dopo la descrizione dei metodi sviluppati essenzialmente in Russia, circa le tecniche e i problemi riguardanti il testing termico dei materiali, e' stato introdotto un approccio innovativo basato su un nuovo sensore: una termocamera bolometrica connessa con un software dedicato per l'analisi online del setto; vengono inoltre mostrati i risultati sperimentali ottenuti. Le conclusioni ottenute mostrano che nel nuovo approccio, il testing termico non distruttivo dovrebbe essere utile per assemblare un sistema sensoriale completo che utilizzi anche un sensore di tipo ultrasonico.

  9. A proposed approach for developing next-generation computational electromagnetics software

    Energy Technology Data Exchange (ETDEWEB)

    Miller, E.K.; Kruger, R.P. [Los Alamos National Lab., NM (United States); Moraites, S. [Simulated Life Systems, Inc., Chambersburg, PA (United States)

    1993-02-01

    Computations have become a tool coequal with mathematics and measurements as a means of performing electromagnetic analysis and design. This is demonstrated by the volume of articles and meeting presentations in which computational electromagnetics (CEM) is routinely employed to address an increasing variety of problems. Yet, in spite of the substantial resources invested in CEM software over the past three decades, little real progress seems to have been made towards providing the EM engineer software tools having a functionality equivalent to that expected of hardware instrumentation. Furthermore, the bulk of CEM software now available is generally of limited applicability to large, complex problems because most modeling codes employ a single field propagator, or analytical form, of Maxwell`s Equations. The acknowledged advantages of hybrid models, i.e., those which employ different propagators in differing regions of a problem, are relatively unexploited. The thrust of this discussion is to propose a new approach designed to address both problems outlined above, integrating advances being made in both software and hardware development. After briefly reviewing the evolution of modeling CEM software to date and pointing out the deficiencies thereof, we describe an approach for making CEM tools more truly ``user friendly`` called EMSES (Electromagnetic Modeling and Simulation Environment for Systems). This will be achieved through two main avenues. One is developing a common problem-description language implemented in a visual programming environment working together with a translator that produces the specific model description needed by various numerical treatments, in order to optimize user efficiency. The other is to employ a new modeling paradigm based on the idea of field propagators to expedite the development of the hybrid models that are needed to optimize computation efficiency.

  10. A proposed approach for developing next-generation computational electromagnetics software

    Energy Technology Data Exchange (ETDEWEB)

    Miller, E.K.; Kruger, R.P. (Los Alamos National Lab., NM (United States)); Moraites, S. (Simulated Life Systems, Inc., Chambersburg, PA (United States))

    1993-01-01

    Computations have become a tool coequal with mathematics and measurements as a means of performing electromagnetic analysis and design. This is demonstrated by the volume of articles and meeting presentations in which computational electromagnetics (CEM) is routinely employed to address an increasing variety of problems. Yet, in spite of the substantial resources invested in CEM software over the past three decades, little real progress seems to have been made towards providing the EM engineer software tools having a functionality equivalent to that expected of hardware instrumentation. Furthermore, the bulk of CEM software now available is generally of limited applicability to large, complex problems because most modeling codes employ a single field propagator, or analytical form, of Maxwell's Equations. The acknowledged advantages of hybrid models, i.e., those which employ different propagators in differing regions of a problem, are relatively unexploited. The thrust of this discussion is to propose a new approach designed to address both problems outlined above, integrating advances being made in both software and hardware development. After briefly reviewing the evolution of modeling CEM software to date and pointing out the deficiencies thereof, we describe an approach for making CEM tools more truly user friendly'' called EMSES (Electromagnetic Modeling and Simulation Environment for Systems). This will be achieved through two main avenues. One is developing a common problem-description language implemented in a visual programming environment working together with a translator that produces the specific model description needed by various numerical treatments, in order to optimize user efficiency. The other is to employ a new modeling paradigm based on the idea of field propagators to expedite the development of the hybrid models that are needed to optimize computation efficiency.

  11. Liposuction for Advanced Lymphedema: A Multidisciplinary Approach for Complete Reduction of Arm and Leg Swelling

    OpenAIRE

    Boyages, John; Kastanias, Katrina; Koelmeyer, Louise A.; Winch, Caleb J.; Lam, Thomas C.; Sherman, Kerry A.; Munnoch, David Alex; Brorson, Håkan; Ngo, Quan D.; Heydon-White, Asha; Magnussen, John S.; Mackie, Helen

    2015-01-01

    Purpose This research describes and evaluates a liposuction surgery and multidisciplinary rehabilitation approach for advanced lymphedema of the upper and lower extremities. Methods A prospective clinical study was conducted at an Advanced Lymphedema Assessment Clinic (ALAC) comprised of specialists in plastic surgery, rehabilitation, imaging, oncology, and allied health, at Macquarie University, Australia. Between May 2012 and 31 May 2014, a total of 104 patients attended the ALAC. Eligibili...

  12. Design approach to the development of an advanced HANARO research reactor

    International Nuclear Information System (INIS)

    Based on the experiences of the HANARO construction and operation, a project to design an advanced research reactor was launched in 2003 to prepare for the future needs of a research reactor. Many improvements identified during the HANARO operation and utilization will be incorporated into the design of the advanced research reactor. This paper deals with the basic principles of the design approach and the preliminary design features of the reactor under study

  13. An expanded framework for the advanced computational testing and simulation toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Marques, Osni A.; Drummond, Leroy A.

    2003-11-09

    The Advanced Computational Testing and Simulation (ACTS) Toolkit is a set of computational tools developed primarily at DOE laboratories and is aimed at simplifying the solution of common and important computational problems. The use of the tools reduces the development time for new codes and the tools provide functionality that might not otherwise be available. This document outlines an agenda for expanding the scope of the ACTS Project based on lessons learned from current activities. Highlights of this agenda include peer-reviewed certification of new tools; finding tools to solve problems that are not currently addressed by the Toolkit; working in collaboration with other software initiatives and DOE computer facilities; expanding outreach efforts; promoting interoperability, further development of the tools; and improving functionality of the ACTS Information Center, among other tasks. The ultimate goal is to make the ACTS tools more widely used and more effective in solving DOE's and the nation's scientific problems through the creation of a reliable software infrastructure for scientific computing.

  14. Recent Advances in Computational Simulation of Macro-, Meso-, and Micro-Scale Biomimetics Related Fluid Flow Problems

    Institute of Scientific and Technical Information of China (English)

    Y. Y. Yan

    2007-01-01

    Over the last decade, computational methods have been intensively applied to a variety of scientific researches and engineering designs. Although the computational fluid dynamics (CFD) method has played a dominant role in studying and simulating transport phenomena involving fluid flow and heat and mass transfers, in recent years, other numerical methods for the simulations at meso- and micro-scales have also been actively applied to solve the physics of complex flow and fluid-interface interactions. This paper presents a review of recent advances in multi-scale computational simulation of biomimetics related fluid flow problems. The state-of-the-art numerical techniques, such as lattice Boltzmann method (LBM), molecular dynamics (MD), and conventional CFD, applied to different problems such as fish flow, electro-osmosis effect of earthworm motion, and self-cleaning hydrophobic surface, and the numerical approaches are introduced. The new challenging of modelling biomimetics problems in developing the physical conditions of self-clean hydrophobic surfaces is discussed.

  15. Evaluating fluid behavior in advanced reactor systems using coupled computational fluid dynamics and systems analysis tools

    International Nuclear Information System (INIS)

    Simulation of some fluid phenomena associated with Generation IV reactors require the capability of modeling mixing in two- or three-dimensional flow. At the same time, the flow condition of interest is often transient and depends upon boundary conditions dictated by the system behavior as a whole. Computational Fluid Dynamics (CFD) is an ideal tool for simulating mixing and three-dimensional flow in system components, whereas a system analysis tool is ideal for modeling the entire system. This paper presents the reasoning which has led to coupled CFD and systems analysis code software to analyze the behavior of advanced reactor fluid system behavior. In addition, the kinds of scenarios where this capability is important are identified. The important role of a coupled CFD/systems analysis code tool in the overall calculation scheme for a Very High Temperature Reactor is described. The manner in which coupled systems analysis and CFD codes will be used to evaluate the mixing behavior in a plenum for transient boundary conditions is described. The calculation methodology forms the basis for future coupled calculations that will examine the behavior of such systems at a spectrum of conditions, including transient accident conditions, that define the operational and accident envelope of the subject system. The methodology and analysis techniques demonstrated herein are a key technology that in part forms the backbone of the advanced techniques employed in the evaluation of advanced designs and their operational characteristics for the Generation IV advanced reactor systems. (authors)

  16. Computer Science Contests for Secondary School Students: Approaches to Classification

    Directory of Open Access Journals (Sweden)

    Wolfgang POHL

    2006-04-01

    Full Text Available The International Olympiad in Informatics currently provides a model which is imitated by the majority of contests for secondary school students in Informatics or Computer Science. However, the IOI model can be criticized, and alternative contest models exist. To support the discussion about contests in Computer Science, several dimensions for characterizing and classifying contests are suggested.

  17. Teaching Scientific Computing: A Model-Centered Approach to Pipeline and Parallel Programming with C

    OpenAIRE

    Vladimiras Dolgopolovas; Valentina Dagienė; Saulius Minkevičius; Leonidas Sakalauskas

    2015-01-01

    The aim of this study is to present an approach to the introduction into pipeline and parallel computing, using a model of the multiphase queueing system. Pipeline computing, including software pipelines, is among the key concepts in modern computing and electronics engineering. The modern computer science and engineering education requires a comprehensive curriculum, so the introduction to pipeline and parallel computing is the essential topic to be included in the curriculum. At the same ti...

  18. Development of Computer Science Disciplines - A Social Network Analysis Approach

    CERN Document Server

    Pham, Manh Cuong; Jarke, Matthias

    2011-01-01

    In contrast to many other scientific disciplines, computer science considers conference publications. Conferences have the advantage of providing fast publication of papers and of bringing researchers together to present and discuss the paper with peers. Previous work on knowledge mapping focused on the map of all sciences or a particular domain based on ISI published JCR (Journal Citation Report). Although this data covers most of important journals, it lacks computer science conference and workshop proceedings. That results in an imprecise and incomplete analysis of the computer science knowledge. This paper presents an analysis on the computer science knowledge network constructed from all types of publications, aiming at providing a complete view of computer science research. Based on the combination of two important digital libraries (DBLP and CiteSeerX), we study the knowledge network created at journal/conference level using citation linkage, to identify the development of sub-disciplines. We investiga...

  19. Exploring the Process of Adult Computer Software Training Using Andragogy, Situated Cognition, and a Minimalist Approach

    Science.gov (United States)

    Hurt, Andrew C.

    2007-01-01

    With technology advances, computer software becomes increasingly difficult to learn. Adults often rely on software training to keep abreast of these changes. Instructor-led software training is frequently used to teach adults new software skills; however there is limited research regarding the best practices in adult computer software training.…

  20. A Challenging Surgical Approach to Locally Advanced Primary Urethral Carcinoma: A Case Report and Literature Review.

    Science.gov (United States)

    Lucarelli, Giuseppe; Spilotros, Marco; Vavallo, Antonio; Palazzo, Silvano; Miacola, Carlos; Forte, Saverio; Matera, Matteo; Campagna, Marcello; Colamonico, Ottavio; Schiralli, Francesco; Sebastiani, Francesco; Di Cosmo, Federica; Bettocchi, Carlo; Di Lorenzo, Giuseppe; Buonerba, Carlo; Vincenti, Leonardo; Ludovico, Giuseppe; Ditonno, Pasquale; Battaglia, Michele

    2016-05-01

    Primary urethral carcinoma (PUC) is a rare and aggressive cancer, often underdetected and consequently unsatisfactorily treated. We report a case of advanced PUC, surgically treated with combined approaches.A 47-year-old man underwent transurethral resection of a urethral lesion with histological evidence of a poorly differentiated squamous cancer of the bulbomembranous urethra. Computed tomography (CT) and bone scans excluded metastatic spread of the disease but showed involvement of both corpora cavernosa (cT3N0M0). A radical surgical approach was advised, but the patient refused this and opted for chemotherapy. After 17 months the patient was referred to our department due to the evidence of a fistula in the scrotal area. CT scan showed bilateral metastatic disease in the inguinal, external iliac, and obturator lymph nodes as well as the involvement of both corpora cavernosa. Additionally, a fistula originating from the right corpus cavernosum extended to the scrotal skin. At this stage, the patient accepted the surgical treatment, consisting of different phases. Phase I: Radical extraperitoneal cystoprostatectomy with iliac-obturator lymph nodes dissection. Phase II: Creation of a urinary diversion through a Bricker ileal conduit. Phase III: Repositioning of the patient in lithotomic position for an overturned Y skin incision, total penectomy, fistula excision, and "en bloc" removal of surgical specimens including the bladder, through the perineal breach. Phase IV: Right inguinal lymphadenectomy.The procedure lasted 9-and-a-half hours, was complication-free, and intraoperative blood loss was 600 mL. The patient was discharged 8 days after surgery. Pathological examination documented a T4N2M0 tumor. The clinical situation was stable during the first 3 months postoperatively but then metastatic spread occurred, not responsive to adjuvant chemotherapy, which led to the patient's death 6 months after surgery.Patients with advanced stage tumors of the

  1. What Computational Approaches Should be Taught for Physics?

    Science.gov (United States)

    Landau, Rubin

    2005-03-01

    The standard Computational Physics courses are designed for upper-level physics majors who already have some computational skills. We believe that it is important for first-year physics students to learn modern computing techniques that will be useful throughout their college careers, even before they have learned the math and science required for Computational Physics. To teach such Introductory Scientific Computing courses requires that some choices be made as to what subjects and computer languages wil be taught. Our survey of colleagues active in Computational Physics and Physics Education show no predominant choice, with strong positions taken for the compiled languages Java, C, C++ and Fortran90, as well as for problem-solving environments like Maple and Mathematica. Over the last seven years we have developed an Introductory course and have written up those courses as text books for others to use. We will describe our model of using both a problem-solving environment and a compiled language. The developed materials are available in both Maple and Mathaematica, and Java and Fortran90ootnotetextPrinceton University Press, to be published; www.physics.orst.edu/˜rubin/IntroBook/.

  2. Computational Models of Exercise on the Advanced Resistance Exercise Device (ARED)

    Science.gov (United States)

    Newby, Nate; Caldwell, Erin; Scott-Pandorf, Melissa; Peters,Brian; Fincke, Renita; DeWitt, John; Poutz-Snyder, Lori

    2011-01-01

    Muscle and bone loss remain a concern for crew returning from space flight. The advanced resistance exercise device (ARED) is used for on-orbit resistance exercise to help mitigate these losses. However, characterization of how the ARED loads the body in microgravity has yet to be determined. Computational models allow us to analyze ARED exercise in both 1G and 0G environments. To this end, biomechanical models of the squat, single-leg squat, and deadlift exercise on the ARED have been developed to further investigate bone and muscle forces resulting from the exercises.

  3. Recent advances in computational biology, bioinformatics, medicine, and healthcare by modern OR

    OpenAIRE

    Türkay, Metin; Weber, Gerhard-Wilhelm; Blazewicz, Jacek; Rauner, Marion

    2014-01-01

    CEJOR (2014) 22:427–430 DOI 10.1007/s10100-013-0327-2 EDITORIAL Recent advances in computational biology, bioinformatics, medicine, and healthcare by modern OR Gerhard-Wilhelm Weber · Jacek Blazewicz · Marion Rauner · Metin Türkay Published online: 7 September 2013 © Springer-Verlag Berlin Heidelberg 2013 At the occasion of the 25th European Conference on Operational Research, EURO XXV 2012, July 8–11, 2012, in Vilnius, Lithuania (http://www.euro-2012.lt/), the ...

  4. Advanced Computing Technologies for Rocket Engine Propulsion Systems: Object-Oriented Design with C++

    Science.gov (United States)

    Bekele, Gete

    2002-01-01

    This document explores the use of advanced computer technologies with an emphasis on object-oriented design to be applied in the development of software for a rocket engine to improve vehicle safety and reliability. The primary focus is on phase one of this project, the smart start sequence module. The objectives are: 1) To use current sound software engineering practices, object-orientation; 2) To improve on software development time, maintenance, execution and management; 3) To provide an alternate design choice for control, implementation, and performance.

  5. DOE Advanced Scientific Computing Advisory Committee (ASCAC) Subcommittee Report on Scientific and Technical Information

    Energy Technology Data Exchange (ETDEWEB)

    Hey, Tony [eScience Institute, University of Washington; Agarwal, Deborah [Lawrence Berkeley National Laboratory; Borgman, Christine [University of California, Los Angeles; Cartaro, Concetta [SLAC National Accelerator Laboratory; Crivelli, Silvia [Lawrence Berkeley National Laboratory; Van Dam, Kerstin Kleese [Pacific Northwest National Laboratory; Luce, Richard [University of Oklahoma; Arjun, Shankar [CADES, Oak Ridge National Laboratory; Trefethen, Anne [University of Oxford; Wade, Alex [Microsoft Research, Microsoft Corporation; Williams, Dean [Lawrence Livermore National Laboratory

    2015-09-04

    The Advanced Scientific Computing Advisory Committee (ASCAC) was charged to form a standing subcommittee to review the Department of Energy’s Office of Scientific and Technical Information (OSTI) and to begin by assessing the quality and effectiveness of OSTI’s recent and current products and services and to comment on its mission and future directions in the rapidly changing environment for scientific publication and data. The Committee met with OSTI staff and reviewed available products, services and other materials. This report summaries their initial findings and recommendations.

  6. Loss tolerant one-way quantum computation -- a horticultural approach

    CERN Document Server

    Varnava, M; Rudolph, T; Varnava, Michael; Browne, Daniel E.; Rudolph, Terry

    2005-01-01

    We introduce a scheme for fault tolerantly dealing with losses in cluster state computation that can tolerate up to 50% qubit loss. This is achieved passively - no coherent measurements or coherent correction is required. We then use this procedure within a specific linear optical quantum computation proposal to show that: (i) given perfect sources, detector inefficiencies of up to 50% can be tolerated and (ii) given perfect detectors, the purity of the photon source (overlap of the photonic wavefunction with the desired single mode) need only be greater than 66.6% for efficient computation to be possible.

  7. Advanced Simulation and Computing Fiscal Year 2016 Implementation Plan, Version 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hendrickson, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-27

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The purpose of this IP is to outline key work requirements to be performed and to control individual work activities within the scope of work. Contractors may not deviate from this plan without a revised WA or subsequent IP.

  8. Further development of the Dynamic Control Assemblies Worth Measurement Method for Advanced Reactivity Computers

    International Nuclear Information System (INIS)

    The dynamic control assemblies worth measurement technique is a quick method for validation of predicted control assemblies worth. The dynamic control assemblies worth measurement utilize space-time corrections for the measured out of core ionization chamber readings calculated by DYN 3D computer code. The space-time correction arising from the prompt neutron density redistribution in the measured ionization chamber reading can be directly applied in the advanced reactivity computer. The second correction concerning the difference of spatial distribution of delayed neutrons can be calculated by simulation the measurement procedure by dynamic version of the DYN 3D code. In the paper some results of dynamic control assemblies worth measurement applied for NPP Mochovce are presented (Authors)

  9. Advances in Intelligent Modelling and Simulation Artificial Intelligence-Based Models and Techniques in Scalable Computing

    CERN Document Server

    Khan, Samee; Burczy´nski, Tadeusz

    2012-01-01

    One of the most challenging issues in today’s large-scale computational modeling and design is to effectively manage the complex distributed environments, such as computational clouds, grids, ad hoc, and P2P networks operating under  various  types of users with evolving relationships fraught with  uncertainties. In this context, the IT resources and services usually belong to different owners (institutions, enterprises, or individuals) and are managed by different administrators. Moreover, uncertainties are presented to the system at hand in various forms of information that are incomplete, imprecise, fragmentary, or overloading, which hinders in the full and precise resolve of the evaluation criteria, subsequencing and selection, and the assignment scores. Intelligent scalable systems enable the flexible routing and charging, advanced user interactions and the aggregation and sharing of geographically-distributed resources in modern large-scale systems.   This book presents new ideas, theories, models...

  10. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    Science.gov (United States)

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  11. Diffuse globally, compute locally: a cyclist approach to modeling long time robot locomotion

    Science.gov (United States)

    Zhang, Tingnan; Goldman, Daniel; Cvitanović, Predrag

    2015-03-01

    To advance autonomous robots we are interested to develop a statistical/dynamical description of diffusive self-propulsion on heterogeneous terrain. We consider a minimal model for such diffusion, the 2-dimensional Lorentz gas, which abstracts the motion of a light, point-like particle bouncing within a large number of heavy scatters (e.g. small robots in a boulder field). We present a precise computation (based on exact periodic orbit theory formula for the diffusion constant) for a periodic triangular Lorentz gas with finite horizon. We formulate a new approach to tiling the plane in terms of three elementary tiling generators which, for the first time, enables use of periodic orbits computed in the fundamental domain (that is, 1 / 12 of the hexagonal elementary cell whose translations tile the entire plane). Compared with previous literature, our fundamental domain value of the diffusion constant converges quickly for inter-disk separation/disk radius > 0 . 2 , with the cycle expansion truncated to only a few hundred periodic orbits of up to 5 billiard wall bounces. For small inter-disk separations, with periodic orbits up to 6 bounces, our diffusion constants are close (simulation estimates and the recent literature probabilistic estimates.

  12. Computational Approaches for Probing the Formation of Atmospheric Molecular Clusters

    DEFF Research Database (Denmark)

    Elm, Jonas

    This thesis presents the investigation of atmospheric molecular clusters using computational methods. Previous investigations have focused on solving problems related to atmospheric nucleation, and have not been targeted at the performance of the applied methods. This thesis focuses on assessing...

  13. A computational approach to George Boole's discovery of mathematical logic

    OpenAIRE

    Ledesma, Luis de; Pérez, Aurora; Borrajo, Daniel; Laita, Luis M.

    1997-01-01

    This paper reports a computational model of Boole's discovery of Logic as a part of Mathematics. George Boole (1815–1864) found that the symbols of Logic behaved as algebraic symbols, and he then rebuilt the whole contemporary theory of Logic by the use of methods such as the solution of algebraic equations. Study of the different historical factors that influenced this achievement has served as background for our two main contributions: a computational representation of Boole's Logic before ...

  14. AN ETHICAL ASSESSMENT OF COMPUTER ETHICS USING SCENARIO APPROACH

    OpenAIRE

    Maslin Masrom; Zuraini Ismail; Ramlah Hussein

    2010-01-01

    Ethics refers to a set of rules that define right and wrong behavior, used for moral decision making. In this case, computer ethics is one of the major issues in information technology (IT) and information system (IS). The ethical behaviour of IT students and professionals need to be studied in an attempt to reduce many unethical practices such as software piracy, hacking, and software intellectual property violations. This paper attempts to address computer-related scenarios that can be used...

  15. Development of Computer Science Disciplines - A Social Network Analysis Approach

    OpenAIRE

    Pham, Manh Cuong; Klamma, Ralf; Jarke, Matthias

    2011-01-01

    In contrast to many other scientific disciplines, computer science considers conference publications. Conferences have the advantage of providing fast publication of papers and of bringing researchers together to present and discuss the paper with peers. Previous work on knowledge mapping focused on the map of all sciences or a particular domain based on ISI published JCR (Journal Citation Report). Although this data covers most of important journals, it lacks computer science conference and ...

  16. An Econometric Approach of Computing Competitiveness Index in Human Capital

    OpenAIRE

    Salahodjaev, Raufhon; Nazarov, Zafar

    2013-01-01

    The aim of this paper is to provide methodology of estimating one of the components (pillar) of the Global Competitiveness Index (GCI), health and primary education (HPE) pillar for not included in the Global Competitiveness Report countries using conventional econometric techniques. Specifically, using the weighted least square and bootstrapping methods, we enable to compute the HPE for two countries of the former Soviet Union, Uzbekistan and Belarus and then compare the computed...

  17. Collaboration in computer science: a network science approach. Part I

    OpenAIRE

    Franceschet, Massimo

    2010-01-01

    Co-authorship in publications within a discipline uncovers interesting properties of the analysed field. We represent collaboration in academic papers of computer science in terms of differently grained networks, including those sub-networks that emerge from conference and journal co-authorship only. We take advantage of the network science paraphernalia to take a picture of computer science collaboration including all papers published in the field since 1936. We investigate typical bibliomet...

  18. AVES: A Computer Cluster System approach for INTEGRAL Scientific Analysis

    Science.gov (United States)

    Federici, M.; Martino, B. L.; Natalucci, L.; Umbertini, P.

    The AVES computing system, based on an "Cluster" architecture is a fully integrated, low cost computing facility dedicated to the archiving and analysis of the INTEGRAL data. AVES is a modular system that uses the software resource manager (SLURM) and allows almost unlimited expandibility (65,536 nodes and hundreds of thousands of processors); actually is composed by 30 Personal Computers with Quad-Cores CPU able to reach the computing power of 300 Giga Flops (300x10{9} Floating point Operations Per Second), with 120 GB of RAM and 7.5 Tera Bytes (TB) of storage memory in UFS configuration plus 6 TB for users area. AVES was designed and built to solve growing problems raised from the analysis of the large data amount accumulated by the INTEGRAL mission (actually about 9 TB) and due to increase every year. The used analysis software is the OSA package, distributed by the ISDC in Geneva. This is a very complex package consisting of dozens of programs that can not be converted to parallel computing. To overcome this limitation we developed a series of programs to distribute the workload analysis on the various nodes making AVES automatically divide the analysis in N jobs sent to N cores. This solution thus produces a result similar to that obtained by the parallel computing configuration. In support of this we have developed tools that allow a flexible use of the scientific software and quality control of on-line data storing. The AVES software package is constituted by about 50 specific programs. Thus the whole computing time, compared to that provided by a Personal Computer with single processor, has been enhanced up to a factor 70.

  19. Validation of physics and thermalhydraulic computer codes for advanced Candu reactor applications

    International Nuclear Information System (INIS)

    Atomic Energy of Canada Ltd. (AECL) is developing an Advanced Candu Reactor (ACR) that is an evolutionary advancement of the currently operating Candu 6 reactors. The ACR is being designed to produce electrical power for a capital cost and at a unit-energy cost significantly less than that of the current reactor designs. The ACR retains the modular Candu concept of horizontal fuel channels surrounded by a heavy water moderator. However, ACR uses slightly enriched uranium fuel compared to the natural uranium used in Candu 6. This achieves the twin goals of improved economics (via large reductions in the heavy water moderator volume and replacement of the heavy water coolant with light water coolant) and improved safety. AECL has developed and implemented a software quality assurance program to ensure that its analytical, scientific and design computer codes meet the required standards for software used in safety analyses. Since the basic design of the ACR is equivalent to that of the Candu 6, most of the key phenomena associated with the safety analyses of ACR are common, and the Candu industry standard tool-set of safety analysis codes can be applied to the analysis of the ACR. A systematic assessment of computer code applicability addressing the unique features of the ACR design was performed covering the important aspects of the computer code structure, models, constitutive correlations, and validation database. Arising from this assessment, limited additional requirements for code modifications and extensions to the validation databases have been identified. This paper provides an outline of the AECL software quality assurance program process for the validation of computer codes used to perform physics and thermal-hydraulics safety analyses of the ACR. It describes the additional validation work that has been identified for these codes and the planned, and ongoing, experimental programs to extend the code validation as required to address specific ACR design

  20. New Approaches to Quantum Computing using Nuclear Magnetic Resonance Spectroscopy

    International Nuclear Information System (INIS)

    The power of a quantum computer (QC) relies on the fundamental concept of the superposition in quantum mechanics and thus allowing an inherent large-scale parallelization of computation. In a QC, binary information embodied in a quantum system, such as spin degrees of freedom of a spin-1/2 particle forms the qubits (quantum mechanical bits), over which appropriate logical gates perform the computation. In classical computers, the basic unit of information is the bit, which can take a value of either 0 or 1. Bits are connected together by logic gates to form logic circuits to implement complex logical operations. The expansion of modern computers has been driven by the developments of faster, smaller and cheaper logic gates. As the size of the logic gates become smaller toward the level of atomic dimensions, the performance of such a system is no longer considered classical but is rather governed by quantum mechanics. Quantum computers offer the potentially superior prospect of solving computational problems that are intractable to classical computers such as efficient database searches and cryptography. A variety of algorithms have been developed recently, most notably Shor's algorithm for factorizing long numbers into prime factors in polynomial time and Grover's quantum search algorithm. The algorithms that were of only theoretical interest as recently, until several methods were proposed to build an experimental QC. These methods include, trapped ions, cavity-QED, coupled quantum dots, Josephson junctions, spin resonance transistors, linear optics and nuclear magnetic resonance. Nuclear magnetic resonance (NMR) is uniquely capable of constructing small QCs and several algorithms have been implemented successfully. NMR-QC differs from other implementations in one important way that it is not a single QC, but a statistical ensemble of them. Thus, quantum computing based on NMR is considered as ensemble quantum computing. In NMR quantum computing, the spins with

  1. Percutaneous irreversible electroporation of locally advanced pancreatic carcinoma using the dorsal approach: a case report.

    Science.gov (United States)

    Scheffer, Hester J; Melenhorst, Marleen C A M; Vogel, Jantien A; van Tilborg, Aukje A J M; Nielsen, Karin; Kazemier, Geert; Meijerink, Martijn R

    2015-06-01

    Irreversible electroporation (IRE) is a novel image-guided ablation technique that is increasingly used to treat locally advanced pancreatic carcinoma (LAPC). We describe a 67-year-old male patient with a 5 cm stage III pancreatic tumor who was referred for IRE. Because the ventral approach for electrode placement was considered dangerous due to vicinity of the tumor to collateral vessels and duodenum, the dorsal approach was chosen. Under CT-guidance, six electrodes were advanced in the tumor, approaching paravertebrally alongside the aorta and inferior vena cava. Ablation was performed without complications. This case describes that when ventral electrode placement for pancreatic IRE is impaired, the dorsal approach could be considered alternatively. PMID:25288173

  2. Percutaneous Irreversible Electroporation of Locally Advanced Pancreatic Carcinoma Using the Dorsal Approach: A Case Report

    International Nuclear Information System (INIS)

    Irreversible electroporation (IRE) is a novel image-guided ablation technique that is increasingly used to treat locally advanced pancreatic carcinoma (LAPC). We describe a 67-year-old male patient with a 5 cm stage III pancreatic tumor who was referred for IRE. Because the ventral approach for electrode placement was considered dangerous due to vicinity of the tumor to collateral vessels and duodenum, the dorsal approach was chosen. Under CT-guidance, six electrodes were advanced in the tumor, approaching paravertebrally alongside the aorta and inferior vena cava. Ablation was performed without complications. This case describes that when ventral electrode placement for pancreatic IRE is impaired, the dorsal approach could be considered alternatively

  3. Percutaneous Irreversible Electroporation of Locally Advanced Pancreatic Carcinoma Using the Dorsal Approach: A Case Report

    Energy Technology Data Exchange (ETDEWEB)

    Scheffer, Hester J., E-mail: hj.scheffer@vumc.nl; Melenhorst, Marleen C. A. M., E-mail: m.melenhorst@vumc.nl [VU University Medical Center, Department of Radiology and Nuclear Medicine (Netherlands); Vogel, Jantien A., E-mail: j.a.vogel@amc.uva.nl [Academic Medical Center, Department of Surgery (Netherlands); Tilborg, Aukje A. J. M. van, E-mail: a.vantilborg@vumc.nl [VU University Medical Center, Department of Radiology and Nuclear Medicine (Netherlands); Nielsen, Karin, E-mail: k.nielsen@vumc.nl; Kazemier, Geert, E-mail: g.kazemier@vumc.nl [VU University Medical Center, Department of Surgery (Netherlands); Meijerink, Martijn R., E-mail: mr.meijerink@vumc.nl [VU University Medical Center, Department of Radiology and Nuclear Medicine (Netherlands)

    2015-06-15

    Irreversible electroporation (IRE) is a novel image-guided ablation technique that is increasingly used to treat locally advanced pancreatic carcinoma (LAPC). We describe a 67-year-old male patient with a 5 cm stage III pancreatic tumor who was referred for IRE. Because the ventral approach for electrode placement was considered dangerous due to vicinity of the tumor to collateral vessels and duodenum, the dorsal approach was chosen. Under CT-guidance, six electrodes were advanced in the tumor, approaching paravertebrally alongside the aorta and inferior vena cava. Ablation was performed without complications. This case describes that when ventral electrode placement for pancreatic IRE is impaired, the dorsal approach could be considered alternatively.

  4. Computational consideration on advanced oxidation degradation of phenolic preservative, methylparaben, in water: mechanisms, kinetics, and toxicity assessments

    International Nuclear Information System (INIS)

    Graphical abstract: - Highlights: • Computational approach is effective to reveal the transformation mechanism of MPB. • MPB degradation was more dependent on the [• OH] than temperature during AOPs. • O2 could enhance MPB degradation, but more harmful products were formed. • The risks of MPB products in natural waters should be considered seriously. • The risks of MPB products can be overlooked in AOPs due to short half-time. - Abstract: Hydroxyl radicals (• OH) are strong oxidants that can degrade organic pollutants in advanced oxidation processes (AOPs). The mechanisms, kinetics, and toxicity assessment of the • OH-initiated oxidative degradation of the phenolic preservative, methylparaben (MPB), were systematically investigated using a computational approach, as the supplementary information for experimental data. Results showed that MPB can be initially attacked by • OH via OH-addition and H-abstraction routes. Among these routes, the • OH addition to the C atom at the ortho-position of phenolic hydroxyl group was the most significant route. However, the methyl-H-abstraction route also cannot be neglected. Further, the formed transient intermediates, OH-adduct (• MPB-OH1) and dehydrogenated radical (• MPB(-H)α), could be easily transformed to several stable degradation products in the presence of O2 and • OH. To better understand the potential toxicity of MPB and its products to aquatic organisms, both acute and chronic toxicities were assessed computationally at three trophic levels. Both MPB and its products, particularly the OH-addition products, are harmful to aquatic organisms. Therefore, the application of AOPs to remove MPB should be carefully performed for safe water treatment

  5. Computationally Inexpensive Approach for Pitch Control of Offshore Wind Turbine on Barge Floating Platform

    Science.gov (United States)

    Zuo, Shan; Song, Y. D.; Wang, Lei; Song, Qing-wang

    2013-01-01

    Offshore floating wind turbine (OFWT) has gained increasing attention during the past decade because of the offshore high-quality wind power and complex load environment. The control system is a tradeoff between power tracking and fatigue load reduction in the above-rated wind speed area. In allusion to the external disturbances and uncertain system parameters of OFWT due to the proximity to load centers and strong wave coupling, this paper proposes a computationally inexpensive robust adaptive control approach with memory-based compensation for blade pitch control. The method is tested and compared with a baseline controller and a conventional individual blade pitch controller with the “NREL offshore 5 MW baseline wind turbine” being mounted on a barge platform run on FAST and Matlab/Simulink, operating in the above-rated condition. It is shown that the advanced control approach is not only robust to complex wind and wave disturbances but adaptive to varying and uncertain system parameters as well. The simulation results demonstrate that the proposed method performs better in reducing power fluctuations, fatigue loads and platform vibration as compared to the conventional individual blade pitch control. PMID:24453834

  6. Computational approaches and metrics required for formulating biologically realistic nanomaterial pharmacokinetic models

    International Nuclear Information System (INIS)

    The field of nanomaterial pharmacokinetics is in its infancy, with major advances largely restricted by a lack of biologically relevant metrics, fundamental differences between particles and small molecules of organic chemicals and drugs relative to biological processes involved in disposition, a scarcity of sufficiently rich and characterized in vivo data and a lack of computational approaches to integrating nanomaterial properties to biological endpoints. A central concept that links nanomaterial properties to biological disposition, in addition to their colloidal properties, is the tendency to form a biocorona which modulates biological interactions including cellular uptake and biodistribution. Pharmacokinetic models must take this crucial process into consideration to accurately predict in vivo disposition, especially when extrapolating from laboratory animals to humans since allometric principles may not be applicable. The dynamics of corona formation, which modulates biological interactions including cellular uptake and biodistribution, is thereby a crucial process involved in the rate and extent of biodisposition. The challenge will be to develop a quantitative metric that characterizes a nanoparticle's surface adsorption forces that are important for predicting biocorona dynamics. These types of integrative quantitative approaches discussed in this paper for the dynamics of corona formation must be developed before realistic engineered nanomaterial risk assessment can be accomplished. (paper)

  7. Computational approaches and metrics required for formulating biologically realistic nanomaterial pharmacokinetic models

    Science.gov (United States)

    Riviere, Jim E.; Scoglio, Caterina; Sahneh, Faryad D.; Monteiro-Riviere, Nancy A.

    2013-01-01

    The field of nanomaterial pharmacokinetics is in its infancy, with major advances largely restricted by a lack of biologically relevant metrics, fundamental differences between particles and small molecules of organic chemicals and drugs relative to biological processes involved in disposition, a scarcity of sufficiently rich and characterized in vivo data and a lack of computational approaches to integrating nanomaterial properties to biological endpoints. A central concept that links nanomaterial properties to biological disposition, in addition to their colloidal properties, is the tendency to form a biocorona which modulates biological interactions including cellular uptake and biodistribution. Pharmacokinetic models must take this crucial process into consideration to accurately predict in vivo disposition, especially when extrapolating from laboratory animals to humans since allometric principles may not be applicable. The dynamics of corona formation, which modulates biological interactions including cellular uptake and biodistribution, is thereby a crucial process involved in the rate and extent of biodisposition. The challenge will be to develop a quantitative metric that characterizes a nanoparticle's surface adsorption forces that are important for predicting biocorona dynamics. These types of integrative quantitative approaches discussed in this paper for the dynamics of corona formation must be developed before realistic engineered nanomaterial risk assessment can be accomplished.

  8. Energy Therapies in Advanced Practice Oncology: An Evidence-Informed Practice Approach

    OpenAIRE

    Potter, Pamela J.

    2013-01-01

    Advanced practitioners in oncology want patients to receive state-of-the-art care and support for their healing process. Evidence-informed practice (EIP), an approach to evaluating evidence for clinical practice, considers the varieties of evidence in the context of patient preference and condition as well as practitioner knowledge and experience. This article offers an EIP approach to energy therapies, namely, Therapeutic Touch (TT), Healing Touch (HT), and Reiki, as supportive interventions...

  9. Sensitivity analysis of scenario models for operational risk Advanced Measurement Approach

    OpenAIRE

    Chaudhary, Dinesh

    2014-01-01

    Scenario Analysis (SA) plays a key role in determination of operational risk capital under Basel II Advanced Measurement Approach. However, operational risk capital based on scenario data may exhibit high sensitivity or wrong-way sensitivity to scenario inputs. In this paper, we first discuss scenario generation using quantile approach and parameter estimation using quantile matching. Then we use single-loss approximation (SLA) to examine sensitivity of scenario based capital to scenario inputs.

  10. Practical Approach to Knowledge-based Question Answering with Natural Language Understanding and Advanced Reasoning

    CERN Document Server

    Wong, Wilson

    2007-01-01

    This research hypothesized that a practical approach in the form of a solution framework known as Natural Language Understanding and Reasoning for Intelligence (NaLURI), which combines full-discourse natural language understanding, powerful representation formalism capable of exploiting ontological information and reasoning approach with advanced features, will solve the following problems without compromising practicality factors: 1) restriction on the nature of question and response, and 2) limitation to scale across domains and to real-life natural language text.

  11. AN ETHICAL ASSESSMENT OF COMPUTER ETHICS USING SCENARIO APPROACH

    Directory of Open Access Journals (Sweden)

    Maslin Masrom

    2010-06-01

    Full Text Available Ethics refers to a set of rules that define right and wrong behavior, used for moral decision making. In this case, computer ethics is one of the major issues in information technology (IT and information system (IS. The ethical behaviour of IT students and professionals need to be studied in an attempt to reduce many unethical practices such as software piracy, hacking, and software intellectual property violations. This paper attempts to address computer-related scenarios that can be used to examine the computer ethics. The computer-related scenario consists of a short description of an ethical situation whereby the subject of the study such as IT professionals or students, then rate the ethics of the scenario, namely attempt to identify the ethical issues involved. This paper also reviews several measures of computer ethics in different setting. The perceptions of various dimensions of ethical behaviour in IT that are related to the circumstances of the ethical scenario are also presented.

  12. Advanced computational sensors technology: testing and evaluation in visible, SWIR, and LWIR imaging

    Science.gov (United States)

    Rizk, Charbel G.; Wilson, John P.; Pouliquen, Philippe

    2015-05-01

    The Advanced Computational Sensors Team at the Johns Hopkins University Applied Physics Laboratory and the Johns Hopkins University Department of Electrical and Computer Engineering has been developing advanced readout integrated circuit (ROIC) technology for more than 10 years with a particular focus on the key challenges of dynamic range, sampling rate, system interface and bandwidth, and detector materials or band dependencies. Because the pixel array offers parallel sampling by default, the team successfully demonstrated that adding smarts in the pixel and the chip can increase performance significantly. Each pixel becomes a smart sensor and can operate independently in collecting, processing, and sharing data. In addition, building on the digital circuit revolution, the effective well size can be increased by orders of magnitude within the same pixel pitch over analog designs. This research has yielded an innovative class of a system-on-chip concept: the Flexible Readout and Integration Sensor (FRIS) architecture. All key parameters are programmable and/or can be adjusted dynamically, and this architecture can potentially be sensor and application agnostic. This paper reports on the testing and evaluation of one prototype that can support either detector polarity and includes sample results with visible, short-wavelength infrared (SWIR), and long-wavelength infrared (LWIR) imaging.

  13. Numerical Methods for Stochastic Computations A Spectral Method Approach

    CERN Document Server

    Xiu, Dongbin

    2010-01-01

    The first graduate-level textbook to focus on fundamental aspects of numerical methods for stochastic computations, this book describes the class of numerical methods based on generalized polynomial chaos (gPC). These fast, efficient, and accurate methods are an extension of the classical spectral methods of high-dimensional random spaces. Designed to simulate complex systems subject to random inputs, these methods are widely used in many areas of computer science and engineering. The book introduces polynomial approximation theory and probability theory; describes the basic theory of gPC meth

  14. Exploring Advanced Piano Students' Approaches to Sight-Reading

    Science.gov (United States)

    Zhukov, Katie

    2014-01-01

    The ability to read music fluently is fundamental for undergraduate music study yet the training of sight-reading is often neglected. This study compares approaches to sight-reading and accompanying by students with extensive sight-reading experience to those with limited experience, and evaluates the importance of this skill to advanced pianists…

  15. Advanced light source's approach to ensure conditions for safe top-off operation

    International Nuclear Information System (INIS)

    The purpose of this document is to outline the Advanced Light Source (ALS) approach for preventing a radiation accident scenario on the ALS experimental floor due to top-off operation. The document will describe the potential risks, the analysis, and the resulting specifications for the controls.

  16. A Social Network Approach to Provisioning and Management of Cloud Computing Services for Enterprises

    OpenAIRE

    Kuada, Eric; Olesen, Henning

    2011-01-01

    This paper proposes a social network approach to the provisioning and management of cloud computing services termed Opportunistic Cloud Computing Services (OCCS), for enterprises; and presents the research issues that need to be addressed for its implementation. We hypothesise that OCCS will facilitate the adoption process of cloud computing services by enterprises. OCCS deals with the concept of enterprises taking advantage of cloud computing services to meet their business needs without hav...

  17. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  18. Statistical Learning of Phonetic Categories: Insights from a Computational Approach

    Science.gov (United States)

    McMurray, Bob; Aslin, Richard N.; Toscano, Joseph C.

    2009-01-01

    Recent evidence (Maye, Werker & Gerken, 2002) suggests that statistical learning may be an important mechanism for the acquisition of phonetic categories in the infant's native language. We examined the sufficiency of this hypothesis and its implications for development by implementing a statistical learning mechanism in a computational model…

  19. R for cloud computing an approach for data scientists

    CERN Document Server

    Ohri, A

    2014-01-01

    R for Cloud Computing looks at some of the tasks performed by business analysts on the desktop (PC era)  and helps the user navigate the wealth of information in R and its 4000 packages as well as transition the same analytics using the cloud.  With this information the reader can select both cloud vendors  and the sometimes confusing cloud ecosystem as well  as the R packages that can help process the analytical tasks with minimum effort and cost, and maximum usefulness and customization. The use of Graphical User Interfaces (GUI)  and Step by Step screenshot tutorials is emphasized in this book to lessen the famous learning curve in learning R and some of the needless confusion created in cloud computing that hinders its widespread adoption. This will help you kick-start analytics on the cloud including chapters on cloud computing, R, common tasks performed in analytics, scrutiny of big data analytics, and setting up and navigating cloud providers. Readers are exposed to a breadth of cloud computing ch...

  20. A genetic and computational approach to structurally classify neuronal types

    OpenAIRE

    Sümbül, Uygar; Song, Sen; McCulloch, Kyle; Becker, Michael; Lin, Bin; Sanes, Joshua R.; Masland, Richard H.; Seung, H. Sebastian

    2014-01-01

    The importance of cell types in understanding brain function is widely appreciated but only a tiny fraction of neuronal diversity has been catalogued. Here, we exploit recent progress in genetic definition of cell types in an objective structural approach to neuronal classification. The approach is based on highly accurate quantification of dendritic arbor position relative to neurites of other cells. We test the method on a population of 363 mouse retinal ganglion cells. For each cell, we de...