WorldWideScience

Sample records for computing challenges progress

  1. Recent progress and modern challenges in applied mathematics, modeling and computational science

    CERN Document Server

    Makarov, Roman; Belair, Jacques

    2017-01-01

    This volume is an excellent resource for professionals in various areas of applications of mathematics, modeling, and computational science. It focuses on recent progress and modern challenges in these areas. The volume provides a balance between fundamental theoretical and applied developments, emphasizing the interdisciplinary nature of modern trends and detailing state-of-the-art achievements in Applied Mathematics, Modeling, and Computational Science.  The chapters have been authored by international experts in their respective fields, making this book ideal for researchers in academia, practitioners, and graduate students. It can also serve as a reference in the diverse selected areas of applied mathematics, modelling, and computational sciences, and is ideal for interdisciplinary collaborations.

  2. Chips challenging champions games, computers and artificial intelligence

    CERN Document Server

    Schaeffer, J

    2002-01-01

    One of the earliest dreams of the fledgling field of artificial intelligence (AI) was to build computer programs that could play games as well as or better than the best human players. Despite early optimism in the field, the challenge proved to be surprisingly difficult. However, the 1990s saw amazing progress. Computers are now better than humans in checkers, Othello and Scrabble; are at least as good as the best humans in backgammon and chess; and are rapidly improving at hex, go, poker, and shogi. This book documents the progress made in computers playing games and puzzles. The book is the

  3. Editorial: Modelling and computational challenges in granular materials

    OpenAIRE

    Weinhart, Thomas; Thornton, Anthony Richard; Einav, Itai

    2015-01-01

    This is the editorial for the special issue on “Modelling and computational challenges in granular materials” in the journal on Computational Particle Mechanics (CPM). The issue aims to provide an opportunity for physicists, engineers, applied mathematicians and computational scientists to discuss the current progress and latest advancements in the field of advanced numerical methods and modelling of granular materials. The focus will be on computational methods, improved algorithms and the m...

  4. The challenge of computer mathematics.

    Science.gov (United States)

    Barendregt, Henk; Wiedijk, Freek

    2005-10-15

    Progress in the foundations of mathematics has made it possible to formulate all thinkable mathematical concepts, algorithms and proofs in one language and in an impeccable way. This is not in spite of, but partially based on the famous results of Gödel and Turing. In this way statements are about mathematical objects and algorithms, proofs show the correctness of statements and computations, and computations are dealing with objects and proofs. Interactive computer systems for a full integration of defining, computing and proving are based on this. The human defines concepts, constructs algorithms and provides proofs, while the machine checks that the definitions are well formed and the proofs and computations are correct. Results formalized so far demonstrate the feasibility of this 'computer mathematics'. Also there are very good applications. The challenge is to make the systems more mathematician-friendly, by building libraries and tools. The eventual goal is to help humans to learn, develop, communicate, referee and apply mathematics.

  5. Merging Library and Computing Services at Kenyon College: A Progress Report.

    Science.gov (United States)

    Oden, Robert A., Jr.; Temple, Daniel B.; Cottrell, Janet R.; Griggs, Ronald K.; Turney, Glen W.; Wojcik, Frank M.

    2001-01-01

    Describes the evolution and progress toward a uniquely integrated library and computer services unit at Kenyon College. Discusses its focus on constituencies; merging of the divisions; benefits for students, faculty, administrative units, and the institution; meeting challenges; and generalizing from the model. (EV)

  6. Progress in Computational Physics (PiCP) Volume 1 Wave Propagation in Periodic Media

    CERN Document Server

    Ehrhardt, Matthias

    2010-01-01

    Progress in Computational Physics is a new e-book series devoted to recent research trends in computational physics. It contains chapters contributed by outstanding experts of modeling of physical problems. The series focuses on interdisciplinary computational perspectives of current physical challenges, new numerical techniques for the solution of mathematical wave equations and describes certain real-world applications. With the help of powerful computers and sophisticated methods of numerical mathematics it is possible to simulate many ultramodern devices, e.g. photonic crystals structures,

  7. Progress in computational toxicology.

    Science.gov (United States)

    Ekins, Sean

    2014-01-01

    Computational methods have been widely applied to toxicology across pharmaceutical, consumer product and environmental fields over the past decade. Progress in computational toxicology is now reviewed. A literature review was performed on computational models for hepatotoxicity (e.g. for drug-induced liver injury (DILI)), cardiotoxicity, renal toxicity and genotoxicity. In addition various publications have been highlighted that use machine learning methods. Several computational toxicology model datasets from past publications were used to compare Bayesian and Support Vector Machine (SVM) learning methods. The increasing amounts of data for defined toxicology endpoints have enabled machine learning models that have been increasingly used for predictions. It is shown that across many different models Bayesian and SVM perform similarly based on cross validation data. Considerable progress has been made in computational toxicology in a decade in both model development and availability of larger scale or 'big data' models. The future efforts in toxicology data generation will likely provide us with hundreds of thousands of compounds that are readily accessible for machine learning models. These models will cover relevant chemistry space for pharmaceutical, consumer product and environmental applications. Copyright © 2013 Elsevier Inc. All rights reserved.

  8. Development of indigenous irradiator - current progress and challenges

    International Nuclear Information System (INIS)

    Anwar A Rahman; Mohd Arif Hamzah; Muhd Nor Atan; Aznor Hassan; Fadil Ismail; Julia A Karim; Rosli Darmawan

    2009-01-01

    The development of indigenous irradiator is one of Prototype Development Center main project to support Nuclear Malaysia services. Three (3) projects have been identified and currently the status is in final stage of design. There are some issues and challenges encountered, which delayed the project progress. The paper will discuss the current progress of development and challenges faced in designing the irradiator. (Author)

  9. Smart garments in chronic disease management: progress and challenges

    Science.gov (United States)

    Khosla, Ajit

    2012-10-01

    This paper presents the progress made developments in the area of Smart Garments for chronic disease management over last 10 years. A large number of health monitoring smart garments and wearable sensors have been manufactured to monitor patient's physiological parameters such as electrocardiogram, blood pressure, body temperature, heart rate, oxygen saturation, while patient is not in hospital. In last few years with the advancement in smartphones and cloud computing it is now possible to send the measure physiological data to any desired location. However there are many challenges in the development of smart garment systems. The two major challenges are development of new lightweight power sources and there is a need for global standardization and a road map for development of smart garments. In this paper we will discuss current state-of-theart smart garments and wearable sensor systems. Also discussed will be the new emerging trends in smart garment research and development.

  10. Computational Challenges in the Analysis of Petrophysics Using Microtomography and Upscaling

    Science.gov (United States)

    Liu, J.; Pereira, G.; Freij-Ayoub, R.; Regenauer-Lieb, K.

    2014-12-01

    Microtomography provides detailed 3D internal structures of rocks in micro- to tens of nano-meter resolution and is quickly turning into a new technology for studying petrophysical properties of materials. An important step is the upscaling of these properties as micron or sub-micron resolution can only be done on the sample-scale of millimeters or even less than a millimeter. We present here a recently developed computational workflow for the analysis of microstructures including the upscaling of material properties. Computations of properties are first performed using conventional material science simulations at micro to nano-scale. The subsequent upscaling of these properties is done by a novel renormalization procedure based on percolation theory. We have tested the workflow using different rock samples, biological and food science materials. We have also applied the technique on high-resolution time-lapse synchrotron CT scans. In this contribution we focus on the computational challenges that arise from the big data problem of analyzing petrophysical properties and its subsequent upscaling. We discuss the following challenges: 1) Characterization of microtomography for extremely large data sets - our current capability. 2) Computational fluid dynamics simulations at pore-scale for permeability estimation - methods, computing cost and accuracy. 3) Solid mechanical computations at pore-scale for estimating elasto-plastic properties - computational stability, cost, and efficiency. 4) Extracting critical exponents from derivative models for scaling laws - models, finite element meshing, and accuracy. Significant progress in each of these challenges is necessary to transform microtomography from the current research problem into a robust computational big data tool for multi-scale scientific and engineering problems.

  11. Gradient models in molecular biophysics: progress, challenges, opportunities

    Science.gov (United States)

    Bardhan, Jaydeep P.

    2013-12-01

    In the interest of developing a bridge between researchers modeling materials and those modeling biological molecules, we survey recent progress in developing nonlocal-dielectric continuum models for studying the behavior of proteins and nucleic acids. As in other areas of science, continuum models are essential tools when atomistic simulations (e.g., molecular dynamics) are too expensive. Because biological molecules are essentially all nanoscale systems, the standard continuum model, involving local dielectric response, has basically always been dubious at best. The advanced continuum theories discussed here aim to remedy these shortcomings by adding nonlocal dielectric response. We begin by describing the central role of electrostatic interactions in biology at the molecular scale, and motivate the development of computationally tractable continuum models using applications in science and engineering. For context, we highlight some of the most important challenges that remain, and survey the diverse theoretical formalisms for their treatment, highlighting the rigorous statistical mechanics that support the use and improvement of continuum models. We then address the development and implementation of nonlocal dielectric models, an approach pioneered by Dogonadze, Kornyshev, and their collaborators almost 40 years ago. The simplest of these models is just a scalar form of gradient elasticity, and here we use ideas from gradient-based modeling to extend the electrostatic model to include additional length scales. The review concludes with a discussion of open questions for model development, highlighting the many opportunities for the materials community to leverage its physical, mathematical, and computational expertise to help solve one of the most challenging questions in molecular biology and biophysics.

  12. DOE Advanced Scientific Computing Advisory Subcommittee (ASCAC) Report: Top Ten Exascale Research Challenges

    Energy Technology Data Exchange (ETDEWEB)

    Lucas, Robert [University of Southern California, Information Sciences Institute; Ang, James [Sandia National Laboratories; Bergman, Keren [Columbia University; Borkar, Shekhar [Intel; Carlson, William [Institute for Defense Analyses; Carrington, Laura [University of California, San Diego; Chiu, George [IBM; Colwell, Robert [DARPA; Dally, William [NVIDIA; Dongarra, Jack [University of Tennessee; Geist, Al [Oak Ridge National Laboratory; Haring, Rud [IBM; Hittinger, Jeffrey [Lawrence Livermore National Laboratory; Hoisie, Adolfy [Pacific Northwest National Laboratory; Klein, Dean Micron; Kogge, Peter [University of Notre Dame; Lethin, Richard [Reservoir Labs; Sarkar, Vivek [Rice University; Schreiber, Robert [Hewlett Packard; Shalf, John [Lawrence Berkeley National Laboratory; Sterling, Thomas [Indiana University; Stevens, Rick [Argonne National Laboratory; Bashor, Jon [Lawrence Berkeley National Laboratory; Brightwell, Ron [Sandia National Laboratories; Coteus, Paul [IBM; Debenedictus, Erik [Sandia National Laboratories; Hiller, Jon [Science and Technology Associates; Kim, K. H. [IBM; Langston, Harper [Reservoir Labs; Murphy, Richard Micron; Webster, Clayton [Oak Ridge National Laboratory; Wild, Stefan [Argonne National Laboratory; Grider, Gary [Los Alamos National Laboratory; Ross, Rob [Argonne National Laboratory; Leyffer, Sven [Argonne National Laboratory; Laros III, James [Sandia National Laboratories

    2014-02-10

    Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a system that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.

  13. Lattice QCD computations: Recent progress with modern Krylov subspace methods

    Energy Technology Data Exchange (ETDEWEB)

    Frommer, A. [Bergische Universitaet GH Wuppertal (Germany)

    1996-12-31

    Quantum chromodynamics (QCD) is the fundamental theory of the strong interaction of matter. In order to compare the theory with results from experimental physics, the theory has to be reformulated as a discrete problem of lattice gauge theory using stochastic simulations. The computational challenge consists in solving several hundreds of very large linear systems with several right hand sides. A considerable part of the world`s supercomputer time is spent in such QCD calculations. This paper presents results on solving systems for the Wilson fermions. Recent progress is reviewed on algorithms obtained in cooperation with partners from theoretical physics.

  14. Reference Structures: Stagnation, Progress, and Future Challenges.

    Science.gov (United States)

    Greenberg, Jane

    1997-01-01

    Assesses the current state of reference structures in online public access catalogs (OPACs) in a framework defined by stagnation, progress, and future challenges. Outlines six areas for reference structure development. Twenty figures provide illustrations. (AEF)

  15. Gradient Models in Molecular Biophysics: Progress, Challenges, Opportunities.

    Science.gov (United States)

    Bardhan, Jaydeep P

    2013-12-01

    In the interest of developing a bridge between researchers modeling materials and those modeling biological molecules, we survey recent progress in developing nonlocal-dielectric continuum models for studying the behavior of proteins and nucleic acids. As in other areas of science, continuum models are essential tools when atomistic simulations (e.g. molecular dynamics) are too expensive. Because biological molecules are essentially all nanoscale systems, the standard continuum model, involving local dielectric response, has basically always been dubious at best. The advanced continuum theories discussed here aim to remedy these shortcomings by adding features such as nonlocal dielectric response, and nonlinearities resulting from dielectric saturation. We begin by describing the central role of electrostatic interactions in biology at the molecular scale, and motivate the development of computationally tractable continuum models using applications in science and engineering. For context, we highlight some of the most important challenges that remain and survey the diverse theoretical formalisms for their treatment, highlighting the rigorous statistical mechanics that support the use and improvement of continuum models. We then address the development and implementation of nonlocal dielectric models, an approach pioneered by Dogonadze, Kornyshev, and their collaborators almost forty years ago. The simplest of these models is just a scalar form of gradient elasticity, and here we use ideas from gradient-based modeling to extend the electrostatic model to include additional length scales. The paper concludes with a discussion of open questions for model development, highlighting the many opportunities for the materials community to leverage its physical, mathematical, and computational expertise to help solve one of the most challenging questions in molecular biology and biophysics.

  16. Gradient Models in Molecular Biophysics: Progress, Challenges, Opportunities

    Science.gov (United States)

    Bardhan, Jaydeep P.

    2014-01-01

    In the interest of developing a bridge between researchers modeling materials and those modeling biological molecules, we survey recent progress in developing nonlocal-dielectric continuum models for studying the behavior of proteins and nucleic acids. As in other areas of science, continuum models are essential tools when atomistic simulations (e.g. molecular dynamics) are too expensive. Because biological molecules are essentially all nanoscale systems, the standard continuum model, involving local dielectric response, has basically always been dubious at best. The advanced continuum theories discussed here aim to remedy these shortcomings by adding features such as nonlocal dielectric response, and nonlinearities resulting from dielectric saturation. We begin by describing the central role of electrostatic interactions in biology at the molecular scale, and motivate the development of computationally tractable continuum models using applications in science and engineering. For context, we highlight some of the most important challenges that remain and survey the diverse theoretical formalisms for their treatment, highlighting the rigorous statistical mechanics that support the use and improvement of continuum models. We then address the development and implementation of nonlocal dielectric models, an approach pioneered by Dogonadze, Kornyshev, and their collaborators almost forty years ago. The simplest of these models is just a scalar form of gradient elasticity, and here we use ideas from gradient-based modeling to extend the electrostatic model to include additional length scales. The paper concludes with a discussion of open questions for model development, highlighting the many opportunities for the materials community to leverage its physical, mathematical, and computational expertise to help solve one of the most challenging questions in molecular biology and biophysics. PMID:25505358

  17. E-Government in the Asia-Pacific Region: Progress and Challenges

    Directory of Open Access Journals (Sweden)

    Clay Wescott

    2005-12-01

    Full Text Available This paper will focus on two issues: (i recent e-government progress and challenges, and (ii the practices regional organizations follow to cope with the challenges, while maximizing the benefits. Beginning with an overview of efforts to improve governance in the region, it then analyzes recent progress in the use of information and communication technology (ICT in the Asia-Pacific region to promote more efficient, cost-effective, and participatory government, facilitate more convenient government services, allow greater public access to information, and make government more accountable to citizens. Successful adoption of e-government presents major challenges. The paper concludes by examining the practices regional organizations follow to cope with the challenges, while maximizing the benefits.

  18. Swallowable Wireless Capsule Endoscopy: Progress and Technical Challenges

    Directory of Open Access Journals (Sweden)

    Guobing Pan

    2012-01-01

    Full Text Available Wireless capsule endoscopy (WCE offers a feasible noninvasive way to detect the whole gastrointestinal (GI tract and revolutionizes the diagnosis technology. However, compared with wired endoscopies, the limited working time, the low frame rate, and the low image resolution limit the wider application. The progress of this new technology is reviewed in this paper, and the evolution tendencies are analyzed to be high image resolution, high frame rate, and long working time. Unfortunately, the power supply of capsule endoscope (CE is the bottleneck. Wireless power transmission (WPT is the promising solution to this problem, but is also the technical challenge. Active CE is another tendency and will be the next geneion of the WCE. Nevertheless, it will not come true shortly, unless the practical locomotion mechanism of the active CE in GI tract is achieved. The locomotion mechanism is the other technical challenge, besides the challenge of WPT. The progress about the WPT and the active capsule technology is reviewed.

  19. Progress and Challenge of Artificial Intelligence

    Institute of Scientific and Technical Information of China (English)

    Zhong-Zhi Shi; Nan-Ning Zheng

    2006-01-01

    Artificial Intelligence (AI) is generally considered to be a subfield of computer science, that is concerned to attempt simulation, extension and expansion of human intelligence. Artificial intelligence has enjoyed tremendous success over the last fifty years. In this paper we only focus on visual perception, granular computing, agent computing, semantic grid. Human-level intelligence is the long-term goal of artificial intelligence. We should do joint research on basic theory and technology of intelligence by brain science, cognitive science, artificial intelligence and others. A new cross discipline intelligence science is undergoing a rapid development. Future challenges are given in final section.

  20. Orion Absolute Navigation System Progress and Challenge

    Science.gov (United States)

    Holt, Greg N.; D'Souza, Christopher

    2012-01-01

    The absolute navigation design of NASA's Orion vehicle is described. It has undergone several iterations and modifications since its inception, and continues as a work-in-progress. This paper seeks to benchmark the current state of the design and some of the rationale and analysis behind it. There are specific challenges to address when preparing a timely and effective design for the Exploration Flight Test (EFT-1), while still looking ahead and providing software extensibility for future exploration missions. The primary onboard measurements in a Near-Earth or Mid-Earth environment consist of GPS pseudo-range and delta-range, but for future explorations missions the use of star-tracker and optical navigation sources need to be considered. Discussions are presented for state size and composition, processing techniques, and consider states. A presentation is given for the processing technique using the computationally stable and robust UDU formulation with an Agee-Turner Rank-One update. This allows for computational savings when dealing with many parameters which are modeled as slowly varying Gauss-Markov processes. Preliminary analysis shows up to a 50% reduction in computation versus a more traditional formulation. Several state elements are discussed and evaluated, including position, velocity, attitude, clock bias/drift, and GPS measurement biases in addition to bias, scale factor, misalignment, and non-orthogonalities of the accelerometers and gyroscopes. Another consideration is the initialization of the EKF in various scenarios. Scenarios such as single-event upset, ground command, and cold start are discussed as are strategies for whole and partial state updates as well as covariance considerations. Strategies are given for dealing with latent measurements and high-rate propagation using multi-rate architecture. The details of the rate groups and the data ow between the elements is discussed and evaluated.

  1. Cancer nanomedicine: progress, challenges and opportunities.

    Science.gov (United States)

    Shi, Jinjun; Kantoff, Philip W; Wooster, Richard; Farokhzad, Omid C

    2017-01-01

    The intrinsic limits of conventional cancer therapies prompted the development and application of various nanotechnologies for more effective and safer cancer treatment, herein referred to as cancer nanomedicine. Considerable technological success has been achieved in this field, but the main obstacles to nanomedicine becoming a new paradigm in cancer therapy stem from the complexities and heterogeneity of tumour biology, an incomplete understanding of nano-bio interactions and the challenges regarding chemistry, manufacturing and controls required for clinical translation and commercialization. This Review highlights the progress, challenges and opportunities in cancer nanomedicine and discusses novel engineering approaches that capitalize on our growing understanding of tumour biology and nano-bio interactions to develop more effective nanotherapeutics for cancer patients.

  2. Computing Platforms for Big Biological Data Analytics: Perspectives and Challenges.

    Science.gov (United States)

    Yin, Zekun; Lan, Haidong; Tan, Guangming; Lu, Mian; Vasilakos, Athanasios V; Liu, Weiguo

    2017-01-01

    The last decade has witnessed an explosion in the amount of available biological sequence data, due to the rapid progress of high-throughput sequencing projects. However, the biological data amount is becoming so great that traditional data analysis platforms and methods can no longer meet the need to rapidly perform data analysis tasks in life sciences. As a result, both biologists and computer scientists are facing the challenge of gaining a profound insight into the deepest biological functions from big biological data. This in turn requires massive computational resources. Therefore, high performance computing (HPC) platforms are highly needed as well as efficient and scalable algorithms that can take advantage of these platforms. In this paper, we survey the state-of-the-art HPC platforms for big biological data analytics. We first list the characteristics of big biological data and popular computing platforms. Then we provide a taxonomy of different biological data analysis applications and a survey of the way they have been mapped onto various computing platforms. After that, we present a case study to compare the efficiency of different computing platforms for handling the classical biological sequence alignment problem. At last we discuss the open issues in big biological data analytics.

  3. COMPLEX NETWORKS IN CLIMATE SCIENCE: PROGRESS, OPPORTUNITIES AND CHALLENGES

    Data.gov (United States)

    National Aeronautics and Space Administration — COMPLEX NETWORKS IN CLIMATE SCIENCE: PROGRESS, OPPORTUNITIES AND CHALLENGES KARSTEN STEINHAEUSER, NITESH V. CHAWLA, AND AUROOP R. GANGULY Abstract. Networks have...

  4. Progress in computer vision.

    Science.gov (United States)

    Jain, A. K.; Dorai, C.

    Computer vision has emerged as a challenging and important area of research, both as an engineering and a scientific discipline. The growing importance of computer vision is evident from the fact that it was identified as one of the "Grand Challenges" and also from its prominent role in the National Information Infrastructure. While the design of a general-purpose vision system continues to be elusive machine vision systems are being used successfully in specific application elusive, machine vision systems are being used successfully in specific application domains. Building a practical vision system requires a careful selection of appropriate sensors, extraction and integration of information from available cues in the sensed data, and evaluation of system robustness and performance. The authors discuss and demonstrate advantages of (1) multi-sensor fusion, (2) combination of features and classifiers, (3) integration of visual modules, and (IV) admissibility and goal-directed evaluation of vision algorithms. The requirements of several prominent real world applications such as biometry, document image analysis, image and video database retrieval, and automatic object model construction offer exciting problems and new opportunities to design and evaluate vision algorithms.

  5. Challenges and solutions in enterprise computing

    NARCIS (Netherlands)

    van Sinderen, Marten J.

    2008-01-01

    The emergence of the networked enterprise has a profound effect on enterprise computing. This introduction discusses some important challenges in enterprise computing, which are the result of the mentioned networking trend, and positions the articles of this special issue with respect to these

  6. Challenges and Security in Cloud Computing

    Science.gov (United States)

    Chang, Hyokyung; Choi, Euiin

    People who live in this world want to solve any problems as they happen then. An IT technology called Ubiquitous computing should help the situations easier and we call a technology which makes it even better and powerful cloud computing. Cloud computing, however, is at the stage of the beginning to implement and use and it faces a lot of challenges in technical matters and security issues. This paper looks at the cloud computing security.

  7. Cloud Computing: Opportunities and Challenges for Businesses

    Directory of Open Access Journals (Sweden)

    İbrahim Halil Seyrek

    2011-12-01

    Full Text Available Cloud computing represents a new approach for supplying and using information technology services. Considering its benefits for firms and the potential of changes that it may lead to, it is envisioned that cloud computing can be the most important innovation in information technology since the development of the internet. In this study, firstly, the development of cloud computing and related technologies are explained and classified by giving current application examples. Then the benefits of this new computing model for businesses are elaborated especially in terms of cost, flexibility and service quality. In spite of its benefits, cloud computing also poses some risks for firms, of which security is one of the most important, and there are some challenges in its implementation. This study points out the risks that companies should be wary about and some legal challenges related to cloud computing. Lastly, approaches that companies may take against cloud computing and different strategies that they may adopt are discussed and some recommendations are made

  8. Orion Absolute Navigation System Progress and Challenges

    Science.gov (United States)

    Holt, Greg N.; D'Souza, Christopher

    2011-01-01

    The Orion spacecraft is being designed as NASA's next-generation exploration vehicle for crewed missions beyond Low-Earth Orbit. The navigation system for the Orion spacecraft is being designed in a Multi-Organizational Design Environment (MODE) team including contractor and NASA personnel. The system uses an Extended Kalman Filter to process measurements and determine the state. The design of the navigation system has undergone several iterations and modifications since its inception, and continues as a work-in-progress. This paper seeks to benchmark the current state of the design and some of the rationale and analysis behind it. There are specific challenges to address when preparing a timely and effective design for the Exploration Flight Test (EFT-1), while still looking ahead and providing software extensibility for future exploration missions. The primary measurements in a Near-Earth or Mid-Earth environment consist of GPS pseudorange and deltarange, but for future explorations missions the use of star-tracker and optical navigation sources need to be considered. Discussions are presented for state size and composition, processing techniques, and consider states. A presentation is given for the processing technique using the computationally stable and robust UDU formulation with an Agee-Turner Rank-One update. This allows for computational savings when dealing with many parameters which are modeled as slowly varying Gauss-Markov processes. Preliminary analysis shows up to a 50% reduction in computation versus a more traditional formulation. Several state elements are discussed and evaluated, including position, velocity, attitude, clock bias/drift, and GPS measurement biases in addition to bias, scale factor, misalignment, and non-orthogonalities of the accelerometers and gyroscopes. Another consideration is the initialization of the EKF in various scenarios. Scenarios such as single-event upset, ground command, pad alignment, cold start are discussed as are

  9. Workplace Charging Challenge Progress Update 2016: A New Sustainable Commute

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2017-01-31

    In the 2016 Workplace Charging Challenge annual survey, partners shared for the how their efforts were making an impact in their communities and helped identify best practices for workplace charging. The Workplace Charging Challenge Progress Update highlights the findings from this survey and recognizes leading employers for their workplace charging efforts.

  10. Security and Privacy in Fog Computing: Challenges

    OpenAIRE

    Mukherjee, Mithun; Matam, Rakesh; Shu, Lei; Maglaras, Leandros; Ferrag, Mohamed Amine; Choudhry, Nikumani; Kumar, Vikas

    2017-01-01

    open access article Fog computing paradigm extends the storage, networking, and computing facilities of the cloud computing toward the edge of the networks while offloading the cloud data centers and reducing service latency to the end users. However, the characteristics of fog computing arise new security and privacy challenges. The existing security and privacy measurements for cloud computing cannot be directly applied to the fog computing due to its features, such as mobility, heteroge...

  11. The computational challenges of Earth-system science.

    Science.gov (United States)

    O'Neill, Alan; Steenman-Clark, Lois

    2002-06-15

    The Earth system--comprising atmosphere, ocean, land, cryosphere and biosphere--is an immensely complex system, involving processes and interactions on a wide range of space- and time-scales. To understand and predict the evolution of the Earth system is one of the greatest challenges of modern science, with success likely to bring enormous societal benefits. High-performance computing, along with the wealth of new observational data, is revolutionizing our ability to simulate the Earth system with computer models that link the different components of the system together. There are, however, considerable scientific and technical challenges to be overcome. This paper will consider four of them: complexity, spatial resolution, inherent uncertainty and time-scales. Meeting these challenges requires a significant increase in the power of high-performance computers. The benefits of being able to make reliable predictions about the evolution of the Earth system should, on their own, amply repay this investment.

  12. Potentials and Challenges of Student Progress Portfolio Innovation ...

    African Journals Online (AJOL)

    This paper aims at stimulating discussion on Students Progress Portfolio (SPP) Innovation in assessment. It analyses the potential and challenges of SPP as well as how it can be harnessed to improve assessment practices and its contribution to quality education. The paper is based on a recent qualitative research which ...

  13. Computer graphics visions and challenges: a European perspective.

    Science.gov (United States)

    Encarnação, José L

    2006-01-01

    I have briefly described important visions and challenges in computer graphics. They are a personal and therefore subjective selection. But most of these issues have to be addressed and solved--no matter if we call them visions or challenges or something else--if we want to make and further develop computer graphics into a key enabling technology for our IT-based society.

  14. Computing challenges of the CMS experiment

    International Nuclear Information System (INIS)

    Krammer, N.; Liko, D.

    2017-01-01

    The success of the LHC experiments is due to the magnificent performance of the detector systems and the excellent operating computing systems. The CMS offline software and computing system is successfully fulfilling the LHC Run 2 requirements. For the increased data rate of future LHC operation, together with high pileup interactions, improvements of the usage of the current computing facilities and new technologies became necessary. Especially for the challenge of the future HL-LHC a more flexible and sophisticated computing model is needed. In this presentation, I will discuss the current computing system used in the LHC Run 2 and future computing facilities for the HL-LHC runs using flexible computing technologies like commercial and academic computing clouds. The cloud resources are highly virtualized and can be deployed for a variety of computing tasks providing the capacities for the increasing needs of large scale scientific computing.

  15. Editorial: Modelling and computational challenges in granular materials

    NARCIS (Netherlands)

    Weinhart, Thomas; Thornton, Anthony Richard; Einav, Itai

    2015-01-01

    This is the editorial for the special issue on “Modelling and computational challenges in granular materials” in the journal on Computational Particle Mechanics (CPM). The issue aims to provide an opportunity for physicists, engineers, applied mathematicians and computational scientists to discuss

  16. Homogeneous Buchberger algorithms and Sullivant's computational commutative algebra challenge

    DEFF Research Database (Denmark)

    Lauritzen, Niels

    2005-01-01

    We give a variant of the homogeneous Buchberger algorithm for positively graded lattice ideals. Using this algorithm we solve the Sullivant computational commutative algebra challenge.......We give a variant of the homogeneous Buchberger algorithm for positively graded lattice ideals. Using this algorithm we solve the Sullivant computational commutative algebra challenge....

  17. Progress and challenges in implementing the women, peace and ...

    African Journals Online (AJOL)

    This article provides an initial overview of the African Union's progress and challenges in implementing the Women, Peace and Security (WPS) agenda in its peace and security architecture. It reviews implementation in relation to representation, programming and in peacekeeping. The article contends that the WPS agenda ...

  18. Therapeutic strategies to fight HIV-1 latency: progress and challenges

    CSIR Research Space (South Africa)

    Manoto, Sello L

    2017-10-01

    Full Text Available —1112, 2017 Therapeutic strategies to fight HIV-1 latency: progress and challenges Sello Lebohang Manoto, Lebogang Thobakgale, Rudzani Malabi, Charles Maphanga, Saturnin Ombinda-Lemboumba, Patience Mthunzi-Kufa Abstract: The life...

  19. Advances and Challenges in Computational Plasma Science

    International Nuclear Information System (INIS)

    Tang, W.M.; Chan, V.S.

    2005-01-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behavior. Recent advances in simulations of magnetically-confined plasmas are reviewed in this paper with illustrative examples chosen from associated research areas such as microturbulence, magnetohydrodynamics, and other topics. Progress has been stimulated in particular by the exponential growth of computer speed along with significant improvements in computer technology

  20. CLOUD COMPUTING OVERVIEW AND CHALLENGES: A REVIEW PAPER

    OpenAIRE

    Satish Kumar*, Vishal Thakur, Payal Thakur, Ashok Kumar Kashyap

    2017-01-01

    Cloud computing era is the most resourceful, elastic, utilized and scalable period for internet technology to use the computing resources over the internet successfully. Cloud computing did not provide only the speed, accuracy, storage capacity and efficiency for computing but it also lead to propagate the green computing and resource utilization. In this research paper, a brief description of cloud computing, cloud services and cloud security challenges is given. Also the literature review o...

  1. Photons, photosynthesis, and high-performance computing: challenges, progress, and promise of modeling metabolism in green algae

    International Nuclear Information System (INIS)

    Chang, C H; Graf, P; Alber, D M; Kim, K; Murray, G; Posewitz, M; Seibert, M

    2008-01-01

    The complexity associated with biological metabolism considered at a kinetic level presents a challenge to quantitative modeling. In particular, the relatively sparse knowledge of parameters for enzymes with known kinetic responses is problematic. The possible space of these parameters is of high-dimension, and sampling of such a space typifies a combinatorial explosion of possible dynamic states. However, with sufficient quantitative transcriptomics, proteomics, and metabolomics data at hand, these challenges could be met by high-performance software with sampling, fitting, and optimization capabilities. With this in mind, we present the High-Performance Systems Biology Toolkit HiPer SBTK, an evolving software package to simulate, fit, and optimize metabolite concentrations and fluxes within the space of rate and binding parameters associated with detailed enzyme kinetic models. We present our chosen modeling paradigm for the formulation of metabolic pathway models, the means to address the challenge of representing such models in a precise and persistent fashion using the standardized Systems Biology Markup Language, and our second-generation model of H2-associated Chlamydomonas metabolism. Processing of such models for hierarchically parallelized simulation and optimization, job specification by the user through a GUI interface, software capabilities and initial scaling data, and the mapping of the computation to biological questions is also discussed. Moreover, we present near-term future software and model development goals

  2. Cloud computing challenges, limitations and R&D solutions

    CERN Document Server

    Mahmood, Zaigham

    2014-01-01

    This important text/reference reviews the challenging issues that present barriers to greater implementation of the cloud computing paradigm, together with the latest research into developing potential solutions. Exploring the strengths and vulnerabilities of cloud provision and cloud environments, Cloud Computing: Challenges, Limitations and R&D Solutions provides case studies from a diverse selection of researchers and practitioners of international repute. The implications of emerging cloud technologies are also analyzed from the perspective of consumers. Topics and features: presents

  3. Progress and challenges of carbon nanotube membrane in water treatment

    KAUST Repository

    Lee, Jieun; Jeong, Sanghyun; Liu, Zongwen

    2016-01-01

    review of the progress of CNT membranes addressing the current epidemic—whether (i) the CNT membranes could tackle current challenges in the pressure- or thermally driven membrane processes and (ii) CNT hybrid nanocomposite as a new generation

  4. Progress and Challenges in Implementing the Women, Peace and ...

    African Journals Online (AJOL)

    This article provides an initial overview of the African Union's progress and challenges ... peace initiatives to protect women and girls from gender-based violence. (GBV); to ... bilateral aid on gender equality to fragile states has quadrupled (UN. Women ..... (support of school supplies for ten children borne as a result of rape.

  5. Advances and challenges in computational plasma science

    International Nuclear Information System (INIS)

    Tang, W M; Chan, V S

    2005-01-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This

  6. Computational challenges in modeling gene regulatory events.

    Science.gov (United States)

    Pataskar, Abhijeet; Tiwari, Vijay K

    2016-10-19

    Cellular transcriptional programs driven by genetic and epigenetic mechanisms could be better understood by integrating "omics" data and subsequently modeling the gene-regulatory events. Toward this end, computational biology should keep pace with evolving experimental procedures and data availability. This article gives an exemplified account of the current computational challenges in molecular biology.

  7. Novel spintronics devices for memory and logic: prospects and challenges for room temperature all spin computing

    Science.gov (United States)

    Wang, Jian-Ping

    An energy efficient memory and logic device for the post-CMOS era has been the goal of a variety of research fields. The limits of scaling, which we expect to reach by the year 2025, demand that future advances in computational power will not be realized from ever-shrinking device sizes, but rather by innovative designs and new materials and physics. Magnetoresistive based devices have been a promising candidate for future integrated magnetic computation because of its unique non-volatility and functionalities. The application of perpendicular magnetic anisotropy for potential STT-RAM application was demonstrated and later has been intensively investigated by both academia and industry groups, but there is no clear path way how scaling will eventually work for both memory and logic applications. One of main reasons is that there is no demonstrated material stack candidate that could lead to a scaling scheme down to sub 10 nm. Another challenge for the usage of magnetoresistive based devices for logic application is its available switching speed and writing energy. Although a good progress has been made to demonstrate the fast switching of a thermally stable magnetic tunnel junction (MTJ) down to 165 ps, it is still several times slower than its CMOS counterpart. In this talk, I will review the recent progress by my research group and my C-SPIN colleagues, then discuss the opportunities, challenges and some potential path ways for magnetoresitive based devices for memory and logic applications and their integration for room temperature all spin computing system.

  8. Molecular Science Computing Facility Scientific Challenges: Linking Across Scales

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, Wibe A.; Windus, Theresa L.

    2005-07-01

    The purpose of this document is to define the evolving science drivers for performing environmental molecular research at the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) and to provide guidance associated with the next-generation high-performance computing center that must be developed at EMSL's Molecular Science Computing Facility (MSCF) in order to address this critical research. The MSCF is the pre-eminent computing facility?supported by the U.S. Department of Energy's (DOE's) Office of Biological and Environmental Research (BER)?tailored to provide the fastest time-to-solution for current computational challenges in chemistry and biology, as well as providing the means for broad research in the molecular and environmental sciences. The MSCF provides integral resources and expertise to emerging EMSL Scientific Grand Challenges and Collaborative Access Teams that are designed to leverage the multiple integrated research capabilities of EMSL, thereby creating a synergy between computation and experiment to address environmental molecular science challenges critical to DOE and the nation.

  9. Scientific and Computational Challenges of the Fusion Simulation Program (FSP)

    International Nuclear Information System (INIS)

    Tang, William M.

    2011-01-01

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Program (FSP) a major national initiative in the United States with the primary objective being to enable scientific discovery of important new plasma phenomena with associated understanding that emerges only upon integration. This requires developing a predictive integrated simulation capability for magnetically-confined fusion plasmas that are properly validated against experiments in regimes relevant for producing practical fusion energy. It is expected to provide a suite of advanced modeling tools for reliably predicting fusion device behavior with comprehensive and targeted science-based simulations of nonlinearly-coupled phenomena in the core plasma, edge plasma, and wall region on time and space scales required for fusion energy production. As such, it will strive to embody the most current theoretical and experimental understanding of magnetic fusion plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing the ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices with high physics fidelity on all relevant time and space scales. From a computational perspective, this will demand computing resources in the petascale range and beyond together with the associated multi-core algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative experiment involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics modeling projects (e

  10. Scientific and computational challenges of the fusion simulation program (FSP)

    International Nuclear Information System (INIS)

    Tang, William M.

    2011-01-01

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Program (FSP) - a major national initiative in the United States with the primary objective being to enable scientific discovery of important new plasma phenomena with associated understanding that emerges only upon integration. This requires developing a predictive integrated simulation capability for magnetically-confined fusion plasmas that are properly validated against experiments in regimes relevant for producing practical fusion energy. It is expected to provide a suite of advanced modeling tools for reliably predicting fusion device behavior with comprehensive and targeted science-based simulations of nonlinearly-coupled phenomena in the core plasma, edge plasma, and wall region on time and space scales required for fusion energy production. As such, it will strive to embody the most current theoretical and experimental understanding of magnetic fusion plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing the ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices with high physics fidelity on all relevant time and space scales. From a computational perspective, this will demand computing resources in the petascale range and beyond together with the associated multi-core algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative experiment involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics modeling projects (e

  11. Scientific and computational challenges of the fusion simulation project (FSP)

    International Nuclear Information System (INIS)

    Tang, W M

    2008-01-01

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Project (FSP). The primary objective is to develop advanced software designed to use leadership-class computers for carrying out multiscale physics simulations to provide information vital to delivering a realistic integrated fusion simulation model with unprecedented physics fidelity. This multiphysics capability will be unprecedented in that in the current FES applications domain, the largest-scale codes are used to carry out first-principles simulations of mostly individual phenomena in realistic 3D geometry while the integrated models are much smaller-scale, lower-dimensionality codes with significant empirical elements used for modeling and designing experiments. The FSP is expected to be the most up-to-date embodiment of the theoretical and experimental understanding of magnetically confined thermonuclear plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing a reliable ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices on all relevant time and space scales. From a computational perspective, the fusion energy science application goal to produce high-fidelity, whole-device modeling capabilities will demand computing resources in the petascale range and beyond, together with the associated multicore algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative device involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics

  12. Progressing neurobiological strategies against proteostasis failure: Challenges in neurodegeneration.

    Science.gov (United States)

    Amanullah, Ayeman; Upadhyay, Arun; Joshi, Vibhuti; Mishra, Ribhav; Jana, Nihar Ranjan; Mishra, Amit

    2017-12-01

    Proteins are ordered useful cellular entities, required for normal health and organism's survival. The proteome is the absolute set of cellular expressed proteins, which regulates a wide range of physiological functions linked with all domains of life. In aging cells or under unfavorable cellular conditions, misfolding of proteins generates common pathological events linked with neurodegenerative diseases and aging. Current advances of proteome studies systematically generates some progress in our knowledge that how misfolding of proteins or their accumulation can contribute to the impairment or depletion of proteome functions. Still, the underlying causes of this unrecoverable loss are not clear that how such unsolved transitions give rise to multifactorial challengeable degenerative pathological conditions in neurodegeneration. In this review, we specifically focus and systematically summarize various molecular mechanisms of proteostasis maintenance, as well as discuss progressing neurobiological strategies, promising natural and pharmacological candidates, which can be useful to counteract the problem of proteopathies. Our article emphasizes an urgent need that now it is important for us to recognize the fundamentals of proteostasis to design a new molecular framework and fruitful strategies to uncover how the proteome defects are associated with aging and neurodegenerative diseases. A enhance understanding of progress link with proteome and neurobiological challenges may provide new basic concepts in the near future, based on pharmacological agents, linked with impaired proteostasis and neurodegenerative diseases. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. The ATLAS computing challenge for HL-LHC

    CERN Document Server

    Campana, Simone; The ATLAS collaboration

    2016-01-01

    The ATLAS experiment successfully commissioned a software and computing infrastructure to support the physics program during LHC Run 2. The next phases of the accelerator upgrade will present new challenges in the offline area. In particular, at High Luminosity LHC (also known as Run 4) the data taking conditions will be very demanding in terms of computing resources: between 5 and 10 KHz of event rate from the HLT to be reconstructed (and possibly further reprocessed) with an average pile-up of up to 200 events per collision and an equivalent number of simulated samples to be produced. The same parameters for the current run are lower by up to an order of magnitude. While processing and storage resources would need to scale accordingly, the funding situation allows one at best to consider a flat budget over the next few years for offline computing needs. In this paper we present a study quantifying the challenge in terms of computing resources for HL-LHC and present ideas about the possible evolution of the ...

  14. Multimodal Challenge: Analytics Beyond User-computer Interaction Data

    NARCIS (Netherlands)

    Di Mitri, Daniele; Schneider, Jan; Specht, Marcus; Drachsler, Hendrik

    2018-01-01

    This contribution describes one the challenges explored in the Fourth LAK Hackathon. This challenge aims at shifting the focus from learning situations which can be easily traced through user-computer interactions data and concentrate more on user-world interactions events, typical of co-located and

  15. Mobile Computing and Ubiquitous Networking: Concepts, Technologies and Challenges.

    Science.gov (United States)

    Pierre, Samuel

    2001-01-01

    Analyzes concepts, technologies and challenges related to mobile computing and networking. Defines basic concepts of cellular systems. Describes the evolution of wireless technologies that constitute the foundations of mobile computing and ubiquitous networking. Presents characterization and issues of mobile computing. Analyzes economical and…

  16. High-End Computing Challenges in Aerospace Design and Engineering

    Science.gov (United States)

    Bailey, F. Ronald

    2004-01-01

    High-End Computing (HEC) has had significant impact on aerospace design and engineering and is poised to make even more in the future. In this paper we describe four aerospace design and engineering challenges: Digital Flight, Launch Simulation, Rocket Fuel System and Digital Astronaut. The paper discusses modeling capabilities needed for each challenge and presents projections of future near and far-term HEC computing requirements. NASA's HEC Project Columbia is described and programming strategies presented that are necessary to achieve high real performance.

  17. Female challenges in acquiring computer education at the federal ...

    African Journals Online (AJOL)

    Computer education and application of Computer skills in the knowledge-based society is ever increasing. It is in recognition of this that this study determined the challenges of female students in acquisition of Computer education using the Federal Polytechnic, Idah as a case study. The data were obtained from 72 female ...

  18. Nucleosynthesis in the early Galaxy: Progress and challenges.

    Science.gov (United States)

    Montes, Fernando

    2015-10-01

    Chemical imprints left by the first stars in the oldest stars of the Milky Way gives clues of the stellar nucleosynthesis responsible for the creation of elements heavier than iron. Recent progress in astronomical observations and in the modeling of the chemical evolution of the Galaxy have shown that multiple nucleosynthesis processes may operate at those early times. In this talk I will review some of that evidence along with the important role that nuclear reactions play in those processes. I will focus in progress in our understanding of the rapid neutron capture process (r-process) and in new results on nucleosynthesis in core-collapse supernovae and neutrino-driven winds that produce elements up to silver. I will show some examples of recent nuclear physics measurements addressing the need for better nuclear data and give an outlook of the remaining challenges and future plans to continue those measurements.

  19. Achieving efficient RNAi therapy: progress and challenges

    Directory of Open Access Journals (Sweden)

    Kun Gao

    2013-07-01

    Full Text Available RNA interference (RNAi has been harnessed to produce a new class of drugs for treatment of various diseases. This review summarizes the most important parameters that govern the silencing efficiency and duration of the RNAi effect such as small interfering RNA (siRNA stability and modification, the type of delivery system and particle sizing methods. It also discusses the predominant barriers for siRNA delivery, such as off-target effects and introduces internalization, endosomal escape and mathematical modeling in RNAi therapy and combinatorial RNAi. At present, effective delivery of RNAi therapeutics in vivo remains a challenge although significant progress has been made in this field.

  20. Progress and challenges in the development and qualification of multi-level multi-physics coupled methodologies for reactor analysis

    International Nuclear Information System (INIS)

    Ivanov, K.; Avramova, M.

    2007-01-01

    Current trends in nuclear power generation and regulation as well as the design of next generation reactor concepts along with the continuing computer technology progress stimulate the development, qualification and application of multi-physics multi-scale coupled code systems. The efforts have been focused on extending the analysis capabilities by coupling models, which simulate different phenomena or system components, as well as on refining the scale and level of detail of the coupling. This paper reviews the progress made in this area and outlines the remaining challenges. The discussion is illustrated with examples based on neutronics/thermohydraulics coupling in the reactor core modeling. In both fields recent advances and developments are towards more physics-based high-fidelity simulations, which require implementation of improved and flexible coupling methodologies. First, the progresses in coupling of different physics codes along with the advances in multi-level techniques for coupled code simulations are discussed. Second, the issues related to the consistent qualification of coupled multi-physics and multi-scale code systems for design and safety evaluation are presented. The increased importance of uncertainty and sensitivity analysis are discussed along with approaches to propagate the uncertainty quantification between the codes. The incoming OECD LWR Uncertainty Analysis in Modeling (UAM) benchmark is the first international activity to address this issue and it is described in the paper. Finally, the remaining challenges with multi-physics coupling are outlined. (authors)

  1. Progress and challenges in the development and qualification of multi-level multi-physics coupled methodologies for reactor analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ivanov, K.; Avramova, M. [Pennsylvania State Univ., University Park, PA (United States)

    2007-07-01

    Current trends in nuclear power generation and regulation as well as the design of next generation reactor concepts along with the continuing computer technology progress stimulate the development, qualification and application of multi-physics multi-scale coupled code systems. The efforts have been focused on extending the analysis capabilities by coupling models, which simulate different phenomena or system components, as well as on refining the scale and level of detail of the coupling. This paper reviews the progress made in this area and outlines the remaining challenges. The discussion is illustrated with examples based on neutronics/thermohydraulics coupling in the reactor core modeling. In both fields recent advances and developments are towards more physics-based high-fidelity simulations, which require implementation of improved and flexible coupling methodologies. First, the progresses in coupling of different physics codes along with the advances in multi-level techniques for coupled code simulations are discussed. Second, the issues related to the consistent qualification of coupled multi-physics and multi-scale code systems for design and safety evaluation are presented. The increased importance of uncertainty and sensitivity analysis are discussed along with approaches to propagate the uncertainty quantification between the codes. The incoming OECD LWR Uncertainty Analysis in Modeling (UAM) benchmark is the first international activity to address this issue and it is described in the paper. Finally, the remaining challenges with multi-physics coupling are outlined. (authors)

  2. New challenges in computational collective intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Ngoc Thanh; Katarzyniak, Radoslaw Piotr [Wroclaw Univ. of Technology (Poland). Inst. of Informatics; Janiak, Adam (eds.) [Wroclaw Univ. of Technology (Poland). Inst. of Computer Engineering, Control and Robotics

    2009-07-01

    The book consists of 29 chapters which have been selected and invited from the submissions to the 1{sup st} International Conference on Collective Intelligence - Semantic Web, Social Networks and Multiagent Systems (ICCCI 2009). All chapters in the book discuss various examples of applications of computational collective intelligence and related technologies to such fields as semantic web, information systems ontologies, social networks, agent and multiagent systems. The editors hope that the book can be useful for graduate and Ph.D. students in Computer Science, in particular participants to courses on Soft Computing, Multi-Agent Systems and Robotics. This book can also be useful for researchers working on the concept of computational collective intelligence in artificial populations. It is the hope of the editors that readers of this volume can find many inspiring ideas and use them to create new cases intelligent collectives. Many such challenges are suggested by particular approaches and models presented in particular chapters of this book. (orig.)

  3. Peptide Vaccine: Progress and Challenges

    Directory of Open Access Journals (Sweden)

    Weidang Li

    2014-07-01

    Full Text Available Conventional vaccine strategies have been highly efficacious for several decades in reducing mortality and morbidity due to infectious diseases. The bane of conventional vaccines, such as those that include whole organisms or large proteins, appear to be the inclusion of unnecessary antigenic load that, not only contributes little to the protective immune response, but complicates the situation by inducing allergenic and/or reactogenic responses. Peptide vaccines are an attractive alternative strategy that relies on usage of short peptide fragments to engineer the induction of highly targeted immune responses, consequently avoiding allergenic and/or reactogenic sequences. Conversely, peptide vaccines used in isolation are often weakly immunogenic and require particulate carriers for delivery and adjuvanting. In this article, we discuss the specific advantages and considerations in targeted induction of immune responses by peptide vaccines and progresses in the development of such vaccines against various diseases. Additionally, we also discuss the development of particulate carrier strategies and the inherent challenges with regard to safety when combining such technologies with peptide vaccines.

  4. Scientific Grand Challenges: Challenges in Climate Change Science and the Role of Computing at the Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.; Johnson, Gary M.; Washington, Warren M.

    2009-07-02

    The U.S. Department of Energy (DOE) Office of Biological and Environmental Research (BER) in partnership with the Office of Advanced Scientific Computing Research (ASCR) held a workshop on the challenges in climate change science and the role of computing at the extreme scale, November 6-7, 2008, in Bethesda, Maryland. At the workshop, participants identified the scientific challenges facing the field of climate science and outlined the research directions of highest priority that should be pursued to meet these challenges. Representatives from the national and international climate change research community as well as representatives from the high-performance computing community attended the workshop. This group represented a broad mix of expertise. Of the 99 participants, 6 were from international institutions. Before the workshop, each of the four panels prepared a white paper, which provided the starting place for the workshop discussions. These four panels of workshop attendees devoted to their efforts the following themes: Model Development and Integrated Assessment; Algorithms and Computational Environment; Decadal Predictability and Prediction; Data, Visualization, and Computing Productivity. The recommendations of the panels are summarized in the body of this report.

  5. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  6. Symbol manipulation by computer applied to plasma physics. Technical progress report 2

    International Nuclear Information System (INIS)

    Rosen, B.

    1977-09-01

    Progress has been made in automating the calculation of parametric processes analytically by computer. The computations are performed automatically to lowest order quickly and efficiently. Work has started on a method for solving the nonlinear differential equations describing interacting modes

  7. Computational Psychiatry and the Challenge of Schizophrenia

    Science.gov (United States)

    Murray, John D.; Chekroud, Adam M.; Corlett, Philip R.; Yang, Genevieve; Wang, Xiao-Jing; Anticevic, Alan

    2017-01-01

    Abstract Schizophrenia research is plagued by enormous challenges in integrating and analyzing large datasets and difficulties developing formal theories related to the etiology, pathophysiology, and treatment of this disorder. Computational psychiatry provides a path to enhance analyses of these large and complex datasets and to promote the development and refinement of formal models for features of this disorder. This presentation introduces the reader to the notion of computational psychiatry and describes discovery-oriented and theory-driven applications to schizophrenia involving machine learning, reinforcement learning theory, and biophysically-informed neural circuit models. PMID:28338845

  8. The Abbott Districts in 2005-06: Progress and Challenges, Spring 2006

    Science.gov (United States)

    Hirsch, Lesley

    2006-01-01

    New Jersey's urban--or "Abbott"--schools have improved at the preschool and elementary school level, but lag when it comes to middle and high school performance. These are the key findings of an Abbott Indicators Project report entitled, "The Abbott Districts in 2005-06: Progress and Challenges." The report was prepared by…

  9. Progress and challenges of disaster health management in China: a scoping review.

    Science.gov (United States)

    Zhong, Shuang; Clark, Michele; Hou, Xiang-Yu; Zang, Yuli; FitzGerald, Gerard

    2014-01-01

    Despite the importance of an effective health system response to various disasters, relevant research is still in its infancy, especially in middle- and low-income countries. This paper provides an overview of the status of disaster health management in China, with its aim to promote the effectiveness of the health response for reducing disaster-related mortality and morbidity. A scoping review method was used to address the recent progress of and challenges to disaster health management in China. Major health electronic databases were searched to identify English and Chinese literature that were relevant to the research aims. The review found that since 2003 considerable progress has been achieved in the health disaster response system in China. However, there remain challenges that hinder effective health disaster responses, including low standards of disaster-resistant infrastructure safety, the lack of specific disaster plans, poor emergency coordination between hospitals, lack of portable diagnostic equipment and underdeveloped triage skills, surge capacity, and psychological interventions. Additional challenges include the fragmentation of the emergency health service system, a lack of specific legislation for emergencies, disparities in the distribution of funding, and inadequate cost-effective considerations for disaster rescue. One solution identified to address these challenges appears to be through corresponding policy strategies at multiple levels (e.g. community, hospital, and healthcare system level).

  10. Impedimetric biosensors for medical applications current progress and challenges

    CERN Document Server

    Rushworth, Jo V; Goode, Jack A; Pike, Douglas J; Ahmed, Asif; Millner, Paul

    2014-01-01

    In this monograph, the authors discuss the current progress in the medical application of impedimetric biosensors, along with the key challenges in the field. First, a general overview of biosensor development, structure and function is presented, followed by a detailed discussion of impedimetric biosensors and the principles of electrochemical impedance spectroscopy. Next, the current state-of-the art in terms of the science and technology underpinning impedance-based biosensors is reviewed in detail. The layer-by-layer construction of impedimetric sensors is described, including the design of electrodes, their nano-modification, transducer surface functionalization and the attachment of different bioreceptors. The current challenges of translating lab-based biosensor platforms into commercially-available devices that function with real patient samples at the POC are presented; this includes a consideration of systems integration, microfluidics and biosensor regeneration. The final section of this monograph ...

  11. 3rd International Symposium on Big Data and Cloud Computing Challenges

    CERN Document Server

    Neelanarayanan, V

    2016-01-01

    This proceedings volume contains selected papers that were presented in the 3rd International Symposium on Big data and Cloud Computing Challenges, 2016 held at VIT University, India on March 10 and 11. New research issues, challenges and opportunities shaping the future agenda in the field of Big Data and Cloud Computing are identified and presented throughout the book, which is intended for researchers, scholars, students, software developers and practitioners working at the forefront in their field. This book acts as a platform for exchanging ideas, setting questions for discussion, and sharing the experience in Big Data and Cloud Computing domain.

  12. Emerging nanomedicine applications and manufacturing: progress and challenges.

    Science.gov (United States)

    Sartain, Felicity; Greco, Francesca; Hill, Kathryn; Rannard, Steve; Owen, Andrew

    2016-03-01

    APS 6th International PharmSci Conference 2015 7-9 September 2015 East Midlands Conference Centre, University of Nottingham, Nottingham, UK As part of the 6th APS International PharmSci Conference, a nanomedicine session was organised to address challenges and share experiences in this field. Topics ranged from the reporting on latest results and advances in the development of targeted therapeutics to the needs that the community faces in how to progress these exciting proof of concept results into products. Here we provide an overview of the discussion and highlight some of the initiatives that have recently been established to support the translation of nanomedicines into the clinic.

  13. A heterogeneous computing environment to solve the 768-bit RSA challenge

    OpenAIRE

    Kleinjung, Thorsten; Bos, Joppe Willem; Lenstra, Arjen K.; Osvik, Dag Arne; Aoki, Kazumaro; Contini, Scott; Franke, Jens; Thomé, Emmanuel; Jermini, Pascal; Thiémard, Michela; Leyland, Paul; Montgomery, Peter L.; Timofeev, Andrey; Stockinger, Heinz

    2010-01-01

    In December 2009 the 768-bit, 232-digit number RSA-768 was factored using the number field sieve. Overall, the computational challenge would take more than 1700 years on a single, standard core. In the article we present the heterogeneous computing approach, involving different compute clusters and Grid computing environments, used to solve this problem.

  14. Laser ignited engines: progress, challenges and prospects.

    Science.gov (United States)

    Dearden, Geoff; Shenton, Tom

    2013-11-04

    Laser ignition (LI) has been shown to offer many potential benefits compared to spark ignition (SI) for improving the performance of internal combustion (IC) engines. This paper outlines progress made in recent research on laser ignited IC engines, discusses the potential advantages and control opportunities and considers the challenges faced and prospects for its future implementation. An experimental research effort has been underway at the University of Liverpool (UoL) to extend the stratified speed/load operating region of the gasoline direct injection (GDI) engine through LI research, for which an overview of some of the approaches, testing and results to date are presented. These indicate how LI can be used to improve control of the engine for: leaner operation, reductions in emissions, lower idle speed and improved combustion stability.

  15. U.S. Department of Energy Workplace Charging Challenge - Progress Update 2016: A New Sustainable Commute

    Energy Technology Data Exchange (ETDEWEB)

    2017-01-01

    In June 2016, the Workplace Charging Challenge distributed its third annual survey to 295 partners with the goal of tracking partners' progress and identifying trends in workplace charging. This document summarizes findings from the survey and highlights accomplishments of the EV Everywhere Workplace Charging Challenge.

  16. Protein Biomarkers for Early Detection of Pancreatic Ductal Adenocarcinoma: Progress and Challenges.

    Science.gov (United States)

    Root, Alex; Allen, Peter; Tempst, Paul; Yu, Kenneth

    2018-03-07

    Approximately 75% of patients with pancreatic ductal adenocarcinoma are diagnosed with advanced cancer, which cannot be safely resected. The most commonly used biomarker CA19-9 has inadequate sensitivity and specificity for early detection, which we define as Stage I/II cancers. Therefore, progress in next-generation biomarkers is greatly needed. Recent reports have validated a number of biomarkers, including combination assays of proteins and DNA mutations; however, the history of translating promising biomarkers to clinical utility suggests that several major hurdles require careful consideration by the medical community. The first set of challenges involves nominating and verifying biomarkers. Candidate biomarkers need to discriminate disease from benign controls with high sensitivity and specificity for an intended use, which we describe as a two-tiered strategy of identifying and screening high-risk patients. Community-wide efforts to share samples, data, and analysis methods have been beneficial and progress meeting this challenge has been achieved. The second set of challenges is assay optimization and validating biomarkers. After initial candidate validation, assays need to be refined into accurate, cost-effective, highly reproducible, and multiplexed targeted panels and then validated in large cohorts. To move the most promising candidates forward, ideally, biomarker panels, head-to-head comparisons, meta-analysis, and assessment in independent data sets might mitigate risk of failure. Much more investment is needed to overcome these challenges. The third challenge is achieving clinical translation. To moonshot an early detection test to the clinic requires a large clinical trial and organizational, regulatory, and entrepreneurial know-how. Additional factors, such as imaging technologies, will likely need to improve concomitant with molecular biomarker development. The magnitude of the clinical translational challenge is uncertain, but interdisciplinary

  17. Protein Biomarkers for Early Detection of Pancreatic Ductal Adenocarcinoma: Progress and Challenges

    Directory of Open Access Journals (Sweden)

    Alex Root

    2018-03-01

    Full Text Available Approximately 75% of patients with pancreatic ductal adenocarcinoma are diagnosed with advanced cancer, which cannot be safely resected. The most commonly used biomarker CA19-9 has inadequate sensitivity and specificity for early detection, which we define as Stage I/II cancers. Therefore, progress in next-generation biomarkers is greatly needed. Recent reports have validated a number of biomarkers, including combination assays of proteins and DNA mutations; however, the history of translating promising biomarkers to clinical utility suggests that several major hurdles require careful consideration by the medical community. The first set of challenges involves nominating and verifying biomarkers. Candidate biomarkers need to discriminate disease from benign controls with high sensitivity and specificity for an intended use, which we describe as a two-tiered strategy of identifying and screening high-risk patients. Community-wide efforts to share samples, data, and analysis methods have been beneficial and progress meeting this challenge has been achieved. The second set of challenges is assay optimization and validating biomarkers. After initial candidate validation, assays need to be refined into accurate, cost-effective, highly reproducible, and multiplexed targeted panels and then validated in large cohorts. To move the most promising candidates forward, ideally, biomarker panels, head-to-head comparisons, meta-analysis, and assessment in independent data sets might mitigate risk of failure. Much more investment is needed to overcome these challenges. The third challenge is achieving clinical translation. To moonshot an early detection test to the clinic requires a large clinical trial and organizational, regulatory, and entrepreneurial know-how. Additional factors, such as imaging technologies, will likely need to improve concomitant with molecular biomarker development. The magnitude of the clinical translational challenge is uncertain, but

  18. Undergraduate students’ challenges with computational modelling in physics

    Directory of Open Access Journals (Sweden)

    Simen A. Sørby

    2012-12-01

    Full Text Available In later years, computational perspectives have become essential parts in several of the University of Oslo’s natural science studies. In this paper we discuss some main findings from a qualitative study of the computational perspectives’ impact on the students’ work with their first course in physics– mechanics – and their learning and meaning making of its contents. Discussions of the students’ learning of physics are based on sociocultural theory, which originates in Vygotsky and Bakhtin, and subsequent physics education research. Results imply that the greatest challenge for students when working with computational assignments is to combine knowledge from previously known, but separate contexts. Integrating knowledge of informatics, numerical and analytical mathematics and conceptual understanding of physics appears as a clear challenge for the students. We also observe alack of awareness concerning the limitations of physical modelling. The students need help with identifying the appropriate knowledge system or “tool set”, for the different tasks at hand; they need helpto create a plan for their modelling and to become aware of its limits. In light of this, we propose thatan instructive and dialogic text as basis for the exercises, in which the emphasis is on specification, clarification and elaboration, would be of potential great aid for students who are new to computational modelling.

  19. Health impact assessment in China: Emergence, progress and challenges

    Energy Technology Data Exchange (ETDEWEB)

    Huang Zheng, E-mail: huangzhg@mails.tjmu.edu.cn

    2012-01-15

    The values, concepts and approaches of health impact assessment (HIA) were outlined in the Gothenburg consensus paper and some industrialized countries have implemented HIA for many years. HIA has played an important role in environmental protection in China, however, the emergence, progress and challenges of HIA in China have not been well described. In this paper, the evolution of HIA in China was analyzed and the challenges of HIA were presented based on the author's experiences. HIA contributed to decision-making for large capital construction projects, such as the Three Gorges Dam project, in its emergence stage. Increasing attention has been given to HIA in recent years due to supportive policies underpinning development of the draft HIA guidelines in 2008. However enormous challenges lie ahead in ensuring the institutionalization of HIA into project, program and policy decision-making process due to limited scope, immature tools and insufficient professionals in HIA practice. HIA should broaden its horizons by encompassing physical, chemical, biological and socio-economic aspects and constant attempts should be made to integrate HIA into the decision-making process, not only for projects and programs but also for policies as well.

  20. New Challenges for Computing in High Energy Physics

    International Nuclear Information System (INIS)

    Santoro, Alberto

    2003-01-01

    In view of the new scientific programs established for the LHC (Large Hadron Collider) era, the way to face the technological challenges in computing was develop a new concept of GRID computing. We show some examples and, in particular, a proposal for high energy physicists in countries like Brazil. Due to the big amount of data and the need of close collaboration it will be impossible to work in research centers and universities very far from Fermilab or CERN unless a GRID architecture is built. An important effort is being made by the international community to up to date their computing infrastructure and networks

  1. Computational enzyme design approaches with significant biological outcomes: progress and challenges

    OpenAIRE

    Li, Xiaoman; Zhang, Ziding; Song, Jiangning

    2012-01-01

    Enzymes are powerful biocatalysts, however, so far there is still a large gap between the number of enzyme-based practical applications and that of naturally occurring enzymes. Multiple experimental approaches have been applied to generate nearly all possible mutations of target enzymes, allowing the identification of desirable variants with improved properties to meet the practical needs. Meanwhile, an increasing number of computational methods have been developed to assist in the modificati...

  2. Silicon spintronics: Progress and challenges

    Energy Technology Data Exchange (ETDEWEB)

    Sverdlov, Viktor; Selberherr, Siegfried, E-mail: Selberherr@TUWien.ac.at

    2015-07-14

    Electron spin attracts much attention as an alternative to the electron charge degree of freedom for low-power reprogrammable logic and non-volatile memory applications. Silicon appears to be the perfect material for spin-driven applications. Recent progress and challenges regarding spin-based devices are reviewed. An order of magnitude enhancement of the electron spin lifetime in silicon thin films by shear strain is predicted and its impact on spin transport in SpinFETs is discussed. A relatively weak coupling between spin and effective electric field in silicon allows magnetoresistance modulation at room temperature, however, for long channel lengths. Due to tunneling magnetoresistance and spin transfer torque effects, a much stronger coupling between the spin (magnetization) orientation and charge current is achieved in magnetic tunnel junctions. Magnetic random access memory (MRAM) built on magnetic tunnel junctions is CMOS compatible and possesses all properties needed for future universal memory. Designs of spin-based non-volatile MRAM cells are presented. By means of micromagnetic simulations it is demonstrated that a substantial reduction of the switching time can be achieved. Finally, it is shown that any two arbitrary memory cells from an MRAM array can be used to perform a logic operation. Thus, an intrinsic non-volatile logic-in-memory architecture can be realized.

  3. Silicon spintronics: Progress and challenges

    International Nuclear Information System (INIS)

    Sverdlov, Viktor; Selberherr, Siegfried

    2015-01-01

    Electron spin attracts much attention as an alternative to the electron charge degree of freedom for low-power reprogrammable logic and non-volatile memory applications. Silicon appears to be the perfect material for spin-driven applications. Recent progress and challenges regarding spin-based devices are reviewed. An order of magnitude enhancement of the electron spin lifetime in silicon thin films by shear strain is predicted and its impact on spin transport in SpinFETs is discussed. A relatively weak coupling between spin and effective electric field in silicon allows magnetoresistance modulation at room temperature, however, for long channel lengths. Due to tunneling magnetoresistance and spin transfer torque effects, a much stronger coupling between the spin (magnetization) orientation and charge current is achieved in magnetic tunnel junctions. Magnetic random access memory (MRAM) built on magnetic tunnel junctions is CMOS compatible and possesses all properties needed for future universal memory. Designs of spin-based non-volatile MRAM cells are presented. By means of micromagnetic simulations it is demonstrated that a substantial reduction of the switching time can be achieved. Finally, it is shown that any two arbitrary memory cells from an MRAM array can be used to perform a logic operation. Thus, an intrinsic non-volatile logic-in-memory architecture can be realized

  4. Addressing Cloud Computing in Enterprise Architecture: Issues and Challenges

    OpenAIRE

    Khan, Khaled; Gangavarapu, Narendra

    2009-01-01

    This article discusses how the characteristics of cloud computing affect the enterprise architecture in four domains: business, data, application and technology. The ownership and control of architectural components are shifted from organisational perimeters to cloud providers. It argues that although cloud computing promises numerous benefits to enterprises, the shifting control from enterprises to cloud providers on architectural components introduces several architectural challenges. The d...

  5. SIRT6 knockout cells resist apoptosis initiation but not progression: a computational method to evaluate the progression of apoptosis.

    Science.gov (United States)

    Domanskyi, Sergii; Nicholatos, Justin W; Schilling, Joshua E; Privman, Vladimir; Libert, Sergiy

    2017-11-01

    Apoptosis is essential for numerous processes, such as development, resistance to infections, and suppression of tumorigenesis. Here, we investigate the influence of the nutrient sensing and longevity-assuring enzyme SIRT6 on the dynamics of apoptosis triggered by serum starvation. Specifically, we characterize the progression of apoptosis in wild type and SIRT6 deficient mouse embryonic fibroblasts using time-lapse flow cytometry and computational modelling based on rate-equations and cell distribution analysis. We find that SIRT6 deficient cells resist apoptosis by delaying its initiation. Interestingly, once apoptosis is initiated, the rate of its progression is higher in SIRT6 null cells compared to identically cultured wild type cells. However, SIRT6 null cells succumb to apoptosis more slowly, not only in response to nutrient deprivation but also in response to other stresses. Our data suggest that SIRT6 plays a role in several distinct steps of apoptosis. Overall, we demonstrate the utility of our computational model to describe stages of apoptosis progression and the integrity of the cellular membrane. Such measurements will be useful in a broad range of biological applications.

  6. Scenario-Based Digital Forensics Challenges in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Erik Miranda Lopez

    2016-10-01

    Full Text Available The aim of digital forensics is to extract information to answer the 5Ws (Why, When, Where, What, and Who from the data extracted from the evidence. In order to achieve this, most digital forensic processes assume absolute control of digital evidence. However, in a cloud environment forensic investigation, this is not always possible. Additionally, the unique characteristics of cloud computing create new technical, legal and architectural challenges when conducting a forensic investigation. We propose a hypothetical scenario to uncover and explain the challenges forensic practitioners face during cloud investigations. Additionally, we also provide solutions to address the challenges. Our hypothetical case scenario has shown that, in the long run, better live forensic tools, development of new methods tailored for cloud investigations and new procedures and standards are indeed needed. Furthermore, we have come to the conclusion that forensic investigations biggest challenge is not technical but legal.

  7. Mobile Computing: The Emerging Technology, Sensing, Challenges and Applications

    International Nuclear Information System (INIS)

    Bezboruah, T.

    2010-12-01

    The mobile computing is a computing system in which a computer and all necessary accessories like files and software are taken out to the field. It is a system of computing through which it is being able to use a computing device even when someone being mobile and therefore changing location. The portability is one of the important aspects of mobile computing. The mobile phones are being used to gather scientific data from remote and isolated places that could not be possible to retrieve by other means. The scientists are initiating to use mobile devices and web-based applications to systematically explore interesting scientific aspects of their surroundings, ranging from climate change, environmental pollution to earthquake monitoring. This mobile revolution enables new ideas and innovations to spread out more quickly and efficiently. Here we will discuss in brief about the mobile computing technology, its sensing, challenges and the applications. (author)

  8. Computational brain connectivity mapping: A core health and scientific challenge.

    Science.gov (United States)

    Deriche, Rachid

    2016-10-01

    One third of the burden of all the diseases in Europe is due to problems caused by diseases affecting brain. Although exceptional progress have been obtained for exploring the brain during the past decades, it is still terra-incognita and calls for specific efforts in research to better understand its architecture and functioning. To take up this great challenge of modern science and to solve the limited view of the brain provided just by one imaging modality, this article advocates the idea developed in my research group of a global approach involving new generation of models for brain connectivity mapping and strong interactions between structural and functional connectivities. Capitalizing on the strengths of integrated and complementary non invasive imaging modalities such as diffusion Magnetic Resonance Imaging (dMRI) and Electro & Magneto-Encephalography (EEG & MEG) will contribute to achieve new frontiers for identifying and characterizing structural and functional brain connectivities and to provide a detailed mapping of the brain connectivity, both in space and time. Thus leading to an added clinical value for high impact diseases with new perspectives in computational neuro-imaging and cognitive neuroscience. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. ceRNAs in plants: computational approaches and associated challenges for target mimic research.

    Science.gov (United States)

    Paschoal, Alexandre Rossi; Lozada-Chávez, Irma; Domingues, Douglas Silva; Stadler, Peter F

    2017-05-30

    The competing endogenous RNA hypothesis has gained increasing attention as a potential global regulatory mechanism of microRNAs (miRNAs), and as a powerful tool to predict the function of many noncoding RNAs, including miRNAs themselves. Most studies have been focused on animals, although target mimic (TMs) discovery as well as important computational and experimental advances has been developed in plants over the past decade. Thus, our contribution summarizes recent progresses in computational approaches for research of miRNA:TM interactions. We divided this article in three main contributions. First, a general overview of research on TMs in plants is presented with practical descriptions of the available literature, tools, data, databases and computational reports. Second, we describe a common protocol for the computational and experimental analyses of TM. Third, we provide a bioinformatics approach for the prediction of TM motifs potentially cross-targeting both members within the same or from different miRNA families, based on the identification of consensus miRNA-binding sites from known TMs across sequenced genomes, transcriptomes and known miRNAs. This computational approach is promising because, in contrast to animals, miRNA families in plants are large with identical or similar members, several of which are also highly conserved. From the three consensus TM motifs found with our approach: MIM166, MIM171 and MIM159/319, the last one has found strong support on the recent experimental work by Reichel and Millar [Specificity of plant microRNA TMs: cross-targeting of mir159 and mir319. J Plant Physiol 2015;180:45-8]. Finally, we stress the discussion on the major computational and associated experimental challenges that have to be faced in future ceRNA studies. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. Mathematical challenges from theoretical/computational chemistry

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-12-31

    The committee believes that this report has relevance and potentially valuable suggestions for a wide range of readers. Target audiences include: graduate departments in the mathematical and chemical sciences; federal and private agencies that fund research in the mathematical and chemical sciences; selected industrial and government research and development laboratories; developers of software and hardware for computational chemistry; and selected individual researchers. Chapter 2 of this report covers some history of computational chemistry for the nonspecialist, while Chapter 3 illustrates the fruits of some past successful cross-fertilization between mathematical scientists and computational/theoretical chemists. In Chapter 4 the committee has assembled a representative, but not exhaustive, survey of research opportunities. Most of these are descriptions of important open problems in computational/theoretical chemistry that could gain much from the efforts of innovative mathematical scientists, written so as to be accessible introductions to the nonspecialist. Chapter 5 is an assessment, necessarily subjective, of cultural differences that must be overcome if collaborative work is to be encouraged between the mathematical and the chemical communities. Finally, the report ends with a brief list of conclusions and recommendations that, if followed, could promote accelerated progress at this interface. Recognizing that bothersome language issues can inhibit prospects for collaborative research at the interface between distinctive disciplines, the committee has attempted throughout to maintain an accessible style, in part by using illustrative boxes, and has included at the end of the report a glossary of technical terms that may be familiar to only a subset of the target audiences listed above.

  11. Progress and challenges in bioinformatics approaches for enhancer identification

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2017-02-03

    Enhancers are cis-acting DNA elements that play critical roles in distal regulation of gene expression. Identifying enhancers is an important step for understanding distinct gene expression programs that may reflect normal and pathogenic cellular conditions. Experimental identification of enhancers is constrained by the set of conditions used in the experiment. This requires multiple experiments to identify enhancers, as they can be active under specific cellular conditions but not in different cell types/tissues or cellular states. This has opened prospects for computational prediction methods that can be used for high-throughput identification of putative enhancers to complement experimental approaches. Potential functions and properties of predicted enhancers have been catalogued and summarized in several enhancer-oriented databases. Because the current methods for the computational prediction of enhancers produce significantly different enhancer predictions, it will be beneficial for the research community to have an overview of the strategies and solutions developed in this field. In this review, we focus on the identification and analysis of enhancers by bioinformatics approaches. First, we describe a general framework for computational identification of enhancers, present relevant data types and discuss possible computational solutions. Next, we cover over 30 existing computational enhancer identification methods that were developed since 2000. Our review highlights advantages, limitations and potentials, while suggesting pragmatic guidelines for development of more efficient computational enhancer prediction methods. Finally, we discuss challenges and open problems of this topic, which require further consideration.

  12. Progress and challenges in bioinformatics approaches for enhancer identification

    KAUST Repository

    Kleftogiannis, Dimitrios A.; Kalnis, Panos; Bajic, Vladimir B.

    2017-01-01

    Enhancers are cis-acting DNA elements that play critical roles in distal regulation of gene expression. Identifying enhancers is an important step for understanding distinct gene expression programs that may reflect normal and pathogenic cellular conditions. Experimental identification of enhancers is constrained by the set of conditions used in the experiment. This requires multiple experiments to identify enhancers, as they can be active under specific cellular conditions but not in different cell types/tissues or cellular states. This has opened prospects for computational prediction methods that can be used for high-throughput identification of putative enhancers to complement experimental approaches. Potential functions and properties of predicted enhancers have been catalogued and summarized in several enhancer-oriented databases. Because the current methods for the computational prediction of enhancers produce significantly different enhancer predictions, it will be beneficial for the research community to have an overview of the strategies and solutions developed in this field. In this review, we focus on the identification and analysis of enhancers by bioinformatics approaches. First, we describe a general framework for computational identification of enhancers, present relevant data types and discuss possible computational solutions. Next, we cover over 30 existing computational enhancer identification methods that were developed since 2000. Our review highlights advantages, limitations and potentials, while suggesting pragmatic guidelines for development of more efficient computational enhancer prediction methods. Finally, we discuss challenges and open problems of this topic, which require further consideration.

  13. Nuclear Data Covariances in the Indian Context – Progress, Challenges, Excitement and Perspectives

    International Nuclear Information System (INIS)

    Ganesan, S.

    2015-01-01

    We present a brief overview of progress, challenges, excitement and perspectives in developing nuclear data covariances in the Indian context in relation to target accuracies and sensitivity studies that are of great importance to Bhabha's 3-stage nuclear programme for energy and non-energy applications

  14. Computer Network Security- The Challenges of Securing a Computer Network

    Science.gov (United States)

    Scotti, Vincent, Jr.

    2011-01-01

    This article is intended to give the reader an overall perspective on what it takes to design, implement, enforce and secure a computer network in the federal and corporate world to insure the confidentiality, integrity and availability of information. While we will be giving you an overview of network design and security, this article will concentrate on the technology and human factors of securing a network and the challenges faced by those doing so. It will cover the large number of policies and the limits of technology and physical efforts to enforce such policies.

  15. Progress and challenges associated with halal authentication of consumer packaged goods.

    Science.gov (United States)

    Premanandh, Jagadeesan; Bin Salem, Samara

    2017-11-01

    Abusive business practices are increasingly evident in consumer packaged goods. Although consumers have the right to protect themselves against such practices, rapid urbanization and industrialization result in greater distances between producers and consumers, raising serious concerns on the supply chain. The operational complexities surrounding halal authentication pose serious challenges on the integrity of consumer packaged goods. This article attempts to address the progress and challenges associated with halal authentication. Advancement and concerns on the application of new, rapid analytical methods for halal authentication are discussed. The significance of zero tolerance policy in consumer packaged foods and its impact on analytical testing are presented. The role of halal assurance systems and their challenges are also considered. In conclusion, consensus on the establishment of one standard approach coupled with a sound traceability system and constant monitoring would certainly improve and ensure halalness of consumer packaged goods. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  16. The Awareness and Challenges of Cloud Computing Adoption on Tertiary Education in Malaysia

    Science.gov (United States)

    Hazreeni Hamzah, Nor; Mahmud, Maziah; Zukri, Shamsunarnie Mohamed; Yaacob, Wan Fairos Wan; Yacob, Jusoh

    2017-09-01

    This preliminary study aims to investigate the awareness of the adoption of cloud computing among the academicians in tertiary education in Malaysia. Besides, this study also want to explore the possible challenges faced by the academician while adopting this new technology. The pilot study was done on 40 lecturers in Universiti Teknologi MARA Kampus Kota Bharu (UiTMKB) by using self administered questionnaire. The results found that almost half (40 percent) were not aware on the existing of cloud computing in teaching and learning (T&L) process. The challenges confronting the adoption of cloud computing are data insecurity, data insecurity, unsolicited advertisement, lock-in, reluctance to eliminate staff positions, privacy concerns, reliability challenge, regulatory compliance concerns/user control and institutional culture/resistance to change in technology. This possible challenges can be factorized in two major factors which were security and dependency factor and user control and mentality factor.

  17. Progress and challenges in the computational prediction of gene function using networks [v1; ref status: indexed, http://f1000r.es/SqmJUM

    Directory of Open Access Journals (Sweden)

    Paul Pavlidis

    2012-09-01

    Full Text Available In this opinion piece, we attempt to unify recent arguments we have made that serious confounds affect the use of network data to predict and characterize gene function. The development of computational approaches to determine gene function is a major strand of computational genomics research. However, progress beyond using BLAST to transfer annotations has been surprisingly slow. We have previously argued that a large part of the reported success in using "guilt by association" in network data is due to the tendency of methods to simply assign new functions to already well-annotated genes. While such predictions will tend to be correct, they are generic; it is true, but not very helpful, that a gene with many functions is more likely to have any function. We have also presented evidence that much of the remaining performance in cross-validation cannot be usefully generalized to new predictions, making progressive improvement in analysis difficult to engineer. Here we summarize our findings about how these problems will affect network analysis, discuss some ongoing responses within the field to these issues, and consolidate some recommendations and speculation, which we hope will modestly increase the reliability and specificity of gene function prediction.

  18. Inclusive Education in Georgia: Current Progress and Challenges

    Directory of Open Access Journals (Sweden)

    Nikoloz Kavelashvili

    2017-05-01

    Full Text Available Purpose and Originality: The paper provides a realistic picture about how the implementation process of inclusive education in Georgia is developing, about the problems that are encountered together with what needs are to be fulfilled for stimulating the process. Today’s challenge in the country is to make inclusive practices available to everybody, everywhere and all the time. This article discusses the status of the efforts being made to meet this challenge. In the course of that discussion, some comprehensive changes will be described that systemic efforts of school improvement must achieve to continue making progress towards fully inclusive learning. Method: The study was conducted in Georgia. A qualitative research design was employed along with closed-ended and open-ended questionnaires, which allowed participants to express their point of views, skills and knowledge. Data collection methods were applied: semi-structured interviews and observation on respondents. Results: The study uncovers those challenges that obstruct the implementation process: indifferent attitudes of teachers and parents towards inclusion, absence of self-awareness to the issue amongst educators, slightest involvement of parents and need to infrastructural development. Society: The results should raise the awareness of the population of Georgia as well as increase the understanding of the problem. Limitations / further research: There were quite enough informants on the school level (special teachers, principals, however, there are still many other possible respondents who could add something valuable to a better understanding of the process of inclusion at schools. The theoretical approach employed in the study and the empirical research could be validated.

  19. Towards a cyberinfrastructure for the biological sciences: progress, visions and challenges.

    Science.gov (United States)

    Stein, Lincoln D

    2008-09-01

    Biology is an information-driven science. Large-scale data sets from genomics, physiology, population genetics and imaging are driving research at a dizzying rate. Simultaneously, interdisciplinary collaborations among experimental biologists, theorists, statisticians and computer scientists have become the key to making effective use of these data sets. However, too many biologists have trouble accessing and using these electronic data sets and tools effectively. A 'cyberinfrastructure' is a combination of databases, network protocols and computational services that brings people, information and computational tools together to perform science in this information-driven world. This article reviews the components of a biological cyberinfrastructure, discusses current and pending implementations, and notes the many challenges that lie ahead.

  20. The FCC-ee study: Progress and challenges

    CERN Document Server

    Koratzinos, Michael; Bogomyagkov, Anton; Boscolo, Manuela; Cook, Charlie; Doblhammer, Andreas; Härer, Bastian; Tomás, Rogelio; Levichev, Evgeny; Medina Medrano, Luis; Shatilov, Dmitry; Wienands, Ulrich; Zimmermann, Frank

    The FCC (Future Circular Collider) study represents a vision for the next large project in high energy physics, comprising an 80-100 km tunnel that can house a future 100 TeV hadron collider. The study also includes a high luminosity e+e- collider operating in the centre-of-mass energy range of 90-350 GeV as a possible intermediate step, the FCC-ee. The FCC-ee aims at definitive electro-weak precision measurements of the Z, W, H and top particles, and search for rare phenomena. Although FCC-ee is based on known technology, the goal performance in luminosity and energy calibration make it quite challenging. During 2014 the study went through an exploration phase. The study has now entered its second year and the aim is to produce a conceptual design report during the next three to four years. We here report on progress since the last IPAC conference.

  1. Advanced teaching labs in physics - celebrating progress; challenges ahead

    Science.gov (United States)

    Peterson, Richard

    A few examples of optical physics experiments may help us first reflect on significant progress on how advanced lab initiatives may now be more effectively developed, discussed, and disseminated - as opposed to only 10 or 15 years back. Many cooperative developments of the last decade are having profound impacts on advanced lab workers and students. Central to these changes are the programs of the Advanced Laboratory Physics Association (ALPhA) (Immersions, BFY conferences), AAPT (advlab-l server, ComPADRE, apparatus competitions, summer workshops/sessions), APS (Reichert Award, FEd activities and sessions), and the Jonathan F. Reichert Foundation (ALPhA support and institution matched equipment grants for Immersion participants). Broad NSF support has helped undergird several of these initiatives. Two of the most significant challenges before this new advanced lab community are (a) to somehow enhance funding opportunities for teaching equipment and apparatus in an era of minimal NSF equipment support, and (b) to help develop a more complementary relationship between research-based advanced lab pedagogies and the development of fresh physics experiments that help enable the mentoring and experimental challenge of our students.

  2. Computer Graphics Research Laboratory Quarterly Progress Report Number 49, July-September 1993

    Science.gov (United States)

    1993-11-22

    20 Texture Sampling and Strength Guided Motion: Jeffry S. Nimeroff 23 21 Radiosity : Min-Zhi Shao 24 22 Blended Shape Primitives: Douglas DeCarlo 25 23...placement. "* Extensions of radiosity rendering. "* A discussion of blended shape primitives and the applications in computer vision and computer...user. Radiosity : An improved version of the radiosity renderer is included. This version uses a fast over- relaxation progressive refinement algorithm

  3. Achievements and Challenges in Computational Protein Design.

    Science.gov (United States)

    Samish, Ilan

    2017-01-01

    Computational protein design (CPD), a yet evolving field, includes computer-aided engineering for partial or full de novo designs of proteins of interest. Designs are defined by a requested structure, function, or working environment. This chapter describes the birth and maturation of the field by presenting 101 CPD examples in a chronological order emphasizing achievements and pending challenges. Integrating these aspects presents the plethora of CPD approaches with the hope of providing a "CPD 101". These reflect on the broader structural bioinformatics and computational biophysics field and include: (1) integration of knowledge-based and energy-based methods, (2) hierarchical designated approach towards local, regional, and global motifs and the integration of high- and low-resolution design schemes that fit each such region, (3) systematic differential approaches towards different protein regions, (4) identification of key hot-spot residues and the relative effect of remote regions, (5) assessment of shape-complementarity, electrostatics and solvation effects, (6) integration of thermal plasticity and functional dynamics, (7) negative design, (8) systematic integration of experimental approaches, (9) objective cross-assessment of methods, and (10) successful ranking of potential designs. Future challenges also include dissemination of CPD software to the general use of life-sciences researchers and the emphasis of success within an in vivo milieu. CPD increases our understanding of protein structure and function and the relationships between the two along with the application of such know-how for the benefit of mankind. Applied aspects range from biological drugs, via healthier and tastier food products to nanotechnology and environmentally friendly enzymes replacing toxic chemicals utilized in the industry.

  4. Reviews on Security Issues and Challenges in Cloud Computing

    Science.gov (United States)

    An, Y. Z.; Zaaba, Z. F.; Samsudin, N. F.

    2016-11-01

    Cloud computing is an Internet-based computing service provided by the third party allowing share of resources and data among devices. It is widely used in many organizations nowadays and becoming more popular because it changes the way of how the Information Technology (IT) of an organization is organized and managed. It provides lots of benefits such as simplicity and lower costs, almost unlimited storage, least maintenance, easy utilization, backup and recovery, continuous availability, quality of service, automated software integration, scalability, flexibility and reliability, easy access to information, elasticity, quick deployment and lower barrier to entry. While there is increasing use of cloud computing service in this new era, the security issues of the cloud computing become a challenges. Cloud computing must be safe and secure enough to ensure the privacy of the users. This paper firstly lists out the architecture of the cloud computing, then discuss the most common security issues of using cloud and some solutions to the security issues since security is one of the most critical aspect in cloud computing due to the sensitivity of user's data.

  5. Computational challenges in atomic, molecular and optical physics.

    Science.gov (United States)

    Taylor, Kenneth T

    2002-06-15

    Six challenges are discussed. These are the laser-driven helium atom; the laser-driven hydrogen molecule and hydrogen molecular ion; electron scattering (with ionization) from one-electron atoms; the vibrational and rotational structure of molecules such as H(3)(+) and water at their dissociation limits; laser-heated clusters; and quantum degeneracy and Bose-Einstein condensation. The first four concern fundamental few-body systems where use of high-performance computing (HPC) is currently making possible accurate modelling from first principles. This leads to reliable predictions and support for laboratory experiment as well as true understanding of the dynamics. Important aspects of these challenges addressable only via a terascale facility are set out. Such a facility makes the last two challenges in the above list meaningfully accessible for the first time, and the scientific interest together with the prospective role for HPC in these is emphasized.

  6. Gaucher disease: Progress and ongoing challenges.

    Science.gov (United States)

    Mistry, Pramod K; Lopez, Grisel; Schiffmann, Raphael; Barton, Norman W; Weinreb, Neal J; Sidransky, Ellen

    Over the past decades, tremendous progress has been made in the field of Gaucher disease, the inherited deficiency of the lysosomal enzyme glucocerebrosidase. Many of the colossal achievements took place during the course of the sixty-year tenure of Dr. Roscoe Brady at the National Institutes of Health. These include the recognition of the enzymatic defect involved, the isolation and characterization of the protein, the localization and characterization of the gene and its nearby pseudogene, as well as the identification of the first mutant alleles in patients. The first treatment for Gaucher disease, enzyme replacement therapy, was conceived of, developed and tested at the Clinical Center of the National Institutes of Health. Advances including recombinant production of the enzyme, the development of mouse models, pioneering gene therapy experiments, high throughput screens of small molecules and the generation of induced pluripotent stem cell models have all helped to catapult research in Gaucher disease into the twenty-first century. The appreciation that mutations in the glucocerebrosidase gene are an important risk factor for parkinsonism further expands the impact of this work. However, major challenges still remain, some of which are described here, that will provide opportunities, excitement and discovery for the next generations of Gaucher investigators. Published by Elsevier Inc.

  7. Scientific Grand Challenges: Crosscutting Technologies for Computing at the Exascale - February 2-4, 2010, Washington, D.C.

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.

    2011-02-06

    The goal of the "Scientific Grand Challenges - Crosscutting Technologies for Computing at the Exascale" workshop in February 2010, jointly sponsored by the U.S. Department of Energy’s Office of Advanced Scientific Computing Research and the National Nuclear Security Administration, was to identify the elements of a research and development agenda that will address these challenges and create a comprehensive exascale computing environment. This exascale computing environment will enable the science applications identified in the eight previously held Scientific Grand Challenges Workshop Series.

  8. Center for computation and visualization of geometric structures. [Annual], Progress report

    Energy Technology Data Exchange (ETDEWEB)

    1993-02-12

    The mission of the Center is to establish a unified environment promoting research, education, and software and tool development. The work is centered on computing, interpreted in a broad sense to include the relevant theory, development of algorithms, and actual implementation. The research aspects of the Center are focused on geometry; correspondingly the computational aspects are focused on three (and higher) dimensional visualization. The educational aspects are likewise centered on computing and focused on geometry. A broader term than education is `communication` which encompasses the challenge of explaining to the world current research in mathematics, and specifically geometry.

  9. Achievements and challenges in structural bioinformatics and computational biophysics.

    Science.gov (United States)

    Samish, Ilan; Bourne, Philip E; Najmanovich, Rafael J

    2015-01-01

    The field of structural bioinformatics and computational biophysics has undergone a revolution in the last 10 years. Developments that are captured annually through the 3DSIG meeting, upon which this article reflects. An increase in the accessible data, computational resources and methodology has resulted in an increase in the size and resolution of studied systems and the complexity of the questions amenable to research. Concomitantly, the parameterization and efficiency of the methods have markedly improved along with their cross-validation with other computational and experimental results. The field exhibits an ever-increasing integration with biochemistry, biophysics and other disciplines. In this article, we discuss recent achievements along with current challenges within the field. © The Author 2014. Published by Oxford University Press.

  10. Precision Medicine and PET/Computed Tomography: Challenges and Implementation.

    Science.gov (United States)

    Subramaniam, Rathan M

    2017-01-01

    Precision Medicine is about selecting the right therapy for the right patient, at the right time, specific to the molecular targets expressed by disease or tumors, in the context of patient's environment and lifestyle. Some of the challenges for delivery of precision medicine in oncology include biomarkers for patient selection for enrichment-precision diagnostics, mapping out tumor heterogeneity that contributes to therapy failures, and early therapy assessment to identify resistance to therapies. PET/computed tomography offers solutions in these important areas of challenges and facilitates implementation of precision medicine. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Computational Challenge of Fractional Differential Equations and the Potential Solutions: A Survey

    Directory of Open Access Journals (Sweden)

    Chunye Gong

    2015-01-01

    Full Text Available We present a survey of fractional differential equations and in particular of the computational cost for their numerical solutions from the view of computer science. The computational complexities of time fractional, space fractional, and space-time fractional equations are O(N2M, O(NM2, and O(NM(M + N compared with O(MN for the classical partial differential equations with finite difference methods, where M, N are the number of space grid points and time steps. The potential solutions for this challenge include, but are not limited to, parallel computing, memory access optimization (fractional precomputing operator, short memory principle, fast Fourier transform (FFT based solutions, alternating direction implicit method, multigrid method, and preconditioner technology. The relationships of these solutions for both space fractional derivative and time fractional derivative are discussed. The authors pointed out that the technologies of parallel computing should be regarded as a basic method to overcome this challenge, and some attention should be paid to the fractional killer applications, high performance iteration methods, high order schemes, and Monte Carlo methods. Since the computation of fractional equations with high dimension and variable order is even heavier, the researchers from the area of mathematics and computer science have opportunity to invent cornerstones in the area of fractional calculus.

  12. US DOE Grand Challenge in Computational Accelerator Physics

    International Nuclear Information System (INIS)

    Ryne, R.; Habib, S.; Qiang, J.; Ko, K.; Li, Z.; McCandless, B.; Mi, W.; Ng, C.; Saparov, M.; Srinivas, V.; Sun, Y.; Zhan, X.; Decyk, V.; Golub, G.

    1998-01-01

    Particle accelerators are playing an increasingly important role in basic and applied science, and are enabling new accelerator-driven technologies. But the design of next-generation accelerators, such as linear colliders and high intensity linacs, will require a major advance in numerical modeling capability due to extremely stringent beam control and beam loss requirements, and the presence of highly complex three-dimensional accelerator components. To address this situation, the U.S. Department of Energy has approved a ''Grand Challenge'' in Computational Accelerator Physics, whose primary goal is to develop a parallel modeling capability that will enable high performance, large scale simulations for the design, optimization, and numerical validation of next-generation accelerators. In this paper we report on the status of the Grand Challenge

  13. BigData and computing challenges in high energy and nuclear physics

    Science.gov (United States)

    Klimentov, A.; Grigorieva, M.; Kiryanov, A.; Zarochentsev, A.

    2017-06-01

    In this contribution we discuss the various aspects of the computing resource needs experiments in High Energy and Nuclear Physics, in particular at the Large Hadron Collider. This will evolve in the future when moving from LHC to HL-LHC in ten years from now, when the already exascale levels of data we are processing could increase by a further order of magnitude. The distributed computing environment has been a great success and the inclusion of new super-computing facilities, cloud computing and volunteering computing for the future is a big challenge, which we are successfully mastering with a considerable contribution from many super-computing centres around the world, academic and commercial cloud providers. We also discuss R&D computing projects started recently in National Research Center ``Kurchatov Institute''

  14. BigData and computing challenges in high energy and nuclear physics

    International Nuclear Information System (INIS)

    Klimentov, A.; Grigorieva, M.; Kiryanov, A.; Zarochentsev, A.

    2017-01-01

    In this contribution we discuss the various aspects of the computing resource needs experiments in High Energy and Nuclear Physics, in particular at the Large Hadron Collider. This will evolve in the future when moving from LHC to HL-LHC in ten years from now, when the already exascale levels of data we are processing could increase by a further order of magnitude. The distributed computing environment has been a great success and the inclusion of new super-computing facilities, cloud computing and volunteering computing for the future is a big challenge, which we are successfully mastering with a considerable contribution from many super-computing centres around the world, academic and commercial cloud providers. We also discuss R and D computing projects started recently in National Research Center ''Kurchatov Institute''

  15. Agent 2003 Conference on Challenges in Social Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Margaret Clemmons, ed.

    2003-01-01

    Welcome to the Proceedings of the fourth in a series of agent simulation conferences cosponsored by Argonne National Laboratory and The University of Chicago. Agent 2003 is the second conference in which three Special Interest Groups from the North American Association for Computational Social and Organizational Science (NAACSOS) have been involved in planning the program--Computational Social Theory; Simulation Applications; and Methods, Toolkits and Techniques. The theme of Agent 2003, Challenges in Social Simulation, is especially relevant, as there seems to be no shortage of such challenges. Agent simulation has been applied with increasing frequency to social domains for several decades, and its promise is clear and increasingly visible. Like any nascent scientific methodology, however, it faces a number of problems or issues that must be addressed in order to progress. These challenges include: (1) Validating models relative to the social settings they are designed to represent; (2) Developing agents and interactions simple enough to understand but sufficiently complex to do justice to the social processes of interest; (3) Bridging the gap between empirically spare artificial societies and naturally occurring social phenomena; (4) Building multi-level models that span processes across domains; (5) Promoting a dialog among theoretical, qualitative, and empirical social scientists and area experts, on the one hand, and mathematical and computational modelers and engineers, on the other; (6) Using that dialog to facilitate substantive progress in the social sciences; and (7) Fulfilling the aspirations of users in business, government, and other application areas, while recognizing and addressing the preceding challenges. Although this list hardly exhausts the challenges the field faces, it does identify topics addressed throughout the presentations of Agent 2003. Agent 2003 is part of a much larger process in which new methods and techniques are applied to

  16. Challenges and opportunities of cloud computing for atmospheric sciences

    Science.gov (United States)

    Pérez Montes, Diego A.; Añel, Juan A.; Pena, Tomás F.; Wallom, David C. H.

    2016-04-01

    Cloud computing is an emerging technological solution widely used in many fields. Initially developed as a flexible way of managing peak demand it has began to make its way in scientific research. One of the greatest advantages of cloud computing for scientific research is independence of having access to a large cyberinfrastructure to fund or perform a research project. Cloud computing can avoid maintenance expenses for large supercomputers and has the potential to 'democratize' the access to high-performance computing, giving flexibility to funding bodies for allocating budgets for the computational costs associated with a project. Two of the most challenging problems in atmospheric sciences are computational cost and uncertainty in meteorological forecasting and climate projections. Both problems are closely related. Usually uncertainty can be reduced with the availability of computational resources to better reproduce a phenomenon or to perform a larger number of experiments. Here we expose results of the application of cloud computing resources for climate modeling using cloud computing infrastructures of three major vendors and two climate models. We show how the cloud infrastructure compares in performance to traditional supercomputers and how it provides the capability to complete experiments in shorter periods of time. The monetary cost associated is also analyzed. Finally we discuss the future potential of this technology for meteorological and climatological applications, both from the point of view of operational use and research.

  17. Multiscale methods in turbulent combustion: strategies and computational challenges

    International Nuclear Information System (INIS)

    Echekki, Tarek

    2009-01-01

    A principal challenge in modeling turbulent combustion flows is associated with their complex, multiscale nature. Traditional paradigms in the modeling of these flows have attempted to address this nature through different strategies, including exploiting the separation of turbulence and combustion scales and a reduced description of the composition space. The resulting moment-based methods often yield reasonable predictions of flow and reactive scalars' statistics under certain conditions. However, these methods must constantly evolve to address combustion at different regimes, modes or with dominant chemistries. In recent years, alternative multiscale strategies have emerged, which although in part inspired by the traditional approaches, also draw upon basic tools from computational science, applied mathematics and the increasing availability of powerful computational resources. This review presents a general overview of different strategies adopted for multiscale solutions of turbulent combustion flows. Within these strategies, some specific models are discussed or outlined to illustrate their capabilities and underlying assumptions. These strategies may be classified under four different classes, including (i) closure models for atomistic processes, (ii) multigrid and multiresolution strategies, (iii) flame-embedding strategies and (iv) hybrid large-eddy simulation-low-dimensional strategies. A combination of these strategies and models can potentially represent a robust alternative strategy to moment-based models; but a significant challenge remains in the development of computational frameworks for these approaches as well as their underlying theories. (topical review)

  18. Static Load Balancing Algorithms In Cloud Computing Challenges amp Solutions

    Directory of Open Access Journals (Sweden)

    Nadeem Shah

    2015-08-01

    Full Text Available Abstract Cloud computing provides on-demand hosted computing resources and services over the Internet on a pay-per-use basis. It is currently becoming the favored method of communication and computation over scalable networks due to numerous attractive attributes such as high availability scalability fault tolerance simplicity of management and low cost of ownership. Due to the huge demand of cloud computing efficient load balancing becomes critical to ensure that computational tasks are evenly distributed across servers to prevent bottlenecks. The aim of this review paper is to understand the current challenges in cloud computing primarily in cloud load balancing using static algorithms and finding gaps to bridge for more efficient static cloud load balancing in the future. We believe the ideas suggested as new solution will allow researchers to redesign better algorithms for better functionalities and improved user experiences in simple cloud systems. This could assist small businesses that cannot afford infrastructure that supports complex amp dynamic load balancing algorithms.

  19. Progress of the Computer-Aided Engineering of Electric Drive Vehicle Batteries (CAEBAT) (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Pesaran, A. A.; Han, T.; Hartridge, S.; Shaffer, C.; Kim, G. H.; Pannala, S.

    2013-06-01

    This presentation, Progress of Computer-Aided Engineering of Electric Drive Vehicle Batteries (CAEBAT) is about simulation and computer-aided engineering (CAE) tools that are widely used to speed up the research and development cycle and reduce the number of build-and-break steps, particularly in the automotive industry. Realizing this, DOE?s Vehicle Technologies Program initiated the CAEBAT project in April 2010 to develop a suite of software tools for designing batteries.

  20. A computational method for computing an Alzheimer’s Disease Progression Score; experiments and validation with the ADNI dataset

    Science.gov (United States)

    Jedynak, Bruno M.; Liu, Bo; Lang, Andrew; Gel, Yulia; Prince, Jerry L.

    2014-01-01

    Understanding the time-dependent changes of biomarkers related to Alzheimer’s disease (AD) is a key to assessing disease progression and to measuring the outcomes of disease-modifying therapies. In this paper, we validate an Alzheimer’s disease progression score model which uses multiple biomarkers to quantify the AD progression of subjects following three assumptions: (1) there is a unique disease progression for all subjects, (2) each subject has a different age of onset and rate of progression, and (3) each biomarker is sigmoidal as a function of disease progression. Fitting the parameters of this model is a challenging problem which we approach using an alternating least squares optimization algorithm. In order to validate this optimization scheme under realistic conditions, we use the Alzheimer’s Disease Neuroimaging Initiative (ADNI) cohort. With the help of Monte Carlo simulations, we show that most of the global parameters of the model are tightly estimated, thus enabling an ordering of the biomarkers that fit the model well, ordered as: the Rey auditory verbal learning test with 30 minutes delay, the sum of the two lateral hippocampal volumes divided by the intra-cranial volume, followed by (the clinical dementia rating sum of boxes score and the mini mental state examination score) in no particular order and lastly the Alzheimer’s disease assessment scale-cognitive subscale. PMID:25444605

  1. Progress and Challenges in Assessing NOAA Data Management

    Science.gov (United States)

    de la Beaujardiere, J.

    2016-12-01

    The US National Oceanic and Atmospheric Administration (NOAA) produces large volumes of environmental data from a great variety of observing systems including satellites, radars, aircraft, ships, buoys, and other platforms. These data are irreplaceable assets that must be properly managed to ensure they are discoverable, accessible, usable, and preserved. A policy framework has been established which informs data producers of their responsibilities and which supports White House-level mandates such as the Executive Order on Open Data and the OSTP Memorandum on Increasing Access to the Results of Federally Funded Scientific Research. However, assessing the current state and progress toward completion for the many NOAA datasets is a challenge. This presentation will discuss work toward establishing assessment methodologies and dashboard-style displays. Ideally, metrics would be gathered though software and be automatically updated whenever an individual improvement was made. In practice, however, some level of manual information collection is required. Differing approaches to dataset granularity in different branches of NOAA yield additional complexity.

  2. Conference Scene: From innovative polymers to advanced nanomedicine: Key challenges, recent progress and future perspectives

    NARCIS (Netherlands)

    Feijen, Jan; Hennink, W.E.; Zhong, Zhiyuan

    2013-01-01

    Recent developments in polymer-based controlled delivery systems have made a significant clinical impact. The second Symposium on Innovative Polymers for Controlled Delivery (SIPCD) was held in Suzhou, China to address the key challenges and provide up-to-date progress and future perspectives in the

  3. Progress report of a research program in computational physics

    International Nuclear Information System (INIS)

    Guralnik, G.S.

    1990-01-01

    Task D's research is focused on the understanding of elementary particle physics through the techniques of quantum field theory. We make intensive use of computers to aid our research. During the last year we have made significant progress in understanding the weak interactions through the use of Monte Carlo methods as applied to the equations of quenched lattice QCD. We have launched a program to understand full (not quenched) lattice QCD on relatively large lattices using massively parallel computers. Because of our awareness that Monte Carlo methods might not be able to give a good solution to field theories with the computer power likely to be available to us for the forseeable future we have launched an entirely different numerical approach to study these problems. This ''Source Galerkin'' method is based on an algebraic approach to the field theoretic equations of motion and is (somewhat) related to variational and finite element techniques applied to a source rather than a coordinate space. The results for relatively simple problems are sensationally good. In particular, fermions can be treated in a way which allows them to retain their status as independent dynamical entities in the theory. 8 refs

  4. PROGRESS & CHALLENGES IN CLEANUP OF HANFORDS TANK WASTES

    Energy Technology Data Exchange (ETDEWEB)

    HEWITT, W.M.; SCHEPENS, R.

    2006-01-23

    The River Protection Project (RPP), which is managed by the Department of Energy (DOE) Office of River Protection (ORP), is highly complex from technical, regulatory, legal, political, and logistical perspectives and is the largest ongoing environmental cleanup project in the world. Over the past three years, ORP has made significant advances in its planning and execution of the cleanup of the Hartford tank wastes. The 149 single-shell tanks (SSTs), 28 double-shell tanks (DSTs), and 60 miscellaneous underground storage tanks (MUSTs) at Hanford contain approximately 200,000 m{sup 3} (53 million gallons) of mixed radioactive wastes, some of which dates back to the first days of the Manhattan Project. The plan for treating and disposing of the waste stored in large underground tanks is to: (1) retrieve the waste, (2) treat the waste to separate it into high-level (sludge) and low-activity (supernatant) fractions, (3) remove key radionuclides (e.g., Cs-137, Sr-90, actinides) from the low-activity fraction to the maximum extent technically and economically practical, (4) immobilize both the high-level and low-activity waste fractions by vitrification, (5) interim store the high-level waste fraction for ultimate disposal off-site at the federal HLW repository, (6) dispose the low-activity fraction on-site in the Integrated Disposal Facility (IDF), and (7) close the waste management areas consisting of tanks, ancillary equipment, soils, and facilities. Design and construction of the Waste Treatment and Immobilization Plant (WTP), the cornerstone of the RPP, has progressed substantially despite challenges arising from new seismic information for the WTP site. We have looked closely at the waste and aligned our treatment and disposal approaches with the waste characteristics. For example, approximately 11,000 m{sup 3} (2-3 million gallons) of metal sludges in twenty tanks were not created during spent nuclear fuel reprocessing and have low fission product concentrations. We

  5. Computer systems and nuclear industry

    International Nuclear Information System (INIS)

    Nkaoua, Th.; Poizat, F.; Augueres, M.J.

    1999-01-01

    This article deals with computer systems in nuclear industry. In most nuclear facilities it is necessary to handle a great deal of data and of actions in order to help plant operator to drive, to control physical processes and to assure the safety. The designing of reactors requires reliable computer codes able to simulate neutronic or mechanical or thermo-hydraulic behaviours. Calculations and simulations play an important role in safety analysis. In each of these domains, computer systems have progressively appeared as efficient tools to challenge and master complexity. (A.C.)

  6. Research program in computational physics: [Progress report for Task D

    International Nuclear Information System (INIS)

    Guralnik, G.S.

    1987-01-01

    Studies are reported of several aspects of the purely gluonic sector of QCD, including methods for efficiently generating gauge configurations, properties of the standard Wilson action and improved actions, and properties of the pure glue theory itself. Simulation of quantum chromodynamics in the ''quenched approximation'', in which the back reaction of quarks upon gauge fields is neglected, is studied with fermions introduced on the lattice via both Wilson and staggered formulations. Efforts are also reported to compute QCD matrix elements and to simulate QCD theory beyond the quenched approximation considering the effect of the quarks on the gauge fields. Work is in progress toward improving the algorithms used to generate the gauge field configurations and to compute the quark propagators. Implementation of lattice QCD on a hypercube is also reported

  7. Opportunities and Challenges of Cloud Computing to Improve Health Care Services

    Science.gov (United States)

    2011-01-01

    Cloud computing is a new way of delivering computing resources and services. Many managers and experts believe that it can improve health care services, benefit health care research, and change the face of health information technology. However, as with any innovation, cloud computing should be rigorously evaluated before its widespread adoption. This paper discusses the concept and its current place in health care, and uses 4 aspects (management, technology, security, and legal) to evaluate the opportunities and challenges of this computing model. Strategic planning that could be used by a health organization to determine its direction, strategy, and resource allocation when it has decided to migrate from traditional to cloud-based health services is also discussed. PMID:21937354

  8. Opportunities and challenges of cloud computing to improve health care services.

    Science.gov (United States)

    Kuo, Alex Mu-Hsing

    2011-09-21

    Cloud computing is a new way of delivering computing resources and services. Many managers and experts believe that it can improve health care services, benefit health care research, and change the face of health information technology. However, as with any innovation, cloud computing should be rigorously evaluated before its widespread adoption. This paper discusses the concept and its current place in health care, and uses 4 aspects (management, technology, security, and legal) to evaluate the opportunities and challenges of this computing model. Strategic planning that could be used by a health organization to determine its direction, strategy, and resource allocation when it has decided to migrate from traditional to cloud-based health services is also discussed.

  9. Applications of computational intelligence in nuclear reactors

    International Nuclear Information System (INIS)

    Jayalal, M.L.; Jehadeesan, R.

    2016-01-01

    Computational intelligence techniques have been successfully employed in a wide range of applications which include the domains of medical, bioinformatics, electronics, communications and business. There has been progress in applying of computational intelligence in the nuclear reactor domain during the last two decades. The stringent nuclear safety regulations pertaining to reactor environment present challenges in the application of computational intelligence in various nuclear sub-systems. The applications of various methods of computational intelligence in the domain of nuclear reactors are discussed in this paper. (author)

  10. Psychotherapy for Borderline Personality Disorder: Progress and Remaining Challenges.

    Science.gov (United States)

    Links, Paul S; Shah, Ravi; Eynan, Rahel

    2017-03-01

    The main purpose of this review was to critically evaluate the literature on psychotherapies for borderline personality disorder (BPD) published over the past 5 years to identify the progress with remaining challenges and to determine priority areas for future research. A systematic review of the literature over the last 5 years was undertaken. The review yielded 184 relevant abstracts, and after applying inclusion criteria, 16 articles were fully reviewed based on the articles' implications for future research and/or clinical practice. Our review indicated that patients with various severities benefited from psychotherapy; more intensive therapies were not significantly superior to less intensive therapies; enhancing emotion regulation processes and fostering more coherent self-identity were important mechanisms of change; therapies had been extended to patients with BPD and posttraumatic stress disorder; and more research was needed to be directed at functional outcomes.

  11. Challenges in Soft Computing: Case Study with Louisville MSD CSO Modeling

    Science.gov (United States)

    Ormsbee, L.; Tufail, M.

    2005-12-01

    The principal constituents of soft computing include fuzzy logic, neural computing, evolutionary computation, machine learning, and probabilistic reasoning. There are numerous applications of these constituents (both individually and combination of two or more) in the area of water resources and environmental systems. These range from development of data driven models to optimal control strategies to assist in more informed and intelligent decision making process. Availability of data is critical to such applications and having scarce data may lead to models that do not represent the response function over the entire domain. At the same time, too much data has a tendency to lead to over-constraining of the problem. This paper will describe the application of a subset of these soft computing techniques (neural computing and genetic algorithms) to the Beargrass Creek watershed in Louisville, Kentucky. The application include development of inductive models as substitutes for more complex process-based models to predict water quality of key constituents (such as dissolved oxygen) and use them in an optimization framework for optimal load reductions. Such a process will facilitate the development of total maximum daily loads for the impaired water bodies in the watershed. Some of the challenges faced in this application include 1) uncertainty in data sets, 2) model application, and 3) development of cause-and-effect relationships between water quality constituents and watershed parameters through use of inductive models. The paper will discuss these challenges and how they affect the desired goals of the project.

  12. Progression Analysis and Stage Discovery in Continuous Physiological Processes Using Image Computing

    Directory of Open Access Journals (Sweden)

    Ferrucci Luigi

    2010-01-01

    Full Text Available We propose an image computing-based method for quantitative analysis of continuous physiological processes that can be sensed by medical imaging and demonstrate its application to the analysis of morphological alterations of the bone structure, which correlate with the progression of osteoarthritis (OA. The purpose of the analysis is to quantitatively estimate OA progression in a fashion that can assist in understanding the pathophysiology of the disease. Ultimately, the texture analysis will be able to provide an alternative OA scoring method, which can potentially reflect the progression of the disease in a more direct fashion compared to the existing clinically utilized classification schemes based on radiology. This method can be useful not just for studying the nature of OA, but also for developing and testing the effect of drugs and treatments. While in this paper we demonstrate the application of the method to osteoarthritis, its generality makes it suitable for the analysis of other progressive clinical conditions that can be diagnosed and prognosed by using medical imaging.

  13. Research Progress of Global Land Domain Service Computing:Take GlobeLand 30 as an Example

    Directory of Open Access Journals (Sweden)

    CHEN Jun

    2017-10-01

    Full Text Available Combining service-computing technology with domain requirements is one of the important development directions of geographic information under Internet+, which provides highly efficient technical means for information sharing and collaborative services. Using GlobeLand 30 as an example, this paper analyzes the basic problems of integrating land cover information processing and service computing, introduces the latest research progress in domain service modeling, online computing method and dynamic service technology, and the GlobeLand 30 information service platform. The paper also discusses the further development directions of GlobeLand 30 domain service computing.

  14. Impact of changing computer technology on hydrologic and water resource modeling

    OpenAIRE

    Loucks, D.P.; Fedra, K.

    1987-01-01

    The increasing availability of substantial computer power at relatively low costs and the increasing ease of using computer graphics, of communicating with other computers and data bases, and of programming using high-level problem-oriented computer languages, is providing new opportunities and challenges for those developing and using hydrologic and water resources models. This paper reviews some of the progress made towards the development and application of computer support systems designe...

  15. Technical progress faced with the challenges of the energy sector in the future

    International Nuclear Information System (INIS)

    Maillard, D.

    1999-01-01

    The colloquium organised by the Association of Energy Economists dealing with the theme 'Technical progress faced with the challenges of the energy sector in the future' takes place against a backdrop of ever-increasing initiatives in this field, for example at the World Energy Council or the International Energy Agency Faith in technical progress is widespread but should be supported by studies without any preconceived ideas. Research and development efforts must be fully supported, and in a climate of opening markets and liberalization the public authorities have a major role to pay. Historically, the markets have always been able to meet new needs thanks to technology, but the ambitious targets that the international community has set itself regarding the emission of greenhouse gases imply technical improvements and major investments. (authors)

  16. Progression criteria for cancer antigen 15.3 and carcinoembryonic antigen in metastatic breast cancer compared by computer simulation of marker data

    DEFF Research Database (Denmark)

    Sölétormos, G; Hyltoft Petersen, P; Dombernowsky, P

    2000-01-01

    .3 and carcinoembryonic antigen concentrations were combined with representative values for background variations in a computer simulation model. Fifteen criteria for assessment of longitudinal tumor marker data were obtained from the literature and computerized. Altogether, 7200 different patients, each based on 50......BACKGROUND: We investigated the utility of computer simulation models for performance comparisons of different tumor marker assessment criteria to define progression or nonprogression of metastatic breast cancer. METHODS: Clinically relevant values for progressive cancer antigen 15...... of progression. CONCLUSIONS: The computer simulation model is a fast, effective, and inexpensive approach for comparing the diagnostic potential of assessment criteria during clinically relevant conditions of steady-state and progressive disease. The model systems can be used to generate tumor marker assessment...

  17. Progress and challenges of carbon nanotube membrane in water treatment

    KAUST Repository

    Lee, Jieun

    2016-05-25

    The potential of the carbon nanotube (CNT) membrane has been highly strengthened in water treatment during the last decade. According to works published up to now, the unique and excellent characteristics of CNT outperformed conventional polymer membranes. Such achievements of CNT membranes are greatly dependent on their fabrication methods. Further, the intrinsic properties of CNT could be a critical factor of applicability to membrane processes. This article provides an explicit and systematic review of the progress of CNT membranes addressing the current epidemic—whether (i) the CNT membranes could tackle current challenges in the pressure- or thermally driven membrane processes and (ii) CNT hybrid nanocomposite as a new generation of materials could complement current CNT-enhanced membrane. © 2016 Taylor & Francis Group, LLC.

  18. Bringing high-performance computing to the biologist's workbench: approaches, applications, and challenges

    International Nuclear Information System (INIS)

    Oehmen, C S; Cannon, W R

    2008-01-01

    Data-intensive and high-performance computing are poised to significantly impact the future of biological research which is increasingly driven by the prevalence of high-throughput experimental methodologies for genome sequencing, transcriptomics, proteomics, and other areas. Large centers such as NIH's National Center for Biotechnology Information, The Institute for Genomic Research, and the DOE's Joint Genome Institute) have made extensive use of multiprocessor architectures to deal with some of the challenges of processing, storing and curating exponentially growing genomic and proteomic datasets, thus enabling users to rapidly access a growing public data source, as well as use analysis tools transparently on high-performance computing resources. Applying this computational power to single-investigator analysis, however, often relies on users to provide their own computational resources, forcing them to endure the learning curve of porting, building, and running software on multiprocessor architectures. Solving the next generation of large-scale biology challenges using multiprocessor machines-from small clusters to emerging petascale machines-can most practically be realized if this learning curve can be minimized through a combination of workflow management, data management and resource allocation as well as intuitive interfaces and compatibility with existing common data formats

  19. Impacts of mothers' occupation status and parenting styles on levels of self-control, addiction to computer games, and educational progress of adolescents.

    Science.gov (United States)

    Abedini, Yasamin; Zamani, Bibi Eshrat; Kheradmand, Ali; Rajabizadeh, Ghodratollah

    2012-01-01

    Addiction to computer (video) games in adolescents and its relationship with educational progress has recently attracted the attention of rearing and education experts as well as organizations and institutes involved in physical and mental health. The current research attempted to propose a structural model of the relationships between parenting styles, mothers' occupation status, and addiction to computer games, self-control, and educational progress of secondary school students. Using multistage cluster random sampling, 500 female and male secondary school students in Kerman (Iran) were selected and studied. The research tools included self-control, parenting styles, and addiction to computer games questionnaires and a self-made questionnaire containing demographic details. The data was analyzed using exploratory factor analysis, Cronbach's alpha coefficient and route analysis (in LISREL). We found self-control to have a linking role in the relationship between four parenting styles and educational progress. Mothers' occupation status was directly and significantly correlated with addiction to computer games. Although four parenting styles directly and significantly affected addiction to computer games, the findings did not support the linking role of addiction to computer games in the relationship between four parenting styles and educational progress. In agreement with previous studies, the current research reflected the impact of four parenting styles on self-control, addiction to computer games, and educational progress of students. Among the parenting styles, authoritative style can affect the severity of addiction to computer games through self-control development. It can thus indirectly influence the educational progress of students. Parents are recommended to use authoritative parenting style to help both self-management and psychological health of their children. The employed mothers are also recommended to have more supervision and control on the degree

  20. Impacts of Mothers’ Occupation Status and Parenting Styles on Levels of Self-Control, Addiction to Computer Games, and Educational Progress of Adolescents

    Science.gov (United States)

    Abedini, Yasamin; Zamani, Bibi Eshrat; Kheradmand, Ali; Rajabizadeh, Ghodratollah

    2012-01-01

    Background Addiction to computer (video) games in adolescents and its relationship with educational progress has recently attracted the attention of rearing and education experts as well as organizations and institutes involved in physical and mental health. The current research attempted to propose a structural model of the relationships between parenting styles, mothers’ occupation status, and addiction to computer games, self-control, and educational progress of secondary school students. Methods Using multistage cluster random sampling, 500 female and male secondary school students in Kerman (Iran) were selected and studied. The research tools included self-control, parenting styles, and addiction to computer games questionnaires and a self-made questionnaire containing demographic details. The data was analyzed using exploratory factor analysis, Cronbach’s alpha coefficient and route analysis (in LISREL). Findings We found self-control to have a linking role in the relationship between four parenting styles and educational progress. Mothers’ occupation status was directly and significantly correlated with addiction to computer games. Although four parenting styles directly and significantly affected addiction to computer games, the findings did not support the linking role of addiction to computer games in the relationship between four parenting styles and educational progress. Conclusion In agreement with previous studies, the current research reflected the impact of four parenting styles on self-control, addiction to computer games, and educational progress of students. Among the parenting styles, authoritative style can affect the severity of addiction to computer games through self-control development. It can thus indirectly influence the educational progress of students. Parents are recommended to use authoritative parenting style to help both self-management and psychological health of their children. The employed mothers are also recommended to

  1. Multiagent Work Practice Simulation: Progress and Challenges

    Science.gov (United States)

    Clancey, William J.; Sierhuis, Maarten

    2002-01-01

    Modeling and simulating complex human-system interactions requires going beyond formal procedures and information flows to analyze how people interact with each other. Such work practices include conversations, modes of communication, informal assistance, impromptu meetings, workarounds, and so on. To make these social processes visible, we have developed a multiagent simulation tool, called Brahms, for modeling the activities of people belonging to multiple groups, situated in a physical environment (geographic regions, buildings, transport vehicles, etc.) consisting of tools, documents, and computer systems. We are finding many useful applications of Brahms for system requirements analysis, instruction, implementing software agents, and as a workbench for relating cognitive and social theories of human behavior. Many challenges remain for representing work practices, including modeling: memory over multiple days, scheduled activities combining physical objects, groups, and locations on a timeline (such as a Space Shuttle mission), habitat vehicles with trajectories (such as the Shuttle), agent movement in 3d space (e.g., inside the International Space Station), agent posture and line of sight, coupled movements (such as carrying objects), and learning (mimicry, forming habits, detecting repetition, etc.).

  2. [Facing the challenges of ubiquitous computing in the health care sector].

    Science.gov (United States)

    Georgieff, Peter; Friedewald, Michael

    2010-01-01

    The steady progress of microelectronics, communications and information technology will enable the realisation of the vision for "ubiquitous computing" where the Internet extends into the real world embracing everyday objects. The necessary technical basis is already in place. Due to their diminishing size, constantly falling price and declining energy consumption, processors, communications modules and sensors are being increasingly integrated into everyday objects today. This development is opening up huge opportunities for both the economy and individuals. In the present paper we discuss possible applications, but also technical, social and economic barriers to a wide-spread use of ubiquitous computing in the health care sector. .

  3. Challenges and considerations for the design and production of a purpose-optimized body-worn wrist-watch computer

    Science.gov (United States)

    Narayanaswami, Chandra; Raghunath, Mandayam T.

    2004-09-01

    We outline a collection of technological challenges in the design of wearable computers with a focus on one of the most desirable form-factors, the wrist watch. We describe our experience with building three generations of wrist watch computers. We built these research prototypes as platforms to investigate the fundamental limitations of wearable computing. Results of our investigations are presented in the form of challenges that have been overcome and those that still remain.

  4. Challenges to Software/Computing for Experimentation at the LHC

    Science.gov (United States)

    Banerjee, Sunanda

    The demands of future high energy physics experiments towards software and computing have led the experiments to plan the related activities as a full-fledged project and to investigate new methodologies and languages to meet the challenges. The paths taken by the four LHC experiments ALICE, ATLAS, CMS and LHCb are coherently put together in an LHC-wide framework based on Grid technology. The current status and understandings have been broadly outlined.

  5. Towards Cloud Computing SLA Risk Management: Issues and Challenges

    OpenAIRE

    Morin, Jean-Henry; Aubert, Jocelyn; Gateau, Benjamin

    2012-01-01

    Cloud Computing has become mainstream technology offering a commoditized approach to software, platform and infrastructure as a service over the Internet on a global scale. This raises important new security issues beyond traditional perimeter based approaches. This paper attempts to identify these issues and their corresponding challenges, proposing to use risk and Service Level Agreement (SLA) management as the basis for a service level framework to improve governance, risk and compliance i...

  6. Computational Science And Engineering Software Sustainability And Productivity (CSESSP) Challenges Workshop Report

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — This report details the challenges and opportunities discussed at the NITRD sponsored multi-agency workshop on Computational Science and Engineering Software...

  7. Abduction aiming at empirical progress or even at truth approximation, leading to challenge for computational modelling

    NARCIS (Netherlands)

    Kuipers, Theo A.F.

    1999-01-01

    This paper primarily deals with the conceptual prospects for generalizing the aim of abduction from the standard one of explaining surprising or anomalous observations to that of empirical progress or even truth approximation. It turns out that the main abduction task then becomes the

  8. The Office of Site Closure: Progress in the Face of Challenges

    International Nuclear Information System (INIS)

    Fiore, J. J.; Murphie, W. E.; Meador, S. W.

    2002-01-01

    The Office of Site Closure (OSC) was formed in November 1999 when the Department of Energy's (DOE's) Office of Environmental Management (EM) reorganized to focus specifically on site cleanup and closure. OSC's objective is to achieve safe and cost-effective cleanups and closures that are protective of our workers, the public, and the environment, now and in the future. Since its inception, OSC has focused on implementing a culture of safe closure, with emphasis in three primary areas: complete our responsibility for the Closure Sites Rocky Flats, Mound, Fernald, Ashtabula, and Weldon Spring; complete our responsibility for cleanup at sites where the DOE mission has been completed (examples include Battelle King Avenue and Battelle West Jefferson in Columbus, and General Atomics) or where other Departmental organizations have an ongoing mission (examples include the Brookhaven, Livermore, or Los Alamos National Laboratories, and the Nevada Test Site); and create a framework a nd develop specific business closure tools that will help sites close, such as guidance for and decisions on post-contract benefit liabilities, records retention, and Federal employee incentives for site closure. This paper discusses OSC's 2001 progress in achieving site cleanups, moving towards site closure, and developing specific business closure tools to support site closure. It describes the tools used to achieve progress towards cleanup and closure, such as the application of new technologies, changes in contracting approaches, and the development of agreements between sites and with host states. The paper also identifies upcoming challenges and explores options for how Headquarters and the sites can work together to address these challenges. Finally, it articulates OSC's new focus on oversight of Field Offices to ensure they have the systems in place to oversee contractor activities resulting in site cleanups and closures

  9. The Glen Canyon Dam adaptive management program: progress and immediate challenges

    Science.gov (United States)

    Hamill, John F.; Melis, Theodore S.; Boon, Philip J.; Raven, Paul J.

    2012-01-01

    Adaptive management emerged as an important resource management strategy for major river systems in the United States (US) in the early 1990s. The Glen Canyon Dam Adaptive Management Program (‘the Program’) was formally established in 1997 to fulfill a statutory requirement in the 1992 Grand Canyon Protection Act (GCPA). The GCPA aimed to improve natural resource conditions in the Colorado River corridor in the Glen Canyon National Recreation Area and Grand Canyon National Park, Arizona that were affected by the Glen Canyon dam. The Program achieves this by using science and a variety of stakeholder perspectives to inform decisions about dam operations. Since the Program started the ecosystem is now much better understood and several biological and physical improvements have been achieved. These improvements include: (i) an estimated 50% increase in the adult population of endangered humpback chub (Gila cypha) between 2001 and 2008, following previous decline; (ii) a 90% decrease in non-native rainbow trout (Oncorhynchus mykiss), which are known to compete with and prey on native fish, as a result of removal experiments; and (iii) the widespread reappearance of sandbars in response to an experimental high-flow release of dam water in March 2008.Although substantial progress has been made, the Program faces several immediate challenges. These include: (i) defining specific, measurable objectives and desired future conditions for important natural, cultural and recreational attributes to inform science and management decisions; (ii) implementing structural and operational changes to improve collaboration among stakeholders; (iii) establishing a long-term experimental programme and management plan; and (iv) securing long-term funding for monitoring programmes to assess ecosystem and other responses to management actions. Addressing these challenges and building on recent progress will require strong and consistent leadership from the US Department of the Interior

  10. Progress and challenges to the global waste management system.

    Science.gov (United States)

    Singh, Jagdeep; Laurenti, Rafael; Sinha, Rajib; Frostell, Björn

    2014-09-01

    Rapid economic growth, urbanization and increasing population have caused (materially intensive) resource consumption to increase, and consequently the release of large amounts of waste to the environment. From a global perspective, current waste and resource management lacks a holistic approach covering the whole chain of product design, raw material extraction, production, consumption, recycling and waste management. In this article, progress and different sustainability challenges facing the global waste management system are presented and discussed. The study leads to the conclusion that the current, rather isolated efforts, in different systems for waste management, waste reduction and resource management are indeed not sufficient in a long term sustainability perspective. In the future, to manage resources and wastes sustainably, waste management requires a more systems-oriented approach that addresses the root causes for the problems. A specific issue to address is the development of improved feedback information (statistics) on how waste generation is linked to consumption. © The Author(s) 2014.

  11. Progress and Challenges in Developing Aptamer-Functionalized Targeted Drug Delivery Systems

    Directory of Open Access Journals (Sweden)

    Feng Jiang

    2015-10-01

    Full Text Available Aptamers, which can be screened via systematic evolution of ligands by exponential enrichment (SELEX, are superior ligands for molecular recognition due to their high selectivity and affinity. The interest in the use of aptamers as ligands for targeted drug delivery has been increasing due to their unique advantages. Based on their different compositions and preparation methods, aptamer-functionalized targeted drug delivery systems can be divided into two main categories: aptamer-small molecule conjugated systems and aptamer-nanomaterial conjugated systems. In this review, we not only summarize recent progress in aptamer selection and the application of aptamers in these targeted drug delivery systems but also discuss the advantages, challenges and new perspectives associated with these delivery systems.

  12. Quantum Computation: Entangling with the Future

    Science.gov (United States)

    Jiang, Zhang

    2017-01-01

    Commercial applications of quantum computation have become viable due to the rapid progress of the field in the recent years. Efficient quantum algorithms are discovered to cope with the most challenging real-world problems that are too hard for classical computers. Manufactured quantum hardware has reached unprecedented precision and controllability, enabling fault-tolerant quantum computation. Here, I give a brief introduction on what principles in quantum mechanics promise its unparalleled computational power. I will discuss several important quantum algorithms that achieve exponential or polynomial speedup over any classical algorithm. Building a quantum computer is a daunting task, and I will talk about the criteria and various implementations of quantum computers. I conclude the talk with near-future commercial applications of a quantum computer.

  13. Recent Progress in First-Principles Methods for Computing the Electronic Structure of Correlated Materials

    Directory of Open Access Journals (Sweden)

    Fredrik Nilsson

    2018-03-01

    Full Text Available Substantial progress has been achieved in the last couple of decades in computing the electronic structure of correlated materials from first principles. This progress has been driven by parallel development in theory and numerical algorithms. Theoretical development in combining ab initio approaches and many-body methods is particularly promising. A crucial role is also played by a systematic method for deriving a low-energy model, which bridges the gap between real and model systems. In this article, an overview is given tracing the development from the LDA+U to the latest progress in combining the G W method and (extended dynamical mean-field theory ( G W +EDMFT. The emphasis is on conceptual and theoretical aspects rather than technical ones.

  14. PROGRESS and CHALLENGES IN CLEANUP OF HANFORDS TANK WASTES

    International Nuclear Information System (INIS)

    HEWITT, W.M.; SCHEPENS, R.

    2006-01-01

    The River Protection Project (RPP), which is managed by the Department of Energy (DOE) Office of River Protection (ORP), is highly complex from technical, regulatory, legal, political, and logistical perspectives and is the largest ongoing environmental cleanup project in the world. Over the past three years, ORP has made significant advances in its planning and execution of the cleanup of the Hartford tank wastes. The 149 single-shell tanks (SSTs), 28 double-shell tanks (DSTs), and 60 miscellaneous underground storage tanks (MUSTs) at Hanford contain approximately 200,000 m 3 (53 million gallons) of mixed radioactive wastes, some of which dates back to the first days of the Manhattan Project. The plan for treating and disposing of the waste stored in large underground tanks is to: (1) retrieve the waste, (2) treat the waste to separate it into high-level (sludge) and low-activity (supernatant) fractions, (3) remove key radionuclides (e.g., Cs-137, Sr-90, actinides) from the low-activity fraction to the maximum extent technically and economically practical, (4) immobilize both the high-level and low-activity waste fractions by vitrification, (5) interim store the high-level waste fraction for ultimate disposal off-site at the federal HLW repository, (6) dispose the low-activity fraction on-site in the Integrated Disposal Facility (IDF), and (7) close the waste management areas consisting of tanks, ancillary equipment, soils, and facilities. Design and construction of the Waste Treatment and Immobilization Plant (WTP), the cornerstone of the RPP, has progressed substantially despite challenges arising from new seismic information for the WTP site. We have looked closely at the waste and aligned our treatment and disposal approaches with the waste characteristics. For example, approximately 11,000 m 3 (2-3 million gallons) of metal sludges in twenty tanks were not created during spent nuclear fuel reprocessing and have low fission product concentrations. We plan to

  15. The challenges of developing computational physics: the case of South Africa

    International Nuclear Information System (INIS)

    Salagaram, T; Chetty, N

    2013-01-01

    Most modern scientific research problems are complex and interdisciplinary in nature. It is impossible to study such problems in detail without the use of computation in addition to theory and experiment. Although it is widely agreed that students should be introduced to computational methods at the undergraduate level, it remains a challenge to do this in a full traditional undergraduate curriculum. In this paper, we report on a survey that we conducted of undergraduate physics curricula in South Africa to determine the content and the approach taken in the teaching of computational physics. We also considered the pedagogy of computational physics at the postgraduate and research levels at various South African universities, research facilities and institutions. We conclude that the state of computational physics training in South Africa, especially at the undergraduate teaching level, is generally weak and needs to be given more attention at all universities. Failure to do so will impact negatively on the countrys capacity to grow its endeavours generally in the field of computational sciences, with negative impacts on research, and in commerce and industry

  16. Computed tomographic findings of progressive supranuclear palsy compared with Parkinson's disease

    Energy Technology Data Exchange (ETDEWEB)

    Yuki, Nobuhiro; Sato, Shuzo; Yuasa, Tatsuhiko; Ito, Jusuke; Miyatake, Tadashi [Niigata Univ. (Japan). School of Dentistry

    1990-10-01

    We investigated computed tomographic (CT) films of 4 pathologically documented cases of progressive supranuclear palsy (PSP) in which the clinical presentations were atypical and compared the findings with those of 15 patients with Parkinson's disease (PD). Dilatation of the third ventricle, atrophy of the midbrain tegmentum, and enlargement of the interpeduncular cistern toward the aqueduct were found to be the characteristic findings in PSP. Thus, radiological findings can be useful when the differential diagnosis between PSP and PD is clinically difficult. (author).

  17. A community computational challenge to predict the activity of pairs of compounds.

    Science.gov (United States)

    Bansal, Mukesh; Yang, Jichen; Karan, Charles; Menden, Michael P; Costello, James C; Tang, Hao; Xiao, Guanghua; Li, Yajuan; Allen, Jeffrey; Zhong, Rui; Chen, Beibei; Kim, Minsoo; Wang, Tao; Heiser, Laura M; Realubit, Ronald; Mattioli, Michela; Alvarez, Mariano J; Shen, Yao; Gallahan, Daniel; Singer, Dinah; Saez-Rodriguez, Julio; Xie, Yang; Stolovitzky, Gustavo; Califano, Andrea

    2014-12-01

    Recent therapeutic successes have renewed interest in drug combinations, but experimental screening approaches are costly and often identify only small numbers of synergistic combinations. The DREAM consortium launched an open challenge to foster the development of in silico methods to computationally rank 91 compound pairs, from the most synergistic to the most antagonistic, based on gene-expression profiles of human B cells treated with individual compounds at multiple time points and concentrations. Using scoring metrics based on experimental dose-response curves, we assessed 32 methods (31 community-generated approaches and SynGen), four of which performed significantly better than random guessing. We highlight similarities between the methods. Although the accuracy of predictions was not optimal, we find that computational prediction of compound-pair activity is possible, and that community challenges can be useful to advance the field of in silico compound-synergy prediction.

  18. Association between Smoking and the Progression of Computed Tomography Findings in Chronic Pancreatitis.

    Science.gov (United States)

    Lee, Jeong Woo; Kim, Ho Gak; Lee, Dong Wook; Han, Jimin; Kwon, Hyuk Yong; Seo, Chang Jin; Oh, Ji Hye; Lee, Joo Hyoung; Jung, Jin Tae; Kwon, Joong Goo; Kim, Eun Young

    2016-05-23

    Smoking and alcohol intake are two wellknown risk factors for chronic pancreatitis. However, there are few studies examining the association between smoking and changes in computed tomography (CT) findings in chronic pancreatitis. The authors evaluated associations between smoking, drinking and the progression of calcification on CT in chronic pancreatitis. In this retrospective study, 59 patients with chronic pancreatitis who had undergone initial and follow-up CT between January 2002 and September 2010 were included. Progression of calcification among CT findings was compared according to the amount of alcohol intake and smoking. The median duration of followup was 51.6 months (range, 17.1 to 112.7 months). At initial CT findings, there was pancreatic calcification in 35 patients (59.3%). In the follow-up CT, progression of calcification was observed in 37 patients (62.7%). Progression of calcification was more common in smokers according to the multivariate analysis (odds ratio [OR], 9.987; p=0.006). The amount of smoking was a significant predictor for progression of calcification in the multivariate analysis (OR, 6.051 in less than 1 pack per day smokers; OR, 36.562 in more than 1 pack per day smokers; p=0.008). Continued smoking accelerates pancreatic calcification, and the amount of smoking is associated with the progression of calcification in chronic pancreatitis.

  19. IAEA Mission Sees Significant Progress in Georgia’s Regulatory Framework, Challenges Ahead

    International Nuclear Information System (INIS)

    2018-01-01

    An International Atomic Energy Agency (IAEA) team of experts said Georgia has made significant progress in strengthening its regulatory framework for nuclear and radiation safety. The team also pointed to challenges ahead as Georgia seeks to achieve further progress. The Integrated Regulatory Review Service (IRRS) team concluded a 10-day mission on 28 February to assess the regulatory safety framework in Georgia. The mission was conducted at the request of the Government and hosted by the Agency of Nuclear and Radiation Safety (ANRS), which is responsible for regulatory oversight in the country. IRRS missions are designed to strengthen the effectiveness of the national safety regulatory infrastructure, while recognizing the responsibility of each State to ensure nuclear and radiation safety. Georgia uses radioactive sources in medicine and industry and operates radioactive waste management facilities. It has decommissioned its only research reactor and has no nuclear power plants. In recent years, the Government and ANRS, with assistance from the IAEA, introduced new safety regulations and increased the number of regulatory inspections.

  20. CMS results in the Combined Computing Readiness Challenge CCRC'08

    International Nuclear Information System (INIS)

    Bonacorsi, D.; Bauerdick, L.

    2009-01-01

    During February and May 2008, CMS participated to the Combined Computing Readiness Challenge (CCRC'08) together with all other LHC experiments. The purpose of this worldwide exercise was to check the readiness of the Computing infrastructure for LHC data taking. Another set of major CMS tests called Computing, Software and Analysis challenge (CSA'08) - as well as CMS cosmic runs - were also running at the same time: CCRC augmented the load on computing with additional tests to validate and stress-test all CMS computing workflows at full data taking scale, also extending this to the global WLCG community. CMS exercised most aspects of the CMS computing model, with very comprehensive tests. During May 2008, CMS moved more than 3.6 Petabytes among more than 300 links in the complex Grid topology. CMS demonstrated that is able to safely move data out of CERN to the Tier-1 sites, sustaining more than 600 MB/s as a daily average for more than seven days in a row, with enough headroom and with hourly peaks of up to 1.7 GB/s. CMS ran hundreds of simultaneous jobs at each Tier-1 site, re-reconstructing and skimming hundreds of millions of events. After re-reconstruction the fresh AOD (Analysis Object Data) has to be synchronized between Tier-1 centers: CMS demonstrated that the required inter-Tier-1 transfers are achievable within a few days. CMS also showed that skimmed analysis data sets can be transferred to Tier-2 sites for analysis at sufficient rate, regionally as well as inter-regionally, achieving all goals in about 90% of >200 links. Simultaneously, CMS also ran a large Tier-2 analysis exercise, where realistic analysis jobs were submitted to a large set of Tier-2 sites by a large number of people to produce a chaotic workload across the systems, and with more than 400 analysis users in May. Taken all together, CMS routinely achieved submissions of 100k jobs/day, with peaks up to 200k jobs/day. The achieved results in CCRC'08 - focussing on the distributed

  1. Factors associated with coronary artery disease progression assessed by serial coronary computed tomography angiography

    International Nuclear Information System (INIS)

    Camargo, Gabriel Cordeiro; Gottlieb, Ilan; Rothstein, Tamara; Derenne, Maria Eduarda; Sabioni, Leticia; Lima, Ronaldo de Souza Leão; Lima, João A. C.

    2017-01-01

    Background: Coronary computed tomography angiography (CCTA) allows for noninvasive coronary artery disease (CAD) phenotyping. Factors related to CAD progression are epidemiologically valuable. Objective: To identify factors associated with CAD progression in patients undergoing sequential CCTA testing. Methods: We retrospectively analyzed 384 consecutive patients who had at least two CCTA studies between December 2005 and March 2013. Due to limitations in the quantification of CAD progression, we excluded patients who had undergone surgical revascularization previously or percutaneous coronary intervention (PCI) between studies. CAD progression was defined as any increase in the adapted segment stenosis score (calculated using the number of diseased segments and stenosis severity) in all coronary segments without stent (in-stent restenosis was excluded from the analysis). Stepwise logistic regression was used to assess variables associated with CAD progression. Results: From a final population of 234 patients, a total of 117 (50%) had CAD progression. In a model accounting for major CAD risk factors and other baseline characteristics, only age (odds ratio [OR] 1.04, 95% confidence interval [95%CI] 1.01–1.07), interstudy interval (OR 1.03, 95%CI 1.01–1.04), and past PCI (OR 3.66, 95%CI 1.77–7.55) showed an independent relationship with CAD progression. Conclusions: A history of PCI with stent placement was independently associated with a 3.7-fold increase in the odds of CAD progression, excluding in-stent restenosis. Age and interstudy interval were also independent predictors of progression. (author)

  2. Factors associated with coronary artery disease progression assessed by serial coronary computed tomography angiography

    Energy Technology Data Exchange (ETDEWEB)

    Camargo, Gabriel Cordeiro; Gottlieb, Ilan, E-mail: ilangottlieb@gmail.com [Casa de Saúde São José, Rio de Janeiro, RJ (Brazil); Rothstein, Tamara; Derenne, Maria Eduarda; Sabioni, Leticia; Lima, Ronaldo de Souza Leão [Centro de Diagnóstico por Imagem CDPI, Rio de Janeiro, RJ (Brazil); Lima, João A. C. [Johns Hopkins University, Baltimore (United States)

    2017-05-15

    Background: Coronary computed tomography angiography (CCTA) allows for noninvasive coronary artery disease (CAD) phenotyping. Factors related to CAD progression are epidemiologically valuable. Objective: To identify factors associated with CAD progression in patients undergoing sequential CCTA testing. Methods: We retrospectively analyzed 384 consecutive patients who had at least two CCTA studies between December 2005 and March 2013. Due to limitations in the quantification of CAD progression, we excluded patients who had undergone surgical revascularization previously or percutaneous coronary intervention (PCI) between studies. CAD progression was defined as any increase in the adapted segment stenosis score (calculated using the number of diseased segments and stenosis severity) in all coronary segments without stent (in-stent restenosis was excluded from the analysis). Stepwise logistic regression was used to assess variables associated with CAD progression. Results: From a final population of 234 patients, a total of 117 (50%) had CAD progression. In a model accounting for major CAD risk factors and other baseline characteristics, only age (odds ratio [OR] 1.04, 95% confidence interval [95%CI] 1.01–1.07), interstudy interval (OR 1.03, 95%CI 1.01–1.04), and past PCI (OR 3.66, 95%CI 1.77–7.55) showed an independent relationship with CAD progression. Conclusions: A history of PCI with stent placement was independently associated with a 3.7-fold increase in the odds of CAD progression, excluding in-stent restenosis. Age and interstudy interval were also independent predictors of progression. (author)

  3. Progress Toward Source-to-Target Simulation

    International Nuclear Information System (INIS)

    Grote, D.P.; Friedman, A.; Craig, G.D.; Sharp, W.M.; Haber, I.

    2000-01-01

    Source-to-target simulation of an accelerator provides a thorough check on the consistency of the design as well as a detailed understanding of the beam behavior. Issues such as envelope mis-match and emittance growth can be examined in a self-consistent manner, including the details of accelerator transitions, long-term transport, and longitudinal compression. The large range in scales, from centimeter-scale transverse beam size and applied field scale-length, to meter-scale beam length, to kilometer-scale accelerator length, poses a significant computational challenge. The ever-increasing computational power that is becoming available through massively parallel computers is making such simulation realizable. This paper discusses the progress toward source-to-target simulation using the WARP particle-in-cell code. Representative examples are shown, including 3-D, along-term transport simulations of Integrated Research Experiment (IRE) scale accelerators

  4. Cone beam computed tomographic imaging: perspective, challenges, and the impact of near-trend future applications.

    Science.gov (United States)

    Cavalcanti, Marcelo Gusmão Paraiso

    2012-01-01

    Cone beam computed tomography (CBCT) can be considered as a valuable imaging modality for improving diagnosis and treatment planning to achieve true guidance for several craniofacial surgical interventions. A new concept and perspective in medical informatics is the highlight discussion about the new imaging interactive workflow. The aim of this article was to present, in a short literature review, the usefulness of CBCT technology as an important alternative imaging modality, highlighting current practices and near-term future applications in cutting-edge thought-provoking perspectives for craniofacial surgical assessment. This article explains the state of the art of CBCT improvements, medical workstation, and perspectives of the dedicated unique hardware and software, which can be used from the CBCT source. In conclusion, CBCT technology is developing rapidly, and many advances are on the horizon. Further progress in medical workstations, engineering capabilities, and improvement in independent software-some open source-should be attempted with this new imaging method. The perspectives, challenges, and pitfalls in CBCT will be delineated and evaluated along with the technological developments.

  5. Computation-aware algorithm selection approach for interlaced-to-progressive conversion

    Science.gov (United States)

    Park, Sang-Jun; Jeon, Gwanggil; Jeong, Jechang

    2010-05-01

    We discuss deinterlacing results in a computationally constrained and varied environment. The proposed computation-aware algorithm selection approach (CASA) for fast interlaced to progressive conversion algorithm consists of three methods: the line-averaging (LA) method for plain regions, the modified edge-based line-averaging (MELA) method for medium regions, and the proposed covariance-based adaptive deinterlacing (CAD) method for complex regions. The proposed CASA uses two criteria, mean-squared error (MSE) and CPU time, for assigning the method. We proposed a CAD method. The principle idea of CAD is based on the correspondence between the high and low-resolution covariances. We estimated the local covariance coefficients from an interlaced image using Wiener filtering theory and then used these optimal minimum MSE interpolation coefficients to obtain a deinterlaced image. The CAD method, though more robust than most known methods, was not found to be very fast compared to the others. To alleviate this issue, we proposed an adaptive selection approach using a fast deinterlacing algorithm rather than using only one CAD algorithm. The proposed hybrid approach of switching between the conventional schemes (LA and MELA) and our CAD was proposed to reduce the overall computational load. A reliable condition to be used for switching the schemes was presented after a wide set of initial training processes. The results of computer simulations showed that the proposed methods outperformed a number of methods presented in the literature.

  6. The Challenge of Computers.

    Science.gov (United States)

    Leger, Guy

    Computers may change teachers' lifestyles, teaching styles, and perhaps even their personal values. A brief survey of the history of computers demonstrates the incredible pace at which computer technology is moving ahead. The cost and size of microchips will continue to decline dramatically over the next 20 years, while the capability and variety…

  7. Progress in Application of the Neurosciences to an Understanding of Human Learning: The Challenge of Finding a Middle-Ground Neuroeducational Theory

    Science.gov (United States)

    Anderson, O. Roger

    2014-01-01

    Modern neuroscientific research has substantially enhanced our understanding of the human brain. However, many challenges remain in developing a strong, brain-based theory of human learning, especially in complex environments such as educational settings. Some of the current issues and challenges in our progress toward developing comprehensive…

  8. Computing in the Curriculum: Challenges and Strategies from a Teacher's Perspective

    Science.gov (United States)

    Sentance, Sue; Csizmadia, Andrew

    2017-01-01

    Computing is being introduced into the curriculum in many countries. Teachers' perspectives enable us to discover what challenges this presents, and also the strategies teachers claim to be using successfully in teaching the subject across primary and secondary education. The study described in this paper was carried out in the UK in 2014 where…

  9. Research to application: Supercomputing trends for the 90's - Opportunities for interdisciplinary computations

    International Nuclear Information System (INIS)

    Shankar, V.

    1991-01-01

    The progression of supercomputing is reviewed from the point of view of computational fluid dynamics (CFD), and multidisciplinary problems impacting the design of advanced aerospace configurations are addressed. The application of full potential and Euler equations to transonic and supersonic problems in the 70s and early 80s is outlined, along with Navier-Stokes computations widespread during the late 80s and early 90s. Multidisciplinary computations currently in progress are discussed, including CFD and aeroelastic coupling for both static and dynamic flexible computations, CFD, aeroelastic, and controls coupling for flutter suppression and active control, and the development of a computational electromagnetics technology based on CFD methods. Attention is given to computational challenges standing in a way of the concept of establishing a computational environment including many technologies. 40 refs

  10. The Security Challenges in the IoT Enabled Cyber-Physical Systems and Opportunities for Evolutionary Computing & Other Computational Intelligence

    OpenAIRE

    He, H.; Maple, C.; Watson, T.; Tiwari, A.; Mehnen, J.; Jin, Y.; Gabrys, Bogdan

    2016-01-01

    Internet of Things (IoT) has given rise to the fourth industrial revolution (Industrie 4.0), and it brings great benefits by connecting people, processes and data. However, cybersecurity has become a critical challenge in the IoT enabled cyber physical systems, from connected supply chain, Big Data produced by huge amount of IoT devices, to industry control systems. Evolutionary computation combining with other computational intelligence will play an important role for cybersecurity, such as ...

  11. Surgical treatment of progressive ethmoidal hematoma aided by computed tomography in a foal

    International Nuclear Information System (INIS)

    Colbourne, C.M.; Rosenstein, D.S.; Steficek, B.A.; Yovich, J.V.; Stick, J.A.

    1997-01-01

    A progressive ethmoidal hematoma (PEH) was treated successfully in a 4-week-old Belgian filly by surgical removal, using a frontonasal bone flap. The filly had respiratory stridor, epistaxis, and facial enlargement over the left paranasal sinuses, which had progressively increased in size since birth. Computed tomographic images of the head obtained with the foal under general anesthesia were useful in determining the extent and nature of the soft-tissue mass and planning surgical intervention. On the basis of the histologic appearance of the mass, a diagnosis of PEH was made. Twelve months after surgery, the facial appearance was normal and the abnormal appearance of the ethmoid region on endoscopic evaluation was less obvious, with return of the nasal septum to a normal position. Progressive ethmoidal hematoma is uncommon and, to our knowledge, has not been reported in a neonate. Clinical signs of PEH in this foal were atypical because of the rapid enlargement of the mass, extent of facial deformity, and minimal epistaxis and interoperative hemorrhage

  12. The progress and challenges of implementation of the Framework Convention on Tobacco Control (WHO FCTC) in Kyrgyz Republic

    OpenAIRE

    Chinara Bekbasarova

    2018-01-01

    Background and challenges to implementation The Kyrgyz Republic is Party of the WHO FCTC since August 23, 2006. This abstract analyzes progress and challenges during 10 years of implementation of WHO´s FCTC Intervention or response National Tobacco Control (TC) Law was adopted on August 21, 2006, entered into force on December 19, 2006 and was amended and supplemented during 10 years 2 times. TC measures were included, as one of main priorities, in the National Program on Heal...

  13. Computer-Aided Design/Computer-Assisted Manufacture Monolithic Restorations for Severely Worn Dentition: A Case History Report.

    Science.gov (United States)

    Abou-Ayash, Samir; Boldt, Johannes; Vuck, Alexander

    Full-arch rehabilitation of patients with severe tooth wear due to parafunctional behavior is a challenge for dentists and dental technicians, especially when a highly esthetic outcome is desired. A variety of different treatment options and prosthetic materials are available for such a clinical undertaking. The ongoing progress of computer-aided design/computer-assisted manufacture technologies in combination with all-ceramic materials provides a predictable workflow for these complex cases. This case history report describes a comprehensive, step-by-step treatment protocol leading to an optimally predictable treatment outcome for an esthetically compromised patient.

  14. Computing in research and development in Africa benefits, trends, challenges and solutions

    CERN Document Server

    2015-01-01

    This book describes the trends, challenges and solutions in computing use for scientific research and development within different domains in Africa, such as health, agriculture, environment, economy, energy, education and engineering. The benefits expected are discussed by a number of recognized, domain-specific experts, with a common theme being computing as solution enabler. This book is the first document providing such a representative up-to-date view on this topic at the continent level.   • Discusses computing for scientific research and development on the African continent, addressing domains such as engineering, health, agriculture, environment, economy, energy, and education; • Describes the state-of-the-art in usage of computing to address problems in developing countries pertaining to health, productivity, economic growth, and renewable energy; • Offers insights applicable to all developing countries on the use of computing technologies to address a variety of societal issues.

  15. Microbial production of nattokinase: current progress, challenge and prospect.

    Science.gov (United States)

    Cai, Dongbo; Zhu, Chengjun; Chen, Shouwen

    2017-05-01

    Nattokinase (EC 3.4.21.62) is a profibrinolytic serine protease with a potent fibrin-degrading activity, and it has been produced by many host strains. Compared to other fibrinolytic enzymes (urokinase, t-PA and streprokinase), nattokinase shows the advantages of having no side effects, low cost and long life-time, and it has the potential to be used as a drug for treating cardiovascular disease and served as a functional food additive. In this review, we focused on screening of producing strains, genetic engineering, fermentation process optimization for microbial nattokinase production, and the extraction and purification of nattokinase were also discussed in this particular chapter. The selection of optimal nattokinase producing strain was the crucial starting element for improvement of nattokinase production. Genetic engineering, protein engineering, fermentation optimization and process control have been proved to be the effective strategies for enhancement of nattokinase production. Also, extraction and purification of nattokinase are critical for the quality evaluation of nattokinase. Finally, the prospect of microbial nattokinase production was also discussed regarding the recent progress, challenge, and trends in this field.

  16. Engineering brain-computer interfaces: past, present and future.

    Science.gov (United States)

    Hughes, M A

    2014-06-01

    Electricity governs the function of both nervous systems and computers. Whilst ions move in polar fluids to depolarize neuronal membranes, electrons move in the solid-state lattices of microelectronic semiconductors. Joining these two systems together, to create an iono-electric brain-computer interface, is an immense challenge. However, such interfaces offer (and in select clinical contexts have already delivered) a method of overcoming disability caused by neurological or musculoskeletal pathology. To fulfill their theoretical promise, several specific challenges demand consideration. Rate-limiting steps cover a diverse range of disciplines including microelectronics, neuro-informatics, engineering, and materials science. As those who work at the tangible interface between brain and outside world, neurosurgeons are well placed to contribute to, and inform, this cutting edge area of translational research. This article explores the historical background, status quo, and future of brain-computer interfaces; and outlines the challenges to progress and opportunities available to the clinical neurosciences community.

  17. CMS Computing Software and Analysis Challenge 2006

    Energy Technology Data Exchange (ETDEWEB)

    De Filippis, N. [Dipartimento interateneo di Fisica M. Merlin and INFN Bari, Via Amendola 173, 70126 Bari (Italy)

    2007-10-15

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work.

  18. CMS Computing Software and Analysis Challenge 2006

    International Nuclear Information System (INIS)

    De Filippis, N.

    2007-01-01

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work

  19. High Performance Numerical Computing for High Energy Physics: A New Challenge for Big Data Science

    International Nuclear Information System (INIS)

    Pop, Florin

    2014-01-01

    Modern physics is based on both theoretical analysis and experimental validation. Complex scenarios like subatomic dimensions, high energy, and lower absolute temperature are frontiers for many theoretical models. Simulation with stable numerical methods represents an excellent instrument for high accuracy analysis, experimental validation, and visualization. High performance computing support offers possibility to make simulations at large scale, in parallel, but the volume of data generated by these experiments creates a new challenge for Big Data Science. This paper presents existing computational methods for high energy physics (HEP) analyzed from two perspectives: numerical methods and high performance computing. The computational methods presented are Monte Carlo methods and simulations of HEP processes, Markovian Monte Carlo, unfolding methods in particle physics, kernel estimation in HEP, and Random Matrix Theory used in analysis of particles spectrum. All of these methods produce data-intensive applications, which introduce new challenges and requirements for ICT systems architecture, programming paradigms, and storage capabilities.

  20. Evaluating a multi-player brain-computer interface game: challenge versus co-experience

    NARCIS (Netherlands)

    Gürkök, Hayrettin; Volpe, G; Reidsma, Dennis; Poel, Mannes; Camurri, A.; Obbink, Michel; Nijholt, Antinus

    2013-01-01

    Brain–computer interfaces (BCIs) have started to be considered as game controllers. The low level of control they provide prevents them from providing perfect control but allows the design of challenging games which can be enjoyed by players. Evaluation of enjoyment, or user experience (UX), is

  1. New Challenges for Design Participation in the Era of Ubiquitous Computing

    DEFF Research Database (Denmark)

    Brereton, Margot; Buur, Jacob

    2008-01-01

    Since the event of participatory design in the work democracy projects of the 1970’s and 1980’s in Scandinavia, computing technology and people’s engagement with it have undergone fundamental changes. Although participatory design continues to be a precondition for designing computing that aligns...... with human practices, the motivations to engage in participatory design have changed, and the new era requires formats that are different from the original ones. Through the analysis of three case studies this paper seeks to explain why participatory design must be brought to bear on the field of ubiquitous...... computing, and how this challenges the original participatory design thinking. In particular we will argue that more casual, exploratory formats of engagement with people are required, and rather than planning the all-encompassing systems development project, participatory design needs to move towards...

  2. Best Performers Announced for the NCI-CPTAC DREAM Proteogenomics Computational Challenge | Office of Cancer Clinical Proteomics Research

    Science.gov (United States)

    The National Cancer Institute (NCI) Clinical Proteomic Tumor Analysis Consortium (CPTAC) is pleased to announce that teams led by Jaewoo Kang (Korea University), and Yuanfang Guan with Hongyang Li (University of Michigan) as the best performers of the NCI-CPTAC DREAM Proteogenomics Computational Challenge. Over 500 participants from 20 countries registered for the Challenge, which offered $25,000 in cash awards contributed by the NVIDIA Foundation through its Compute the Cure initiative.

  3. Military Readiness: Progress and Challenges in Implementing the Navy’s Optimized Fleet Response Plan

    Science.gov (United States)

    2016-05-02

    command and control under the OFRP contributes to wide swings in port workload , which in turn can have a negative effect on the private - sector industrial...for 53 percent of all private - sector aircraft carrier maintenance contracts and 70 percent of cruiser and destroyer contracts from fiscal years...their impact on the Navy; (2) the Navy’s goals and progress in implementing the OFRP; and (3) challenges faced by public and private shipyards

  4. Leaderboard Now Open: CPTAC’s DREAM Proteogenomics Computational Challenge | Office of Cancer Clinical Proteomics Research

    Science.gov (United States)

    The National Cancer Institute’s Clinical Proteomic Tumor Analysis Consortium (CPTAC) is pleased to announce the opening of the leaderboard to its Proteogenomics Computational DREAM Challenge. The leadership board remains open for submissions during September 25, 2017 through October 8, 2017, with the Challenge expected to run until November 17, 2017.

  5. Computational Aspects of Dam Risk Analysis: Findings and Challenges

    Directory of Open Access Journals (Sweden)

    Ignacio Escuder-Bueno

    2016-09-01

    Full Text Available In recent years, risk analysis techniques have proved to be a useful tool to inform dam safety management. This paper summarizes the outcomes of three themes related to dam risk analysis discussed in the Benchmark Workshops organized by the International Commission on Large Dams Technical Committee on “Computational Aspects of Analysis and Design of Dams.” In the 2011 Benchmark Workshop, estimation of the probability of failure of a gravity dam for the sliding failure mode was discussed. Next, in 2013, the discussion focused on the computational challenges of the estimation of consequences in dam risk analysis. Finally, in 2015, the probability of sliding and overtopping in an embankment was analyzed. These Benchmark Workshops have allowed a complete review of numerical aspects for dam risk analysis, showing that risk analysis methods are a very useful tool to analyze the risk of dam systems, including downstream consequence assessments and the uncertainty of structural models.

  6. Ex Machina: Analytical platforms, Law and the Challenges of Computational Legal Science

    Directory of Open Access Journals (Sweden)

    Nicola Lettieri

    2018-04-01

    Full Text Available Over the years, computation has become a fundamental part of the scientific practice in several research fields that goes far beyond the boundaries of natural sciences. Data mining, machine learning, simulations and other computational methods lie today at the hearth of the scientific endeavour in a growing number of social research areas from anthropology to economics. In this scenario, an increasingly important role is played by analytical platforms: integrated environments allowing researchers to experiment cutting-edge data-driven and computation-intensive analyses. The paper discusses the appearance of such tools in the emerging field of computational legal science. After a general introduction to the impact of computational methods on both natural and social sciences, we describe the concept and the features of an analytical platform exploring innovative cross-methodological approaches to the academic and investigative study of crime. Stemming from an ongoing project involving researchers from law, computer science and bioinformatics, the initiative is presented and discussed as an opportunity to raise a debate about the future of legal scholarship and, inside of it, about the challenges of computational legal science.

  7. Granular computing: perspectives and challenges.

    Science.gov (United States)

    Yao, JingTao; Vasilakos, Athanasios V; Pedrycz, Witold

    2013-12-01

    Granular computing, as a new and rapidly growing paradigm of information processing, has attracted many researchers and practitioners. Granular computing is an umbrella term to cover any theories, methodologies, techniques, and tools that make use of information granules in complex problem solving. The aim of this paper is to review foundations and schools of research and to elaborate on current developments in granular computing research. We first review some basic notions of granular computing. Classification and descriptions of various schools of research in granular computing are given. We also present and identify some research directions in granular computing.

  8. Computed tomography of skeletal muscles in childhood spinal progressive muscular atrophies

    International Nuclear Information System (INIS)

    Arai, Yumi; Osawa, Makiko; Sumida, Sawako; Shishikura, Keiko; Suzuki, Haruko; Fukuyama, Yukio; Kohno, Atsushi

    1992-01-01

    Computed tomographic (CT) scanning of skeletal muscles was performed in patients with type 1 and type 2 spinal progressive muscular atrophy (SPMA) and Kugelberg-Welander disease (K-W) to delineate the characteristic CT features of each category. Marked muscular atrophy was observed in type 1 SPMA, and both muscular atrophy and intramuscular low density areas in type 2 SPMA, changes being more pronounced in older patients. In contrast, in K-W, muscular atrophy was slight, and intramuscular low density areas constituted the most prominent findings. These observations indicate that SPMA and K-W are each characterized by distinct CT findings. (author)

  9. Quality Assurance Challenges for Motion-Adaptive Radiation Therapy: Gating, Breath Holding, and Four-Dimensional Computed Tomography

    International Nuclear Information System (INIS)

    Jiang, Steve B.; Wolfgang, John; Mageras, Gig S.

    2008-01-01

    Compared with conventional three-dimensional (3D) conformal radiation therapy and intensity-modulated radiation therapy treatments, quality assurance (QA) for motion-adaptive radiation therapy involves various challenges because of the added temporal dimension. Here we discuss those challenges for three specific techniques related to motion-adaptive therapy: namely respiratory gating, breath holding, and four-dimensional computed tomography. Similar to the introduction of any other new technologies in clinical practice, typical QA measures should be taken for these techniques also, including initial testing of equipment and clinical procedures, as well as frequent QA examinations during the early stage of implementation. Here, rather than covering every QA aspect in depth, we focus on some major QA challenges. The biggest QA challenge for gating and breath holding is how to ensure treatment accuracy when internal target position is predicted using external surrogates. Recommended QA measures for each component of treatment, including simulation, planning, patient positioning, and treatment delivery and verification, are discussed. For four-dimensional computed tomography, some major QA challenges have also been discussed

  10. Transformation of the education of health professionals in China: progress and challenges.

    Science.gov (United States)

    Hou, Jianlin; Michaud, Catherine; Li, Zhihui; Dong, Zhe; Sun, Baozhi; Zhang, Junhua; Cao, Depin; Wan, Xuehong; Zeng, Cheng; Wei, Bo; Tao, Lijian; Li, Xiaosong; Wang, Weimin; Lu, Yingqing; Xia, Xiulong; Guo, Guifang; Zhang, Zhiyong; Cao, Yunfei; Guan, Yuanzhi; Meng, Qingyue; Wang, Qing; Zhao, Yuhong; Liu, Huaping; Lin, Huiqing; Ke, Yang; Chen, Lincoln

    2014-08-30

    In this Review we examine the progress and challenges of China's ambitious 1998 reform of the world's largest health professional educational system. The reforms merged training institutions into universities and greatly expanded enrolment of health professionals. Positive achievements include an increase in the number of graduates to address human resources shortages, acceleration of production of diploma nurses to correct skill-mix imbalance, and priority for general practitioner training, especially of rural primary care workers. These developments have been accompanied by concerns: rapid expansion of the number of students without commensurate faculty strengthening, worries about dilution effect on quality, outdated curricular content, and ethical professionalism challenged by narrow technical training and growing admissions of students who did not express medicine as their first career choice. In this Review we underscore the importance of rebalance of the roles of health sciences institutions and government in educational policies and implementation. The imperative for reform is shown by a looming crisis of violence against health workers hypothesised as a result of many factors including deficient educational preparation and harmful profit-driven clinical practices. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Computational intelligence in wireless sensor networks recent advances and future challenges

    CERN Document Server

    Falcon, Rafael; Koeppen, Mario

    2017-01-01

    This book emphasizes the increasingly important role that Computational Intelligence (CI) methods are playing in solving a myriad of entangled Wireless Sensor Networks (WSN) related problems. The book serves as a guide for surveying several state-of-the-art WSN scenarios in which CI approaches have been employed. The reader finds in this book how CI has contributed to solve a wide range of challenging problems, ranging from balancing the cost and accuracy of heterogeneous sensor deployments to recovering from real-time sensor failures to detecting attacks launched by malicious sensor nodes and enacting CI-based security schemes. Network managers, industry experts, academicians and practitioners alike (mostly in computer engineering, computer science or applied mathematics) benefit from the spectrum of successful applications reported in this book. Senior undergraduate or graduate students may discover in this book some problems well suited for their own research endeavors. USP: Presents recent advances and fu...

  12. Annual progress report FY 1977. [Computer calculations of light water reactor dynamics and safety

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, K.F.; Henry, A.F.

    1977-07-01

    Progress is summarized in a project directed toward development of numerical methods suitable for the computer solution of problems in reactor dynamics and safety. Specific areas of research include methods of integration of the time-dependent diffusion equations by finite difference and finite element methods; representation of reactor properties by various homogenization procedures; application of synthesis methods; and development of response matrix techniques.

  13. Impacts of Mothers’ Occupation Status and Parenting Styles on Levels of Self-Control, Addiction to Computer Games, and Educational Progress of Adolescents

    OpenAIRE

    Abedini, Yasamin; Zamani, Bibi Eshrat; Kheradmand, Ali; Rajabizadeh, Ghodratollah

    2012-01-01

    Background Addiction to computer (video) games in adolescents and its relationship with educational progress has recently attracted the attention of rearing and education experts as well as organizations and institutes involved in physical and mental health. The current research attempted to propose a structural model of the relationships between parenting styles, mothers’ occupation status, and addiction to computer games, self-control, and educational progress of secondary school students. ...

  14. Neonatal tetanus elimination in Pakistan: progress and challenges.

    Science.gov (United States)

    Lambo, Jonathan A; Nagulesapillai, Tharsiya

    2012-12-01

    Pakistan is one of the 34 countries that have not achieved the neonatal tetanus (NT) global elimination target set by the World Health Organization (WHO). NT, caused by Clostridium tetani, is a highly fatal infection of the neonatal period. It is one of the most underreported diseases and remains a major but preventable cause of neonatal and infant mortality in many developing countries. In 1989, the World Health Assembly called for the elimination of NT by 1995, and since then considerable progress has been made using the following strategies: clean delivery practices, routine tetanus toxoid (TT) immunization of pregnant women, and immunization of all women of childbearing age with three doses of TT vaccine in high-risk areas during supplementary immunization campaigns. This review presents the activities, progress, and challenges in achieving NT elimination in Pakistan. A review of the literature found TT vaccination coverage in Pakistan ranged from 60% to 74% over the last decade. Low vaccination coverage, the main driver for NT in Pakistan, is due to many factors, including demand failure for TT vaccine resulting from inadequate knowledge of TT vaccine among reproductive age females and inadequate information about the benefits of TT provided by health care workers and the media. Other factors linked to low vaccination coverage include residing in rural areas, lack of formal education, poor knowledge about place and time to get vaccinated, and lack of awareness about the importance of vaccination. A disparity exists in TT vaccination coverage and antenatal care between urban and rural areas due to access and utilization of health care services. NT reporting is incomplete, as cases from the private sector and rural areas are underreported. To successfully eliminate NT, women of reproductive age must be made aware of the benefits of TT vaccine, not only to themselves, but also to their families. Effective communication strategies for TT vaccine delivery and

  15. A Step Towards A Computing Grid For The LHC Experiments ATLAS Data Challenge 1

    CERN Document Server

    Sturrock, R; Epp, B; Ghete, V M; Kuhn, D; Mello, A G; Caron, B; Vetterli, M C; Karapetian, G V; Martens, K; Agarwal, A; Poffenberger, P R; McPherson, R A; Sobie, R J; Amstrong, S; Benekos, N C; Boisvert, V; Boonekamp, M; Brandt, S; Casado, M P; Elsing, M; Gianotti, F; Goossens, L; Grote, M; Hansen, J B; Mair, K; Nairz, A; Padilla, C; Poppleton, A; Poulard, G; Richter-Was, Elzbieta; Rosati, S; Schörner-Sadenius, T; Wengler, T; Xu, G F; Ping, J L; Chudoba, J; Kosina, J; Lokajícek, M; Svec, J; Tas, P; Hansen, J R; Lytken, E; Nielsen, J L; Wäänänen, A; Tapprogge, Stefan; Calvet, D; Albrand, S; Collot, J; Fulachier, J; Ledroit-Guillon, F; Ohlsson-Malek, F; Viret, S; Wielers, M; Bernardet, K; Corréard, S; Rozanov, A; De Vivie de Régie, J B; Arnault, C; Bourdarios, C; Hrivnác, J; Lechowski, M; Parrour, G; Perus, A; Rousseau, D; Schaffer, A; Unal, G; Derue, F; Chevalier, L; Hassani, S; Laporte, J F; Nicolaidou, R; Pomarède, D; Virchaux, M; Nesvadba, N; Baranov, S; Putzer, A; Khonich, A; Duckeck, G; Schieferdecker, P; Kiryunin, A E; Schieck, J; Lagouri, T; Duchovni, E; Levinson, L; Schrager, D; Negri, G; Bilokon, H; Spogli, L; Barberis, D; Parodi, F; Cataldi, G; Gorini, E; Primavera, M; Spagnolo, S; Cavalli, D; Heldmann, M; Lari, T; Perini, L; Rebatto, D; Resconi, S; Tatarelli, F; Vaccarossa, L; Biglietti, M; Carlino, G; Conventi, F; Doria, A; Merola, L; Polesello, G; Vercesi, V; De Salvo, A; Di Mattia, A; Luminari, L; Nisati, A; Reale, M; Testa, M; Farilla, A; Verducci, M; Cobal, M; Santi, L; Hasegawa, Y; Ishino, M; Mashimo, T; Matsumoto, H; Sakamoto, H; Tanaka, J; Ueda, I; Bentvelsen, Stanislaus Cornelius Maria; Fornaini, A; Gorfine, G; Groep, D; Templon, J; Köster, L J; Konstantinov, A; Myklebust, T; Ould-Saada, F; Bold, T; Kaczmarska, A; Malecki, P; Szymocha, T; Turala, M; Kulchitskii, Yu A; Khoreauli, G; Gromova, N; Tsulaia, V; Minaenko, A A; Rudenko, R; Slabospitskaya, E; Solodkov, A; Gavrilenko, I; Nikitine, N; Sivoklokov, S Yu; Toms, K; Zalite, A; Zalite, Yu; Kervesan, B; Bosman, M; González, S; Sánchez, J; Salt, J; Andersson, N; Nixon, L; Eerola, Paule Anna Mari; Kónya, B; Smirnova, O G; Sandgren, A; Ekelöf, T J C; Ellert, M; Gollub, N; Hellman, S; Lipniacka, A; Corso-Radu, A; Pérez-Réale, V; Lee, S C; CLin, S C; Ren, Z L; Teng, P K; Faulkner, P J W; O'Neale, S W; Watson, A; Brochu, F; Lester, C; Thompson, S; Kennedy, J; Bouhova-Thacker, E; Henderson, R; Jones, R; Kartvelishvili, V G; Smizanska, M; Washbrook, A J; Drohan, J; Konstantinidis, N P; Moyse, E; Salih, S; Loken, J; Baines, J T M; Candlin, D; Candlin, R; Clifft, R; Li, W; McCubbin, N A; George, S; Lowe, A; Buttar, C; Dawson, I; Moraes, A; Tovey, Daniel R; Gieraltowski, J; Malon, D; May, E; LeCompte, T J; Vaniachine, A; Adams, D L; Assamagan, Ketevi A; Baker, R; Deng, W; Fine, V; Fisyak, Yu; Gibbard, B; Ma, H; Nevski, P; Paige, F; Rajagopalan, S; Smith, J; Undrus, A; Wenaus, T; Yu, D; Calafiura, P; Canon, S; Costanzo, D; Hinchliffe, Ian; Lavrijsen, W; Leggett, C; Marino, M; Quarrie, D R; Sakrejda, I; Stravopoulos, G; Tull, C; Loch, P; Youssef, S; Shank, J T; Engh, D; Frank, E; Sen-Gupta, A; Gardner, R; Meritt, F; Smirnov, Y; Huth, J; Grundhoefer, L; Luehring, F C; Goldfarb, S; Severini, H; Skubic, P L; Gao, Y; Ryan, T; De, K; Sosebee, M; McGuigan, P; Ozturk, N

    2004-01-01

    The ATLAS Collaboration at CERN is preparing for the data taking and analysis at the LHC that will start in 2007. Therefore, a series of Data Challenges was started in 2002 whose goals are the validation of the Computing Model, of the complete software suite, of the data model, and to ensure the correctness of the technical choices to be made for the final offline computing environment. A major feature of the first Data Challenge (DC1) was the preparation and the deployment of the software required for the production of large event samples as a worldwide distributed activity. It should be noted that it was not an option to "run the complete production at CERN" even if we had wanted to; the resources were not available at CERN to carry out the production on a reasonable time-scale. The great challenge of organising and carrying out this large-scale production at a significant number of sites around the world had therefore to be faced. However, the benefits of this are manifold: apart from realising the require...

  16. Progress and challenges in utilization of palm oil biomass as fuel for decentralized electricity generation

    Energy Technology Data Exchange (ETDEWEB)

    Bazmi, Aqeel Ahmed [Process Systems Engineering Centre (PROSPECT), Department of Chemical Engineering, Faculty of Chemical and Natural Resources Engineering, University Technology Malaysia, Skudai 81310, Johor Bahru, JB (Malaysia); Biomass Conversion Research Center (BCRC), Department of Chemical Engineering, COMSATS Institute of Information Technology, Lahore (Pakistan); Zahedi, Gholamreza; Hashim, Haslenda [Process Systems Engineering Centre (PROSPECT), Department of Chemical Engineering, Faculty of Chemical and Natural Resources Engineering, University Technology Malaysia, Skudai 81310, Johor Bahru, JB (Malaysia)

    2011-01-15

    It has been broadly accepted worldwide that global warming, indeed, is the greatest threat of the time to the environment. Renewable energy (RE) is expected as a perfect solution to reduce global warming and to endorse sustainable development. Progressive release of greenhouse gases (GHG) from increasing energy-intensive industries has eventually caused human civilization to suffer. Realizing the exigency of reducing emissions and simultaneously catering to needs of industries, researchers foresee the RE as the perfect entrant to overcome these challenges. RE provides an effective option for the provision of energy services from the technical point of view while biomass, a major source of energy in the world until before industrialization when fossil fuels become dominant, appears an important renewable source of energy and researches have proven from time to time its viability for large-scale production. Being a widely spread source, biomass offers the execution of decentralized electricity generation gaining importance in liberalized electricity markets. The decentralized power is characterized by generation of electricity nearer to the demand centers, meeting the local energy needs. Researchers envisaged an increasing decentralization of power supply, expected to make a particular contribution to climate protection. This article investigates the progress and challenges for decentralized electricity generation by palm oil biomass according to the overall concept of sustainable development. (author)

  17. Progress and challenges in utilization of palm oil biomass as fuel for decentralized electricity generation

    International Nuclear Information System (INIS)

    Bazmi, Aqeel Ahmed; Zahedi, Gholamreza; Hashim, Haslenda

    2011-01-01

    It has been broadly accepted worldwide that global warming, indeed, is the greatest threat of the time to the environment. Renewable energy (RE) is expected as a perfect solution to reduce global warming and to endorse sustainable development. Progressive release of greenhouse gases (GHG) from increasing energy-intensive industries has eventually caused human civilization to suffer. Realizing the exigency of reducing emissions and simultaneously catering to needs of industries, researchers foresee the RE as the perfect entrant to overcome these challenges. RE provides an effective option for the provision of energy services from the technical point of view while biomass, a major source of energy in the world until before industrialization when fossil fuels become dominant, appears an important renewable source of energy and researches have proven from time to time its viability for large-scale production. Being a widely spread source, biomass offers the execution of decentralized electricity generation gaining importance in liberalized electricity markets. The decentralized power is characterized by generation of electricity nearer to the demand centers, meeting the local energy needs. Researchers envisaged an increasing decentralization of power supply, expected to make a particular contribution to climate protection. This article investigates the progress and challenges for decentralized electricity generation by palm oil biomass according to the overall concept of sustainable development. (author)

  18. Eight challenges for network epidemic models

    Directory of Open Access Journals (Sweden)

    Lorenzo Pellis

    2015-03-01

    Full Text Available Networks offer a fertile framework for studying the spread of infection in human and animal populations. However, owing to the inherent high-dimensionality of networks themselves, modelling transmission through networks is mathematically and computationally challenging. Even the simplest network epidemic models present unanswered questions. Attempts to improve the practical usefulness of network models by including realistic features of contact networks and of host–pathogen biology (e.g. waning immunity have made some progress, but robust analytical results remain scarce. A more general theory is needed to understand the impact of network structure on the dynamics and control of infection. Here we identify a set of challenges that provide scope for active research in the field of network epidemic models.

  19. High End Computing Technologies for Earth Science Applications: Trends, Challenges, and Innovations

    Science.gov (United States)

    Parks, John (Technical Monitor); Biswas, Rupak; Yan, Jerry C.; Brooks, Walter F.; Sterling, Thomas L.

    2003-01-01

    Earth science applications of the future will stress the capabilities of even the highest performance supercomputers in the areas of raw compute power, mass storage management, and software environments. These NASA mission critical problems demand usable multi-petaflops and exabyte-scale systems to fully realize their science goals. With an exciting vision of the technologies needed, NASA has established a comprehensive program of advanced research in computer architecture, software tools, and device technology to ensure that, in partnership with US industry, it can meet these demanding requirements with reliable, cost effective, and usable ultra-scale systems. NASA will exploit, explore, and influence emerging high end computing architectures and technologies to accelerate the next generation of engineering, operations, and discovery processes for NASA Enterprises. This article captures this vision and describes the concepts, accomplishments, and the potential payoff of the key thrusts that will help meet the computational challenges in Earth science applications.

  20. Computer-assisted learning and simulation systems in dentistry--a challenge to society.

    Science.gov (United States)

    Welk, A; Splieth, Ch; Wierinck, E; Gilpatrick, R O; Meyer, G

    2006-07-01

    Computer technology is increasingly used in practical training at universities. However, in spite of their potential, computer-assisted learning (CAL) and computer-assisted simulation (CAS) systems still appear to be underutilized in dental education. Advantages, challenges, problems, and solutions of computer-assisted learning and simulation in dentistry are discussed by means of MEDLINE, open Internet platform searches, and key results of a study among German dental schools. The advantages of computer-assisted learning are seen for example in self-paced and self-directed learning and increased motivation. It is useful for both objective theoretical and practical tests and for training students to handle complex cases. CAL can lead to more structured learning and can support training in evidence-based decision-making. The reasons for the still relatively rare implementation of CAL/CAS systems in dental education include an inability to finance, lack of studies of CAL/CAS, and too much effort required to integrate CAL/CAS systems into the curriculum. To overcome the reasons for the relative low degree of computer technology use, we should strive for multicenter research and development projects monitored by the appropriate national and international scientific societies, so that the potential of computer technology can be fully realized in graduate, postgraduate, and continuing dental education.

  1. Security Techniques for protecting data in Cloud Computing

    OpenAIRE

    Maddineni, Venkata Sravan Kumar; Ragi, Shivashanker

    2012-01-01

    Context: From the past few years, there has been a rapid progress in Cloud Computing. With the increasing number of companies resorting to use resources in the Cloud, there is a necessity for protecting the data of various users using centralized resources. Some major challenges that are being faced by Cloud Computing are to secure, protect and process the data which is the property of the user. Aims and Objectives: The main aim of this research is to understand the security threats and ident...

  2. Fundamental challenging problems for developing new nuclear safety standard computer codes

    International Nuclear Information System (INIS)

    Wong, P.K.; Wong, A.E.; Wong, A.

    2005-01-01

    Based on the claims of the US Basic patents number 5,084,232; 5,848,377 and 6,430,516 that can be obtained from typing the Patent Numbers into the Box of the Web site http://164.195.100.11/netahtml/srchnum.htm and their associated published technical papers having been presented and published at International Conferences in the last three years and that all these had been sent into US-NRC by E-mail on March 26, 2003 at 2:46 PM., three fundamental challenging problems for developing new nuclear safety standard computer codes had been presented at the US-NRC RIC2003 Session W4. 2:15-3:15 PM. at the Washington D.C. Capital Hilton Hotel, Presidential Ballroom on April 16, 2003 in front of more than 800 nuclear professionals from many countries worldwide. The objective and scope of this paper is to invite all nuclear professionals to examine and evaluate all the current computer codes being used in their own countries by means of comparison of numerical data from these three specific openly challenging fundamental problems in order to set up a global safety standard for all nuclear power plants in the world. (authors)

  3. Biomolecular simulations on petascale: promises and challenges

    International Nuclear Information System (INIS)

    Agarwal, Pratul K; Alam, Sadaf R

    2006-01-01

    Proteins work as highly efficient machines at the molecular level and are responsible for a variety of processes in all living cells. There is wide interest in understanding these machines for implications in biochemical/biotechnology industries as well as in health related fields. Over the last century, investigations of proteins based on a variety of experimental techniques have provided a wealth of information. More recently, theoretical and computational modeling using large scale simulations is providing novel insights into the functioning of these machines. The next generation supercomputers with petascale computing power, hold great promises as well as challenges for the biomolecular simulation scientists. We briefly discuss the progress being made in this area

  4. Scientific Discovery through Advanced Computing in Plasma Science

    Science.gov (United States)

    Tang, William

    2005-03-01

    Advanced computing is generally recognized to be an increasingly vital tool for accelerating progress in scientific research during the 21st Century. For example, the Department of Energy's ``Scientific Discovery through Advanced Computing'' (SciDAC) Program was motivated in large measure by the fact that formidable scientific challenges in its research portfolio could best be addressed by utilizing the combination of the rapid advances in super-computing technology together with the emergence of effective new algorithms and computational methodologies. The imperative is to translate such progress into corresponding increases in the performance of the scientific codes used to model complex physical systems such as those encountered in high temperature plasma research. If properly validated against experimental measurements and analytic benchmarks, these codes can provide reliable predictive capability for the behavior of a broad range of complex natural and engineered systems. This talk reviews recent progress and future directions for advanced simulations with some illustrative examples taken from the plasma science applications area. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by the combination of access to powerful new computational resources together with innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning a huge range in time and space scales. In particular, the plasma science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations

  5. Progress in computer aided diagnosis for medical images by information technology

    International Nuclear Information System (INIS)

    Mekada, Yoshito

    2007-01-01

    This paper describes the history, present state and future view of computer aided diagnosis (CAD) based on processing, recognition and visualization of chest and abdominal images. A primitive feature of CAD is seen as early as in 1960's for lung cancer detection. Contemporarily, advances in medical imaging by CT, MRI, single photon emission computed tomography (SPECT) and positron emission tomography (PET) in multi-dimensions require doctors to read those vast information, where necessity of CAD is evident. At present, simultaneous CAD for multi-organs and multi-diseases is in progress, the interaction between images and medical doctors is leading to developing a newer system like virtual endoscopy, objective evaluation of CAD systems is necessary for its approval to authorities like fluorescein diacetate (FDA) with use of receiver operating characteristics analysis, and thus cooperation of medical and technological fields is more and more important. In future, CAD should be responsible for individual difference and for change in disease state, usable simultaneously for time and space, more recognized of its importance by doctors, and more useful in participation to therapeutic practice. (R.T.)

  6. Building a Grad Nation: Progress and Challenge in Ending the High School Dropout Epidemic. Annual Update, 2010-2011

    Science.gov (United States)

    Balfanz, Robert; Bridgeland, John M.; Fox, Joanna Hornig; Moore, Laura A.

    2011-01-01

    America continues to make progress in meeting its high school dropout challenge. Leaders in education, government, nonprofits and business have awakened to the individual, social and economic costs of the dropout crisis and are working together to solve it. This year, all states, districts, and schools are required by law to calculate high school…

  7. Mobile cloud computing for computation offloading: Issues and challenges

    Directory of Open Access Journals (Sweden)

    Khadija Akherfi

    2018-01-01

    Full Text Available Despite the evolution and enhancements that mobile devices have experienced, they are still considered as limited computing devices. Today, users become more demanding and expect to execute computational intensive applications on their smartphone devices. Therefore, Mobile Cloud Computing (MCC integrates mobile computing and Cloud Computing (CC in order to extend capabilities of mobile devices using offloading techniques. Computation offloading tackles limitations of Smart Mobile Devices (SMDs such as limited battery lifetime, limited processing capabilities, and limited storage capacity by offloading the execution and workload to other rich systems with better performance and resources. This paper presents the current offloading frameworks, computation offloading techniques, and analyzes them along with their main critical issues. In addition, it explores different important parameters based on which the frameworks are implemented such as offloading method and level of partitioning. Finally, it summarizes the issues in offloading frameworks in the MCC domain that requires further research.

  8. Computing Challenges in Coded Mask Imaging

    Science.gov (United States)

    Skinner, Gerald

    2009-01-01

    This slide presaentation reviews the complications and challenges in developing computer systems for Coded Mask Imaging telescopes. The coded mask technique is used when there is no other way to create the telescope, (i.e., when there are wide fields of view, high energies for focusing or low energies for the Compton/Tracker Techniques and very good angular resolution.) The coded mask telescope is described, and the mask is reviewed. The coded Masks for the INTErnational Gamma-Ray Astrophysics Laboratory (INTEGRAL) instruments are shown, and a chart showing the types of position sensitive detectors used for the coded mask telescopes is also reviewed. Slides describe the mechanism of recovering an image from the masked pattern. The correlation with the mask pattern is described. The Matrix approach is reviewed, and other approaches to image reconstruction are described. Included in the presentation is a review of the Energetic X-ray Imaging Survey Telescope (EXIST) / High Energy Telescope (HET), with information about the mission, the operation of the telescope, comparison of the EXIST/HET with the SWIFT/BAT and details of the design of the EXIST/HET.

  9. Delaware: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  10. Oklahoma: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  11. Arkansas: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  12. Mississippi: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  13. Texas: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  14. Georgia: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  15. Maryland: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  16. Louisiana: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  17. Tennessee: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  18. Kentucky: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  19. The nature of the (visualization) game: Challenges and opportunities from computational geophysics

    Science.gov (United States)

    Kellogg, L. H.

    2016-12-01

    As the geosciences enters the era of big data, modeling and visualization become increasingly vital tools for discovery, understanding, education, and communication. Here, we focus on modeling and visualization of the structure and dynamics of the Earth's surface and interior. The past decade has seen accelerated data acquisition, including higher resolution imaging and modeling of Earth's deep interior, complex models of geodynamics, and high resolution topographic imaging of the changing surface, with an associated acceleration of computational modeling through better scientific software, increased computing capability, and the use of innovative methods of scientific visualization. The role of modeling is to describe a system, answer scientific questions, and test hypotheses; the term "model" encompasses mathematical models, computational models, physical models, conceptual models, statistical models, and visual models of a structure or process. These different uses of the term require thoughtful communication to avoid confusion. Scientific visualization is integral to every aspect of modeling. Not merely a means of communicating results, the best uses of visualization enable scientists to interact with their data, revealing the characteristics of the data and models to enable better interpretation and inform the direction of future investigation. Innovative immersive technologies like virtual reality, augmented reality, and remote collaboration techniques, are being adapted more widely and are a magnet for students. Time-varying or transient phenomena are especially challenging to model and to visualize; researchers and students may need to investigate the role of initial conditions in driving phenomena, while nonlinearities in the governing equations of many Earth systems make the computations and resulting visualization especially challenging. Training students how to use, design, build, and interpret scientific modeling and visualization tools prepares them

  20. Alabama: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP, high school graduation rates, and access…

  1. Virginia: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP, high school graduation rates, and access…

  2. Florida: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP, high school graduation rates, and access…

  3. Challenges and Recent Progress in the Development of a Closed-loop Artificial Pancreas.

    Science.gov (United States)

    Bequette, B Wayne

    2012-12-01

    Pursuit of a closed-loop artificial pancreas that automatically controls the blood glucose of individuals with type 1 diabetes has intensified during the past six years. Here we discuss the recent progress and challenges in the major steps towards a closed-loop system. Continuous insulin infusion pumps have been widely available for over two decades, but "smart pump" technology has made the devices easier to use and more powerful. Continuous glucose monitoring (CGM) technology has improved and the devices are more widely available. A number of approaches are currently under study for fully closed-loop systems; most manipulate only insulin, while others manipulate insulin and glucagon. Algorithms include on-off (for prevention of overnight hypoglycemia), proportional-integral-derivative (PID), model predictive control (MPC) and fuzzy logic based learning control. Meals cause a major "disturbance" to blood glucose, and we discuss techniques that our group has developed to predict when a meal is likely to be consumed and its effect. We further examine both physiology and device-related challenges, including insulin infusion set failure and sensor signal attenuation. Finally, we discuss the next steps required to make a closed-loop artificial pancreas a commercial reality.

  4. "Defining Computer 'Speed': An Unsolved Challenge"

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Abstract: The reason we use computers is their speed, and the reason we use parallel computers is that they're faster than single-processor computers. Yet, after 70 years of electronic digital computing, we still do not have a solid definition of what computer 'speed' means, or even what it means to be 'faster'. Unlike measures in physics, where the definition of speed is rigorous and unequivocal, in computing there is no definition of speed that is universally accepted. As a result, computer customers have made purchases misguided by dubious information, computer designers have optimized their designs for the wrong goals, and computer programmers have chosen methods that optimize the wrong things. This talk describes why some of the obvious and historical ways of defining 'speed' haven't served us well, and the things we've learned in the struggle to find a definition that works. Biography: Dr. John Gustafson is a Director ...

  5. Progress and challenges of engineering a biophysical CO2-concentrating mechanism into higher plants.

    Science.gov (United States)

    Rae, Benjamin D; Long, Benedict M; Förster, Britta; Nguyen, Nghiem D; Velanis, Christos N; Atkinson, Nicky; Hee, Wei Yih; Mukherjee, Bratati; Price, G Dean; McCormick, Alistair J

    2017-06-01

    Growth and productivity in important crop plants is limited by the inefficiencies of the C3 photosynthetic pathway. Introducing CO2-concentrating mechanisms (CCMs) into C3 plants could overcome these limitations and lead to increased yields. Many unicellular microautotrophs, such as cyanobacteria and green algae, possess highly efficient biophysical CCMs that increase CO2 concentrations around the primary carboxylase enzyme, Rubisco, to enhance CO2 assimilation rates. Algal and cyanobacterial CCMs utilize distinct molecular components, but share several functional commonalities. Here we outline the recent progress and current challenges of engineering biophysical CCMs into C3 plants. We review the predicted requirements for a functional biophysical CCM based on current knowledge of cyanobacterial and algal CCMs, the molecular engineering tools and research pipelines required to translate our theoretical knowledge into practice, and the current challenges to achieving these goals. © The Author 2017. Published by Oxford University Press on behalf of the Society for Experimental Biology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  6. Nanoparticle-Based Drug Delivery for Therapy of Lung Cancer: Progress and Challenges

    Directory of Open Access Journals (Sweden)

    Anish Babu

    2013-01-01

    Full Text Available The last decade has witnessed enormous advances in the development and application of nanotechnology in cancer detection, diagnosis, and therapy culminating in the development of the nascent field of “cancer nanomedicine.” A nanoparticle as per the National Institutes of Health (NIH guidelines is any material that is used in the formulation of a drug resulting in a final product smaller than 1 micron in size. Nanoparticle-based therapeutic systems have gained immense popularity due to their ability to overcome biological barriers, effectively deliver hydrophobic therapies, and preferentially target disease sites. Currently, many formulations of nanocarriers are utilized including lipid-based, polymeric and branched polymeric, metal-based, magnetic, and mesoporous silica. Innovative strategies have been employed to exploit the multicomponent, three-dimensional constructs imparting multifunctional capabilities. Engineering such designs allows simultaneous drug delivery of chemotherapeutics and anticancer gene therapies to site-specific targets. In lung cancer, nanoparticle-based therapeutics is paving the way in the diagnosis, imaging, screening, and treatment of primary and metastatic tumors. However, translating such advances from the bench to the bedside has been severely hampered by challenges encountered in the areas of pharmacology, toxicology, immunology, large-scale manufacturing, and regulatory issues. This review summarizes current progress and challenges in nanoparticle-based drug delivery systems, citing recent examples targeted at lung cancer treatment.

  7. Progression of vestibular schawnnoma after GammaKnife radiosurgery: A challenge for microsurgical resection.

    Science.gov (United States)

    Aboukaïs, Rabih; Bonne, Nicolas-Xavier; Touzet, Gustavo; Vincent, Christophe; Reyns, Nicolas; Lejeune, Jean-Paul

    2018-05-01

    We aimed to evaluate the outcome of patients who underwent salvage microsurgery for vestibular schwannoma (VS) that failed primary Gammaknife radiosurgery (GKS). Among the 1098 patients who received GKS for the treatment of VS in our center between January 2004 and December 2012, the follow-up was organized in our institution for 290 patients who lived in our recruitment area. Tumor progression was noted in 23 patients. A salvage microsurgical resection was performed in 11 patients, who were included in our study. Grading of facial function was done according to the House & Brackman scale. The mean age at diagnosis was 50.2 years (19-68 years) and the mean follow-up was 9.4 years (4-13 years). The mean dose was 11.8 Gy (11-12 Gy) and the mean volume was 922 mm3 (208-2500 mm3). The mean period between GKS and diagnosis of tumor progression was 32 months (18-72 months). Concerning salvage microsurgery, complete resection was obtained in 8 patients. Small residual tumor on the facial nerve was deliberately left in 3 patients and no tumor progression was noted with a mean follow-up of 26 months. At last follow-up, facial nerve function was grade 1 in 4 patients, grade 2 in 3 patients, grade 3 in 1 patient and grade 4 in 3 patients. Salvage surgery of recurrent vestibular schwannoma after failed initial GKS remains a good treatment. However, facial nerve preservation is more challenging in this case and small tumor remnant could be sometimes deliberately left. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Scientific Grand Challenges: Forefront Questions in Nuclear Science and the Role of High Performance Computing

    International Nuclear Information System (INIS)

    Khaleel, Mohammad A.

    2009-01-01

    This report is an account of the deliberations and conclusions of the workshop on 'Forefront Questions in Nuclear Science and the Role of High Performance Computing' held January 26-28, 2009, co-sponsored by the U.S. Department of Energy (DOE) Office of Nuclear Physics (ONP) and the DOE Office of Advanced Scientific Computing (ASCR). Representatives from the national and international nuclear physics communities, as well as from the high performance computing community, participated. The purpose of this workshop was to (1) identify forefront scientific challenges in nuclear physics and then determine which-if any-of these could be aided by high performance computing at the extreme scale; (2) establish how and why new high performance computing capabilities could address issues at the frontiers of nuclear science; (3) provide nuclear physicists the opportunity to influence the development of high performance computing; and (4) provide the nuclear physics community with plans for development of future high performance computing capability by DOE ASCR.

  9. Scientific Grand Challenges: Forefront Questions in Nuclear Science and the Role of High Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.

    2009-10-01

    This report is an account of the deliberations and conclusions of the workshop on "Forefront Questions in Nuclear Science and the Role of High Performance Computing" held January 26-28, 2009, co-sponsored by the U.S. Department of Energy (DOE) Office of Nuclear Physics (ONP) and the DOE Office of Advanced Scientific Computing (ASCR). Representatives from the national and international nuclear physics communities, as well as from the high performance computing community, participated. The purpose of this workshop was to 1) identify forefront scientific challenges in nuclear physics and then determine which-if any-of these could be aided by high performance computing at the extreme scale; 2) establish how and why new high performance computing capabilities could address issues at the frontiers of nuclear science; 3) provide nuclear physicists the opportunity to influence the development of high performance computing; and 4) provide the nuclear physics community with plans for development of future high performance computing capability by DOE ASCR.

  10. Computational Cellular Dynamics Based on the Chemical Master Equation: A Challenge for Understanding Complexity.

    Science.gov (United States)

    Liang, Jie; Qian, Hong

    2010-01-01

    Modern molecular biology has always been a great source of inspiration for computational science. Half a century ago, the challenge from understanding macromolecular dynamics has led the way for computations to be part of the tool set to study molecular biology. Twenty-five years ago, the demand from genome science has inspired an entire generation of computer scientists with an interest in discrete mathematics to join the field that is now called bioinformatics. In this paper, we shall lay out a new mathematical theory for dynamics of biochemical reaction systems in a small volume (i.e., mesoscopic) in terms of a stochastic, discrete-state continuous-time formulation, called the chemical master equation (CME). Similar to the wavefunction in quantum mechanics, the dynamically changing probability landscape associated with the state space provides a fundamental characterization of the biochemical reaction system. The stochastic trajectories of the dynamics are best known through the simulations using the Gillespie algorithm. In contrast to the Metropolis algorithm, this Monte Carlo sampling technique does not follow a process with detailed balance. We shall show several examples how CMEs are used to model cellular biochemical systems. We shall also illustrate the computational challenges involved: multiscale phenomena, the interplay between stochasticity and nonlinearity, and how macroscopic determinism arises from mesoscopic dynamics. We point out recent advances in computing solutions to the CME, including exact solution of the steady state landscape and stochastic differential equations that offer alternatives to the Gilespie algorithm. We argue that the CME is an ideal system from which one can learn to understand "complex behavior" and complexity theory, and from which important biological insight can be gained.

  11. Progressive geometric algorithms

    NARCIS (Netherlands)

    Alewijnse, S.P.A.; Bagautdinov, T.M.; de Berg, M.T.; Bouts, Q.W.; ten Brink, Alex P.; Buchin, K.A.; Westenberg, M.A.

    2015-01-01

    Progressive algorithms are algorithms that, on the way to computing a complete solution to the problem at hand, output intermediate solutions that approximate the complete solution increasingly well. We present a framework for analyzing such algorithms, and develop efficient progressive algorithms

  12. Progressive geometric algorithms

    NARCIS (Netherlands)

    Alewijnse, S.P.A.; Bagautdinov, T.M.; Berg, de M.T.; Bouts, Q.W.; Brink, ten A.P.; Buchin, K.; Westenberg, M.A.

    2014-01-01

    Progressive algorithms are algorithms that, on the way to computing a complete solution to the problem at hand, output intermediate solutions that approximate the complete solution increasingly well. We present a framework for analyzing such algorithms, and develop efficient progressive algorithms

  13. North Carolina: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  14. West Virginia: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  15. South Carolina: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  16. Computer animation algorithms and techniques

    CERN Document Server

    Parent, Rick

    2012-01-01

    Driven by the demands of research and the entertainment industry, the techniques of animation are pushed to render increasingly complex objects with ever-greater life-like appearance and motion. This rapid progression of knowledge and technique impacts professional developers, as well as students. Developers must maintain their understanding of conceptual foundations, while their animation tools become ever more complex and specialized. The second edition of Rick Parent's Computer Animation is an excellent resource for the designers who must meet this challenge. The first edition establ

  17. Progress Report 2004-2005

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-11-01

    The Faculty of Electrical Engineering and Computing (FER) as a part of University of Zagreb, has its roots in the Technical Faculty Zagreb, founded in 1919, which evolved into the Faculty of Electrical Engineering in 1956 and was upgraded into the Faculty of Electrical Engineering and Computing in 1994. Due to the increasing progress and advances in electrical and electronic engineering as well as in computer science and information technologies, the Faculty has become the largest technical faculty and the leading educational and R and D institution in the fields of electrical engineering and computing in Croatia. More than 13000 graduate students, more than 1900 postgraduate students who received the Master degree and more than 540 students with PhD degree, are today's total numbers, which highlights our highly spirited activities in teaching. Additional to this number are also 3800 undergraduate students as well as about 500 graduates each year. Organised in 11 departments, the present educational staff comprises 130 professors and 200 teaching assistants and researchers operating in more than 60 laboratories and area of more than 35000 m{sup 2}. Education and research is the crucial factor determining the economic and social progress and equality of opportunity in our societies. It becomes even more so in the digital age in order to ensure life-long-learning and the emergence of new generations of creators, researchers and entrepreneurs and to empower playing an active role in the knowledge society. The experiences at the university level should be transferred to the others. We can help to do that, as a chain the global challenge. The Faculty offers a broad spectrum of services to business and industry, from research and consultancy to conference facilities, training and postgraduate recruitment. The Faculty is a leading research-led institution and undertakes research at the highest levels of international standing. The Faculty is an integral part of the

  18. Progress Report 2004-2005

    International Nuclear Information System (INIS)

    2005-11-01

    The Faculty of Electrical Engineering and Computing (FER) as a part of University of Zagreb, has its roots in the Technical Faculty Zagreb, founded in 1919, which evolved into the Faculty of Electrical Engineering in 1956 and was upgraded into the Faculty of Electrical Engineering and Computing in 1994. Due to the increasing progress and advances in electrical and electronic engineering as well as in computer science and information technologies, the Faculty has become the largest technical faculty and the leading educational and R and D institution in the fields of electrical engineering and computing in Croatia. More than 13000 graduate students, more than 1900 postgraduate students who received the Master degree and more than 540 students with PhD degree, are today's total numbers, which highlights our highly spirited activities in teaching. Additional to this number are also 3800 undergraduate students as well as about 500 graduates each year. Organised in 11 departments, the present educational staff comprises 130 professors and 200 teaching assistants and researchers operating in more than 60 laboratories and area of more than 35000 m 2 . Education and research is the crucial factor determining the economic and social progress and equality of opportunity in our societies. It becomes even more so in the digital age in order to ensure life-long-learning and the emergence of new generations of creators, researchers and entrepreneurs and to empower playing an active role in the knowledge society. The experiences at the university level should be transferred to the others. We can help to do that, as a chain the global challenge. The Faculty offers a broad spectrum of services to business and industry, from research and consultancy to conference facilities, training and postgraduate recruitment. The Faculty is a leading research-led institution and undertakes research at the highest levels of international standing. The Faculty is an integral part of the community

  19. Label-free SERS in biological and biomedical applications: Recent progress, current challenges and opportunities

    Science.gov (United States)

    Zheng, Xiao-Shan; Jahn, Izabella Jolan; Weber, Karina; Cialla-May, Dana; Popp, Jürgen

    2018-05-01

    To achieve an insightful look within biomolecular processes on the cellular level, the development of diseases as well as the reliable detection of metabolites and pathogens, a modern analytical tool is needed that is highly sensitive, molecular-specific and exhibits fast detection. Surface-enhanced Raman spectroscopy (SERS) is known to meet these requirements and, within this review article, the recent progress of label-free SERS in biological and biomedical applications is summarized and discussed. This includes the detection of biomolecules such as metabolites, nucleic acids and proteins. Further, the characterization and identification of microorganisms has been achieved by label-free SERS-based approaches. Eukaryotic cells can be characterized by SERS in order to gain information about the outer cell wall or to detect intracellular molecules and metabolites. The potential of SERS for medically relevant detection schemes is emphasized by the label-free detection of tissue, the investigation of body fluids as well as applications for therapeutic and illicit drug monitoring. The review article is concluded with an evaluation of the recent progress and current challenges in order to highlight the direction of label-free SERS in the future.

  20. In this issue: Time to replace doctors’ judgement with computers

    Directory of Open Access Journals (Sweden)

    Simon de Lusignan

    2015-11-01

    Full Text Available Informaticians continue to rise to the challenge, set by the English Health Minister, of trying to replace doctors’ judgement with computers. This issue describes successes and where there are barriers. However, whilst there is progress this tends to be incremental and there are grand challenges to be overcome before computers can replace clinician. These grand challenges include: (1 improving usability so it is possible to more readily incorporate technology into clinical workflow; (2 rigorous new analytic methods that make use of the mass of available data, ‘Big data’, to create real-world evidence; (3 faster ways of meeting regulatory and legal requirements including ensuring privacy; (4 provision of reimbursement models to fund innovative technology that can substitute for clinical time and (5 recognition that innovations that improve quality also often increase cost. Informatics more is likely to support and augment clinical decision making rather than replace clinicians.

  1. EPA and GSA Webinar: E Scrap Management, Computers for Learning and the Federal Green Challenge

    Science.gov (United States)

    EPA and the General Services Administration (GSA) are hosting a webinar on May 2, 2018. Topics will include policies and procedures on E Scrap management, a review of the Computers For Leaning Program, and benefits of joining the Federal Green Challenge.

  2. Grand Challenges of Advanced Computing for Energy Innovation Report from the Workshop Held July 31-August 2, 2012

    Energy Technology Data Exchange (ETDEWEB)

    Larzelere, Alex R.; Ashby, Steven F.; Christensen, Dana C.; Crawford, Dona L.; Khaleel, Mohammad A.; John, Grosh; Stults, B. Ray; Lee, Steven L.; Hammond, Steven W.; Grover, Benjamin T.; Neely, Rob; Dudney, Lee Ann; Goldstein, Noah C.; Wells, Jack; Peltz, Jim

    2013-03-06

    On July 31-August 2 of 2012, the U.S. Department of Energy (DOE) held a workshop entitled Grand Challenges of Advanced Computing for Energy Innovation. This workshop built on three earlier workshops that clearly identified the potential for the Department and its national laboratories to enable energy innovation. The specific goal of the workshop was to identify the key challenges that the nation must overcome to apply the full benefit of taxpayer-funded advanced computing technologies to U.S. energy innovation in the ways that the country produces, moves, stores, and uses energy. Perhaps more importantly, the workshop also developed a set of recommendations to help the Department overcome those challenges. These recommendations provide an action plan for what the Department can do in the coming years to improve the nation’s energy future.

  3. Climate Reanalysis: Progress and Future Prospects

    Science.gov (United States)

    Gelaro, Ron

    2018-01-01

    Reanalysis is the process whereby an unchanging data assimilation system is used to provide a consistent reprocessing of observations, typically spanning an extended segment of the historical data record. The process relies on an underlying model to combine often-disparate observations in a physically consistent manner, enabling production of gridded data sets for a broad range of applications including the study of historical weather events, preparation of climatologies, business sector development and, more recently, climate monitoring. Over the last few decades, several generations of reanalyses of the global atmosphere have been produced by various operational and research centers, focusing more or less on the period of regular conventional and satellite observations beginning in the mid to late twentieth century. There have also been successful efforts to extend atmospheric reanalyses back to the late nineteenth and early twentieth centuries, using mostly surface observations. Much progress has resulted from (and contributed to) advancements in numerical weather prediction, especially improved models and data assimilation techniques, increased computing capacity, the availability of new observation types and efforts to recover and improve the quality of historical ones. The recent extension of forecast systems that allow integrated modeling of meteorological, oceanic, land surface, and chemical variables provide the basic elements for coupled data assimilation. This has opened the door to the development of a new generation of coupled reanalyses of the Earth system, or integrated Earth system analyses (IESA). Evidence so far suggests that this approach can improve the analysis of currently uncoupled components of the Earth system, especially at their interface, and lead to increased predictability. However, extensive analysis coupling as envisioned for IESA, while progressing, still presents significant challenges. These include model biases that can be

  4. Development of scan analysis techniques employing a small computer. Progress report, August 1, 1974--July 31, 1975

    International Nuclear Information System (INIS)

    Kuhl, D.E.

    1975-01-01

    Progress is reported in the development of equipment and counting techniques for transverse section scanning of the brain following the administration of radiopharmaceuticals to evaluate regional blood flow. The scanning instrument has an array of 32 scintillation detectors that surround the head and scan data are analyzed using a small computer. (U.S.)

  5. Spectroscopy, modeling and computation of metal chelate solubility in supercritical CO2. 1998 annual progress report

    International Nuclear Information System (INIS)

    Brennecke, J.F.; Chateauneuf, J.E.; Stadtherr, M.A.

    1998-01-01

    'This report summarizes work after 1 year and 8 months (9/15/96-5/14/98) of a 3 year project. Thus far, progress has been made in: (1) the measurement of the solubility of metal chelates in SC CO 2 with and without added cosolvents, (2) the spectroscopic determination of preferential solvation of metal chelates by cosolvents in SC CO 2 solutions, and (3) the development of a totally reliable computational technique for phase equilibrium computations. An important factor in the removal of metals from solid matrices with CO 2 /chelate mixtures is the equilibrium solubility of the metal chelate complex in the CO 2 .'

  6. Aquatic Toxic Analysis by Monitoring Fish Behavior Using Computer Vision: A Recent Progress

    Directory of Open Access Journals (Sweden)

    Chunlei Xia

    2018-01-01

    Full Text Available Video tracking based biological early warning system achieved a great progress with advanced computer vision and machine learning methods. Ability of video tracking of multiple biological organisms has been largely improved in recent years. Video based behavioral monitoring has become a common tool for acquiring quantified behavioral data for aquatic risk assessment. Investigation of behavioral responses under chemical and environmental stress has been boosted by rapidly developed machine learning and artificial intelligence. In this paper, we introduce the fundamental of video tracking and present the pioneer works in precise tracking of a group of individuals in 2D and 3D space. Technical and practical issues suffered in video tracking are explained. Subsequently, the toxic analysis based on fish behavioral data is summarized. Frequently used computational methods and machine learning are explained with their applications in aquatic toxicity detection and abnormal pattern analysis. Finally, advantages of recent developed deep learning approach in toxic prediction are presented.

  7. Topic 14+16: High-performance and scientific applications and extreme-scale computing (Introduction)

    KAUST Repository

    Downes, Turlough P.

    2013-01-01

    As our understanding of the world around us increases it becomes more challenging to make use of what we already know, and to increase our understanding still further. Computational modeling and simulation have become critical tools in addressing this challenge. The requirements of high-resolution, accurate modeling have outstripped the ability of desktop computers and even small clusters to provide the necessary compute power. Many applications in the scientific and engineering domains now need very large amounts of compute time, while other applications, particularly in the life sciences, frequently have large data I/O requirements. There is thus a growing need for a range of high performance applications which can utilize parallel compute systems effectively, which have efficient data handling strategies and which have the capacity to utilise current and future systems. The High Performance and Scientific Applications topic aims to highlight recent progress in the use of advanced computing and algorithms to address the varied, complex and increasing challenges of modern research throughout both the "hard" and "soft" sciences. This necessitates being able to use large numbers of compute nodes, many of which are equipped with accelerators, and to deal with difficult I/O requirements. © 2013 Springer-Verlag.

  8. Some computational challenges of developing efficient parallel algorithms for data-dependent computations in thermal-hydraulics supercomputer applications

    International Nuclear Information System (INIS)

    Woodruff, S.B.

    1994-01-01

    The Transient Reactor Analysis Code (TRAC), which features a two-fluid treatment of thermal-hydraulics, is designed to model transients in water reactors and related facilities. One of the major computational costs associated with TRAC and similar codes is calculating constitutive coefficients. Although the formulations for these coefficients are local, the costs are flow-regime- or data-dependent; i.e., the computations needed for a given spatial node often vary widely as a function of time. Consequently, a fixed, uniform assignment of nodes to prallel processors will result in degraded computational efficiency due to the poor load balancing. A standard method for treating data-dependent models on vector architectures has been to use gather operations (or indirect adressing) to sort the nodes into subsets that (temporarily) share a common computational model. However, this method is not effective on distributed memory data parallel architectures, where indirect adressing involves expensive communication overhead. Another serious problem with this method involves software engineering challenges in the areas of maintainability and extensibility. For example, an implementation that was hand-tuned to achieve good computational efficiency would have to be rewritten whenever the decision tree governing the sorting was modified. Using an example based on the calculation of the wall-to-liquid and wall-to-vapor heat-transfer coefficients for three nonboiling flow regimes, we describe how the use of the Fortran 90 WHERE construct and automatic inlining of functions can be used to ameliorate this problem while improving both efficiency and software engineering. Unfortunately, a general automatic solution to the load-balancing problem associated with data-dependent computations is not yet available for massively parallel architectures. We discuss why developers should either wait for such solutions or consider alternative numerical algorithms, such as a neural network

  9. Characteristic detected on computed tomography angiography predict coronary artery plaque progression in non-culprit lesions

    Energy Technology Data Exchange (ETDEWEB)

    Tan, Ya Hang; Zhou, Jia Zhou; Zhou, Ying; Yang, Xiaobo; Yang, Jun Jie; Chen, Yun Dai [Dept. of Cardiology, Chinese PLA General Hospital, Beijing (China)

    2017-06-15

    This study sought to determine whether variables detected on coronary computed tomography angiography (CCTA) would predict plaque progression in non-culprit lesions (NCL). In this single-center trial, we analyzed 103 consecutive patients who were undergoing CCTA and percutaneous coronary intervention (PCI) for culprit lesions. Follow-up CCTA was scheduled 12 months after the PCI, and all patients were followed for 3 years after their second CCTA examination. High-risk plaque features and epicardial adipose tissue (EAT) volume were assessed by CCTA. Each NCL stenosis grade was compared visually between two CCTA scans to detect plaque progression, and patients were stratified into two groups based on this. Logistic regression analysis was used to evaluate the factors that were independently associated with plaque progression in NCLs. Time-to-event curves were compared using the log-rank statistic. Overall, 34 of 103 patients exhibited NCL plaque progression (33%). Logistic regression analyses showed that the NCL progression was associated with a history of ST-elevated myocardial infarction (odds ratio [OR] = 5.855, 95% confidence interval [CI] = 1.391–24.635, p = 0.016), follow-up low-density lipoprotein cholesterol level (OR = 6.832, 95% CI = 2.103–22.200, p = 0.001), baseline low-attenuation plaque (OR = 7.311, 95% CI = 1.242–43.028, p = 0.028) and EAT (OR = 1.015, 95% CI = 1.000–1.029, p = 0.044). Following the second CCTA examination, major adverse cardiac events (MACEs) were observed in 12 patients, and NCL plaque progression was significantly associated with future MACEs (log rank p = 0.006). Noninvasive assessment of NCLs by CCTA has potential prognostic value.

  10. Acute Pancreatitis—Progress and Challenges

    Science.gov (United States)

    Afghani, Elham; Pandol, Stephen J.; Shimosegawa, Tooru; Sutton, Robert; Wu, Bechien U.; Vege, Santhi Swaroop; Gorelick, Fred; Hirota, Morihisa; Windsor, John; Lo, Simon K.; Freeman, Martin L.; Lerch, Markus M.; Tsuji, Yoshihisa; Melmed, Gil Y.; Wassef, Wahid; Mayerle, Julia

    2016-01-01

    An international symposium entitled “Acute pancreatitis: progress and challenges” was held on November 5, 2014 at the Hapuna Beach Hotel, Big Island, Hawaii, as part of the 45th Anniversary Meeting of the American Pancreatic Association and the Japanese Pancreas Society. The course was organized and directed by Drs. Stephen Pandol, Tooru Shimosegawa, Robert Sutton, Bechien Wu, and Santhi Swaroop Vege. The symposium objectives were to: (1) highlight current issues in management of acute pancreatitis, (2) discuss promising treatments, (3) consider development of quality indicators and improved measures of disease activity, and (4) present a framework for international collaboration for development of new therapies. This article represents a compilation and adaptation of brief summaries prepared by speakers at the symposium with the purpose of broadly disseminating information and initiatives. PMID:26465949

  11. Candidiasis: a fungal infection--current challenges and progress in prevention and treatment.

    Science.gov (United States)

    Hani, Umme; Shivakumar, Hosakote G; Vaghela, Rudra; Osmani, Riyaz Ali M; Shrivastava, Atul

    2015-01-01

    Despite therapeutic advances candidiasis remains a common fungal infection most frequently caused by C. albicans and may occur as vulvovaginal candidiasis or thrush, a mucocutaneous candidiasis. Candidiasis frequently occurs in newborns, in immune-deficient people like AIDS patients, and in people being treated with broad spectrum antibiotics. It is mainly due to C. albicans while other species such as C. tropicalis, C. glabrata, C. parapsilosis and C. krusei are increasingly isolated. OTC antifungal dosage forms such as creams and gels can be used for effective treatment of local candidiasis. Whereas, for preventing spread of the disease to deeper vital organs, candidiasis antifungal chemotherapy is preferred. Use of probiotics and development of novel vaccines is an advanced approach for the prevention of candidiasis. Present review summarizes the diagnosis, current status and challenges in the treatment and prevention of candidiasis with prime focus on host defense against candidiasis, advancements in diagnosis, probiotics role and recent progress in the development of vaccines against candidiasis.

  12. Grid Computing in High Energy Physics

    International Nuclear Information System (INIS)

    Avery, Paul

    2004-01-01

    Over the next two decades, major high energy physics (HEP) experiments, particularly at the Large Hadron Collider, will face unprecedented challenges to achieving their scientific potential. These challenges arise primarily from the rapidly increasing size and complexity of HEP datasets that will be collected and the enormous computational, storage and networking resources that will be deployed by global collaborations in order to process, distribute and analyze them.Coupling such vast information technology resources to globally distributed collaborations of several thousand physicists requires extremely capable computing infrastructures supporting several key areas: (1) computing (providing sufficient computational and storage resources for all processing, simulation and analysis tasks undertaken by the collaborations); (2) networking (deploying high speed networks to transport data quickly between institutions around the world); (3) software (supporting simple and transparent access to data and software resources, regardless of location); (4) collaboration (providing tools that allow members full and fair access to all collaboration resources and enable distributed teams to work effectively, irrespective of location); and (5) education, training and outreach (providing resources and mechanisms for training students and for communicating important information to the public).It is believed that computing infrastructures based on Data Grids and optical networks can meet these challenges and can offer data intensive enterprises in high energy physics and elsewhere a comprehensive, scalable framework for collaboration and resource sharing. A number of Data Grid projects have been underway since 1999. Interestingly, the most exciting and far ranging of these projects are led by collaborations of high energy physicists, computer scientists and scientists from other disciplines in support of experiments with massive, near-term data needs. I review progress in this

  13. Recent progress and new challenges in isospin physics with heavy-ion reactions

    Energy Technology Data Exchange (ETDEWEB)

    Li Baoan [Department of Physics, Texas A and M University-Commerce, Commerce, TX 75429-3011 (United States)], E-mail: Bao-An_Li@Tamu-Commerce.edu; Chen Liewen [Institute of Theoretical Physics, Shanghai Jiao Tong University, Shanghai 200240 (China)], E-mail: Lwchen@Sjtu.edu.cn; Ko, Che Ming [Cyclotron Institute and Physics Department, Texas A and M University, College Station, TX 77843-3366 (United States)], E-mail: Ko@Comp.tamu.edu

    2008-08-15

    The ultimate goal of studying isospin physics via heavy-ion reactions with neutron-rich, stable and/or radioactive nuclei is to explore the isospin dependence of in-medium nuclear effective interactions and the equation of state of neutron-rich nuclear matter, particularly the isospin-dependent term in the equation of state, i.e., the density dependence of the symmetry energy. Because of its great importance for understanding many phenomena in both nuclear physics and astrophysics, the study of the density dependence of the nuclear symmetry energy has been the main focus of the intermediate-energy heavy-ion physics community during the last decade, and significant progress has been achieved both experimentally and theoretically. In particular, a number of phenomena or observables have been identified as sensitive probes to the density dependence of nuclear symmetry energy. Experimental studies have confirmed some of these interesting isospin-dependent effects and allowed us to constrain relatively stringently the symmetry energy at sub-saturation densities. The impact of this constrained density dependence of the symmetry energy on the properties of neutron stars have also been studied, and they were found to be very useful for the astrophysical community. With new opportunities provided by the various radioactive beam facilities being constructed around the world, the study of isospin physics is expected to remain one of the forefront research areas in nuclear physics. In this report, we review the major progress achieved during the last decade in isospin physics with heavy ion reactions and discuss future challenges to the most important issues in this field.

  14. Progress in multidimensional neutron transport computation

    International Nuclear Information System (INIS)

    Lewis, E.E.

    1977-01-01

    The methods available for solution of the time-independent neutron transport problems arising in the analysis of nuclear systems are examined. The merits of deterministic and Monte Carlo methods are briefly compared. The capabilities of deterministic computational methods derived from the first-order form of the transport equation, from the second-order even-parity form of this equation, and from integral transport formulations are discussed in some detail. Emphasis is placed on the approaches for dealing with the related problems of computer memory requirements, computational cost, and achievable accuracy. Attention is directed to some areas where problems exist currently and where the need for further work appears to be particularly warranted

  15. A Novel Quantitative Computed Tomographic Analysis Suggests How Sirolimus Stabilizes Progressive Air Trapping in Lymphangioleiomyomatosis.

    Science.gov (United States)

    Argula, Rahul G; Kokosi, Maria; Lo, Pechin; Kim, Hyun J; Ravenel, James G; Meyer, Cristopher; Goldin, Jonathan; Lee, Hye-Seung; Strange, Charlie; McCormack, Francis X

    2016-03-01

    The Multicenter International Lymphangioleiomyomatosis Efficacy and Safety of Sirolimus (MILES) trial demonstrated that sirolimus stabilized lung function and improved measures of functional performance and quality of life in patients with lymphangioleiomyomatosis. The physiologic mechanisms of these beneficial actions of sirolimus are incompletely understood. To prospectively determine the longitudinal computed tomographic lung imaging correlates of lung function change in MILES patients treated with placebo or sirolimus. We determined the baseline to 12-month change in computed tomographic image-derived lung volumes and the volume of the lung occupied by cysts in the 31 MILES participants (17 in sirolimus group, 14 in placebo group) with baseline and 12-month scans. There was a trend toward an increase in median expiratory cyst volume percentage in the placebo group and a reduction in the sirolimus group (+2.68% vs. +0.97%, respectively; P = 0.10). The computed tomographic image-derived residual volume and the ratio of residual volume to total lung capacity increased more in the placebo group than in the sirolimus group (+214.4 ml vs. +2.9 ml [P = 0.054] and +0.05 ml vs. -0.01 ml [P = 0.0498], respectively). A Markov transition chain analysis of respiratory cycle cyst volume changes revealed greater dynamic variation in the sirolimus group than in the placebo group at the 12-month time point. Collectively, these data suggest that sirolimus attenuates progressive gas trapping in lymphangioleiomyomatosis, consistent with a beneficial effect of the drug on airflow obstruction. We speculate that a reduction in lymphangioleiomyomatosis cell burden around small airways and cyst walls alleviates progressive airflow limitation and facilitates cyst emptying.

  16. Tackling some of the most intricate geophysical challenges via high-performance computing

    Science.gov (United States)

    Khosronejad, A.

    2016-12-01

    Recently, world has been witnessing significant enhancements in computing power of supercomputers. Computer clusters in conjunction with the advanced mathematical algorithms has set the stage for developing and applying powerful numerical tools to tackle some of the most intricate geophysical challenges that today`s engineers face. One such challenge is to understand how turbulent flows, in real-world settings, interact with (a) rigid and/or mobile complex bed bathymetry of waterways and sea-beds in the coastal areas; (b) objects with complex geometry that are fully or partially immersed; and (c) free-surface of waterways and water surface waves in the coastal area. This understanding is especially important because the turbulent flows in real-world environments are often bounded by geometrically complex boundaries, which dynamically deform and give rise to multi-scale and multi-physics transport phenomena, and characterized by multi-lateral interactions among various phases (e.g. air/water/sediment phases). Herein, I present some of the multi-scale and multi-physics geophysical fluid mechanics processes that I have attempted to study using an in-house high-performance computational model, the so-called VFS-Geophysics. More specifically, I will present the simulation results of turbulence/sediment/solute/turbine interactions in real-world settings. Parts of the simulations I present are performed to gain scientific insights into the processes such as sand wave formation (A. Khosronejad, and F. Sotiropoulos, (2014), Numerical simulation of sand waves in a turbulent open channel flow, Journal of Fluid Mechanics, 753:150-216), while others are carried out to predict the effects of climate change and large flood events on societal infrastructures ( A. Khosronejad, et al., (2016), Large eddy simulation of turbulence and solute transport in a forested headwater stream, Journal of Geophysical Research:, doi: 10.1002/2014JF003423).

  17. Frequency of educational computer use as a longitudinal predictor of educational outcome in young people with specific language impairment.

    Directory of Open Access Journals (Sweden)

    Kevin Durkin

    Full Text Available Computer use draws on linguistic abilities. Using this medium thus presents challenges for young people with Specific Language Impairment (SLI and raises questions of whether computer-based tasks are appropriate for them. We consider theoretical arguments predicting impaired performance and negative outcomes relative to peers without SLI versus the possibility of positive gains. We examine the relationship between frequency of computer use (for leisure and educational purposes and educational achievement; in particular examination performance at the end of compulsory education and level of educational progress two years later. Participants were 49 young people with SLI and 56 typically developing (TD young people. At around age 17, the two groups did not differ in frequency of educational computer use or leisure computer use. There were no associations between computer use and educational outcomes in the TD group. In the SLI group, after PIQ was controlled for, educational computer use at around 17 years of age contributed substantially to the prediction of educational progress at 19 years. The findings suggest that educational uses of computers are conducive to educational progress in young people with SLI.

  18. Assessing the Progress of Trapped-Ion Processors Towards Fault-Tolerant Quantum Computation

    Science.gov (United States)

    Bermudez, A.; Xu, X.; Nigmatullin, R.; O'Gorman, J.; Negnevitsky, V.; Schindler, P.; Monz, T.; Poschinger, U. G.; Hempel, C.; Home, J.; Schmidt-Kaler, F.; Biercuk, M.; Blatt, R.; Benjamin, S.; Müller, M.

    2017-10-01

    A quantitative assessment of the progress of small prototype quantum processors towards fault-tolerant quantum computation is a problem of current interest in experimental and theoretical quantum information science. We introduce a necessary and fair criterion for quantum error correction (QEC), which must be achieved in the development of these quantum processors before their sizes are sufficiently big to consider the well-known QEC threshold. We apply this criterion to benchmark the ongoing effort in implementing QEC with topological color codes using trapped-ion quantum processors and, more importantly, to guide the future hardware developments that will be required in order to demonstrate beneficial QEC with small topological quantum codes. In doing so, we present a thorough description of a realistic trapped-ion toolbox for QEC and a physically motivated error model that goes beyond standard simplifications in the QEC literature. We focus on laser-based quantum gates realized in two-species trapped-ion crystals in high-optical aperture segmented traps. Our large-scale numerical analysis shows that, with the foreseen technological improvements described here, this platform is a very promising candidate for fault-tolerant quantum computation.

  19. IMPLEMENTING THE COMPUTER-BASED NATIONAL EXAMINATION IN INDONESIAN SCHOOLS: THE CHALLENGES AND STRATEGIES

    Directory of Open Access Journals (Sweden)

    Heri Retnawati

    2017-12-01

    Full Text Available In line with technological development, the computer-based national examination (CBNE has become an urgent matter as its implementation faces various challenges, especially in developing countries. Strategies in implementing CBNE are thus needed to face the challenges. The aim of this research was to analyse the challenges and strategies of Indonesian schools in implementing CBNE. This research was qualitative phenomenological in nature. The data were collected through a questionnaire and a focus group discussion. The research participants were teachers who were test supervisors and technicians at junior high schools and senior high schools (i.e. Level 1 and 2 and vocational high schools implementing CBNE in Yogyakarta, Indonesia. The data were analysed using the Bogdan and Biklen model. The results indicate that (1 in implementing CBNE, the schools should initially make efforts to provide the electronic equipment supporting it; (2 the implementation of CBNE is challenged by problems concerning the Internet and the electricity supply; (3 the test supervisors have to learn their duties by themselves and (4 the students are not yet familiar with the beneficial use of information technology. To deal with such challenges, the schools employed strategies by making efforts to provide the standard electronic equipment through collaboration with the students’ parents and improving the curriculum content by adding information technology as a school subject.

  20. Internet ware cloud computing :Challenges

    OpenAIRE

    Qamar, S; Lal, Niranjan; Singh, Mrityunjay

    2010-01-01

    After decades of engineering development and infrastructural investment, Internet connections have become commodity product in many countries, and Internet scale “cloud computing” has started to compete with traditional software business through its technological advantages and economy of scale. Cloud computing is a promising enabling technology of Internet ware Cloud Computing is termed as the next big thing in the modern corporate world. Apart from the present day software and technologies,...

  1. The Glass Ceiling: Progress and Persistent Challenges

    Science.gov (United States)

    McLlwain, Wendy M.

    2012-01-01

    It has been written that since 2001, there has not been any significant progress and the glass ceiling is still intact. Women are still underrepresented in top positions (Anonymous, 2004). If this is true, the glass ceiling presents a major barrier between women and their desire to advance into executive or senior management positions. In addition…

  2. Tobacco Control Policies in Vietnam: Review on MPOWER Implementation Progress and Challenges.

    Science.gov (United States)

    Minh, Hoang Van; Ngan, Tran Thu; Mai, Vu Quynh; My, Nguyen Thi Tuyet; Chung, Le Hong; Kien, Vu Duy; Anh, Tran Tuan; Ngoc, Nguyen Bao; Giap, Vu Van; Cuong, Nguyen Manh; Manh, Pham Duc; Giang, Kim Bao

    2016-01-01

    In Vietnam, the WHO Framework Convention on Tobacco Control (WHO FCTC) took effect in March 2005 while MPOWER has been implemented since 2008. This paper describes the progress and challenges of implementation of the MPOWER package in Vietnam. We can report that, in term of monitoring, Vietnam is very active in the Global Tobacco Surveillance System, completing two rounds of the Global Adult Tobacco Survey (GATS) and three rounds of the Global Youth Tobacco Survey (GYTS). To protect people from tobacco smoke, Vietnam has issued and enforced a law requiring comprehensive smoking bans at workplaces and public places since 2013. Tobacco advertising and promotion are also prohibited with the exception of points of sale displays of tobacco products. Violations come in the form of promotion girls, corporate social responsibility activities from tobacco manufacturers and packages displayed by retail vendors. Vietnam is one of the 77 countries that require pictorial health warnings to be printed on cigarette packages to warn about the danger of tobacco and the warnings have been implemented effectively. Cigarette tax is 70% of factory price which is equal to less than 45% of retail price and much lower than the recommendation of WHO. However, Vietnam is one of the very few countries that require manufacturers and importers to make "compulsory contributions" at 1-2% of the factory price of cigarettes sold in Vietnam for the establishment of a Tobacco Control Fund (TCF). The TCF is being operated well. In 2015, 67 units of 63 provinces/cities, 22 ministries and political-social organizations and 6 hospitals received funding from TCF to implement a wide range of tobacco control activities. Cessation services have been starting with a a toll-free quit-line but need to be further strengthened. In conclusion, Vietnam has constantly put efforts into the tobacco control field with high commitment from the government, scientists and activists. Though several remarkable achievements

  3. Computing at the leading edge: Research in the energy sciences

    Energy Technology Data Exchange (ETDEWEB)

    Mirin, A.A.; Van Dyke, P.T. [eds.

    1994-02-01

    The purpose of this publication is to highlight selected scientific challenges that have been undertaken by the DOE Energy Research community. The high quality of the research reflected in these contributions underscores the growing importance both to the Grand Challenge scientific efforts sponsored by DOE and of the related supporting technologies that the National Energy Research Supercomputer Center (NERSC) and other facilities are able to provide. The continued improvement of the computing resources available to DOE scientists is prerequisite to ensuring their future progress in solving the Grand Challenges. Titles of articles included in this publication include: the numerical tokamak project; static and animated molecular views of a tumorigenic chemical bound to DNA; toward a high-performance climate systems model; modeling molecular processes in the environment; lattice Boltzmann models for flow in porous media; parallel algorithms for modeling superconductors; parallel computing at the Superconducting Super Collider Laboratory; the advanced combustion modeling environment; adaptive methodologies for computational fluid dynamics; lattice simulations of quantum chromodynamics; simulating high-intensity charged-particle beams for the design of high-power accelerators; electronic structure and phase stability of random alloys.

  4. Computing at the leading edge: Research in the energy sciences

    International Nuclear Information System (INIS)

    Mirin, A.A.; Van Dyke, P.T.

    1994-01-01

    The purpose of this publication is to highlight selected scientific challenges that have been undertaken by the DOE Energy Research community. The high quality of the research reflected in these contributions underscores the growing importance both to the Grand Challenge scientific efforts sponsored by DOE and of the related supporting technologies that the National Energy Research Supercomputer Center (NERSC) and other facilities are able to provide. The continued improvement of the computing resources available to DOE scientists is prerequisite to ensuring their future progress in solving the Grand Challenges. Titles of articles included in this publication include: the numerical tokamak project; static and animated molecular views of a tumorigenic chemical bound to DNA; toward a high-performance climate systems model; modeling molecular processes in the environment; lattice Boltzmann models for flow in porous media; parallel algorithms for modeling superconductors; parallel computing at the Superconducting Super Collider Laboratory; the advanced combustion modeling environment; adaptive methodologies for computational fluid dynamics; lattice simulations of quantum chromodynamics; simulating high-intensity charged-particle beams for the design of high-power accelerators; electronic structure and phase stability of random alloys

  5. Computational science: Emerging opportunities and challenges

    International Nuclear Information System (INIS)

    Hendrickson, Bruce

    2009-01-01

    In the past two decades, computational methods have emerged as an essential component of the scientific and engineering enterprise. A diverse assortment of scientific applications has been simulated and explored via advanced computational techniques. Computer vendors have built enormous parallel machines to support these activities, and the research community has developed new algorithms and codes, and agreed on standards to facilitate ever more ambitious computations. However, this track record of success will be increasingly hard to sustain in coming years. Power limitations constrain processor clock speeds, so further performance improvements will need to come from ever more parallelism. This higher degree of parallelism will require new thinking about algorithms, programming models, and architectural resilience. Simultaneously, cutting edge science increasingly requires more complex simulations with unstructured and adaptive grids, and multi-scale and multi-physics phenomena. These new codes will push existing parallelization strategies to their limits and beyond. Emerging data-rich scientific applications are also in need of high performance computing, but their complex spatial and temporal data access patterns do not perform well on existing machines. These interacting forces will reshape high performance computing in the coming years.

  6. Thrifty: An Exascale Architecture for Energy Proportional Computing

    Energy Technology Data Exchange (ETDEWEB)

    Torrellas, Josep [Univ. of Illinois, Champaign, IL (United States)

    2014-12-23

    The objective of this project is to design different aspects of a novel exascale architecture called Thrifty. Our goal is to focus on the challenges of power/energy efficiency, performance, and resiliency in exascale systems. The project includes work on computer architecture (Josep Torrellas from University of Illinois), compilation (Daniel Quinlan from Lawrence Livermore National Laboratory), runtime and applications (Laura Carrington from University of California San Diego), and circuits (Wilfred Pinfold from Intel Corporation). In this report, we focus on the progress at the University of Illinois during the last year of the grant (September 1, 2013 to August 31, 2014). We also point to the progress in the other collaborating institutions when needed.

  7. Progress and challenges in cleaning up Hanford

    Energy Technology Data Exchange (ETDEWEB)

    Wagoner, J.D. [Dept. of Energy, Richland, WA (United States)

    1997-08-01

    This paper presents captioned viewgraphs which briefly summarize cleanup efforts at the Hanford Site. Underground waste tank and spent nuclear fuel issues are described. Progress is reported for the Plutonium Finishing Plant, PUREX plant, B-Plant/Waste Encapsulation Storage Facility, and Fast Flux Test Facility. A very brief overview of costs and number of sites remediated and/or decommissioned is given.

  8. Detection and quantification of proteins and cells by use of elemental mass spectrometry: progress and challenges.

    Science.gov (United States)

    Yan, Xiaowen; Yang, Limin; Wang, Qiuquan

    2013-07-01

    Much progress has been made in identification of the proteins in proteomes, and quantification of these proteins has attracted much interest. In addition to popular tandem mass spectrometric methods based on soft ionization, inductively coupled plasma mass spectrometry (ICPMS), a typical example of mass spectrometry based on hard ionization, usually used for analysis of elements, has unique advantages in absolute quantification of proteins by determination of an element with a definite stoichiometry in a protein or attached to the protein. In this Trends article, we briefly describe state-of-the-art ICPMS-based methods for quantification of proteins, emphasizing protein-labeling and element-tagging strategies developed on the basis of chemically selective reactions and/or biospecific interactions. Recent progress from protein to cell quantification by use of ICPMS is also discussed, and the possibilities and challenges of ICPMS-based protein quantification for universal, selective, or targeted quantification of proteins and cells in a biological sample are also discussed critically. We believe ICPMS-based protein quantification will become ever more important in targeted quantitative proteomics and bioanalysis in the near future.

  9. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    Science.gov (United States)

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Kamath, C.; Khairallah, S. A.; Rubenchik, A. M.

    2015-12-01

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  10. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    Energy Technology Data Exchange (ETDEWEB)

    King, W. E., E-mail: weking@llnl.gov [Physical and Life Sciences Directorate, Lawrence Livermore National Laboratory, Livermore, California 94550 (United States); Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Khairallah, S. A. [Engineering Directorate, Lawrence Livermore National Laboratory, Livermore, California 94550 (United States); Kamath, C. [Computation Directorate, Lawrence Livermore National Laboratory, Livermore, California 94550 (United States); Rubenchik, A. M. [NIF and Photon Sciences Directorate, Lawrence Livermore National Laboratory, Livermore, California 94550 (United States)

    2015-12-15

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  11. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    International Nuclear Information System (INIS)

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Khairallah, S. A.; Kamath, C.; Rubenchik, A. M.

    2015-01-01

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process

  12. Relearning and Retaining Personally-Relevant Words using Computer-Based Flashcard Software in Primary Progressive Aphasia

    Directory of Open Access Journals (Sweden)

    William Streicher Evans

    2016-11-01

    Full Text Available Although anomia treatments have often focused on training small sets of words in the hopes of promoting generalization to untrained items, an alternative is to directly train a larger set of words more efficiently. The current case report study reports on a novel treatment for a patient with semantic variant Primary Progressive Aphasia (svPPA, in which the patient was taught to make and practice flashcards for personally-relevant words using an open-source computer program (Anki. Results show that the patient was able to relearn and retain a large subset of her studied words over a 20-month period. At the end of treatment, she showed good retention for 139 studied words, far more than the number typically treated in svPPA studies. Furthermore, she showed evidence of stimulus generalization to confrontation-naming tasks for studied items, and of relearning forgotten items with additional practice. This case represents a successful example of patient-centered computer-based asynchronous telepractice. It also illustrates how data captured from computer-based treatments can provide powerful practice-based evidence, obtained during routine clinical care.

  13. Nuclear challenges and progress in designing stellarator fusion power plants

    International Nuclear Information System (INIS)

    El-Guebaly, L.A.; Wilson, P.; Henderson, D.; Sawan, M.; Sviatoslavsky, G.; Tautges, T.; Slaybaugh, R.; Kiedrowski, B.; Ibrahim, A.

    2008-01-01

    Over the past 5-6 decades, stellarator power plants have been studied in the US, Europe, and Japan as an alternate to the mainline magnetic fusion tokamaks, offering steady-state operation and eliminating the risk of plasma disruptions. The earlier 1980s studies suggested large-scale stellarator power plants with an average major radius exceeding 20 m. The most recent development of the compact stellarator concept delivered ARIES-CS - a compact stellarator with 7.75 m average major radius, approaching that of tokamaks. For stellarators, the most important engineering parameter that determines the machine size and cost is the minimum distance between the plasma boundary and mid-coil. Accommodating the breeding blanket and necessary shield within this distance to protect the ARIES-CS superconducting magnet represents a challenging task. Selecting the ARIES-CS nuclear and engineering parameters to produce an economic optimum, modeling the complex geometry for 3D nuclear analysis to confirm the key parameters, and minimizing the radwaste stream received considerable attention during the design process. These engineering design elements combined with advanced physics helped enable the compact stellarator to be a viable concept. This paper provides a brief historical overview of the progress in designing stellarator power plants and a perspective on the successful integration of the nuclear activity into the final ARIES-CS configuration

  14. Cloud Computing Security Issues and Challenges

    OpenAIRE

    Kuyoro S. O.; Ibikunle F; Awodele O

    2011-01-01

    Cloud computing is a set of IT services that are provided to a customer over a network on a leased basis and with the ability to scale up or down their service requirements. Usually cloud computing services are delivered by a third party provider who owns the infrastructure. It advantages to mention but a few include scalability, resilience, flexibility, efficiency and outsourcing non-core activities. Cloud computing offers an innovative business model for organizations to adopt IT services w...

  15. Computational design of proteins with novel structure and functions

    International Nuclear Information System (INIS)

    Yang Wei; Lai Lu-Hua

    2016-01-01

    Computational design of proteins is a relatively new field, where scientists search the enormous sequence space for sequences that can fold into desired structure and perform desired functions. With the computational approach, proteins can be designed, for example, as regulators of biological processes, novel enzymes, or as biotherapeutics. These approaches not only provide valuable information for understanding of sequence–structure–function relations in proteins, but also hold promise for applications to protein engineering and biomedical research. In this review, we briefly introduce the rationale for computational protein design, then summarize the recent progress in this field, including de novo protein design, enzyme design, and design of protein–protein interactions. Challenges and future prospects of this field are also discussed. (topical review)

  16. Accomplishments and challenges of surgical simulation.

    Science.gov (United States)

    Satava, R M

    2001-03-01

    For nearly a decade, advanced computer technologies have created extraordinary educational tools using three-dimensional (3D) visualization and virtual reality. Pioneering efforts in surgical simulation with these tools have resulted in a first generation of simulators for surgical technical skills. Accomplishments include simulations with 3D models of anatomy for practice of surgical tasks, initial assessment of student performance in technical skills, and awareness by professional societies of potential in surgical education and certification. However, enormous challenges remain, which include improvement of technical fidelity, standardization of accurate metrics for performance evaluation, integration of simulators into a robust educational curriculum, stringent evaluation of simulators for effectiveness and value added to surgical training, determination of simulation application to certification of surgical technical skills, and a business model to implement and disseminate simulation successfully throughout the medical education community. This review looks at the historical progress of surgical simulators, their accomplishments, and the challenges that remain.

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  18. Computational analyses of ancient pathogen DNA from herbarium samples: challenges and prospects.

    Science.gov (United States)

    Yoshida, Kentaro; Sasaki, Eriko; Kamoun, Sophien

    2015-01-01

    The application of DNA sequencing technology to the study of ancient DNA has enabled the reconstruction of past epidemics from genomes of historically important plant-associated microbes. Recently, the genome sequences of the potato late blight pathogen Phytophthora infestans were analyzed from 19th century herbarium specimens. These herbarium samples originated from infected potatoes collected during and after the Irish potato famine. Herbaria have therefore great potential to help elucidate past epidemics of crops, date the emergence of pathogens, and inform about past pathogen population dynamics. DNA preservation in herbarium samples was unexpectedly good, raising the possibility of a whole new research area in plant and microbial genomics. However, the recovered DNA can be extremely fragmented resulting in specific challenges in reconstructing genome sequences. Here we review some of the challenges in computational analyses of ancient DNA from herbarium samples. We also applied the recently developed linkage method to haplotype reconstruction of diploid or polyploid genomes from fragmented ancient DNA.

  19. Computational Fragment-Based Drug Design: Current Trends, Strategies, and Applications.

    Science.gov (United States)

    Bian, Yuemin; Xie, Xiang-Qun Sean

    2018-04-09

    Fragment-based drug design (FBDD) has become an effective methodology for drug development for decades. Successful applications of this strategy brought both opportunities and challenges to the field of Pharmaceutical Science. Recent progress in the computational fragment-based drug design provide an additional approach for future research in a time- and labor-efficient manner. Combining multiple in silico methodologies, computational FBDD possesses flexibilities on fragment library selection, protein model generation, and fragments/compounds docking mode prediction. These characteristics provide computational FBDD superiority in designing novel and potential compounds for a certain target. The purpose of this review is to discuss the latest advances, ranging from commonly used strategies to novel concepts and technologies in computational fragment-based drug design. Particularly, in this review, specifications and advantages are compared between experimental and computational FBDD, and additionally, limitations and future prospective are discussed and emphasized.

  20. Challenges in reducing the computational time of QSTS simulations for distribution system analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Deboever, Jeremiah [Georgia Inst. of Technology, Atlanta, GA (United States); Zhang, Xiaochen [Georgia Inst. of Technology, Atlanta, GA (United States); Reno, Matthew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grijalva, Santiago [Georgia Inst. of Technology, Atlanta, GA (United States); Therrien, Francis [CME International T& D, St. Bruno, QC (Canada)

    2017-06-01

    The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10 to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.

  1. NNSA?s Computing Strategy, Acquisition Plan, and Basis for Computing Time Allocation

    Energy Technology Data Exchange (ETDEWEB)

    Nikkel, D J

    2009-07-21

    This report is in response to the Omnibus Appropriations Act, 2009 (H.R. 1105; Public Law 111-8) in its funding of the National Nuclear Security Administration's (NNSA) Advanced Simulation and Computing (ASC) Program. This bill called for a report on ASC's plans for computing and platform acquisition strategy in support of stockpile stewardship. Computer simulation is essential to the stewardship of the nation's nuclear stockpile. Annual certification of the country's stockpile systems, Significant Finding Investigations (SFIs), and execution of Life Extension Programs (LEPs) are dependent on simulations employing the advanced ASC tools developed over the past decade plus; indeed, without these tools, certification would not be possible without a return to nuclear testing. ASC is an integrated program involving investments in computer hardware (platforms and computing centers), software environments, integrated design codes and physical models for these codes, and validation methodologies. The significant progress ASC has made in the past derives from its focus on mission and from its strategy of balancing support across the key investment areas necessary for success. All these investment areas must be sustained for ASC to adequately support current stockpile stewardship mission needs and to meet ever more difficult challenges as the weapons continue to age or undergo refurbishment. The appropriations bill called for this report to address three specific issues, which are responded to briefly here but are expanded upon in the subsequent document: (1) Identify how computing capability at each of the labs will specifically contribute to stockpile stewardship goals, and on what basis computing time will be allocated to achieve the goal of a balanced program among the labs. (2) Explain the NNSA's acquisition strategy for capacity and capability of machines at each of the labs and how it will fit within the existing budget constraints. (3

  2. Copy-Right for Software and Computer Games: Strategies and Challenges

    Directory of Open Access Journals (Sweden)

    Hojatollah Ayoubi

    2009-11-01

    Full Text Available Copy-right has been initially used in cultural and art industries. From that time there have been two different approaches to the matter: the commercial-economic approach which is concerned with the rights of suppliers and investors; and the other approach, the cultural one, which is especially concerned with the rights of author. First approach is rooted in Anglo-American countries, while the other is originally French. Expansion of the computer market, and separating software and hardware markets caused to the so-called velvet-rubbery, which refers to the illegal reproduction in the market. Therefore, there were some struggles all over the world to protect rights of their producers. In present study, beside the domestic and international difficulties these strategies would encounter, this article has reviewed different strategies to face this challenge.

  3. Towards Effective Non-Invasive Brain-Computer Interfaces Dedicated to Gait Rehabilitation Systems

    Directory of Open Access Journals (Sweden)

    Thierry Castermans

    2013-12-01

    Full Text Available In the last few years, significant progress has been made in the field of walk rehabilitation. Motor cortex signals in bipedal monkeys have been interpreted to predict walk kinematics. Epidural electrical stimulation in rats and in one young paraplegic has been realized to partially restore motor control after spinal cord injury. However, these experimental trials are far from being applicable to all patients suffering from motor impairments. Therefore, it is thought that more simple rehabilitation systems are desirable in the meanwhile. The goal of this review is to describe and summarize the progress made in the development of non-invasive brain-computer interfaces dedicated to motor rehabilitation systems. In the first part, the main principles of human locomotion control are presented. The paper then focuses on the mechanisms of supra-spinal centers active during gait, including results from electroencephalography, functional brain imaging technologies [near-infrared spectroscopy (NIRS, functional magnetic resonance imaging (fMRI, positron-emission tomography (PET, single-photon emission-computed tomography (SPECT] and invasive studies. The first brain-computer interface (BCI applications to gait rehabilitation are then presented, with a discussion about the different strategies developed in the field. The challenges to raise for future systems are identified and discussed. Finally, we present some proposals to address these challenges, in order to contribute to the improvement of BCI for gait rehabilitation.

  4. Relationship of computed tomography perfusion and positron emission tomography to tumour progression in malignant glioma

    International Nuclear Information System (INIS)

    Yeung, Timothy P C; Yartsev, Slav; Lee, Ting-Yim; Wong, Eugene; He, Wenqing; Fisher, Barbara; VanderSpek, Lauren L; Macdonald, David; Bauman, Glenn

    2014-01-01

    Introduction: This study aimed to explore the potential for computed tomography (CT) perfusion and 18-Fluorodeoxyglucose positron emission tomography (FDG-PET) in predicting sites of future progressive tumour on a voxel-by-voxel basis after radiotherapy and chemotherapy. Methods: Ten patients underwent pre-radiotherapy magnetic resonance (MR), FDG-PET and CT perfusion near the end of radiotherapy and repeated post-radiotherapy follow-up MR scans. The relationships between these images and tumour progression were assessed using logistic regression. Cross-validation with receiver operating characteristic (ROC) analysis was used to assess the value of these images in predicting sites of tumour progression. Results: Pre-radiotherapy MR-defined gross tumour; near-end-of-radiotherapy CT-defined enhancing lesion; CT perfusion blood flow (BF), blood volume (BV) and permeability-surface area (PS) product; FDG-PET standard uptake value (SUV); and SUV:BF showed significant associations with tumour progression on follow-up MR imaging (P < 0.0001). The mean sensitivity (±standard deviation), specificity and area under the ROC curve (AUC) of PS were 0.64 ± 0.15, 0.74 ± 0.07 and 0.72 ± 0.12 respectively. This mean AUC was higher than that of the pre-radiotherapy MR-defined gross tumour and near-end-of-radiotherapy CT-defined enhancing lesion (both AUCs = 0.6 ± 0.1, P ≤ 0.03). The multivariate model using BF, BV, PS and SUV had a mean AUC of 0.8 ± 0.1, but this was not significantly higher than the PS only model. Conclusion: PS is the single best predictor of tumour progression when compared to other parameters, but voxel-based prediction based on logistic regression had modest sensitivity and specificity

  5. Relationship of computed tomography perfusion and positron emission tomography to tumour progression in malignant glioma

    Energy Technology Data Exchange (ETDEWEB)

    Yeung, Timothy P C [London Regional Cancer Program, London Health Sciences Centre, Ontario, Canada, N6A 4L6 (Canada); Robarts Research Institute, The University of Western Ontario, Ontario, Canada, N6A 5B7 (Canada); Department of Medical Biophysics, The University of Western Ontario, Ontario, Canada, N6A 5C1 (Canada); Yartsev, Slav [London Regional Cancer Program, London Health Sciences Centre, Ontario, Canada, N6A 4L6 (Canada); Department of Medical Biophysics, The University of Western Ontario, Ontario, Canada, N6A 5C1 (Canada); Department of Oncology, The University of Western Ontario, London Health Sciences Centre, London Regional Cancer Program, Ontario, Canada, N6A 4L6 (Canada); Lee, Ting-Yim [Robarts Research Institute, The University of Western Ontario, Ontario, Canada, N6A 5B7 (Canada); Department of Medical Biophysics, The University of Western Ontario, Ontario, Canada, N6A 5C1 (Canada); Department of Oncology, The University of Western Ontario, London Health Sciences Centre, London Regional Cancer Program, Ontario, Canada, N6A 4L6 (Canada); Department of Medical Imaging, The University of Western Ontario, London Health Sciences Centre, Victoria Hospital, Ontario, Canada, N6A 5W9 (Australia); Lawson Health Research Institute, St. Joseph' s Health Care London, Ontario, Canada, N6A 4V2 (Canada); Wong, Eugene [London Regional Cancer Program, London Health Sciences Centre, Ontario, Canada, N6A 4L6 (Canada); Department of Oncology, The University of Western Ontario, London Health Sciences Centre, London Regional Cancer Program, Ontario, Canada, N6A 4L6 (Canada); Department of Physics and Astronomy, The University of Western Ontario, Ontario, Canada, N6A 3K7 (Canada); He, Wenqing [Department of Statistical and Actuarial Sciences, The University of Western Ontario, Ontario, Canada, N6A 5B7 (Canada); Fisher, Barbara; VanderSpek, Lauren L [London Regional Cancer Program, London Health Sciences Centre, Ontario, Canada, N6A 4L6 (Canada); Department of Oncology, The University of Western Ontario, London Health Sciences Centre, London Regional Cancer Program, Ontario, Canada, N6A 4L6 (Canada); Macdonald, David [London Regional Cancer Program, London Health Sciences Centre, Ontario, Canada, N6A 4L6 (Canada); Department of Oncology, The University of Western Ontario, London Health Sciences Centre, London Regional Cancer Program, Ontario, Canada, N6A 4L6 (Canada); Department of Clinical Neurological Sciences, The University of Western Ontario, London Health Sciences Centre, University Hospital, Ontario, Canada, N6A 5A5 (Canada); Bauman, Glenn, E-mail: glenn.bauman@lhsc.on.ca [London Regional Cancer Program, London Health Sciences Centre, Ontario, Canada, N6A 4L6 (Canada); Department of Medical Biophysics, The University of Western Ontario, Ontario, Canada, N6A 5C1 (Canada); Department of Oncology, The University of Western Ontario, London Health Sciences Centre, London Regional Cancer Program, Ontario, Canada, N6A 4L6 (Canada)

    2014-02-15

    Introduction: This study aimed to explore the potential for computed tomography (CT) perfusion and 18-Fluorodeoxyglucose positron emission tomography (FDG-PET) in predicting sites of future progressive tumour on a voxel-by-voxel basis after radiotherapy and chemotherapy. Methods: Ten patients underwent pre-radiotherapy magnetic resonance (MR), FDG-PET and CT perfusion near the end of radiotherapy and repeated post-radiotherapy follow-up MR scans. The relationships between these images and tumour progression were assessed using logistic regression. Cross-validation with receiver operating characteristic (ROC) analysis was used to assess the value of these images in predicting sites of tumour progression. Results: Pre-radiotherapy MR-defined gross tumour; near-end-of-radiotherapy CT-defined enhancing lesion; CT perfusion blood flow (BF), blood volume (BV) and permeability-surface area (PS) product; FDG-PET standard uptake value (SUV); and SUV:BF showed significant associations with tumour progression on follow-up MR imaging (P < 0.0001). The mean sensitivity (±standard deviation), specificity and area under the ROC curve (AUC) of PS were 0.64 ± 0.15, 0.74 ± 0.07 and 0.72 ± 0.12 respectively. This mean AUC was higher than that of the pre-radiotherapy MR-defined gross tumour and near-end-of-radiotherapy CT-defined enhancing lesion (both AUCs = 0.6 ± 0.1, P ≤ 0.03). The multivariate model using BF, BV, PS and SUV had a mean AUC of 0.8 ± 0.1, but this was not significantly higher than the PS only model. Conclusion: PS is the single best predictor of tumour progression when compared to other parameters, but voxel-based prediction based on logistic regression had modest sensitivity and specificity.

  6. Computer-aided detection systems to improve lung cancer early diagnosis: state-of-the-art and challenges

    International Nuclear Information System (INIS)

    Traverso, A; Lopez Torres, E; Cerello, P; Fantacci, M E

    2017-01-01

    Lung cancer is one of the most lethal types of cancer, because its early diagnosis is not good enough. In fact, the detection of pulmonary nodule, potential lung cancers, in Computed Tomography scans is a very challenging and time-consuming task for radiologists. To support radiologists, researchers have developed Computer-Aided Diagnosis (CAD) systems for the automated detection of pulmonary nodules in chest Computed Tomography scans. Despite the high level of technological developments and the proved benefits on the overall detection performance, the usage of Computer-Aided Diagnosis in clinical practice is far from being a common procedure. In this paper we investigate the causes underlying this discrepancy and present a solution to tackle it: the M5L WEB- and Cloud-based on-demand Computer-Aided Diagnosis. In addition, we prove how the combination of traditional imaging processing techniques with state-of-art advanced classification algorithms allows to build a system whose performance could be much larger than any Computer-Aided Diagnosis developed so far. This outcome opens the possibility to use the CAD as clinical decision support for radiologists. (paper)

  7. Computer-aided detection systems to improve lung cancer early diagnosis: state-of-the-art and challenges

    Science.gov (United States)

    Traverso, A.; Lopez Torres, E.; Fantacci, M. E.; Cerello, P.

    2017-05-01

    Lung cancer is one of the most lethal types of cancer, because its early diagnosis is not good enough. In fact, the detection of pulmonary nodule, potential lung cancers, in Computed Tomography scans is a very challenging and time-consuming task for radiologists. To support radiologists, researchers have developed Computer-Aided Diagnosis (CAD) systems for the automated detection of pulmonary nodules in chest Computed Tomography scans. Despite the high level of technological developments and the proved benefits on the overall detection performance, the usage of Computer-Aided Diagnosis in clinical practice is far from being a common procedure. In this paper we investigate the causes underlying this discrepancy and present a solution to tackle it: the M5L WEB- and Cloud-based on-demand Computer-Aided Diagnosis. In addition, we prove how the combination of traditional imaging processing techniques with state-of-art advanced classification algorithms allows to build a system whose performance could be much larger than any Computer-Aided Diagnosis developed so far. This outcome opens the possibility to use the CAD as clinical decision support for radiologists.

  8. Addressing current challenges in cancer immunotherapy with mathematical and computational modelling.

    Science.gov (United States)

    Konstorum, Anna; Vella, Anthony T; Adler, Adam J; Laubenbacher, Reinhard C

    2017-06-01

    The goal of cancer immunotherapy is to boost a patient's immune response to a tumour. Yet, the design of an effective immunotherapy is complicated by various factors, including a potentially immunosuppressive tumour microenvironment, immune-modulating effects of conventional treatments and therapy-related toxicities. These complexities can be incorporated into mathematical and computational models of cancer immunotherapy that can then be used to aid in rational therapy design. In this review, we survey modelling approaches under the umbrella of the major challenges facing immunotherapy development, which encompass tumour classification, optimal treatment scheduling and combination therapy design. Although overlapping, each challenge has presented unique opportunities for modellers to make contributions using analytical and numerical analysis of model outcomes, as well as optimization algorithms. We discuss several examples of models that have grown in complexity as more biological information has become available, showcasing how model development is a dynamic process interlinked with the rapid advances in tumour-immune biology. We conclude the review with recommendations for modellers both with respect to methodology and biological direction that might help keep modellers at the forefront of cancer immunotherapy development. © 2017 The Author(s).

  9. Progresses in application of computational ?uid dynamic methods to large scale wind turbine aerodynamics?

    Institute of Scientific and Technical Information of China (English)

    Zhenyu ZHANG; Ning ZHAO; Wei ZHONG; Long WANG; Bofeng XU

    2016-01-01

    The computational ?uid dynamics (CFD) methods are applied to aerody-namic problems for large scale wind turbines. The progresses including the aerodynamic analyses of wind turbine pro?les, numerical ?ow simulation of wind turbine blades, evalu-ation of aerodynamic performance, and multi-objective blade optimization are discussed. Based on the CFD methods, signi?cant improvements are obtained to predict two/three-dimensional aerodynamic characteristics of wind turbine airfoils and blades, and the vorti-cal structure in their wake ?ows is accurately captured. Combining with a multi-objective genetic algorithm, a 1.5 MW NH-1500 optimized blade is designed with high e?ciency in wind energy conversion.

  10. Comparison of progressive addition lenses for general purpose and for computer vision: an office field study.

    Science.gov (United States)

    Jaschinski, Wolfgang; König, Mirjam; Mekontso, Tiofil M; Ohlendorf, Arne; Welscher, Monique

    2015-05-01

    Two types of progressive addition lenses (PALs) were compared in an office field study: 1. General purpose PALs with continuous clear vision between infinity and near reading distances and 2. Computer vision PALs with a wider zone of clear vision at the monitor and in near vision but no clear distance vision. Twenty-three presbyopic participants wore each type of lens for two weeks in a double-masked four-week quasi-experimental procedure that included an adaptation phase (Weeks 1 and 2) and a test phase (Weeks 3 and 4). Questionnaires on visual and musculoskeletal conditions as well as preferences regarding the type of lenses were administered. After eight more weeks of free use of the spectacles, the preferences were assessed again. The ergonomic conditions were analysed from photographs. Head inclination when looking at the monitor was significantly lower by 2.3 degrees with the computer vision PALs than with the general purpose PALs. Vision at the monitor was judged significantly better with computer PALs, while distance vision was judged better with general purpose PALs; however, the reported advantage of computer vision PALs differed in extent between participants. Accordingly, 61 per cent of the participants preferred the computer vision PALs, when asked without information about lens design. After full information about lens characteristics and additional eight weeks of free spectacle use, 44 per cent preferred the computer vision PALs. On average, computer vision PALs were rated significantly better with respect to vision at the monitor during the experimental part of the study. In the final forced-choice ratings, approximately half of the participants preferred either the computer vision PAL or the general purpose PAL. Individual factors seem to play a role in this preference and in the rated advantage of computer vision PALs. © 2015 The Authors. Clinical and Experimental Optometry © 2015 Optometry Australia.

  11. Summary of researches being performed in the Institute of Mathematics and Computer Science on computer science and information technologies

    Directory of Open Access Journals (Sweden)

    Artiom Alhazov

    2008-07-01

    Full Text Available Evolution of the informatization notion (which assumes automation of majority of human activities applying computers, computer networks, information technologies towards the notion of {\\it Global Information Society} (GIS challenges the determination of new paradigms of society: automation and intellectualization of production, new level of education and teaching, formation of new styles of work, active participation in decision making, etc. To assure transition to GIS for any society, including that from Republic of Moldova, requires both special training and broad application of progressive technologies and information systems. Methodological aspects concerning impact of GIS creation over the citizen, economic unit, national economy in the aggregate demands a profound study. Without systematic approach to these aspects the GIS creation would have confront great difficulties. Collective of researchers from the Institute of Mathematics and Computer Science (IMCS of Academy of Sciences of Moldova, which work in the field of computer science, constitutes the center of advanced researches and activates in those directions of researches of computer science which facilitate technologies and applications without of which the development of GIS cannot be assured.

  12. Applied & Computational MathematicsChallenges for the Design and Control of Dynamic Energy Systems

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D L; Burns, J A; Collis, S; Grosh, J; Jacobson, C A; Johansen, H; Mezic, I; Narayanan, S; Wetter, M

    2011-03-10

    The Energy Independence and Security Act of 2007 (EISA) was passed with the goal 'to move the United States toward greater energy independence and security.' Energy security and independence cannot be achieved unless the United States addresses the issue of energy consumption in the building sector and significantly reduces energy consumption in buildings. Commercial and residential buildings account for approximately 40% of the U.S. energy consumption and emit 50% of CO{sub 2} emissions in the U.S. which is more than twice the total energy consumption of the entire U.S. automobile and light truck fleet. A 50%-80% improvement in building energy efficiency in both new construction and in retrofitting existing buildings could significantly reduce U.S. energy consumption and mitigate climate change. Reaching these aggressive building efficiency goals will not happen without significant Federal investments in areas of computational and mathematical sciences. Applied and computational mathematics are required to enable the development of algorithms and tools to design, control and optimize energy efficient buildings. The challenge has been issued by the U.S. Secretary of Energy, Dr. Steven Chu (emphasis added): 'We need to do more transformational research at DOE including computer design tools for commercial and residential buildings that enable reductions in energy consumption of up to 80 percent with investments that will pay for themselves in less than 10 years.' On July 8-9, 2010 a team of technical experts from industry, government and academia were assembled in Arlington, Virginia to identify the challenges associated with developing and deploying newcomputational methodologies and tools thatwill address building energy efficiency. These experts concluded that investments in fundamental applied and computational mathematics will be required to build enabling technology that can be used to realize the target of 80% reductions in energy

  13. The use of the Climate-science Computational End Station (CCES) development and grand challenge team for the next IPCC assessment: an operational plan

    International Nuclear Information System (INIS)

    Washington, W M; Buja, L; Gent, P; Drake, J; Erickson, D; Anderson, D; Bader, D; Dickinson, R; Ghan, S; Jones, P; Jacob, R

    2008-01-01

    The grand challenge of climate change science is to predict future climates based on scenarios of anthropogenic emissions and other changes resulting from options in energy and development policies. Addressing this challenge requires a Climate Science Computational End Station consisting of a sustained climate model research, development, and application program combined with world-class DOE leadership computing resources to enable advanced computational simulation of the Earth system. This project provides the primary computer allocations for the DOE SciDAC and Climate Change Prediction Program. It builds on the successful interagency collaboration of the National Science and the U.S. Department of Energy in developing and applying the Community Climate System Model (CCSM) for climate change science. It also includes collaboration with the National Aeronautics and Space Administration in carbon data assimilation and university partners with expertise in high-end computational climate research

  14. ATLAS distributed computing: experience and evolution

    International Nuclear Information System (INIS)

    Nairz, A

    2014-01-01

    The ATLAS experiment has just concluded its first running period which commenced in 2010. After two years of remarkable performance from the LHC and ATLAS, the experiment has accumulated more than 25 fb −1 of data. The total volume of beam and simulated data products exceeds 100 PB distributed across more than 150 computing centres around the world, managed by the experiment's distributed data management system. These sites have provided up to 150,000 computing cores to ATLAS's global production and analysis processing system, enabling a rich physics programme including the discovery of the Higgs-like boson in 2012. The wealth of accumulated experience in global data-intensive computing at this massive scale, and the considerably more challenging requirements of LHC computing from 2015 when the LHC resumes operation, are driving a comprehensive design and development cycle to prepare a revised computing model together with data processing and management systems able to meet the demands of higher trigger rates, energies and event complexities. An essential requirement will be the efficient utilisation of current and future processor technologies as well as a broad range of computing platforms, including supercomputing and cloud resources. We will report on experience gained thus far and our progress in preparing ATLAS computing for the future

  15. Science Education Reform in Qatar: Progress and Challenges

    Science.gov (United States)

    Said, Ziad

    2016-01-01

    Science education reform in Qatar has had limited success. In the Trends in International Mathematics and Science Study (TIMMS), Qatari 4th and 8th grade students have shown progress in science achievement, but they remain significantly below the international average. Also, in the Program for International Student Assessment (PISA), Qatari…

  16. Safety risk management of underground engineering in China: Progress, challenges and strategies

    Directory of Open Access Journals (Sweden)

    Qihu Qian

    2016-08-01

    Full Text Available Underground construction in China is featured by large scale, high speed, long construction period, complex operation and frustrating situations regarding project safety. Various accidents have been reported from time to time, resulting in serious social impact and huge economic loss. This paper presents the main progress in the safety risk management of underground engineering in China over the last decade, i.e. (1 establishment of laws and regulations for safety risk management of underground engineering, (2 implementation of the safety risk management plan, (3 establishment of decision support system for risk management and early-warning based on information technology, and (4 strengthening the study on safety risk management, prediction and prevention. Based on the analysis of the typical accidents in China in the last decade, the new challenges in the safety risk management for underground engineering are identified as follows: (1 control of unsafe human behaviors; (2 technological innovation in safety risk management; and (3 design of safety risk management regulations. Finally, the strategies for safety risk management of underground engineering in China are proposed in six aspects, i.e. the safety risk management system and policy, law, administration, economy, education and technology.

  17. Petroleum industry's current progress and challenges in dealing with the year 2000 problem

    International Nuclear Information System (INIS)

    Kraus, D.I.; Radu, C.G.; McKenzie, S.; Stuart, K.

    1998-01-01

    The steps that some major oil companies in Canada have taken to prepare their computers and automated equipment for the year 2000 (Y2K) are described. It is acknowledged that with 1700 retail service stations, over 300 wholesale operations, and 26 terminals, the extent of the problem is great. In addition some 38 upstream and 65 downstream applications have been identified as mission critical, not counting the 30 mission critical control systems in the upstream field operations and approximately the same number of operations in the downstream refinery. The good news is that remediation and testing is well underway nationally and the Calgary test laboratory will be commercially available in 1999. Getting management on-side, selling the positive aspects of Y2K, making good use of reputable consulting companies, keeping employees properly informed of problems and progress, are some of the key criteria in solving Y2K problems successfully

  18. Measles and rubella elimination in the WHO Region for Europe: progress and challenges.

    Science.gov (United States)

    O'Connor, P; Jankovic, D; Muscat, M; Ben-Mamou, M; Reef, S; Papania, M; Singh, S; Kaloumenos, T; Butler, R; Datta, S

    2017-08-01

    Globally measles remains one of the leading causes of death among young children even though a safe and cost-effective vaccine is available. The World Health Organization (WHO) European Region has seen a decline in measles and rubella cases in recent years. The recent outbreaks have primarily affected adolescents and young adults with no vaccination or an incomplete vaccination history. Eliminating measles and rubella is one of the top immunization priorities of the European Region as outlined in the European Vaccine Action Plan 2015-2020. Following the 2010 decision by the Member States in the Region to initiate the process of verifying elimination, the European Regional Verification Commission for Measles and Rubella Elimination (RVC) was established in 2011. The RVC meets every year to evaluate the status of measles and rubella elimination in the Region based on documentation submitted by each country's National Verification Committees. The verification process was however modified in late 2014 to assess the elimination status at the individual country level instead of at regional level. The WHO European Region has made substantial progress towards measles and rubella elimination over the past 5 years. The RVC's conclusion in 2016 that 70% and 66% of the 53 Member States in the Region had interrupted the endemic transmission of measles and rubella, respectively, by 2015 is a testament to this progress. Nevertheless, where measles and rubella remain endemic, challenges in vaccination service delivery and disease surveillance will need to be addressed through focused technical assistance from WHO and development partners. Copyright © 2017 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

  19. Geant4 Hadronic Cascade Models and CMS Data Analysis : Computational Challenges in the LHC era

    CERN Document Server

    Heikkinen, Aatos

    This work belongs to the field of computational high-energy physics (HEP). The key methods used in this thesis work to meet the challenges raised by the Large Hadron Collider (LHC) era experiments are object-orientation with software engineering, Monte Carlo simulation, the computer technology of clusters, and artificial neural networks. The first aspect discussed is the development of hadronic cascade models, used for the accurate simulation of medium-energy hadron-nucleus reactions, up to 10 GeV. These models are typically needed in hadronic calorimeter studies and in the estimation of radiation backgrounds. Various applications outside HEP include the medical field (such as hadron treatment simulations), space science (satellite shielding), and nuclear physics (spallation studies). Validation results are presented for several significant improvements released in Geant4 simulation tool, and the significance of the new models for computing in the Large Hadron Collider era is estimated. In particular, we es...

  20. Computational mechanics research and support for aerodynamics and hydraulics at TFHRC year 1 quarter 4 progress report.

    Energy Technology Data Exchange (ETDEWEB)

    Lottes, S.A.; Kulak, R.F.; Bojanowski, C. (Energy Systems)

    2011-12-09

    The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. The analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to assess them for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFCHR wind engineering laboratory, vehicle stability under high wind loading, and the use of electromagnetic shock absorbers to improve vehicle stability

  1. Exercising CMS dataflows and workflows in computing challenges at the SpanishTier-1 and Tier-2 sites

    International Nuclear Information System (INIS)

    Caballero, J; Colino, N; Peris, A D; G-Abia, P; Hernandez, J M; R-Calonge, F J; Cabrillo, I; Caballero, I G; Marco, R; Matorras, F; Flix, J; Merino, G

    2008-01-01

    An overview of the data transfer, processing and analysis operations conducted at the Spanish Tier-1 (PIC, Barcelona) and Tier-2 (CIEMAT-Madrid and IFCA-Santander federation) centres during the past CMS CSA06 Computing, Software and Analysis challenge and in preparation for CSA07 is presented

  2. Combining Brain–Computer Interfaces and Assistive Technologies: State-of-the-Art and Challenges

    Science.gov (United States)

    Millán, J. d. R.; Rupp, R.; Müller-Putz, G. R.; Murray-Smith, R.; Giugliemma, C.; Tangermann, M.; Vidaurre, C.; Cincotti, F.; Kübler, A.; Leeb, R.; Neuper, C.; Müller, K.-R.; Mattia, D.

    2010-01-01

    In recent years, new research has brought the field of electroencephalogram (EEG)-based brain–computer interfacing (BCI) out of its infancy and into a phase of relative maturity through many demonstrated prototypes such as brain-controlled wheelchairs, keyboards, and computer games. With this proof-of-concept phase in the past, the time is now ripe to focus on the development of practical BCI technologies that can be brought out of the lab and into real-world applications. In particular, we focus on the prospect of improving the lives of countless disabled individuals through a combination of BCI technology with existing assistive technologies (AT). In pursuit of more practical BCIs for use outside of the lab, in this paper, we identify four application areas where disabled individuals could greatly benefit from advancements in BCI technology, namely, “Communication and Control”, “Motor Substitution”, “Entertainment”, and “Motor Recovery”. We review the current state of the art and possible future developments, while discussing the main research issues in these four areas. In particular, we expect the most progress in the development of technologies such as hybrid BCI architectures, user–machine adaptation algorithms, the exploitation of users’ mental states for BCI reliability and confidence measures, the incorporation of principles in human–computer interaction (HCI) to improve BCI usability, and the development of novel BCI technology including better EEG devices. PMID:20877434

  3. Storming the Citadel: The Fundamental Revolution Against Progressive Education.

    Science.gov (United States)

    Vetterli, Richard

    The first four chapters ("Progressive Education,""The Impact of Progressive Education,""The Remedies: Focusing on the Wrong Problems," and "Progressive Education Challenged") examine the deleterious effect that progressive education has had on student achievement and on society as a whole. The last five…

  4. Egyptian women in physics: Progress and challenges

    Science.gov (United States)

    Mohsen, M.; Hosni, Hala; Mohamed, Hadeer; Gadalla, Afaf; Kahil, Heba; Hashem, Hassan

    2015-12-01

    The present study shows a progressive increase in the number of female physicists as undergraduates and postgraduates in several governmental universities. For instance, in Ain Shams University, the percentage of women who selected physics as a major course of study increased from 7.2% in 2012 to 10.8% in 2013 and 15.7% in 2014. The study also provides the current gender distribution in the various positions among the teaching staff in seven governmental universities. The data supports the fact that female teaching assistants are increasing in these universities.

  5. Computational biomechanics for medicine imaging, modeling and computing

    CERN Document Server

    Doyle, Barry; Wittek, Adam; Nielsen, Poul; Miller, Karol

    2016-01-01

    The Computational Biomechanics for Medicine titles provide an opportunity for specialists in computational biomechanics to present their latest methodologies and advancements. This volume comprises eighteen of the newest approaches and applications of computational biomechanics, from researchers in Australia, New Zealand, USA, UK, Switzerland, Scotland, France and Russia. Some of the interesting topics discussed are: tailored computational models; traumatic brain injury; soft-tissue mechanics; medical image analysis; and clinically-relevant simulations. One of the greatest challenges facing the computational engineering community is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. We hope the research presented within this book series will contribute to overcoming this grand challenge.

  6. Looking from Within: Prospects and Challenges for Progressive Education in Indonesia

    Science.gov (United States)

    Zulfikar, Teuku

    2013-01-01

    Many Indonesian scholars (Azra, 2002; Darmaningtyas, 2004; Yunus, 2004), have attempted to bring progressive education to their country. They believe that progressive practices such as critical thinking, critical dialogue and child-centered instruction will help students learn better. However, this implementation is resisted because of cultural…

  7. Exercising CMS dataflows and workflows in computing challenges at the SpanishTier-1 and Tier-2 sites

    Energy Technology Data Exchange (ETDEWEB)

    Caballero, J; Colino, N; Peris, A D; G-Abia, P; Hernandez, J M; R-Calonge, F J [CIEMAT, Madrid (Spain); Cabrillo, I; Caballero, I G; Marco, R; Matorras, F [IFCA, Santander (Spain); Flix, J; Merino, G [PIC, Barcelona (Spain)], E-mail: jose.hernandez@ciemat.es

    2008-07-15

    An overview of the data transfer, processing and analysis operations conducted at the Spanish Tier-1 (PIC, Barcelona) and Tier-2 (CIEMAT-Madrid and IFCA-Santander federation) centres during the past CMS CSA06 Computing, Software and Analysis challenge and in preparation for CSA07 is present0008.

  8. Privacy-Preserving Computation with Trusted Computing via Scramble-then-Compute

    OpenAIRE

    Dang Hung; Dinh Tien Tuan Anh; Chang Ee-Chien; Ooi Beng Chin

    2017-01-01

    We consider privacy-preserving computation of big data using trusted computing primitives with limited private memory. Simply ensuring that the data remains encrypted outside the trusted computing environment is insufficient to preserve data privacy, for data movement observed during computation could leak information. While it is possible to thwart such leakage using generic solution such as ORAM [42], designing efficient privacy-preserving algorithms is challenging. Besides computation effi...

  9. Cloud Computing Security Issues - Challenges and Opportunities

    OpenAIRE

    Vaikunth, Pai T.; Aithal, P. S.

    2017-01-01

    Cloud computing services enabled through information communication technology delivered to a customer as services over the Internet on a leased basis have the capability to extend up or down their service requirements or needs. In this model, the infrastructure is owned by a third party vendor and the cloud computing services are delivered to the requested customers. Cloud computing model has many advantages including scalability, flexibility, elasticity, efficiency, and supports outsourcing ...

  10. Nuclear challenges and progress in designing stellarator power plants

    International Nuclear Information System (INIS)

    El-Guebaly, L.

    2007-01-01

    As an alternate to the mainline magnetic fusion tokamaks, the stellarator concept offers a steady state operation without external driven current, eliminating the risk of plasma irruptions. Over the past 2-3 decades, stellarator power plants have been studied in the U.S., Japan, and Europe to enhance the physics and engineering aspects and optimize the design parameters that are subject to numerous constraints. The earlier 1980's studies delivered large stellarators with an average major radius exceeding 20 m. The most recent development of the compact stellarator concept has led to the construction of the National Compact Stellarator Experiment (NCSX) in the U.S. and the 3 years power plant study of ARIES-CS, a compact stellarator with 7.75 m average major radius, approaching that of tokamaks. The ARIES-CS first wall configuration deviates from the standard practice of uniform toroidal shape in order to achieve compactness. Modeling such a complex geometry for 3-D nuclear analysis was a challenging engineering task. A novel approach based on coupling the CAD model with the MCNP Monte Carlo code was developed to model, for the first time ever, the complex stellarator geometry for nuclear assessments. The most important parameter that determines the stellarator size and cost is the minimum distance between the plasma boundary and mid-coil. Accommodating the breeding blanket and necessary shield to protect the superconducting magnet represented another challenging task. An innovative approach utilizing a non-uniform blanket combined with a highly efficient WC shield for this highly constrained area reduced the radial standoff (and machine size and cost) by 25- 30%, which is significant. As stellarators generate more radwaste than tokamaks, managing ARIES-CS active materials during operation and after plant decommissioning was essential for the environmental attractiveness of the machine. The geological disposal option could be replaced with more attractive scenarios

  11. Exploring the Benefits and Challenges of Using Laptop Computers in Higher Education Classrooms: A Formative Analysis

    Directory of Open Access Journals (Sweden)

    Robin H. Kay

    2011-04-01

    Full Text Available Because of decreased prices, increased convenience, and wireless access, an increasing number of college and university students are using laptop computers in their classrooms. This recent trend has forced instructors to address the educational consequences of using these mobile devices. The purpose of the current study was to analyze and assess beneficial and challenging laptop behaviours in higher education classrooms. Both quantitative and qualitative data were collected from 177 undergraduate university students (89 males, 88 females. Key benefits observed include note-taking activities, in-class laptop-based academic tasks, collaboration, increased focus, improved organization and efficiency, and addressing special needs. Key challenges noted include other student’s distracting laptop behaviours, instant messaging, surfing the web, playing games, watching movies, and decreased focus. Nearly three-quarters of the students claimed that laptops were useful in supporting their academic experience. Twice as many benefits were reported compared to challenges. It is speculated that the integration of meaningful laptop activities is a critical determinant of benefits and challenges experienced in higher education classrooms.

  12. Fifth annual progress report for Canada's climate change voluntary challenge and registry program

    International Nuclear Information System (INIS)

    1999-10-01

    Suncor Energy is a growing Canada-based integrated energy company comprising a corporate group and four operating businesses including: Oil Sands with a mine and upgrading facility at Fort McMurray, AB, Exploration and Production with conventional and heavy oil business in Western Canada, a Sunoco refining and marketing operation, and the Stuart Oil Shale Development Project in Queensland, Australia. While the emphasis is laid on technical and economic advances made by the company, the environmental tradeoffs, namely, greater greenhouse gas emissions and the need to reduce them, are noted. The most important positive item in the report is the incredible transformation occurring in Suncor's business operations. The company has begun a $2 billion expansion in its Oil Sands business that will more than double production of crude oil and fuel products by 2002. The expansion initiative provides a wonderful opportunity to demonstrate the huge leaps in performance that can be implemented at the time of capital stock turnover. The new expansion facilities are designed to be twice as energy efficient as the existing plant. Equally dramatic and hard won, are the multitude of incremental improvements achieved in existing facilities. Through energy management systems and operating practices and procedures, exploration and production is reversing the trend of rising greenhouse gas (GHG) emission intensity associated with mature conventional reservoirs, and Suncoco achieved its best ever operating performance in 1998. However, the volume of Suncor greenhouse gas emissions remains on an upward trend, which is a challenge for the future. As part of its mission to become a sustainable energy company, Suncor will continue to attempt to limit its net volume contribution of GHGs to the atmosphere to 1990 levels by pursuing domestic and international offsets and the development of alternative and renewable sources of energy. Progress towards sustainability for both Suncor and Canada

  13. Public Service Motivation Research : Achievements, Challenges, and Future Directions

    NARCIS (Netherlands)

    Perry, James L.; Vandenabeele, Wouter

    2015-01-01

    This article takes stock of public service motivation research to identify achievements, challenges, and an agenda for research to build on progress made since 1990. After enumerating achievements and challenges, the authors take stock of progress on extant proposals to strengthen research. In

  14. Progress and challenges in improving the nutritional quality of rice (Oryza sativa L.).

    Science.gov (United States)

    Birla, Deep Shikha; Malik, Kapil; Sainger, Manish; Chaudhary, Darshna; Jaiwal, Ranjana; Jaiwal, Pawan K

    2017-07-24

    Rice is a staple food for more than 3 billion people in more than 100 countries of the world but ironically it is deficient in many bioavailable vitamins, minerals, essential amino- and fatty-acids and phytochemicals that prevent chronic diseases like type 2 diabetes, heart disease, cancers, and obesity. To enhance the nutritional and other quality aspects of rice, a better understanding of the regulation of the processes involved in the synthesis, uptake, transport, and metabolism of macro-(starch, seed storage protein and lipid) and micronutrients (vitamins, minerals and phytochemicals) is required. With the publication of high quality genomic sequence of rice, significant progress has been made in identification, isolation, and characterization of novel genes and their regulation for the nutritional and quality enhancement of rice. During the last decade, numerous efforts have been made to refine the nutritional and other quality traits either by using the traditional breeding with high through put technologies such as marker assisted selection and breeding, or by adopting the transgenic approach. A significant improvement in vitamins (A, folate, and E), mineral (iron), essential amino acid (lysine), and flavonoids levels has been achieved in the edible part of rice, i.e., endosperm (biofortification) to meet the daily dietary allowance. However, studies on bioavailability and allergenicity on biofortified rice are still required. Despite the numerous efforts, the commercialization of biofortified rice has not yet been achieved. The present review summarizes the progress and challenges of genetic engineering and/or metabolic engineering technologies to improve rice grain quality, and presents the future prospects in developing nutrient dense rice to save the everincreasing population, that depends solely on rice as the staple food, from widespread nutritional deficiencies.

  15. Mobile, Cloud, and Big Data Computing: Contributions, Challenges, and New Directions in Telecardiology

    Science.gov (United States)

    Hsieh, Jui-Chien; Li, Ai-Hsien; Yang, Chung-Chi

    2013-01-01

    Many studies have indicated that computing technology can enable off-site cardiologists to read patients’ electrocardiograph (ECG), echocardiography (ECHO), and relevant images via smart phones during pre-hospital, in-hospital, and post-hospital teleconsultation, which not only identifies emergency cases in need of immediate treatment, but also prevents the unnecessary re-hospitalizations. Meanwhile, several studies have combined cloud computing and mobile computing to facilitate better storage, delivery, retrieval, and management of medical files for telecardiology. In the future, the aggregated ECG and images from hospitals worldwide will become big data, which should be used to develop an e-consultation program helping on-site practitioners deliver appropriate treatment. With information technology, real-time tele-consultation and tele-diagnosis of ECG and images can be practiced via an e-platform for clinical, research, and educational purposes. While being devoted to promote the application of information technology onto telecardiology, we need to resolve several issues: (1) data confidentiality in the cloud, (2) data interoperability among hospitals, and (3) network latency and accessibility. If these challenges are overcome, tele-consultation will be ubiquitous, easy to perform, inexpensive, and beneficial. Most importantly, these services will increase global collaboration and advance clinical practice, education, and scientific research in cardiology. PMID:24232290

  16. Mobile, Cloud, and Big Data Computing: Contributions, Challenges, and New Directions in Telecardiology

    Directory of Open Access Journals (Sweden)

    Chung-Chi Yang

    2013-11-01

    Full Text Available Many studies have indicated that computing technology can enable off-site cardiologists to read patients’ electrocardiograph (ECG, echocardiography (ECHO, and relevant images via smart phones during pre-hospital, in-hospital, and post-hospital teleconsultation, which not only identifies emergency cases in need of immediate treatment, but also prevents the unnecessary re-hospitalizations. Meanwhile, several studies have combined cloud computing and mobile computing to facilitate better storage, delivery, retrieval, and management of medical files for telecardiology. In the future, the aggregated ECG and images from hospitals worldwide will become big data, which should be used to develop an e-consultation program helping on-site practitioners deliver appropriate treatment. With information technology, real-time tele-consultation and tele-diagnosis of ECG and images can be practiced via an e-platform for clinical, research, and educational purposes. While being devoted to promote the application of information technology onto telecardiology, we need to resolve several issues: (1 data confidentiality in the cloud, (2 data interoperability among hospitals, and (3 network latency and accessibility. If these challenges are overcome, tele-consultation will be ubiquitous, easy to perform, inexpensive, and beneficial. Most importantly, these services will increase global collaboration and advance clinical practice, education, and scientific research in cardiology.

  17. Mobile, cloud, and big data computing: contributions, challenges, and new directions in telecardiology.

    Science.gov (United States)

    Hsieh, Jui-Chien; Li, Ai-Hsien; Yang, Chung-Chi

    2013-11-13

    Many studies have indicated that computing technology can enable off-site cardiologists to read patients' electrocardiograph (ECG), echocardiography (ECHO), and relevant images via smart phones during pre-hospital, in-hospital, and post-hospital teleconsultation, which not only identifies emergency cases in need of immediate treatment, but also prevents the unnecessary re-hospitalizations. Meanwhile, several studies have combined cloud computing and mobile computing to facilitate better storage, delivery, retrieval, and management of medical files for telecardiology. In the future, the aggregated ECG and images from hospitals worldwide will become big data, which should be used to develop an e-consultation program helping on-site practitioners deliver appropriate treatment. With information technology, real-time tele-consultation and tele-diagnosis of ECG and images can be practiced via an e-platform for clinical, research, and educational purposes. While being devoted to promote the application of information technology onto telecardiology, we need to resolve several issues: (1) data confidentiality in the cloud, (2) data interoperability among hospitals, and (3) network latency and accessibility. If these challenges are overcome, tele-consultation will be ubiquitous, easy to perform, inexpensive, and beneficial. Most importantly, these services will increase global collaboration and advance clinical practice, education, and scientific research in cardiology.

  18. The Office of Science Data-Management Challenge

    Energy Technology Data Exchange (ETDEWEB)

    Mount, Richard P.; /SLAC

    2005-10-10

    Science--like business, national security, and even everyday life--is becoming more and more data intensive. In some sciences the data-management challenge already exceeds the compute-power challenge in its needed resources. Leadership in applying computing to science will necessarily require both world-class computing and world-class data management. The Office of Science program needs a leadership-class capability in scientific data management. Currently two-thirds of Office of Science research and development in data management is left to the individual scientific programs. About $18M/year is spent by the programs on data-management research and development targeted at their most urgent needs. This is to be compared with the $9M/year spent on data management by DOE computer science. This highly mission-directed approach has been effective, but only in meeting just the highest-priority needs of individual programs. A coherent, leadership-class, program of data management is clearly warranted by the scale and nature of the Office of Science programs. More directly, much of the Office of Science portfolio is in desperate need of such a program; without it, data management could easily become the primary bottleneck to scientific progress within the next five years. When grouped into simulation-intensive science, experiment/observation-intensive science, and information-intensive science, the Office of Science programs show striking commonalities in their data-management needs. Not just research and development but also packaging and hardening as well as maintenance and support are required. Meeting these needs is a medium- to long-term effort requiring a well-planned program of evolving investment. We propose an Office of Science Data-Management Program at an initial scale of $32M/year of new funding. The program should be managed by a Director charged with creating and maintaining a forward-looking approach to multiscience data-management challenges. The program

  19. Nispero: a cloud-computing based Scala tool specially suited for bioinformatics data processing

    OpenAIRE

    Evdokim Kovach; Alexey Alekhin; Eduardo Pareja Tobes; Raquel Tobes; Eduardo Pareja; Marina Manrique

    2014-01-01

    Nowadays it is widely accepted that the bioinformatics data analysis is a real bottleneck in many research activities related to life sciences. High-throughput technologies like Next Generation Sequencing (NGS) have completely reshaped the biology and bioinformatics landscape. Undoubtedly NGS has allowed important progress in many life-sciences related fields but has also presented interesting challenges in terms of computation capabilities and algorithms. Many kinds of tasks related with NGS...

  20. Challenges and progress in turbomachinery design systems

    International Nuclear Information System (INIS)

    Van den Braembussche, R A

    2013-01-01

    This paper first describes the requirements that a modern design system should meet, followed by a comparison between design systems based on inverse design or optimization techniques. The second part of the paper presents the way these challenges are realized in an optimization method combining an Evolutionary theory and a Metamodel. Extensions to multi-disciplinary, multi-point and multi-objective optimization are illustrated by examples

  1. Progressive systemic sclerosis: high-resolution computed tomography findings; Esclerose sistemica progressiva: aspectos na tomografia computadorizada de alta resolucao

    Energy Technology Data Exchange (ETDEWEB)

    Gasparetto, Emerson L.; Pimenta, Rodrigo; Ono, Sergio E.; Escuissato, Dante L. [Parana Univ., Curitiba, PR (Brazil). Hospital de Clinicas. Servico de Radiologia Medica]. E-mail: dante.luiz@onda.com.br; Inoue, Cesar [Parana Univ., Curitiba, PR (Brazil). Faculdade de Medicina

    2005-09-15

    Objective: To describe the high-resolution computed tomography findings in the lung of patients with systemic sclerosis, independently of the respiratory symptoms. Materials and methods: Seventy-three high-resolution computed tomography scans of 44 patients with clinical diagnosis of systemic sclerosis were reviewed and defined by the consensus of two radiologists. Results: Abnormalities were seen in 91.8% (n = 67) of the scans. The most frequent findings were reticular pattern (90.4%), ground-glass opacities (63%), traction bronchiectasis and bronchiolectasis (56.2%), esophageal dilatation (46.6%), honeycombing pattern (28.8%) and signs of pulmonary hypertension (15.6%). In most cases the lesions were bilateral (89%) and symmetrical (58.5%). The lesions were predominantly located in the basal (91.2%) and peripheral (92.2%) regions. Conclusion: In the majority of the patients, progressive systemic sclerosis can cause pulmonary fibrosis mainly characterized by reticular pattern with basal and peripheral distribution on high-resolution computed tomography. (author)

  2. Advances in medical image computing.

    Science.gov (United States)

    Tolxdorff, T; Deserno, T M; Handels, H; Meinzer, H-P

    2009-01-01

    Medical image computing has become a key technology in high-tech applications in medicine and an ubiquitous part of modern imaging systems and the related processes of clinical diagnosis and intervention. Over the past years significant progress has been made in the field, both on methodological and on application level. Despite this progress there are still big challenges to meet in order to establish image processing routinely in health care. In this issue, selected contributions of the German Conference on Medical Image Processing (BVM) are assembled to present latest advances in the field of medical image computing. The winners of scientific awards of the German Conference on Medical Image Processing (BVM) 2008 were invited to submit a manuscript on their latest developments and results for possible publication in Methods of Information in Medicine. Finally, seven excellent papers were selected to describe important aspects of recent advances in the field of medical image processing. The selected papers give an impression of the breadth and heterogeneity of new developments. New methods for improved image segmentation, non-linear image registration and modeling of organs are presented together with applications of image analysis methods in different medical disciplines. Furthermore, state-of-the-art tools and techniques to support the development and evaluation of medical image processing systems in practice are described. The selected articles describe different aspects of the intense development in medical image computing. The image processing methods presented enable new insights into the patient's image data and have the future potential to improve medical diagnostics and patient treatment.

  3. Computational approaches to identify functional genetic variants in cancer genomes

    DEFF Research Database (Denmark)

    Gonzalez-Perez, Abel; Mustonen, Ville; Reva, Boris

    2013-01-01

    The International Cancer Genome Consortium (ICGC) aims to catalog genomic abnormalities in tumors from 50 different cancer types. Genome sequencing reveals hundreds to thousands of somatic mutations in each tumor but only a minority of these drive tumor progression. We present the result of discu......The International Cancer Genome Consortium (ICGC) aims to catalog genomic abnormalities in tumors from 50 different cancer types. Genome sequencing reveals hundreds to thousands of somatic mutations in each tumor but only a minority of these drive tumor progression. We present the result...... of discussions within the ICGC on how to address the challenge of identifying mutations that contribute to oncogenesis, tumor maintenance or response to therapy, and recommend computational techniques to annotate somatic variants and predict their impact on cancer phenotype....

  4. Engineering of obligate intracellular bacteria: progress, challenges and paradigms

    Science.gov (United States)

    Over twenty years have passed since the first report of genetic manipulation of an obligate intracellular bacterium. Through progress interspersed by bouts of stagnation, microbiologists and geneticists have developed approaches to genetically manipulate obligates. A brief overview of the current ge...

  5. The quantum computer game: citizen science

    Science.gov (United States)

    Damgaard, Sidse; Mølmer, Klaus; Sherson, Jacob

    2013-05-01

    Progress in the field of quantum computation is hampered by daunting technical challenges. Here we present an alternative approach to solving these by enlisting the aid of computer players around the world. We have previously examined a quantum computation architecture involving ultracold atoms in optical lattices and strongly focused tweezers of light. In The Quantum Computer Game (see http://www.scienceathome.org/), we have encapsulated the time-dependent Schrödinger equation for the problem in a graphical user interface allowing for easy user input. Players can then search the parameter space with real-time graphical feedback in a game context with a global high-score that rewards short gate times and robustness to experimental errors. The game which is still in a demo version has so far been tried by several hundred players. Extensions of the approach to other models such as Gross-Pitaevskii and Bose-Hubbard are currently under development. The game has also been incorporated into science education at high-school and university level as an alternative method for teaching quantum mechanics. Initial quantitative evaluation results are very positive. AU Ideas Center for Community Driven Research, CODER.

  6. Real-time fMRI neurofeedback: Progress and challenges

    Science.gov (United States)

    Sulzer, J.; Haller, S.; Scharnowski, F.; Weiskopf, N.; Birbaumer, N.; Blefari, M.L.; Bruehl, A.B.; Cohen, L.G.; deCharms, R.C.; Gassert, R.; Goebel, R.; Herwig, U.; LaConte, S.; Linden, D.; Luft, A.; Seifritz, E.; Sitaram, R.

    2016-01-01

    In February of 2012, the first international conference on real time functional magnetic resonance imaging (rtfMRI) neurofeedback was held at the Swiss Federal Institute of Technology Zurich (ETHZ), Switzerland. This review summarizes progress in the field, introduces current debates, elucidates open questions, and offers viewpoints derived from the conference. The review offers perspectives on study design, scientific and clinical applications, rtfMRI learning mechanisms and future outlook. PMID:23541800

  7. Community Cloud Computing

    Science.gov (United States)

    Marinos, Alexandros; Briscoe, Gerard

    Cloud Computing is rising fast, with its data centres growing at an unprecedented rate. However, this has come with concerns over privacy, efficiency at the expense of resilience, and environmental sustainability, because of the dependence on Cloud vendors such as Google, Amazon and Microsoft. Our response is an alternative model for the Cloud conceptualisation, providing a paradigm for Clouds in the community, utilising networked personal computers for liberation from the centralised vendor model. Community Cloud Computing (C3) offers an alternative architecture, created by combing the Cloud with paradigms from Grid Computing, principles from Digital Ecosystems, and sustainability from Green Computing, while remaining true to the original vision of the Internet. It is more technically challenging than Cloud Computing, having to deal with distributed computing issues, including heterogeneous nodes, varying quality of service, and additional security constraints. However, these are not insurmountable challenges, and with the need to retain control over our digital lives and the potential environmental consequences, it is a challenge we must pursue.

  8. Coupled Atmosphere-Wave-Ocean Modeling of Tropical Cyclones: Progress, Challenges, and Ways Forward

    Science.gov (United States)

    Chen, Shuyi

    2015-04-01

    /s. It is found that the air-sea fluxes are quite asymmetric around a storm with complex features representing various air-sea interaction processes in TCs. A unique observation in Typhoon Fanapi is the development of a stable boundary layer in the near-storm cold wake region, which has a direct impact on TC inner core structure and intensity. Despite of the progress, challenges remain. Air-sea momentum exchange in wind speed greater than 30-40 m/s is largely unresolved. Directional wind-wave stress and wave-current stress are difficult to determine from observations. Effects of sea spray on the air-sea fluxes are still not well understood. This talk will provide an overview on progress made in recent years, challenges we are facing, and ways forward. An integrated coupled observational and atmosphere-wave-ocean modeling system is urgently needed, in which coupled model development and targeted observations from field campaign and lab measurements together form the core of the research and prediction system. Another important aspect is that fully coupled models provide explicit, integrated impact forecasts of wind, rain, waves, ocean currents and surges in TCs and winter storms, which are missing in most current NWP models. It requires a new strategy for model development, evaluation, and verification. Ensemble forecasts using high-resolution coupled atmosphere-wave-ocean models can provide probabilistic forecasts and quantitative uncertainty estimates, which also allow us to explore new methodologies to verify probabilistic impact forecasts and evaluate model physics using a stochastic approach. Examples of such approach in TCs including Superstorm Sandy will be presented.

  9. Specialized computer architectures for computational aerodynamics

    Science.gov (United States)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  10. Developing E-Governance in the Eurasian Economic Union: Progress, Challenges and Prospects

    Directory of Open Access Journals (Sweden)

    Lyudmila Vidiasova

    2017-03-01

    Full Text Available he article provides an overview of e-governance development in the members of the Eurasian Economic Union (EEU. There is a lack of integrated research on e-governance in the EEU countries, although given the strengthening of this regional bloc, new information and communication technologies (ICT could serve as significant growth driver. Given the history and specifics of regional cooperation in the post-Soviet space and international best practices in ICT use in regional blocs, this article reviews the development of e-governance in the EEU members The research methodology was based on a three-stage concept of regionalism [Van Langenhov, Coste, 2005]. The study examines three key components: progress in developing e-governance, barriers to that development and future prospects. It used qualitative and quantitative methods. Data sources included the results of the United Nations E-Government rating, EEU countries’ regulations based on 3,200 documents and the authors’ expert survey, in which 18 experts (12 EEU representatives and six international experts participated. The study revealed the progress made by EEU countries in improving technological development and reducing human capital development indicators. The survey identified key barriers to e-governance development in the EEU: low motivation and information technology skills among civil servants, and citizens’ low computer literacy. The analysis of EEU members’ national economic priorities revealed a common focus on ICT development. The authors concluded that prospects for e-governance in the EEU were associated with strengthening regional cooperation in standardizing information systems, implementing one-stop-shop services, managing electronic documents and expanding online services. The authors presented two areas for developing e-governance within the EEU. The first is external integration, which, if strengthened, would affect the economy positivelyand optimize business processes

  11. Challenges for the computational fluid dynamics codes in the nineties. Various examples of application

    International Nuclear Information System (INIS)

    Chabard, J.P.; Viollet, P.L.

    1991-08-01

    Most of the computational fluid dynamics applications which are encountered at the Research and Development Division of EDF (RDD) are dealing with thermal exchanges. The development of numerical tools for the simulation of flows, devoted to this class of application, has been under way for 15 years. At the beginning this work was mainly concerned with a good simulation of the dynamics of the flow. Now these tools can be used to compute flows with thermal exchanges. The presentation will be limited to incompressible and one phase flows. First the softwares developed at RDD will be presented. Then some applications of these tools to flows with thermal exchanges will be discussed. To conclude, the paper will treat be general case of the CFD codes. The challenges for the next years will be detailed in order to make these tools available for users involved in complex physical modeling

  12. Computational electromagnetics—retrospective and outlook in honor of Wolfgang J.R. Hoefer

    CERN Document Server

    Chen, Zhizhang

    2015-01-01

    The book will cover the past, present and future developments of field theory and computational electromagnetics. The first two chapters will give an overview of the historical developments and the present the state-of-the-art in computational electromagnetics. These two chapters will set the stage for discussing recent progress, new developments, challenges, trends, and major directions in computational electromagnetics with three main emphases:   a. Modeling of ever larger structures with multi-scale dimensions and multi-level descriptions (behavioral, circuit, network and field levels) and transient behaviours   b. Inclusions of physical effects other than electromagnetic: quantum effects, thermal effects, mechanical effects and nanoscale features   c. New developments in available computer hardware, programming paradigms (MPI, OpenMP, CUDA, and OpenCL) and the associated new modeling approaches   These are the current emerging topics in the area of computational electromagnetics and may provide reader...

  13. Secure cloud computing

    CERN Document Server

    Jajodia, Sushil; Samarati, Pierangela; Singhal, Anoop; Swarup, Vipin; Wang, Cliff

    2014-01-01

    This book presents a range of cloud computing security challenges and promising solution paths. The first two chapters focus on practical considerations of cloud computing. In Chapter 1, Chandramouli, Iorga, and Chokani describe the evolution of cloud computing and the current state of practice, followed by the challenges of cryptographic key management in the cloud. In Chapter 2, Chen and Sion present a dollar cost model of cloud computing and explore the economic viability of cloud computing with and without security mechanisms involving cryptographic mechanisms. The next two chapters addres

  14. Computer, Informatics, Cybernetics and Applications : Proceedings of the CICA 2011

    CERN Document Server

    Hua, Ertian; Lin, Yun; Liu, Xiaozhu

    2012-01-01

    Computer Informatics Cybernetics and Applications offers 91 papers chosen for publication from among 184 papers accepted for presentation to the International Conference on Computer, Informatics, Cybernetics and Applications 2011 (CICA 2011), held in Hangzhou, China, September 13-16, 2011. The CICA 2011 conference provided a forum for engineers and scientists in academia, industry, and government to address the most innovative research and development including technical challenges and social, legal, political, and economic issues, and to present and discuss their ideas, results, work in progress and experience on all aspects of Computer, Informatics, Cybernetics and Applications. Reflecting the broad scope of the conference, the contents are organized in these topical categories: Communication Technologies and Applications Intelligence and Biometrics Technologies Networks Systems and Web Technologies Data Modeling and Programming Languages Digital Image Processing Optimization and Scheduling Education and In...

  15. Advanced methods for the computation of particle beam transport and the computation of electromagnetic fields and beam-cavity interactions. Progress report, July 1993--August 1994

    International Nuclear Information System (INIS)

    Dragt, A.J.; Gluckstern, R.L.

    1994-08-01

    The University of Maryland Dynamical Systems and Accelerator Theory Group has been carrying out long-term research work in the general area of Dynamical Systems with a particular emphasis on applications to Accelerator Physics. This work is broadly divided into two tasks: the computation of charged particle beam transport and the computation of electromagnetic fields and beam-cavity interactions. Each of these tasks is described briefly. Work is devoted both to the development of new methods and the application of these methods to problems of current interest in accelerator physics including the theoretical performance of present and proposed high energy machines. In addition to its research effort, the Dynamical Systems and Accelerator Theory Group is actively engaged in the education of students and postdoctoral research associates. Substantial progress in research has been made during the past year. These achievements are summarized in the following report

  16. The Computational Fluid Dynamics Rupture Challenge 2013--Phase II: Variability of Hemodynamic Simulations in Two Intracranial Aneurysms.

    Science.gov (United States)

    Berg, Philipp; Roloff, Christoph; Beuing, Oliver; Voss, Samuel; Sugiyama, Shin-Ichiro; Aristokleous, Nicolas; Anayiotos, Andreas S; Ashton, Neil; Revell, Alistair; Bressloff, Neil W; Brown, Alistair G; Chung, Bong Jae; Cebral, Juan R; Copelli, Gabriele; Fu, Wenyu; Qiao, Aike; Geers, Arjan J; Hodis, Simona; Dragomir-Daescu, Dan; Nordahl, Emily; Bora Suzen, Yildirim; Owais Khan, Muhammad; Valen-Sendstad, Kristian; Kono, Kenichi; Menon, Prahlad G; Albal, Priti G; Mierka, Otto; Münster, Raphael; Morales, Hernán G; Bonnefous, Odile; Osman, Jan; Goubergrits, Leonid; Pallares, Jordi; Cito, Salvatore; Passalacqua, Alberto; Piskin, Senol; Pekkan, Kerem; Ramalho, Susana; Marques, Nelson; Sanchi, Stéphane; Schumacher, Kristopher R; Sturgeon, Jess; Švihlová, Helena; Hron, Jaroslav; Usera, Gabriel; Mendina, Mariana; Xiang, Jianping; Meng, Hui; Steinman, David A; Janiga, Gábor

    2015-12-01

    With the increased availability of computational resources, the past decade has seen a rise in the use of computational fluid dynamics (CFD) for medical applications. There has been an increase in the application of CFD to attempt to predict the rupture of intracranial aneurysms, however, while many hemodynamic parameters can be obtained from these computations, to date, no consistent methodology for the prediction of the rupture has been identified. One particular challenge to CFD is that many factors contribute to its accuracy; the mesh resolution and spatial/temporal discretization can alone contribute to a variation in accuracy. This failure to identify the importance of these factors and identify a methodology for the prediction of ruptures has limited the acceptance of CFD among physicians for rupture prediction. The International CFD Rupture Challenge 2013 seeks to comment on the sensitivity of these various CFD assumptions to predict the rupture by undertaking a comparison of the rupture and blood-flow predictions from a wide range of independent participants utilizing a range of CFD approaches. Twenty-six groups from 15 countries took part in the challenge. Participants were provided with surface models of two intracranial aneurysms and asked to carry out the corresponding hemodynamics simulations, free to choose their own mesh, solver, and temporal discretization. They were requested to submit velocity and pressure predictions along the centerline and on specified planes. The first phase of the challenge, described in a separate paper, was aimed at predicting which of the two aneurysms had previously ruptured and where the rupture site was located. The second phase, described in this paper, aims to assess the variability of the solutions and the sensitivity to the modeling assumptions. Participants were free to choose boundary conditions in the first phase, whereas they were prescribed in the second phase but all other CFD modeling parameters were not

  17. Poverty Reduction and Shared Prosperity in Moldova : Progress and Prospects

    OpenAIRE

    World Bank Group

    2016-01-01

    Moldova has experienced rapid economic growth in the past decade, which has been accompanied by reductions in poverty and good performance in shared prosperity. Nonetheless, Moldova remains one of the poorest countries in Europe and faces challenges in sustaining the progress. The challenges for progress include spatial and cross-group inequalities, particularly because of unequal access t...

  18. From Computational Thinking to Computational Empowerment: A 21st Century PD Agenda

    DEFF Research Database (Denmark)

    Iversen, Ole Sejer; Smith, Rachel Charlotte; Dindler, Christian

    2018-01-01

    We propose computational empowerment as an approach, and a Participatory Design response, to challenges related to digitalization of society and the emerging need for digital literacy in K12 education. Our approach extends the current focus on computational thinking to include contextual, human-c...... technology in education. We argue that PD has the potential to drive a computational empowerment agenda in education, by connecting political PD with contemporary visions for addressing a future digitalized labor market and society.......We propose computational empowerment as an approach, and a Participatory Design response, to challenges related to digitalization of society and the emerging need for digital literacy in K12 education. Our approach extends the current focus on computational thinking to include contextual, human......-centred and societal challenges and impacts involved in students’ creative and critical engagement with digital technology. Our research is based on the FabLab@School project, in which a PD approach to computational empowerment provided opportunities as well as further challenges for the complex agenda of digital...

  19. Challenges in computational fluid dynamics simulation for the nineties. Various examples of application

    International Nuclear Information System (INIS)

    Chabard, J.P.; Viollet, P.L.

    1991-01-01

    Most of the computational fluid dynamics applications which are encountered at the Research Branch of EDF (DER) are dealing with thermal exchanges. The development of numerical tools for the simulation of flows, devoted to this class of application, has been under way for 15 years. At the beginning this work was mainly concerned with a good simulation of the dynamics of the flow. Now these tools can be used to compute flows with thermal exchanges. The presentation will be limited to incompressible and one phase flows (the DER developments on two phase flows are discussed in the paper by MM. Hery, Boivin et Viollet (in the present magazine). First the softwares developed at DER will be presented. Then some applications of these tools to flows with thermal exchanges will be discussed. To conclude, the paper will treat the general case of the CFD codes. The challenges for the next years will be detailed in order to make these tools available for users involved in complex physical modeling [fr

  20. Algorithms for limited-view computed tomography: an annotated bibliography and a challenge

    International Nuclear Information System (INIS)

    Rangayyan, R.; Dhawan, A.P.; Gordon, R.

    1985-01-01

    In many applications of computed tomography, it may not be possible to acquire projection data at all angles, as required by the most commonly used algorithm of convolution backprojection. In such a limited-data situation, we face an ill-posed problem in attempting to reconstruct an image from an incomplete set of projections. Many techniques have been proposed to tackle this situation, employing diverse theories such as signal recovery, image restoration, constrained deconvolution, and constrained optimization, as well as novel schemes such as iterative object-dependent algorithms incorporating a priori knowledge and use of multispectral radiation. The authors present an overview of such techniques and offer a challenge to all readers to reconstruct images from a set of limited-view data provided here

  1. The Challenges and Benefits of Using Computer Technology for Communication and Teaching in the Geosciences

    Science.gov (United States)

    Fairley, J. P.; Hinds, J. J.

    2003-12-01

    The advent of the World Wide Web in the early 1990s not only revolutionized the exchange of ideas and information within the scientific community, but also provided educators with a new array of teaching, informational, and promotional tools. Use of computer graphics and animation to explain concepts and processes can stimulate classroom participation and student interest in the geosciences, which has historically attracted students with strong spatial and visualization skills. In today's job market, graduates are expected to have knowledge of computers and the ability to use them for acquiring, processing, and visually analyzing data. Furthermore, in addition to promoting visibility and communication within the scientific community, computer graphics and the Internet can be informative and educational for the general public. Although computer skills are crucial for earth science students and educators, many pitfalls exist in implementing computer technology and web-based resources into research and classroom activities. Learning to use these new tools effectively requires a significant time commitment and careful attention to the source and reliability of the data presented. Furthermore, educators have a responsibility to ensure that students and the public understand the assumptions and limitations of the materials presented, rather than allowing them to be overwhelmed by "gee-whiz" aspects of the technology. We present three examples of computer technology in the earth sciences classroom: 1) a computer animation of water table response to well pumping, 2) a 3-D fly-through animation of a fault controlled valley, and 3) a virtual field trip for an introductory geology class. These examples demonstrate some of the challenges and benefits of these new tools, and encourage educators to expand the responsible use of computer technology for teaching and communicating scientific results to the general public.

  2. Exploring the Benefits and Challenges of Using Laptop Computers in Higher Education Classrooms: A Formative Analysis

    OpenAIRE

    Robin H. Kay; Sharon Lauricella

    2011-01-01

    Because of decreased prices, increased convenience, and wireless access, an increasing number of college and university students are using laptop computers in their classrooms. This recent trend has forced instructors to address the educational consequences of using these mobile devices. The purpose of the current study was to analyze and assess beneficial and challenging laptop behaviours in higher education classrooms. Both quantitative and qualitative data were collected from 177 undergrad...

  3. Progress report, Physics Division

    International Nuclear Information System (INIS)

    1986-03-01

    This report reviews events and progress in the following areas: development of the TASCC facility; experimental and theoretical nuclear physics research; radionuclide standardization; condensed matter research; applied mathematics; and computer facility operation

  4. Recent progress and future challenges in algal biofuel production [version 1; referees: 4 approved

    Directory of Open Access Journals (Sweden)

    Jonathan B. Shurin

    2016-10-01

    Full Text Available Modern society is fueled by fossil energy produced millions of years ago by photosynthetic organisms. Cultivating contemporary photosynthetic producers to generate energy and capture carbon from the atmosphere is one potential approach to sustaining society without disrupting the climate. Algae, photosynthetic aquatic microorganisms, are the fastest growing primary producers in the world and can therefore produce more energy with less land, water, and nutrients than terrestrial plant crops. We review recent progress and challenges in developing bioenergy technology based on algae. A variety of high-value products in addition to biofuels can be harvested from algal biomass, and these may be key to developing algal biotechnology and realizing the commercial potential of these organisms. Aspects of algal biology that differentiate them from plants demand an integrative approach based on genetics, cell biology, ecology, and evolution. We call for a systems approach to research on algal biotechnology rooted in understanding their biology, from the level of genes to ecosystems, and integrating perspectives from physical, chemical, and social sciences to solve one of the most critical outstanding technological problems.

  5. Neurofeedback therapy for enhancing visual attention: state-of-the-art and challenges

    Directory of Open Access Journals (Sweden)

    Mehdi Ordikhani-Seyedlar

    2016-08-01

    Full Text Available We have witnessed a rapid development of brain-computer interfaces (BCIs linking the brain to external devices. BCIs can be utilized to treat neurological conditions and even to augment brain functions. BCIs offer a promising treatment for mental disorders, including disorders of attention. Here we review the current state of the art and challenges of attention-based BCIs, with a focus on visual attention. Attention-based BCIs utilize electroencephalograms (EEGs or other recording techniques to generate neurofeedback, which patients use to improve their attention, a complex cognitive function. Although progress has been made in the studies of neural mechanisms of attention, extraction of attention-related neural signals needed for BCI operations is a difficult problem. To attain good BCI performance, it is important to select the features of neural activity that represent attentional signals. BCI decoding of attention-related activity may be hindered by the presence of different neural signals. Therefore, BCI accuracy can be improved by signal processing algorithms that dissociate signals of interest from irrelevant activities. Notwithstanding recent progress, optimal processing of attentional neural signals remains a fundamental challenge for the development of efficient therapies for disorders of attention.

  6. Catalysis Research of Relevance to Carbon Management: Progress, Challenges, and Opportunities

    Energy Technology Data Exchange (ETDEWEB)

    Arakawa, Hironori; Aresta, Michele; Armor, John; Barteau, Mark; Beckman, Eric J.; Bell, Alexis T.; Bercaw, John E.; Creutz, Carol; Dinjus, Eckhard; Dixon, David A.; Domen, Kazunari; Dubois, Daniel L.; Eckert, Juergen; Fujita, Etsuko; Gibson, Dorothy H.; Goddard, William A.; Goodman, Wayne D.; Keller, Jay; Kubas, Gregory J.; Kung, Harold H.; Lyons, James E.; Manzer, Leo; Marks, Tobin J.; Morokuma, Keiji; Nicholas, Kenneth M.; Periana, Roy; Que, Lawrence; Rostrup-Nielson, Jens; Sachtler, Woflgang M H.; Schmidt, Lanny D.; Sen, Ayusman; Somorjai, Gabor A.; Stair, Peter C.; Stults, Bailey R.; Tumas, William

    2001-04-11

    The goal of the 'Opportunities for Catalysis Research in Carbon Management' workshop was to review within the context of greenhouse gas/carbon issues the current state of knowledge, barriers to further scientific and technological progress, and basic scientific research needs in the areas of H{sub 2} generation and utilization, light hydrocarbon activation and utilization, carbon dioxide activation, utilization, and sequestration, emerging techniques and research directions in relevant catalysis research, and in catalysis for more efficient transportation engines. Several overarching themes emerge from this review. First and foremost, there is a pressing need to better understand in detail the catalytic mechanisms involved in almost every process area mentioned above. This includes the structures, energetics, lifetimes, and reactivities of the species thought to be important in the key catalytic cycles. As much of this type of information as is possible to acquire would also greatly aid in better understanding perplexing, incomplete/inefficient catalytic cycles and in inventing new, efficient ones. The most productive way to attack such problems must include long-term, in-depth fundamental studies of both commercial and model processes, by conventional research techniques and, importantly, by applying various promising new physicochemical and computational approaches which would allow incisive, in situ elucidation of reaction pathways. There is also a consensus that more exploratory experiments, especially high-risk, unconventional catalytic and model studies, should be undertaken. Such an effort will likely require specialized equipment, instrumentation, and computational facilities. The most expeditious and cost-effective means to carry out this research would be by close coupling of academic, industrial, and national laboratory catalysis efforts worldwide. Completely new research approaches should be vigorously explored, ranging from novel compositions

  7. Sustainable Tourism: Progress Challenges and Opportunities

    DEFF Research Database (Denmark)

    Budeanu, Adriana; Miller, Graham; Moscardo, Gianna

    2016-01-01

    The term sustainable tourism emerged in the late 1980s and has become firmly established in both tourism policies and strategies and tourism research (Hall, 2011). After more than 25 years of attention it is timely to consider the state of research and practice in sustainable tourism. This special...... volume was established with exactly that goal in mind and this introduction seeks to set the context for this critical examination and reflection on sustainable tourism. Another objective of this introduction was to briefly describe the range of contributions selected for this SV. The articles...... are organised into four thematic areas of research: community stakeholders' perspectives and business approaches to sustainability in tourism, cultural responses, and methodological challenges related to sustainability. The articles shine a light on issues of importance within sustainable tourism, and in so...

  8. Challenges and opportunities of modeling plasma–surface interactions in tungsten using high-performance computing

    Energy Technology Data Exchange (ETDEWEB)

    Wirth, Brian D., E-mail: bdwirth@utk.edu [Department of Nuclear Engineering, University of Tennessee, Knoxville, TN 37996 (United States); Nuclear Science and Engineering Directorate, Oak Ridge National Laboratory, Oak Ridge, TN (United States); Hammond, K.D. [Department of Nuclear Engineering, University of Tennessee, Knoxville, TN 37996 (United States); Krasheninnikov, S.I. [University of California, San Diego, La Jolla, CA (United States); Maroudas, D. [University of Massachusetts, Amherst, Amherst, MA 01003 (United States)

    2015-08-15

    The performance of plasma facing components (PFCs) is critical for ITER and future magnetic fusion reactors. The ITER divertor will be tungsten, which is the primary candidate material for future reactors. Recent experiments involving tungsten exposure to low-energy helium plasmas reveal significant surface modification, including the growth of nanometer-scale tendrils of “fuzz” and formation of nanometer-sized bubbles in the near-surface region. The large span of spatial and temporal scales governing plasma surface interactions are among the challenges to modeling divertor performance. Fortunately, recent innovations in computational modeling, increasingly powerful high-performance computers, and improved experimental characterization tools provide a path toward self-consistent, experimentally validated models of PFC and divertor performance. Recent advances in understanding tungsten–helium interactions are reviewed, including such processes as helium clustering, which serve as nuclei for gas bubbles; and trap mutation, dislocation loop punching and bubble bursting; which together initiate surface morphological modification.

  9. Challenges and opportunities of modeling plasma–surface interactions in tungsten using high-performance computing

    International Nuclear Information System (INIS)

    Wirth, Brian D.; Hammond, K.D.; Krasheninnikov, S.I.; Maroudas, D.

    2015-01-01

    The performance of plasma facing components (PFCs) is critical for ITER and future magnetic fusion reactors. The ITER divertor will be tungsten, which is the primary candidate material for future reactors. Recent experiments involving tungsten exposure to low-energy helium plasmas reveal significant surface modification, including the growth of nanometer-scale tendrils of “fuzz” and formation of nanometer-sized bubbles in the near-surface region. The large span of spatial and temporal scales governing plasma surface interactions are among the challenges to modeling divertor performance. Fortunately, recent innovations in computational modeling, increasingly powerful high-performance computers, and improved experimental characterization tools provide a path toward self-consistent, experimentally validated models of PFC and divertor performance. Recent advances in understanding tungsten–helium interactions are reviewed, including such processes as helium clustering, which serve as nuclei for gas bubbles; and trap mutation, dislocation loop punching and bubble bursting; which together initiate surface morphological modification

  10. Language development: Progress and challenges in a multilingual ...

    African Journals Online (AJOL)

    Some such challenges discussed include issues like language selection for development, absence of clear language policy and the important issue of attitudes of respective language communities towards language research programmes. The article also looks at how the project and the institute have managed to make ...

  11. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  12. QM/MM free energy simulations: recent progress and challenges

    Science.gov (United States)

    Lu, Xiya; Fang, Dong; Ito, Shingo; Okamoto, Yuko; Ovchinnikov, Victor

    2016-01-01

    Due to the higher computational cost relative to pure molecular mechanical (MM) simulations, hybrid quantum mechanical/molecular mechanical (QM/MM) free energy simulations particularly require a careful consideration of balancing computational cost and accuracy. Here we review several recent developments in free energy methods most relevant to QM/MM simulations and discuss several topics motivated by these developments using simple but informative examples that involve processes in water. For chemical reactions, we highlight the value of invoking enhanced sampling technique (e.g., replica-exchange) in umbrella sampling calculations and the value of including collective environmental variables (e.g., hydration level) in metadynamics simulations; we also illustrate the sensitivity of string calculations, especially free energy along the path, to various parameters in the computation. Alchemical free energy simulations with a specific thermodynamic cycle are used to probe the effect of including the first solvation shell into the QM region when computing solvation free energies. For cases where high-level QM/MM potential functions are needed, we analyze two different approaches: the QM/MM-MFEP method of Yang and co-workers and perturbative correction to low-level QM/MM free energy results. For the examples analyzed here, both approaches seem productive although care needs to be exercised when analyzing the perturbative corrections. PMID:27563170

  13. Shape perception in human and computer vision an interdisciplinary perspective

    CERN Document Server

    Dickinson, Sven J

    2013-01-01

    This comprehensive and authoritative text/reference presents a unique, multidisciplinary perspective on Shape Perception in Human and Computer Vision. Rather than focusing purely on the state of the art, the book provides viewpoints from world-class researchers reflecting broadly on the issues that have shaped the field. Drawing upon many years of experience, each contributor discusses the trends followed and the progress made, in addition to identifying the major challenges that still lie ahead. Topics and features: examines each topic from a range of viewpoints, rather than promoting a speci

  14. Stable isotope views on ecosystem function: challenging or challenged?

    Science.gov (United States)

    Resco, Víctor; Querejeta, José I; Ogle, Kiona; Voltas, Jordi; Sebastià, Maria-Teresa; Serrano-Ortiz, Penélope; Linares, Juan C; Moreno-Gutiérrez, Cristina; Herrero, Asier; Carreira, José A; Torres-Cañabate, Patricia; Valladares, Fernando

    2010-06-23

    Stable isotopes and their potential for detecting various and complex ecosystem processes are attracting an increasing number of scientists. Progress is challenging, particularly under global change scenarios, but some established views have been challenged. The IX meeting of the Spanish Association of Terrestrial Ecology (AAET, Ubeda, 18-22 October 2009) hosted a symposium on the ecology of stable isotopes where the linear mixing model approach of partitioning sinks and sources of carbon and water fluxes within an ecosystem was challenged, and new applications of stable isotopes for the study of plant interactions were evaluated. Discussion was also centred on the need for networks that monitor ecological processes using stable isotopes and key ideas for fostering future research with isotopes.

  15. Computing challenges in HEP for WLHC grid

    CERN Document Server

    Muralidharan, Servesh

    2017-01-01

    As CERN moves towards preparation for increasing the luminosity of the particle beam towards HL-LHC, predictions shows computing demand would out grow our conservative scaling estimates by over ten times. Fortunately we are talking about a time scale of roughly ten years to develop new techniques and novel solutions to address this gap in compute resources. Experiments at CERN face a unique scenario where in they need to scale both latency sensitive workloads such as data acquisition of the detectors and throughput based ones such as simulations and reconstruction of high level events and physics processes. In this talk we cover some of the ongoing research at tier-0 in CERN which investigates several aspects of throughput sensitive workloads that consume significant compute cycles.

  16. Computed tomographic findings in progressive supranuclear palsy

    Energy Technology Data Exchange (ETDEWEB)

    Saitoh, H; Yoshii, F; Shinohara, Y

    1987-03-01

    CT findings of 6 patients with progressive supranuclear palsy (PSP) are described, with emphasis on their supratentorial changes in comparison with those of control subjects and patients with Parkinson's disease (PD). As estimated from CT films, the lateral ventricles, third ventricle and prepontine cistern were significantly enlarged in PSP patients compared with both controls and PD patients. It is suggested that the patients with PSP have not only infratentorial but also supratentorial lesions.

  17. Review of The SIAM 100-Digit Challenge: A Study in High-Accuracy Numerical Computing

    International Nuclear Information System (INIS)

    Bailey, David

    2005-01-01

    In the January 2002 edition of SIAM News, Nick Trefethen announced the '$100, 100-Digit Challenge'. In this note he presented ten easy-to-state but hard-to-solve problems of numerical analysis, and challenged readers to find each answer to ten-digit accuracy. Trefethen closed with the enticing comment: 'Hint: They're hard. If anyone gets 50 digits in total, I will be impressed.' This challenge obviously struck a chord in hundreds of numerical mathematicians worldwide, as 94 teams from 25 nations later submitted entries. Many of these submissions exceeded the target of 50 correct digits; in fact, 20 teams achieved a perfect score of 100 correct digits. Trefethen had offered $100 for the best submission. Given the overwhelming response, a generous donor (William Browning, founder of Applied Mathematics, Inc.) provided additional funds to provide a $100 award to each of the 20 winning teams. Soon after the results were out, four participants, each from a winning team, got together and agreed to write a book about the problems and their solutions. The team is truly international: Bornemann is from Germany, Laurie is from South Africa, Wagon is from the USA, and Waldvogel is from Switzerland. This book provides some mathematical background for each problem, and then shows in detail how each of them can be solved. In fact, multiple solution techniques are mentioned in each case. The book describes how to extend these solutions to much larger problems and much higher numeric precision (hundreds or thousands of digit accuracy). The authors also show how to compute error bounds for the results, so that one can say with confidence that one's results are accurate to the level stated. Numerous numerical software tools are demonstrated in the process, including the commercial products Mathematica, Maple and Matlab. Computer programs that perform many of the algorithms mentioned in the book are provided, both in an appendix to the book and on a website. In the process, the

  18. Progress in Harmonizing Tiered HIV Laboratory Systems: Challenges and Opportunities in 8 African Countries.

    Science.gov (United States)

    Williams, Jason; Umaru, Farouk; Edgil, Dianna; Kuritsky, Joel

    2016-09-28

    In 2014, the Joint United Nations Programme on HIV/AIDS released its 90-90-90 targets, which make laboratory diagnostics a cornerstone for measuring efforts toward the epidemic control of HIV. A data-driven laboratory harmonization and standardization approach is one way to create efficiencies and ensure optimal laboratory procurements. Following the 2008 "Maputo Declaration on Strengthening of Laboratory Systems"-a call for government leadership in harmonizing tiered laboratory networks and standardizing testing services-several national ministries of health requested that the United States Government and in-country partners help implement the recommendations by facilitating laboratory harmonization and standardization workshops, with a primary focus on improving HIV laboratory service delivery. Between 2007 and 2015, harmonization and standardization workshops were held in 8 African countries. This article reviews progress in the harmonization of laboratory systems in these 8 countries. We examined agreed-upon instrument lists established at the workshops and compared them against instrument data from laboratory quantification exercises over time. We used this measure as an indicator of adherence to national procurement policies. We found high levels of diversity across laboratories' diagnostic instruments, equipment, and services. This diversity contributes to different levels of compliance with expected service delivery standards. We believe the following challenges to be the most important to address: (1) lack of adherence to procurement policies, (2) absence or limited influence of a coordinating body to fully implement harmonization proposals, and (3) misalignment of laboratory policies with minimum packages of care and with national HIV care and treatment guidelines. Overall, the effort to implement the recommendations from the Maputo Declaration has had mixed success and is a work in progress. Program managers should continue efforts to advance the

  19. Progress in Harmonizing Tiered HIV Laboratory Systems: Challenges and Opportunities in 8 African Countries

    Science.gov (United States)

    Williams, Jason; Umaru, Farouk; Edgil, Dianna; Kuritsky, Joel

    2016-01-01

    ABSTRACT In 2014, the Joint United Nations Programme on HIV/AIDS released its 90-90-90 targets, which make laboratory diagnostics a cornerstone for measuring efforts toward the epidemic control of HIV. A data-driven laboratory harmonization and standardization approach is one way to create efficiencies and ensure optimal laboratory procurements. Following the 2008 “Maputo Declaration on Strengthening of Laboratory Systems”—a call for government leadership in harmonizing tiered laboratory networks and standardizing testing services—several national ministries of health requested that the United States Government and in-country partners help implement the recommendations by facilitating laboratory harmonization and standardization workshops, with a primary focus on improving HIV laboratory service delivery. Between 2007 and 2015, harmonization and standardization workshops were held in 8 African countries. This article reviews progress in the harmonization of laboratory systems in these 8 countries. We examined agreed-upon instrument lists established at the workshops and compared them against instrument data from laboratory quantification exercises over time. We used this measure as an indicator of adherence to national procurement policies. We found high levels of diversity across laboratories’ diagnostic instruments, equipment, and services. This diversity contributes to different levels of compliance with expected service delivery standards. We believe the following challenges to be the most important to address: (1) lack of adherence to procurement policies, (2) absence or limited influence of a coordinating body to fully implement harmonization proposals, and (3) misalignment of laboratory policies with minimum packages of care and with national HIV care and treatment guidelines. Overall, the effort to implement the recommendations from the Maputo Declaration has had mixed success and is a work in progress. Program managers should continue efforts to

  20. Interdisciplinary and physics challenges of network theory

    Science.gov (United States)

    Bianconi, Ginestra

    2015-09-01

    Network theory has unveiled the underlying structure of complex systems such as the Internet or the biological networks in the cell. It has identified universal properties of complex networks, and the interplay between their structure and dynamics. After almost twenty years of the field, new challenges lie ahead. These challenges concern the multilayer structure of most of the networks, the formulation of a network geometry and topology, and the development of a quantum theory of networks. Making progress on these aspects of network theory can open new venues to address interdisciplinary and physics challenges including progress on brain dynamics, new insights into quantum technologies, and quantum gravity.

  1. Radioactive waste management in Canada: progress and challenges 15 years after the policy framework

    International Nuclear Information System (INIS)

    McCauley, D.

    2011-01-01

    from that development - the establishment of the Nuclear Waste Management Organization, the study of options for the long-term management of nuclear fuel waste, the Government's decision on the options, the agreement on a funding formula for nuclear fuel waste management, and the launch of the NWMO's siting process. In this same period, we also have witnessed progress on a long-term waste management facility for low and intermediate-level radioactive waste in Ontario - including an agreement with the hosting community. In addition, there has been further advancement in the management of uranium tailing, notably the launch of cleanup efforts at the Gunnar mine in northern Saskatchewan. Finally, the federal government has established robust programs for the management of historic and legacy wastes across the country. In terms of historic wastes, the Port Hope Area Initiative has advanced to the point where critical decisions will be made in 2011 on the launch of the implementation phase of that Project and the Low-Level Radioactive Waste Management Office continues to manage historic wastes at other sites across the country. As for legacy wastes, decisions are expected prior to the end of 2010 on the continuation of the Nuclear Legacy Liabilities Program which addresses decommissioning and radioactive waste liabilities at AECL sites in Manitoba, Ontario, Quebec, and Nova Scotia. The coming years will see the further advancement of these initiatives, all which will face their own challenges. Nevertheless, there is generally a defined strategy or path and the appropriate elements are in place to achieve success. Despite these initiatives, there remain gaps in Canada's approach to radioactive waste management. In particular, while there has been progress on the management of low and intermediate-level radioactive waste in Ontario to address wastes from Ontario Power Generation's facilities, there is, as yet, no long-term management approach defined for

  2. The natural history of primary progressive multiple sclerosis

    NARCIS (Netherlands)

    Koch, Marcus; Kingwell, Elaine; Rieckmann, Peter; Tremlett, Helen

    2009-01-01

    Background: Primary progressive multiple sclerosis (PPMS) carries the worst prognosis of the multiple sclerosis (MS) subtypes and is currently untreatable. A previous analysis of the British Columbia MS database challenged the view that disability progression is rapid in PPMS, but identified few

  3. Determinants of Local Progression After Computed Tomography-Guided Percutaneous Radiofrequency Ablation for Unresectable Lung Tumors: 9-Year Experience in a Single Institution

    International Nuclear Information System (INIS)

    Okuma, Tomohisa; Matsuoka, Toshiyuki; Yamamoto, Akira; Oyama, Yoshimasa; Hamamoto, Shinichi; Toyoshima, Masami; Nakamura, Kenji; Miki, Yukio

    2010-01-01

    The purpose of this study was to retrospectively determine the local control rate and contributing factors to local progression after computed tomography (CT)-guided radiofrequency ablation (RFA) for unresectable lung tumor. This study included 138 lung tumors in 72 patients (56 men and 16 women; age 70.0 ± 11.6 years (range 31-94); mean tumor size 2.1 ± 1.2 cm [range 0.2-9]) who underwent lung RFA between June 2000 and May 2009. Mean follow-up periods for patients and tumors were 14 and 12 months, respectively. The local progression-free rate and survival rate were calculated to determine the contributing factors to local progression. During follow-up, 44 of 138 (32%) lung tumors showed local progression. The 1-, 2-, 3-, and 5-year overall local control rates were 61, 57, 57, and 38%, respectively. The risk factors for local progression were age (≥70 years), tumor size (≥2 cm), sex (male), and no achievement of roll-off during RFA (P < 0.05). Multivariate analysis identified tumor size ≥2 cm as the only independent factor for local progression (P = 0.003). For tumors <2 cm, 17 of 68 (25%) showed local progression, and the 1-, 2-, and 3-year overall local control rates were 77, 73, and 73%, respectively. Multivariate analysis identified that age ≥70 years was an independent determinant of local progression for tumors <2 cm in diameter (P = 0.011). The present study showed that 32% of lung tumors developed local progression after CT-guided RFA. The significant risk factor for local progression after RFA for lung tumors was tumor size ≥2 cm.

  4. Research progress in machine learning methods for gene-gene interaction detection.

    Science.gov (United States)

    Peng, Zhe-Ye; Tang, Zi-Jun; Xie, Min-Zhu

    2018-03-20

    Complex diseases are results of gene-gene and gene-environment interactions. However, the detection of high-dimensional gene-gene interactions is computationally challenging. In the last two decades, machine-learning approaches have been developed to detect gene-gene interactions with some successes. In this review, we summarize the progress in research on machine learning methods, as applied to gene-gene interaction detection. It systematically examines the principles and limitations of the current machine learning methods used in genome wide association studies (GWAS) to detect gene-gene interactions, such as neural networks (NN), random forest (RF), support vector machines (SVM) and multifactor dimensionality reduction (MDR), and provides some insights on the future research directions in the field.

  5. Exploring the challenges faced by polytechnic students

    Science.gov (United States)

    Matore, Mohd Effendi @ Ewan Mohd; Khairani, Ahmad Zamri

    2015-02-01

    This study aims to identify other challenges besides those already faced by students, in seven polytechnics in Malaysia as a continuation to the previous research that had identified 52 main challenges faced by students using the Rasch Model. The explorative study focuses on the challenges that are not included in the Mooney Problem Checklist (MPCL). A total of 121 polytechnic students submitted 183 written responses through the open questions provided. Two hundred fifty two students had responded from a students' perspective on the dichotomous questions regarding their view on the challenges faced. The data was analysed qualitatively using the NVivo 8.0. The findings showed that students from Politeknik Seberang Perai (PSP) gave the highest response, which was 56 (30.6%) and Politeknik Metro Kuala Lumpur (PMKL) had the lowest response of 2 (1.09%). Five dominant challenges were identified, which were the English language (32, 17.5%), learning (14, 7.7%), vehicles (13, 7.1%), information technology and communication (ICT) (13, 7.1%), and peers (11, 6.0%). This article, however, focus on three apparent challenges, namely, English language, vehicles, as well as computer and ICT, as the challenges of learning and peers had been analysed in the previous MPCL. The challenge of English language that had been raised was regarding the weakness in commanding the aspects of speech and fluency. The computer and ICT challenge covered the weakness in mastering ICT and computers, as well as computer breakdowns and low-performance computers. The challenge of vehicles emphasized the unavailability of vehicles to attend lectures and go elsewhere, lack of transportation service in the polytechnic and not having a valid driving license. These challenges are very relevant and need to be discussed in an effort to prepare polytechnics in facing the transformational process of polytechnics.

  6. Cloud Computing Governance Lifecycle

    Directory of Open Access Journals (Sweden)

    Soňa Karkošková

    2016-06-01

    Full Text Available Externally provisioned cloud services enable flexible and on-demand sourcing of IT resources. Cloud computing introduces new challenges such as need of business process redefinition, establishment of specialized governance and management, organizational structures and relationships with external providers and managing new types of risk arising from dependency on external providers. There is a general consensus that cloud computing in addition to challenges brings many benefits but it is unclear how to achieve them. Cloud computing governance helps to create business value through obtain benefits from use of cloud computing services while optimizing investment and risk. Challenge, which organizations are facing in relation to governing of cloud services, is how to design and implement cloud computing governance to gain expected benefits. This paper aims to provide guidance on implementation activities of proposed Cloud computing governance lifecycle from cloud consumer perspective. Proposed model is based on SOA Governance Framework and consists of lifecycle for implementation and continuous improvement of cloud computing governance model.

  7. Challenges of Big Data Analysis.

    Science.gov (United States)

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  9. Challenges to the development of complex virtual reality surgical simulations.

    Science.gov (United States)

    Seymour, N E; Røtnes, J S

    2006-11-01

    Virtual reality simulation in surgical training has become more widely used and intensely investigated in an effort to develop safer, more efficient, measurable training processes. The development of virtual reality simulation of surgical procedures has begun, but well-described technical obstacles must be overcome to permit varied training in a clinically realistic computer-generated environment. These challenges include development of realistic surgical interfaces and physical objects within the computer-generated environment, modeling of realistic interactions between objects, rendering of the surgical field, and development of signal processing for complex events associated with surgery. Of these, the realistic modeling of tissue objects that are fully responsive to surgical manipulations is the most challenging. Threats to early success include relatively limited resources for development and procurement, as well as smaller potential for return on investment than in other simulation industries that face similar problems. Despite these difficulties, steady progress continues to be made in these areas. If executed properly, virtual reality offers inherent advantages over other training systems in creating a realistic surgical environment and facilitating measurement of surgeon performance. Once developed, complex new virtual reality training devices must be validated for their usefulness in formative training and assessment of skill to be established.

  10. OPENING REMARKS: SciDAC: Scientific Discovery through Advanced Computing

    Science.gov (United States)

    Strayer, Michael

    2005-01-01

    Good morning. Welcome to SciDAC 2005 and San Francisco. SciDAC is all about computational science and scientific discovery. In a large sense, computational science characterizes SciDAC and its intent is change. It transforms both our approach and our understanding of science. It opens new doors and crosses traditional boundaries while seeking discovery. In terms of twentieth century methodologies, computational science may be said to be transformational. There are a number of examples to this point. First are the sciences that encompass climate modeling. The application of computational science has in essence created the field of climate modeling. This community is now international in scope and has provided precision results that are challenging our understanding of our environment. A second example is that of lattice quantum chromodynamics. Lattice QCD, while adding precision and insight to our fundamental understanding of strong interaction dynamics, has transformed our approach to particle and nuclear science. The individual investigator approach has evolved to teams of scientists from different disciplines working side-by-side towards a common goal. SciDAC is also undergoing a transformation. This meeting is a prime example. Last year it was a small programmatic meeting tracking progress in SciDAC. This year, we have a major computational science meeting with a variety of disciplines and enabling technologies represented. SciDAC 2005 should position itself as a new corner stone for Computational Science and its impact on science. As we look to the immediate future, FY2006 will bring a new cycle to SciDAC. Most of the program elements of SciDAC will be re-competed in FY2006. The re-competition will involve new instruments for computational science, new approaches for collaboration, as well as new disciplines. There will be new opportunities for virtual experiments in carbon sequestration, fusion, and nuclear power and nuclear waste, as well as collaborations

  11. The challenge of ubiquitous computing in health care: technology, concepts and solutions. Findings from the IMIA Yearbook of Medical Informatics 2005.

    Science.gov (United States)

    Bott, O J; Ammenwerth, E; Brigl, B; Knaup, P; Lang, E; Pilgram, R; Pfeifer, B; Ruderich, F; Wolff, A C; Haux, R; Kulikowski, C

    2005-01-01

    To review recent research efforts in the field of ubiquitous computing in health care. To identify current research trends and further challenges for medical informatics. Analysis of the contents of the Yearbook on Medical Informatics 2005 of the International Medical Informatics Association (IMIA). The Yearbook of Medical Informatics 2005 includes 34 original papers selected from 22 peer-reviewed scientific journals related to several distinct research areas: health and clinical management, patient records, health information systems, medical signal processing and biomedical imaging, decision support, knowledge representation and management, education and consumer informatics as well as bioinformatics. A special section on ubiquitous health care systems is devoted to recent developments in the application of ubiquitous computing in health care. Besides additional synoptical reviews of each of the sections the Yearbook includes invited reviews concerning E-Health strategies, primary care informatics and wearable healthcare. Several publications demonstrate the potential of ubiquitous computing to enhance effectiveness of health services delivery and organization. But ubiquitous computing is also a societal challenge, caused by the surrounding but unobtrusive character of this technology. Contributions from nearly all of the established sub-disciplines of medical informatics are demanded to turn the visions of this promising new research field into reality.

  12. Human rights in Japan: progress and challenges

    Directory of Open Access Journals (Sweden)

    Yolanda Muñoz González

    2007-11-01

    Full Text Available The aim of this paper is to present an overview of the improvements and challenges that Japan has been facing between 1983 and 2007. The paper explores the interaction among the different stakeholders –i.e. the Japanese Government, international organizations and civil society- to advance full access to citizenship regarding gender equality, the elimination of social and physical barriers for the inclusion of people with disabilities and elderly persons; ethnic minorities –specifically the situation of the Ainu people and the Buraku community – and the persons considered as “foreigners” living in Japan.

  13. Progress in Cell Marking for Synchrotron X-ray Computed Tomography

    Science.gov (United States)

    Hall, Christopher; Sturm, Erica; Schultke, Elisabeth; Arfelli, Fulvia; Menk, Ralf-Hendrik; Astolfo, Alberto; Juurlink, Bernhard H. J.

    2010-07-01

    Recently there has been an increase in research activity into finding ways of marking cells in live animals for pre-clinical trials. Development of certain drugs and other therapies crucially depend on tracking particular cells or cell types in living systems. Therefore cell marking techniques are required which will enable longitudinal studies, where individuals can be examined several times over the course of a therapy or study. The benefits of being able to study both disease and therapy progression in individuals, rather than cohorts are clear. The need for high contrast 3-D imaging, without harming or altering the biological system requires a non-invasive yet penetrating imaging technique. The technique will also have to provide an appropriate spatial and contrast resolution. X-ray computed tomography offers rapid acquisition of 3-D images and is set to become one of the principal imaging techniques in this area. Work by our group over the last few years has shown that marking cells with gold nano-particles (GNP) is an effective means of visualising marked cells in-vivo using x-ray CT. Here we report the latest results from these studies. Synchrotron X-ray CT images of brain lesions in rats taken using the SYRMEP facility at the Elettra synchrotron in 2009 have been compared with histological examination of the tissues. Some deductions are drawn about the visibility of the gold loaded cells in both light microscopy and x-ray imaging.

  14. How to Build a Quantum Computer

    Science.gov (United States)

    Sanders, Barry C.

    2017-11-01

    Quantum computer technology is progressing rapidly with dozens of qubits and hundreds of quantum logic gates now possible. Although current quantum computer technology is distant from being able to solve computational problems beyond the reach of non-quantum computers, experiments have progressed well beyond simply demonstrating the requisite components. We can now operate small quantum logic processors with connected networks of qubits and quantum logic gates, which is a great stride towards functioning quantum computers. This book aims to be accessible to a broad audience with basic knowledge of computers, electronics and physics. The goal is to convey key notions relevant to building quantum computers and to present state-of-the-art quantum-computer research in various media such as trapped ions, superconducting circuits, photonics and beyond.

  15. Progress report 1979

    International Nuclear Information System (INIS)

    1980-12-01

    This progress report deals with technical and research work done at the AAEC Research Establishment in the twelve month period ending September 30, 1979. Work done in the following research divisions is reported: Applied Maths and Computing, Chemical Technology, Engineering Research, Environmental Science, Instrumentation and Control, Isotope, Materials and Physics

  16. Stable isotope views on ecosystem function: challenging or challenged?

    Science.gov (United States)

    Resco, Víctor; Querejeta, José I.; Ogle, Kiona; Voltas, Jordi; Sebastià, Maria-Teresa; Serrano-Ortiz, Penélope; Linares, Juan C.; Moreno-Gutiérrez, Cristina; Herrero, Asier; Carreira, José A.; Torres-Cañabate, Patricia; Valladares, Fernando

    2010-01-01

    Stable isotopes and their potential for detecting various and complex ecosystem processes are attracting an increasing number of scientists. Progress is challenging, particularly under global change scenarios, but some established views have been challenged. The IX meeting of the Spanish Association of Terrestrial Ecology (AAET, Úbeda, 18–22 October 2009) hosted a symposium on the ecology of stable isotopes where the linear mixing model approach of partitioning sinks and sources of carbon and water fluxes within an ecosystem was challenged, and new applications of stable isotopes for the study of plant interactions were evaluated. Discussion was also centred on the need for networks that monitor ecological processes using stable isotopes and key ideas for fostering future research with isotopes. PMID:20015858

  17. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  18. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  19. Challenges in computational statistics and data mining

    CERN Document Server

    Mielniczuk, Jan

    2016-01-01

    This volume contains nineteen research papers belonging to the areas of computational statistics, data mining, and their applications. Those papers, all written specifically for this volume, are their authors’ contributions to honour and celebrate Professor Jacek Koronacki on the occcasion of his 70th birthday. The book’s related and often interconnected topics, represent Jacek Koronacki’s research interests and their evolution. They also clearly indicate how close the areas of computational statistics and data mining are.

  20. Investigation of Cloud Computing: Applications and Challenges

    OpenAIRE

    Amid Khatibi Bardsiri; Anis Vosoogh; Fatemeh Ahoojoosh

    2014-01-01

    Cloud computing is a model for saving data or knowledge in distance servers through Internet. It can be save the required memory space and reduce cost of extending memory capacity in users’ own machines and etc., Therefore, Cloud Computing has several benefits for individuals as well as organizations. It provides protection for personal and organizational data. Further, with the help of cloud service, one business owner, organization manager or service provider will be able to make privacy an...

  1. Recent development in computational actinide chemistry

    International Nuclear Information System (INIS)

    Li Jun

    2008-01-01

    Ever since the Manhattan project in World War II, actinide chemistry has been essential for nuclear science and technology. Yet scientists still seek the ability to interpret and predict chemical and physical properties of actinide compounds and materials using first-principle theory and computational modeling. Actinide compounds are challenging to computational chemistry because of their complicated electron correlation effects and relativistic effects, including spin-orbit coupling effects. There have been significant developments in theoretical studies on actinide compounds in the past several years. The theoretical capabilities coupled with new experimental characterization techniques now offer a powerful combination for unraveling the complexities of actinide chemistry. In this talk, we will provide an overview of our own research in this field, with particular emphasis on applications of relativistic density functional and ab initio quantum chemical methods to the geometries, electronic structures, spectroscopy and excited-state properties of small actinide molecules such as CUO and UO 2 and some large actinide compounds relevant to separation and environment science. The performance of various density functional approaches and wavefunction theory-based electron correlation methods will be compared. The results of computational modeling on the vibrational, electronic, and NMR spectra of actinide compounds will be briefly discussed as well [1-4]. We will show that progress in relativistic quantum chemistry, computer hardware and computational chemistry software has enabled computational actinide chemistry to emerge as a powerful and predictive tool for research in actinide chemistry. (authors)

  2. Computers and virtual reality for surgical education in the 21st century.

    Science.gov (United States)

    Haluck, R S; Krummel, T M

    2000-07-01

    Surgeons must learn to perform operations. The current system of surgical resident education is facing many challenges in terms of time efficiency, costs, and patient safety. In addition, as new types of operations are developed rapidly, practicing surgeons may find a need for more efficient methods of surgical skill education. An in-depth examination of the current learning environment and the literature of motor skills learning provides insights into ways in which surgical skills education can be improved. Computers will certainly be a part of this process. Computer-based training in technical skills has the potential to solve many of the educational, economic, ethical, and patient safety issues related to learning to perform operations. Although full virtual-reality systems are still in development, there has been early progress that should encourage surgeons to incorporate computer simulation into the surgical curriculum.

  3. Quantum Computing's Classical Problem, Classical Computing's Quantum Problem

    OpenAIRE

    Van Meter, Rodney

    2013-01-01

    Tasked with the challenge to build better and better computers, quantum computing and classical computing face the same conundrum: the success of classical computing systems. Small quantum computing systems have been demonstrated, and intermediate-scale systems are on the horizon, capable of calculating numeric results or simulating physical systems far beyond what humans can do by hand. However, to be commercially viable, they must surpass what our wildly successful, highly advanced classica...

  4. Setting a research agenda for progressive multiple sclerosis: the International Collaborative on Progressive MS.

    Science.gov (United States)

    Fox, Robert J; Thompson, Alan; Baker, David; Baneke, Peer; Brown, Doug; Browne, Paul; Chandraratna, Dhia; Ciccarelli, Olga; Coetzee, Timothy; Comi, Giancarlo; Feinstein, Anthony; Kapoor, Raj; Lee, Karen; Salvetti, Marco; Sharrock, Kersten; Toosy, Ahmed; Zaratin, Paola; Zuidwijk, Kim

    2012-11-01

    Despite significant progress in the development of therapies for relapsing MS, progressive MS remains comparatively disappointing. Our objective, in this paper, is to review the current challenges in developing therapies for progressive MS and identify key priority areas for research. A collaborative was convened by volunteer and staff leaders from several MS societies with the mission to expedite the development of effective disease-modifying and symptom management therapies for progressive forms of multiple sclerosis. Through a series of scientific and strategic planning meetings, the collaborative identified and developed new perspectives on five key priority areas for research: experimental models, identification and validation of targets and repurposing opportunities, proof-of-concept clinical trial strategies, clinical outcome measures, and symptom management and rehabilitation. Our conclusions, tackling the impediments in developing therapies for progressive MS will require an integrated, multi-disciplinary approach to enable effective translation of research into therapies for progressive MS. Engagement of the MS research community through an international effort is needed to address and fund these research priorities with the ultimate goal of expediting the development of disease-modifying and symptom-relief treatments for progressive MS.

  5. Setting a research agenda for progressive multiple sclerosis: The International Collaborative on Progressive MS

    Science.gov (United States)

    Thompson, Alan; Baker, David; Baneke, Peer; Brown, Doug; Browne, Paul; Chandraratna, Dhia; Ciccarelli, Olga; Coetzee, Timothy; Comi, Giancarlo; Feinstein, Anthony; Kapoor, Raj; Lee, Karen; Salvetti, Marco; Sharrock, Kersten; Toosy, Ahmed; Zaratin, Paola; Zuidwijk, Kim

    2012-01-01

    Despite significant progress in the development of therapies for relapsing MS, progressive MS remains comparatively disappointing. Our objective, in this paper, is to review the current challenges in developing therapies for progressive MS and identify key priority areas for research. A collaborative was convened by volunteer and staff leaders from several MS societies with the mission to expedite the development of effective disease-modifying and symptom management therapies for progressive forms of multiple sclerosis. Through a series of scientific and strategic planning meetings, the collaborative identified and developed new perspectives on five key priority areas for research: experimental models, identification and validation of targets and repurposing opportunities, proof-of-concept clinical trial strategies, clinical outcome measures, and symptom management and rehabilitation. Our conclusions, tackling the impediments in developing therapies for progressive MS will require an integrated, multi-disciplinary approach to enable effective translation of research into therapies for progressive MS. Engagement of the MS research community through an international effort is needed to address and fund these research priorities with the ultimate goal of expediting the development of disease-modifying and symptom-relief treatments for progressive MS. PMID:22917690

  6. Research progress on quantum informatics and quantum computation

    Science.gov (United States)

    Zhao, Yusheng

    2018-03-01

    Quantum informatics is an emerging interdisciplinary subject developed by the combination of quantum mechanics, information science, and computer science in the 1980s. The birth and development of quantum information science has far-reaching significance in science and technology. At present, the application of quantum information technology has become the direction of people’s efforts. The preparation, storage, purification and regulation, transmission, quantum coding and decoding of quantum state have become the hotspot of scientists and technicians, which have a profound impact on the national economy and the people’s livelihood, technology and defense technology. This paper first summarizes the background of quantum information science and quantum computer and the current situation of domestic and foreign research, and then introduces the basic knowledge and basic concepts of quantum computing. Finally, several quantum algorithms are introduced in detail, including Quantum Fourier transform, Deutsch-Jozsa algorithm, Shor’s quantum algorithm, quantum phase estimation.

  7. Tracking progress towards equitable child survival in a Nicaraguan community: neonatal mortality challenges to meet the MDG 4

    Directory of Open Access Journals (Sweden)

    Persson Lars-Åke

    2011-06-01

    Full Text Available Abstract Background Nicaragua has made progress in the reduction of the under-five mortality since 1980s. Data for the national trends indicate that this poor Central American country is on track to reach the Millennium Development Goal-4 by 2015. Despite this progress, neonatal mortality has not showed same progress. The aim of this study is to analyse trends and social differentials in neonatal and under-five mortality in a Nicaraguan community from 1970 to 2005. Methods Two linked community-based reproductive surveys in 1993 and 2002 followed by a health and demographic surveillance system providing information on all births and child deaths in urban and rural areas of León municipality, Nicaragua. A total of 49 972 live births were registered. Results A rapid reduction in under-five mortality was observed during the late 1970s (from 103 deaths/1000 live births and the 1980s, followed by a gradual decline to the level of 23 deaths/1000 live births in 2005. This community is on track for the Millennium Development Goal 4 for improved child survival. However, neonatal mortality increased lately in spite of a good coverage of skilled assistance at delivery. After some years in the 1990s with a very small gap in neonatal survival between children of mothers of different educational levels this divide is increasing. Conclusions After the reduction of high under-five mortality that coincided with improved equity in survival in this Nicaraguan community, the current challenge is the neonatal mortality where questions of an equitable perinatal care of good quality must be addressed.

  8. Single-Cell Transcriptomics Bioinformatics and Computational Challenges

    Directory of Open Access Journals (Sweden)

    Lana Garmire

    2016-09-01

    Full Text Available The emerging single-cell RNA-Seq (scRNA-Seq technology holds the promise to revolutionize our understanding of diseases and associated biological processes at an unprecedented resolution. It opens the door to reveal the intercellular heterogeneity and has been employed to a variety of applications, ranging from characterizing cancer cells subpopulations to elucidating tumor resistance mechanisms. Parallel to improving experimental protocols to deal with technological issues, deriving new analytical methods to reveal the complexity in scRNA-Seq data is just as challenging. Here we review the current state-of-the-art bioinformatics tools and methods for scRNA-Seq analysis, as well as addressing some critical analytical challenges that the field faces.

  9. Progress report for a research program in computational physics: Progress report, January 1, 1988-December 31, 1988

    International Nuclear Information System (INIS)

    Guralnik, G.S.

    1988-01-01

    The project of this progress report are all ultimately concerned with various aspects of numerical simulations of lattice gauge theories. These aspects include algorithms, machines, theoretical studies and actual simulations. We made progress in four general areas: studies of new algorithms, determination of the SU(3) β-function, studies of the finite temperature QCD phase transition in the presence of fermions, and the calculation of hadronic matrix elements. We will describe each of these in turn. 7 refs

  10. Workplace Charging Challenge Mid-Program Review: Employees Plug In

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2015-12-31

    The EV Everywhere Workplace Charging Challenge aims to have 500 U.S. employers offering workplace charging by 2018. These reports describe the progress made in the Challenge. In 2015, the Workplace Charging Challenge celebrated a major milestone – it reached the halfway point to its goal of 500 Challenge partners committed to installing workplace charging by 2018. More than 250 employers have joined as Challenge partners and the installation of workplace charging as a sustainable business practice is growing across the country. Their efforts have resulted in more than 600 workplaces with over 5,500 charging stations accessible to nearly one million employees. In 2015, more than 9,000 PEV-driving employees charged at these worksites on a regular basis. Our Workplace Charging Challenge Mid-Program Review reports this progress and other statistics related to workplace charging, including employee satisfaction and charger usage.

  11. Static, rheological and mechanical properties of polymer nanocomposites studied by computer modeling and simulation.

    Science.gov (United States)

    Liu, Jun; Zhang, Liqun; Cao, Dapeng; Wang, Wenchuan

    2009-12-28

    Polymer nanocomposites (PNCs) often exhibit excellent mechanical, thermal, electrical and optical properties, because they combine the performances of both polymers and inorganic or organic nanoparticles. Recently, computer modeling and simulation are playing an important role in exploring the reinforcement mechanism of the PNCs and even the design of functional PNCs. This report provides an overview of the progress made in past decades in the investigation of the static, rheological and mechanical properties of polymer nanocomposites studied by computer modeling and simulation. Emphases are placed on exploring the mechanisms at the molecular level for the dispersion of nanoparticles in nanocomposites, the effects of nanoparticles on chain conformation and glass transition temperature (T(g)), as well as viscoelastic and mechanical properties. Finally, some future challenges and opportunities in computer modeling and simulation of PNCs are addressed.

  12. Parameterized algorithmics for computational social choice : nine research challenges

    NARCIS (Netherlands)

    Bredereck, R.; Chen, J.; Faliszewski, P.; Guo, J.; Niedermeier, R.; Woeginger, G.J.

    2014-01-01

    Computational Social Choice is an interdisciplinary research area involving Economics, Political Science, and Social Science on the one side, and Mathematics and Computer Science (including Artificial Intelligence and Multiagent Systems) on the other side. Typical computational problems studied in

  13. Enamel Regeneration - Current Progress and Challenges

    Science.gov (United States)

    Baswaraj; H.K, Navin; K.B, Prasanna

    2014-01-01

    Dental Enamel is the outermost covering of teeth. It is hardest mineralized tissue present in the human body. Enamel faces the challenge of maintaining its integrity in a constant demineralization and remineralization within the oral environment and it is vulnerable to wear, damage, and decay. It cannot regenerate itself, because it is formed by a layer of cells that are lost after the tooth eruption. Conventional treatment relies on synthetic materials to restore lost enamel that cannot mimic natural enamel. With advances in material science and understanding of basic principles of organic matrix mediated mineralization paves a way for formation of synthetic enamel. The knowledge of enamel formation and understanding of protein interactions and their gene products function along with the isolation of postnatal stem cells from various sources in the oral cavity, and the development of smart materials for cell and growth factor delivery, makes possibility for biological based enamel regeneration. This article will review the recent endeavor on biomimetic synthesis and cell based strategies for enamel regeneration. PMID:25386548

  14. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  15. Biogas Opportunities Roadmap Progress Report

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2015-12-01

    In support of the Obama Administration's Climate Action Plan, the U.S. Department of Energy, the U.S. Environmental Protection Agency, and U.S. Department of Agriculture jointly released the Biogas Opportunities Roadmap Progress Report, updating the federal government's progress to reduce methane emissions through biogas systems since the Biogas Opportunities Roadmap was completed by the three agencies in July 2014. The report highlights actions taken, outlines challenges and opportunities, and identifies next steps to the growth of a robust biogas industry.

  16. Development of a Computational Steering Framework for High Performance Computing Environments on Blue Gene/P Systems

    KAUST Repository

    Danani, Bob K.

    2012-07-01

    Computational steering has revolutionized the traditional workflow in high performance computing (HPC) applications. The standard workflow that consists of preparation of an application’s input, running of a simulation, and visualization of simulation results in a post-processing step is now transformed into a real-time interactive workflow that significantly reduces development and testing time. Computational steering provides the capability to direct or re-direct the progress of a simulation application at run-time. It allows modification of application-defined control parameters at run-time using various user-steering applications. In this project, we propose a computational steering framework for HPC environments that provides an innovative solution and easy-to-use platform, which allows users to connect and interact with running application(s) in real-time. This framework uses RealityGrid as the underlying steering library and adds several enhancements to the library to enable steering support for Blue Gene systems. Included in the scope of this project is the development of a scalable and efficient steering relay server that supports many-to-many connectivity between multiple steered applications and multiple steering clients. Steered applications can range from intermediate simulation and physical modeling applications to complex computational fluid dynamics (CFD) applications or advanced visualization applications. The Blue Gene supercomputer presents special challenges for remote access because the compute nodes reside on private networks. This thesis presents an implemented solution and demonstrates it on representative applications. Thorough implementation details and application enablement steps are also presented in this thesis to encourage direct usage of this framework.

  17. Computational methods for 2D materials: discovery, property characterization, and application design.

    Science.gov (United States)

    Paul, J T; Singh, A K; Dong, Z; Zhuang, H; Revard, B C; Rijal, B; Ashton, M; Linscheid, A; Blonsky, M; Gluhovic, D; Guo, J; Hennig, R G

    2017-11-29

    The discovery of two-dimensional (2D) materials comes at a time when computational methods are mature and can predict novel 2D materials, characterize their properties, and guide the design of 2D materials for applications. This article reviews the recent progress in computational approaches for 2D materials research. We discuss the computational techniques and provide an overview of the ongoing research in the field. We begin with an overview of known 2D materials, common computational methods, and available cyber infrastructures. We then move onto the discovery of novel 2D materials, discussing the stability criteria for 2D materials, computational methods for structure prediction, and interactions of monolayers with electrochemical and gaseous environments. Next, we describe the computational characterization of the 2D materials' electronic, optical, magnetic, and superconducting properties and the response of the properties under applied mechanical strain and electrical fields. From there, we move on to discuss the structure and properties of defects in 2D materials, and describe methods for 2D materials device simulations. We conclude by providing an outlook on the needs and challenges for future developments in the field of computational research for 2D materials.

  18. Philanthropy and disparities: progress, challenges, and unfinished business.

    Science.gov (United States)

    Mitchell, Faith; Sessions, Kathryn

    2011-10-01

    Philanthropy has invested millions of dollars to reduce disparities in health care and improve minority health. Grants to strengthen providers' cultural competence, diversify health professions, and collect data have improved understanding of and spurred action on disparities. The persistence of disparities in spite of these advances has shifted philanthropic attention toward strategies to change social, economic, and environmental conditions. We argue that these evolving perspectives, along with earlier groundwork, present new opportunities for funders, especially in combination with progress toward universal health coverage. This article looks at how philanthropy has addressed health disparities over the past decade, with a focus on accomplishments, the work remaining to be done, and how funders can help advance the disparities agenda.

  19. Maternal and child health in Brazil: progress and challenges.

    Science.gov (United States)

    Victora, Cesar G; Aquino, Estela M L; do Carmo Leal, Maria; Monteiro, Carlos Augusto; Barros, Fernando C; Szwarcwald, Celia L

    2011-05-28

    In the past three decades, Brazil has undergone rapid changes in major social determinants of health and in the organisation of health services. In this report, we examine how these changes have affected indicators of maternal health, child health, and child nutrition. We use data from vital statistics, population censuses, demographic and health surveys, and published reports. In the past three decades, infant mortality rates have reduced substantially, decreasing by 5·5% a year in the 1980s and 1990s, and by 4·4% a year since 2000 to reach 20 deaths per 1000 livebirths in 2008. Neonatal deaths account for 68% of infant deaths. Stunting prevalence among children younger than 5 years decreased from 37% in 1974-75 to 7% in 2006-07. Regional differences in stunting and child mortality also decreased. Access to most maternal-health and child-health interventions increased sharply to almost universal coverage, and regional and socioeconomic inequalities in access to such interventions were notably reduced. The median duration of breastfeeding increased from 2·5 months in the 1970s to 14 months by 2006-07. Official statistics show stable maternal mortality ratios during the past 10 years, but modelled data indicate a yearly decrease of 4%, a trend which might not have been noticeable in official reports because of improvements in death registration and the increased number of investigations into deaths of women of reproductive age. The reasons behind Brazil's progress include: socioeconomic and demographic changes (economic growth, reduction in income disparities between the poorest and wealthiest populations, urbanisation, improved education of women, and decreased fertility rates), interventions outside the health sector (a conditional cash transfer programme and improvements in water and sanitation), vertical health programmes in the 1980s (promotion of breastfeeding, oral rehydration, and immunisations), creation of a tax-funded national health service in 1988

  20. IBM Watson: How Cognitive Computing Can Be Applied to Big Data Challenges in Life Sciences Research.

    Science.gov (United States)

    Chen, Ying; Elenee Argentinis, J D; Weber, Griff

    2016-04-01

    Life sciences researchers are under pressure to innovate faster than ever. Big data offer the promise of unlocking novel insights and accelerating breakthroughs. Ironically, although more data are available than ever, only a fraction is being integrated, understood, and analyzed. The challenge lies in harnessing volumes of data, integrating the data from hundreds of sources, and understanding their various formats. New technologies such as cognitive computing offer promise for addressing this challenge because cognitive solutions are specifically designed to integrate and analyze big datasets. Cognitive solutions can understand different types of data such as lab values in a structured database or the text of a scientific publication. Cognitive solutions are trained to understand technical, industry-specific content and use advanced reasoning, predictive modeling, and machine learning techniques to advance research faster. Watson, a cognitive computing technology, has been configured to support life sciences research. This version of Watson includes medical literature, patents, genomics, and chemical and pharmacological data that researchers would typically use in their work. Watson has also been developed with specific comprehension of scientific terminology so it can make novel connections in millions of pages of text. Watson has been applied to a few pilot studies in the areas of drug target identification and drug repurposing. The pilot results suggest that Watson can accelerate identification of novel drug candidates and novel drug targets by harnessing the potential of big data. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  1. Computational algebraic geometry for statistical modeling FY09Q2 progress.

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, David C.; Rojas, Joseph Maurice; Pebay, Philippe Pierre

    2009-03-01

    This is a progress report on polynomial system solving for statistical modeling. This is a progress report on polynomial system solving for statistical modeling. This quarter we have developed our first model of shock response data and an algorithm for identifying the chamber cone containing a polynomial system in n variables with n+k terms within polynomial time - a significant improvement over previous algorithms, all having exponential worst-case complexity. We have implemented and verified the chamber cone algorithm for n+3 and are working to extend the implementation to handle arbitrary k. Later sections of this report explain chamber cones in more detail; the next section provides an overview of the project and how the current progress fits into it.

  2. Security in hybrid cloud computing

    OpenAIRE

    Koudelka, Ondřej

    2016-01-01

    This bachelor thesis deals with the area of hybrid cloud computing, specifically with its security. The major aim of the thesis is to analyze and compare the chosen hybrid cloud providers. For the minor aim this thesis compares the security challenges of hybrid cloud as opponent to other deployment models. In order to accomplish said aims, this thesis defines the terms cloud computing and hybrid cloud computing in its theoretical part. Furthermore the security challenges for cloud computing a...

  3. Cloud Computing Fundamentals

    Science.gov (United States)

    Furht, Borko

    In the introductory chapter we define the concept of cloud computing and cloud services, and we introduce layers and types of cloud computing. We discuss the differences between cloud computing and cloud services. New technologies that enabled cloud computing are presented next. We also discuss cloud computing features, standards, and security issues. We introduce the key cloud computing platforms, their vendors, and their offerings. We discuss cloud computing challenges and the future of cloud computing.

  4. 20 CFR 411.180 - What is timely progress toward self-supporting employment?

    Science.gov (United States)

    2010-04-01

    ... Who Are Using a Ticket Introduction § 411.180 What is timely progress toward self-supporting... the previous 12-month progress certification period. In computing any 12-month progress certification...

  5. Software Systems for High-performance Quantum Computing

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S [ORNL; Britt, Keith A [ORNL

    2016-01-01

    Quantum computing promises new opportunities for solving hard computational problems, but harnessing this novelty requires breakthrough concepts in the design, operation, and application of computing systems. We define some of the challenges facing the development of quantum computing systems as well as software-based approaches that can be used to overcome these challenges. Following a brief overview of the state of the art, we present models for the quantum programming and execution models, the development of architectures for hybrid high-performance computing systems, and the realization of software stacks for quantum networking. This leads to a discussion of the role that conventional computing plays in the quantum paradigm and how some of the current challenges for exascale computing overlap with those facing quantum computing.

  6. Early detection of emphysema progression

    DEFF Research Database (Denmark)

    Gorbunova, Vladlena; Jacobs, Sander S A M; Lo, Pechin

    2010-01-01

    Emphysema is one of the most widespread diseases in subjects with smoking history. The gold standard method for estimating the severity of emphysema is a lung function test, such as forced expiratory volume in first second (FEV1). However, several clinical studies showed that chest CT scans offer...... more sensitive estimates of emphysema progression. The standard CT densitometric score of emphysema is the relative area of voxels below a threshold (RA). The RA score is a global measurement and reflects the overall emphysema progression. In this work, we propose a framework for estimation of local...... emphysema progression from longitudinal chest CT scans. First, images are registered to a common system of coordinates and then local image dissimilarities are computed in corresponding anatomical locations. Finally, the obtained dissimilarity representation is converted into a single emphysema progression...

  7. Crystal growth and computational materials science

    International Nuclear Information System (INIS)

    Jayakumar, S.; Ravindran, P.; Arun Kumar, R.; Sudarshan, C.

    2012-01-01

    The proceedings of the international conference on advanced materials discusses the advances being made in the area of single crystals, their preparation and device development from these crystals and details of the progress that is taking place in the computational field relating to materials science. Computational materials science makes use of advanced simulation tools and computer interfaces to develop a virtual platform which can provide a model for real-time experiments. This book includes selected papers in topics of crystal growth and computational materials science. We are confident that the new concepts and results presented will stimulate and enhance progress of research on crystal growth and computational materials science. Papers relevant to INIS are indexed separately

  8. FFTF integrated leak rate computer system

    International Nuclear Information System (INIS)

    Hubbard, J.A.

    1987-01-01

    The Fast Flux Test Facility (FFTF) is a liquid-metal-cooled test reactor located on the Hanford site. The FFTF is the only reactor of this type designed and operated to meet the licensing requirements of the Nuclear Regulatory Commission. Unique characteristics of the FFTF that present special challenges related to leak rate testing include thin wall containment vessel construction, cover gas systems that penetrate containment, and a low-pressure design basis accident. The successful completion of the third FFTF integrated leak rate test 5 days ahead of schedule and 10% under budget was a major achievement for the Westinghouse Hanford Company. The success of this operational safety test was due in large part to a special network (LAN) of three IBM PC/XT computers, which monitored the sensor data, calculated the containment vessel leak rate, and displayed test results. The equipment configuration allowed continuous monitoring of the progress of the test independent of the data acquisition and analysis functions, and it also provided overall improved system reliability by permitting immediate switching to backup computers in the event of equipment failure

  9. Challenges in scaling NLO generators to leadership computers

    Science.gov (United States)

    Benjamin, D.; Childers, JT; Hoeche, S.; LeCompte, T.; Uram, T.

    2017-10-01

    Exascale computing resources are roughly a decade away and will be capable of 100 times more computing than current supercomputers. In the last year, Energy Frontier experiments crossed a milestone of 100 million core-hours used at the Argonne Leadership Computing Facility, Oak Ridge Leadership Computing Facility, and NERSC. The Fortran-based leading-order parton generator called Alpgen was successfully scaled to millions of threads to achieve this level of usage on Mira. Sherpa and MadGraph are next-to-leading order generators used heavily by LHC experiments for simulation. Integration times for high-multiplicity or rare processes can take a week or more on standard Grid machines, even using all 16-cores. We will describe our ongoing work to scale the Sherpa generator to thousands of threads on leadership-class machines and reduce run-times to less than a day. This work allows the experiments to leverage large-scale parallel supercomputers for event generation today, freeing tens of millions of grid hours for other work, and paving the way for future applications (simulation, reconstruction) on these and future supercomputers.

  10. Wearable sensors: modalities, challenges, and prospects.

    Science.gov (United States)

    Heikenfeld, J; Jajack, A; Rogers, J; Gutruf, P; Tian, L; Pan, T; Li, R; Khine, M; Kim, J; Wang, J; Kim, J

    2018-01-16

    Wearable sensors have recently seen a large increase in both research and commercialization. However, success in wearable sensors has been a mix of both progress and setbacks. Most of commercial progress has been in smart adaptation of existing mechanical, electrical and optical methods of measuring the body. This adaptation has involved innovations in how to miniaturize sensing technologies, how to make them conformal and flexible, and in the development of companion software that increases the value of the measured data. However, chemical sensing modalities have experienced greater challenges in commercial adoption, especially for non-invasive chemical sensors. There have also been significant challenges in making significant fundamental improvements to existing mechanical, electrical, and optical sensing modalities, especially in improving their specificity of detection. Many of these challenges can be understood by appreciating the body's surface (skin) as more of an information barrier than as an information source. With a deeper understanding of the fundamental challenges faced for wearable sensors and of the state-of-the-art for wearable sensor technology, the roadmap becomes clearer for creating the next generation of innovations and breakthroughs.

  11. The Artificial Leaf: Recent Progress and Remaining Challenges

    Directory of Open Access Journals (Sweden)

    Mark D Symes

    2016-12-01

    Full Text Available The prospect of a device that uses solar energy to split water into H2 and O2 is highly attractive in terms of producing hydrogen as a carbon-neutral fuel. In this mini review, key research milestones that have been reached in this field over the last two decades will be discussed, with special focus on devices that use earth-abundant materials. Finally, the remaining challenges in the development of such “artificial leaves” will be highlighted.

  12. Cloud service performance evaluation: status, challenges, and opportunities – a survey from the system modeling perspective

    Directory of Open Access Journals (Sweden)

    Qiang Duan

    2017-05-01

    Full Text Available With rapid advancement of Cloud computing and networking technologies, a wide spectrum of Cloud services have been developed by various providers and utilized by numerous organizations as indispensable ingredients of their information systems. Cloud service performance has a significant impact on performance of the future information infrastructure. Thorough evaluation on Cloud service performance is crucial and beneficial to both service providers and consumers; thus forming an active research area. Some key technologies for Cloud computing, such as virtualization and the Service-Oriented Architecture (SOA, bring in special challenges to service performance evaluation. A tremendous amount of effort has been put by the research community to address these challenges and exciting progress has been made. Among the work on Cloud performance analysis, evaluation approaches developed with a system modeling perspective play an important role. However, related works have been reported in different sections of the literature; thus lacking a big picture that shows the latest status of this area. The objectives of this article is to present a survey that reflects the state of the art of Cloud service performance evaluation from the system modeling perspective. This articles also examines open issues and challenges to the surveyed evaluation approaches and identifies possible opportunities for future research in this important field.

  13. Automation of Knowledge Work in Medicine and Health care: Future and Challenges

    Directory of Open Access Journals (Sweden)

    Farzan Majidfar

    2017-07-01

    Full Text Available Increment of computing speed, machine learning and human interface, have extended capabilities of artificial intelligence applications to an important stage. It is predicted that use of artificial intelligence (AI to automate knowledge-based occupations (occupations such as medicine, engineering and law may have an global enormous economic impact in the near future.Applications based on artificial intelligence are able to improve health and quality of life for millions in the coming years. Although clinical applications of computer science are slow moving to real-world labs, but there are promising signs that the pace of innovation will improve. In the near future AI based applications by automating knowledge-based work in the field of diagnosis and treatment, nursing and health care, robotic surgery and development of new drugs, will have a transformative effect on the health sector. Therefore many artificial intelligence systems should work closely with health providers and patients to gain their trust. The progress of how smart machines naturally will interact with healthcare professionals, patients and patients' families is very important, yet challenging.In this article, we review the future of  automation of knowledge enabled by AI work in medicine and healthcare in  seven categories including big medical data mining, computer Aided Diagnosis, online consultations, evidence based medicine, health assistance, precision medicine and drug creation. Also challenges of this issue including cultural, organizational, legal and social barriers are described.

  14. Interdisciplinary research and education at the biology-engineering-computer science interface: a perspective.

    Science.gov (United States)

    Tadmor, Brigitta; Tidor, Bruce

    2005-09-01

    Progress in the life sciences, including genome sequencing and high-throughput experimentation, offers an opportunity for understanding biology and medicine from a systems perspective. This 'new view', which complements the more traditional component-based approach, involves the integration of biological research with approaches from engineering disciplines and computer science. The result is more than a new set of technologies. Rather, it promises a fundamental reconceptualization of the life sciences based on the development of quantitative and predictive models to describe crucial processes. To achieve this change, learning communities are being formed at the interface of the life sciences, engineering and computer science. Through these communities, research and education will be integrated across disciplines and the challenges associated with multidisciplinary team-based science will be addressed.

  15. Single-molecule techniques in biophysics: a review of the progress in methods and applications

    Science.gov (United States)

    Miller, Helen; Zhou, Zhaokun; Shepherd, Jack; Wollman, Adam J. M.; Leake, Mark C.

    2018-02-01

    Single-molecule biophysics has transformed our understanding of biology, but also of the physics of life. More exotic than simple soft matter, biomatter lives far from thermal equilibrium, covering multiple lengths from the nanoscale of single molecules to up to several orders of magnitude higher in cells, tissues and organisms. Biomolecules are often characterized by underlying instability: multiple metastable free energy states exist, separated by levels of just a few multiples of the thermal energy scale k B T, where k B is the Boltzmann constant and T absolute temperature, implying complex inter-conversion kinetics in the relatively hot, wet environment of active biological matter. A key benefit of single-molecule biophysics techniques is their ability to probe heterogeneity of free energy states across a molecular population, too challenging in general for conventional ensemble average approaches. Parallel developments in experimental and computational techniques have catalysed the birth of multiplexed, correlative techniques to tackle previously intractable biological questions. Experimentally, progress has been driven by improvements in sensitivity and speed of detectors, and the stability and efficiency of light sources, probes and microfluidics. We discuss the motivation and requirements for these recent experiments, including the underpinning mathematics. These methods are broadly divided into tools which detect molecules and those which manipulate them. For the former we discuss the progress of super-resolution microscopy, transformative for addressing many longstanding questions in the life sciences, and for the latter we include progress in ‘force spectroscopy’ techniques that mechanically perturb molecules. We also consider in silico progress of single-molecule computational physics, and how simulation and experimentation may be drawn together to give a more complete understanding. Increasingly, combinatorial techniques are now used, including

  16. The challenge of networked enterprises for cloud computing interoperability

    OpenAIRE

    Mezgár, István; Rauschecker, Ursula

    2014-01-01

    Manufacturing enterprises have to organize themselves into effective system architectures forming different types of Networked Enterprises (NE) to match fast changing market demands. Cloud Computing (CC) is an important up to date computing concept for NE, as it offers significant financial and technical advantages beside high-level collaboration possibilities. As cloud computing is a new concept the solutions for handling interoperability, portability, security, privacy and standardization c...

  17. Performance engineering challenges: the view from RENCI

    International Nuclear Information System (INIS)

    Fowler, R; Gamblin, T; Porterfield, A; Dreher, P; Huang, S; Joo, B

    2008-01-01

    Trends in chip technology and system design are causing a revolution in high-performance computing. The emergence of multicore processor chips, the construction of very large computing systems, and the increasing need to deal with power and energy issues in these systems are three of the most significant changes. We focus on the way that these trends have created a new set of challenges in the area of performance engineering, the measurement, analysis, and tuning of computing systems and applications. We discuss these changes and outline recent work at the Renaissance Computing Institute to meet these challenges

  18. Radiation doses in pediatric computed tomography procedures: challenges facing new technologies

    International Nuclear Information System (INIS)

    Cotelo, E.; Padilla, M.; Dibarboure, L.

    2008-01-01

    Despite the fact that in recent years an increasing number of radiologists and radiological technologists have been applying radiation dose optimization techniques in paediatric Computed Tomography (CT) examinations, dual and multi -slice CT (MSCT) scanners present a new challenge in Radiation Protection (RP). While on one hand these scanners are provided with Automatic Exposure Control (AEC) devices, dose reduction modes and dose estimation software, on the other hand Quality Control (QC) tests and CT Kerma Index (C) measurements and patient dose estimation present specific difficulties and require changes or adaptations of traditional QC protocols. This implies a major challenge in most developing countries where Quality Assurance Programmes (QAP) have not been implemented yet and there is a shortage in the number of medical physicists This paper analyses clinical and technical protocols as well as patient doses in 204 CT body procedures performed in 154 children. The investigation was carried out in a paediatric reference hospital of Uruguay, where are performed an average of 450 paediatric CT examinations per month in a sole CT dual scanner. Besides, C VOL reported from the scanner display was registered in order to be related with the same dosimetric quantity derived from technical parameters and C values published on tables. Results showed that not all the radiologists applied the same protocol in similar clinical situations delivering unnecessary patient dose with no significant differences in image quality. Moreover, it was found that dose reduction modes represent a drawback in order to estimate patient dose when mA changes according to tissue attenuation, in most cases in each rotation. The study concluded on the importance of QAP that must include education on RP of radiologists and technologists, as well as in the need of medical physicists to perform QC tests and patient dose estimations and measurements. (author)

  19. Fault tolerance in computational grids: perspectives, challenges, and issues.

    Science.gov (United States)

    Haider, Sajjad; Nazir, Babar

    2016-01-01

    Computational grids are established with the intention of providing shared access to hardware and software based resources with special reference to increased computational capabilities. Fault tolerance is one of the most important issues faced by the computational grids. The main contribution of this survey is the creation of an extended classification of problems that incur in the computational grid environments. The proposed classification will help researchers, developers, and maintainers of grids to understand the types of issues to be anticipated. Moreover, different types of problems, such as omission, interaction, and timing related have been identified that need to be handled on various layers of the computational grid. In this survey, an analysis and examination is also performed pertaining to the fault tolerance and fault detection mechanisms. Our conclusion is that a dependable and reliable grid can only be established when more emphasis is on fault identification. Moreover, our survey reveals that adaptive and intelligent fault identification, and tolerance techniques can improve the dependability of grid working environments.

  20. Continuing progress on a lattice QCD software infrastructure

    International Nuclear Information System (INIS)

    Joo, B

    2008-01-01

    We report on the progress of the software effort in the QCD application area of SciDAC. In particular, we discuss how the software developed under SciDAC enabled the aggressive exploitation of leadership computers, and we report on progress in the area of QCD software for multi-core architectures

  1. Galeras: Progress and challenges of disaster

    International Nuclear Information System (INIS)

    Dorado G, Lina Marlene

    2008-01-01

    The Galeras Volcano is presently being considered the most active in Colombia. For the last 17 years of constant vigilance, the occurrence of eruptions by Galeras Volcano has been mostly classified as small ones. In the high hazard zone, live 7935 persons who have to be evacuated every time level II to volcanic activity is reached (probable eruption in the course of days or weeks). For the first time in Colombian history, a disaster situation has been declared before its happening. On November 15th, 2005, the National Government, on the basis of the Decreto 4106, declared the existence of an disaster situation within the counties of Pasto, Narino and La Florida, all making part of the Narino Department. This declaration was made considering the serious alteration of the daily life style, to which the population was exposed due to a probable volcanic eruption, is out to come. The present work is an analysis of the emergency procedures which have been carried out with help of the PAR (pressure and release) methodology. This analysis contains some reflexions on how difficulties were solved, and on positive aspects, challenges and advances for a better long term management of evacuations, like those carried out in the high hazard zones of Galeras volcano

  2. Computational pan-genomics: status, promises and challenges

    NARCIS (Netherlands)

    The Computational Pan-Genomics Consortium; T. Marschall (Tobias); M. Marz (Manja); T. Abeel (Thomas); L.J. Dijkstra (Louis); B.E. Dutilh (Bas); A. Ghaffaari (Ali); P. Kersey (Paul); W.P. Kloosterman (Wigard); V. Mäkinen (Veli); A.M. Novak (Adam); B. Paten (Benedict); D. Porubsky (David); E. Rivals (Eric); C. Alkan (Can); J.A. Baaijens (Jasmijn); P.I.W. de Bakker (Paul); V. Boeva (Valentina); R.J.P. Bonnal (Raoul); F. Chiaromonte (Francesca); R. Chikhi (Rayan); F.D. Ciccarelli (Francesca); C.P. Cijvat (Robin); E. Datema (Erwin); C.M. van Duijn (Cornelia); E.E. Eichler (Evan); C. Ernst (Corinna); E. Eskin (Eleazar); E. Garrison (Erik); M. El-Kebir (Mohammed); G.W. Klau (Gunnar); J.O. Korbel (Jan); E.-W. Lameijer (Eric-Wubbo); B. Langmead (Benjamin); M. Martin; P. Medvedev (Paul); J.C. Mu (John); P.B.T. Neerincx (Pieter); K. Ouwens (Klaasjan); P. Peterlongo (Pierre); N. Pisanti (Nadia); S. Rahmann (Sven); B.J. Raphael (Benjamin); K. Reinert (Knut); D. de Ridder (Dick); J. de Ridder (Jeroen); M. Schlesner (Matthias); O. Schulz-Trieglaff (Ole); A.D. Sanders (Ashley); S. Sheikhizadeh (Siavash); C. Shneider (Carl); S. Smit (Sandra); D. Valenzuela (Daniel); J. Wang (Jiayin); L.F.A. Wessels (Lodewyk); Y. Zhang (Ying); V. Guryev (Victor); F. Vandin (Fabio); K. Ye (Kai); A. Schönhuth (Alexander)

    2018-01-01

    textabstractMany disciplines, from human genetics and oncology to plant breeding, microbiology and virology, commonly face the challenge of analyzing rapidly increasing numbers of genomes. In case of Homo sapiens, the number of sequenced genomes will approach hundreds of thousands in the next few

  3. Computer Aided Modelling – Opportunities and Challenges

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    -based solutions to significant problems? The important issues of workflow and data flow are discussed together with fit-for-purpose model development. As well, the lack of tools around multiscale modelling provides opportunities for the development of efficient tools to address such challenges. The ability...

  4. Clay-Inspired MXene-Based Electrochemical Devices and Photo-Electrocatalyst: State-of-the-Art Progresses and Challenges.

    Science.gov (United States)

    Wang, Hou; Wu, Yan; Yuan, Xingzhong; Zeng, Guangming; Zhou, Jin; Wang, Xin; Chew, Jia Wei

    2018-03-01

    MXene, an important and increasingly popular category of postgraphene 2D nanomaterials, has been rigorously investigated since early 2011 because of advantages including flexible tunability in element composition, hydrophobicity, metallic nature, unique in-plane anisotropic structure, high charge-carrier mobility, tunable band gap, and favorable optical and mechanical properties. To fully exploit these potentials and further expand beyond the existing boundaries, novel functional nanostructures spanning monolayer, multilayer, nanoparticles, and composites have been developed by means of intercalation, delamination, functionalization, hybridization, among others. Undeniably, the cutting-edge developments and applications of clay-inspired 2D MXene platform as electrochemical electrode or photo-electrocatalyst have conferred superior performance and have made significant impact in the field of energy and advanced catalysis. This review provides an overview of the fundamental properties and synthesis routes of pure MXene, functionalized MXene and their hybrids, highlights the state-of-the-art progresses of MXene-based applications with respect to supercapacitors, batteries, electrocatalysis and photocatalysis, and presents the challenges and prospects in the burgeoning field. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. On the convergence of nanotechnology and Big Data analysis for computer-aided diagnosis.

    Science.gov (United States)

    Rodrigues, Jose F; Paulovich, Fernando V; de Oliveira, Maria Cf; de Oliveira, Osvaldo N

    2016-04-01

    An overview is provided of the challenges involved in building computer-aided diagnosis systems capable of precise medical diagnostics based on integration and interpretation of data from different sources and formats. The availability of massive amounts of data and computational methods associated with the Big Data paradigm has brought hope that such systems may soon be available in routine clinical practices, which is not the case today. We focus on visual and machine learning analysis of medical data acquired with varied nanotech-based techniques and on methods for Big Data infrastructure. Because diagnosis is essentially a classification task, we address the machine learning techniques with supervised and unsupervised classification, making a critical assessment of the progress already made in the medical field and the prospects for the near future. We also advocate that successful computer-aided diagnosis requires a merge of methods and concepts from nanotechnology and Big Data analysis.

  6. Lattice QCD - a challenge in large scale computing

    International Nuclear Information System (INIS)

    Schilling, K.

    1987-01-01

    The computation of the hadron spectrum within the framework of lattice QCD sets a demanding goal for the application of supercomputers in basic science. It requires both big computer capacities and clever algorithms to fight all the numerical evils that one encounters in the Euclidean space-time-world. The talk will attempt to introduce to the present state of the art of spectrum calculations by lattice simulations. (orig.)

  7. Measuring and reporting on decommissioning progress

    International Nuclear Information System (INIS)

    Lange, B.A.

    2006-01-01

    One of the challenges facing AECL, as well as other organizations charged with the responsibility of decommissioning nuclear facilities, is the means by which to measure and report on decommissioning progress to various audiences which, in some cases, may only have a peripheral knowledge or understanding of the complexities associated with the decommissioning process. The reporting and measurement of decommissioning progress is important for a number of reasons, i.e., It provides a vehicle by which to effectively communicate the nature of the decommissioning process; It ensures that stakeholders and shareholders are provided with a transparent and understandable means for assessing value for money; It provides a means by which to integrate the planning, measurement, and operational aspects of decommissioning One underlying reason behind the challenge of reporting decommissioning progress lies in the fact that decommissioning programs are generally executed over periods of time that far exceed those generally associated with typical design and build projects. For example, a decommissioning program could take decades to complete in which case progress on the order of a few percent in any one year might be typical. However, such progress may appear low compared to that seen with more typical projects that can be completed in a matter of years. As a consequence, AECL undertook to develop a system by which to measure decommissioning progress in a straightforward, meaningful, and understandable fashion. The system is not rigorously objective, and there are subjective aspects that are necessitated by the need to keep the system readily understandable. It is also important to note that while the system is simple in concept, there is, nonetheless, significant effort involved in generating and updating the parameters used as input, and in the actual calculations. (author)

  8. Towards Static Analysis of Policy-Based Self-adaptive Computing Systems

    DEFF Research Database (Denmark)

    Margheri, Andrea; Nielson, Hanne Riis; Nielson, Flemming

    2016-01-01

    For supporting the design of self-adaptive computing systems, the PSCEL language offers a principled approach that relies on declarative definitions of adaptation and authorisation policies enforced at runtime. Policies permit managing system components by regulating their interactions...... and by dynamically introducing new actions to accomplish task-oriented goals. However, the runtime evaluation of policies and their effects on system components make the prediction of system behaviour challenging. In this paper, we introduce the construction of a flow graph that statically points out the policy...... evaluations that can take place at runtime and exploit it to analyse the effects of policy evaluations on the progress of system components....

  9. Multiscale Mechanics of Articular Cartilage: Potentials and Challenges of Coupling Musculoskeletal, Joint, and Microscale Computational Models

    Science.gov (United States)

    Halloran, J. P.; Sibole, S.; van Donkelaar, C. C.; van Turnhout, M. C.; Oomens, C. W. J.; Weiss, J. A.; Guilak, F.; Erdemir, A.

    2012-01-01

    Articular cartilage experiences significant mechanical loads during daily activities. Healthy cartilage provides the capacity for load bearing and regulates the mechanobiological processes for tissue development, maintenance, and repair. Experimental studies at multiple scales have provided a fundamental understanding of macroscopic mechanical function, evaluation of the micromechanical environment of chondrocytes, and the foundations for mechanobiological response. In addition, computational models of cartilage have offered a concise description of experimental data at many spatial levels under healthy and diseased conditions, and have served to generate hypotheses for the mechanical and biological function. Further, modeling and simulation provides a platform for predictive risk assessment, management of dysfunction, as well as a means to relate multiple spatial scales. Simulation-based investigation of cartilage comes with many challenges including both the computational burden and often insufficient availability of data for model development and validation. This review outlines recent modeling and simulation approaches to understand cartilage function from a mechanical systems perspective, and illustrates pathways to associate mechanics with biological function. Computational representations at single scales are provided from the body down to the microstructure, along with attempts to explore multiscale mechanisms of load sharing that dictate the mechanical environment of the cartilage and chondrocytes. PMID:22648577

  10. Exploring the marketing challenges faced by assembled computer dealers

    OpenAIRE

    Kallimani, Rashmi

    2010-01-01

    There has been a great competition in computer market these days for obtaining higher market share. Computer market consisting of many branded and non branded players have been using various methods for matching the supply and demand in best possible way for attaining market dominance. Branded companies are seen to be investing large amount in aggressive marketing techniques for reaching the customers and obtaining higher market share. Due to this many small companies and non branded computer...

  11. Modern Trends Of Computation, Simulation, and Communication, And Their Impacts On The Progress Of Scientific And Engineering Research, Development, And Education

    International Nuclear Information System (INIS)

    Bunjamin, Muhammad

    2001-01-01

    A short report on the modern trends of computation, simulation, and communication in the 1990s is presented, along with their impacts on the progress of scientific and engineering research, development, and education. A full description of this giant issue is certainly a m ission impossible f or the author. Nevertheless, it is the author's hope that it will at least give an overall view about what is going on in this very dynamic field in the advanced countries. After t hinking globally t hru reading this report, we should then decide on w hat and how to act locally t o respond to these global trends. The main source of information reported here were the computational science and engineering journals and books issued during the 1990s as listed in the references below

  12. Technological progress and challenges towards cGMP manufacturing of human pluripotent stem cells based therapeutic products for allogeneic and autologous cell therapies.

    Science.gov (United States)

    Abbasalizadeh, Saeed; Baharvand, Hossein

    2013-12-01

    Recent technological advances in the generation, characterization, and bioprocessing of human pluripotent stem cells (hPSCs) have created new hope for their use as a source for production of cell-based therapeutic products. To date, a few clinical trials that have used therapeutic cells derived from hESCs have been approved by the Food and Drug Administration (FDA), but numerous new hPSC-based cell therapy products are under various stages of development in cell therapy-specialized companies and their future market is estimated to be very promising. However, the multitude of critical challenges regarding different aspects of hPSC-based therapeutic product manufacturing and their therapies have made progress for the introduction of new products and clinical applications very slow. These challenges include scientific, technological, clinical, policy, and financial aspects. The technological aspects of manufacturing hPSC-based therapeutic products for allogeneic and autologous cell therapies according to good manufacturing practice (cGMP) quality requirements is one of the most important challenging and emerging topics in the development of new hPSCs for clinical use. In this review, we describe main critical challenges and highlight a series of technological advances in all aspects of hPSC-based therapeutic product manufacturing including clinical grade cell line development, large-scale banking, upstream processing, downstream processing, and quality assessment of final cell therapeutic products that have brought hPSCs closer to clinical application and commercial cGMP manufacturing. © 2013.

  13. Massively Parallel Computing: A Sandia Perspective

    Energy Technology Data Exchange (ETDEWEB)

    Dosanjh, Sudip S.; Greenberg, David S.; Hendrickson, Bruce; Heroux, Michael A.; Plimpton, Steve J.; Tomkins, James L.; Womble, David E.

    1999-05-06

    The computing power available to scientists and engineers has increased dramatically in the past decade, due in part to progress in making massively parallel computing practical and available. The expectation for these machines has been great. The reality is that progress has been slower than expected. Nevertheless, massively parallel computing is beginning to realize its potential for enabling significant break-throughs in science and engineering. This paper provides a perspective on the state of the field, colored by the authors' experiences using large scale parallel machines at Sandia National Laboratories. We address trends in hardware, system software and algorithms, and we also offer our view of the forces shaping the parallel computing industry.

  14. Thallium-201 single photon emission computed tomography (SPECT) in patients with Duchenne's progressive muscular dystrophy. A histopathologic correlation study

    International Nuclear Information System (INIS)

    Nishimura, Toru; Yanagisawa, Atsuo; Sakata, Konomi; Shimoyama, Katsuya; Yoshino, Hideaki; Ishikawa, Kyozo; Sakata, Hitomi; Ishihara, Tadayuki

    2001-01-01

    The pathomorphologic mechanism responsible for abnormal perfusion imaging during thallium-201 myocardial single photon emission computed tomography ( 201 Tl-SPECT) in patients with Duchenne's progressive muscular dystrophy (DMD) was investigated. Hearts from 7 patients with DMD were evaluated histopathologically at autopsy and the results correlated with findings on initial and delayed resting 201 Tl-SPECT images. The location of segments with perfusion defects correlated with the histopathologically abnormal segments in the hearts. Both the extent and degree of myocardial fibrosis were severe, especially in the posterolateral segment of the left ventricle. Severe transmural fibrosis and severe fatty infiltration were common in segments with perfusion defects. In areas of redistribution, the degree of fibrosis appeared to be greater than in areas of normal perfusion; and intermuscular edema was prominent. Thus, the degree and extent of perfusion defects detected by 201 Tl-SPECT were compatible with the histopathology. The presence of the redistribution phenomenon may indicate ongoing fibrosis. Initial and delayed resting 201 Tl-SPECT images can predict the site and progress of myocardial degeneration in patients with DMD. (author)

  15. Ovarian cancer immunotherapy: opportunities, progresses and challenges

    Directory of Open Access Journals (Sweden)

    Stevens Richard

    2010-02-01

    Full Text Available Abstract Due to the low survival rates from invasive ovarian cancer, new effective treatment modalities are urgently needed. Compelling evidence indicates that the immune response against ovarian cancer may play an important role in controlling this disease. We herein summarize multiple immune-based strategies that have been proposed and tested for potential therapeutic benefit against advanced stage ovarian cancer. We will examine the evidence for the premise that an effective therapeutic vaccine against ovarian cancer is useful not only for inducing remission of the disease but also for preventing disease relapse. We will also highlight the questions and challenges in the development of ovarian cancer vaccines, and critically discuss the limitations of some of the existing immunotherapeutic strategies. Finally, we will summarize our own experience on the use of patient-specific tumor-derived heat shock protein-peptide complex for the treatment of advanced ovarian cancer.

  16. The implementation of AI technologies in computer wargames

    Science.gov (United States)

    Tiller, John A.

    2004-08-01

    Computer wargames involve the most in-depth analysis of general game theory. The enumerated turns of a game like chess are dwarfed by the exponentially larger possibilities of even a simple computer wargame. Implementing challenging AI is computer wargames is an important goal in both the commercial and military environments. In the commercial marketplace, customers demand a challenging AI opponent when they play a computer wargame and are frustrated by a lack of competence on the part of the AI. In the military environment, challenging AI opponents are important for several reasons. A challenging AI opponent will force the military professional to avoid routine or set-piece approaches to situations and cause them to think much deeper about military situations before taking action. A good AI opponent would also include national characteristics of the opponent being simulated, thus providing the military professional with even more of a challenge in planning and approach. Implementing current AI technologies in computer wargames is a technological challenge. The goal is to join the needs of AI in computer wargames with the solutions of current AI technologies. This talk will address several of those issues, possible solutions, and currently unsolved problems.

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  18. Examination of concept of next generation computer. Progress report 1999

    Energy Technology Data Exchange (ETDEWEB)

    Higuchi, Kenji; Hasegawa, Yukihiro; Hirayama, Toshio

    2000-12-01

    The Center for Promotion of Computational Science and Engineering has conducted R and D works on the technology of parallel processing and has started the examination of the next generation computer in 1999. This report describes the behavior analyses of quantum calculation codes. It also describes the consideration for the analyses and examination results for the method to reduce cash misses. Furthermore, it describes a performance simulator that is being developed to quantitatively examine the concept of the next generation computer. (author)

  19. Advanced computations in plasma physics

    International Nuclear Information System (INIS)

    Tang, W.M.

    2002-01-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. In this paper we review recent progress and future directions for advanced simulations in magnetically confined plasmas with illustrative examples chosen from magnetic confinement research areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to

  20. A Challenge to Watson

    Science.gov (United States)

    Detterman, Douglas K.

    2011-01-01

    Watson's Jeopardy victory raises the question of the similarity of artificial intelligence and human intelligence. Those of us who study human intelligence issue a challenge to the artificial intelligence community. We will construct a unique battery of tests for any computer that would provide an actual IQ score for the computer. This is the same…

  1. Towards large-scale data analysis: challenges in the design of portable systems and use of Cloud computing.

    Science.gov (United States)

    Diaz, Javier; Arrizabalaga, Saioa; Bustamante, Paul; Mesa, Iker; Añorga, Javier; Goya, Jon

    2013-01-01

    Portable systems and global communications open a broad spectrum for new health applications. In the framework of electrophysiological applications, several challenges are faced when developing portable systems embedded in Cloud computing services. In order to facilitate new developers in this area based on our experience, five areas of interest are presented in this paper where strategies can be applied for improving the performance of portable systems: transducer and conditioning, processing, wireless communications, battery and power management. Likewise, for Cloud services, scalability, portability, privacy and security guidelines have been highlighted.

  2. Multicore Challenges and Benefits for High Performance Scientific Computing

    Directory of Open Access Journals (Sweden)

    Ida M.B. Nielsen

    2008-01-01

    Full Text Available Until recently, performance gains in processors were achieved largely by improvements in clock speeds and instruction level parallelism. Thus, applications could obtain performance increases with relatively minor changes by upgrading to the latest generation of computing hardware. Currently, however, processor performance improvements are realized by using multicore technology and hardware support for multiple threads within each core, and taking full advantage of this technology to improve the performance of applications requires exposure of extreme levels of software parallelism. We will here discuss the architecture of parallel computers constructed from many multicore chips as well as techniques for managing the complexity of programming such computers, including the hybrid message-passing/multi-threading programming model. We will illustrate these ideas with a hybrid distributed memory matrix multiply and a quantum chemistry algorithm for energy computation using Møller–Plesset perturbation theory.

  3. CERN concludes year of strong progress towards the LHC

    CERN Multimedia

    2006-01-01

    Speaking at the 135th session of the CERN Council, the Director General, Robert Aymar, hailed a year of impressive progress towards the LHC project. 'In one year, we have made great progress,' he said. 'The challenge is not over, of course, but we have great confidence of maintaining the schedule for start-up in 2007.'

  4. Computing with memory for energy-efficient robust systems

    CERN Document Server

    Paul, Somnath

    2013-01-01

    This book analyzes energy and reliability as major challenges faced by designers of computing frameworks in the nanometer technology regime.  The authors describe the existing solutions to address these challenges and then reveal a new reconfigurable computing platform, which leverages high-density nanoscale memory for both data storage and computation to maximize the energy-efficiency and reliability. The energy and reliability benefits of this new paradigm are illustrated and the design challenges are discussed. Various hardware and software aspects of this exciting computing paradigm are de

  5. OpenTopography: Addressing Big Data Challenges Using Cloud Computing, HPC, and Data Analytics

    Science.gov (United States)

    Crosby, C. J.; Nandigam, V.; Phan, M.; Youn, C.; Baru, C.; Arrowsmith, R.

    2014-12-01

    OpenTopography (OT) is a geoinformatics-based data facility initiated in 2009 for democratizing access to high-resolution topographic data, derived products, and tools. Hosted at the San Diego Supercomputer Center (SDSC), OT utilizes cyberinfrastructure, including large-scale data management, high-performance computing, and service-oriented architectures to provide efficient Web based access to large, high-resolution topographic datasets. OT collocates data with processing tools to enable users to quickly access custom data and derived products for their application. OT's ongoing R&D efforts aim to solve emerging technical challenges associated with exponential growth in data, higher order data products, as well as user base. Optimization of data management strategies can be informed by a comprehensive set of OT user access metrics that allows us to better understand usage patterns with respect to the data. By analyzing the spatiotemporal access patterns within the datasets, we can map areas of the data archive that are highly active (hot) versus the ones that are rarely accessed (cold). This enables us to architect a tiered storage environment consisting of high performance disk storage (SSD) for the hot areas and less expensive slower disk for the cold ones, thereby optimizing price to performance. From a compute perspective, OT is looking at cloud based solutions such as the Microsoft Azure platform to handle sudden increases in load. An OT virtual machine image in Microsoft's VM Depot can be invoked and deployed quickly in response to increased system demand. OT has also integrated SDSC HPC systems like the Gordon supercomputer into our infrastructure tier to enable compute intensive workloads like parallel computation of hydrologic routing on high resolution topography. This capability also allows OT to scale to HPC resources during high loads to meet user demand and provide more efficient processing. With a growing user base and maturing scientific user

  6. Cloud Computing Governance Lifecycle

    OpenAIRE

    Soňa Karkošková; George Feuerlicht

    2016-01-01

    Externally provisioned cloud services enable flexible and on-demand sourcing of IT resources. Cloud computing introduces new challenges such as need of business process redefinition, establishment of specialized governance and management, organizational structures and relationships with external providers and managing new types of risk arising from dependency on external providers. There is a general consensus that cloud computing in addition to challenges brings many benefits but it is uncle...

  7. Progress report 1971/72

    International Nuclear Information System (INIS)

    1972-01-01

    The progress report comprises reports from interdisciplinary task groups on radiation protection, isotope application and radiation measurement technique, microscopy, linear accelerators, process computers, biophysics and nuclear physics. The last task group reports on work with the electron linear accelerator, nuclear spectroscopy, neutron physics, work with polarized particles, and experiments with the GSI heavy ion accelerator. (orig./AK) [de

  8. Childhood Obesity – 2010: Progress and Challenges

    Science.gov (United States)

    Han, Joan C.; Lawlor, Debbie A.; Kimm, Sue Y.S.

    2010-01-01

    Summary The worldwide prevalence of childhood obesity has increased greatly over the past 3 decades. The increasing occurrence in children of disorders, such as type 2 diabetes, is believed to be a consequence of this obesity epidemic. Much progress has been made in understanding the genetics and physiology of appetite control and from this, the elucidation of the causes of some rare obesity syndromes. However, these rare disorders have so far taught us only limited lessons on how to prevent or reverse obesity in most children. Calorie intake and activity recommendations need to be re-assessed and better quantified, on a population level, given the more sedentary life of children today. For individual treatment, the currently recommended calorie prescriptions may be too conservative given the evolving insight on the “energy gap.” Whilst quality of research in both prevention and treatment has improved, there is still a need for high-quality multi-centre trials with long-term follow-up. Meanwhile, prevention and treatment approaches that aim to increase energy expenditure and decrease intake need to continue. Most recently, the spiralling increase in obesity prevalence may be abating for children. Thus, even greater efforts need to be made on all fronts to continue this potentially exciting trend. PMID:20451244

  9. Nationwide Natural Resource Inventory of the Philippines Using Lidar: Strategies, Progress, and Challenges

    Science.gov (United States)

    Blanco, A. C.; Tamondong, A.; Perez, A. M.; Ang, M. R. C.; Paringit, E.; Alberto, R.; Alibuyog, N.; Aquino, D.; Ballado, A.; Garcia, P.; Japitana, M.; Ignacio, M. T.; Macandog, D.; Novero, A.; Otadoy, R. E.; Regis, E.; Rodriguez, M.; Silapan, J.; Villar, R.

    2016-06-01

    The Philippines has embarked on a detailed nationwide natural resource inventory using LiDAR through the Phil-LiDAR 2 Program. This 3-year program has developed and has been implementing mapping methodologies and protocols to produce high-resolution maps of agricultural, forest, coastal marine, hydrological features, and renewable energy resources. The Program has adopted strategies on system and process development, capacity building and enhancement, and expanding the network of collaborations. These strategies include training programs (on point cloud and image processing, GIS, and field surveys), workshops, forums, and colloquiums (program-wide, cluster-based, and project-based), and collaboration with partner national government agencies and other organizations. In place is a cycle of training, implementation, and feedback in order to continually improve the system and processes. To date, the Program has achieved progress in the development of workflows and in rolling out products such as resource maps and GIS data layers, which are indispensable in planning and decision-making. Challenges remains in speeding up output production (including quality checks) and in ensuring sustainability considering the short duration of the program. Enhancements in the workflows and protocols have been incorporated to address data quality and data availability issues. More trainings have been conducted for project staff hired to address human resource gaps. Collaborative arrangements with more partners are being established. To attain sustainability, the Program is developing and instituting a system of training, data updating and sharing, information utilization, and feedback. This requires collaboration and cooperation of the government agencies, LGUs, universities, other organizations, and the communities.

  10. Data Access, Interoperability and Sustainability: Key Challenges for the Evolution of Science Capabilities

    Science.gov (United States)

    Walton, A. L.

    2015-12-01

    In 2016, the National Science Foundation (NSF) will support a portfolio of activities and investments focused upon challenges in data access, interoperability, and sustainability. These topics are fundamental to science questions of increasing complexity that require multidisciplinary approaches and expertise. Progress has become tractable because of (and sometimes complicated by) unprecedented growth in data (both simulations and observations) and rapid advances in technology (such as instrumentation in all aspects of the discovery process, together with ubiquitous cyberinfrastructure to connect, compute, visualize, store, and discover). The goal is an evolution of capabilities for the research community based on these investments, scientific priorities, technology advances, and policies. Examples from multiple NSF directorates, including investments by the Advanced Cyberinfrastructure Division, are aimed at these challenges and can provide the geosciences research community with models and opportunities for participation. Implications for the future are highlighted, along with the importance of continued community engagement on key issues.

  11. Cloud ice: A climate model challenge with signs and expectations of progress

    Science.gov (United States)

    Waliser, Duane E.; Li, Jui-Lin F.; Woods, Christopher P.; Austin, Richard T.; Bacmeister, Julio; Chern, Jiundar; Del Genio, Anthony; Jiang, Jonathan H.; Kuang, Zhiming; Meng, Huan; Minnis, Patrick; Platnick, Steve; Rossow, William B.; Stephens, Graeme L.; Sun-Mack, Szedung; Tao, Wei-Kuo; Tompkins, Adrian M.; Vane, Deborah G.; Walker, Christopher; Wu, Dong

    2009-04-01

    Present-day shortcomings in the representation of upper tropospheric ice clouds in general circulation models (GCMs) lead to errors in weather and climate forecasts as well as account for a source of uncertainty in climate change projections. An ongoing challenge in rectifying these shortcomings has been the availability of adequate, high-quality, global observations targeting ice clouds and related precipitating hydrometeors. In addition, the inadequacy of the modeled physics and the often disjointed nature between model representation and the characteristics of the retrieved/observed values have hampered GCM development and validation efforts from making effective use of the measurements that have been available. Thus, even though parameterizations in GCMs accounting for cloud ice processes have, in some cases, become more sophisticated in recent years, this development has largely occurred independently of the global-scale measurements. With the relatively recent addition of satellite-derived products from Aura/Microwave Limb Sounder (MLS) and CloudSat, there are now considerably more resources with new and unique capabilities to evaluate GCMs. In this article, we illustrate the shortcomings evident in model representations of cloud ice through a comparison of the simulations assessed in the Intergovernmental Panel on Climate Change Fourth Assessment Report, briefly discuss the range of global observational resources that are available, and describe the essential components of the model parameterizations that characterize their "cloud" ice and related fields. Using this information as background, we (1) discuss some of the main considerations and cautions that must be taken into account in making model-data comparisons related to cloud ice, (2) illustrate present progress and uncertainties in applying satellite cloud ice (namely from MLS and CloudSat) to model diagnosis, (3) show some indications of model improvements, and finally (4) discuss a number of

  12. Canada's climate change voluntary challenge and registry program : 6. annual progress report

    International Nuclear Information System (INIS)

    2000-10-01

    A Canadian integrated energy company, Suncor Energy Inc. comprises a corporate group, three operating business units, and two emerging businesses. This annual Progress Report for Canada's Climate Change Voluntary Challenge and Registry (VCR) Program represents the sixth for this company. Suncor is committed to sustainable development. Some initiatives undertaken in 1999 by Suncor included: Oil Sands Project Millennium, which will more than double the actual production of crude oil and fuel products by 2002. Suncor is divesting of conventional oil properties in order to concentrate on exploration and production of natural gas. Alternative and renewable energy will see an investment of 100 million over the next five years. The money will be allocated to research and development, the production of fuels from biomass, and conversion of municipal solid waste to energy through the recovery of methane from landfills. Since 1990, the emissions of carbon dioxide have been reduced to 14 per cent below 1990 levels, and reductions of 622, 000 tonnes of greenhouse gases. A comprehensive tracking, reporting, and management system for greenhouse gases was implemented. Ongoing improvements in quality and comprehensiveness have validated the methodology used to monitor emissions inventories and sources. Initiatives in internal and external awareness of greenhouse gases education were implemented, such as speaking engagements at climate change activities, the retrofit of schools with advanced energy-efficient technology, education programs, employee suggestion programs, etc. Collaboration with external partners on research and development projects represents a major building block in this approach. Some of the research and development projects involve the development of advanced carbon dioxide capture and geologic sequestration technologies, work on the production of alternative and renewable energy from Canadian municipal landfills, and the study of a new process to extract heavy

  13. Knowledge-based computer systems for radiotherapy planning.

    Science.gov (United States)

    Kalet, I J; Paluszynski, W

    1990-08-01

    Radiation therapy is one of the first areas of clinical medicine to utilize computers in support of routine clinical decision making. The role of the computer has evolved from simple dose calculations to elaborate interactive graphic three-dimensional simulations. These simulations can combine external irradiation from megavoltage photons, electrons, and particle beams with interstitial and intracavitary sources. With the flexibility and power of modern radiotherapy equipment and the ability of computer programs that simulate anything the machinery can do, we now face a challenge to utilize this capability to design more effective radiation treatments. How can we manage the increased complexity of sophisticated treatment planning? A promising approach will be to use artificial intelligence techniques to systematize our present knowledge about design of treatment plans, and to provide a framework for developing new treatment strategies. Far from replacing the physician, physicist, or dosimetrist, artificial intelligence-based software tools can assist the treatment planning team in producing more powerful and effective treatment plans. Research in progress using knowledge-based (AI) programming in treatment planning already has indicated the usefulness of such concepts as rule-based reasoning, hierarchical organization of knowledge, and reasoning from prototypes. Problems to be solved include how to handle continuously varying parameters and how to evaluate plans in order to direct improvements.

  14. Computational Design of Urban Layouts

    KAUST Repository

    Wonka, Peter

    2015-10-07

    A fundamental challenge in computational design is to compute layouts by arranging a set of shapes. In this talk I will present recent urban modeling projects with applications in computer graphics, urban planning, and architecture. The talk will look at different scales of urban modeling (streets, floorplans, parcels). A common challenge in all these modeling problems are functional and aesthetic constraints that should be respected. The talk also highlights interesting links to geometry processing problems, such as field design and quad meshing.

  15. Portable computers - portable operating systems

    International Nuclear Information System (INIS)

    Wiegandt, D.

    1985-01-01

    Hardware development has made rapid progress over the past decade. Computers used to have attributes like ''general purpose'' or ''universal'', nowadays they are labelled ''personal'' and ''portable''. Recently, a major manufacturing company started marketing a portable version of their personal computer. But even for these small computers the old truth still holds that the biggest disadvantage of a computer is that it must be programmed, hardware by itself does not make a computer. (orig.)

  16. Advanced Technologies, Embedded and Multimedia for Human-Centric Computing

    CERN Document Server

    Chao, Han-Chieh; Deng, Der-Jiunn; Park, James; HumanCom and EMC 2013

    2014-01-01

    The theme of HumanCom and EMC are focused on the various aspects of human-centric computing for advances in computer science and its applications, embedded and multimedia computing and provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of human-centric computing. And the theme of EMC (Advanced in Embedded and Multimedia Computing) is focused on the various aspects of embedded system, smart grid, cloud and multimedia computing, and it provides an opportunity for academic, industry professionals to discuss the latest issues and progress in the area of embedded and multimedia computing. Therefore this book will be include the various theories and practical applications in human-centric computing and embedded and multimedia computing.

  17. FY13 Annual Progress Report for SECA Core Technology Program

    Energy Technology Data Exchange (ETDEWEB)

    Stevenson, Jeffry W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Koeppel, Brian J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-01-31

    This progress report covers technical work performed during fiscal year 2013 at PNNL under Field Work Proposal (FWP) 40552. The report highlights and documents technical progress in tasks related to advanced cell and stack component materials development and computational design and simulation.

  18. Case management considerations of progressive dementia in a home setting.

    Science.gov (United States)

    Pierce, Mary Ellen

    2010-01-01

    Nursing theory, research, and best practice guidelines contribute substantially to the field of dementia care. Interventional plans are challenged most by those dementias considered progressive and deteriorative in nature, requiring ongoing reassessment and modification of care practices as the clinical course changes. The purpose of this article is to provide guidelines for case managers in the development of effective, individualized care plans for clients with progressive dementia residing in a home setting. The application of these guidelines is illustrated through the presentation of an actual case. The practice setting is a private home in the Pacific Northwest. Geriatric case management is provided by an RN case manager. Progressive dementia presents challenges to home care. Professional case management using comprehensive, holistic assessment, collaborative approaches, and best practice fundamentals serve to create an effective, individualized plan of care. The increasing geriatric population presents great opportunities for case managers in strategic management for creating successful home care models in clients with progressive dementia. Use of nursing diagnoses, dementia research, and collaborative approaches with families and other medical providers creates a viable alternative for clients with progressive dementia.

  19. Challenges: a state and compact perspective

    International Nuclear Information System (INIS)

    Brown, H.

    1987-01-01

    The challenges facing states and compacts in their efforts to implement the Low-Level Waste Policy Amendments Act are described. Institutional challenges include: small-volume sites; compact maintenance; shifting agencies and changing personnel; and timing of progress. The technical challenge lies in the enormous number of plans, procedures, and regulations that have to be developed over the next four years. There are two main fiscal challenges: funding of day-to-day operations of compact commissions; and financing the siting and construction of new disposal sites. There are also two main regulatory challenges: host states must develop regulations for siting and selection of technology; and all states have to await federal regulations to be completed. The final challenge is political: whether a region can overcome public opposition and actually site a facility

  20. Workshop and conference on Grand Challenges applications and software technology

    Energy Technology Data Exchange (ETDEWEB)

    1993-12-31

    On May 4--7, 1993, nine federal agencies sponsored a four-day meeting on Grand Challenge applications and software technology. The objective was to bring High-Performance Computing and Communications (HPCC) Grand Challenge applications research groups supported under the federal HPCC program together with HPCC software technologists to: discuss multidisciplinary computational science research issues and approaches, identify major technology challenges facing users and providers, and refine software technology requirements for Grand Challenge applications research. The first day and a half focused on applications. Presentations were given by speakers from universities, national laboratories, and government agencies actively involved in Grand Challenge research. Five areas of research were covered: environmental and earth sciences; computational physics; computational biology, chemistry, and materials sciences; computational fluid and plasma dynamics; and applications of artificial intelligence. The next day and a half was spent in working groups in which the applications researchers were joined by software technologists. Nine breakout sessions took place: I/0, Data, and File Systems; Parallel Programming Paradigms; Performance Characterization and Evaluation of Massively Parallel Processing Applications; Program Development Tools; Building Multidisciplinary Applications; Algorithm and Libraries I; Algorithms and Libraries II; Graphics and Visualization; and National HPCC Infrastructure.

  1. Math bytes Google bombs, chocolate-covered Pi, and other cool bits in computing

    CERN Document Server

    Chartier, Tim

    2014-01-01

    This book provides a fun, hands-on approach to learning how mathematics and computing relate to the world around us and help us to better understand it. How can reposting on Twitter kill a movie's opening weekend? How can you use mathematics to find your celebrity look-alike? What is Homer Simpson's method for disproving Fermat's Last Theorem? Each topic in this refreshingly inviting book illustrates a famous mathematical algorithm or result--such as Google's PageRank and the traveling salesman problem--and the applications grow more challenging as you progress through the chapters. But don't

  2. A hybrid online scheduling mechanism with revision and progressive techniques for autonomous Earth observation satellite

    Science.gov (United States)

    Li, Guoliang; Xing, Lining; Chen, Yingwu

    2017-11-01

    The autonomicity of self-scheduling on Earth observation satellite and the increasing scale of satellite network attract much attention from researchers in the last decades. In reality, the limited onboard computational resource presents challenge for the online scheduling algorithm. This study considered online scheduling problem for a single autonomous Earth observation satellite within satellite network environment. It especially addressed that the urgent tasks arrive stochastically during the scheduling horizon. We described the problem and proposed a hybrid online scheduling mechanism with revision and progressive techniques to solve this problem. The mechanism includes two decision policies, a when-to-schedule policy combining periodic scheduling and critical cumulative number-based event-driven rescheduling, and a how-to-schedule policy combining progressive and revision approaches to accommodate two categories of task: normal tasks and urgent tasks. Thus, we developed two heuristic (re)scheduling algorithms and compared them with other generally used techniques. Computational experiments indicated that the into-scheduling percentage of urgent tasks in the proposed mechanism is much higher than that in periodic scheduling mechanism, and the specific performance is highly dependent on some mechanism-relevant and task-relevant factors. For the online scheduling, the modified weighted shortest imaging time first and dynamic profit system benefit heuristics outperformed the others on total profit and the percentage of successfully scheduled urgent tasks.

  3. Progress Toward Affordable High Fidelity Combustion Simulations Using Filtered Density Functions for Hypersonic Flows in Complex Geometries

    Science.gov (United States)

    Drozda, Tomasz G.; Quinlan, Jesse R.; Pisciuneri, Patrick H.; Yilmaz, S. Levent

    2012-01-01

    Significant progress has been made in the development of subgrid scale (SGS) closures based on a filtered density function (FDF) for large eddy simulations (LES) of turbulent reacting flows. The FDF is the counterpart of the probability density function (PDF) method, which has proven effective in Reynolds averaged simulations (RAS). However, while systematic progress is being made advancing the FDF models for relatively simple flows and lab-scale flames, the application of these methods in complex geometries and high speed, wall-bounded flows with shocks remains a challenge. The key difficulties are the significant computational cost associated with solving the FDF transport equation and numerically stiff finite rate chemistry. For LES/FDF methods to make a more significant impact in practical applications a pragmatic approach must be taken that significantly reduces the computational cost while maintaining high modeling fidelity. An example of one such ongoing effort is at the NASA Langley Research Center, where the first generation FDF models, namely the scalar filtered mass density function (SFMDF) are being implemented into VULCAN, a production-quality RAS and LES solver widely used for design of high speed propulsion flowpaths. This effort leverages internal and external collaborations to reduce the overall computational cost of high fidelity simulations in VULCAN by: implementing high order methods that allow reduction in the total number of computational cells without loss in accuracy; implementing first generation of high fidelity scalar PDF/FDF models applicable to high-speed compressible flows; coupling RAS/PDF and LES/FDF into a hybrid framework to efficiently and accurately model the effects of combustion in the vicinity of the walls; developing efficient Lagrangian particle tracking algorithms to support robust solutions of the FDF equations for high speed flows; and utilizing finite rate chemistry parametrization, such as flamelet models, to reduce

  4. Progress and challenges of the bioartificial pancreas

    Science.gov (United States)

    Hwang, Patrick T. J.; Shah, Dishant K.; Garcia, Jacob A.; Bae, Chae Yun; Lim, Dong-Jin; Huiszoon, Ryan C.; Alexander, Grant C.; Jun, Ho-Wook

    2016-11-01

    Pancreatic islet transplantation has been validated as a treatment for type 1 diabetes since it maintains consistent and sustained type 1 diabetes reversal. However, one of the major challenges in pancreatic islet transplantation is the body's natural immune response to the implanted islets. Immunosuppressive drug treatment is the most popular immunomodulatory approach for islet graft survival. However, administration of immunosuppressive drugs gives rise to negative side effects, and long-term effects are not clearly understood. A bioartificial pancreas is a therapeutic approach to enable pancreatic islet transplantation without or with minimal immune suppression. The bioartificial pancreas encapsulates the pancreatic islets in a semi-permeable environment which protects islets from the body's immune responses, while allowing the permeation of insulin, oxygen, nutrients, and waste. Many groups have developed various types of the bioartificial pancreas and tested their efficacy in animal models. However, the clinical application of the bioartificial pancreas still requires further investigation. In this review, we discuss several types of bioartificial pancreases and address their advantages and limitations. We also discuss recent advances in bioartificial pancreas applications with microfluidic or micropatterning technology.

  5. Automation of multi-agent control for complex dynamic systems in heterogeneous computational network

    Science.gov (United States)

    Oparin, Gennady; Feoktistov, Alexander; Bogdanova, Vera; Sidorov, Ivan

    2017-01-01

    The rapid progress of high-performance computing entails new challenges related to solving large scientific problems for various subject domains in a heterogeneous distributed computing environment (e.g., a network, Grid system, or Cloud infrastructure). The specialists in the field of parallel and distributed computing give the special attention to a scalability of applications for problem solving. An effective management of the scalable application in the heterogeneous distributed computing environment is still a non-trivial issue. Control systems that operate in networks, especially relate to this issue. We propose a new approach to the multi-agent management for the scalable applications in the heterogeneous computational network. The fundamentals of our approach are the integrated use of conceptual programming, simulation modeling, network monitoring, multi-agent management, and service-oriented programming. We developed a special framework for an automation of the problem solving. Advantages of the proposed approach are demonstrated on the parametric synthesis example of the static linear regulator for complex dynamic systems. Benefits of the scalable application for solving this problem include automation of the multi-agent control for the systems in a parallel mode with various degrees of its detailed elaboration.

  6. The Waste Treatment Plant, a Work in Progress

    International Nuclear Information System (INIS)

    Hamel, W. F. Jr.; Duncan, G. M.

    2006-01-01

    There are many challenges in the design and construction of Department of Energy's (DOE) Waste Treatment and Immobilization Plant (WTP) at the Hanford site. The plant is being built to process some 55 million gallons of radioactive waste from 177 underground tanks. Engineering and construction are progressing on this largest project in the DOE complex. This paper describes some of WTP's principal recent challenges and opportunities and how they are being addressed to minimize impact on the project, enhance the capabilities of the facilities, and reduce risk. A significant new development in 2005 was the need to account for higher seismic accelerations than originally specified for the facility structures and equipment. Efforts have centered on continuing design and construction with minimal risk, while the final seismic design spectra was developed. Other challenges include development of an alternative cesium ion exchange resin to minimize the risk from reliance on a single product, implementing advanced analytical techniques to improve laboratory performance, adopting a thinner walled high level waste (HLW) canister to reduce waste volume and mission duration, and commissioning a comprehensive external flowsheet review of the design, along with its underpinning technologies, and projected plant operability. These challenges make it clear that WTP is a work in progress, but the challenges are being successfully resolved as the design and construction move on to completion. (authors)

  7. Private quantum computation: an introduction to blind quantum computing and related protocols

    Science.gov (United States)

    Fitzsimons, Joseph F.

    2017-06-01

    Quantum technologies hold the promise of not only faster algorithmic processing of data, via quantum computation, but also of more secure communications, in the form of quantum cryptography. In recent years, a number of protocols have emerged which seek to marry these concepts for the purpose of securing computation rather than communication. These protocols address the task of securely delegating quantum computation to an untrusted device while maintaining the privacy, and in some instances the integrity, of the computation. We present a review of the progress to date in this emerging area.

  8. A Blood Test for Alzheimer's Disease: Progress, Challenges, and Recommendations.

    Science.gov (United States)

    Kiddle, Steven J; Voyle, Nicola; Dobson, Richard J B

    2018-03-29

    Ever since the discovery of APOEɛ4 around 25 years ago, researchers have been excited about the potential of a blood test for Alzheimer's disease (AD). Since then researchers have looked for genetic, protein, metabolite, and/or gene expression markers of AD and related phenotypes. However, no blood test for AD is yet being used in the clinical setting. We first review the trends and challenges in AD blood biomarker research, before giving our personal recommendations to help researchers overcome these challenges. While some degree of consistency and replication has been seen across independent studies, several high-profile studies have seemingly failed to replicate. Partly due to academic incentives, there is a reluctance in the field to report predictive ability, to publish negative findings, and to independently replicate the work of others. If this can be addressed, then we will know sooner whether a blood test for AD or related phenotypes with clinical utility can be developed.

  9. Efficient computation of hashes

    International Nuclear Information System (INIS)

    Lopes, Raul H C; Franqueira, Virginia N L; Hobson, Peter R

    2014-01-01

    The sequential computation of hashes at the core of many distributed storage systems and found, for example, in grid services can hinder efficiency in service quality and even pose security challenges that can only be addressed by the use of parallel hash tree modes. The main contributions of this paper are, first, the identification of several efficiency and security challenges posed by the use of sequential hash computation based on the Merkle-Damgard engine. In addition, alternatives for the parallel computation of hash trees are discussed, and a prototype for a new parallel implementation of the Keccak function, the SHA-3 winner, is introduced.

  10. Computational mechanics research and support for aerodynamics and hydraulics at TFHRC. Quarterly report January through March 2011. Year 1 Quarter 2 progress report.

    Energy Technology Data Exchange (ETDEWEB)

    Lottes, S. A.; Kulak, R. F.; Bojanowski, C. (Energy Systems)

    2011-05-19

    This project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at the Turner-Fairbank Highway Research Center for a period of five years, beginning in October 2010. The analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water loads on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to assess them for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, vehicle stability under high wind loading, and the use of electromagnetic shock absorbers to improve vehicle stability under high wind conditions. This quarterly report documents technical progress on the project tasks for the period of January through March 2011.

  11. Progressive neuronal degeneration of childhood with liver disease

    International Nuclear Information System (INIS)

    Kendall, B.E.; Boyd, S.G.; Egger, J.; Harding, B.N.

    1987-01-01

    The clinical, electrophysiological and neuroradiological features of thirteen patients suffering from progressive neuronal degeneration of childhood with liver failure are presented. The disease commonly presents very early in life with progressive mental retardation, followed by intractable epilepsy, and should be suspected clinically especially if there is a family history of similar disorder in a sibling. On computed tomography there are low density regions, particularly in the occipital and posterior temporal lobes, involving both cortex and white matter, combined with or followed by progressive atrophy. Typical EEG findings may be confirmatory. (orig.)

  12. Can eye tracking boost usability evaluation of computer games?

    DEFF Research Database (Denmark)

    Johansen, Sune Alstrup; Noergaard, Mie; Soerensen, Janus Rau

    2008-01-01

    Good computer games need to be challenging while at the same time being easy to use. Accordingly, besides struggling with well known challenges for usability work, such as persuasiveness, the computer game industry also faces system-specific challenges, such as identifying methods that can provide...... data on players' attention during a game. This position paper discusses how eye tracking may address three core challenges faced by computer game producer IO Interactive in their on-going work to ensure games that are fun, usable, and challenging. These challenges are: (1) Persuading game designers...... about the relevance of usability results, (2) involving game designers in usability work, and (3) identifying methods that provide new data about user behaviour and experience....

  13. Exact computation and large angular momentum asymptotics of 3nj symbols: Semiclassical disentangling of spin networks

    International Nuclear Information System (INIS)

    Anderson, Roger W.; Aquilanti, Vincenzo; Silva Ferreira, Cristiane da

    2008-01-01

    Spin networks, namely, the 3nj symbols of quantum angular momentum theory and their generalizations to groups other than SU(2) and to quantum groups, permeate many areas of pure and applied science. The issues of their computation and characterization for large values of their entries are a challenge for diverse fields, such as spectroscopy and quantum chemistry, molecular and condensed matter physics, quantum computing, and the geometry of space time. Here we record progress both in their efficient calculation and in the study of the large j asymptotics. For the 9j symbol, a prototypical entangled network, we present and extensively check numerically formulas that illustrate the passage to the semiclassical limit, manifesting both the occurrence of disentangling and the discrete-continuum transition.

  14. Further improvement in ABWR (part-4) open distributed plant process computer system

    International Nuclear Information System (INIS)

    Makino, Shigenori; Hatori, Yoshinori

    1999-01-01

    In the nuclear industry of Japan, the electric power companies have promoted the plant process computer (PPC) technology of nuclear power plant (NPP). When PPC was introduced to NPP for the first time, because of very tight requirement such as high reliability, high speed processing, the large-scale customized computer was applied. As for recent computer field, the large market of computer contributes to the remarkable progress of engineering work station (EWS) and personal computer (PC) technology. Moreover because the data transmission technology has been progressing at the same time, world wide computer network has been established. Thanks to progress of both technologies, the distributed computer system has been established at reasonable price. So Tokyo Electric Power Company (TEPCO) is trying to apply it for PPC of NPP. (author)

  15. Multicore Programming Challenges

    Science.gov (United States)

    Perrone, Michael

    The computer industry is facing fundamental challenges that are driving a major change in the design of computer processors. Due to restrictions imposed by quantum physics, one historical path to higher computer processor performance - by increased clock frequency - has come to an end. Increasing clock frequency now leads to power consumption costs that are too high to justify. As a result, we have seen in recent years that the processor frequencies have peaked and are receding from their high point. At the same time, competitive market conditions are giving business advantage to those companies that can field new streaming applications, handle larger data sets, and update their models to market conditions faster. The desire for newer, faster and larger is driving continued demand for higher computer performance.

  16. Foundations of Neuromorphic Computing

    Science.gov (United States)

    2013-05-01

    paradigms: few sensors/complex computations and many sensors/simple computation. Challenges with Nano-enabled Neuromorphic Chips A wide variety of...FOUNDATIONS OF NEUROMORPHIC COMPUTING MAY 2013 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE; DISTRIBUTION...2009 – SEP 2012 4. TITLE AND SUBTITLE FOUNDATIONS OF NEUROMORPHIC COMPUTING 5a. CONTRACT NUMBER IN-HOUSE 5b. GRANT NUMBER N/A 5c. PROGRAM

  17. Assessing the Global Development Agenda (Goal 1 in Uganda: The Progress Made and the Challenges that Persist

    Directory of Open Access Journals (Sweden)

    E. A. Ndaguba

    2016-12-01

    Full Text Available The international development agenda (2000-2015 that was hailed in Uganda was unsuccessful and powerless in elevating individuals and groups to a place of comfort through the achievement of the MDGs. Hence, according to a survey of the Directorate of Social Protection in 2012, 67% of citizens of Uganda are either highly vulnerable to remaining in poverty or being poor.  This study therefore assesses the gains of the global development agenda (2000 – 2015 in Uganda. The study relies heavily on review papers, secondary dataset and material, and quasi-quantitative method in analyzing the research aim. Results show that ambiguous and unrealistic targets of the MDGs did not take into cognizance the structures, institutions, and interaction of systems and governance issues in Uganda. Despite these, the gains were also shortchanged as a result of drought, flood, and high prices of commodities, due to low farm production in most (rural areas in Uganda. In addition to the drought and the negative effects of climate change, other challenges include deficient access to markets and market place, lack of motorized and non-motorized load-carrying wheel vehicles, lack of capacity and infrastructure, lack of mechanized farming implements, and the lack of access to credit reduced the potency of the achievement of most of its goals. However, significant strides were attempted and the country was able to achieve several targets, which are worth celebrating. The study contends that the realization of the SDGs will only be wishful thinking, if challenges of rural poverty, governance and institution are not put in check. Shared progress and prosperity as acclaimed by the World Bank will never be visible in Uganda.

  18. Development of computational fluid dynamics--habitat suitability (CFD-HSI) models to identify potential passage--Challenge zones for migratory fishes in the Penobscot River

    Science.gov (United States)

    Haro, Alexander J.; Dudley, Robert W.; Chelminski, Michael

    2012-01-01

    A two-dimensional computational fluid dynamics-habitat suitability (CFD–HSI) model was developed to identify potential zones of shallow depth and high water velocity that may present passage challenges for five anadromous fish species in the Penobscot River, Maine, upstream from two existing dams and as a result of the proposed future removal of the dams. Potential depth-challenge zones were predicted for larger species at the lowest flow modeled in the dam-removal scenario. Increasing flows under both scenarios increased the number and size of potential velocity-challenge zones, especially for smaller species. This application of the two-dimensional CFD–HSI model demonstrated its capabilities to estimate the potential effects of flow and hydraulic alteration on the passage of migratory fish.

  19. Biomedical Visual Computing: Case Studies and Challenges

    KAUST Repository

    Johnson, Christopher

    2012-01-01

    Advances in computational geometric modeling, imaging, and simulation let researchers build and test models of increasing complexity, generating unprecedented amounts of data. As recent research in biomedical applications illustrates, visualization will be critical in making this vast amount of data usable; it's also fundamental to understanding models of complex phenomena. © 2012 IEEE.

  20. Progress report on nuclear science and technology in China (Vol.3). Proceedings of academic annual meeting of China Nuclear Society in 2013, No.6--computational physics sub-volume

    International Nuclear Information System (INIS)

    2014-05-01

    Progress report on nuclear science and technology in China (Vol. 3) includes 13 articles which are communicated on the third national academic annual meeting of China Nuclear Society. There are 10 books totally. This is the sixth one, the content is about computational physics sub-volume

  1. Biomedical Visual Computing: Case Studies and Challenges

    KAUST Repository

    Johnson, Christopher

    2012-01-01

    Advances in computational geometric modeling, imaging, and simulation let researchers build and test models of increasing complexity, generating unprecedented amounts of data. As recent research in biomedical applications illustrates, visualization will be critical in making this vast amount of data usable; it\\'s also fundamental to understanding models of complex phenomena. © 2012 IEEE.

  2. Security Problems in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Rola Motawie

    2016-12-01

    Full Text Available Cloud is a pool of computing resources which are distributed among cloud users. Cloud computing has many benefits like scalability, flexibility, cost savings, reliability, maintenance and mobile accessibility. Since cloud-computing technology is growing day by day, it comes with many security problems. Securing the data in the cloud environment is most critical challenges which act as a barrier when implementing the cloud. There are many new concepts that cloud introduces, such as resource sharing, multi-tenancy, and outsourcing, create new challenges for the security community. In this work, we provide a comparable study of cloud computing privacy and security concerns. We identify and classify known security threats, cloud vulnerabilities, and attacks.

  3. Phenomenological analyses and their application to the Defense Waste Processing Facility probabilistic safety analysis accident progression event tree. Revision 1

    International Nuclear Information System (INIS)

    Kalinich, D.A.; Thomas, J.K.; Gough, S.T.; Bailey, R.T.; Kearnaghan, D.P.

    1995-01-01

    In the Defense Waste Processing Facility (DWPF) Safety Analysis Reports (SARs) for the Savannah River Site (SRS), risk-based perspectives have been included per US Department of Energy (DOE) Order 5480.23. The NUREG-1150 Level 2/3 Probabilistic Risk Assessment (PRA) methodology was selected as the basis for calculating facility risk. The backbone of this methodology is the generation of an Accident Progression Event Tree (APET), which is solved using the EVNTRE computer code. To support the development of the DWPF APET, deterministic modeling of accident phenomena was necessary. From these analyses, (1) accident progressions were identified for inclusion into the APET; (2) branch point probabilities and any attendant parameters were quantified; and (3) the radionuclide releases to the environment from accidents were determined. The phenomena of interest for accident progressions included explosions, fires, a molten glass spill, and the response of the facility confinement system during such challenges. A variety of methodologies, from hand calculations to large system-model codes, were used in the evaluation of these phenomena

  4. Accomplishments and challenges of the severe accident research

    International Nuclear Information System (INIS)

    Sehgal, B.R.

    2001-01-01

    This paper briefly describes the progress of the severe accident research since 1980, in terms of the accomplishments made so far and the challenges that remain. Much has been accomplished: many important safety issues have been resolved and consensus is near on some others. However, some of the previously identified safety issues remain as challenges, while some new ones have arisen due to the shift in focus from containment to vessel integrity. New reactor designs have also created some new challenges. In general, the regulatory demands for new reactor designs are stricter, thereby requiring much greater attention to the safety issues concerned with the containment design of the new large reactors, and to the accident management procedures for mitigating the consequences of a severe accident. We apologize for not providing references to many fine investigations that contributed to the great progress made so far in the severe accident research

  5. Roadmap to greener computing

    CERN Document Server

    Nguemaleu, Raoul-Abelin Choumin

    2014-01-01

    A concise and accessible introduction to green computing and green IT, this book addresses how computer science and the computer infrastructure affect the environment and presents the main challenges in making computing more environmentally friendly. The authors review the methodologies, designs, frameworks, and software development tools that can be used in computer science to reduce energy consumption and still compute efficiently. They also focus on Computer Aided Design (CAD) and describe what design engineers and CAD software applications can do to support new streamlined business directi

  6. Uterine sarcomas-Recent progress and future challenges

    International Nuclear Information System (INIS)

    Seddon, Beatrice M.; Davda, Reena

    2011-01-01

    Uterine sarcomas are a group of rare tumours that provide considerable challenges in their treatment. Radiological diagnosis prior to hysterectomy is difficult, with the diagnosis frequently made post-operatively. Current staging systems have been unsatisfactory, although a new FIGO staging system specifically for uterine sarcomas has now been introduced, and may allow better grouping of patients according to expected prognosis. While the mainstay of treatment of early disease is a total abdominal hysterectomy, it is less clear whether routine oophorectomy or lymphadenectomy is necessary. Adjuvant pelvic radiotherapy may improve local tumour control in high risk patients, but is not associated with an overall survival benefit. Similarly there is no good evidence for the routine use of adjuvant chemotherapy. For advanced leiomyosarcoma, newer chemotherapy agents including gemcitabine and docetaxel, and trabectedin, offer some promise, while hormonal therapies appear to be more useful in endometrial stromal sarcoma. Novel targeted agents are now being introduced for sarcomas, and uterine sarcomas, and show some indications of activity. Non-pharmacological treatments, including surgical metastatectomy, radiofrequency ablation, and CyberKnife radiotherapy, are important additions to systemic therapy for advanced metastatic disease.

  7. Progress in the Gondwanan Carboniferous–Permian palynology ...

    Indian Academy of Sciences (India)

    The first section describes challenges in the Carboniferous–Permian Gondwanan stratigraphic palynology, and progress in techniques such as presence of the 'rare-marine intervals', and 'radiometric dating' in some Gondwanan successions, e.g., South Africa, Australia and South America, as tools to confidently calibrate ...

  8. Computing Nash equilibria through computational intelligence methods

    Science.gov (United States)

    Pavlidis, N. G.; Parsopoulos, K. E.; Vrahatis, M. N.

    2005-03-01

    Nash equilibrium constitutes a central solution concept in game theory. The task of detecting the Nash equilibria of a finite strategic game remains a challenging problem up-to-date. This paper investigates the effectiveness of three computational intelligence techniques, namely, covariance matrix adaptation evolution strategies, particle swarm optimization, as well as, differential evolution, to compute Nash equilibria of finite strategic games, as global minima of a real-valued, nonnegative function. An issue of particular interest is to detect more than one Nash equilibria of a game. The performance of the considered computational intelligence methods on this problem is investigated using multistart and deflection.

  9. The National Institutes of Health Center for Human Immunology, Autoimmunity, and Inflammation: history and progress.

    Science.gov (United States)

    Dickler, Howard B; McCoy, J Philip; Nussenblatt, Robert; Perl, Shira; Schwartzberg, Pamela A; Tsang, John S; Wang, Ena; Young, Neil S

    2013-05-01

    The Center for Human Immunology, Autoimmunity, and Inflammation (CHI) is an exciting initiative of the NIH intramural program begun in 2009. It is uniquely trans-NIH in support (multiple institutes) and leadership (senior scientists from several institutes who donate their time). Its goal is an in-depth assessment of the human immune system using high-throughput multiplex technologies for examination of immune cells and their products, the genome, gene expression, and epigenetic modulation obtained from individuals both before and after interventions, adding information from in-depth clinical phenotyping, and then applying advanced biostatistical and computer modeling methods for mining these diverse data. The aim is to develop a comprehensive picture of the human "immunome" in health and disease, elucidate common pathogenic pathways in various diseases, identify and validate biomarkers that predict disease progression and responses to new interventions, and identify potential targets for new therapeutic modalities. Challenges, opportunities, and progress are detailed. Published 2013. This article is a U.S. Government work and is in the public domain in the USA.

  10. The computational future for climate change research

    International Nuclear Information System (INIS)

    Washington, Warren M

    2005-01-01

    The development of climate models has a long history starting with the building of atmospheric models and later ocean models. The early researchers were very aware of the goal of building climate models which could integrate our knowledge of complex physical interactions between atmospheric, land-vegetation, hydrology, ocean, cryospheric processes, and sea ice. The transition from climate models to earth system models is already underway with coupling of active biochemical cycles. Progress is limited by present computer capability which is needed for increasingly more complex and higher resolution climate models versions. It would be a mistake to make models too complex or too high resolution. Arriving at a 'feasible' and useful model is the challenge for the climate model community. Some of the climate change history, scientific successes, and difficulties encountered with supercomputers will be presented

  11. Computer-based quantitative computed tomography image analysis in idiopathic pulmonary fibrosis: A mini review.

    Science.gov (United States)

    Ohkubo, Hirotsugu; Nakagawa, Hiroaki; Niimi, Akio

    2018-01-01

    Idiopathic pulmonary fibrosis (IPF) is the most common type of progressive idiopathic interstitial pneumonia in adults. Many computer-based image analysis methods of chest computed tomography (CT) used in patients with IPF include the mean CT value of the whole lungs, density histogram analysis, density mask technique, and texture classification methods. Most of these methods offer good assessment of pulmonary functions, disease progression, and mortality. Each method has merits that can be used in clinical practice. One of the texture classification methods is reported to be superior to visual CT scoring by radiologist for correlation with pulmonary function and prediction of mortality. In this mini review, we summarize the current literature on computer-based CT image analysis of IPF and discuss its limitations and several future directions. Copyright © 2017 The Japanese Respiratory Society. Published by Elsevier B.V. All rights reserved.

  12. New Mexico High School Supercomputing Challenge, 1990--1995: Five years of making a difference to students, teachers, schools, and communities. Progress report

    Energy Technology Data Exchange (ETDEWEB)

    Foster, M.; Kratzer, D.

    1996-02-01

    The New Mexico High School Supercomputing Challenge is an academic program dedicated to increasing interest in science and math among high school students by introducing them to high performance computing. This report provides a summary and evaluation of the first five years of the program, describes the program and shows the impact that it has had on high school students, their teachers, and their communities. Goals and objectives are reviewed and evaluated, growth and development of the program are analyzed, and future directions are discussed.

  13. Quantum Accelerators for High-Performance Computing Systems

    OpenAIRE

    Britt, Keith A.; Mohiyaddin, Fahd A.; Humble, Travis S.

    2017-01-01

    We define some of the programming and system-level challenges facing the application of quantum processing to high-performance computing. Alongside barriers to physical integration, prominent differences in the execution of quantum and conventional programs challenges the intersection of these computational models. Following a brief overview of the state of the art, we discuss recent advances in programming and execution models for hybrid quantum-classical computing. We discuss a novel quantu...

  14. Integrin-Targeted Hybrid Fluorescence Molecular Tomography/X-ray Computed Tomography for Imaging Tumor Progression and Early Response in Non-Small Cell Lung Cancer

    Directory of Open Access Journals (Sweden)

    Xiaopeng Ma

    2017-01-01

    Full Text Available Integrins play an important role in tumor progression, invasion and metastasis. Therefore we aimed to evaluate a preclinical imaging approach applying ανβ3 integrin targeted hybrid Fluorescence Molecular Tomography/X-ray Computed Tomography (FMT-XCT for monitoring tumor progression as well as early therapy response in a syngeneic murine Non-Small Cell Lung Cancer (NSCLC model. Lewis Lung Carcinomas were grown orthotopically in C57BL/6 J mice and imaged in-vivo using a ανβ3 targeted near-infrared fluorescence (NIRF probe. ανβ3-targeted FMT-XCT was able to track tumor progression. Cilengitide was able to substantially block the binding of the NIRF probe and suppress the imaging signal. Additionally mice were treated with an established chemotherapy regimen of Cisplatin and Bevacizumab or with a novel MEK inhibitor (Refametinib for 2 weeks. While μCT revealed only a moderate slowdown of tumor growth, ανβ3 dependent signal decreased significantly compared to non-treated mice already at one week post treatment. ανβ3 targeted imaging might therefore become a promising tool for assessment of early therapy response in the future.

  15. Empirical findings on progress and challenges in a novice students ...

    African Journals Online (AJOL)

    Journal of Computer Science and Its Application. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 22, No 2 (2015) >. Log in or Register to get access to full text downloads.

  16. Fusion power plant simulations: a progress report

    International Nuclear Information System (INIS)

    Cook, J.M.; Pattern, J.S.; Amend, W.E.

    1976-01-01

    The objective of the fusion systems analysis at ANL is to develop simulations to compare alternative conceptual designs of magnetically confined fusion power plants. The power plant computer simulation progress is described. Some system studies are also discussed

  17. Imaging manifestations of progressive multifocal leukoencephalopathy

    International Nuclear Information System (INIS)

    Shah, R.; Bag, A.K.; Chapman, P.R.; Cure, J.K.

    2010-01-01

    Progressive multifocal leukoencephalopathy (PML) is a demyelinating disease caused by reactivation of JC virus in immunosuppressed patients. The diagnosis is usually suggested on imaging and confirmed by cerebrospinal fluid polymerase chain reaction (PCR) for JC virus DNA. In this article, we review the imaging manifestations of PML on computed tomography (CT), magnetic resonance imaging (MRI), diffusion-weighted imaging (DWI), diffusion tensor imaging (DTI), MR spectroscopy, single photon-emission computed tomography (SPECT) and positron-emission tomography (PET), and outline the role of imaging in follow-up and prognostication.

  18. Computational quantum magnetism: Role of noncollinear magnetism

    International Nuclear Information System (INIS)

    Freeman, Arthur J.; Nakamura, Kohji

    2009-01-01

    We are witnessing today a golden age of innovation with novel magnetic materials and with discoveries important for both basic science and device applications. Computation and simulation have played a key role in the dramatic advances of the past and those we are witnessing today. A goal-driving computational science-simulations of every-increasing complexity of more and more realistic models has been brought into greater focus with greater computing power to run sophisticated and powerful software codes like our highly precise full-potential linearized augmented plane wave (FLAPW) method. Indeed, significant progress has been achieved from advanced first-principles FLAPW calculations for the predictions of surface/interface magnetism. One recently resolved challenging issue is the role of noncollinear magnetism (NCM) that arises not only through the SOC, but also from the breaking of symmetry at surfaces and interfaces. For this, we will further review some specific advances we are witnessing today, including complex magnetic phenomena from noncollinear magnetism with no shape approximation for the magnetization (perpendicular MCA in transition-metal overlayers and superlattices; unidirectional anisotropy and exchange bias in FM and AFM bilayers; constricted domain walls important in quantum spin interfaces; and curling magnetic nano-scale dots as new candidates for non-volatile memory applications) and most recently providing new predictions and understanding of magnetism in novel materials such as magnetic semiconductors and multi-ferroic systems

  19. Dryland climate change: Recent progress and challenges

    Science.gov (United States)

    Huang, J.; Li, Y.; Fu, C.; Chen, F.; Fu, Q.; Dai, A.; Shinoda, M.; Ma, Z.; Guo, W.; Li, Z.; Zhang, L.; Liu, Y.; Yu, H.; He, Y.; Xie, Y.; Guan, X.; Ji, M.; Lin, L.; Wang, S.; Yan, H.; Wang, G.

    2017-09-01

    Drylands are home to more than 38% of the world's population and are one of the most sensitive areas to climate change and human activities. This review describes recent progress in dryland climate change research. Recent findings indicate that the long-term trend of the aridity index (AI) is mainly attributable to increased greenhouse gas emissions, while anthropogenic aerosols exert small effects but alter its attributions. Atmosphere-land interactions determine the intensity of regional response. The largest warming during the last 100 years was observed over drylands and accounted for more than half of the continental warming. The global pattern and interdecadal variability of aridity changes are modulated by oceanic oscillations. The different phases of those oceanic oscillations induce significant changes in land-sea and north-south thermal contrasts, which affect the intensity of the westerlies and planetary waves and the blocking frequency, thereby altering global changes in temperature and precipitation. During 1948-2008, the drylands in the Americas became wetter due to enhanced westerlies, whereas the drylands in the Eastern Hemisphere became drier because of the weakened East Asian summer monsoon. Drylands as defined by the AI have expanded over the last 60 years and are projected to expand in the 21st century. The largest expansion of drylands has occurred in semiarid regions since the early 1960s. Dryland expansion will lead to reduced carbon sequestration and enhanced regional warming. The increasing aridity, enhanced warming, and rapidly growing population will exacerbate the risk of land degradation and desertification in the near future in developing countries.

  20. Session Introduction: Challenges of Pattern Recognition in Biomedical Data.

    Science.gov (United States)

    Verma, Shefali Setia; Verma, Anurag; Basile, Anna Okula; Bishop, Marta-Byrska; Darabos, Christian

    2018-01-01

    The analysis of large biomedical data often presents with various challenges related to not just the size of the data, but also to data quality issues such as heterogeneity, multidimensionality, noisiness, and incompleteness of the data. The data-intensive nature of computational genomics problems in biomedical informatics warrants the development and use of massive computer infrastructure and advanced software tools and platforms, including but not limited to the use of cloud computing. Our session aims to address these challenges in handling big data for designing a study, performing analysis, and interpreting outcomes of these analyses. These challenges have been prevalent in many studies including those which focus on the identification of novel genetic variant-phenotype associations using data from sources like Electronic Health Records (EHRs) or multi-omic data. One of the biggest challenges to focus on is the imperfect nature of the biomedical data where a lot of noise and sparseness is observed. In our session, we will present research articles that can help in identifying innovative ways to recognize and overcome newly arising challenges associated with pattern recognition in biomedical data.