WorldWideScience

Sample records for computing challenges progress

  1. Recent progress and modern challenges in applied mathematics, modeling and computational science

    CERN Document Server

    Makarov, Roman; Belair, Jacques

    2017-01-01

    This volume is an excellent resource for professionals in various areas of applications of mathematics, modeling, and computational science. It focuses on recent progress and modern challenges in these areas. The volume provides a balance between fundamental theoretical and applied developments, emphasizing the interdisciplinary nature of modern trends and detailing state-of-the-art achievements in Applied Mathematics, Modeling, and Computational Science.  The chapters have been authored by international experts in their respective fields, making this book ideal for researchers in academia, practitioners, and graduate students. It can also serve as a reference in the diverse selected areas of applied mathematics, modelling, and computational sciences, and is ideal for interdisciplinary collaborations.

  2. Abduction aiming at empirical progress or even at truth approximation, leading to challenge for computational modelling

    NARCIS (Netherlands)

    Kuipers, Theo A.F.

    1999-01-01

    This paper primarily deals with the conceptual prospects for generalizing the aim of abduction from the standard one of explaining surprising or anomalous observations to that of empirical progress or even truth approximation. It turns out that the main abduction task then becomes the

  3. Computational enzyme design approaches with significant biological outcomes: progress and challenges

    OpenAIRE

    Li, Xiaoman; Zhang, Ziding; Song, Jiangning

    2012-01-01

    Enzymes are powerful biocatalysts, however, so far there is still a large gap between the number of enzyme-based practical applications and that of naturally occurring enzymes. Multiple experimental approaches have been applied to generate nearly all possible mutations of target enzymes, allowing the identification of desirable variants with improved properties to meet the practical needs. Meanwhile, an increasing number of computational methods have been developed to assist in the modificati...

  4. Progress in computational toxicology.

    Science.gov (United States)

    Ekins, Sean

    2014-01-01

    Computational methods have been widely applied to toxicology across pharmaceutical, consumer product and environmental fields over the past decade. Progress in computational toxicology is now reviewed. A literature review was performed on computational models for hepatotoxicity (e.g. for drug-induced liver injury (DILI)), cardiotoxicity, renal toxicity and genotoxicity. In addition various publications have been highlighted that use machine learning methods. Several computational toxicology model datasets from past publications were used to compare Bayesian and Support Vector Machine (SVM) learning methods. The increasing amounts of data for defined toxicology endpoints have enabled machine learning models that have been increasingly used for predictions. It is shown that across many different models Bayesian and SVM perform similarly based on cross validation data. Considerable progress has been made in computational toxicology in a decade in both model development and availability of larger scale or 'big data' models. The future efforts in toxicology data generation will likely provide us with hundreds of thousands of compounds that are readily accessible for machine learning models. These models will cover relevant chemistry space for pharmaceutical, consumer product and environmental applications. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. Progress in computer vision.

    Science.gov (United States)

    Jain, A. K.; Dorai, C.

    Computer vision has emerged as a challenging and important area of research, both as an engineering and a scientific discipline. The growing importance of computer vision is evident from the fact that it was identified as one of the "Grand Challenges" and also from its prominent role in the National Information Infrastructure. While the design of a general-purpose vision system continues to be elusive machine vision systems are being used successfully in specific application elusive, machine vision systems are being used successfully in specific application domains. Building a practical vision system requires a careful selection of appropriate sensors, extraction and integration of information from available cues in the sensed data, and evaluation of system robustness and performance. The authors discuss and demonstrate advantages of (1) multi-sensor fusion, (2) combination of features and classifiers, (3) integration of visual modules, and (IV) admissibility and goal-directed evaluation of vision algorithms. The requirements of several prominent real world applications such as biometry, document image analysis, image and video database retrieval, and automatic object model construction offer exciting problems and new opportunities to design and evaluate vision algorithms.

  6. Photons, photosynthesis, and high-performance computing: challenges, progress, and promise of modeling metabolism in green algae

    International Nuclear Information System (INIS)

    Chang, C H; Graf, P; Alber, D M; Kim, K; Murray, G; Posewitz, M; Seibert, M

    2008-01-01

    The complexity associated with biological metabolism considered at a kinetic level presents a challenge to quantitative modeling. In particular, the relatively sparse knowledge of parameters for enzymes with known kinetic responses is problematic. The possible space of these parameters is of high-dimension, and sampling of such a space typifies a combinatorial explosion of possible dynamic states. However, with sufficient quantitative transcriptomics, proteomics, and metabolomics data at hand, these challenges could be met by high-performance software with sampling, fitting, and optimization capabilities. With this in mind, we present the High-Performance Systems Biology Toolkit HiPer SBTK, an evolving software package to simulate, fit, and optimize metabolite concentrations and fluxes within the space of rate and binding parameters associated with detailed enzyme kinetic models. We present our chosen modeling paradigm for the formulation of metabolic pathway models, the means to address the challenge of representing such models in a precise and persistent fashion using the standardized Systems Biology Markup Language, and our second-generation model of H2-associated Chlamydomonas metabolism. Processing of such models for hierarchically parallelized simulation and optimization, job specification by the user through a GUI interface, software capabilities and initial scaling data, and the mapping of the computation to biological questions is also discussed. Moreover, we present near-term future software and model development goals

  7. The challenge of computer mathematics.

    Science.gov (United States)

    Barendregt, Henk; Wiedijk, Freek

    2005-10-15

    Progress in the foundations of mathematics has made it possible to formulate all thinkable mathematical concepts, algorithms and proofs in one language and in an impeccable way. This is not in spite of, but partially based on the famous results of Gödel and Turing. In this way statements are about mathematical objects and algorithms, proofs show the correctness of statements and computations, and computations are dealing with objects and proofs. Interactive computer systems for a full integration of defining, computing and proving are based on this. The human defines concepts, constructs algorithms and provides proofs, while the machine checks that the definitions are well formed and the proofs and computations are correct. Results formalized so far demonstrate the feasibility of this 'computer mathematics'. Also there are very good applications. The challenge is to make the systems more mathematician-friendly, by building libraries and tools. The eventual goal is to help humans to learn, develop, communicate, referee and apply mathematics.

  8. Silicon spintronics: Progress and challenges

    Energy Technology Data Exchange (ETDEWEB)

    Sverdlov, Viktor; Selberherr, Siegfried, E-mail: Selberherr@TUWien.ac.at

    2015-07-14

    Electron spin attracts much attention as an alternative to the electron charge degree of freedom for low-power reprogrammable logic and non-volatile memory applications. Silicon appears to be the perfect material for spin-driven applications. Recent progress and challenges regarding spin-based devices are reviewed. An order of magnitude enhancement of the electron spin lifetime in silicon thin films by shear strain is predicted and its impact on spin transport in SpinFETs is discussed. A relatively weak coupling between spin and effective electric field in silicon allows magnetoresistance modulation at room temperature, however, for long channel lengths. Due to tunneling magnetoresistance and spin transfer torque effects, a much stronger coupling between the spin (magnetization) orientation and charge current is achieved in magnetic tunnel junctions. Magnetic random access memory (MRAM) built on magnetic tunnel junctions is CMOS compatible and possesses all properties needed for future universal memory. Designs of spin-based non-volatile MRAM cells are presented. By means of micromagnetic simulations it is demonstrated that a substantial reduction of the switching time can be achieved. Finally, it is shown that any two arbitrary memory cells from an MRAM array can be used to perform a logic operation. Thus, an intrinsic non-volatile logic-in-memory architecture can be realized.

  9. Silicon spintronics: Progress and challenges

    International Nuclear Information System (INIS)

    Sverdlov, Viktor; Selberherr, Siegfried

    2015-01-01

    Electron spin attracts much attention as an alternative to the electron charge degree of freedom for low-power reprogrammable logic and non-volatile memory applications. Silicon appears to be the perfect material for spin-driven applications. Recent progress and challenges regarding spin-based devices are reviewed. An order of magnitude enhancement of the electron spin lifetime in silicon thin films by shear strain is predicted and its impact on spin transport in SpinFETs is discussed. A relatively weak coupling between spin and effective electric field in silicon allows magnetoresistance modulation at room temperature, however, for long channel lengths. Due to tunneling magnetoresistance and spin transfer torque effects, a much stronger coupling between the spin (magnetization) orientation and charge current is achieved in magnetic tunnel junctions. Magnetic random access memory (MRAM) built on magnetic tunnel junctions is CMOS compatible and possesses all properties needed for future universal memory. Designs of spin-based non-volatile MRAM cells are presented. By means of micromagnetic simulations it is demonstrated that a substantial reduction of the switching time can be achieved. Finally, it is shown that any two arbitrary memory cells from an MRAM array can be used to perform a logic operation. Thus, an intrinsic non-volatile logic-in-memory architecture can be realized

  10. Peptide Vaccine: Progress and Challenges

    Directory of Open Access Journals (Sweden)

    Weidang Li

    2014-07-01

    Full Text Available Conventional vaccine strategies have been highly efficacious for several decades in reducing mortality and morbidity due to infectious diseases. The bane of conventional vaccines, such as those that include whole organisms or large proteins, appear to be the inclusion of unnecessary antigenic load that, not only contributes little to the protective immune response, but complicates the situation by inducing allergenic and/or reactogenic responses. Peptide vaccines are an attractive alternative strategy that relies on usage of short peptide fragments to engineer the induction of highly targeted immune responses, consequently avoiding allergenic and/or reactogenic sequences. Conversely, peptide vaccines used in isolation are often weakly immunogenic and require particulate carriers for delivery and adjuvanting. In this article, we discuss the specific advantages and considerations in targeted induction of immune responses by peptide vaccines and progresses in the development of such vaccines against various diseases. Additionally, we also discuss the development of particulate carrier strategies and the inherent challenges with regard to safety when combining such technologies with peptide vaccines.

  11. The Challenge of Computers.

    Science.gov (United States)

    Leger, Guy

    Computers may change teachers' lifestyles, teaching styles, and perhaps even their personal values. A brief survey of the history of computers demonstrates the incredible pace at which computer technology is moving ahead. The cost and size of microchips will continue to decline dramatically over the next 20 years, while the capability and variety…

  12. Reference Structures: Stagnation, Progress, and Future Challenges.

    Science.gov (United States)

    Greenberg, Jane

    1997-01-01

    Assesses the current state of reference structures in online public access catalogs (OPACs) in a framework defined by stagnation, progress, and future challenges. Outlines six areas for reference structure development. Twenty figures provide illustrations. (AEF)

  13. Orion Absolute Navigation System Progress and Challenge

    Science.gov (United States)

    Holt, Greg N.; D'Souza, Christopher

    2012-01-01

    The absolute navigation design of NASA's Orion vehicle is described. It has undergone several iterations and modifications since its inception, and continues as a work-in-progress. This paper seeks to benchmark the current state of the design and some of the rationale and analysis behind it. There are specific challenges to address when preparing a timely and effective design for the Exploration Flight Test (EFT-1), while still looking ahead and providing software extensibility for future exploration missions. The primary onboard measurements in a Near-Earth or Mid-Earth environment consist of GPS pseudo-range and delta-range, but for future explorations missions the use of star-tracker and optical navigation sources need to be considered. Discussions are presented for state size and composition, processing techniques, and consider states. A presentation is given for the processing technique using the computationally stable and robust UDU formulation with an Agee-Turner Rank-One update. This allows for computational savings when dealing with many parameters which are modeled as slowly varying Gauss-Markov processes. Preliminary analysis shows up to a 50% reduction in computation versus a more traditional formulation. Several state elements are discussed and evaluated, including position, velocity, attitude, clock bias/drift, and GPS measurement biases in addition to bias, scale factor, misalignment, and non-orthogonalities of the accelerometers and gyroscopes. Another consideration is the initialization of the EKF in various scenarios. Scenarios such as single-event upset, ground command, and cold start are discussed as are strategies for whole and partial state updates as well as covariance considerations. Strategies are given for dealing with latent measurements and high-rate propagation using multi-rate architecture. The details of the rate groups and the data ow between the elements is discussed and evaluated.

  14. Progress and challenges in the computational prediction of gene function using networks [v1; ref status: indexed, http://f1000r.es/SqmJUM

    Directory of Open Access Journals (Sweden)

    Paul Pavlidis

    2012-09-01

    Full Text Available In this opinion piece, we attempt to unify recent arguments we have made that serious confounds affect the use of network data to predict and characterize gene function. The development of computational approaches to determine gene function is a major strand of computational genomics research. However, progress beyond using BLAST to transfer annotations has been surprisingly slow. We have previously argued that a large part of the reported success in using "guilt by association" in network data is due to the tendency of methods to simply assign new functions to already well-annotated genes. While such predictions will tend to be correct, they are generic; it is true, but not very helpful, that a gene with many functions is more likely to have any function. We have also presented evidence that much of the remaining performance in cross-validation cannot be usefully generalized to new predictions, making progressive improvement in analysis difficult to engineer. Here we summarize our findings about how these problems will affect network analysis, discuss some ongoing responses within the field to these issues, and consolidate some recommendations and speculation, which we hope will modestly increase the reliability and specificity of gene function prediction.

  15. Orion Absolute Navigation System Progress and Challenges

    Science.gov (United States)

    Holt, Greg N.; D'Souza, Christopher

    2011-01-01

    The Orion spacecraft is being designed as NASA's next-generation exploration vehicle for crewed missions beyond Low-Earth Orbit. The navigation system for the Orion spacecraft is being designed in a Multi-Organizational Design Environment (MODE) team including contractor and NASA personnel. The system uses an Extended Kalman Filter to process measurements and determine the state. The design of the navigation system has undergone several iterations and modifications since its inception, and continues as a work-in-progress. This paper seeks to benchmark the current state of the design and some of the rationale and analysis behind it. There are specific challenges to address when preparing a timely and effective design for the Exploration Flight Test (EFT-1), while still looking ahead and providing software extensibility for future exploration missions. The primary measurements in a Near-Earth or Mid-Earth environment consist of GPS pseudorange and deltarange, but for future explorations missions the use of star-tracker and optical navigation sources need to be considered. Discussions are presented for state size and composition, processing techniques, and consider states. A presentation is given for the processing technique using the computationally stable and robust UDU formulation with an Agee-Turner Rank-One update. This allows for computational savings when dealing with many parameters which are modeled as slowly varying Gauss-Markov processes. Preliminary analysis shows up to a 50% reduction in computation versus a more traditional formulation. Several state elements are discussed and evaluated, including position, velocity, attitude, clock bias/drift, and GPS measurement biases in addition to bias, scale factor, misalignment, and non-orthogonalities of the accelerometers and gyroscopes. Another consideration is the initialization of the EKF in various scenarios. Scenarios such as single-event upset, ground command, pad alignment, cold start are discussed as are

  16. Progress and Challenge of Artificial Intelligence

    Institute of Scientific and Technical Information of China (English)

    Zhong-Zhi Shi; Nan-Ning Zheng

    2006-01-01

    Artificial Intelligence (AI) is generally considered to be a subfield of computer science, that is concerned to attempt simulation, extension and expansion of human intelligence. Artificial intelligence has enjoyed tremendous success over the last fifty years. In this paper we only focus on visual perception, granular computing, agent computing, semantic grid. Human-level intelligence is the long-term goal of artificial intelligence. We should do joint research on basic theory and technology of intelligence by brain science, cognitive science, artificial intelligence and others. A new cross discipline intelligence science is undergoing a rapid development. Future challenges are given in final section.

  17. Achieving efficient RNAi therapy: progress and challenges

    Directory of Open Access Journals (Sweden)

    Kun Gao

    2013-07-01

    Full Text Available RNA interference (RNAi has been harnessed to produce a new class of drugs for treatment of various diseases. This review summarizes the most important parameters that govern the silencing efficiency and duration of the RNAi effect such as small interfering RNA (siRNA stability and modification, the type of delivery system and particle sizing methods. It also discusses the predominant barriers for siRNA delivery, such as off-target effects and introduces internalization, endosomal escape and mathematical modeling in RNAi therapy and combinatorial RNAi. At present, effective delivery of RNAi therapeutics in vivo remains a challenge although significant progress has been made in this field.

  18. Cancer nanomedicine: progress, challenges and opportunities.

    Science.gov (United States)

    Shi, Jinjun; Kantoff, Philip W; Wooster, Richard; Farokhzad, Omid C

    2017-01-01

    The intrinsic limits of conventional cancer therapies prompted the development and application of various nanotechnologies for more effective and safer cancer treatment, herein referred to as cancer nanomedicine. Considerable technological success has been achieved in this field, but the main obstacles to nanomedicine becoming a new paradigm in cancer therapy stem from the complexities and heterogeneity of tumour biology, an incomplete understanding of nano-bio interactions and the challenges regarding chemistry, manufacturing and controls required for clinical translation and commercialization. This Review highlights the progress, challenges and opportunities in cancer nanomedicine and discusses novel engineering approaches that capitalize on our growing understanding of tumour biology and nano-bio interactions to develop more effective nanotherapeutics for cancer patients.

  19. Editorial: Modelling and computational challenges in granular materials

    OpenAIRE

    Weinhart, Thomas; Thornton, Anthony Richard; Einav, Itai

    2015-01-01

    This is the editorial for the special issue on “Modelling and computational challenges in granular materials” in the journal on Computational Particle Mechanics (CPM). The issue aims to provide an opportunity for physicists, engineers, applied mathematicians and computational scientists to discuss the current progress and latest advancements in the field of advanced numerical methods and modelling of granular materials. The focus will be on computational methods, improved algorithms and the m...

  20. Laser ignited engines: progress, challenges and prospects.

    Science.gov (United States)

    Dearden, Geoff; Shenton, Tom

    2013-11-04

    Laser ignition (LI) has been shown to offer many potential benefits compared to spark ignition (SI) for improving the performance of internal combustion (IC) engines. This paper outlines progress made in recent research on laser ignited IC engines, discusses the potential advantages and control opportunities and considers the challenges faced and prospects for its future implementation. An experimental research effort has been underway at the University of Liverpool (UoL) to extend the stratified speed/load operating region of the gasoline direct injection (GDI) engine through LI research, for which an overview of some of the approaches, testing and results to date are presented. These indicate how LI can be used to improve control of the engine for: leaner operation, reductions in emissions, lower idle speed and improved combustion stability.

  1. Granular computing: perspectives and challenges.

    Science.gov (United States)

    Yao, JingTao; Vasilakos, Athanasios V; Pedrycz, Witold

    2013-12-01

    Granular computing, as a new and rapidly growing paradigm of information processing, has attracted many researchers and practitioners. Granular computing is an umbrella term to cover any theories, methodologies, techniques, and tools that make use of information granules in complex problem solving. The aim of this paper is to review foundations and schools of research and to elaborate on current developments in granular computing research. We first review some basic notions of granular computing. Classification and descriptions of various schools of research in granular computing are given. We also present and identify some research directions in granular computing.

  2. Gaucher disease: Progress and ongoing challenges.

    Science.gov (United States)

    Mistry, Pramod K; Lopez, Grisel; Schiffmann, Raphael; Barton, Norman W; Weinreb, Neal J; Sidransky, Ellen

    Over the past decades, tremendous progress has been made in the field of Gaucher disease, the inherited deficiency of the lysosomal enzyme glucocerebrosidase. Many of the colossal achievements took place during the course of the sixty-year tenure of Dr. Roscoe Brady at the National Institutes of Health. These include the recognition of the enzymatic defect involved, the isolation and characterization of the protein, the localization and characterization of the gene and its nearby pseudogene, as well as the identification of the first mutant alleles in patients. The first treatment for Gaucher disease, enzyme replacement therapy, was conceived of, developed and tested at the Clinical Center of the National Institutes of Health. Advances including recombinant production of the enzyme, the development of mouse models, pioneering gene therapy experiments, high throughput screens of small molecules and the generation of induced pluripotent stem cell models have all helped to catapult research in Gaucher disease into the twenty-first century. The appreciation that mutations in the glucocerebrosidase gene are an important risk factor for parkinsonism further expands the impact of this work. However, major challenges still remain, some of which are described here, that will provide opportunities, excitement and discovery for the next generations of Gaucher investigators. Published by Elsevier Inc.

  3. Security and Privacy in Fog Computing: Challenges

    OpenAIRE

    Mukherjee, Mithun; Matam, Rakesh; Shu, Lei; Maglaras, Leandros; Ferrag, Mohamed Amine; Choudhry, Nikumani; Kumar, Vikas

    2017-01-01

    open access article Fog computing paradigm extends the storage, networking, and computing facilities of the cloud computing toward the edge of the networks while offloading the cloud data centers and reducing service latency to the end users. However, the characteristics of fog computing arise new security and privacy challenges. The existing security and privacy measurements for cloud computing cannot be directly applied to the fog computing due to its features, such as mobility, heteroge...

  4. Chips challenging champions games, computers and artificial intelligence

    CERN Document Server

    Schaeffer, J

    2002-01-01

    One of the earliest dreams of the fledgling field of artificial intelligence (AI) was to build computer programs that could play games as well as or better than the best human players. Despite early optimism in the field, the challenge proved to be surprisingly difficult. However, the 1990s saw amazing progress. Computers are now better than humans in checkers, Othello and Scrabble; are at least as good as the best humans in backgammon and chess; and are rapidly improving at hex, go, poker, and shogi. This book documents the progress made in computers playing games and puzzles. The book is the

  5. Smart garments in chronic disease management: progress and challenges

    Science.gov (United States)

    Khosla, Ajit

    2012-10-01

    This paper presents the progress made developments in the area of Smart Garments for chronic disease management over last 10 years. A large number of health monitoring smart garments and wearable sensors have been manufactured to monitor patient's physiological parameters such as electrocardiogram, blood pressure, body temperature, heart rate, oxygen saturation, while patient is not in hospital. In last few years with the advancement in smartphones and cloud computing it is now possible to send the measure physiological data to any desired location. However there are many challenges in the development of smart garment systems. The two major challenges are development of new lightweight power sources and there is a need for global standardization and a road map for development of smart garments. In this paper we will discuss current state-of-theart smart garments and wearable sensor systems. Also discussed will be the new emerging trends in smart garment research and development.

  6. Multiagent Work Practice Simulation: Progress and Challenges

    Science.gov (United States)

    Clancey, William J.; Sierhuis, Maarten

    2002-01-01

    Modeling and simulating complex human-system interactions requires going beyond formal procedures and information flows to analyze how people interact with each other. Such work practices include conversations, modes of communication, informal assistance, impromptu meetings, workarounds, and so on. To make these social processes visible, we have developed a multiagent simulation tool, called Brahms, for modeling the activities of people belonging to multiple groups, situated in a physical environment (geographic regions, buildings, transport vehicles, etc.) consisting of tools, documents, and computer systems. We are finding many useful applications of Brahms for system requirements analysis, instruction, implementing software agents, and as a workbench for relating cognitive and social theories of human behavior. Many challenges remain for representing work practices, including modeling: memory over multiple days, scheduled activities combining physical objects, groups, and locations on a timeline (such as a Space Shuttle mission), habitat vehicles with trajectories (such as the Shuttle), agent movement in 3d space (e.g., inside the International Space Station), agent posture and line of sight, coupled movements (such as carrying objects), and learning (mimicry, forming habits, detecting repetition, etc.).

  7. COMPLEX NETWORKS IN CLIMATE SCIENCE: PROGRESS, OPPORTUNITIES AND CHALLENGES

    Data.gov (United States)

    National Aeronautics and Space Administration — COMPLEX NETWORKS IN CLIMATE SCIENCE: PROGRESS, OPPORTUNITIES AND CHALLENGES KARSTEN STEINHAEUSER, NITESH V. CHAWLA, AND AUROOP R. GANGULY Abstract. Networks have...

  8. Internet ware cloud computing :Challenges

    OpenAIRE

    Qamar, S; Lal, Niranjan; Singh, Mrityunjay

    2010-01-01

    After decades of engineering development and infrastructural investment, Internet connections have become commodity product in many countries, and Internet scale “cloud computing” has started to compete with traditional software business through its technological advantages and economy of scale. Cloud computing is a promising enabling technology of Internet ware Cloud Computing is termed as the next big thing in the modern corporate world. Apart from the present day software and technologies,...

  9. Development of indigenous irradiator - current progress and challenges

    International Nuclear Information System (INIS)

    Anwar A Rahman; Mohd Arif Hamzah; Muhd Nor Atan; Aznor Hassan; Fadil Ismail; Julia A Karim; Rosli Darmawan

    2009-01-01

    The development of indigenous irradiator is one of Prototype Development Center main project to support Nuclear Malaysia services. Three (3) projects have been identified and currently the status is in final stage of design. There are some issues and challenges encountered, which delayed the project progress. The paper will discuss the current progress of development and challenges faced in designing the irradiator. (Author)

  10. Challenges and Security in Cloud Computing

    Science.gov (United States)

    Chang, Hyokyung; Choi, Euiin

    People who live in this world want to solve any problems as they happen then. An IT technology called Ubiquitous computing should help the situations easier and we call a technology which makes it even better and powerful cloud computing. Cloud computing, however, is at the stage of the beginning to implement and use and it faces a lot of challenges in technical matters and security issues. This paper looks at the cloud computing security.

  11. Computing challenges of the CMS experiment

    International Nuclear Information System (INIS)

    Krammer, N.; Liko, D.

    2017-01-01

    The success of the LHC experiments is due to the magnificent performance of the detector systems and the excellent operating computing systems. The CMS offline software and computing system is successfully fulfilling the LHC Run 2 requirements. For the increased data rate of future LHC operation, together with high pileup interactions, improvements of the usage of the current computing facilities and new technologies became necessary. Especially for the challenge of the future HL-LHC a more flexible and sophisticated computing model is needed. In this presentation, I will discuss the current computing system used in the LHC Run 2 and future computing facilities for the HL-LHC runs using flexible computing technologies like commercial and academic computing clouds. The cloud resources are highly virtualized and can be deployed for a variety of computing tasks providing the capacities for the increasing needs of large scale scientific computing.

  12. The Glass Ceiling: Progress and Persistent Challenges

    Science.gov (United States)

    McLlwain, Wendy M.

    2012-01-01

    It has been written that since 2001, there has not been any significant progress and the glass ceiling is still intact. Women are still underrepresented in top positions (Anonymous, 2004). If this is true, the glass ceiling presents a major barrier between women and their desire to advance into executive or senior management positions. In addition…

  13. Progress and challenges in cleaning up Hanford

    Energy Technology Data Exchange (ETDEWEB)

    Wagoner, J.D. [Dept. of Energy, Richland, WA (United States)

    1997-08-01

    This paper presents captioned viewgraphs which briefly summarize cleanup efforts at the Hanford Site. Underground waste tank and spent nuclear fuel issues are described. Progress is reported for the Plutonium Finishing Plant, PUREX plant, B-Plant/Waste Encapsulation Storage Facility, and Fast Flux Test Facility. A very brief overview of costs and number of sites remediated and/or decommissioned is given.

  14. Computational challenges in modeling gene regulatory events.

    Science.gov (United States)

    Pataskar, Abhijeet; Tiwari, Vijay K

    2016-10-19

    Cellular transcriptional programs driven by genetic and epigenetic mechanisms could be better understood by integrating "omics" data and subsequently modeling the gene-regulatory events. Toward this end, computational biology should keep pace with evolving experimental procedures and data availability. This article gives an exemplified account of the current computational challenges in molecular biology.

  15. Challenges and solutions in enterprise computing

    NARCIS (Netherlands)

    van Sinderen, Marten J.

    2008-01-01

    The emergence of the networked enterprise has a profound effect on enterprise computing. This introduction discusses some important challenges in enterprise computing, which are the result of the mentioned networking trend, and positions the articles of this special issue with respect to these

  16. Egyptian women in physics: Progress and challenges

    Science.gov (United States)

    Mohsen, M.; Hosni, Hala; Mohamed, Hadeer; Gadalla, Afaf; Kahil, Heba; Hashem, Hassan

    2015-12-01

    The present study shows a progressive increase in the number of female physicists as undergraduates and postgraduates in several governmental universities. For instance, in Ain Shams University, the percentage of women who selected physics as a major course of study increased from 7.2% in 2012 to 10.8% in 2013 and 15.7% in 2014. The study also provides the current gender distribution in the various positions among the teaching staff in seven governmental universities. The data supports the fact that female teaching assistants are increasing in these universities.

  17. Challenges and progress in turbomachinery design systems

    International Nuclear Information System (INIS)

    Van den Braembussche, R A

    2013-01-01

    This paper first describes the requirements that a modern design system should meet, followed by a comparison between design systems based on inverse design or optimization techniques. The second part of the paper presents the way these challenges are realized in an optimization method combining an Evolutionary theory and a Metamodel. Extensions to multi-disciplinary, multi-point and multi-objective optimization are illustrated by examples

  18. Advances and Challenges in Computational Plasma Science

    International Nuclear Information System (INIS)

    Tang, W.M.; Chan, V.S.

    2005-01-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behavior. Recent advances in simulations of magnetically-confined plasmas are reviewed in this paper with illustrative examples chosen from associated research areas such as microturbulence, magnetohydrodynamics, and other topics. Progress has been stimulated in particular by the exponential growth of computer speed along with significant improvements in computer technology

  19. Therapeutic strategies to fight HIV-1 latency: progress and challenges

    CSIR Research Space (South Africa)

    Manoto, Sello L

    2017-10-01

    Full Text Available —1112, 2017 Therapeutic strategies to fight HIV-1 latency: progress and challenges Sello Lebohang Manoto, Lebogang Thobakgale, Rudzani Malabi, Charles Maphanga, Saturnin Ombinda-Lemboumba, Patience Mthunzi-Kufa Abstract: The life...

  20. Progress and challenges of carbon nanotube membrane in water treatment

    KAUST Repository

    Lee, Jieun; Jeong, Sanghyun; Liu, Zongwen

    2016-01-01

    review of the progress of CNT membranes addressing the current epidemic—whether (i) the CNT membranes could tackle current challenges in the pressure- or thermally driven membrane processes and (ii) CNT hybrid nanocomposite as a new generation

  1. Human rights in Japan: progress and challenges

    Directory of Open Access Journals (Sweden)

    Yolanda Muñoz González

    2007-11-01

    Full Text Available The aim of this paper is to present an overview of the improvements and challenges that Japan has been facing between 1983 and 2007. The paper explores the interaction among the different stakeholders –i.e. the Japanese Government, international organizations and civil society- to advance full access to citizenship regarding gender equality, the elimination of social and physical barriers for the inclusion of people with disabilities and elderly persons; ethnic minorities –specifically the situation of the Ainu people and the Buraku community – and the persons considered as “foreigners” living in Japan.

  2. Acute Pancreatitis—Progress and Challenges

    Science.gov (United States)

    Afghani, Elham; Pandol, Stephen J.; Shimosegawa, Tooru; Sutton, Robert; Wu, Bechien U.; Vege, Santhi Swaroop; Gorelick, Fred; Hirota, Morihisa; Windsor, John; Lo, Simon K.; Freeman, Martin L.; Lerch, Markus M.; Tsuji, Yoshihisa; Melmed, Gil Y.; Wassef, Wahid; Mayerle, Julia

    2016-01-01

    An international symposium entitled “Acute pancreatitis: progress and challenges” was held on November 5, 2014 at the Hapuna Beach Hotel, Big Island, Hawaii, as part of the 45th Anniversary Meeting of the American Pancreatic Association and the Japanese Pancreas Society. The course was organized and directed by Drs. Stephen Pandol, Tooru Shimosegawa, Robert Sutton, Bechien Wu, and Santhi Swaroop Vege. The symposium objectives were to: (1) highlight current issues in management of acute pancreatitis, (2) discuss promising treatments, (3) consider development of quality indicators and improved measures of disease activity, and (4) present a framework for international collaboration for development of new therapies. This article represents a compilation and adaptation of brief summaries prepared by speakers at the symposium with the purpose of broadly disseminating information and initiatives. PMID:26465949

  3. Sustainable Tourism: Progress Challenges and Opportunities

    DEFF Research Database (Denmark)

    Budeanu, Adriana; Miller, Graham; Moscardo, Gianna

    2016-01-01

    The term sustainable tourism emerged in the late 1980s and has become firmly established in both tourism policies and strategies and tourism research (Hall, 2011). After more than 25 years of attention it is timely to consider the state of research and practice in sustainable tourism. This special...... volume was established with exactly that goal in mind and this introduction seeks to set the context for this critical examination and reflection on sustainable tourism. Another objective of this introduction was to briefly describe the range of contributions selected for this SV. The articles...... are organised into four thematic areas of research: community stakeholders' perspectives and business approaches to sustainability in tourism, cultural responses, and methodological challenges related to sustainability. The articles shine a light on issues of importance within sustainable tourism, and in so...

  4. Ovarian cancer immunotherapy: opportunities, progresses and challenges

    Directory of Open Access Journals (Sweden)

    Stevens Richard

    2010-02-01

    Full Text Available Abstract Due to the low survival rates from invasive ovarian cancer, new effective treatment modalities are urgently needed. Compelling evidence indicates that the immune response against ovarian cancer may play an important role in controlling this disease. We herein summarize multiple immune-based strategies that have been proposed and tested for potential therapeutic benefit against advanced stage ovarian cancer. We will examine the evidence for the premise that an effective therapeutic vaccine against ovarian cancer is useful not only for inducing remission of the disease but also for preventing disease relapse. We will also highlight the questions and challenges in the development of ovarian cancer vaccines, and critically discuss the limitations of some of the existing immunotherapeutic strategies. Finally, we will summarize our own experience on the use of patient-specific tumor-derived heat shock protein-peptide complex for the treatment of advanced ovarian cancer.

  5. Childhood Obesity – 2010: Progress and Challenges

    Science.gov (United States)

    Han, Joan C.; Lawlor, Debbie A.; Kimm, Sue Y.S.

    2010-01-01

    Summary The worldwide prevalence of childhood obesity has increased greatly over the past 3 decades. The increasing occurrence in children of disorders, such as type 2 diabetes, is believed to be a consequence of this obesity epidemic. Much progress has been made in understanding the genetics and physiology of appetite control and from this, the elucidation of the causes of some rare obesity syndromes. However, these rare disorders have so far taught us only limited lessons on how to prevent or reverse obesity in most children. Calorie intake and activity recommendations need to be re-assessed and better quantified, on a population level, given the more sedentary life of children today. For individual treatment, the currently recommended calorie prescriptions may be too conservative given the evolving insight on the “energy gap.” Whilst quality of research in both prevention and treatment has improved, there is still a need for high-quality multi-centre trials with long-term follow-up. Meanwhile, prevention and treatment approaches that aim to increase energy expenditure and decrease intake need to continue. Most recently, the spiralling increase in obesity prevalence may be abating for children. Thus, even greater efforts need to be made on all fronts to continue this potentially exciting trend. PMID:20451244

  6. Lattice QCD computations: Recent progress with modern Krylov subspace methods

    Energy Technology Data Exchange (ETDEWEB)

    Frommer, A. [Bergische Universitaet GH Wuppertal (Germany)

    1996-12-31

    Quantum chromodynamics (QCD) is the fundamental theory of the strong interaction of matter. In order to compare the theory with results from experimental physics, the theory has to be reformulated as a discrete problem of lattice gauge theory using stochastic simulations. The computational challenge consists in solving several hundreds of very large linear systems with several right hand sides. A considerable part of the world`s supercomputer time is spent in such QCD calculations. This paper presents results on solving systems for the Wilson fermions. Recent progress is reviewed on algorithms obtained in cooperation with partners from theoretical physics.

  7. Workplace Charging Challenge Progress Update 2016: A New Sustainable Commute

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2017-01-31

    In the 2016 Workplace Charging Challenge annual survey, partners shared for the how their efforts were making an impact in their communities and helped identify best practices for workplace charging. The Workplace Charging Challenge Progress Update highlights the findings from this survey and recognizes leading employers for their workplace charging efforts.

  8. Dryland climate change: Recent progress and challenges

    Science.gov (United States)

    Huang, J.; Li, Y.; Fu, C.; Chen, F.; Fu, Q.; Dai, A.; Shinoda, M.; Ma, Z.; Guo, W.; Li, Z.; Zhang, L.; Liu, Y.; Yu, H.; He, Y.; Xie, Y.; Guan, X.; Ji, M.; Lin, L.; Wang, S.; Yan, H.; Wang, G.

    2017-09-01

    Drylands are home to more than 38% of the world's population and are one of the most sensitive areas to climate change and human activities. This review describes recent progress in dryland climate change research. Recent findings indicate that the long-term trend of the aridity index (AI) is mainly attributable to increased greenhouse gas emissions, while anthropogenic aerosols exert small effects but alter its attributions. Atmosphere-land interactions determine the intensity of regional response. The largest warming during the last 100 years was observed over drylands and accounted for more than half of the continental warming. The global pattern and interdecadal variability of aridity changes are modulated by oceanic oscillations. The different phases of those oceanic oscillations induce significant changes in land-sea and north-south thermal contrasts, which affect the intensity of the westerlies and planetary waves and the blocking frequency, thereby altering global changes in temperature and precipitation. During 1948-2008, the drylands in the Americas became wetter due to enhanced westerlies, whereas the drylands in the Eastern Hemisphere became drier because of the weakened East Asian summer monsoon. Drylands as defined by the AI have expanded over the last 60 years and are projected to expand in the 21st century. The largest expansion of drylands has occurred in semiarid regions since the early 1960s. Dryland expansion will lead to reduced carbon sequestration and enhanced regional warming. The increasing aridity, enhanced warming, and rapidly growing population will exacerbate the risk of land degradation and desertification in the near future in developing countries.

  9. Galeras: Progress and challenges of disaster

    International Nuclear Information System (INIS)

    Dorado G, Lina Marlene

    2008-01-01

    The Galeras Volcano is presently being considered the most active in Colombia. For the last 17 years of constant vigilance, the occurrence of eruptions by Galeras Volcano has been mostly classified as small ones. In the high hazard zone, live 7935 persons who have to be evacuated every time level II to volcanic activity is reached (probable eruption in the course of days or weeks). For the first time in Colombian history, a disaster situation has been declared before its happening. On November 15th, 2005, the National Government, on the basis of the Decreto 4106, declared the existence of an disaster situation within the counties of Pasto, Narino and La Florida, all making part of the Narino Department. This declaration was made considering the serious alteration of the daily life style, to which the population was exposed due to a probable volcanic eruption, is out to come. The present work is an analysis of the emergency procedures which have been carried out with help of the PAR (pressure and release) methodology. This analysis contains some reflexions on how difficulties were solved, and on positive aspects, challenges and advances for a better long term management of evacuations, like those carried out in the high hazard zones of Galeras volcano

  10. Enamel Regeneration - Current Progress and Challenges

    Science.gov (United States)

    Baswaraj; H.K, Navin; K.B, Prasanna

    2014-01-01

    Dental Enamel is the outermost covering of teeth. It is hardest mineralized tissue present in the human body. Enamel faces the challenge of maintaining its integrity in a constant demineralization and remineralization within the oral environment and it is vulnerable to wear, damage, and decay. It cannot regenerate itself, because it is formed by a layer of cells that are lost after the tooth eruption. Conventional treatment relies on synthetic materials to restore lost enamel that cannot mimic natural enamel. With advances in material science and understanding of basic principles of organic matrix mediated mineralization paves a way for formation of synthetic enamel. The knowledge of enamel formation and understanding of protein interactions and their gene products function along with the isolation of postnatal stem cells from various sources in the oral cavity, and the development of smart materials for cell and growth factor delivery, makes possibility for biological based enamel regeneration. This article will review the recent endeavor on biomimetic synthesis and cell based strategies for enamel regeneration. PMID:25386548

  11. Progress and challenges of the bioartificial pancreas

    Science.gov (United States)

    Hwang, Patrick T. J.; Shah, Dishant K.; Garcia, Jacob A.; Bae, Chae Yun; Lim, Dong-Jin; Huiszoon, Ryan C.; Alexander, Grant C.; Jun, Ho-Wook

    2016-11-01

    Pancreatic islet transplantation has been validated as a treatment for type 1 diabetes since it maintains consistent and sustained type 1 diabetes reversal. However, one of the major challenges in pancreatic islet transplantation is the body's natural immune response to the implanted islets. Immunosuppressive drug treatment is the most popular immunomodulatory approach for islet graft survival. However, administration of immunosuppressive drugs gives rise to negative side effects, and long-term effects are not clearly understood. A bioartificial pancreas is a therapeutic approach to enable pancreatic islet transplantation without or with minimal immune suppression. The bioartificial pancreas encapsulates the pancreatic islets in a semi-permeable environment which protects islets from the body's immune responses, while allowing the permeation of insulin, oxygen, nutrients, and waste. Many groups have developed various types of the bioartificial pancreas and tested their efficacy in animal models. However, the clinical application of the bioartificial pancreas still requires further investigation. In this review, we discuss several types of bioartificial pancreases and address their advantages and limitations. We also discuss recent advances in bioartificial pancreas applications with microfluidic or micropatterning technology.

  12. Gradient Models in Molecular Biophysics: Progress, Challenges, Opportunities.

    Science.gov (United States)

    Bardhan, Jaydeep P

    2013-12-01

    In the interest of developing a bridge between researchers modeling materials and those modeling biological molecules, we survey recent progress in developing nonlocal-dielectric continuum models for studying the behavior of proteins and nucleic acids. As in other areas of science, continuum models are essential tools when atomistic simulations (e.g. molecular dynamics) are too expensive. Because biological molecules are essentially all nanoscale systems, the standard continuum model, involving local dielectric response, has basically always been dubious at best. The advanced continuum theories discussed here aim to remedy these shortcomings by adding features such as nonlocal dielectric response, and nonlinearities resulting from dielectric saturation. We begin by describing the central role of electrostatic interactions in biology at the molecular scale, and motivate the development of computationally tractable continuum models using applications in science and engineering. For context, we highlight some of the most important challenges that remain and survey the diverse theoretical formalisms for their treatment, highlighting the rigorous statistical mechanics that support the use and improvement of continuum models. We then address the development and implementation of nonlocal dielectric models, an approach pioneered by Dogonadze, Kornyshev, and their collaborators almost forty years ago. The simplest of these models is just a scalar form of gradient elasticity, and here we use ideas from gradient-based modeling to extend the electrostatic model to include additional length scales. The paper concludes with a discussion of open questions for model development, highlighting the many opportunities for the materials community to leverage its physical, mathematical, and computational expertise to help solve one of the most challenging questions in molecular biology and biophysics.

  13. Gradient Models in Molecular Biophysics: Progress, Challenges, Opportunities

    Science.gov (United States)

    Bardhan, Jaydeep P.

    2014-01-01

    In the interest of developing a bridge between researchers modeling materials and those modeling biological molecules, we survey recent progress in developing nonlocal-dielectric continuum models for studying the behavior of proteins and nucleic acids. As in other areas of science, continuum models are essential tools when atomistic simulations (e.g. molecular dynamics) are too expensive. Because biological molecules are essentially all nanoscale systems, the standard continuum model, involving local dielectric response, has basically always been dubious at best. The advanced continuum theories discussed here aim to remedy these shortcomings by adding features such as nonlocal dielectric response, and nonlinearities resulting from dielectric saturation. We begin by describing the central role of electrostatic interactions in biology at the molecular scale, and motivate the development of computationally tractable continuum models using applications in science and engineering. For context, we highlight some of the most important challenges that remain and survey the diverse theoretical formalisms for their treatment, highlighting the rigorous statistical mechanics that support the use and improvement of continuum models. We then address the development and implementation of nonlocal dielectric models, an approach pioneered by Dogonadze, Kornyshev, and their collaborators almost forty years ago. The simplest of these models is just a scalar form of gradient elasticity, and here we use ideas from gradient-based modeling to extend the electrostatic model to include additional length scales. The paper concludes with a discussion of open questions for model development, highlighting the many opportunities for the materials community to leverage its physical, mathematical, and computational expertise to help solve one of the most challenging questions in molecular biology and biophysics. PMID:25505358

  14. Gradient models in molecular biophysics: progress, challenges, opportunities

    Science.gov (United States)

    Bardhan, Jaydeep P.

    2013-12-01

    In the interest of developing a bridge between researchers modeling materials and those modeling biological molecules, we survey recent progress in developing nonlocal-dielectric continuum models for studying the behavior of proteins and nucleic acids. As in other areas of science, continuum models are essential tools when atomistic simulations (e.g., molecular dynamics) are too expensive. Because biological molecules are essentially all nanoscale systems, the standard continuum model, involving local dielectric response, has basically always been dubious at best. The advanced continuum theories discussed here aim to remedy these shortcomings by adding nonlocal dielectric response. We begin by describing the central role of electrostatic interactions in biology at the molecular scale, and motivate the development of computationally tractable continuum models using applications in science and engineering. For context, we highlight some of the most important challenges that remain, and survey the diverse theoretical formalisms for their treatment, highlighting the rigorous statistical mechanics that support the use and improvement of continuum models. We then address the development and implementation of nonlocal dielectric models, an approach pioneered by Dogonadze, Kornyshev, and their collaborators almost 40 years ago. The simplest of these models is just a scalar form of gradient elasticity, and here we use ideas from gradient-based modeling to extend the electrostatic model to include additional length scales. The review concludes with a discussion of open questions for model development, highlighting the many opportunities for the materials community to leverage its physical, mathematical, and computational expertise to help solve one of the most challenging questions in molecular biology and biophysics.

  15. Cloud Computing: Opportunities and Challenges for Businesses

    Directory of Open Access Journals (Sweden)

    İbrahim Halil Seyrek

    2011-12-01

    Full Text Available Cloud computing represents a new approach for supplying and using information technology services. Considering its benefits for firms and the potential of changes that it may lead to, it is envisioned that cloud computing can be the most important innovation in information technology since the development of the internet. In this study, firstly, the development of cloud computing and related technologies are explained and classified by giving current application examples. Then the benefits of this new computing model for businesses are elaborated especially in terms of cost, flexibility and service quality. In spite of its benefits, cloud computing also poses some risks for firms, of which security is one of the most important, and there are some challenges in its implementation. This study points out the risks that companies should be wary about and some legal challenges related to cloud computing. Lastly, approaches that companies may take against cloud computing and different strategies that they may adopt are discussed and some recommendations are made

  16. Advances and challenges in computational plasma science

    International Nuclear Information System (INIS)

    Tang, W M; Chan, V S

    2005-01-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This

  17. New challenges in computational collective intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Ngoc Thanh; Katarzyniak, Radoslaw Piotr [Wroclaw Univ. of Technology (Poland). Inst. of Informatics; Janiak, Adam (eds.) [Wroclaw Univ. of Technology (Poland). Inst. of Computer Engineering, Control and Robotics

    2009-07-01

    The book consists of 29 chapters which have been selected and invited from the submissions to the 1{sup st} International Conference on Collective Intelligence - Semantic Web, Social Networks and Multiagent Systems (ICCCI 2009). All chapters in the book discuss various examples of applications of computational collective intelligence and related technologies to such fields as semantic web, information systems ontologies, social networks, agent and multiagent systems. The editors hope that the book can be useful for graduate and Ph.D. students in Computer Science, in particular participants to courses on Soft Computing, Multi-Agent Systems and Robotics. This book can also be useful for researchers working on the concept of computational collective intelligence in artificial populations. It is the hope of the editors that readers of this volume can find many inspiring ideas and use them to create new cases intelligent collectives. Many such challenges are suggested by particular approaches and models presented in particular chapters of this book. (orig.)

  18. Computational Psychiatry and the Challenge of Schizophrenia

    Science.gov (United States)

    Murray, John D.; Chekroud, Adam M.; Corlett, Philip R.; Yang, Genevieve; Wang, Xiao-Jing; Anticevic, Alan

    2017-01-01

    Abstract Schizophrenia research is plagued by enormous challenges in integrating and analyzing large datasets and difficulties developing formal theories related to the etiology, pathophysiology, and treatment of this disorder. Computational psychiatry provides a path to enhance analyses of these large and complex datasets and to promote the development and refinement of formal models for features of this disorder. This presentation introduces the reader to the notion of computational psychiatry and describes discovery-oriented and theory-driven applications to schizophrenia involving machine learning, reinforcement learning theory, and biophysically-informed neural circuit models. PMID:28338845

  19. Potentials and Challenges of Student Progress Portfolio Innovation ...

    African Journals Online (AJOL)

    This paper aims at stimulating discussion on Students Progress Portfolio (SPP) Innovation in assessment. It analyses the potential and challenges of SPP as well as how it can be harnessed to improve assessment practices and its contribution to quality education. The paper is based on a recent qualitative research which ...

  20. Progress and challenges in implementing the women, peace and ...

    African Journals Online (AJOL)

    This article provides an initial overview of the African Union's progress and challenges in implementing the Women, Peace and Security (WPS) agenda in its peace and security architecture. It reviews implementation in relation to representation, programming and in peacekeeping. The article contends that the WPS agenda ...

  1. Progress and Challenges in Implementing the Women, Peace and ...

    African Journals Online (AJOL)

    This article provides an initial overview of the African Union's progress and challenges ... peace initiatives to protect women and girls from gender-based violence. (GBV); to ... bilateral aid on gender equality to fragile states has quadrupled (UN. Women ..... (support of school supplies for ten children borne as a result of rape.

  2. Swallowable Wireless Capsule Endoscopy: Progress and Technical Challenges

    Directory of Open Access Journals (Sweden)

    Guobing Pan

    2012-01-01

    Full Text Available Wireless capsule endoscopy (WCE offers a feasible noninvasive way to detect the whole gastrointestinal (GI tract and revolutionizes the diagnosis technology. However, compared with wired endoscopies, the limited working time, the low frame rate, and the low image resolution limit the wider application. The progress of this new technology is reviewed in this paper, and the evolution tendencies are analyzed to be high image resolution, high frame rate, and long working time. Unfortunately, the power supply of capsule endoscope (CE is the bottleneck. Wireless power transmission (WPT is the promising solution to this problem, but is also the technical challenge. Active CE is another tendency and will be the next geneion of the WCE. Nevertheless, it will not come true shortly, unless the practical locomotion mechanism of the active CE in GI tract is achieved. The locomotion mechanism is the other technical challenge, besides the challenge of WPT. The progress about the WPT and the active capsule technology is reviewed.

  3. Computer Network Security- The Challenges of Securing a Computer Network

    Science.gov (United States)

    Scotti, Vincent, Jr.

    2011-01-01

    This article is intended to give the reader an overall perspective on what it takes to design, implement, enforce and secure a computer network in the federal and corporate world to insure the confidentiality, integrity and availability of information. While we will be giving you an overview of network design and security, this article will concentrate on the technology and human factors of securing a network and the challenges faced by those doing so. It will cover the large number of policies and the limits of technology and physical efforts to enforce such policies.

  4. Mathematical challenges from theoretical/computational chemistry

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-12-31

    The committee believes that this report has relevance and potentially valuable suggestions for a wide range of readers. Target audiences include: graduate departments in the mathematical and chemical sciences; federal and private agencies that fund research in the mathematical and chemical sciences; selected industrial and government research and development laboratories; developers of software and hardware for computational chemistry; and selected individual researchers. Chapter 2 of this report covers some history of computational chemistry for the nonspecialist, while Chapter 3 illustrates the fruits of some past successful cross-fertilization between mathematical scientists and computational/theoretical chemists. In Chapter 4 the committee has assembled a representative, but not exhaustive, survey of research opportunities. Most of these are descriptions of important open problems in computational/theoretical chemistry that could gain much from the efforts of innovative mathematical scientists, written so as to be accessible introductions to the nonspecialist. Chapter 5 is an assessment, necessarily subjective, of cultural differences that must be overcome if collaborative work is to be encouraged between the mathematical and the chemical communities. Finally, the report ends with a brief list of conclusions and recommendations that, if followed, could promote accelerated progress at this interface. Recognizing that bothersome language issues can inhibit prospects for collaborative research at the interface between distinctive disciplines, the committee has attempted throughout to maintain an accessible style, in part by using illustrative boxes, and has included at the end of the report a glossary of technical terms that may be familiar to only a subset of the target audiences listed above.

  5. Achievements and Challenges in Computational Protein Design.

    Science.gov (United States)

    Samish, Ilan

    2017-01-01

    Computational protein design (CPD), a yet evolving field, includes computer-aided engineering for partial or full de novo designs of proteins of interest. Designs are defined by a requested structure, function, or working environment. This chapter describes the birth and maturation of the field by presenting 101 CPD examples in a chronological order emphasizing achievements and pending challenges. Integrating these aspects presents the plethora of CPD approaches with the hope of providing a "CPD 101". These reflect on the broader structural bioinformatics and computational biophysics field and include: (1) integration of knowledge-based and energy-based methods, (2) hierarchical designated approach towards local, regional, and global motifs and the integration of high- and low-resolution design schemes that fit each such region, (3) systematic differential approaches towards different protein regions, (4) identification of key hot-spot residues and the relative effect of remote regions, (5) assessment of shape-complementarity, electrostatics and solvation effects, (6) integration of thermal plasticity and functional dynamics, (7) negative design, (8) systematic integration of experimental approaches, (9) objective cross-assessment of methods, and (10) successful ranking of potential designs. Future challenges also include dissemination of CPD software to the general use of life-sciences researchers and the emphasis of success within an in vivo milieu. CPD increases our understanding of protein structure and function and the relationships between the two along with the application of such know-how for the benefit of mankind. Applied aspects range from biological drugs, via healthier and tastier food products to nanotechnology and environmentally friendly enzymes replacing toxic chemicals utilized in the industry.

  6. Computing Platforms for Big Biological Data Analytics: Perspectives and Challenges.

    Science.gov (United States)

    Yin, Zekun; Lan, Haidong; Tan, Guangming; Lu, Mian; Vasilakos, Athanasios V; Liu, Weiguo

    2017-01-01

    The last decade has witnessed an explosion in the amount of available biological sequence data, due to the rapid progress of high-throughput sequencing projects. However, the biological data amount is becoming so great that traditional data analysis platforms and methods can no longer meet the need to rapidly perform data analysis tasks in life sciences. As a result, both biologists and computer scientists are facing the challenge of gaining a profound insight into the deepest biological functions from big biological data. This in turn requires massive computational resources. Therefore, high performance computing (HPC) platforms are highly needed as well as efficient and scalable algorithms that can take advantage of these platforms. In this paper, we survey the state-of-the-art HPC platforms for big biological data analytics. We first list the characteristics of big biological data and popular computing platforms. Then we provide a taxonomy of different biological data analysis applications and a survey of the way they have been mapped onto various computing platforms. After that, we present a case study to compare the efficiency of different computing platforms for handling the classical biological sequence alignment problem. At last we discuss the open issues in big biological data analytics.

  7. Computed tomographic findings in progressive supranuclear palsy

    Energy Technology Data Exchange (ETDEWEB)

    Saitoh, H; Yoshii, F; Shinohara, Y

    1987-03-01

    CT findings of 6 patients with progressive supranuclear palsy (PSP) are described, with emphasis on their supratentorial changes in comparison with those of control subjects and patients with Parkinson's disease (PD). As estimated from CT films, the lateral ventricles, third ventricle and prepontine cistern were significantly enlarged in PSP patients compared with both controls and PD patients. It is suggested that the patients with PSP have not only infratentorial but also supratentorial lesions.

  8. CMS Computing Software and Analysis Challenge 2006

    Energy Technology Data Exchange (ETDEWEB)

    De Filippis, N. [Dipartimento interateneo di Fisica M. Merlin and INFN Bari, Via Amendola 173, 70126 Bari (Italy)

    2007-10-15

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work.

  9. CMS Computing Software and Analysis Challenge 2006

    International Nuclear Information System (INIS)

    De Filippis, N.

    2007-01-01

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work

  10. Progress in multidimensional neutron transport computation

    International Nuclear Information System (INIS)

    Lewis, E.E.

    1977-01-01

    The methods available for solution of the time-independent neutron transport problems arising in the analysis of nuclear systems are examined. The merits of deterministic and Monte Carlo methods are briefly compared. The capabilities of deterministic computational methods derived from the first-order form of the transport equation, from the second-order even-parity form of this equation, and from integral transport formulations are discussed in some detail. Emphasis is placed on the approaches for dealing with the related problems of computer memory requirements, computational cost, and achievable accuracy. Attention is directed to some areas where problems exist currently and where the need for further work appears to be particularly warranted

  11. Progressing neurobiological strategies against proteostasis failure: Challenges in neurodegeneration.

    Science.gov (United States)

    Amanullah, Ayeman; Upadhyay, Arun; Joshi, Vibhuti; Mishra, Ribhav; Jana, Nihar Ranjan; Mishra, Amit

    2017-12-01

    Proteins are ordered useful cellular entities, required for normal health and organism's survival. The proteome is the absolute set of cellular expressed proteins, which regulates a wide range of physiological functions linked with all domains of life. In aging cells or under unfavorable cellular conditions, misfolding of proteins generates common pathological events linked with neurodegenerative diseases and aging. Current advances of proteome studies systematically generates some progress in our knowledge that how misfolding of proteins or their accumulation can contribute to the impairment or depletion of proteome functions. Still, the underlying causes of this unrecoverable loss are not clear that how such unsolved transitions give rise to multifactorial challengeable degenerative pathological conditions in neurodegeneration. In this review, we specifically focus and systematically summarize various molecular mechanisms of proteostasis maintenance, as well as discuss progressing neurobiological strategies, promising natural and pharmacological candidates, which can be useful to counteract the problem of proteopathies. Our article emphasizes an urgent need that now it is important for us to recognize the fundamentals of proteostasis to design a new molecular framework and fruitful strategies to uncover how the proteome defects are associated with aging and neurodegenerative diseases. A enhance understanding of progress link with proteome and neurobiological challenges may provide new basic concepts in the near future, based on pharmacological agents, linked with impaired proteostasis and neurodegenerative diseases. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Nucleosynthesis in the early Galaxy: Progress and challenges.

    Science.gov (United States)

    Montes, Fernando

    2015-10-01

    Chemical imprints left by the first stars in the oldest stars of the Milky Way gives clues of the stellar nucleosynthesis responsible for the creation of elements heavier than iron. Recent progress in astronomical observations and in the modeling of the chemical evolution of the Galaxy have shown that multiple nucleosynthesis processes may operate at those early times. In this talk I will review some of that evidence along with the important role that nuclear reactions play in those processes. I will focus in progress in our understanding of the rapid neutron capture process (r-process) and in new results on nucleosynthesis in core-collapse supernovae and neutrino-driven winds that produce elements up to silver. I will show some examples of recent nuclear physics measurements addressing the need for better nuclear data and give an outlook of the remaining challenges and future plans to continue those measurements.

  13. Emerging nanomedicine applications and manufacturing: progress and challenges.

    Science.gov (United States)

    Sartain, Felicity; Greco, Francesca; Hill, Kathryn; Rannard, Steve; Owen, Andrew

    2016-03-01

    APS 6th International PharmSci Conference 2015 7-9 September 2015 East Midlands Conference Centre, University of Nottingham, Nottingham, UK As part of the 6th APS International PharmSci Conference, a nanomedicine session was organised to address challenges and share experiences in this field. Topics ranged from the reporting on latest results and advances in the development of targeted therapeutics to the needs that the community faces in how to progress these exciting proof of concept results into products. Here we provide an overview of the discussion and highlight some of the initiatives that have recently been established to support the translation of nanomedicines into the clinic.

  14. Progress and challenges in bioinformatics approaches for enhancer identification

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2017-02-03

    Enhancers are cis-acting DNA elements that play critical roles in distal regulation of gene expression. Identifying enhancers is an important step for understanding distinct gene expression programs that may reflect normal and pathogenic cellular conditions. Experimental identification of enhancers is constrained by the set of conditions used in the experiment. This requires multiple experiments to identify enhancers, as they can be active under specific cellular conditions but not in different cell types/tissues or cellular states. This has opened prospects for computational prediction methods that can be used for high-throughput identification of putative enhancers to complement experimental approaches. Potential functions and properties of predicted enhancers have been catalogued and summarized in several enhancer-oriented databases. Because the current methods for the computational prediction of enhancers produce significantly different enhancer predictions, it will be beneficial for the research community to have an overview of the strategies and solutions developed in this field. In this review, we focus on the identification and analysis of enhancers by bioinformatics approaches. First, we describe a general framework for computational identification of enhancers, present relevant data types and discuss possible computational solutions. Next, we cover over 30 existing computational enhancer identification methods that were developed since 2000. Our review highlights advantages, limitations and potentials, while suggesting pragmatic guidelines for development of more efficient computational enhancer prediction methods. Finally, we discuss challenges and open problems of this topic, which require further consideration.

  15. Progress and challenges in bioinformatics approaches for enhancer identification

    KAUST Repository

    Kleftogiannis, Dimitrios A.; Kalnis, Panos; Bajic, Vladimir B.

    2017-01-01

    Enhancers are cis-acting DNA elements that play critical roles in distal regulation of gene expression. Identifying enhancers is an important step for understanding distinct gene expression programs that may reflect normal and pathogenic cellular conditions. Experimental identification of enhancers is constrained by the set of conditions used in the experiment. This requires multiple experiments to identify enhancers, as they can be active under specific cellular conditions but not in different cell types/tissues or cellular states. This has opened prospects for computational prediction methods that can be used for high-throughput identification of putative enhancers to complement experimental approaches. Potential functions and properties of predicted enhancers have been catalogued and summarized in several enhancer-oriented databases. Because the current methods for the computational prediction of enhancers produce significantly different enhancer predictions, it will be beneficial for the research community to have an overview of the strategies and solutions developed in this field. In this review, we focus on the identification and analysis of enhancers by bioinformatics approaches. First, we describe a general framework for computational identification of enhancers, present relevant data types and discuss possible computational solutions. Next, we cover over 30 existing computational enhancer identification methods that were developed since 2000. Our review highlights advantages, limitations and potentials, while suggesting pragmatic guidelines for development of more efficient computational enhancer prediction methods. Finally, we discuss challenges and open problems of this topic, which require further consideration.

  16. Impedimetric biosensors for medical applications current progress and challenges

    CERN Document Server

    Rushworth, Jo V; Goode, Jack A; Pike, Douglas J; Ahmed, Asif; Millner, Paul

    2014-01-01

    In this monograph, the authors discuss the current progress in the medical application of impedimetric biosensors, along with the key challenges in the field. First, a general overview of biosensor development, structure and function is presented, followed by a detailed discussion of impedimetric biosensors and the principles of electrochemical impedance spectroscopy. Next, the current state-of-the art in terms of the science and technology underpinning impedance-based biosensors is reviewed in detail. The layer-by-layer construction of impedimetric sensors is described, including the design of electrodes, their nano-modification, transducer surface functionalization and the attachment of different bioreceptors. The current challenges of translating lab-based biosensor platforms into commercially-available devices that function with real patient samples at the POC are presented; this includes a consideration of systems integration, microfluidics and biosensor regeneration. The final section of this monograph ...

  17. Merging Library and Computing Services at Kenyon College: A Progress Report.

    Science.gov (United States)

    Oden, Robert A., Jr.; Temple, Daniel B.; Cottrell, Janet R.; Griggs, Ronald K.; Turney, Glen W.; Wojcik, Frank M.

    2001-01-01

    Describes the evolution and progress toward a uniquely integrated library and computer services unit at Kenyon College. Discusses its focus on constituencies; merging of the divisions; benefits for students, faculty, administrative units, and the institution; meeting challenges; and generalizing from the model. (EV)

  18. Computing Challenges in Coded Mask Imaging

    Science.gov (United States)

    Skinner, Gerald

    2009-01-01

    This slide presaentation reviews the complications and challenges in developing computer systems for Coded Mask Imaging telescopes. The coded mask technique is used when there is no other way to create the telescope, (i.e., when there are wide fields of view, high energies for focusing or low energies for the Compton/Tracker Techniques and very good angular resolution.) The coded mask telescope is described, and the mask is reviewed. The coded Masks for the INTErnational Gamma-Ray Astrophysics Laboratory (INTEGRAL) instruments are shown, and a chart showing the types of position sensitive detectors used for the coded mask telescopes is also reviewed. Slides describe the mechanism of recovering an image from the masked pattern. The correlation with the mask pattern is described. The Matrix approach is reviewed, and other approaches to image reconstruction are described. Included in the presentation is a review of the Energetic X-ray Imaging Survey Telescope (EXIST) / High Energy Telescope (HET), with information about the mission, the operation of the telescope, comparison of the EXIST/HET with the SWIFT/BAT and details of the design of the EXIST/HET.

  19. Progress in Computational Physics (PiCP) Volume 1 Wave Propagation in Periodic Media

    CERN Document Server

    Ehrhardt, Matthias

    2010-01-01

    Progress in Computational Physics is a new e-book series devoted to recent research trends in computational physics. It contains chapters contributed by outstanding experts of modeling of physical problems. The series focuses on interdisciplinary computational perspectives of current physical challenges, new numerical techniques for the solution of mathematical wave equations and describes certain real-world applications. With the help of powerful computers and sophisticated methods of numerical mathematics it is possible to simulate many ultramodern devices, e.g. photonic crystals structures,

  20. Health impact assessment in China: Emergence, progress and challenges

    Energy Technology Data Exchange (ETDEWEB)

    Huang Zheng, E-mail: huangzhg@mails.tjmu.edu.cn

    2012-01-15

    The values, concepts and approaches of health impact assessment (HIA) were outlined in the Gothenburg consensus paper and some industrialized countries have implemented HIA for many years. HIA has played an important role in environmental protection in China, however, the emergence, progress and challenges of HIA in China have not been well described. In this paper, the evolution of HIA in China was analyzed and the challenges of HIA were presented based on the author's experiences. HIA contributed to decision-making for large capital construction projects, such as the Three Gorges Dam project, in its emergence stage. Increasing attention has been given to HIA in recent years due to supportive policies underpinning development of the draft HIA guidelines in 2008. However enormous challenges lie ahead in ensuring the institutionalization of HIA into project, program and policy decision-making process due to limited scope, immature tools and insufficient professionals in HIA practice. HIA should broaden its horizons by encompassing physical, chemical, biological and socio-economic aspects and constant attempts should be made to integrate HIA into the decision-making process, not only for projects and programs but also for policies as well.

  1. "Defining Computer 'Speed': An Unsolved Challenge"

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Abstract: The reason we use computers is their speed, and the reason we use parallel computers is that they're faster than single-processor computers. Yet, after 70 years of electronic digital computing, we still do not have a solid definition of what computer 'speed' means, or even what it means to be 'faster'. Unlike measures in physics, where the definition of speed is rigorous and unequivocal, in computing there is no definition of speed that is universally accepted. As a result, computer customers have made purchases misguided by dubious information, computer designers have optimized their designs for the wrong goals, and computer programmers have chosen methods that optimize the wrong things. This talk describes why some of the obvious and historical ways of defining 'speed' haven't served us well, and the things we've learned in the struggle to find a definition that works. Biography: Dr. John Gustafson is a Director ...

  2. CLOUD COMPUTING OVERVIEW AND CHALLENGES: A REVIEW PAPER

    OpenAIRE

    Satish Kumar*, Vishal Thakur, Payal Thakur, Ashok Kumar Kashyap

    2017-01-01

    Cloud computing era is the most resourceful, elastic, utilized and scalable period for internet technology to use the computing resources over the internet successfully. Cloud computing did not provide only the speed, accuracy, storage capacity and efficiency for computing but it also lead to propagate the green computing and resource utilization. In this research paper, a brief description of cloud computing, cloud services and cloud security challenges is given. Also the literature review o...

  3. Progress and challenges of carbon nanotube membrane in water treatment

    KAUST Repository

    Lee, Jieun

    2016-05-25

    The potential of the carbon nanotube (CNT) membrane has been highly strengthened in water treatment during the last decade. According to works published up to now, the unique and excellent characteristics of CNT outperformed conventional polymer membranes. Such achievements of CNT membranes are greatly dependent on their fabrication methods. Further, the intrinsic properties of CNT could be a critical factor of applicability to membrane processes. This article provides an explicit and systematic review of the progress of CNT membranes addressing the current epidemic—whether (i) the CNT membranes could tackle current challenges in the pressure- or thermally driven membrane processes and (ii) CNT hybrid nanocomposite as a new generation of materials could complement current CNT-enhanced membrane. © 2016 Taylor & Francis Group, LLC.

  4. Psychotherapy for Borderline Personality Disorder: Progress and Remaining Challenges.

    Science.gov (United States)

    Links, Paul S; Shah, Ravi; Eynan, Rahel

    2017-03-01

    The main purpose of this review was to critically evaluate the literature on psychotherapies for borderline personality disorder (BPD) published over the past 5 years to identify the progress with remaining challenges and to determine priority areas for future research. A systematic review of the literature over the last 5 years was undertaken. The review yielded 184 relevant abstracts, and after applying inclusion criteria, 16 articles were fully reviewed based on the articles' implications for future research and/or clinical practice. Our review indicated that patients with various severities benefited from psychotherapy; more intensive therapies were not significantly superior to less intensive therapies; enhancing emotion regulation processes and fostering more coherent self-identity were important mechanisms of change; therapies had been extended to patients with BPD and posttraumatic stress disorder; and more research was needed to be directed at functional outcomes.

  5. Progress and challenges to the global waste management system.

    Science.gov (United States)

    Singh, Jagdeep; Laurenti, Rafael; Sinha, Rajib; Frostell, Björn

    2014-09-01

    Rapid economic growth, urbanization and increasing population have caused (materially intensive) resource consumption to increase, and consequently the release of large amounts of waste to the environment. From a global perspective, current waste and resource management lacks a holistic approach covering the whole chain of product design, raw material extraction, production, consumption, recycling and waste management. In this article, progress and different sustainability challenges facing the global waste management system are presented and discussed. The study leads to the conclusion that the current, rather isolated efforts, in different systems for waste management, waste reduction and resource management are indeed not sufficient in a long term sustainability perspective. In the future, to manage resources and wastes sustainably, waste management requires a more systems-oriented approach that addresses the root causes for the problems. A specific issue to address is the development of improved feedback information (statistics) on how waste generation is linked to consumption. © The Author(s) 2014.

  6. The FCC-ee study: Progress and challenges

    CERN Document Server

    Koratzinos, Michael; Bogomyagkov, Anton; Boscolo, Manuela; Cook, Charlie; Doblhammer, Andreas; Härer, Bastian; Tomás, Rogelio; Levichev, Evgeny; Medina Medrano, Luis; Shatilov, Dmitry; Wienands, Ulrich; Zimmermann, Frank

    The FCC (Future Circular Collider) study represents a vision for the next large project in high energy physics, comprising an 80-100 km tunnel that can house a future 100 TeV hadron collider. The study also includes a high luminosity e+e- collider operating in the centre-of-mass energy range of 90-350 GeV as a possible intermediate step, the FCC-ee. The FCC-ee aims at definitive electro-weak precision measurements of the Z, W, H and top particles, and search for rare phenomena. Although FCC-ee is based on known technology, the goal performance in luminosity and energy calibration make it quite challenging. During 2014 the study went through an exploration phase. The study has now entered its second year and the aim is to produce a conceptual design report during the next three to four years. We here report on progress since the last IPAC conference.

  7. Advanced teaching labs in physics - celebrating progress; challenges ahead

    Science.gov (United States)

    Peterson, Richard

    A few examples of optical physics experiments may help us first reflect on significant progress on how advanced lab initiatives may now be more effectively developed, discussed, and disseminated - as opposed to only 10 or 15 years back. Many cooperative developments of the last decade are having profound impacts on advanced lab workers and students. Central to these changes are the programs of the Advanced Laboratory Physics Association (ALPhA) (Immersions, BFY conferences), AAPT (advlab-l server, ComPADRE, apparatus competitions, summer workshops/sessions), APS (Reichert Award, FEd activities and sessions), and the Jonathan F. Reichert Foundation (ALPhA support and institution matched equipment grants for Immersion participants). Broad NSF support has helped undergird several of these initiatives. Two of the most significant challenges before this new advanced lab community are (a) to somehow enhance funding opportunities for teaching equipment and apparatus in an era of minimal NSF equipment support, and (b) to help develop a more complementary relationship between research-based advanced lab pedagogies and the development of fresh physics experiments that help enable the mentoring and experimental challenge of our students.

  8. Inclusive Education in Georgia: Current Progress and Challenges

    Directory of Open Access Journals (Sweden)

    Nikoloz Kavelashvili

    2017-05-01

    Full Text Available Purpose and Originality: The paper provides a realistic picture about how the implementation process of inclusive education in Georgia is developing, about the problems that are encountered together with what needs are to be fulfilled for stimulating the process. Today’s challenge in the country is to make inclusive practices available to everybody, everywhere and all the time. This article discusses the status of the efforts being made to meet this challenge. In the course of that discussion, some comprehensive changes will be described that systemic efforts of school improvement must achieve to continue making progress towards fully inclusive learning. Method: The study was conducted in Georgia. A qualitative research design was employed along with closed-ended and open-ended questionnaires, which allowed participants to express their point of views, skills and knowledge. Data collection methods were applied: semi-structured interviews and observation on respondents. Results: The study uncovers those challenges that obstruct the implementation process: indifferent attitudes of teachers and parents towards inclusion, absence of self-awareness to the issue amongst educators, slightest involvement of parents and need to infrastructural development. Society: The results should raise the awareness of the population of Georgia as well as increase the understanding of the problem. Limitations / further research: There were quite enough informants on the school level (special teachers, principals, however, there are still many other possible respondents who could add something valuable to a better understanding of the process of inclusion at schools. The theoretical approach employed in the study and the empirical research could be validated.

  9. Cloud Computing Security Issues and Challenges

    OpenAIRE

    Kuyoro S. O.; Ibikunle F; Awodele O

    2011-01-01

    Cloud computing is a set of IT services that are provided to a customer over a network on a leased basis and with the ability to scale up or down their service requirements. Usually cloud computing services are delivered by a third party provider who owns the infrastructure. It advantages to mention but a few include scalability, resilience, flexibility, efficiency and outsourcing non-core activities. Cloud computing offers an innovative business model for organizations to adopt IT services w...

  10. Cloud Computing Security Issues - Challenges and Opportunities

    OpenAIRE

    Vaikunth, Pai T.; Aithal, P. S.

    2017-01-01

    Cloud computing services enabled through information communication technology delivered to a customer as services over the Internet on a leased basis have the capability to extend up or down their service requirements or needs. In this model, the infrastructure is owned by a third party vendor and the cloud computing services are delivered to the requested customers. Cloud computing model has many advantages including scalability, flexibility, elasticity, efficiency, and supports outsourcing ...

  11. Homogeneous Buchberger algorithms and Sullivant's computational commutative algebra challenge

    DEFF Research Database (Denmark)

    Lauritzen, Niels

    2005-01-01

    We give a variant of the homogeneous Buchberger algorithm for positively graded lattice ideals. Using this algorithm we solve the Sullivant computational commutative algebra challenge.......We give a variant of the homogeneous Buchberger algorithm for positively graded lattice ideals. Using this algorithm we solve the Sullivant computational commutative algebra challenge....

  12. Mobile cloud computing for computation offloading: Issues and challenges

    Directory of Open Access Journals (Sweden)

    Khadija Akherfi

    2018-01-01

    Full Text Available Despite the evolution and enhancements that mobile devices have experienced, they are still considered as limited computing devices. Today, users become more demanding and expect to execute computational intensive applications on their smartphone devices. Therefore, Mobile Cloud Computing (MCC integrates mobile computing and Cloud Computing (CC in order to extend capabilities of mobile devices using offloading techniques. Computation offloading tackles limitations of Smart Mobile Devices (SMDs such as limited battery lifetime, limited processing capabilities, and limited storage capacity by offloading the execution and workload to other rich systems with better performance and resources. This paper presents the current offloading frameworks, computation offloading techniques, and analyzes them along with their main critical issues. In addition, it explores different important parameters based on which the frameworks are implemented such as offloading method and level of partitioning. Finally, it summarizes the issues in offloading frameworks in the MCC domain that requires further research.

  13. PROGRESS & CHALLENGES IN CLEANUP OF HANFORDS TANK WASTES

    Energy Technology Data Exchange (ETDEWEB)

    HEWITT, W.M.; SCHEPENS, R.

    2006-01-23

    The River Protection Project (RPP), which is managed by the Department of Energy (DOE) Office of River Protection (ORP), is highly complex from technical, regulatory, legal, political, and logistical perspectives and is the largest ongoing environmental cleanup project in the world. Over the past three years, ORP has made significant advances in its planning and execution of the cleanup of the Hartford tank wastes. The 149 single-shell tanks (SSTs), 28 double-shell tanks (DSTs), and 60 miscellaneous underground storage tanks (MUSTs) at Hanford contain approximately 200,000 m{sup 3} (53 million gallons) of mixed radioactive wastes, some of which dates back to the first days of the Manhattan Project. The plan for treating and disposing of the waste stored in large underground tanks is to: (1) retrieve the waste, (2) treat the waste to separate it into high-level (sludge) and low-activity (supernatant) fractions, (3) remove key radionuclides (e.g., Cs-137, Sr-90, actinides) from the low-activity fraction to the maximum extent technically and economically practical, (4) immobilize both the high-level and low-activity waste fractions by vitrification, (5) interim store the high-level waste fraction for ultimate disposal off-site at the federal HLW repository, (6) dispose the low-activity fraction on-site in the Integrated Disposal Facility (IDF), and (7) close the waste management areas consisting of tanks, ancillary equipment, soils, and facilities. Design and construction of the Waste Treatment and Immobilization Plant (WTP), the cornerstone of the RPP, has progressed substantially despite challenges arising from new seismic information for the WTP site. We have looked closely at the waste and aligned our treatment and disposal approaches with the waste characteristics. For example, approximately 11,000 m{sup 3} (2-3 million gallons) of metal sludges in twenty tanks were not created during spent nuclear fuel reprocessing and have low fission product concentrations. We

  14. PROGRESS and CHALLENGES IN CLEANUP OF HANFORDS TANK WASTES

    International Nuclear Information System (INIS)

    HEWITT, W.M.; SCHEPENS, R.

    2006-01-01

    The River Protection Project (RPP), which is managed by the Department of Energy (DOE) Office of River Protection (ORP), is highly complex from technical, regulatory, legal, political, and logistical perspectives and is the largest ongoing environmental cleanup project in the world. Over the past three years, ORP has made significant advances in its planning and execution of the cleanup of the Hartford tank wastes. The 149 single-shell tanks (SSTs), 28 double-shell tanks (DSTs), and 60 miscellaneous underground storage tanks (MUSTs) at Hanford contain approximately 200,000 m 3 (53 million gallons) of mixed radioactive wastes, some of which dates back to the first days of the Manhattan Project. The plan for treating and disposing of the waste stored in large underground tanks is to: (1) retrieve the waste, (2) treat the waste to separate it into high-level (sludge) and low-activity (supernatant) fractions, (3) remove key radionuclides (e.g., Cs-137, Sr-90, actinides) from the low-activity fraction to the maximum extent technically and economically practical, (4) immobilize both the high-level and low-activity waste fractions by vitrification, (5) interim store the high-level waste fraction for ultimate disposal off-site at the federal HLW repository, (6) dispose the low-activity fraction on-site in the Integrated Disposal Facility (IDF), and (7) close the waste management areas consisting of tanks, ancillary equipment, soils, and facilities. Design and construction of the Waste Treatment and Immobilization Plant (WTP), the cornerstone of the RPP, has progressed substantially despite challenges arising from new seismic information for the WTP site. We have looked closely at the waste and aligned our treatment and disposal approaches with the waste characteristics. For example, approximately 11,000 m 3 (2-3 million gallons) of metal sludges in twenty tanks were not created during spent nuclear fuel reprocessing and have low fission product concentrations. We plan to

  15. Challenges in computational statistics and data mining

    CERN Document Server

    Mielniczuk, Jan

    2016-01-01

    This volume contains nineteen research papers belonging to the areas of computational statistics, data mining, and their applications. Those papers, all written specifically for this volume, are their authors’ contributions to honour and celebrate Professor Jacek Koronacki on the occcasion of his 70th birthday. The book’s related and often interconnected topics, represent Jacek Koronacki’s research interests and their evolution. They also clearly indicate how close the areas of computational statistics and data mining are.

  16. Investigation of Cloud Computing: Applications and Challenges

    OpenAIRE

    Amid Khatibi Bardsiri; Anis Vosoogh; Fatemeh Ahoojoosh

    2014-01-01

    Cloud computing is a model for saving data or knowledge in distance servers through Internet. It can be save the required memory space and reduce cost of extending memory capacity in users’ own machines and etc., Therefore, Cloud Computing has several benefits for individuals as well as organizations. It provides protection for personal and organizational data. Further, with the help of cloud service, one business owner, organization manager or service provider will be able to make privacy an...

  17. Computational science: Emerging opportunities and challenges

    International Nuclear Information System (INIS)

    Hendrickson, Bruce

    2009-01-01

    In the past two decades, computational methods have emerged as an essential component of the scientific and engineering enterprise. A diverse assortment of scientific applications has been simulated and explored via advanced computational techniques. Computer vendors have built enormous parallel machines to support these activities, and the research community has developed new algorithms and codes, and agreed on standards to facilitate ever more ambitious computations. However, this track record of success will be increasingly hard to sustain in coming years. Power limitations constrain processor clock speeds, so further performance improvements will need to come from ever more parallelism. This higher degree of parallelism will require new thinking about algorithms, programming models, and architectural resilience. Simultaneously, cutting edge science increasingly requires more complex simulations with unstructured and adaptive grids, and multi-scale and multi-physics phenomena. These new codes will push existing parallelization strategies to their limits and beyond. Emerging data-rich scientific applications are also in need of high performance computing, but their complex spatial and temporal data access patterns do not perform well on existing machines. These interacting forces will reshape high performance computing in the coming years.

  18. Editorial: Modelling and computational challenges in granular materials

    NARCIS (Netherlands)

    Weinhart, Thomas; Thornton, Anthony Richard; Einav, Itai

    2015-01-01

    This is the editorial for the special issue on “Modelling and computational challenges in granular materials” in the journal on Computational Particle Mechanics (CPM). The issue aims to provide an opportunity for physicists, engineers, applied mathematicians and computational scientists to discuss

  19. Mobile Computing and Ubiquitous Networking: Concepts, Technologies and Challenges.

    Science.gov (United States)

    Pierre, Samuel

    2001-01-01

    Analyzes concepts, technologies and challenges related to mobile computing and networking. Defines basic concepts of cellular systems. Describes the evolution of wireless technologies that constitute the foundations of mobile computing and ubiquitous networking. Presents characterization and issues of mobile computing. Analyzes economical and…

  20. Female challenges in acquiring computer education at the federal ...

    African Journals Online (AJOL)

    Computer education and application of Computer skills in the knowledge-based society is ever increasing. It is in recognition of this that this study determined the challenges of female students in acquisition of Computer education using the Federal Polytechnic, Idah as a case study. The data were obtained from 72 female ...

  1. Nuclear challenges and progress in designing stellarator fusion power plants

    International Nuclear Information System (INIS)

    El-Guebaly, L.A.; Wilson, P.; Henderson, D.; Sawan, M.; Sviatoslavsky, G.; Tautges, T.; Slaybaugh, R.; Kiedrowski, B.; Ibrahim, A.

    2008-01-01

    Over the past 5-6 decades, stellarator power plants have been studied in the US, Europe, and Japan as an alternate to the mainline magnetic fusion tokamaks, offering steady-state operation and eliminating the risk of plasma disruptions. The earlier 1980s studies suggested large-scale stellarator power plants with an average major radius exceeding 20 m. The most recent development of the compact stellarator concept delivered ARIES-CS - a compact stellarator with 7.75 m average major radius, approaching that of tokamaks. For stellarators, the most important engineering parameter that determines the machine size and cost is the minimum distance between the plasma boundary and mid-coil. Accommodating the breeding blanket and necessary shield within this distance to protect the ARIES-CS superconducting magnet represents a challenging task. Selecting the ARIES-CS nuclear and engineering parameters to produce an economic optimum, modeling the complex geometry for 3D nuclear analysis to confirm the key parameters, and minimizing the radwaste stream received considerable attention during the design process. These engineering design elements combined with advanced physics helped enable the compact stellarator to be a viable concept. This paper provides a brief historical overview of the progress in designing stellarator power plants and a perspective on the successful integration of the nuclear activity into the final ARIES-CS configuration

  2. Microbial production of nattokinase: current progress, challenge and prospect.

    Science.gov (United States)

    Cai, Dongbo; Zhu, Chengjun; Chen, Shouwen

    2017-05-01

    Nattokinase (EC 3.4.21.62) is a profibrinolytic serine protease with a potent fibrin-degrading activity, and it has been produced by many host strains. Compared to other fibrinolytic enzymes (urokinase, t-PA and streprokinase), nattokinase shows the advantages of having no side effects, low cost and long life-time, and it has the potential to be used as a drug for treating cardiovascular disease and served as a functional food additive. In this review, we focused on screening of producing strains, genetic engineering, fermentation process optimization for microbial nattokinase production, and the extraction and purification of nattokinase were also discussed in this particular chapter. The selection of optimal nattokinase producing strain was the crucial starting element for improvement of nattokinase production. Genetic engineering, protein engineering, fermentation optimization and process control have been proved to be the effective strategies for enhancement of nattokinase production. Also, extraction and purification of nattokinase are critical for the quality evaluation of nattokinase. Finally, the prospect of microbial nattokinase production was also discussed regarding the recent progress, challenge, and trends in this field.

  3. Progress and Challenges in Assessing NOAA Data Management

    Science.gov (United States)

    de la Beaujardiere, J.

    2016-12-01

    The US National Oceanic and Atmospheric Administration (NOAA) produces large volumes of environmental data from a great variety of observing systems including satellites, radars, aircraft, ships, buoys, and other platforms. These data are irreplaceable assets that must be properly managed to ensure they are discoverable, accessible, usable, and preserved. A policy framework has been established which informs data producers of their responsibilities and which supports White House-level mandates such as the Executive Order on Open Data and the OSTP Memorandum on Increasing Access to the Results of Federally Funded Scientific Research. However, assessing the current state and progress toward completion for the many NOAA datasets is a challenge. This presentation will discuss work toward establishing assessment methodologies and dashboard-style displays. Ideally, metrics would be gathered though software and be automatically updated whenever an individual improvement was made. In practice, however, some level of manual information collection is required. Differing approaches to dataset granularity in different branches of NOAA yield additional complexity.

  4. Computer Aided Modelling – Opportunities and Challenges

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    -based solutions to significant problems? The important issues of workflow and data flow are discussed together with fit-for-purpose model development. As well, the lack of tools around multiscale modelling provides opportunities for the development of efficient tools to address such challenges. The ability...

  5. Computing challenges in HEP for WLHC grid

    CERN Document Server

    Muralidharan, Servesh

    2017-01-01

    As CERN moves towards preparation for increasing the luminosity of the particle beam towards HL-LHC, predictions shows computing demand would out grow our conservative scaling estimates by over ten times. Fortunately we are talking about a time scale of roughly ten years to develop new techniques and novel solutions to address this gap in compute resources. Experiments at CERN face a unique scenario where in they need to scale both latency sensitive workloads such as data acquisition of the detectors and throughput based ones such as simulations and reconstruction of high level events and physics processes. In this talk we cover some of the ongoing research at tier-0 in CERN which investigates several aspects of throughput sensitive workloads that consume significant compute cycles.

  6. Biomedical Visual Computing: Case Studies and Challenges

    KAUST Repository

    Johnson, Christopher

    2012-01-01

    Advances in computational geometric modeling, imaging, and simulation let researchers build and test models of increasing complexity, generating unprecedented amounts of data. As recent research in biomedical applications illustrates, visualization will be critical in making this vast amount of data usable; it\\'s also fundamental to understanding models of complex phenomena. © 2012 IEEE.

  7. Biomedical Visual Computing: Case Studies and Challenges

    KAUST Repository

    Johnson, Christopher

    2012-01-01

    Advances in computational geometric modeling, imaging, and simulation let researchers build and test models of increasing complexity, generating unprecedented amounts of data. As recent research in biomedical applications illustrates, visualization will be critical in making this vast amount of data usable; it's also fundamental to understanding models of complex phenomena. © 2012 IEEE.

  8. Scientific and Computational Challenges of the Fusion Simulation Program (FSP)

    International Nuclear Information System (INIS)

    Tang, William M.

    2011-01-01

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Program (FSP) a major national initiative in the United States with the primary objective being to enable scientific discovery of important new plasma phenomena with associated understanding that emerges only upon integration. This requires developing a predictive integrated simulation capability for magnetically-confined fusion plasmas that are properly validated against experiments in regimes relevant for producing practical fusion energy. It is expected to provide a suite of advanced modeling tools for reliably predicting fusion device behavior with comprehensive and targeted science-based simulations of nonlinearly-coupled phenomena in the core plasma, edge plasma, and wall region on time and space scales required for fusion energy production. As such, it will strive to embody the most current theoretical and experimental understanding of magnetic fusion plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing the ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices with high physics fidelity on all relevant time and space scales. From a computational perspective, this will demand computing resources in the petascale range and beyond together with the associated multi-core algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative experiment involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics modeling projects (e

  9. Scientific and computational challenges of the fusion simulation program (FSP)

    International Nuclear Information System (INIS)

    Tang, William M.

    2011-01-01

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Program (FSP) - a major national initiative in the United States with the primary objective being to enable scientific discovery of important new plasma phenomena with associated understanding that emerges only upon integration. This requires developing a predictive integrated simulation capability for magnetically-confined fusion plasmas that are properly validated against experiments in regimes relevant for producing practical fusion energy. It is expected to provide a suite of advanced modeling tools for reliably predicting fusion device behavior with comprehensive and targeted science-based simulations of nonlinearly-coupled phenomena in the core plasma, edge plasma, and wall region on time and space scales required for fusion energy production. As such, it will strive to embody the most current theoretical and experimental understanding of magnetic fusion plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing the ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices with high physics fidelity on all relevant time and space scales. From a computational perspective, this will demand computing resources in the petascale range and beyond together with the associated multi-core algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative experiment involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics modeling projects (e

  10. Scientific and computational challenges of the fusion simulation project (FSP)

    International Nuclear Information System (INIS)

    Tang, W M

    2008-01-01

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Project (FSP). The primary objective is to develop advanced software designed to use leadership-class computers for carrying out multiscale physics simulations to provide information vital to delivering a realistic integrated fusion simulation model with unprecedented physics fidelity. This multiphysics capability will be unprecedented in that in the current FES applications domain, the largest-scale codes are used to carry out first-principles simulations of mostly individual phenomena in realistic 3D geometry while the integrated models are much smaller-scale, lower-dimensionality codes with significant empirical elements used for modeling and designing experiments. The FSP is expected to be the most up-to-date embodiment of the theoretical and experimental understanding of magnetically confined thermonuclear plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing a reliable ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices on all relevant time and space scales. From a computational perspective, the fusion energy science application goal to produce high-fidelity, whole-device modeling capabilities will demand computing resources in the petascale range and beyond, together with the associated multicore algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative device involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics

  11. Neonatal tetanus elimination in Pakistan: progress and challenges.

    Science.gov (United States)

    Lambo, Jonathan A; Nagulesapillai, Tharsiya

    2012-12-01

    Pakistan is one of the 34 countries that have not achieved the neonatal tetanus (NT) global elimination target set by the World Health Organization (WHO). NT, caused by Clostridium tetani, is a highly fatal infection of the neonatal period. It is one of the most underreported diseases and remains a major but preventable cause of neonatal and infant mortality in many developing countries. In 1989, the World Health Assembly called for the elimination of NT by 1995, and since then considerable progress has been made using the following strategies: clean delivery practices, routine tetanus toxoid (TT) immunization of pregnant women, and immunization of all women of childbearing age with three doses of TT vaccine in high-risk areas during supplementary immunization campaigns. This review presents the activities, progress, and challenges in achieving NT elimination in Pakistan. A review of the literature found TT vaccination coverage in Pakistan ranged from 60% to 74% over the last decade. Low vaccination coverage, the main driver for NT in Pakistan, is due to many factors, including demand failure for TT vaccine resulting from inadequate knowledge of TT vaccine among reproductive age females and inadequate information about the benefits of TT provided by health care workers and the media. Other factors linked to low vaccination coverage include residing in rural areas, lack of formal education, poor knowledge about place and time to get vaccinated, and lack of awareness about the importance of vaccination. A disparity exists in TT vaccination coverage and antenatal care between urban and rural areas due to access and utilization of health care services. NT reporting is incomplete, as cases from the private sector and rural areas are underreported. To successfully eliminate NT, women of reproductive age must be made aware of the benefits of TT vaccine, not only to themselves, but also to their families. Effective communication strategies for TT vaccine delivery and

  12. Challenges & Roadmap for Beyond CMOS Computing Simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Rodrigues, Arun F. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Frank, Michael P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-12-01

    Simulating HPC systems is a difficult task and the emergence of “Beyond CMOS” architectures and execution models will increase that difficulty. This document presents a “tutorial” on some of the simulation challenges faced by conventional and non-conventional architectures (Section 1) and goals and requirements for simulating Beyond CMOS systems (Section 2). These provide background for proposed short- and long-term roadmaps for simulation efforts at Sandia (Sections 3 and 4). Additionally, a brief explanation of a proof-of-concept integration of a Beyond CMOS architectural simulator is presented (Section 2.3).

  13. Cloud computing challenges, limitations and R&D solutions

    CERN Document Server

    Mahmood, Zaigham

    2014-01-01

    This important text/reference reviews the challenging issues that present barriers to greater implementation of the cloud computing paradigm, together with the latest research into developing potential solutions. Exploring the strengths and vulnerabilities of cloud provision and cloud environments, Cloud Computing: Challenges, Limitations and R&D Solutions provides case studies from a diverse selection of researchers and practitioners of international repute. The implications of emerging cloud technologies are also analyzed from the perspective of consumers. Topics and features: presents

  14. Computer graphics visions and challenges: a European perspective.

    Science.gov (United States)

    Encarnação, José L

    2006-01-01

    I have briefly described important visions and challenges in computer graphics. They are a personal and therefore subjective selection. But most of these issues have to be addressed and solved--no matter if we call them visions or challenges or something else--if we want to make and further develop computer graphics into a key enabling technology for our IT-based society.

  15. Multimodal Challenge: Analytics Beyond User-computer Interaction Data

    NARCIS (Netherlands)

    Di Mitri, Daniele; Schneider, Jan; Specht, Marcus; Drachsler, Hendrik

    2018-01-01

    This contribution describes one the challenges explored in the Fourth LAK Hackathon. This challenge aims at shifting the focus from learning situations which can be easily traced through user-computer interactions data and concentrate more on user-world interactions events, typical of co-located and

  16. Computational brain connectivity mapping: A core health and scientific challenge.

    Science.gov (United States)

    Deriche, Rachid

    2016-10-01

    One third of the burden of all the diseases in Europe is due to problems caused by diseases affecting brain. Although exceptional progress have been obtained for exploring the brain during the past decades, it is still terra-incognita and calls for specific efforts in research to better understand its architecture and functioning. To take up this great challenge of modern science and to solve the limited view of the brain provided just by one imaging modality, this article advocates the idea developed in my research group of a global approach involving new generation of models for brain connectivity mapping and strong interactions between structural and functional connectivities. Capitalizing on the strengths of integrated and complementary non invasive imaging modalities such as diffusion Magnetic Resonance Imaging (dMRI) and Electro & Magneto-Encephalography (EEG & MEG) will contribute to achieve new frontiers for identifying and characterizing structural and functional brain connectivities and to provide a detailed mapping of the brain connectivity, both in space and time. Thus leading to an added clinical value for high impact diseases with new perspectives in computational neuro-imaging and cognitive neuroscience. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Addressing Cloud Computing in Enterprise Architecture: Issues and Challenges

    OpenAIRE

    Khan, Khaled; Gangavarapu, Narendra

    2009-01-01

    This article discusses how the characteristics of cloud computing affect the enterprise architecture in four domains: business, data, application and technology. The ownership and control of architectural components are shifted from organisational perimeters to cloud providers. It argues that although cloud computing promises numerous benefits to enterprises, the shifting control from enterprises to cloud providers on architectural components introduces several architectural challenges. The d...

  18. Maternal and child health in Brazil: progress and challenges.

    Science.gov (United States)

    Victora, Cesar G; Aquino, Estela M L; do Carmo Leal, Maria; Monteiro, Carlos Augusto; Barros, Fernando C; Szwarcwald, Celia L

    2011-05-28

    In the past three decades, Brazil has undergone rapid changes in major social determinants of health and in the organisation of health services. In this report, we examine how these changes have affected indicators of maternal health, child health, and child nutrition. We use data from vital statistics, population censuses, demographic and health surveys, and published reports. In the past three decades, infant mortality rates have reduced substantially, decreasing by 5·5% a year in the 1980s and 1990s, and by 4·4% a year since 2000 to reach 20 deaths per 1000 livebirths in 2008. Neonatal deaths account for 68% of infant deaths. Stunting prevalence among children younger than 5 years decreased from 37% in 1974-75 to 7% in 2006-07. Regional differences in stunting and child mortality also decreased. Access to most maternal-health and child-health interventions increased sharply to almost universal coverage, and regional and socioeconomic inequalities in access to such interventions were notably reduced. The median duration of breastfeeding increased from 2·5 months in the 1970s to 14 months by 2006-07. Official statistics show stable maternal mortality ratios during the past 10 years, but modelled data indicate a yearly decrease of 4%, a trend which might not have been noticeable in official reports because of improvements in death registration and the increased number of investigations into deaths of women of reproductive age. The reasons behind Brazil's progress include: socioeconomic and demographic changes (economic growth, reduction in income disparities between the poorest and wealthiest populations, urbanisation, improved education of women, and decreased fertility rates), interventions outside the health sector (a conditional cash transfer programme and improvements in water and sanitation), vertical health programmes in the 1980s (promotion of breastfeeding, oral rehydration, and immunisations), creation of a tax-funded national health service in 1988

  19. Nuclear challenges and progress in designing stellarator power plants

    International Nuclear Information System (INIS)

    El-Guebaly, L.

    2007-01-01

    As an alternate to the mainline magnetic fusion tokamaks, the stellarator concept offers a steady state operation without external driven current, eliminating the risk of plasma irruptions. Over the past 2-3 decades, stellarator power plants have been studied in the U.S., Japan, and Europe to enhance the physics and engineering aspects and optimize the design parameters that are subject to numerous constraints. The earlier 1980's studies delivered large stellarators with an average major radius exceeding 20 m. The most recent development of the compact stellarator concept has led to the construction of the National Compact Stellarator Experiment (NCSX) in the U.S. and the 3 years power plant study of ARIES-CS, a compact stellarator with 7.75 m average major radius, approaching that of tokamaks. The ARIES-CS first wall configuration deviates from the standard practice of uniform toroidal shape in order to achieve compactness. Modeling such a complex geometry for 3-D nuclear analysis was a challenging engineering task. A novel approach based on coupling the CAD model with the MCNP Monte Carlo code was developed to model, for the first time ever, the complex stellarator geometry for nuclear assessments. The most important parameter that determines the stellarator size and cost is the minimum distance between the plasma boundary and mid-coil. Accommodating the breeding blanket and necessary shield to protect the superconducting magnet represented another challenging task. An innovative approach utilizing a non-uniform blanket combined with a highly efficient WC shield for this highly constrained area reduced the radial standoff (and machine size and cost) by 25- 30%, which is significant. As stellarators generate more radwaste than tokamaks, managing ARIES-CS active materials during operation and after plant decommissioning was essential for the environmental attractiveness of the machine. The geological disposal option could be replaced with more attractive scenarios

  20. Mobile Computing: The Emerging Technology, Sensing, Challenges and Applications

    International Nuclear Information System (INIS)

    Bezboruah, T.

    2010-12-01

    The mobile computing is a computing system in which a computer and all necessary accessories like files and software are taken out to the field. It is a system of computing through which it is being able to use a computing device even when someone being mobile and therefore changing location. The portability is one of the important aspects of mobile computing. The mobile phones are being used to gather scientific data from remote and isolated places that could not be possible to retrieve by other means. The scientists are initiating to use mobile devices and web-based applications to systematically explore interesting scientific aspects of their surroundings, ranging from climate change, environmental pollution to earthquake monitoring. This mobile revolution enables new ideas and innovations to spread out more quickly and efficiently. Here we will discuss in brief about the mobile computing technology, its sensing, challenges and the applications. (author)

  1. Research program in computational physics: [Progress report for Task D

    International Nuclear Information System (INIS)

    Guralnik, G.S.

    1987-01-01

    Studies are reported of several aspects of the purely gluonic sector of QCD, including methods for efficiently generating gauge configurations, properties of the standard Wilson action and improved actions, and properties of the pure glue theory itself. Simulation of quantum chromodynamics in the ''quenched approximation'', in which the back reaction of quarks upon gauge fields is neglected, is studied with fermions introduced on the lattice via both Wilson and staggered formulations. Efforts are also reported to compute QCD matrix elements and to simulate QCD theory beyond the quenched approximation considering the effect of the quarks on the gauge fields. Work is in progress toward improving the algorithms used to generate the gauge field configurations and to compute the quark propagators. Implementation of lattice QCD on a hypercube is also reported

  2. The computational challenges of Earth-system science.

    Science.gov (United States)

    O'Neill, Alan; Steenman-Clark, Lois

    2002-06-15

    The Earth system--comprising atmosphere, ocean, land, cryosphere and biosphere--is an immensely complex system, involving processes and interactions on a wide range of space- and time-scales. To understand and predict the evolution of the Earth system is one of the greatest challenges of modern science, with success likely to bring enormous societal benefits. High-performance computing, along with the wealth of new observational data, is revolutionizing our ability to simulate the Earth system with computer models that link the different components of the system together. There are, however, considerable scientific and technical challenges to be overcome. This paper will consider four of them: complexity, spatial resolution, inherent uncertainty and time-scales. Meeting these challenges requires a significant increase in the power of high-performance computers. The benefits of being able to make reliable predictions about the evolution of the Earth system should, on their own, amply repay this investment.

  3. Science Education Reform in Qatar: Progress and Challenges

    Science.gov (United States)

    Said, Ziad

    2016-01-01

    Science education reform in Qatar has had limited success. In the Trends in International Mathematics and Science Study (TIMMS), Qatari 4th and 8th grade students have shown progress in science achievement, but they remain significantly below the international average. Also, in the Program for International Student Assessment (PISA), Qatari…

  4. Engineering of obligate intracellular bacteria: progress, challenges and paradigms

    Science.gov (United States)

    Over twenty years have passed since the first report of genetic manipulation of an obligate intracellular bacterium. Through progress interspersed by bouts of stagnation, microbiologists and geneticists have developed approaches to genetically manipulate obligates. A brief overview of the current ge...

  5. Empirical findings on progress and challenges in a novice students ...

    African Journals Online (AJOL)

    Journal of Computer Science and Its Application. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 22, No 2 (2015) >. Log in or Register to get access to full text downloads.

  6. High-End Computing Challenges in Aerospace Design and Engineering

    Science.gov (United States)

    Bailey, F. Ronald

    2004-01-01

    High-End Computing (HEC) has had significant impact on aerospace design and engineering and is poised to make even more in the future. In this paper we describe four aerospace design and engineering challenges: Digital Flight, Launch Simulation, Rocket Fuel System and Digital Astronaut. The paper discusses modeling capabilities needed for each challenge and presents projections of future near and far-term HEC computing requirements. NASA's HEC Project Columbia is described and programming strategies presented that are necessary to achieve high real performance.

  7. Real-time fMRI neurofeedback: Progress and challenges

    Science.gov (United States)

    Sulzer, J.; Haller, S.; Scharnowski, F.; Weiskopf, N.; Birbaumer, N.; Blefari, M.L.; Bruehl, A.B.; Cohen, L.G.; deCharms, R.C.; Gassert, R.; Goebel, R.; Herwig, U.; LaConte, S.; Linden, D.; Luft, A.; Seifritz, E.; Sitaram, R.

    2016-01-01

    In February of 2012, the first international conference on real time functional magnetic resonance imaging (rtfMRI) neurofeedback was held at the Swiss Federal Institute of Technology Zurich (ETHZ), Switzerland. This review summarizes progress in the field, introduces current debates, elucidates open questions, and offers viewpoints derived from the conference. The review offers perspectives on study design, scientific and clinical applications, rtfMRI learning mechanisms and future outlook. PMID:23541800

  8. Language development: Progress and challenges in a multilingual ...

    African Journals Online (AJOL)

    Some such challenges discussed include issues like language selection for development, absence of clear language policy and the important issue of attitudes of respective language communities towards language research programmes. The article also looks at how the project and the institute have managed to make ...

  9. Progress report of a research program in computational physics

    International Nuclear Information System (INIS)

    Guralnik, G.S.

    1990-01-01

    Task D's research is focused on the understanding of elementary particle physics through the techniques of quantum field theory. We make intensive use of computers to aid our research. During the last year we have made significant progress in understanding the weak interactions through the use of Monte Carlo methods as applied to the equations of quenched lattice QCD. We have launched a program to understand full (not quenched) lattice QCD on relatively large lattices using massively parallel computers. Because of our awareness that Monte Carlo methods might not be able to give a good solution to field theories with the computer power likely to be available to us for the forseeable future we have launched an entirely different numerical approach to study these problems. This ''Source Galerkin'' method is based on an algebraic approach to the field theoretic equations of motion and is (somewhat) related to variational and finite element techniques applied to a source rather than a coordinate space. The results for relatively simple problems are sensationally good. In particular, fermions can be treated in a way which allows them to retain their status as independent dynamical entities in the theory. 8 refs

  10. New Challenges for Computing in High Energy Physics

    International Nuclear Information System (INIS)

    Santoro, Alberto

    2003-01-01

    In view of the new scientific programs established for the LHC (Large Hadron Collider) era, the way to face the technological challenges in computing was develop a new concept of GRID computing. We show some examples and, in particular, a proposal for high energy physicists in countries like Brazil. Due to the big amount of data and the need of close collaboration it will be impossible to work in research centers and universities very far from Fermilab or CERN unless a GRID architecture is built. An important effort is being made by the international community to up to date their computing infrastructure and networks

  11. QM/MM free energy simulations: recent progress and challenges

    Science.gov (United States)

    Lu, Xiya; Fang, Dong; Ito, Shingo; Okamoto, Yuko; Ovchinnikov, Victor

    2016-01-01

    Due to the higher computational cost relative to pure molecular mechanical (MM) simulations, hybrid quantum mechanical/molecular mechanical (QM/MM) free energy simulations particularly require a careful consideration of balancing computational cost and accuracy. Here we review several recent developments in free energy methods most relevant to QM/MM simulations and discuss several topics motivated by these developments using simple but informative examples that involve processes in water. For chemical reactions, we highlight the value of invoking enhanced sampling technique (e.g., replica-exchange) in umbrella sampling calculations and the value of including collective environmental variables (e.g., hydration level) in metadynamics simulations; we also illustrate the sensitivity of string calculations, especially free energy along the path, to various parameters in the computation. Alchemical free energy simulations with a specific thermodynamic cycle are used to probe the effect of including the first solvation shell into the QM region when computing solvation free energies. For cases where high-level QM/MM potential functions are needed, we analyze two different approaches: the QM/MM-MFEP method of Yang and co-workers and perturbative correction to low-level QM/MM free energy results. For the examples analyzed here, both approaches seem productive although care needs to be exercised when analyzing the perturbative corrections. PMID:27563170

  12. A Blood Test for Alzheimer's Disease: Progress, Challenges, and Recommendations.

    Science.gov (United States)

    Kiddle, Steven J; Voyle, Nicola; Dobson, Richard J B

    2018-03-29

    Ever since the discovery of APOEɛ4 around 25 years ago, researchers have been excited about the potential of a blood test for Alzheimer's disease (AD). Since then researchers have looked for genetic, protein, metabolite, and/or gene expression markers of AD and related phenotypes. However, no blood test for AD is yet being used in the clinical setting. We first review the trends and challenges in AD blood biomarker research, before giving our personal recommendations to help researchers overcome these challenges. While some degree of consistency and replication has been seen across independent studies, several high-profile studies have seemingly failed to replicate. Partly due to academic incentives, there is a reluctance in the field to report predictive ability, to publish negative findings, and to independently replicate the work of others. If this can be addressed, then we will know sooner whether a blood test for AD or related phenotypes with clinical utility can be developed.

  13. The challenge of the future. Technical progress and ecological perspectives

    International Nuclear Information System (INIS)

    Jischa, M.F.

    1993-01-01

    The book introduces readers into the interrelated global problems population dynamics, energy supply, imminent climate catastrophe, environmetal pollution, finite resources and the conflict between the North and South. It encourages probing more deeply into the technical challenges of the future. The author demonstrates why economic and technical issues will soon be outstripped by questions of the environmental, human and social compatibility of new technologies. (orig./UA) [de

  14. Occupational Exposure to HDI: Progress and Challenges in Biomarker Analysis

    OpenAIRE

    Flack, Sheila L.; Ball, Louise M.; Nylander-French, Leena A.

    2010-01-01

    1,6-hexamethylene diisocyanate (HDI) is extensively used in the automotive repair industry and is a commonly reported cause of occupational asthma in industrialized populations. However, the exact pathological mechanism remains uncertain. Characterization and quantification of biomarkers resulting from HDI exposure can fill important knowledge gaps between exposure, susceptibility, and the rise of immunological reactions and sensitization leading to asthma. Here, we discuss existing challenge...

  15. Molecular Science Computing Facility Scientific Challenges: Linking Across Scales

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, Wibe A.; Windus, Theresa L.

    2005-07-01

    The purpose of this document is to define the evolving science drivers for performing environmental molecular research at the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) and to provide guidance associated with the next-generation high-performance computing center that must be developed at EMSL's Molecular Science Computing Facility (MSCF) in order to address this critical research. The MSCF is the pre-eminent computing facility?supported by the U.S. Department of Energy's (DOE's) Office of Biological and Environmental Research (BER)?tailored to provide the fastest time-to-solution for current computational challenges in chemistry and biology, as well as providing the means for broad research in the molecular and environmental sciences. The MSCF provides integral resources and expertise to emerging EMSL Scientific Grand Challenges and Collaborative Access Teams that are designed to leverage the multiple integrated research capabilities of EMSL, thereby creating a synergy between computation and experiment to address environmental molecular science challenges critical to DOE and the nation.

  16. Challenges and opportunities of cloud computing for atmospheric sciences

    Science.gov (United States)

    Pérez Montes, Diego A.; Añel, Juan A.; Pena, Tomás F.; Wallom, David C. H.

    2016-04-01

    Cloud computing is an emerging technological solution widely used in many fields. Initially developed as a flexible way of managing peak demand it has began to make its way in scientific research. One of the greatest advantages of cloud computing for scientific research is independence of having access to a large cyberinfrastructure to fund or perform a research project. Cloud computing can avoid maintenance expenses for large supercomputers and has the potential to 'democratize' the access to high-performance computing, giving flexibility to funding bodies for allocating budgets for the computational costs associated with a project. Two of the most challenging problems in atmospheric sciences are computational cost and uncertainty in meteorological forecasting and climate projections. Both problems are closely related. Usually uncertainty can be reduced with the availability of computational resources to better reproduce a phenomenon or to perform a larger number of experiments. Here we expose results of the application of cloud computing resources for climate modeling using cloud computing infrastructures of three major vendors and two climate models. We show how the cloud infrastructure compares in performance to traditional supercomputers and how it provides the capability to complete experiments in shorter periods of time. The monetary cost associated is also analyzed. Finally we discuss the future potential of this technology for meteorological and climatological applications, both from the point of view of operational use and research.

  17. The Artificial Leaf: Recent Progress and Remaining Challenges

    Directory of Open Access Journals (Sweden)

    Mark D Symes

    2016-12-01

    Full Text Available The prospect of a device that uses solar energy to split water into H2 and O2 is highly attractive in terms of producing hydrogen as a carbon-neutral fuel. In this mini review, key research milestones that have been reached in this field over the last two decades will be discussed, with special focus on devices that use earth-abundant materials. Finally, the remaining challenges in the development of such “artificial leaves” will be highlighted.

  18. Philanthropy and disparities: progress, challenges, and unfinished business.

    Science.gov (United States)

    Mitchell, Faith; Sessions, Kathryn

    2011-10-01

    Philanthropy has invested millions of dollars to reduce disparities in health care and improve minority health. Grants to strengthen providers' cultural competence, diversify health professions, and collect data have improved understanding of and spurred action on disparities. The persistence of disparities in spite of these advances has shifted philanthropic attention toward strategies to change social, economic, and environmental conditions. We argue that these evolving perspectives, along with earlier groundwork, present new opportunities for funders, especially in combination with progress toward universal health coverage. This article looks at how philanthropy has addressed health disparities over the past decade, with a focus on accomplishments, the work remaining to be done, and how funders can help advance the disparities agenda.

  19. Scenario-Based Digital Forensics Challenges in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Erik Miranda Lopez

    2016-10-01

    Full Text Available The aim of digital forensics is to extract information to answer the 5Ws (Why, When, Where, What, and Who from the data extracted from the evidence. In order to achieve this, most digital forensic processes assume absolute control of digital evidence. However, in a cloud environment forensic investigation, this is not always possible. Additionally, the unique characteristics of cloud computing create new technical, legal and architectural challenges when conducting a forensic investigation. We propose a hypothetical scenario to uncover and explain the challenges forensic practitioners face during cloud investigations. Additionally, we also provide solutions to address the challenges. Our hypothetical case scenario has shown that, in the long run, better live forensic tools, development of new methods tailored for cloud investigations and new procedures and standards are indeed needed. Furthermore, we have come to the conclusion that forensic investigations biggest challenge is not technical but legal.

  20. EEG Derived Neuronal Dynamics during Meditation: Progress and Challenges

    Directory of Open Access Journals (Sweden)

    Chamandeep Kaur

    2015-01-01

    Full Text Available Meditation advances positivity but how these behavioral and psychological changes are brought can be explained by understanding neurophysiological effects of meditation. In this paper, a broad spectrum of neural mechanics under a variety of meditation styles has been reviewed. The overall aim of this study is to review existing scientific studies and future challenges on meditation effects based on changing EEG brainwave patterns. Albeit the existing researches evidenced the hold for efficacy of meditation in relieving anxiety and depression and producing psychological well-being, more rigorous studies are required with better design, considering client variables like personality characteristics to avoid negative effects, randomized controlled trials, and large sample sizes. A bigger number of clinical trials that concentrate on the use of meditation are required. Also, the controversial subject of epileptiform EEG changes and other adverse effects during meditation has been raised.

  1. Governance of water resources in Colombia: Between progress and challenges

    International Nuclear Information System (INIS)

    Zamudio Rodriguez, Carmen

    2012-01-01

    This work is an overview of water management in Colombia, emphasizing governance as a key element in this type of process. Therefore, from the collection and analysis of secondary data, identifies the evolution of water management in the country and, to that extent, aspects that reveal a crisis of governance in this area. In this sense, initially some relevant issues are raised in order to analyze the integrated water resource management and water governance. Later, it addresses factors that show that, despite significant progress in water management in the country, it is still to emerge a comprehensive approach that considers multiple criteria to provide governance on water resources. Thus, we propose that there is a crisis of governance on water expressed in terms of lack of experience and international context, lack of coordination and dispersion of water policy, ignorance of the various forms of local government, a wrong perception on the water abundance and richness of the country, and dissimulation or disinterest ignoring the many pressures that threaten water.

  2. Precision Medicine and PET/Computed Tomography: Challenges and Implementation.

    Science.gov (United States)

    Subramaniam, Rathan M

    2017-01-01

    Precision Medicine is about selecting the right therapy for the right patient, at the right time, specific to the molecular targets expressed by disease or tumors, in the context of patient's environment and lifestyle. Some of the challenges for delivery of precision medicine in oncology include biomarkers for patient selection for enrichment-precision diagnostics, mapping out tumor heterogeneity that contributes to therapy failures, and early therapy assessment to identify resistance to therapies. PET/computed tomography offers solutions in these important areas of challenges and facilitates implementation of precision medicine. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. U.S. Department of Energy Workplace Charging Challenge - Progress Update 2016: A New Sustainable Commute

    Energy Technology Data Exchange (ETDEWEB)

    2017-01-01

    In June 2016, the Workplace Charging Challenge distributed its third annual survey to 295 partners with the goal of tracking partners' progress and identifying trends in workplace charging. This document summarizes findings from the survey and highlights accomplishments of the EV Everywhere Workplace Charging Challenge.

  4. Orphan drugs in development for Huntington's disease: challenges and progress

    Directory of Open Access Journals (Sweden)

    Burgunder JM

    2015-02-01

    Full Text Available Jean-Marc Burgunder1–4 1Swiss Huntington’s Disease Centre, Department of Neurology, University of Bern, Bern, Switzerland; 2Department of Neurology, West China Hospital, Sichuan University, Chengdu, 3Department of Neurology, Xiangya Hospital, Central South University, Changsha, 4Department of Neurology, Sun Yat-sen University, Guangzhou, People’s Republic of China Abstract: Huntington’s disease is a monogenic disorder encompassing a variable phenotype with progressive cognitive, psychiatric, and movement disorders. Knowledge of the mechanisms involved in this disorder has made substantial advances since the discovery of the gene mutation. The dynamic mutation is the expansion of a CAG (cytosine-adenine-guanine repeat in the huntingtin (HTT gene, which is transcribed into an abnormal protein with an elongated polyglutamine tract. Polyglutamine HTT accumulates and is changed in its function in multifaceted ways related to the numerous roles of the normal protein. The protein is expressed in numerous areas of the brain and also in other organs. The major brain region involved in the disease process is the striatum, but it is clear that other systems are involved as well. This accumulated knowledge has now led to the development of treatment strategies based on specific molecular pathways for symptomatic and disease course-modifying treatment. The most proximal way to handle the disturbed protein is to hinder the gene transcription, translation, and/or to increase protein clearance. Other mechanisms now being approached include modulation of energy and intracellular signaling, induction of factors potentially leading to neuroprotection, as well as modulation of glial function. Several clinical trials based on these approaches are now under way, and it is becoming clear that a future disease-modifying therapy will be a combination of several approaches harmonized with symptomatic treatments. In this review, some of the most promising and

  5. Progress and challenges for abiotic stress proteomics of crop plants.

    Science.gov (United States)

    Barkla, Bronwyn J; Vera-Estrella, Rosario; Pantoja, Omar

    2013-06-01

    Plants are continually challenged to recognize and respond to adverse changes in their environment to avoid detrimental effects on growth and development. Understanding the mechanisms that crop plants employ to resist and tolerate abiotic stress is of considerable interest for designing agriculture breeding strategies to ensure sustainable productivity. The application of proteomics technologies to advance our knowledge in crop plant abiotic stress tolerance has increased dramatically in the past few years as evidenced by the large amount of publications in this area. This is attributed to advances in various technology platforms associated with MS-based techniques as well as the accessibility of proteomics units to a wider plant research community. This review summarizes the work which has been reported for major crop plants and evaluates the findings in context of the approaches that are widely employed with the aim to encourage broadening the strategies used to increase coverage of the proteome. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. CAR-T therapy for leukemia: progress and challenges.

    Science.gov (United States)

    Wang, Xin; Xiao, Qing; Wang, Zhe; Feng, Wen-Li

    2017-04-01

    Despite the rapid development of therapeutic strategies, leukemia remains a type of difficult-to-treat hematopoietic malignancy that necessitates introduction of more effective treatment options to improve life expectancy and quality of patients. Genetic engineering in adoptively transferred T cells to express antigen-specific chimeric antigen receptors (CARs) has proved highly powerful and efficacious in inducing sustained responses in patients with refractory malignancies, as exemplified by the success of CD19-targeting CAR-T treatment in patients with relapsed acute lymphoblastic leukemia. Recent strategies, including manipulating intracellular activating domains and transducing viral vectors, have resulted in better designed and optimized CAR-T cells. This is further facilitated by the rapid identification of an accumulating number of potential leukemic antigens that may serve as therapeutic targets for CAR-T cells. This review will provide a comprehensive background and scrutinize recent important breakthrough studies on anti-leukemia CAR-T cells, with focus on recently identified antigens for CAR-T therapy design and approaches to overcome critical challenges. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Uterine sarcomas-Recent progress and future challenges

    International Nuclear Information System (INIS)

    Seddon, Beatrice M.; Davda, Reena

    2011-01-01

    Uterine sarcomas are a group of rare tumours that provide considerable challenges in their treatment. Radiological diagnosis prior to hysterectomy is difficult, with the diagnosis frequently made post-operatively. Current staging systems have been unsatisfactory, although a new FIGO staging system specifically for uterine sarcomas has now been introduced, and may allow better grouping of patients according to expected prognosis. While the mainstay of treatment of early disease is a total abdominal hysterectomy, it is less clear whether routine oophorectomy or lymphadenectomy is necessary. Adjuvant pelvic radiotherapy may improve local tumour control in high risk patients, but is not associated with an overall survival benefit. Similarly there is no good evidence for the routine use of adjuvant chemotherapy. For advanced leiomyosarcoma, newer chemotherapy agents including gemcitabine and docetaxel, and trabectedin, offer some promise, while hormonal therapies appear to be more useful in endometrial stromal sarcoma. Novel targeted agents are now being introduced for sarcomas, and uterine sarcomas, and show some indications of activity. Non-pharmacological treatments, including surgical metastatectomy, radiofrequency ablation, and CyberKnife radiotherapy, are important additions to systemic therapy for advanced metastatic disease.

  8. Reviews on Security Issues and Challenges in Cloud Computing

    Science.gov (United States)

    An, Y. Z.; Zaaba, Z. F.; Samsudin, N. F.

    2016-11-01

    Cloud computing is an Internet-based computing service provided by the third party allowing share of resources and data among devices. It is widely used in many organizations nowadays and becoming more popular because it changes the way of how the Information Technology (IT) of an organization is organized and managed. It provides lots of benefits such as simplicity and lower costs, almost unlimited storage, least maintenance, easy utilization, backup and recovery, continuous availability, quality of service, automated software integration, scalability, flexibility and reliability, easy access to information, elasticity, quick deployment and lower barrier to entry. While there is increasing use of cloud computing service in this new era, the security issues of the cloud computing become a challenges. Cloud computing must be safe and secure enough to ensure the privacy of the users. This paper firstly lists out the architecture of the cloud computing, then discuss the most common security issues of using cloud and some solutions to the security issues since security is one of the most critical aspect in cloud computing due to the sensitivity of user's data.

  9. Progress and challenges in coupled hydrodynamic-ecological estuarine modeling

    Science.gov (United States)

    Ganju, Neil K.; Brush, Mark J.; Rashleigh, Brenda; Aretxabaleta, Alfredo L.; del Barrio, Pilar; Grear, Jason S.; Harris, Lora A.; Lake, Samuel J.; McCardell, Grant; O'Donnell, James; Ralston, David K.; Signell, Richard P.; Testa, Jeremy; Vaudrey, Jamie M. P.

    2016-01-01

    Numerical modeling has emerged over the last several decades as a widely accepted tool for investigations in environmental sciences. In estuarine research, hydrodynamic and ecological models have moved along parallel tracks with regard to complexity, refinement, computational power, and incorporation of uncertainty. Coupled hydrodynamic-ecological models have been used to assess ecosystem processes and interactions, simulate future scenarios, and evaluate remedial actions in response to eutrophication, habitat loss, and freshwater diversion. The need to couple hydrodynamic and ecological models to address research and management questions is clear because dynamic feedbacks between biotic and physical processes are critical interactions within ecosystems. In this review, we present historical and modern perspectives on estuarine hydrodynamic and ecological modeling, consider model limitations, and address aspects of model linkage, skill assessment, and complexity. We discuss the balance between spatial and temporal resolution and present examples using different spatiotemporal scales. Finally, we recommend future lines of inquiry, approaches to balance complexity and uncertainty, and model transparency and utility. It is idealistic to think we can pursue a “theory of everything” for estuarine models, but recent advances suggest that models for both scientific investigations and management applications will continue to improve in terms of realism, precision, and accuracy.

  10. Occupational exposure to HDI: progress and challenges in biomarker analysis.

    Science.gov (United States)

    Flack, Sheila L; Ball, Louise M; Nylander-French, Leena A

    2010-10-01

    1,6-Hexamethylene diisocyanate (HDI) is extensively used in the automotive repair industry and is a commonly reported cause of occupational asthma in industrialized populations. However, the exact pathological mechanism remains uncertain. Characterization and quantification of biomarkers resulting from HDI exposure can fill important knowledge gaps between exposure, susceptibility, and the rise of immunological reactions and sensitization leading to asthma. Here, we discuss existing challenges in HDI biomarker analysis including the quantification of N-acetyl-1,6-hexamethylene diamine (monoacetyl-HDA) and N,N'-diacetyl-1,6-hexamethylene diamine (diacetyl-HDA) in urine samples based on previously established methods for HDA analysis. In addition, we describe the optimization of reaction conditions for the synthesis of monoacetyl-HDA and diacetyl-HDA, and utilize these standards for the quantification of these metabolites in the urine of three occupationally exposed workers. Diacetyl-HDA was present in untreated urine at 0.015-0.060 μg/l. Using base hydrolysis, the concentration range of monoacetyl-HDA in urine was 0.19-2.2 μg/l, 60-fold higher than in the untreated samples on average. HDA was detected only in one sample after base hydrolysis (0.026 μg/l). In contrast, acid hydrolysis yielded HDA concentrations ranging from 0.36 to 10.1 μg/l in these three samples. These findings demonstrate HDI metabolism via N-acetylation metabolic pathway and protein adduct formation resulting from occupational exposure to HDI. Copyright © 2010 Elsevier B.V. All rights reserved.

  11. Improving outcome after traumatic brain injury--progress and challenges.

    Science.gov (United States)

    Gentleman, D

    1999-01-01

    This article describes the rapid advances in the head injury field which have taken place within the professional lifetime of many doctors in practice today. These have led to a better understanding of what happens in the injured brain and how these events might be manipulated to achieve better outcomes. Clinical tools we now take for granted, like the CT scanner and the Glasgow Coma Scale, were new developments 25 years ago. They provided a foundation on which clinicians and basic scientists could build what we now know: what to assess in the patient, how to respond to certain findings, what imaging to do, how to plan treatment rationally, how to minimise brain damage at different stages after injury, how to predict and measure outcome, what disabled survivors need, and how to organise the service to do the greatest good for the most people. Some of these topics raise as many questions as answers. The head injury field may be broad but it has essential unity. At one extreme, some patients have a life-threatening illness where the acts and omissions of the clinical team can powerfully influence not only survival but its quality. Later the drama of the acute phase gives way to the 'hidden disabilities' of the long-term deficits which so many survivors have. At the other end of the severity spectrum is the relatively vast number of people who suffer an apparently mild head injury, a few of whom deteriorate and need urgent treatment, and many of whom have unspectacular but, nevertheless, disabling problems. The article attempts to address this broad canvas. Clinicians, neuroscientists, policy makers, and service users must work together to address the major scientific, individual, and population challenges posed by head injury. Much has already been achieved, but much remains to be done, especially in translating 'what we know' into 'what we do'.

  12. Challenges to Software/Computing for Experimentation at the LHC

    Science.gov (United States)

    Banerjee, Sunanda

    The demands of future high energy physics experiments towards software and computing have led the experiments to plan the related activities as a full-fledged project and to investigate new methodologies and languages to meet the challenges. The paths taken by the four LHC experiments ALICE, ATLAS, CMS and LHCb are coherently put together in an LHC-wide framework based on Grid technology. The current status and understandings have been broadly outlined.

  13. Towards Cloud Computing SLA Risk Management: Issues and Challenges

    OpenAIRE

    Morin, Jean-Henry; Aubert, Jocelyn; Gateau, Benjamin

    2012-01-01

    Cloud Computing has become mainstream technology offering a commoditized approach to software, platform and infrastructure as a service over the Internet on a global scale. This raises important new security issues beyond traditional perimeter based approaches. This paper attempts to identify these issues and their corresponding challenges, proposing to use risk and Service Level Agreement (SLA) management as the basis for a service level framework to improve governance, risk and compliance i...

  14. Computational challenges in atomic, molecular and optical physics.

    Science.gov (United States)

    Taylor, Kenneth T

    2002-06-15

    Six challenges are discussed. These are the laser-driven helium atom; the laser-driven hydrogen molecule and hydrogen molecular ion; electron scattering (with ionization) from one-electron atoms; the vibrational and rotational structure of molecules such as H(3)(+) and water at their dissociation limits; laser-heated clusters; and quantum degeneracy and Bose-Einstein condensation. The first four concern fundamental few-body systems where use of high-performance computing (HPC) is currently making possible accurate modelling from first principles. This leads to reliable predictions and support for laboratory experiment as well as true understanding of the dynamics. Important aspects of these challenges addressable only via a terascale facility are set out. Such a facility makes the last two challenges in the above list meaningfully accessible for the first time, and the scientific interest together with the prospective role for HPC in these is emphasized.

  15. The ATLAS computing challenge for HL-LHC

    CERN Document Server

    Campana, Simone; The ATLAS collaboration

    2016-01-01

    The ATLAS experiment successfully commissioned a software and computing infrastructure to support the physics program during LHC Run 2. The next phases of the accelerator upgrade will present new challenges in the offline area. In particular, at High Luminosity LHC (also known as Run 4) the data taking conditions will be very demanding in terms of computing resources: between 5 and 10 KHz of event rate from the HLT to be reconstructed (and possibly further reprocessed) with an average pile-up of up to 200 events per collision and an equivalent number of simulated samples to be produced. The same parameters for the current run are lower by up to an order of magnitude. While processing and storage resources would need to scale accordingly, the funding situation allows one at best to consider a flat budget over the next few years for offline computing needs. In this paper we present a study quantifying the challenge in terms of computing resources for HL-LHC and present ideas about the possible evolution of the ...

  16. Undergraduate students’ challenges with computational modelling in physics

    Directory of Open Access Journals (Sweden)

    Simen A. Sørby

    2012-12-01

    Full Text Available In later years, computational perspectives have become essential parts in several of the University of Oslo’s natural science studies. In this paper we discuss some main findings from a qualitative study of the computational perspectives’ impact on the students’ work with their first course in physics– mechanics – and their learning and meaning making of its contents. Discussions of the students’ learning of physics are based on sociocultural theory, which originates in Vygotsky and Bakhtin, and subsequent physics education research. Results imply that the greatest challenge for students when working with computational assignments is to combine knowledge from previously known, but separate contexts. Integrating knowledge of informatics, numerical and analytical mathematics and conceptual understanding of physics appears as a clear challenge for the students. We also observe alack of awareness concerning the limitations of physical modelling. The students need help with identifying the appropriate knowledge system or “tool set”, for the different tasks at hand; they need helpto create a plan for their modelling and to become aware of its limits. In light of this, we propose thatan instructive and dialogic text as basis for the exercises, in which the emphasis is on specification, clarification and elaboration, would be of potential great aid for students who are new to computational modelling.

  17. Static Load Balancing Algorithms In Cloud Computing Challenges amp Solutions

    Directory of Open Access Journals (Sweden)

    Nadeem Shah

    2015-08-01

    Full Text Available Abstract Cloud computing provides on-demand hosted computing resources and services over the Internet on a pay-per-use basis. It is currently becoming the favored method of communication and computation over scalable networks due to numerous attractive attributes such as high availability scalability fault tolerance simplicity of management and low cost of ownership. Due to the huge demand of cloud computing efficient load balancing becomes critical to ensure that computational tasks are evenly distributed across servers to prevent bottlenecks. The aim of this review paper is to understand the current challenges in cloud computing primarily in cloud load balancing using static algorithms and finding gaps to bridge for more efficient static cloud load balancing in the future. We believe the ideas suggested as new solution will allow researchers to redesign better algorithms for better functionalities and improved user experiences in simple cloud systems. This could assist small businesses that cannot afford infrastructure that supports complex amp dynamic load balancing algorithms.

  18. Achievements and challenges in structural bioinformatics and computational biophysics.

    Science.gov (United States)

    Samish, Ilan; Bourne, Philip E; Najmanovich, Rafael J

    2015-01-01

    The field of structural bioinformatics and computational biophysics has undergone a revolution in the last 10 years. Developments that are captured annually through the 3DSIG meeting, upon which this article reflects. An increase in the accessible data, computational resources and methodology has resulted in an increase in the size and resolution of studied systems and the complexity of the questions amenable to research. Concomitantly, the parameterization and efficiency of the methods have markedly improved along with their cross-validation with other computational and experimental results. The field exhibits an ever-increasing integration with biochemistry, biophysics and other disciplines. In this article, we discuss recent achievements along with current challenges within the field. © The Author 2014. Published by Oxford University Press.

  19. Computational Challenges in the Analysis of Petrophysics Using Microtomography and Upscaling

    Science.gov (United States)

    Liu, J.; Pereira, G.; Freij-Ayoub, R.; Regenauer-Lieb, K.

    2014-12-01

    Microtomography provides detailed 3D internal structures of rocks in micro- to tens of nano-meter resolution and is quickly turning into a new technology for studying petrophysical properties of materials. An important step is the upscaling of these properties as micron or sub-micron resolution can only be done on the sample-scale of millimeters or even less than a millimeter. We present here a recently developed computational workflow for the analysis of microstructures including the upscaling of material properties. Computations of properties are first performed using conventional material science simulations at micro to nano-scale. The subsequent upscaling of these properties is done by a novel renormalization procedure based on percolation theory. We have tested the workflow using different rock samples, biological and food science materials. We have also applied the technique on high-resolution time-lapse synchrotron CT scans. In this contribution we focus on the computational challenges that arise from the big data problem of analyzing petrophysical properties and its subsequent upscaling. We discuss the following challenges: 1) Characterization of microtomography for extremely large data sets - our current capability. 2) Computational fluid dynamics simulations at pore-scale for permeability estimation - methods, computing cost and accuracy. 3) Solid mechanical computations at pore-scale for estimating elasto-plastic properties - computational stability, cost, and efficiency. 4) Extracting critical exponents from derivative models for scaling laws - models, finite element meshing, and accuracy. Significant progress in each of these challenges is necessary to transform microtomography from the current research problem into a robust computational big data tool for multi-scale scientific and engineering problems.

  20. Symbol manipulation by computer applied to plasma physics. Technical progress report 2

    International Nuclear Information System (INIS)

    Rosen, B.

    1977-09-01

    Progress has been made in automating the calculation of parametric processes analytically by computer. The computations are performed automatically to lowest order quickly and efficiently. Work has started on a method for solving the nonlinear differential equations describing interacting modes

  1. Multiscale methods in turbulent combustion: strategies and computational challenges

    International Nuclear Information System (INIS)

    Echekki, Tarek

    2009-01-01

    A principal challenge in modeling turbulent combustion flows is associated with their complex, multiscale nature. Traditional paradigms in the modeling of these flows have attempted to address this nature through different strategies, including exploiting the separation of turbulence and combustion scales and a reduced description of the composition space. The resulting moment-based methods often yield reasonable predictions of flow and reactive scalars' statistics under certain conditions. However, these methods must constantly evolve to address combustion at different regimes, modes or with dominant chemistries. In recent years, alternative multiscale strategies have emerged, which although in part inspired by the traditional approaches, also draw upon basic tools from computational science, applied mathematics and the increasing availability of powerful computational resources. This review presents a general overview of different strategies adopted for multiscale solutions of turbulent combustion flows. Within these strategies, some specific models are discussed or outlined to illustrate their capabilities and underlying assumptions. These strategies may be classified under four different classes, including (i) closure models for atomistic processes, (ii) multigrid and multiresolution strategies, (iii) flame-embedding strategies and (iv) hybrid large-eddy simulation-low-dimensional strategies. A combination of these strategies and models can potentially represent a robust alternative strategy to moment-based models; but a significant challenge remains in the development of computational frameworks for these approaches as well as their underlying theories. (topical review)

  2. Novel spintronics devices for memory and logic: prospects and challenges for room temperature all spin computing

    Science.gov (United States)

    Wang, Jian-Ping

    An energy efficient memory and logic device for the post-CMOS era has been the goal of a variety of research fields. The limits of scaling, which we expect to reach by the year 2025, demand that future advances in computational power will not be realized from ever-shrinking device sizes, but rather by innovative designs and new materials and physics. Magnetoresistive based devices have been a promising candidate for future integrated magnetic computation because of its unique non-volatility and functionalities. The application of perpendicular magnetic anisotropy for potential STT-RAM application was demonstrated and later has been intensively investigated by both academia and industry groups, but there is no clear path way how scaling will eventually work for both memory and logic applications. One of main reasons is that there is no demonstrated material stack candidate that could lead to a scaling scheme down to sub 10 nm. Another challenge for the usage of magnetoresistive based devices for logic application is its available switching speed and writing energy. Although a good progress has been made to demonstrate the fast switching of a thermally stable magnetic tunnel junction (MTJ) down to 165 ps, it is still several times slower than its CMOS counterpart. In this talk, I will review the recent progress by my research group and my C-SPIN colleagues, then discuss the opportunities, challenges and some potential path ways for magnetoresitive based devices for memory and logic applications and their integration for room temperature all spin computing system.

  3. US DOE Grand Challenge in Computational Accelerator Physics

    International Nuclear Information System (INIS)

    Ryne, R.; Habib, S.; Qiang, J.; Ko, K.; Li, Z.; McCandless, B.; Mi, W.; Ng, C.; Saparov, M.; Srinivas, V.; Sun, Y.; Zhan, X.; Decyk, V.; Golub, G.

    1998-01-01

    Particle accelerators are playing an increasingly important role in basic and applied science, and are enabling new accelerator-driven technologies. But the design of next-generation accelerators, such as linear colliders and high intensity linacs, will require a major advance in numerical modeling capability due to extremely stringent beam control and beam loss requirements, and the presence of highly complex three-dimensional accelerator components. To address this situation, the U.S. Department of Energy has approved a ''Grand Challenge'' in Computational Accelerator Physics, whose primary goal is to develop a parallel modeling capability that will enable high performance, large scale simulations for the design, optimization, and numerical validation of next-generation accelerators. In this paper we report on the status of the Grand Challenge

  4. DOE Advanced Scientific Computing Advisory Subcommittee (ASCAC) Report: Top Ten Exascale Research Challenges

    Energy Technology Data Exchange (ETDEWEB)

    Lucas, Robert [University of Southern California, Information Sciences Institute; Ang, James [Sandia National Laboratories; Bergman, Keren [Columbia University; Borkar, Shekhar [Intel; Carlson, William [Institute for Defense Analyses; Carrington, Laura [University of California, San Diego; Chiu, George [IBM; Colwell, Robert [DARPA; Dally, William [NVIDIA; Dongarra, Jack [University of Tennessee; Geist, Al [Oak Ridge National Laboratory; Haring, Rud [IBM; Hittinger, Jeffrey [Lawrence Livermore National Laboratory; Hoisie, Adolfy [Pacific Northwest National Laboratory; Klein, Dean Micron; Kogge, Peter [University of Notre Dame; Lethin, Richard [Reservoir Labs; Sarkar, Vivek [Rice University; Schreiber, Robert [Hewlett Packard; Shalf, John [Lawrence Berkeley National Laboratory; Sterling, Thomas [Indiana University; Stevens, Rick [Argonne National Laboratory; Bashor, Jon [Lawrence Berkeley National Laboratory; Brightwell, Ron [Sandia National Laboratories; Coteus, Paul [IBM; Debenedictus, Erik [Sandia National Laboratories; Hiller, Jon [Science and Technology Associates; Kim, K. H. [IBM; Langston, Harper [Reservoir Labs; Murphy, Richard Micron; Webster, Clayton [Oak Ridge National Laboratory; Wild, Stefan [Argonne National Laboratory; Grider, Gary [Los Alamos National Laboratory; Ross, Rob [Argonne National Laboratory; Leyffer, Sven [Argonne National Laboratory; Laros III, James [Sandia National Laboratories

    2014-02-10

    Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a system that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.

  5. Computational Aspects of Dam Risk Analysis: Findings and Challenges

    Directory of Open Access Journals (Sweden)

    Ignacio Escuder-Bueno

    2016-09-01

    Full Text Available In recent years, risk analysis techniques have proved to be a useful tool to inform dam safety management. This paper summarizes the outcomes of three themes related to dam risk analysis discussed in the Benchmark Workshops organized by the International Commission on Large Dams Technical Committee on “Computational Aspects of Analysis and Design of Dams.” In the 2011 Benchmark Workshop, estimation of the probability of failure of a gravity dam for the sliding failure mode was discussed. Next, in 2013, the discussion focused on the computational challenges of the estimation of consequences in dam risk analysis. Finally, in 2015, the probability of sliding and overtopping in an embankment was analyzed. These Benchmark Workshops have allowed a complete review of numerical aspects for dam risk analysis, showing that risk analysis methods are a very useful tool to analyze the risk of dam systems, including downstream consequence assessments and the uncertainty of structural models.

  6. E-Government in the Asia-Pacific Region: Progress and Challenges

    Directory of Open Access Journals (Sweden)

    Clay Wescott

    2005-12-01

    Full Text Available This paper will focus on two issues: (i recent e-government progress and challenges, and (ii the practices regional organizations follow to cope with the challenges, while maximizing the benefits. Beginning with an overview of efforts to improve governance in the region, it then analyzes recent progress in the use of information and communication technology (ICT in the Asia-Pacific region to promote more efficient, cost-effective, and participatory government, facilitate more convenient government services, allow greater public access to information, and make government more accountable to citizens. Successful adoption of e-government presents major challenges. The paper concludes by examining the practices regional organizations follow to cope with the challenges, while maximizing the benefits.

  7. CMS results in the Combined Computing Readiness Challenge CCRC'08

    International Nuclear Information System (INIS)

    Bonacorsi, D.; Bauerdick, L.

    2009-01-01

    During February and May 2008, CMS participated to the Combined Computing Readiness Challenge (CCRC'08) together with all other LHC experiments. The purpose of this worldwide exercise was to check the readiness of the Computing infrastructure for LHC data taking. Another set of major CMS tests called Computing, Software and Analysis challenge (CSA'08) - as well as CMS cosmic runs - were also running at the same time: CCRC augmented the load on computing with additional tests to validate and stress-test all CMS computing workflows at full data taking scale, also extending this to the global WLCG community. CMS exercised most aspects of the CMS computing model, with very comprehensive tests. During May 2008, CMS moved more than 3.6 Petabytes among more than 300 links in the complex Grid topology. CMS demonstrated that is able to safely move data out of CERN to the Tier-1 sites, sustaining more than 600 MB/s as a daily average for more than seven days in a row, with enough headroom and with hourly peaks of up to 1.7 GB/s. CMS ran hundreds of simultaneous jobs at each Tier-1 site, re-reconstructing and skimming hundreds of millions of events. After re-reconstruction the fresh AOD (Analysis Object Data) has to be synchronized between Tier-1 centers: CMS demonstrated that the required inter-Tier-1 transfers are achievable within a few days. CMS also showed that skimmed analysis data sets can be transferred to Tier-2 sites for analysis at sufficient rate, regionally as well as inter-regionally, achieving all goals in about 90% of >200 links. Simultaneously, CMS also ran a large Tier-2 analysis exercise, where realistic analysis jobs were submitted to a large set of Tier-2 sites by a large number of people to produce a chaotic workload across the systems, and with more than 400 analysis users in May. Taken all together, CMS routinely achieved submissions of 100k jobs/day, with peaks up to 200k jobs/day. The achieved results in CCRC'08 - focussing on the distributed

  8. The Abbott Districts in 2005-06: Progress and Challenges, Spring 2006

    Science.gov (United States)

    Hirsch, Lesley

    2006-01-01

    New Jersey's urban--or "Abbott"--schools have improved at the preschool and elementary school level, but lag when it comes to middle and high school performance. These are the key findings of an Abbott Indicators Project report entitled, "The Abbott Districts in 2005-06: Progress and Challenges." The report was prepared by…

  9. Nuclear Data Covariances in the Indian Context – Progress, Challenges, Excitement and Perspectives

    International Nuclear Information System (INIS)

    Ganesan, S.

    2015-01-01

    We present a brief overview of progress, challenges, excitement and perspectives in developing nuclear data covariances in the Indian context in relation to target accuracies and sensitivity studies that are of great importance to Bhabha's 3-stage nuclear programme for energy and non-energy applications

  10. Conference Scene: From innovative polymers to advanced nanomedicine: Key challenges, recent progress and future perspectives

    NARCIS (Netherlands)

    Feijen, Jan; Hennink, W.E.; Zhong, Zhiyuan

    2013-01-01

    Recent developments in polymer-based controlled delivery systems have made a significant clinical impact. The second Symposium on Innovative Polymers for Controlled Delivery (SIPCD) was held in Suzhou, China to address the key challenges and provide up-to-date progress and future perspectives in the

  11. Examination of concept of next generation computer. Progress report 1999

    Energy Technology Data Exchange (ETDEWEB)

    Higuchi, Kenji; Hasegawa, Yukihiro; Hirayama, Toshio

    2000-12-01

    The Center for Promotion of Computational Science and Engineering has conducted R and D works on the technology of parallel processing and has started the examination of the next generation computer in 1999. This report describes the behavior analyses of quantum calculation codes. It also describes the consideration for the analyses and examination results for the method to reduce cash misses. Furthermore, it describes a performance simulator that is being developed to quantitatively examine the concept of the next generation computer. (author)

  12. Progress and challenges in the development and qualification of multi-level multi-physics coupled methodologies for reactor analysis

    International Nuclear Information System (INIS)

    Ivanov, K.; Avramova, M.

    2007-01-01

    Current trends in nuclear power generation and regulation as well as the design of next generation reactor concepts along with the continuing computer technology progress stimulate the development, qualification and application of multi-physics multi-scale coupled code systems. The efforts have been focused on extending the analysis capabilities by coupling models, which simulate different phenomena or system components, as well as on refining the scale and level of detail of the coupling. This paper reviews the progress made in this area and outlines the remaining challenges. The discussion is illustrated with examples based on neutronics/thermohydraulics coupling in the reactor core modeling. In both fields recent advances and developments are towards more physics-based high-fidelity simulations, which require implementation of improved and flexible coupling methodologies. First, the progresses in coupling of different physics codes along with the advances in multi-level techniques for coupled code simulations are discussed. Second, the issues related to the consistent qualification of coupled multi-physics and multi-scale code systems for design and safety evaluation are presented. The increased importance of uncertainty and sensitivity analysis are discussed along with approaches to propagate the uncertainty quantification between the codes. The incoming OECD LWR Uncertainty Analysis in Modeling (UAM) benchmark is the first international activity to address this issue and it is described in the paper. Finally, the remaining challenges with multi-physics coupling are outlined. (authors)

  13. Progress and challenges in the development and qualification of multi-level multi-physics coupled methodologies for reactor analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ivanov, K.; Avramova, M. [Pennsylvania State Univ., University Park, PA (United States)

    2007-07-01

    Current trends in nuclear power generation and regulation as well as the design of next generation reactor concepts along with the continuing computer technology progress stimulate the development, qualification and application of multi-physics multi-scale coupled code systems. The efforts have been focused on extending the analysis capabilities by coupling models, which simulate different phenomena or system components, as well as on refining the scale and level of detail of the coupling. This paper reviews the progress made in this area and outlines the remaining challenges. The discussion is illustrated with examples based on neutronics/thermohydraulics coupling in the reactor core modeling. In both fields recent advances and developments are towards more physics-based high-fidelity simulations, which require implementation of improved and flexible coupling methodologies. First, the progresses in coupling of different physics codes along with the advances in multi-level techniques for coupled code simulations are discussed. Second, the issues related to the consistent qualification of coupled multi-physics and multi-scale code systems for design and safety evaluation are presented. The increased importance of uncertainty and sensitivity analysis are discussed along with approaches to propagate the uncertainty quantification between the codes. The incoming OECD LWR Uncertainty Analysis in Modeling (UAM) benchmark is the first international activity to address this issue and it is described in the paper. Finally, the remaining challenges with multi-physics coupling are outlined. (authors)

  14. Research progress on quantum informatics and quantum computation

    Science.gov (United States)

    Zhao, Yusheng

    2018-03-01

    Quantum informatics is an emerging interdisciplinary subject developed by the combination of quantum mechanics, information science, and computer science in the 1980s. The birth and development of quantum information science has far-reaching significance in science and technology. At present, the application of quantum information technology has become the direction of people’s efforts. The preparation, storage, purification and regulation, transmission, quantum coding and decoding of quantum state have become the hotspot of scientists and technicians, which have a profound impact on the national economy and the people’s livelihood, technology and defense technology. This paper first summarizes the background of quantum information science and quantum computer and the current situation of domestic and foreign research, and then introduces the basic knowledge and basic concepts of quantum computing. Finally, several quantum algorithms are introduced in detail, including Quantum Fourier transform, Deutsch-Jozsa algorithm, Shor’s quantum algorithm, quantum phase estimation.

  15. Military Readiness: Progress and Challenges in Implementing the Navy’s Optimized Fleet Response Plan

    Science.gov (United States)

    2016-05-02

    command and control under the OFRP contributes to wide swings in port workload , which in turn can have a negative effect on the private - sector industrial...for 53 percent of all private - sector aircraft carrier maintenance contracts and 70 percent of cruiser and destroyer contracts from fiscal years...their impact on the Navy; (2) the Navy’s goals and progress in implementing the OFRP; and (3) challenges faced by public and private shipyards

  16. Parameterized algorithmics for computational social choice : nine research challenges

    NARCIS (Netherlands)

    Bredereck, R.; Chen, J.; Faliszewski, P.; Guo, J.; Niedermeier, R.; Woeginger, G.J.

    2014-01-01

    Computational Social Choice is an interdisciplinary research area involving Economics, Political Science, and Social Science on the one side, and Mathematics and Computer Science (including Artificial Intelligence and Multiagent Systems) on the other side. Typical computational problems studied in

  17. Progress and challenges of disaster health management in China: a scoping review.

    Science.gov (United States)

    Zhong, Shuang; Clark, Michele; Hou, Xiang-Yu; Zang, Yuli; FitzGerald, Gerard

    2014-01-01

    Despite the importance of an effective health system response to various disasters, relevant research is still in its infancy, especially in middle- and low-income countries. This paper provides an overview of the status of disaster health management in China, with its aim to promote the effectiveness of the health response for reducing disaster-related mortality and morbidity. A scoping review method was used to address the recent progress of and challenges to disaster health management in China. Major health electronic databases were searched to identify English and Chinese literature that were relevant to the research aims. The review found that since 2003 considerable progress has been achieved in the health disaster response system in China. However, there remain challenges that hinder effective health disaster responses, including low standards of disaster-resistant infrastructure safety, the lack of specific disaster plans, poor emergency coordination between hospitals, lack of portable diagnostic equipment and underdeveloped triage skills, surge capacity, and psychological interventions. Additional challenges include the fragmentation of the emergency health service system, a lack of specific legislation for emergencies, disparities in the distribution of funding, and inadequate cost-effective considerations for disaster rescue. One solution identified to address these challenges appears to be through corresponding policy strategies at multiple levels (e.g. community, hospital, and healthcare system level).

  18. Progress and challenges associated with halal authentication of consumer packaged goods.

    Science.gov (United States)

    Premanandh, Jagadeesan; Bin Salem, Samara

    2017-11-01

    Abusive business practices are increasingly evident in consumer packaged goods. Although consumers have the right to protect themselves against such practices, rapid urbanization and industrialization result in greater distances between producers and consumers, raising serious concerns on the supply chain. The operational complexities surrounding halal authentication pose serious challenges on the integrity of consumer packaged goods. This article attempts to address the progress and challenges associated with halal authentication. Advancement and concerns on the application of new, rapid analytical methods for halal authentication are discussed. The significance of zero tolerance policy in consumer packaged foods and its impact on analytical testing are presented. The role of halal assurance systems and their challenges are also considered. In conclusion, consensus on the establishment of one standard approach coupled with a sound traceability system and constant monitoring would certainly improve and ensure halalness of consumer packaged goods. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  19. Protein Biomarkers for Early Detection of Pancreatic Ductal Adenocarcinoma: Progress and Challenges.

    Science.gov (United States)

    Root, Alex; Allen, Peter; Tempst, Paul; Yu, Kenneth

    2018-03-07

    Approximately 75% of patients with pancreatic ductal adenocarcinoma are diagnosed with advanced cancer, which cannot be safely resected. The most commonly used biomarker CA19-9 has inadequate sensitivity and specificity for early detection, which we define as Stage I/II cancers. Therefore, progress in next-generation biomarkers is greatly needed. Recent reports have validated a number of biomarkers, including combination assays of proteins and DNA mutations; however, the history of translating promising biomarkers to clinical utility suggests that several major hurdles require careful consideration by the medical community. The first set of challenges involves nominating and verifying biomarkers. Candidate biomarkers need to discriminate disease from benign controls with high sensitivity and specificity for an intended use, which we describe as a two-tiered strategy of identifying and screening high-risk patients. Community-wide efforts to share samples, data, and analysis methods have been beneficial and progress meeting this challenge has been achieved. The second set of challenges is assay optimization and validating biomarkers. After initial candidate validation, assays need to be refined into accurate, cost-effective, highly reproducible, and multiplexed targeted panels and then validated in large cohorts. To move the most promising candidates forward, ideally, biomarker panels, head-to-head comparisons, meta-analysis, and assessment in independent data sets might mitigate risk of failure. Much more investment is needed to overcome these challenges. The third challenge is achieving clinical translation. To moonshot an early detection test to the clinic requires a large clinical trial and organizational, regulatory, and entrepreneurial know-how. Additional factors, such as imaging technologies, will likely need to improve concomitant with molecular biomarker development. The magnitude of the clinical translational challenge is uncertain, but interdisciplinary

  20. Protein Biomarkers for Early Detection of Pancreatic Ductal Adenocarcinoma: Progress and Challenges

    Directory of Open Access Journals (Sweden)

    Alex Root

    2018-03-01

    Full Text Available Approximately 75% of patients with pancreatic ductal adenocarcinoma are diagnosed with advanced cancer, which cannot be safely resected. The most commonly used biomarker CA19-9 has inadequate sensitivity and specificity for early detection, which we define as Stage I/II cancers. Therefore, progress in next-generation biomarkers is greatly needed. Recent reports have validated a number of biomarkers, including combination assays of proteins and DNA mutations; however, the history of translating promising biomarkers to clinical utility suggests that several major hurdles require careful consideration by the medical community. The first set of challenges involves nominating and verifying biomarkers. Candidate biomarkers need to discriminate disease from benign controls with high sensitivity and specificity for an intended use, which we describe as a two-tiered strategy of identifying and screening high-risk patients. Community-wide efforts to share samples, data, and analysis methods have been beneficial and progress meeting this challenge has been achieved. The second set of challenges is assay optimization and validating biomarkers. After initial candidate validation, assays need to be refined into accurate, cost-effective, highly reproducible, and multiplexed targeted panels and then validated in large cohorts. To move the most promising candidates forward, ideally, biomarker panels, head-to-head comparisons, meta-analysis, and assessment in independent data sets might mitigate risk of failure. Much more investment is needed to overcome these challenges. The third challenge is achieving clinical translation. To moonshot an early detection test to the clinic requires a large clinical trial and organizational, regulatory, and entrepreneurial know-how. Additional factors, such as imaging technologies, will likely need to improve concomitant with molecular biomarker development. The magnitude of the clinical translational challenge is uncertain, but

  1. MIT Laboratory for Computer Science Progress Report 27

    Science.gov (United States)

    1990-06-01

    partment, April 1987 TR-388 hirsch, D.E. An Expert System for Diagnosing Gait in Cerebral Palsy Patients, S.M. The~is, EE & CS Department, May 1987 276... games -space wars and computer chess. These early developments laid the foundation for the Laboratory’s work in the 1970’s on knowledge-based systems-for...domains and has seriously impaired th,- ability of developers to maintain them [264]. In response to these inadequacies, researchers turned in the early

  2. ceRNAs in plants: computational approaches and associated challenges for target mimic research.

    Science.gov (United States)

    Paschoal, Alexandre Rossi; Lozada-Chávez, Irma; Domingues, Douglas Silva; Stadler, Peter F

    2017-05-30

    The competing endogenous RNA hypothesis has gained increasing attention as a potential global regulatory mechanism of microRNAs (miRNAs), and as a powerful tool to predict the function of many noncoding RNAs, including miRNAs themselves. Most studies have been focused on animals, although target mimic (TMs) discovery as well as important computational and experimental advances has been developed in plants over the past decade. Thus, our contribution summarizes recent progresses in computational approaches for research of miRNA:TM interactions. We divided this article in three main contributions. First, a general overview of research on TMs in plants is presented with practical descriptions of the available literature, tools, data, databases and computational reports. Second, we describe a common protocol for the computational and experimental analyses of TM. Third, we provide a bioinformatics approach for the prediction of TM motifs potentially cross-targeting both members within the same or from different miRNA families, based on the identification of consensus miRNA-binding sites from known TMs across sequenced genomes, transcriptomes and known miRNAs. This computational approach is promising because, in contrast to animals, miRNA families in plants are large with identical or similar members, several of which are also highly conserved. From the three consensus TM motifs found with our approach: MIM166, MIM171 and MIM159/319, the last one has found strong support on the recent experimental work by Reichel and Millar [Specificity of plant microRNA TMs: cross-targeting of mir159 and mir319. J Plant Physiol 2015;180:45-8]. Finally, we stress the discussion on the major computational and associated experimental challenges that have to be faced in future ceRNA studies. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  4. Computer Vision Malaria Diagnostic Systems—Progress and Prospects

    Directory of Open Access Journals (Sweden)

    Joseph Joel Pollak

    2017-08-01

    Full Text Available Accurate malaria diagnosis is critical to prevent malaria fatalities, curb overuse of antimalarial drugs, and promote appropriate management of other causes of fever. While several diagnostic tests exist, the need for a rapid and highly accurate malaria assay remains. Microscopy and rapid diagnostic tests are the main diagnostic modalities available, yet they can demonstrate poor performance and accuracy. Automated microscopy platforms have the potential to significantly improve and standardize malaria diagnosis. Based on image recognition and machine learning algorithms, these systems maintain the benefits of light microscopy and provide improvements such as quicker scanning time, greater scanning area, and increased consistency brought by automation. While these applications have been in development for over a decade, recently several commercial platforms have emerged. In this review, we discuss the most advanced computer vision malaria diagnostic technologies and investigate several of their features which are central to field use. Additionally, we discuss the technological and policy barriers to implementing these technologies in low-resource settings world-wide.

  5. Technical progress faced with the challenges of the energy sector in the future

    International Nuclear Information System (INIS)

    Maillard, D.

    1999-01-01

    The colloquium organised by the Association of Energy Economists dealing with the theme 'Technical progress faced with the challenges of the energy sector in the future' takes place against a backdrop of ever-increasing initiatives in this field, for example at the World Energy Council or the International Energy Agency Faith in technical progress is widespread but should be supported by studies without any preconceived ideas. Research and development efforts must be fully supported, and in a climate of opening markets and liberalization the public authorities have a major role to pay. Historically, the markets have always been able to meet new needs thanks to technology, but the ambitious targets that the international community has set itself regarding the emission of greenhouse gases imply technical improvements and major investments. (authors)

  6. A heterogeneous computing environment to solve the 768-bit RSA challenge

    OpenAIRE

    Kleinjung, Thorsten; Bos, Joppe Willem; Lenstra, Arjen K.; Osvik, Dag Arne; Aoki, Kazumaro; Contini, Scott; Franke, Jens; Thomé, Emmanuel; Jermini, Pascal; Thiémard, Michela; Leyland, Paul; Montgomery, Peter L.; Timofeev, Andrey; Stockinger, Heinz

    2010-01-01

    In December 2009 the 768-bit, 232-digit number RSA-768 was factored using the number field sieve. Overall, the computational challenge would take more than 1700 years on a single, standard core. In the article we present the heterogeneous computing approach, involving different compute clusters and Grid computing environments, used to solve this problem.

  7. SIRT6 knockout cells resist apoptosis initiation but not progression: a computational method to evaluate the progression of apoptosis.

    Science.gov (United States)

    Domanskyi, Sergii; Nicholatos, Justin W; Schilling, Joshua E; Privman, Vladimir; Libert, Sergiy

    2017-11-01

    Apoptosis is essential for numerous processes, such as development, resistance to infections, and suppression of tumorigenesis. Here, we investigate the influence of the nutrient sensing and longevity-assuring enzyme SIRT6 on the dynamics of apoptosis triggered by serum starvation. Specifically, we characterize the progression of apoptosis in wild type and SIRT6 deficient mouse embryonic fibroblasts using time-lapse flow cytometry and computational modelling based on rate-equations and cell distribution analysis. We find that SIRT6 deficient cells resist apoptosis by delaying its initiation. Interestingly, once apoptosis is initiated, the rate of its progression is higher in SIRT6 null cells compared to identically cultured wild type cells. However, SIRT6 null cells succumb to apoptosis more slowly, not only in response to nutrient deprivation but also in response to other stresses. Our data suggest that SIRT6 plays a role in several distinct steps of apoptosis. Overall, we demonstrate the utility of our computational model to describe stages of apoptosis progression and the integrity of the cellular membrane. Such measurements will be useful in a broad range of biological applications.

  8. Towards a cyberinfrastructure for the biological sciences: progress, visions and challenges.

    Science.gov (United States)

    Stein, Lincoln D

    2008-09-01

    Biology is an information-driven science. Large-scale data sets from genomics, physiology, population genetics and imaging are driving research at a dizzying rate. Simultaneously, interdisciplinary collaborations among experimental biologists, theorists, statisticians and computer scientists have become the key to making effective use of these data sets. However, too many biologists have trouble accessing and using these electronic data sets and tools effectively. A 'cyberinfrastructure' is a combination of databases, network protocols and computational services that brings people, information and computational tools together to perform science in this information-driven world. This article reviews the components of a biological cyberinfrastructure, discusses current and pending implementations, and notes the many challenges that lie ahead.

  9. The challenge of networked enterprises for cloud computing interoperability

    OpenAIRE

    Mezgár, István; Rauschecker, Ursula

    2014-01-01

    Manufacturing enterprises have to organize themselves into effective system architectures forming different types of Networked Enterprises (NE) to match fast changing market demands. Cloud Computing (CC) is an important up to date computing concept for NE, as it offers significant financial and technical advantages beside high-level collaboration possibilities. As cloud computing is a new concept the solutions for handling interoperability, portability, security, privacy and standardization c...

  10. Exploring the marketing challenges faced by assembled computer dealers

    OpenAIRE

    Kallimani, Rashmi

    2010-01-01

    There has been a great competition in computer market these days for obtaining higher market share. Computer market consisting of many branded and non branded players have been using various methods for matching the supply and demand in best possible way for attaining market dominance. Branded companies are seen to be investing large amount in aggressive marketing techniques for reaching the customers and obtaining higher market share. Due to this many small companies and non branded computer...

  11. Progress and challenges in utilization of palm oil biomass as fuel for decentralized electricity generation

    Energy Technology Data Exchange (ETDEWEB)

    Bazmi, Aqeel Ahmed [Process Systems Engineering Centre (PROSPECT), Department of Chemical Engineering, Faculty of Chemical and Natural Resources Engineering, University Technology Malaysia, Skudai 81310, Johor Bahru, JB (Malaysia); Biomass Conversion Research Center (BCRC), Department of Chemical Engineering, COMSATS Institute of Information Technology, Lahore (Pakistan); Zahedi, Gholamreza; Hashim, Haslenda [Process Systems Engineering Centre (PROSPECT), Department of Chemical Engineering, Faculty of Chemical and Natural Resources Engineering, University Technology Malaysia, Skudai 81310, Johor Bahru, JB (Malaysia)

    2011-01-15

    It has been broadly accepted worldwide that global warming, indeed, is the greatest threat of the time to the environment. Renewable energy (RE) is expected as a perfect solution to reduce global warming and to endorse sustainable development. Progressive release of greenhouse gases (GHG) from increasing energy-intensive industries has eventually caused human civilization to suffer. Realizing the exigency of reducing emissions and simultaneously catering to needs of industries, researchers foresee the RE as the perfect entrant to overcome these challenges. RE provides an effective option for the provision of energy services from the technical point of view while biomass, a major source of energy in the world until before industrialization when fossil fuels become dominant, appears an important renewable source of energy and researches have proven from time to time its viability for large-scale production. Being a widely spread source, biomass offers the execution of decentralized electricity generation gaining importance in liberalized electricity markets. The decentralized power is characterized by generation of electricity nearer to the demand centers, meeting the local energy needs. Researchers envisaged an increasing decentralization of power supply, expected to make a particular contribution to climate protection. This article investigates the progress and challenges for decentralized electricity generation by palm oil biomass according to the overall concept of sustainable development. (author)

  12. Progress and challenges in utilization of palm oil biomass as fuel for decentralized electricity generation

    International Nuclear Information System (INIS)

    Bazmi, Aqeel Ahmed; Zahedi, Gholamreza; Hashim, Haslenda

    2011-01-01

    It has been broadly accepted worldwide that global warming, indeed, is the greatest threat of the time to the environment. Renewable energy (RE) is expected as a perfect solution to reduce global warming and to endorse sustainable development. Progressive release of greenhouse gases (GHG) from increasing energy-intensive industries has eventually caused human civilization to suffer. Realizing the exigency of reducing emissions and simultaneously catering to needs of industries, researchers foresee the RE as the perfect entrant to overcome these challenges. RE provides an effective option for the provision of energy services from the technical point of view while biomass, a major source of energy in the world until before industrialization when fossil fuels become dominant, appears an important renewable source of energy and researches have proven from time to time its viability for large-scale production. Being a widely spread source, biomass offers the execution of decentralized electricity generation gaining importance in liberalized electricity markets. The decentralized power is characterized by generation of electricity nearer to the demand centers, meeting the local energy needs. Researchers envisaged an increasing decentralization of power supply, expected to make a particular contribution to climate protection. This article investigates the progress and challenges for decentralized electricity generation by palm oil biomass according to the overall concept of sustainable development. (author)

  13. The Challenge '88 Project: Interfacing of Chemical Instruments to Computers.

    Science.gov (United States)

    Lyons, Jim; Verghese, Manoj

    The main part of this project involved using a computer, either an Apple or an IBM, as a chart recorder for the infrared (IR) and nuclear magnetic resonance (NMR) spectrophotometers. The computer "reads" these machines and displays spectra on its monitor. The graphs can then be stored for future reference and manipulation. The program to…

  14. Gender Digital Divide and Challenges in Undergraduate Computer Science Programs

    Science.gov (United States)

    Stoilescu, Dorian; McDougall, Douglas

    2011-01-01

    Previous research revealed a reduced number of female students registered in computer science studies. In addition, the female students feel isolated, have reduced confidence, and underperform. This article explores differences between female and male students in undergraduate computer science programs in a mid-size university in Ontario. Based on…

  15. A computational method for computing an Alzheimer’s Disease Progression Score; experiments and validation with the ADNI dataset

    Science.gov (United States)

    Jedynak, Bruno M.; Liu, Bo; Lang, Andrew; Gel, Yulia; Prince, Jerry L.

    2014-01-01

    Understanding the time-dependent changes of biomarkers related to Alzheimer’s disease (AD) is a key to assessing disease progression and to measuring the outcomes of disease-modifying therapies. In this paper, we validate an Alzheimer’s disease progression score model which uses multiple biomarkers to quantify the AD progression of subjects following three assumptions: (1) there is a unique disease progression for all subjects, (2) each subject has a different age of onset and rate of progression, and (3) each biomarker is sigmoidal as a function of disease progression. Fitting the parameters of this model is a challenging problem which we approach using an alternating least squares optimization algorithm. In order to validate this optimization scheme under realistic conditions, we use the Alzheimer’s Disease Neuroimaging Initiative (ADNI) cohort. With the help of Monte Carlo simulations, we show that most of the global parameters of the model are tightly estimated, thus enabling an ordering of the biomarkers that fit the model well, ordered as: the Rey auditory verbal learning test with 30 minutes delay, the sum of the two lateral hippocampal volumes divided by the intra-cranial volume, followed by (the clinical dementia rating sum of boxes score and the mini mental state examination score) in no particular order and lastly the Alzheimer’s disease assessment scale-cognitive subscale. PMID:25444605

  16. Progress of the Computer-Aided Engineering of Electric Drive Vehicle Batteries (CAEBAT) (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Pesaran, A. A.; Han, T.; Hartridge, S.; Shaffer, C.; Kim, G. H.; Pannala, S.

    2013-06-01

    This presentation, Progress of Computer-Aided Engineering of Electric Drive Vehicle Batteries (CAEBAT) is about simulation and computer-aided engineering (CAE) tools that are widely used to speed up the research and development cycle and reduce the number of build-and-break steps, particularly in the automotive industry. Realizing this, DOE?s Vehicle Technologies Program initiated the CAEBAT project in April 2010 to develop a suite of software tools for designing batteries.

  17. Computer Graphics Research Laboratory Quarterly Progress Report Number 49, July-September 1993

    Science.gov (United States)

    1993-11-22

    20 Texture Sampling and Strength Guided Motion: Jeffry S. Nimeroff 23 21 Radiosity : Min-Zhi Shao 24 22 Blended Shape Primitives: Douglas DeCarlo 25 23...placement. "* Extensions of radiosity rendering. "* A discussion of blended shape primitives and the applications in computer vision and computer...user. Radiosity : An improved version of the radiosity renderer is included. This version uses a fast over- relaxation progressive refinement algorithm

  18. Multicore Challenges and Benefits for High Performance Scientific Computing

    Directory of Open Access Journals (Sweden)

    Ida M.B. Nielsen

    2008-01-01

    Full Text Available Until recently, performance gains in processors were achieved largely by improvements in clock speeds and instruction level parallelism. Thus, applications could obtain performance increases with relatively minor changes by upgrading to the latest generation of computing hardware. Currently, however, processor performance improvements are realized by using multicore technology and hardware support for multiple threads within each core, and taking full advantage of this technology to improve the performance of applications requires exposure of extreme levels of software parallelism. We will here discuss the architecture of parallel computers constructed from many multicore chips as well as techniques for managing the complexity of programming such computers, including the hybrid message-passing/multi-threading programming model. We will illustrate these ideas with a hybrid distributed memory matrix multiply and a quantum chemistry algorithm for energy computation using Møller–Plesset perturbation theory.

  19. Lattice QCD - a challenge in large scale computing

    International Nuclear Information System (INIS)

    Schilling, K.

    1987-01-01

    The computation of the hadron spectrum within the framework of lattice QCD sets a demanding goal for the application of supercomputers in basic science. It requires both big computer capacities and clever algorithms to fight all the numerical evils that one encounters in the Euclidean space-time-world. The talk will attempt to introduce to the present state of the art of spectrum calculations by lattice simulations. (orig.)

  20. Annual progress report FY 1977. [Computer calculations of light water reactor dynamics and safety

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, K.F.; Henry, A.F.

    1977-07-01

    Progress is summarized in a project directed toward development of numerical methods suitable for the computer solution of problems in reactor dynamics and safety. Specific areas of research include methods of integration of the time-dependent diffusion equations by finite difference and finite element methods; representation of reactor properties by various homogenization procedures; application of synthesis methods; and development of response matrix techniques.

  1. Fault tolerance in computational grids: perspectives, challenges, and issues.

    Science.gov (United States)

    Haider, Sajjad; Nazir, Babar

    2016-01-01

    Computational grids are established with the intention of providing shared access to hardware and software based resources with special reference to increased computational capabilities. Fault tolerance is one of the most important issues faced by the computational grids. The main contribution of this survey is the creation of an extended classification of problems that incur in the computational grid environments. The proposed classification will help researchers, developers, and maintainers of grids to understand the types of issues to be anticipated. Moreover, different types of problems, such as omission, interaction, and timing related have been identified that need to be handled on various layers of the computational grid. In this survey, an analysis and examination is also performed pertaining to the fault tolerance and fault detection mechanisms. Our conclusion is that a dependable and reliable grid can only be established when more emphasis is on fault identification. Moreover, our survey reveals that adaptive and intelligent fault identification, and tolerance techniques can improve the dependability of grid working environments.

  2. Candidiasis: a fungal infection--current challenges and progress in prevention and treatment.

    Science.gov (United States)

    Hani, Umme; Shivakumar, Hosakote G; Vaghela, Rudra; Osmani, Riyaz Ali M; Shrivastava, Atul

    2015-01-01

    Despite therapeutic advances candidiasis remains a common fungal infection most frequently caused by C. albicans and may occur as vulvovaginal candidiasis or thrush, a mucocutaneous candidiasis. Candidiasis frequently occurs in newborns, in immune-deficient people like AIDS patients, and in people being treated with broad spectrum antibiotics. It is mainly due to C. albicans while other species such as C. tropicalis, C. glabrata, C. parapsilosis and C. krusei are increasingly isolated. OTC antifungal dosage forms such as creams and gels can be used for effective treatment of local candidiasis. Whereas, for preventing spread of the disease to deeper vital organs, candidiasis antifungal chemotherapy is preferred. Use of probiotics and development of novel vaccines is an advanced approach for the prevention of candidiasis. Present review summarizes the diagnosis, current status and challenges in the treatment and prevention of candidiasis with prime focus on host defense against candidiasis, advancements in diagnosis, probiotics role and recent progress in the development of vaccines against candidiasis.

  3. Progress and Challenges in Developing Aptamer-Functionalized Targeted Drug Delivery Systems

    Directory of Open Access Journals (Sweden)

    Feng Jiang

    2015-10-01

    Full Text Available Aptamers, which can be screened via systematic evolution of ligands by exponential enrichment (SELEX, are superior ligands for molecular recognition due to their high selectivity and affinity. The interest in the use of aptamers as ligands for targeted drug delivery has been increasing due to their unique advantages. Based on their different compositions and preparation methods, aptamer-functionalized targeted drug delivery systems can be divided into two main categories: aptamer-small molecule conjugated systems and aptamer-nanomaterial conjugated systems. In this review, we not only summarize recent progress in aptamer selection and the application of aptamers in these targeted drug delivery systems but also discuss the advantages, challenges and new perspectives associated with these delivery systems.

  4. Computational pan-genomics: status, promises and challenges

    NARCIS (Netherlands)

    The Computational Pan-Genomics Consortium; T. Marschall (Tobias); M. Marz (Manja); T. Abeel (Thomas); L.J. Dijkstra (Louis); B.E. Dutilh (Bas); A. Ghaffaari (Ali); P. Kersey (Paul); W.P. Kloosterman (Wigard); V. Mäkinen (Veli); A.M. Novak (Adam); B. Paten (Benedict); D. Porubsky (David); E. Rivals (Eric); C. Alkan (Can); J.A. Baaijens (Jasmijn); P.I.W. de Bakker (Paul); V. Boeva (Valentina); R.J.P. Bonnal (Raoul); F. Chiaromonte (Francesca); R. Chikhi (Rayan); F.D. Ciccarelli (Francesca); C.P. Cijvat (Robin); E. Datema (Erwin); C.M. van Duijn (Cornelia); E.E. Eichler (Evan); C. Ernst (Corinna); E. Eskin (Eleazar); E. Garrison (Erik); M. El-Kebir (Mohammed); G.W. Klau (Gunnar); J.O. Korbel (Jan); E.-W. Lameijer (Eric-Wubbo); B. Langmead (Benjamin); M. Martin; P. Medvedev (Paul); J.C. Mu (John); P.B.T. Neerincx (Pieter); K. Ouwens (Klaasjan); P. Peterlongo (Pierre); N. Pisanti (Nadia); S. Rahmann (Sven); B.J. Raphael (Benjamin); K. Reinert (Knut); D. de Ridder (Dick); J. de Ridder (Jeroen); M. Schlesner (Matthias); O. Schulz-Trieglaff (Ole); A.D. Sanders (Ashley); S. Sheikhizadeh (Siavash); C. Shneider (Carl); S. Smit (Sandra); D. Valenzuela (Daniel); J. Wang (Jiayin); L.F.A. Wessels (Lodewyk); Y. Zhang (Ying); V. Guryev (Victor); F. Vandin (Fabio); K. Ye (Kai); A. Schönhuth (Alexander)

    2018-01-01

    textabstractMany disciplines, from human genetics and oncology to plant breeding, microbiology and virology, commonly face the challenge of analyzing rapidly increasing numbers of genomes. In case of Homo sapiens, the number of sequenced genomes will approach hundreds of thousands in the next few

  5. The Office of Site Closure: Progress in the Face of Challenges

    International Nuclear Information System (INIS)

    Fiore, J. J.; Murphie, W. E.; Meador, S. W.

    2002-01-01

    The Office of Site Closure (OSC) was formed in November 1999 when the Department of Energy's (DOE's) Office of Environmental Management (EM) reorganized to focus specifically on site cleanup and closure. OSC's objective is to achieve safe and cost-effective cleanups and closures that are protective of our workers, the public, and the environment, now and in the future. Since its inception, OSC has focused on implementing a culture of safe closure, with emphasis in three primary areas: complete our responsibility for the Closure Sites Rocky Flats, Mound, Fernald, Ashtabula, and Weldon Spring; complete our responsibility for cleanup at sites where the DOE mission has been completed (examples include Battelle King Avenue and Battelle West Jefferson in Columbus, and General Atomics) or where other Departmental organizations have an ongoing mission (examples include the Brookhaven, Livermore, or Los Alamos National Laboratories, and the Nevada Test Site); and create a framework a nd develop specific business closure tools that will help sites close, such as guidance for and decisions on post-contract benefit liabilities, records retention, and Federal employee incentives for site closure. This paper discusses OSC's 2001 progress in achieving site cleanups, moving towards site closure, and developing specific business closure tools to support site closure. It describes the tools used to achieve progress towards cleanup and closure, such as the application of new technologies, changes in contracting approaches, and the development of agreements between sites and with host states. The paper also identifies upcoming challenges and explores options for how Headquarters and the sites can work together to address these challenges. Finally, it articulates OSC's new focus on oversight of Field Offices to ensure they have the systems in place to oversee contractor activities resulting in site cleanups and closures

  6. The Glen Canyon Dam adaptive management program: progress and immediate challenges

    Science.gov (United States)

    Hamill, John F.; Melis, Theodore S.; Boon, Philip J.; Raven, Paul J.

    2012-01-01

    Adaptive management emerged as an important resource management strategy for major river systems in the United States (US) in the early 1990s. The Glen Canyon Dam Adaptive Management Program (‘the Program’) was formally established in 1997 to fulfill a statutory requirement in the 1992 Grand Canyon Protection Act (GCPA). The GCPA aimed to improve natural resource conditions in the Colorado River corridor in the Glen Canyon National Recreation Area and Grand Canyon National Park, Arizona that were affected by the Glen Canyon dam. The Program achieves this by using science and a variety of stakeholder perspectives to inform decisions about dam operations. Since the Program started the ecosystem is now much better understood and several biological and physical improvements have been achieved. These improvements include: (i) an estimated 50% increase in the adult population of endangered humpback chub (Gila cypha) between 2001 and 2008, following previous decline; (ii) a 90% decrease in non-native rainbow trout (Oncorhynchus mykiss), which are known to compete with and prey on native fish, as a result of removal experiments; and (iii) the widespread reappearance of sandbars in response to an experimental high-flow release of dam water in March 2008.Although substantial progress has been made, the Program faces several immediate challenges. These include: (i) defining specific, measurable objectives and desired future conditions for important natural, cultural and recreational attributes to inform science and management decisions; (ii) implementing structural and operational changes to improve collaboration among stakeholders; (iii) establishing a long-term experimental programme and management plan; and (iv) securing long-term funding for monitoring programmes to assess ecosystem and other responses to management actions. Addressing these challenges and building on recent progress will require strong and consistent leadership from the US Department of the Interior

  7. Use of computers for meeting future challenges in traffic management

    International Nuclear Information System (INIS)

    Ford, C.L. Jr.

    1983-01-01

    Since overall distribution costs, including transportation, is approaching 10% of the material expense dollar, new strategies are emerging in distribution management. Innovative technology utilizing computer hardware and software is an aid to solving inventory control and distribution problems. This paper discusses the information needs of the shipper traffic manager and the role of computers in meeting this need. DOE, in association with Union Carbide's Nuclear Division, has utilized data base technology to collect and report transportation statistics for a variety of management information needs

  8. Remarkable Computing - the Challenge of Designing for the Home

    DEFF Research Database (Denmark)

    Petersen, Marianne Graves

    2004-01-01

    The vision of ubiquitous computing is floating into the domain of the household, despite arguments that lessons from design of workplace artefacts cannot be blindly transferred into the domain of the household. This paper discusses why the ideal of unremarkable or ubiquitous computing is too narrow...... with respect to the household. It points out how understanding technology use, is a matter of looking into the process of use and on how the specific context of the home, in several ways, call for technology to be remarkable rather than unremarkable....

  9. Detection and quantification of proteins and cells by use of elemental mass spectrometry: progress and challenges.

    Science.gov (United States)

    Yan, Xiaowen; Yang, Limin; Wang, Qiuquan

    2013-07-01

    Much progress has been made in identification of the proteins in proteomes, and quantification of these proteins has attracted much interest. In addition to popular tandem mass spectrometric methods based on soft ionization, inductively coupled plasma mass spectrometry (ICPMS), a typical example of mass spectrometry based on hard ionization, usually used for analysis of elements, has unique advantages in absolute quantification of proteins by determination of an element with a definite stoichiometry in a protein or attached to the protein. In this Trends article, we briefly describe state-of-the-art ICPMS-based methods for quantification of proteins, emphasizing protein-labeling and element-tagging strategies developed on the basis of chemically selective reactions and/or biospecific interactions. Recent progress from protein to cell quantification by use of ICPMS is also discussed, and the possibilities and challenges of ICPMS-based protein quantification for universal, selective, or targeted quantification of proteins and cells in a biological sample are also discussed critically. We believe ICPMS-based protein quantification will become ever more important in targeted quantitative proteomics and bioanalysis in the near future.

  10. Label-free SERS in biological and biomedical applications: Recent progress, current challenges and opportunities

    Science.gov (United States)

    Zheng, Xiao-Shan; Jahn, Izabella Jolan; Weber, Karina; Cialla-May, Dana; Popp, Jürgen

    2018-05-01

    To achieve an insightful look within biomolecular processes on the cellular level, the development of diseases as well as the reliable detection of metabolites and pathogens, a modern analytical tool is needed that is highly sensitive, molecular-specific and exhibits fast detection. Surface-enhanced Raman spectroscopy (SERS) is known to meet these requirements and, within this review article, the recent progress of label-free SERS in biological and biomedical applications is summarized and discussed. This includes the detection of biomolecules such as metabolites, nucleic acids and proteins. Further, the characterization and identification of microorganisms has been achieved by label-free SERS-based approaches. Eukaryotic cells can be characterized by SERS in order to gain information about the outer cell wall or to detect intracellular molecules and metabolites. The potential of SERS for medically relevant detection schemes is emphasized by the label-free detection of tissue, the investigation of body fluids as well as applications for therapeutic and illicit drug monitoring. The review article is concluded with an evaluation of the recent progress and current challenges in order to highlight the direction of label-free SERS in the future.

  11. IAEA Mission Sees Significant Progress in Georgia’s Regulatory Framework, Challenges Ahead

    International Nuclear Information System (INIS)

    2018-01-01

    An International Atomic Energy Agency (IAEA) team of experts said Georgia has made significant progress in strengthening its regulatory framework for nuclear and radiation safety. The team also pointed to challenges ahead as Georgia seeks to achieve further progress. The Integrated Regulatory Review Service (IRRS) team concluded a 10-day mission on 28 February to assess the regulatory safety framework in Georgia. The mission was conducted at the request of the Government and hosted by the Agency of Nuclear and Radiation Safety (ANRS), which is responsible for regulatory oversight in the country. IRRS missions are designed to strengthen the effectiveness of the national safety regulatory infrastructure, while recognizing the responsibility of each State to ensure nuclear and radiation safety. Georgia uses radioactive sources in medicine and industry and operates radioactive waste management facilities. It has decommissioned its only research reactor and has no nuclear power plants. In recent years, the Government and ANRS, with assistance from the IAEA, introduced new safety regulations and increased the number of regulatory inspections.

  12. Progression of vestibular schawnnoma after GammaKnife radiosurgery: A challenge for microsurgical resection.

    Science.gov (United States)

    Aboukaïs, Rabih; Bonne, Nicolas-Xavier; Touzet, Gustavo; Vincent, Christophe; Reyns, Nicolas; Lejeune, Jean-Paul

    2018-05-01

    We aimed to evaluate the outcome of patients who underwent salvage microsurgery for vestibular schwannoma (VS) that failed primary Gammaknife radiosurgery (GKS). Among the 1098 patients who received GKS for the treatment of VS in our center between January 2004 and December 2012, the follow-up was organized in our institution for 290 patients who lived in our recruitment area. Tumor progression was noted in 23 patients. A salvage microsurgical resection was performed in 11 patients, who were included in our study. Grading of facial function was done according to the House & Brackman scale. The mean age at diagnosis was 50.2 years (19-68 years) and the mean follow-up was 9.4 years (4-13 years). The mean dose was 11.8 Gy (11-12 Gy) and the mean volume was 922 mm3 (208-2500 mm3). The mean period between GKS and diagnosis of tumor progression was 32 months (18-72 months). Concerning salvage microsurgery, complete resection was obtained in 8 patients. Small residual tumor on the facial nerve was deliberately left in 3 patients and no tumor progression was noted with a mean follow-up of 26 months. At last follow-up, facial nerve function was grade 1 in 4 patients, grade 2 in 3 patients, grade 3 in 1 patient and grade 4 in 3 patients. Salvage surgery of recurrent vestibular schwannoma after failed initial GKS remains a good treatment. However, facial nerve preservation is more challenging in this case and small tumor remnant could be sometimes deliberately left. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Algebraic Functions, Computer Programming, and the Challenge of Transfer

    Science.gov (United States)

    Schanzer, Emmanuel Tanenbaum

    2015-01-01

    Students' struggles with algebra are well documented. Prior to the introduction of functions, mathematics is typically focused on applying a set of arithmetic operations to compute an answer. The introduction of functions, however, marks the point at which mathematics begins to focus on building up abstractions as a way to solve complex problems.…

  14. Challenges in scaling NLO generators to leadership computers

    Science.gov (United States)

    Benjamin, D.; Childers, JT; Hoeche, S.; LeCompte, T.; Uram, T.

    2017-10-01

    Exascale computing resources are roughly a decade away and will be capable of 100 times more computing than current supercomputers. In the last year, Energy Frontier experiments crossed a milestone of 100 million core-hours used at the Argonne Leadership Computing Facility, Oak Ridge Leadership Computing Facility, and NERSC. The Fortran-based leading-order parton generator called Alpgen was successfully scaled to millions of threads to achieve this level of usage on Mira. Sherpa and MadGraph are next-to-leading order generators used heavily by LHC experiments for simulation. Integration times for high-multiplicity or rare processes can take a week or more on standard Grid machines, even using all 16-cores. We will describe our ongoing work to scale the Sherpa generator to thousands of threads on leadership-class machines and reduce run-times to less than a day. This work allows the experiments to leverage large-scale parallel supercomputers for event generation today, freeing tens of millions of grid hours for other work, and paving the way for future applications (simulation, reconstruction) on these and future supercomputers.

  15. Human-Computer Interaction Software: Lessons Learned, Challenges Ahead

    Science.gov (United States)

    1989-01-01

    domain communi- Iatelligent s t s s Me cation. Users familiar with problem Inteligent support systes. High-func- anddomains but inxperienced with comput...8217i. April 1987, pp. 7.3-78. His research interests include artificial intel- Creating better HCI softw-are will have a 8. S.K Catrd. I.P. Moran. arid

  16. The challenge of quantum computer simulations of physical phenomena

    International Nuclear Information System (INIS)

    Ortiz, G.; Knill, E.; Gubernatis, J.E.

    2002-01-01

    The goal of physics simulation using controllable quantum systems ('physics imitation') is to exploit quantum laws to advantage, and thus accomplish efficient simulation of physical phenomena. In this Note, we discuss the fundamental concepts behind this paradigm of information processing, such as the connection between models of computation and physical systems. The experimental simulation of a toy quantum many-body problem is described

  17. Petroleum industry's current progress and challenges in dealing with the year 2000 problem

    International Nuclear Information System (INIS)

    Kraus, D.I.; Radu, C.G.; McKenzie, S.; Stuart, K.

    1998-01-01

    The steps that some major oil companies in Canada have taken to prepare their computers and automated equipment for the year 2000 (Y2K) are described. It is acknowledged that with 1700 retail service stations, over 300 wholesale operations, and 26 terminals, the extent of the problem is great. In addition some 38 upstream and 65 downstream applications have been identified as mission critical, not counting the 30 mission critical control systems in the upstream field operations and approximately the same number of operations in the downstream refinery. The good news is that remediation and testing is well underway nationally and the Calgary test laboratory will be commercially available in 1999. Getting management on-side, selling the positive aspects of Y2K, making good use of reputable consulting companies, keeping employees properly informed of problems and progress, are some of the key criteria in solving Y2K problems successfully

  18. Scientific Grand Challenges: Challenges in Climate Change Science and the Role of Computing at the Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.; Johnson, Gary M.; Washington, Warren M.

    2009-07-02

    The U.S. Department of Energy (DOE) Office of Biological and Environmental Research (BER) in partnership with the Office of Advanced Scientific Computing Research (ASCR) held a workshop on the challenges in climate change science and the role of computing at the extreme scale, November 6-7, 2008, in Bethesda, Maryland. At the workshop, participants identified the scientific challenges facing the field of climate science and outlined the research directions of highest priority that should be pursued to meet these challenges. Representatives from the national and international climate change research community as well as representatives from the high-performance computing community attended the workshop. This group represented a broad mix of expertise. Of the 99 participants, 6 were from international institutions. Before the workshop, each of the four panels prepared a white paper, which provided the starting place for the workshop discussions. These four panels of workshop attendees devoted to their efforts the following themes: Model Development and Integrated Assessment; Algorithms and Computational Environment; Decadal Predictability and Prediction; Data, Visualization, and Computing Productivity. The recommendations of the panels are summarized in the body of this report.

  19. Single-Cell Transcriptomics Bioinformatics and Computational Challenges

    Directory of Open Access Journals (Sweden)

    Lana Garmire

    2016-09-01

    Full Text Available The emerging single-cell RNA-Seq (scRNA-Seq technology holds the promise to revolutionize our understanding of diseases and associated biological processes at an unprecedented resolution. It opens the door to reveal the intercellular heterogeneity and has been employed to a variety of applications, ranging from characterizing cancer cells subpopulations to elucidating tumor resistance mechanisms. Parallel to improving experimental protocols to deal with technological issues, deriving new analytical methods to reveal the complexity in scRNA-Seq data is just as challenging. Here we review the current state-of-the-art bioinformatics tools and methods for scRNA-Seq analysis, as well as addressing some critical analytical challenges that the field faces.

  20. Nanoparticle-Based Drug Delivery for Therapy of Lung Cancer: Progress and Challenges

    Directory of Open Access Journals (Sweden)

    Anish Babu

    2013-01-01

    Full Text Available The last decade has witnessed enormous advances in the development and application of nanotechnology in cancer detection, diagnosis, and therapy culminating in the development of the nascent field of “cancer nanomedicine.” A nanoparticle as per the National Institutes of Health (NIH guidelines is any material that is used in the formulation of a drug resulting in a final product smaller than 1 micron in size. Nanoparticle-based therapeutic systems have gained immense popularity due to their ability to overcome biological barriers, effectively deliver hydrophobic therapies, and preferentially target disease sites. Currently, many formulations of nanocarriers are utilized including lipid-based, polymeric and branched polymeric, metal-based, magnetic, and mesoporous silica. Innovative strategies have been employed to exploit the multicomponent, three-dimensional constructs imparting multifunctional capabilities. Engineering such designs allows simultaneous drug delivery of chemotherapeutics and anticancer gene therapies to site-specific targets. In lung cancer, nanoparticle-based therapeutics is paving the way in the diagnosis, imaging, screening, and treatment of primary and metastatic tumors. However, translating such advances from the bench to the bedside has been severely hampered by challenges encountered in the areas of pharmacology, toxicology, immunology, large-scale manufacturing, and regulatory issues. This review summarizes current progress and challenges in nanoparticle-based drug delivery systems, citing recent examples targeted at lung cancer treatment.

  1. Progress and challenges of engineering a biophysical CO2-concentrating mechanism into higher plants.

    Science.gov (United States)

    Rae, Benjamin D; Long, Benedict M; Förster, Britta; Nguyen, Nghiem D; Velanis, Christos N; Atkinson, Nicky; Hee, Wei Yih; Mukherjee, Bratati; Price, G Dean; McCormick, Alistair J

    2017-06-01

    Growth and productivity in important crop plants is limited by the inefficiencies of the C3 photosynthetic pathway. Introducing CO2-concentrating mechanisms (CCMs) into C3 plants could overcome these limitations and lead to increased yields. Many unicellular microautotrophs, such as cyanobacteria and green algae, possess highly efficient biophysical CCMs that increase CO2 concentrations around the primary carboxylase enzyme, Rubisco, to enhance CO2 assimilation rates. Algal and cyanobacterial CCMs utilize distinct molecular components, but share several functional commonalities. Here we outline the recent progress and current challenges of engineering biophysical CCMs into C3 plants. We review the predicted requirements for a functional biophysical CCM based on current knowledge of cyanobacterial and algal CCMs, the molecular engineering tools and research pipelines required to translate our theoretical knowledge into practice, and the current challenges to achieving these goals. © The Author 2017. Published by Oxford University Press on behalf of the Society for Experimental Biology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  2. Transformation of the education of health professionals in China: progress and challenges.

    Science.gov (United States)

    Hou, Jianlin; Michaud, Catherine; Li, Zhihui; Dong, Zhe; Sun, Baozhi; Zhang, Junhua; Cao, Depin; Wan, Xuehong; Zeng, Cheng; Wei, Bo; Tao, Lijian; Li, Xiaosong; Wang, Weimin; Lu, Yingqing; Xia, Xiulong; Guo, Guifang; Zhang, Zhiyong; Cao, Yunfei; Guan, Yuanzhi; Meng, Qingyue; Wang, Qing; Zhao, Yuhong; Liu, Huaping; Lin, Huiqing; Ke, Yang; Chen, Lincoln

    2014-08-30

    In this Review we examine the progress and challenges of China's ambitious 1998 reform of the world's largest health professional educational system. The reforms merged training institutions into universities and greatly expanded enrolment of health professionals. Positive achievements include an increase in the number of graduates to address human resources shortages, acceleration of production of diploma nurses to correct skill-mix imbalance, and priority for general practitioner training, especially of rural primary care workers. These developments have been accompanied by concerns: rapid expansion of the number of students without commensurate faculty strengthening, worries about dilution effect on quality, outdated curricular content, and ethical professionalism challenged by narrow technical training and growing admissions of students who did not express medicine as their first career choice. In this Review we underscore the importance of rebalance of the roles of health sciences institutions and government in educational policies and implementation. The imperative for reform is shown by a looming crisis of violence against health workers hypothesised as a result of many factors including deficient educational preparation and harmful profit-driven clinical practices. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Challenges and Recent Progress in the Development of a Closed-loop Artificial Pancreas.

    Science.gov (United States)

    Bequette, B Wayne

    2012-12-01

    Pursuit of a closed-loop artificial pancreas that automatically controls the blood glucose of individuals with type 1 diabetes has intensified during the past six years. Here we discuss the recent progress and challenges in the major steps towards a closed-loop system. Continuous insulin infusion pumps have been widely available for over two decades, but "smart pump" technology has made the devices easier to use and more powerful. Continuous glucose monitoring (CGM) technology has improved and the devices are more widely available. A number of approaches are currently under study for fully closed-loop systems; most manipulate only insulin, while others manipulate insulin and glucagon. Algorithms include on-off (for prevention of overnight hypoglycemia), proportional-integral-derivative (PID), model predictive control (MPC) and fuzzy logic based learning control. Meals cause a major "disturbance" to blood glucose, and we discuss techniques that our group has developed to predict when a meal is likely to be consumed and its effect. We further examine both physiology and device-related challenges, including insulin infusion set failure and sensor signal attenuation. Finally, we discuss the next steps required to make a closed-loop artificial pancreas a commercial reality.

  4. Operating the worldwide LHC computing grid: current and future challenges

    International Nuclear Information System (INIS)

    Molina, J Flix; Forti, A; Girone, M; Sciaba, A

    2014-01-01

    The Wordwide LHC Computing Grid project (WLCG) provides the computing and storage resources required by the LHC collaborations to store, process and analyse their data. It includes almost 200,000 CPU cores, 200 PB of disk storage and 200 PB of tape storage distributed among more than 150 sites. The WLCG operations team is responsible for several essential tasks, such as the coordination of testing and deployment of Grid middleware and services, communication with the experiments and the sites, followup and resolution of operational issues and medium/long term planning. In 2012 WLCG critically reviewed all operational procedures and restructured the organisation of the operations team as a more coherent effort in order to improve its efficiency. In this paper we describe how the new organisation works, its recent successes and the changes to be implemented during the long LHC shutdown in preparation for the LHC Run 2.

  5. "Tennis elbow". A challenging call for computation and medicine

    Science.gov (United States)

    Sfetsioris, D.; Bontioti, E. N.

    2014-10-01

    An attempt to give an insight on the features composing this musculotendinous disorder. We address the issues of definition, pathophysiology and the mechanism underlying the onset and the occurrence of the disease, diagnosis and diagnostic tools as well as the methods of treatment. We focus mostly on conservative treatment protocols and we recognize the need for a more thorough investigation with the aid of computation.

  6. Computational brain models: Advances from system biology and future challenges

    Directory of Open Access Journals (Sweden)

    George E. Barreto

    2015-02-01

    Full Text Available Computational brain models focused on the interactions between neurons and astrocytes, modeled via metabolic reconstructions, are reviewed. The large source of experimental data provided by the -omics techniques and the advance/application of computational and data-management tools are being fundamental. For instance, in the understanding of the crosstalk between these cells, the key neuroprotective mechanisms mediated by astrocytes in specific metabolic scenarios (1 and the identification of biomarkers for neurodegenerative diseases (2,3. However, the modeling of these interactions demands a clear view of the metabolic and signaling pathways implicated, but most of them are controversial and are still under evaluation (4. Hence, to gain insight into the complexity of these interactions a current view of the main pathways implicated in the neuron-astrocyte communication processes have been made from recent experimental reports and reviews. Furthermore, target problems, limitations and main conclusions have been identified from metabolic models of the brain reported from 2010. Finally, key aspects to take into account into the development of a computational model of the brain and topics that could be approached from a systems biology perspective in future research are highlighted.

  7. Factors associated with coronary artery disease progression assessed by serial coronary computed tomography angiography

    International Nuclear Information System (INIS)

    Camargo, Gabriel Cordeiro; Gottlieb, Ilan; Rothstein, Tamara; Derenne, Maria Eduarda; Sabioni, Leticia; Lima, Ronaldo de Souza Leão; Lima, João A. C.

    2017-01-01

    Background: Coronary computed tomography angiography (CCTA) allows for noninvasive coronary artery disease (CAD) phenotyping. Factors related to CAD progression are epidemiologically valuable. Objective: To identify factors associated with CAD progression in patients undergoing sequential CCTA testing. Methods: We retrospectively analyzed 384 consecutive patients who had at least two CCTA studies between December 2005 and March 2013. Due to limitations in the quantification of CAD progression, we excluded patients who had undergone surgical revascularization previously or percutaneous coronary intervention (PCI) between studies. CAD progression was defined as any increase in the adapted segment stenosis score (calculated using the number of diseased segments and stenosis severity) in all coronary segments without stent (in-stent restenosis was excluded from the analysis). Stepwise logistic regression was used to assess variables associated with CAD progression. Results: From a final population of 234 patients, a total of 117 (50%) had CAD progression. In a model accounting for major CAD risk factors and other baseline characteristics, only age (odds ratio [OR] 1.04, 95% confidence interval [95%CI] 1.01–1.07), interstudy interval (OR 1.03, 95%CI 1.01–1.04), and past PCI (OR 3.66, 95%CI 1.77–7.55) showed an independent relationship with CAD progression. Conclusions: A history of PCI with stent placement was independently associated with a 3.7-fold increase in the odds of CAD progression, excluding in-stent restenosis. Age and interstudy interval were also independent predictors of progression. (author)

  8. Factors associated with coronary artery disease progression assessed by serial coronary computed tomography angiography

    Energy Technology Data Exchange (ETDEWEB)

    Camargo, Gabriel Cordeiro; Gottlieb, Ilan, E-mail: ilangottlieb@gmail.com [Casa de Saúde São José, Rio de Janeiro, RJ (Brazil); Rothstein, Tamara; Derenne, Maria Eduarda; Sabioni, Leticia; Lima, Ronaldo de Souza Leão [Centro de Diagnóstico por Imagem CDPI, Rio de Janeiro, RJ (Brazil); Lima, João A. C. [Johns Hopkins University, Baltimore (United States)

    2017-05-15

    Background: Coronary computed tomography angiography (CCTA) allows for noninvasive coronary artery disease (CAD) phenotyping. Factors related to CAD progression are epidemiologically valuable. Objective: To identify factors associated with CAD progression in patients undergoing sequential CCTA testing. Methods: We retrospectively analyzed 384 consecutive patients who had at least two CCTA studies between December 2005 and March 2013. Due to limitations in the quantification of CAD progression, we excluded patients who had undergone surgical revascularization previously or percutaneous coronary intervention (PCI) between studies. CAD progression was defined as any increase in the adapted segment stenosis score (calculated using the number of diseased segments and stenosis severity) in all coronary segments without stent (in-stent restenosis was excluded from the analysis). Stepwise logistic regression was used to assess variables associated with CAD progression. Results: From a final population of 234 patients, a total of 117 (50%) had CAD progression. In a model accounting for major CAD risk factors and other baseline characteristics, only age (odds ratio [OR] 1.04, 95% confidence interval [95%CI] 1.01–1.07), interstudy interval (OR 1.03, 95%CI 1.01–1.04), and past PCI (OR 3.66, 95%CI 1.77–7.55) showed an independent relationship with CAD progression. Conclusions: A history of PCI with stent placement was independently associated with a 3.7-fold increase in the odds of CAD progression, excluding in-stent restenosis. Age and interstudy interval were also independent predictors of progression. (author)

  9. Center for computation and visualization of geometric structures. [Annual], Progress report

    Energy Technology Data Exchange (ETDEWEB)

    1993-02-12

    The mission of the Center is to establish a unified environment promoting research, education, and software and tool development. The work is centered on computing, interpreted in a broad sense to include the relevant theory, development of algorithms, and actual implementation. The research aspects of the Center are focused on geometry; correspondingly the computational aspects are focused on three (and higher) dimensional visualization. The educational aspects are likewise centered on computing and focused on geometry. A broader term than education is `communication` which encompasses the challenge of explaining to the world current research in mathematics, and specifically geometry.

  10. Computational Science And Engineering Software Sustainability And Productivity (CSESSP) Challenges Workshop Report

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — This report details the challenges and opportunities discussed at the NITRD sponsored multi-agency workshop on Computational Science and Engineering Software...

  11. New challenges in grid generation and adaptivity for scientific computing

    CERN Document Server

    Formaggia, Luca

    2015-01-01

    This volume collects selected contributions from the “Fourth Tetrahedron Workshop on Grid Generation for Numerical Computations”, which was held in Verbania, Italy in July 2013. The previous editions of this Workshop were hosted by the Weierstrass Institute in Berlin (2005), by INRIA Rocquencourt in Paris (2007), and by Swansea University (2010). This book covers different, though related, aspects of the field: the generation of quality grids for complex three-dimensional geometries; parallel mesh generation algorithms; mesh adaptation, including both theoretical and implementation aspects; grid generation and adaptation on surfaces – all with an interesting mix of numerical analysis, computer science and strongly application-oriented problems.

  12. Association between Smoking and the Progression of Computed Tomography Findings in Chronic Pancreatitis.

    Science.gov (United States)

    Lee, Jeong Woo; Kim, Ho Gak; Lee, Dong Wook; Han, Jimin; Kwon, Hyuk Yong; Seo, Chang Jin; Oh, Ji Hye; Lee, Joo Hyoung; Jung, Jin Tae; Kwon, Joong Goo; Kim, Eun Young

    2016-05-23

    Smoking and alcohol intake are two wellknown risk factors for chronic pancreatitis. However, there are few studies examining the association between smoking and changes in computed tomography (CT) findings in chronic pancreatitis. The authors evaluated associations between smoking, drinking and the progression of calcification on CT in chronic pancreatitis. In this retrospective study, 59 patients with chronic pancreatitis who had undergone initial and follow-up CT between January 2002 and September 2010 were included. Progression of calcification among CT findings was compared according to the amount of alcohol intake and smoking. The median duration of followup was 51.6 months (range, 17.1 to 112.7 months). At initial CT findings, there was pancreatic calcification in 35 patients (59.3%). In the follow-up CT, progression of calcification was observed in 37 patients (62.7%). Progression of calcification was more common in smokers according to the multivariate analysis (odds ratio [OR], 9.987; p=0.006). The amount of smoking was a significant predictor for progression of calcification in the multivariate analysis (OR, 6.051 in less than 1 pack per day smokers; OR, 36.562 in more than 1 pack per day smokers; p=0.008). Continued smoking accelerates pancreatic calcification, and the amount of smoking is associated with the progression of calcification in chronic pancreatitis.

  13. Measles and rubella elimination in the WHO Region for Europe: progress and challenges.

    Science.gov (United States)

    O'Connor, P; Jankovic, D; Muscat, M; Ben-Mamou, M; Reef, S; Papania, M; Singh, S; Kaloumenos, T; Butler, R; Datta, S

    2017-08-01

    Globally measles remains one of the leading causes of death among young children even though a safe and cost-effective vaccine is available. The World Health Organization (WHO) European Region has seen a decline in measles and rubella cases in recent years. The recent outbreaks have primarily affected adolescents and young adults with no vaccination or an incomplete vaccination history. Eliminating measles and rubella is one of the top immunization priorities of the European Region as outlined in the European Vaccine Action Plan 2015-2020. Following the 2010 decision by the Member States in the Region to initiate the process of verifying elimination, the European Regional Verification Commission for Measles and Rubella Elimination (RVC) was established in 2011. The RVC meets every year to evaluate the status of measles and rubella elimination in the Region based on documentation submitted by each country's National Verification Committees. The verification process was however modified in late 2014 to assess the elimination status at the individual country level instead of at regional level. The WHO European Region has made substantial progress towards measles and rubella elimination over the past 5 years. The RVC's conclusion in 2016 that 70% and 66% of the 53 Member States in the Region had interrupted the endemic transmission of measles and rubella, respectively, by 2015 is a testament to this progress. Nevertheless, where measles and rubella remain endemic, challenges in vaccination service delivery and disease surveillance will need to be addressed through focused technical assistance from WHO and development partners. Copyright © 2017 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

  14. Progress in Harmonizing Tiered HIV Laboratory Systems: Challenges and Opportunities in 8 African Countries

    Science.gov (United States)

    Williams, Jason; Umaru, Farouk; Edgil, Dianna; Kuritsky, Joel

    2016-01-01

    ABSTRACT In 2014, the Joint United Nations Programme on HIV/AIDS released its 90-90-90 targets, which make laboratory diagnostics a cornerstone for measuring efforts toward the epidemic control of HIV. A data-driven laboratory harmonization and standardization approach is one way to create efficiencies and ensure optimal laboratory procurements. Following the 2008 “Maputo Declaration on Strengthening of Laboratory Systems”—a call for government leadership in harmonizing tiered laboratory networks and standardizing testing services—several national ministries of health requested that the United States Government and in-country partners help implement the recommendations by facilitating laboratory harmonization and standardization workshops, with a primary focus on improving HIV laboratory service delivery. Between 2007 and 2015, harmonization and standardization workshops were held in 8 African countries. This article reviews progress in the harmonization of laboratory systems in these 8 countries. We examined agreed-upon instrument lists established at the workshops and compared them against instrument data from laboratory quantification exercises over time. We used this measure as an indicator of adherence to national procurement policies. We found high levels of diversity across laboratories’ diagnostic instruments, equipment, and services. This diversity contributes to different levels of compliance with expected service delivery standards. We believe the following challenges to be the most important to address: (1) lack of adherence to procurement policies, (2) absence or limited influence of a coordinating body to fully implement harmonization proposals, and (3) misalignment of laboratory policies with minimum packages of care and with national HIV care and treatment guidelines. Overall, the effort to implement the recommendations from the Maputo Declaration has had mixed success and is a work in progress. Program managers should continue efforts to

  15. Progress in Harmonizing Tiered HIV Laboratory Systems: Challenges and Opportunities in 8 African Countries.

    Science.gov (United States)

    Williams, Jason; Umaru, Farouk; Edgil, Dianna; Kuritsky, Joel

    2016-09-28

    In 2014, the Joint United Nations Programme on HIV/AIDS released its 90-90-90 targets, which make laboratory diagnostics a cornerstone for measuring efforts toward the epidemic control of HIV. A data-driven laboratory harmonization and standardization approach is one way to create efficiencies and ensure optimal laboratory procurements. Following the 2008 "Maputo Declaration on Strengthening of Laboratory Systems"-a call for government leadership in harmonizing tiered laboratory networks and standardizing testing services-several national ministries of health requested that the United States Government and in-country partners help implement the recommendations by facilitating laboratory harmonization and standardization workshops, with a primary focus on improving HIV laboratory service delivery. Between 2007 and 2015, harmonization and standardization workshops were held in 8 African countries. This article reviews progress in the harmonization of laboratory systems in these 8 countries. We examined agreed-upon instrument lists established at the workshops and compared them against instrument data from laboratory quantification exercises over time. We used this measure as an indicator of adherence to national procurement policies. We found high levels of diversity across laboratories' diagnostic instruments, equipment, and services. This diversity contributes to different levels of compliance with expected service delivery standards. We believe the following challenges to be the most important to address: (1) lack of adherence to procurement policies, (2) absence or limited influence of a coordinating body to fully implement harmonization proposals, and (3) misalignment of laboratory policies with minimum packages of care and with national HIV care and treatment guidelines. Overall, the effort to implement the recommendations from the Maputo Declaration has had mixed success and is a work in progress. Program managers should continue efforts to advance the

  16. Recent progress and new challenges in isospin physics with heavy-ion reactions

    Energy Technology Data Exchange (ETDEWEB)

    Li Baoan [Department of Physics, Texas A and M University-Commerce, Commerce, TX 75429-3011 (United States)], E-mail: Bao-An_Li@Tamu-Commerce.edu; Chen Liewen [Institute of Theoretical Physics, Shanghai Jiao Tong University, Shanghai 200240 (China)], E-mail: Lwchen@Sjtu.edu.cn; Ko, Che Ming [Cyclotron Institute and Physics Department, Texas A and M University, College Station, TX 77843-3366 (United States)], E-mail: Ko@Comp.tamu.edu

    2008-08-15

    The ultimate goal of studying isospin physics via heavy-ion reactions with neutron-rich, stable and/or radioactive nuclei is to explore the isospin dependence of in-medium nuclear effective interactions and the equation of state of neutron-rich nuclear matter, particularly the isospin-dependent term in the equation of state, i.e., the density dependence of the symmetry energy. Because of its great importance for understanding many phenomena in both nuclear physics and astrophysics, the study of the density dependence of the nuclear symmetry energy has been the main focus of the intermediate-energy heavy-ion physics community during the last decade, and significant progress has been achieved both experimentally and theoretically. In particular, a number of phenomena or observables have been identified as sensitive probes to the density dependence of nuclear symmetry energy. Experimental studies have confirmed some of these interesting isospin-dependent effects and allowed us to constrain relatively stringently the symmetry energy at sub-saturation densities. The impact of this constrained density dependence of the symmetry energy on the properties of neutron stars have also been studied, and they were found to be very useful for the astrophysical community. With new opportunities provided by the various radioactive beam facilities being constructed around the world, the study of isospin physics is expected to remain one of the forefront research areas in nuclear physics. In this report, we review the major progress achieved during the last decade in isospin physics with heavy ion reactions and discuss future challenges to the most important issues in this field.

  17. Progress and challenges in improving the nutritional quality of rice (Oryza sativa L.).

    Science.gov (United States)

    Birla, Deep Shikha; Malik, Kapil; Sainger, Manish; Chaudhary, Darshna; Jaiwal, Ranjana; Jaiwal, Pawan K

    2017-07-24

    Rice is a staple food for more than 3 billion people in more than 100 countries of the world but ironically it is deficient in many bioavailable vitamins, minerals, essential amino- and fatty-acids and phytochemicals that prevent chronic diseases like type 2 diabetes, heart disease, cancers, and obesity. To enhance the nutritional and other quality aspects of rice, a better understanding of the regulation of the processes involved in the synthesis, uptake, transport, and metabolism of macro-(starch, seed storage protein and lipid) and micronutrients (vitamins, minerals and phytochemicals) is required. With the publication of high quality genomic sequence of rice, significant progress has been made in identification, isolation, and characterization of novel genes and their regulation for the nutritional and quality enhancement of rice. During the last decade, numerous efforts have been made to refine the nutritional and other quality traits either by using the traditional breeding with high through put technologies such as marker assisted selection and breeding, or by adopting the transgenic approach. A significant improvement in vitamins (A, folate, and E), mineral (iron), essential amino acid (lysine), and flavonoids levels has been achieved in the edible part of rice, i.e., endosperm (biofortification) to meet the daily dietary allowance. However, studies on bioavailability and allergenicity on biofortified rice are still required. Despite the numerous efforts, the commercialization of biofortified rice has not yet been achieved. The present review summarizes the progress and challenges of genetic engineering and/or metabolic engineering technologies to improve rice grain quality, and presents the future prospects in developing nutrient dense rice to save the everincreasing population, that depends solely on rice as the staple food, from widespread nutritional deficiencies.

  18. Fifth annual progress report for Canada's climate change voluntary challenge and registry program

    International Nuclear Information System (INIS)

    1999-10-01

    Suncor Energy is a growing Canada-based integrated energy company comprising a corporate group and four operating businesses including: Oil Sands with a mine and upgrading facility at Fort McMurray, AB, Exploration and Production with conventional and heavy oil business in Western Canada, a Sunoco refining and marketing operation, and the Stuart Oil Shale Development Project in Queensland, Australia. While the emphasis is laid on technical and economic advances made by the company, the environmental tradeoffs, namely, greater greenhouse gas emissions and the need to reduce them, are noted. The most important positive item in the report is the incredible transformation occurring in Suncor's business operations. The company has begun a $2 billion expansion in its Oil Sands business that will more than double production of crude oil and fuel products by 2002. The expansion initiative provides a wonderful opportunity to demonstrate the huge leaps in performance that can be implemented at the time of capital stock turnover. The new expansion facilities are designed to be twice as energy efficient as the existing plant. Equally dramatic and hard won, are the multitude of incremental improvements achieved in existing facilities. Through energy management systems and operating practices and procedures, exploration and production is reversing the trend of rising greenhouse gas (GHG) emission intensity associated with mature conventional reservoirs, and Suncoco achieved its best ever operating performance in 1998. However, the volume of Suncor greenhouse gas emissions remains on an upward trend, which is a challenge for the future. As part of its mission to become a sustainable energy company, Suncor will continue to attempt to limit its net volume contribution of GHGs to the atmosphere to 1990 levels by pursuing domestic and international offsets and the development of alternative and renewable sources of energy. Progress towards sustainability for both Suncor and Canada

  19. Leaderboard Now Open: CPTAC’s DREAM Proteogenomics Computational Challenge | Office of Cancer Clinical Proteomics Research

    Science.gov (United States)

    The National Cancer Institute’s Clinical Proteomic Tumor Analysis Consortium (CPTAC) is pleased to announce the opening of the leaderboard to its Proteogenomics Computational DREAM Challenge. The leadership board remains open for submissions during September 25, 2017 through October 8, 2017, with the Challenge expected to run until November 17, 2017.

  20. Spectroscopy, modeling and computation of metal chelate solubility in supercritical CO2. 1998 annual progress report

    International Nuclear Information System (INIS)

    Brennecke, J.F.; Chateauneuf, J.E.; Stadtherr, M.A.

    1998-01-01

    'This report summarizes work after 1 year and 8 months (9/15/96-5/14/98) of a 3 year project. Thus far, progress has been made in: (1) the measurement of the solubility of metal chelates in SC CO 2 with and without added cosolvents, (2) the spectroscopic determination of preferential solvation of metal chelates by cosolvents in SC CO 2 solutions, and (3) the development of a totally reliable computational technique for phase equilibrium computations. An important factor in the removal of metals from solid matrices with CO 2 /chelate mixtures is the equilibrium solubility of the metal chelate complex in the CO 2 .'

  1. Cone beam computed tomographic imaging: perspective, challenges, and the impact of near-trend future applications.

    Science.gov (United States)

    Cavalcanti, Marcelo Gusmão Paraiso

    2012-01-01

    Cone beam computed tomography (CBCT) can be considered as a valuable imaging modality for improving diagnosis and treatment planning to achieve true guidance for several craniofacial surgical interventions. A new concept and perspective in medical informatics is the highlight discussion about the new imaging interactive workflow. The aim of this article was to present, in a short literature review, the usefulness of CBCT technology as an important alternative imaging modality, highlighting current practices and near-term future applications in cutting-edge thought-provoking perspectives for craniofacial surgical assessment. This article explains the state of the art of CBCT improvements, medical workstation, and perspectives of the dedicated unique hardware and software, which can be used from the CBCT source. In conclusion, CBCT technology is developing rapidly, and many advances are on the horizon. Further progress in medical workstations, engineering capabilities, and improvement in independent software-some open source-should be attempted with this new imaging method. The perspectives, challenges, and pitfalls in CBCT will be delineated and evaluated along with the technological developments.

  2. Computer Security: Join the CERN WhiteHat Challenge!

    CERN Multimedia

    Computer Security Team

    2014-01-01

    Over the past couple of months, several CERN users have reported vulnerabilities they have found in computing services and servers running at CERN. All were relevant, many were interesting and a few even surprising. Spotting weaknesses and areas for improvement before malicious people can exploit them is paramount. It helps protect the operation of our accelerators and experiments as well as the reputation of the Organization. Therefore, we would like to express our gratitude to those people for having reported these weaknesses! Great job and well done!   Seizing the opportunity, we would like to reopen the hunt for bugs, vulnerabilities and insecure configurations of CERN applications, websites and devices. You might recall we ran a similar initiative (“Hide & Seek”) in 2012 where we asked you to sift through CERN’s webpages and send us those that hold sensitive and confidential information. Quite a number of juicy documents were found and subsequently remov...

  3. Recent Progress in First-Principles Methods for Computing the Electronic Structure of Correlated Materials

    Directory of Open Access Journals (Sweden)

    Fredrik Nilsson

    2018-03-01

    Full Text Available Substantial progress has been achieved in the last couple of decades in computing the electronic structure of correlated materials from first principles. This progress has been driven by parallel development in theory and numerical algorithms. Theoretical development in combining ab initio approaches and many-body methods is particularly promising. A crucial role is also played by a systematic method for deriving a low-energy model, which bridges the gap between real and model systems. In this article, an overview is given tracing the development from the LDA+U to the latest progress in combining the G W method and (extended dynamical mean-field theory ( G W +EDMFT. The emphasis is on conceptual and theoretical aspects rather than technical ones.

  4. Developing E-Governance in the Eurasian Economic Union: Progress, Challenges and Prospects

    Directory of Open Access Journals (Sweden)

    Lyudmila Vidiasova

    2017-03-01

    Full Text Available he article provides an overview of e-governance development in the members of the Eurasian Economic Union (EEU. There is a lack of integrated research on e-governance in the EEU countries, although given the strengthening of this regional bloc, new information and communication technologies (ICT could serve as significant growth driver. Given the history and specifics of regional cooperation in the post-Soviet space and international best practices in ICT use in regional blocs, this article reviews the development of e-governance in the EEU members The research methodology was based on a three-stage concept of regionalism [Van Langenhov, Coste, 2005]. The study examines three key components: progress in developing e-governance, barriers to that development and future prospects. It used qualitative and quantitative methods. Data sources included the results of the United Nations E-Government rating, EEU countries’ regulations based on 3,200 documents and the authors’ expert survey, in which 18 experts (12 EEU representatives and six international experts participated. The study revealed the progress made by EEU countries in improving technological development and reducing human capital development indicators. The survey identified key barriers to e-governance development in the EEU: low motivation and information technology skills among civil servants, and citizens’ low computer literacy. The analysis of EEU members’ national economic priorities revealed a common focus on ICT development. The authors concluded that prospects for e-governance in the EEU were associated with strengthening regional cooperation in standardizing information systems, implementing one-stop-shop services, managing electronic documents and expanding online services. The authors presented two areas for developing e-governance within the EEU. The first is external integration, which, if strengthened, would affect the economy positivelyand optimize business processes

  5. Computed tomographic findings of progressive supranuclear palsy compared with Parkinson's disease

    Energy Technology Data Exchange (ETDEWEB)

    Yuki, Nobuhiro; Sato, Shuzo; Yuasa, Tatsuhiko; Ito, Jusuke; Miyatake, Tadashi [Niigata Univ. (Japan). School of Dentistry

    1990-10-01

    We investigated computed tomographic (CT) films of 4 pathologically documented cases of progressive supranuclear palsy (PSP) in which the clinical presentations were atypical and compared the findings with those of 15 patients with Parkinson's disease (PD). Dilatation of the third ventricle, atrophy of the midbrain tegmentum, and enlargement of the interpeduncular cistern toward the aqueduct were found to be the characteristic findings in PSP. Thus, radiological findings can be useful when the differential diagnosis between PSP and PD is clinically difficult. (author).

  6. The progress and challenges of implementation of the Framework Convention on Tobacco Control (WHO FCTC) in Kyrgyz Republic

    OpenAIRE

    Chinara Bekbasarova

    2018-01-01

    Background and challenges to implementation The Kyrgyz Republic is Party of the WHO FCTC since August 23, 2006. This abstract analyzes progress and challenges during 10 years of implementation of WHO´s FCTC Intervention or response National Tobacco Control (TC) Law was adopted on August 21, 2006, entered into force on December 19, 2006 and was amended and supplemented during 10 years 2 times. TC measures were included, as one of main priorities, in the National Program on Heal...

  7. Tobacco Control Policies in Vietnam: Review on MPOWER Implementation Progress and Challenges.

    Science.gov (United States)

    Minh, Hoang Van; Ngan, Tran Thu; Mai, Vu Quynh; My, Nguyen Thi Tuyet; Chung, Le Hong; Kien, Vu Duy; Anh, Tran Tuan; Ngoc, Nguyen Bao; Giap, Vu Van; Cuong, Nguyen Manh; Manh, Pham Duc; Giang, Kim Bao

    2016-01-01

    In Vietnam, the WHO Framework Convention on Tobacco Control (WHO FCTC) took effect in March 2005 while MPOWER has been implemented since 2008. This paper describes the progress and challenges of implementation of the MPOWER package in Vietnam. We can report that, in term of monitoring, Vietnam is very active in the Global Tobacco Surveillance System, completing two rounds of the Global Adult Tobacco Survey (GATS) and three rounds of the Global Youth Tobacco Survey (GYTS). To protect people from tobacco smoke, Vietnam has issued and enforced a law requiring comprehensive smoking bans at workplaces and public places since 2013. Tobacco advertising and promotion are also prohibited with the exception of points of sale displays of tobacco products. Violations come in the form of promotion girls, corporate social responsibility activities from tobacco manufacturers and packages displayed by retail vendors. Vietnam is one of the 77 countries that require pictorial health warnings to be printed on cigarette packages to warn about the danger of tobacco and the warnings have been implemented effectively. Cigarette tax is 70% of factory price which is equal to less than 45% of retail price and much lower than the recommendation of WHO. However, Vietnam is one of the very few countries that require manufacturers and importers to make "compulsory contributions" at 1-2% of the factory price of cigarettes sold in Vietnam for the establishment of a Tobacco Control Fund (TCF). The TCF is being operated well. In 2015, 67 units of 63 provinces/cities, 22 ministries and political-social organizations and 6 hospitals received funding from TCF to implement a wide range of tobacco control activities. Cessation services have been starting with a a toll-free quit-line but need to be further strengthened. In conclusion, Vietnam has constantly put efforts into the tobacco control field with high commitment from the government, scientists and activists. Though several remarkable achievements

  8. Recent progress and future challenges in algal biofuel production [version 1; referees: 4 approved

    Directory of Open Access Journals (Sweden)

    Jonathan B. Shurin

    2016-10-01

    Full Text Available Modern society is fueled by fossil energy produced millions of years ago by photosynthetic organisms. Cultivating contemporary photosynthetic producers to generate energy and capture carbon from the atmosphere is one potential approach to sustaining society without disrupting the climate. Algae, photosynthetic aquatic microorganisms, are the fastest growing primary producers in the world and can therefore produce more energy with less land, water, and nutrients than terrestrial plant crops. We review recent progress and challenges in developing bioenergy technology based on algae. A variety of high-value products in addition to biofuels can be harvested from algal biomass, and these may be key to developing algal biotechnology and realizing the commercial potential of these organisms. Aspects of algal biology that differentiate them from plants demand an integrative approach based on genetics, cell biology, ecology, and evolution. We call for a systems approach to research on algal biotechnology rooted in understanding their biology, from the level of genes to ecosystems, and integrating perspectives from physical, chemical, and social sciences to solve one of the most critical outstanding technological problems.

  9. Urban growth and water access in sub-Saharan Africa: Progress, challenges, and emerging research directions.

    Science.gov (United States)

    Dos Santos, S; Adams, E A; Neville, G; Wada, Y; de Sherbinin, A; Mullin Bernhardt, E; Adamo, S B

    2017-12-31

    For the next decade, the global water crisis remains the risk of highest concern, and ranks ahead of climate change, extreme weather events, food crises and social instability. Across the globe, nearly one in ten people is without access to an improved drinking water source. Least Developed Countries (LDCs) especially in sub-Saharan Africa (SSA) are the most affected, having disproportionately more of the global population without access to clean water than other major regions. Population growth, changing lifestyles, increasing pollution and accelerating urbanization will continue to widen the gap between the demand for water and available supply especially in urban areas, and disproportionately affect informal settlements, where the majority of SSA's urban population resides. Distribution and allocation of water will be affected by climate-induced water stresses, poor institutions, ineffective governance, and weak political will to address scarcity and mediate uncertainties in future supply. While attempts have been made by many scientists to examine different dimensions of water scarcity and urban population dynamics, there are few comprehensive reviews, especially focused on the particular situation in Sub-Saharan Africa. This paper contributes to interdisciplinary understanding of urban water supply by distilling and integrating relevant empirical knowledge on urban dynamics and water issues in SSA, focusing on progress made and associated challenges. It then points out future research directions including the need to understand how alternatives to centralized water policies may help deliver sustainable water supply to cities and informal settlements in the region. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Safety risk management of underground engineering in China: Progress, challenges and strategies

    Directory of Open Access Journals (Sweden)

    Qihu Qian

    2016-08-01

    Full Text Available Underground construction in China is featured by large scale, high speed, long construction period, complex operation and frustrating situations regarding project safety. Various accidents have been reported from time to time, resulting in serious social impact and huge economic loss. This paper presents the main progress in the safety risk management of underground engineering in China over the last decade, i.e. (1 establishment of laws and regulations for safety risk management of underground engineering, (2 implementation of the safety risk management plan, (3 establishment of decision support system for risk management and early-warning based on information technology, and (4 strengthening the study on safety risk management, prediction and prevention. Based on the analysis of the typical accidents in China in the last decade, the new challenges in the safety risk management for underground engineering are identified as follows: (1 control of unsafe human behaviors; (2 technological innovation in safety risk management; and (3 design of safety risk management regulations. Finally, the strategies for safety risk management of underground engineering in China are proposed in six aspects, i.e. the safety risk management system and policy, law, administration, economy, education and technology.

  11. P300 brain computer interface: current challenges and emerging trends

    Science.gov (United States)

    Fazel-Rezai, Reza; Allison, Brendan Z.; Guger, Christoph; Sellers, Eric W.; Kleih, Sonja C.; Kübler, Andrea

    2012-01-01

    A brain-computer interface (BCI) enables communication without movement based on brain signals measured with electroencephalography (EEG). BCIs usually rely on one of three types of signals: the P300 and other components of the event-related potential (ERP), steady state visual evoked potential (SSVEP), or event related desynchronization (ERD). Although P300 BCIs were introduced over twenty years ago, the past few years have seen a strong increase in P300 BCI research. This closed-loop BCI approach relies on the P300 and other components of the ERP, based on an oddball paradigm presented to the subject. In this paper, we overview the current status of P300 BCI technology, and then discuss new directions: paradigms for eliciting P300s; signal processing methods; applications; and hybrid BCIs. We conclude that P300 BCIs are quite promising, as several emerging directions have not yet been fully explored and could lead to improvements in bit rate, reliability, usability, and flexibility. PMID:22822397

  12. Computational challenges in magnetic-confinement fusion physics

    Science.gov (United States)

    Fasoli, A.; Brunner, S.; Cooper, W. A.; Graves, J. P.; Ricci, P.; Sauter, O.; Villard, L.

    2016-05-01

    Magnetic-fusion plasmas are complex self-organized systems with an extremely wide range of spatial and temporal scales, from the electron-orbit scales (~10-11 s, ~ 10-5 m) to the diffusion time of electrical current through the plasma (~102 s) and the distance along the magnetic field between two solid surfaces in the region that determines the plasma-wall interactions (~100 m). The description of the individual phenomena and of the nonlinear coupling between them involves a hierarchy of models, which, when applied to realistic configurations, require the most advanced numerical techniques and algorithms and the use of state-of-the-art high-performance computers. The common thread of such models resides in the fact that the plasma components are at the same time sources of electromagnetic fields, via the charge and current densities that they generate, and subject to the action of electromagnetic fields. This leads to a wide variety of plasma modes of oscillations that resonate with the particle or fluid motion and makes the plasma dynamics much richer than that of conventional, neutral fluids.

  13. Computer Adaptive Multistage Testing: Practical Issues, Challenges and Principles

    Directory of Open Access Journals (Sweden)

    Halil Ibrahim SARI

    2016-12-01

    Full Text Available The purpose of many test in the educational and psychological measurement is to measure test takers’ latent trait scores from responses given to a set of items. Over the years, this has been done by traditional methods (paper and pencil tests. However, compared to other test administration models (e.g., adaptive testing, traditional methods are extensively criticized in terms of producing low measurement accuracy and long test length. Adaptive testing has been proposed to overcome these problems. There are two popular adaptive testing approaches. These are computerized adaptive testing (CAT and computer adaptive multistage testing (ca-MST. The former is a well-known approach that has been predominantly used in this field. We believe that researchers and practitioners are fairly familiar with many aspects of CAT because it has more than a hundred years of history. However, the same thing is not true for the latter one. Since ca-MST is relatively new, many researchers are not familiar with features of it. The purpose of this study is to closely examine the characteristics of ca-MST, including its working principle, the adaptation procedure called the routing method, test assembly, and scoring, and provide an overview to researchers, with the aim of drawing researchers’ attention to ca-MST and encouraging them to contribute to the research in this area. The books, software and future work for ca-MST are also discussed.

  14. Coupled Atmosphere-Wave-Ocean Modeling of Tropical Cyclones: Progress, Challenges, and Ways Forward

    Science.gov (United States)

    Chen, Shuyi

    2015-04-01

    /s. It is found that the air-sea fluxes are quite asymmetric around a storm with complex features representing various air-sea interaction processes in TCs. A unique observation in Typhoon Fanapi is the development of a stable boundary layer in the near-storm cold wake region, which has a direct impact on TC inner core structure and intensity. Despite of the progress, challenges remain. Air-sea momentum exchange in wind speed greater than 30-40 m/s is largely unresolved. Directional wind-wave stress and wave-current stress are difficult to determine from observations. Effects of sea spray on the air-sea fluxes are still not well understood. This talk will provide an overview on progress made in recent years, challenges we are facing, and ways forward. An integrated coupled observational and atmosphere-wave-ocean modeling system is urgently needed, in which coupled model development and targeted observations from field campaign and lab measurements together form the core of the research and prediction system. Another important aspect is that fully coupled models provide explicit, integrated impact forecasts of wind, rain, waves, ocean currents and surges in TCs and winter storms, which are missing in most current NWP models. It requires a new strategy for model development, evaluation, and verification. Ensemble forecasts using high-resolution coupled atmosphere-wave-ocean models can provide probabilistic forecasts and quantitative uncertainty estimates, which also allow us to explore new methodologies to verify probabilistic impact forecasts and evaluate model physics using a stochastic approach. Examples of such approach in TCs including Superstorm Sandy will be presented.

  15. 3rd International Symposium on Big Data and Cloud Computing Challenges

    CERN Document Server

    Neelanarayanan, V

    2016-01-01

    This proceedings volume contains selected papers that were presented in the 3rd International Symposium on Big data and Cloud Computing Challenges, 2016 held at VIT University, India on March 10 and 11. New research issues, challenges and opportunities shaping the future agenda in the field of Big Data and Cloud Computing are identified and presented throughout the book, which is intended for researchers, scholars, students, software developers and practitioners working at the forefront in their field. This book acts as a platform for exchanging ideas, setting questions for discussion, and sharing the experience in Big Data and Cloud Computing domain.

  16. Radioactive waste management in Canada: progress and challenges 15 years after the policy framework

    International Nuclear Information System (INIS)

    McCauley, D.

    2011-01-01

    from that development - the establishment of the Nuclear Waste Management Organization, the study of options for the long-term management of nuclear fuel waste, the Government's decision on the options, the agreement on a funding formula for nuclear fuel waste management, and the launch of the NWMO's siting process. In this same period, we also have witnessed progress on a long-term waste management facility for low and intermediate-level radioactive waste in Ontario - including an agreement with the hosting community. In addition, there has been further advancement in the management of uranium tailing, notably the launch of cleanup efforts at the Gunnar mine in northern Saskatchewan. Finally, the federal government has established robust programs for the management of historic and legacy wastes across the country. In terms of historic wastes, the Port Hope Area Initiative has advanced to the point where critical decisions will be made in 2011 on the launch of the implementation phase of that Project and the Low-Level Radioactive Waste Management Office continues to manage historic wastes at other sites across the country. As for legacy wastes, decisions are expected prior to the end of 2010 on the continuation of the Nuclear Legacy Liabilities Program which addresses decommissioning and radioactive waste liabilities at AECL sites in Manitoba, Ontario, Quebec, and Nova Scotia. The coming years will see the further advancement of these initiatives, all which will face their own challenges. Nevertheless, there is generally a defined strategy or path and the appropriate elements are in place to achieve success. Despite these initiatives, there remain gaps in Canada's approach to radioactive waste management. In particular, while there has been progress on the management of low and intermediate-level radioactive waste in Ontario to address wastes from Ontario Power Generation's facilities, there is, as yet, no long-term management approach defined for

  17. The Awareness and Challenges of Cloud Computing Adoption on Tertiary Education in Malaysia

    Science.gov (United States)

    Hazreeni Hamzah, Nor; Mahmud, Maziah; Zukri, Shamsunarnie Mohamed; Yaacob, Wan Fairos Wan; Yacob, Jusoh

    2017-09-01

    This preliminary study aims to investigate the awareness of the adoption of cloud computing among the academicians in tertiary education in Malaysia. Besides, this study also want to explore the possible challenges faced by the academician while adopting this new technology. The pilot study was done on 40 lecturers in Universiti Teknologi MARA Kampus Kota Bharu (UiTMKB) by using self administered questionnaire. The results found that almost half (40 percent) were not aware on the existing of cloud computing in teaching and learning (T&L) process. The challenges confronting the adoption of cloud computing are data insecurity, data insecurity, unsolicited advertisement, lock-in, reluctance to eliminate staff positions, privacy concerns, reliability challenge, regulatory compliance concerns/user control and institutional culture/resistance to change in technology. This possible challenges can be factorized in two major factors which were security and dependency factor and user control and mentality factor.

  18. Building a Grad Nation: Progress and Challenge in Ending the High School Dropout Epidemic. Annual Update, 2010-2011

    Science.gov (United States)

    Balfanz, Robert; Bridgeland, John M.; Fox, Joanna Hornig; Moore, Laura A.

    2011-01-01

    America continues to make progress in meeting its high school dropout challenge. Leaders in education, government, nonprofits and business have awakened to the individual, social and economic costs of the dropout crisis and are working together to solve it. This year, all states, districts, and schools are required by law to calculate high school…

  19. Delaware: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  20. Oklahoma: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  1. Arkansas: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  2. Mississippi: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  3. North Carolina: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  4. Texas: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  5. West Virginia: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  6. Georgia: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  7. Maryland: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  8. Alabama: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP, high school graduation rates, and access…

  9. South Carolina: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  10. Virginia: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP, high school graduation rates, and access…

  11. Louisiana: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  12. Tennessee: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  13. Florida: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP, high school graduation rates, and access…

  14. Kentucky: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  15. Comparison of progressive addition lenses for general purpose and for computer vision: an office field study.

    Science.gov (United States)

    Jaschinski, Wolfgang; König, Mirjam; Mekontso, Tiofil M; Ohlendorf, Arne; Welscher, Monique

    2015-05-01

    Two types of progressive addition lenses (PALs) were compared in an office field study: 1. General purpose PALs with continuous clear vision between infinity and near reading distances and 2. Computer vision PALs with a wider zone of clear vision at the monitor and in near vision but no clear distance vision. Twenty-three presbyopic participants wore each type of lens for two weeks in a double-masked four-week quasi-experimental procedure that included an adaptation phase (Weeks 1 and 2) and a test phase (Weeks 3 and 4). Questionnaires on visual and musculoskeletal conditions as well as preferences regarding the type of lenses were administered. After eight more weeks of free use of the spectacles, the preferences were assessed again. The ergonomic conditions were analysed from photographs. Head inclination when looking at the monitor was significantly lower by 2.3 degrees with the computer vision PALs than with the general purpose PALs. Vision at the monitor was judged significantly better with computer PALs, while distance vision was judged better with general purpose PALs; however, the reported advantage of computer vision PALs differed in extent between participants. Accordingly, 61 per cent of the participants preferred the computer vision PALs, when asked without information about lens design. After full information about lens characteristics and additional eight weeks of free spectacle use, 44 per cent preferred the computer vision PALs. On average, computer vision PALs were rated significantly better with respect to vision at the monitor during the experimental part of the study. In the final forced-choice ratings, approximately half of the participants preferred either the computer vision PAL or the general purpose PAL. Individual factors seem to play a role in this preference and in the rated advantage of computer vision PALs. © 2015 The Authors. Clinical and Experimental Optometry © 2015 Optometry Australia.

  16. BigData and computing challenges in high energy and nuclear physics

    International Nuclear Information System (INIS)

    Klimentov, A.; Grigorieva, M.; Kiryanov, A.; Zarochentsev, A.

    2017-01-01

    In this contribution we discuss the various aspects of the computing resource needs experiments in High Energy and Nuclear Physics, in particular at the Large Hadron Collider. This will evolve in the future when moving from LHC to HL-LHC in ten years from now, when the already exascale levels of data we are processing could increase by a further order of magnitude. The distributed computing environment has been a great success and the inclusion of new super-computing facilities, cloud computing and volunteering computing for the future is a big challenge, which we are successfully mastering with a considerable contribution from many super-computing centres around the world, academic and commercial cloud providers. We also discuss R and D computing projects started recently in National Research Center ''Kurchatov Institute''

  17. BigData and computing challenges in high energy and nuclear physics

    Science.gov (United States)

    Klimentov, A.; Grigorieva, M.; Kiryanov, A.; Zarochentsev, A.

    2017-06-01

    In this contribution we discuss the various aspects of the computing resource needs experiments in High Energy and Nuclear Physics, in particular at the Large Hadron Collider. This will evolve in the future when moving from LHC to HL-LHC in ten years from now, when the already exascale levels of data we are processing could increase by a further order of magnitude. The distributed computing environment has been a great success and the inclusion of new super-computing facilities, cloud computing and volunteering computing for the future is a big challenge, which we are successfully mastering with a considerable contribution from many super-computing centres around the world, academic and commercial cloud providers. We also discuss R&D computing projects started recently in National Research Center ``Kurchatov Institute''

  18. Characteristic detected on computed tomography angiography predict coronary artery plaque progression in non-culprit lesions

    Energy Technology Data Exchange (ETDEWEB)

    Tan, Ya Hang; Zhou, Jia Zhou; Zhou, Ying; Yang, Xiaobo; Yang, Jun Jie; Chen, Yun Dai [Dept. of Cardiology, Chinese PLA General Hospital, Beijing (China)

    2017-06-15

    This study sought to determine whether variables detected on coronary computed tomography angiography (CCTA) would predict plaque progression in non-culprit lesions (NCL). In this single-center trial, we analyzed 103 consecutive patients who were undergoing CCTA and percutaneous coronary intervention (PCI) for culprit lesions. Follow-up CCTA was scheduled 12 months after the PCI, and all patients were followed for 3 years after their second CCTA examination. High-risk plaque features and epicardial adipose tissue (EAT) volume were assessed by CCTA. Each NCL stenosis grade was compared visually between two CCTA scans to detect plaque progression, and patients were stratified into two groups based on this. Logistic regression analysis was used to evaluate the factors that were independently associated with plaque progression in NCLs. Time-to-event curves were compared using the log-rank statistic. Overall, 34 of 103 patients exhibited NCL plaque progression (33%). Logistic regression analyses showed that the NCL progression was associated with a history of ST-elevated myocardial infarction (odds ratio [OR] = 5.855, 95% confidence interval [CI] = 1.391–24.635, p = 0.016), follow-up low-density lipoprotein cholesterol level (OR = 6.832, 95% CI = 2.103–22.200, p = 0.001), baseline low-attenuation plaque (OR = 7.311, 95% CI = 1.242–43.028, p = 0.028) and EAT (OR = 1.015, 95% CI = 1.000–1.029, p = 0.044). Following the second CCTA examination, major adverse cardiac events (MACEs) were observed in 12 patients, and NCL plaque progression was significantly associated with future MACEs (log rank p = 0.006). Noninvasive assessment of NCLs by CCTA has potential prognostic value.

  19. Progress and challenges in the development of a cell-based therapy for hemophilia A.

    Science.gov (United States)

    Fomin, M E; Togarrati, P P; Muench, M O

    2014-12-01

    Hemophilia A results from an insufficiency of factor VIII (FVIII). Although replacement therapy with plasma-derived or recombinant FVIII is a life-saving therapy for hemophilia A patients, such therapy is a life-long treatment rather than a cure for the disease. In this review, we discuss the possibilities, progress, and challenges that remain in the development of a cell-based cure for hemophilia A. The success of cell therapy depends on the type and availability of donor cells, the age of the host and method of transplantation, and the levels of engraftment and production of FVIII by the graft. Early therapy, possibly even prenatal transplantation, may yield the highest levels of engraftment by avoiding immunological rejection of the graft. Potential cell sources of FVIII include a specialized subset of endothelial cells known as liver sinusoidal endothelial cells (LSECs) present in the adult and fetal liver, or patient-specific endothelial cells derived from induced pluripotent stem cells that have undergone gene editing to produce FVIII. Achieving sufficient engraftment of transplanted LSECs is one of the obstacles to successful cell therapy for hemophilia A. We discuss recent results from transplants performed in animals that show production of functional and clinically relevant levels of FVIII obtained from donor LSECs. Hence, the possibility of treating hemophilia A can be envisioned through persistent production of FVIII from transplanted donor cells derived from a number of potential cell sources or through creation of donor endothelial cells from patient-specific induced pluripotent stem cells. © 2014 International Society on Thrombosis and Haemostasis.

  20. Nationwide Natural Resource Inventory of the Philippines Using Lidar: Strategies, Progress, and Challenges

    Science.gov (United States)

    Blanco, A. C.; Tamondong, A.; Perez, A. M.; Ang, M. R. C.; Paringit, E.; Alberto, R.; Alibuyog, N.; Aquino, D.; Ballado, A.; Garcia, P.; Japitana, M.; Ignacio, M. T.; Macandog, D.; Novero, A.; Otadoy, R. E.; Regis, E.; Rodriguez, M.; Silapan, J.; Villar, R.

    2016-06-01

    The Philippines has embarked on a detailed nationwide natural resource inventory using LiDAR through the Phil-LiDAR 2 Program. This 3-year program has developed and has been implementing mapping methodologies and protocols to produce high-resolution maps of agricultural, forest, coastal marine, hydrological features, and renewable energy resources. The Program has adopted strategies on system and process development, capacity building and enhancement, and expanding the network of collaborations. These strategies include training programs (on point cloud and image processing, GIS, and field surveys), workshops, forums, and colloquiums (program-wide, cluster-based, and project-based), and collaboration with partner national government agencies and other organizations. In place is a cycle of training, implementation, and feedback in order to continually improve the system and processes. To date, the Program has achieved progress in the development of workflows and in rolling out products such as resource maps and GIS data layers, which are indispensable in planning and decision-making. Challenges remains in speeding up output production (including quality checks) and in ensuring sustainability considering the short duration of the program. Enhancements in the workflows and protocols have been incorporated to address data quality and data availability issues. More trainings have been conducted for project staff hired to address human resource gaps. Collaborative arrangements with more partners are being established. To attain sustainability, the Program is developing and instituting a system of training, data updating and sharing, information utilization, and feedback. This requires collaboration and cooperation of the government agencies, LGUs, universities, other organizations, and the communities.

  1. Cloud ice: A climate model challenge with signs and expectations of progress

    Science.gov (United States)

    Waliser, Duane E.; Li, Jui-Lin F.; Woods, Christopher P.; Austin, Richard T.; Bacmeister, Julio; Chern, Jiundar; Del Genio, Anthony; Jiang, Jonathan H.; Kuang, Zhiming; Meng, Huan; Minnis, Patrick; Platnick, Steve; Rossow, William B.; Stephens, Graeme L.; Sun-Mack, Szedung; Tao, Wei-Kuo; Tompkins, Adrian M.; Vane, Deborah G.; Walker, Christopher; Wu, Dong

    2009-04-01

    Present-day shortcomings in the representation of upper tropospheric ice clouds in general circulation models (GCMs) lead to errors in weather and climate forecasts as well as account for a source of uncertainty in climate change projections. An ongoing challenge in rectifying these shortcomings has been the availability of adequate, high-quality, global observations targeting ice clouds and related precipitating hydrometeors. In addition, the inadequacy of the modeled physics and the often disjointed nature between model representation and the characteristics of the retrieved/observed values have hampered GCM development and validation efforts from making effective use of the measurements that have been available. Thus, even though parameterizations in GCMs accounting for cloud ice processes have, in some cases, become more sophisticated in recent years, this development has largely occurred independently of the global-scale measurements. With the relatively recent addition of satellite-derived products from Aura/Microwave Limb Sounder (MLS) and CloudSat, there are now considerably more resources with new and unique capabilities to evaluate GCMs. In this article, we illustrate the shortcomings evident in model representations of cloud ice through a comparison of the simulations assessed in the Intergovernmental Panel on Climate Change Fourth Assessment Report, briefly discuss the range of global observational resources that are available, and describe the essential components of the model parameterizations that characterize their "cloud" ice and related fields. Using this information as background, we (1) discuss some of the main considerations and cautions that must be taken into account in making model-data comparisons related to cloud ice, (2) illustrate present progress and uncertainties in applying satellite cloud ice (namely from MLS and CloudSat) to model diagnosis, (3) show some indications of model improvements, and finally (4) discuss a number of

  2. Canada's climate change voluntary challenge and registry program : 6. annual progress report

    International Nuclear Information System (INIS)

    2000-10-01

    A Canadian integrated energy company, Suncor Energy Inc. comprises a corporate group, three operating business units, and two emerging businesses. This annual Progress Report for Canada's Climate Change Voluntary Challenge and Registry (VCR) Program represents the sixth for this company. Suncor is committed to sustainable development. Some initiatives undertaken in 1999 by Suncor included: Oil Sands Project Millennium, which will more than double the actual production of crude oil and fuel products by 2002. Suncor is divesting of conventional oil properties in order to concentrate on exploration and production of natural gas. Alternative and renewable energy will see an investment of 100 million over the next five years. The money will be allocated to research and development, the production of fuels from biomass, and conversion of municipal solid waste to energy through the recovery of methane from landfills. Since 1990, the emissions of carbon dioxide have been reduced to 14 per cent below 1990 levels, and reductions of 622, 000 tonnes of greenhouse gases. A comprehensive tracking, reporting, and management system for greenhouse gases was implemented. Ongoing improvements in quality and comprehensiveness have validated the methodology used to monitor emissions inventories and sources. Initiatives in internal and external awareness of greenhouse gases education were implemented, such as speaking engagements at climate change activities, the retrofit of schools with advanced energy-efficient technology, education programs, employee suggestion programs, etc. Collaboration with external partners on research and development projects represents a major building block in this approach. Some of the research and development projects involve the development of advanced carbon dioxide capture and geologic sequestration technologies, work on the production of alternative and renewable energy from Canadian municipal landfills, and the study of a new process to extract heavy

  3. Progress and challenges in the development of a cell-based therapy for hemophilia A

    Science.gov (United States)

    Fomin, Marina E.; Togarrati, Padma Priya; Muench, Marcus O.

    2015-01-01

    Hemophilia A results from an insufficiency of factor VIII (FVIII). Although replacement therapy with plasma-derived or recombinant FVIII is a life-saving therapy for hemophilia A patients, such therapy is a life-long treatment rather than a cure for the disease. In this review we discuss the possibilities, progress and challenges that remain in the development of a cell-based cure for hemophilia A. The success of cell therapy depends on the type and availability of donor cells, the age of the host and method of transplantation, and the levels of engraftment and production of FVIII by the graft. Early therapy, possibly even prenatal transplantation, may yield the highest levels of engraftment by avoiding immunological rejection of the graft. Potential cell sources of FVIII include a specialized subset of endothelial cells known as liver sinusoidal endothelial cells (LSECs) present in the adult and fetal liver, or patient-specific endothelial cells derived from induced pluripotent stem cells (iPSCs) that have undergone gene editing to produce FVIII. Achieving sufficient engraftment of transplanted LSECs is one of the obstacles to successful cell therapy for hemophilia A. We discuss recent results from transplants performed in animals that show production of functional and clinically relevant levels of FVIII obtained from donor LSECs. Hence, the possibility of treating hemophilia A can be envisioned through persistent production of FVIII from transplanted donor cells derived from a number of potential cell sources or through creation of donor endothelial cells from patient-specific iPSCs. PMID:25297648

  4. Catalysis Research of Relevance to Carbon Management: Progress, Challenges, and Opportunities

    Energy Technology Data Exchange (ETDEWEB)

    Arakawa, Hironori; Aresta, Michele; Armor, John; Barteau, Mark; Beckman, Eric J.; Bell, Alexis T.; Bercaw, John E.; Creutz, Carol; Dinjus, Eckhard; Dixon, David A.; Domen, Kazunari; Dubois, Daniel L.; Eckert, Juergen; Fujita, Etsuko; Gibson, Dorothy H.; Goddard, William A.; Goodman, Wayne D.; Keller, Jay; Kubas, Gregory J.; Kung, Harold H.; Lyons, James E.; Manzer, Leo; Marks, Tobin J.; Morokuma, Keiji; Nicholas, Kenneth M.; Periana, Roy; Que, Lawrence; Rostrup-Nielson, Jens; Sachtler, Woflgang M H.; Schmidt, Lanny D.; Sen, Ayusman; Somorjai, Gabor A.; Stair, Peter C.; Stults, Bailey R.; Tumas, William

    2001-04-11

    The goal of the 'Opportunities for Catalysis Research in Carbon Management' workshop was to review within the context of greenhouse gas/carbon issues the current state of knowledge, barriers to further scientific and technological progress, and basic scientific research needs in the areas of H{sub 2} generation and utilization, light hydrocarbon activation and utilization, carbon dioxide activation, utilization, and sequestration, emerging techniques and research directions in relevant catalysis research, and in catalysis for more efficient transportation engines. Several overarching themes emerge from this review. First and foremost, there is a pressing need to better understand in detail the catalytic mechanisms involved in almost every process area mentioned above. This includes the structures, energetics, lifetimes, and reactivities of the species thought to be important in the key catalytic cycles. As much of this type of information as is possible to acquire would also greatly aid in better understanding perplexing, incomplete/inefficient catalytic cycles and in inventing new, efficient ones. The most productive way to attack such problems must include long-term, in-depth fundamental studies of both commercial and model processes, by conventional research techniques and, importantly, by applying various promising new physicochemical and computational approaches which would allow incisive, in situ elucidation of reaction pathways. There is also a consensus that more exploratory experiments, especially high-risk, unconventional catalytic and model studies, should be undertaken. Such an effort will likely require specialized equipment, instrumentation, and computational facilities. The most expeditious and cost-effective means to carry out this research would be by close coupling of academic, industrial, and national laboratory catalysis efforts worldwide. Completely new research approaches should be vigorously explored, ranging from novel compositions

  5. Progression Analysis and Stage Discovery in Continuous Physiological Processes Using Image Computing

    Directory of Open Access Journals (Sweden)

    Ferrucci Luigi

    2010-01-01

    Full Text Available We propose an image computing-based method for quantitative analysis of continuous physiological processes that can be sensed by medical imaging and demonstrate its application to the analysis of morphological alterations of the bone structure, which correlate with the progression of osteoarthritis (OA. The purpose of the analysis is to quantitatively estimate OA progression in a fashion that can assist in understanding the pathophysiology of the disease. Ultimately, the texture analysis will be able to provide an alternative OA scoring method, which can potentially reflect the progression of the disease in a more direct fashion compared to the existing clinically utilized classification schemes based on radiology. This method can be useful not just for studying the nature of OA, but also for developing and testing the effect of drugs and treatments. While in this paper we demonstrate the application of the method to osteoarthritis, its generality makes it suitable for the analysis of other progressive clinical conditions that can be diagnosed and prognosed by using medical imaging.

  6. Computed tomography of skeletal muscles in childhood spinal progressive muscular atrophies

    International Nuclear Information System (INIS)

    Arai, Yumi; Osawa, Makiko; Sumida, Sawako; Shishikura, Keiko; Suzuki, Haruko; Fukuyama, Yukio; Kohno, Atsushi

    1992-01-01

    Computed tomographic (CT) scanning of skeletal muscles was performed in patients with type 1 and type 2 spinal progressive muscular atrophy (SPMA) and Kugelberg-Welander disease (K-W) to delineate the characteristic CT features of each category. Marked muscular atrophy was observed in type 1 SPMA, and both muscular atrophy and intramuscular low density areas in type 2 SPMA, changes being more pronounced in older patients. In contrast, in K-W, muscular atrophy was slight, and intramuscular low density areas constituted the most prominent findings. These observations indicate that SPMA and K-W are each characterized by distinct CT findings. (author)

  7. High Performance Numerical Computing for High Energy Physics: A New Challenge for Big Data Science

    International Nuclear Information System (INIS)

    Pop, Florin

    2014-01-01

    Modern physics is based on both theoretical analysis and experimental validation. Complex scenarios like subatomic dimensions, high energy, and lower absolute temperature are frontiers for many theoretical models. Simulation with stable numerical methods represents an excellent instrument for high accuracy analysis, experimental validation, and visualization. High performance computing support offers possibility to make simulations at large scale, in parallel, but the volume of data generated by these experiments creates a new challenge for Big Data Science. This paper presents existing computational methods for high energy physics (HEP) analyzed from two perspectives: numerical methods and high performance computing. The computational methods presented are Monte Carlo methods and simulations of HEP processes, Markovian Monte Carlo, unfolding methods in particle physics, kernel estimation in HEP, and Random Matrix Theory used in analysis of particles spectrum. All of these methods produce data-intensive applications, which introduce new challenges and requirements for ICT systems architecture, programming paradigms, and storage capabilities.

  8. Relationship of computed tomography perfusion and positron emission tomography to tumour progression in malignant glioma

    Energy Technology Data Exchange (ETDEWEB)

    Yeung, Timothy P C [London Regional Cancer Program, London Health Sciences Centre, Ontario, Canada, N6A 4L6 (Canada); Robarts Research Institute, The University of Western Ontario, Ontario, Canada, N6A 5B7 (Canada); Department of Medical Biophysics, The University of Western Ontario, Ontario, Canada, N6A 5C1 (Canada); Yartsev, Slav [London Regional Cancer Program, London Health Sciences Centre, Ontario, Canada, N6A 4L6 (Canada); Department of Medical Biophysics, The University of Western Ontario, Ontario, Canada, N6A 5C1 (Canada); Department of Oncology, The University of Western Ontario, London Health Sciences Centre, London Regional Cancer Program, Ontario, Canada, N6A 4L6 (Canada); Lee, Ting-Yim [Robarts Research Institute, The University of Western Ontario, Ontario, Canada, N6A 5B7 (Canada); Department of Medical Biophysics, The University of Western Ontario, Ontario, Canada, N6A 5C1 (Canada); Department of Oncology, The University of Western Ontario, London Health Sciences Centre, London Regional Cancer Program, Ontario, Canada, N6A 4L6 (Canada); Department of Medical Imaging, The University of Western Ontario, London Health Sciences Centre, Victoria Hospital, Ontario, Canada, N6A 5W9 (Australia); Lawson Health Research Institute, St. Joseph' s Health Care London, Ontario, Canada, N6A 4V2 (Canada); Wong, Eugene [London Regional Cancer Program, London Health Sciences Centre, Ontario, Canada, N6A 4L6 (Canada); Department of Oncology, The University of Western Ontario, London Health Sciences Centre, London Regional Cancer Program, Ontario, Canada, N6A 4L6 (Canada); Department of Physics and Astronomy, The University of Western Ontario, Ontario, Canada, N6A 3K7 (Canada); He, Wenqing [Department of Statistical and Actuarial Sciences, The University of Western Ontario, Ontario, Canada, N6A 5B7 (Canada); Fisher, Barbara; VanderSpek, Lauren L [London Regional Cancer Program, London Health Sciences Centre, Ontario, Canada, N6A 4L6 (Canada); Department of Oncology, The University of Western Ontario, London Health Sciences Centre, London Regional Cancer Program, Ontario, Canada, N6A 4L6 (Canada); Macdonald, David [London Regional Cancer Program, London Health Sciences Centre, Ontario, Canada, N6A 4L6 (Canada); Department of Oncology, The University of Western Ontario, London Health Sciences Centre, London Regional Cancer Program, Ontario, Canada, N6A 4L6 (Canada); Department of Clinical Neurological Sciences, The University of Western Ontario, London Health Sciences Centre, University Hospital, Ontario, Canada, N6A 5A5 (Canada); Bauman, Glenn, E-mail: glenn.bauman@lhsc.on.ca [London Regional Cancer Program, London Health Sciences Centre, Ontario, Canada, N6A 4L6 (Canada); Department of Medical Biophysics, The University of Western Ontario, Ontario, Canada, N6A 5C1 (Canada); Department of Oncology, The University of Western Ontario, London Health Sciences Centre, London Regional Cancer Program, Ontario, Canada, N6A 4L6 (Canada)

    2014-02-15

    Introduction: This study aimed to explore the potential for computed tomography (CT) perfusion and 18-Fluorodeoxyglucose positron emission tomography (FDG-PET) in predicting sites of future progressive tumour on a voxel-by-voxel basis after radiotherapy and chemotherapy. Methods: Ten patients underwent pre-radiotherapy magnetic resonance (MR), FDG-PET and CT perfusion near the end of radiotherapy and repeated post-radiotherapy follow-up MR scans. The relationships between these images and tumour progression were assessed using logistic regression. Cross-validation with receiver operating characteristic (ROC) analysis was used to assess the value of these images in predicting sites of tumour progression. Results: Pre-radiotherapy MR-defined gross tumour; near-end-of-radiotherapy CT-defined enhancing lesion; CT perfusion blood flow (BF), blood volume (BV) and permeability-surface area (PS) product; FDG-PET standard uptake value (SUV); and SUV:BF showed significant associations with tumour progression on follow-up MR imaging (P < 0.0001). The mean sensitivity (±standard deviation), specificity and area under the ROC curve (AUC) of PS were 0.64 ± 0.15, 0.74 ± 0.07 and 0.72 ± 0.12 respectively. This mean AUC was higher than that of the pre-radiotherapy MR-defined gross tumour and near-end-of-radiotherapy CT-defined enhancing lesion (both AUCs = 0.6 ± 0.1, P ≤ 0.03). The multivariate model using BF, BV, PS and SUV had a mean AUC of 0.8 ± 0.1, but this was not significantly higher than the PS only model. Conclusion: PS is the single best predictor of tumour progression when compared to other parameters, but voxel-based prediction based on logistic regression had modest sensitivity and specificity.

  9. Relationship of computed tomography perfusion and positron emission tomography to tumour progression in malignant glioma

    International Nuclear Information System (INIS)

    Yeung, Timothy P C; Yartsev, Slav; Lee, Ting-Yim; Wong, Eugene; He, Wenqing; Fisher, Barbara; VanderSpek, Lauren L; Macdonald, David; Bauman, Glenn

    2014-01-01

    Introduction: This study aimed to explore the potential for computed tomography (CT) perfusion and 18-Fluorodeoxyglucose positron emission tomography (FDG-PET) in predicting sites of future progressive tumour on a voxel-by-voxel basis after radiotherapy and chemotherapy. Methods: Ten patients underwent pre-radiotherapy magnetic resonance (MR), FDG-PET and CT perfusion near the end of radiotherapy and repeated post-radiotherapy follow-up MR scans. The relationships between these images and tumour progression were assessed using logistic regression. Cross-validation with receiver operating characteristic (ROC) analysis was used to assess the value of these images in predicting sites of tumour progression. Results: Pre-radiotherapy MR-defined gross tumour; near-end-of-radiotherapy CT-defined enhancing lesion; CT perfusion blood flow (BF), blood volume (BV) and permeability-surface area (PS) product; FDG-PET standard uptake value (SUV); and SUV:BF showed significant associations with tumour progression on follow-up MR imaging (P < 0.0001). The mean sensitivity (±standard deviation), specificity and area under the ROC curve (AUC) of PS were 0.64 ± 0.15, 0.74 ± 0.07 and 0.72 ± 0.12 respectively. This mean AUC was higher than that of the pre-radiotherapy MR-defined gross tumour and near-end-of-radiotherapy CT-defined enhancing lesion (both AUCs = 0.6 ± 0.1, P ≤ 0.03). The multivariate model using BF, BV, PS and SUV had a mean AUC of 0.8 ± 0.1, but this was not significantly higher than the PS only model. Conclusion: PS is the single best predictor of tumour progression when compared to other parameters, but voxel-based prediction based on logistic regression had modest sensitivity and specificity

  10. Aquatic Toxic Analysis by Monitoring Fish Behavior Using Computer Vision: A Recent Progress

    Directory of Open Access Journals (Sweden)

    Chunlei Xia

    2018-01-01

    Full Text Available Video tracking based biological early warning system achieved a great progress with advanced computer vision and machine learning methods. Ability of video tracking of multiple biological organisms has been largely improved in recent years. Video based behavioral monitoring has become a common tool for acquiring quantified behavioral data for aquatic risk assessment. Investigation of behavioral responses under chemical and environmental stress has been boosted by rapidly developed machine learning and artificial intelligence. In this paper, we introduce the fundamental of video tracking and present the pioneer works in precise tracking of a group of individuals in 2D and 3D space. Technical and practical issues suffered in video tracking are explained. Subsequently, the toxic analysis based on fish behavioral data is summarized. Frequently used computational methods and machine learning are explained with their applications in aquatic toxicity detection and abnormal pattern analysis. Finally, advantages of recent developed deep learning approach in toxic prediction are presented.

  11. Looking from Within: Prospects and Challenges for Progressive Education in Indonesia

    Science.gov (United States)

    Zulfikar, Teuku

    2013-01-01

    Many Indonesian scholars (Azra, 2002; Darmaningtyas, 2004; Yunus, 2004), have attempted to bring progressive education to their country. They believe that progressive practices such as critical thinking, critical dialogue and child-centered instruction will help students learn better. However, this implementation is resisted because of cultural…

  12. Progress and challenges in maternal health in western China: a Countdown to 2015 national case study

    Directory of Open Access Journals (Sweden)

    Yanqiu Gao, PhD

    2017-05-01

    Full Text Available Summary: Background: China is one of the few Countdown countries to have achieved Millennium Development Goal 5 (75% reduction in maternal mortality ratio between 1990 and 2015. We aimed to examine the health systems and contextual factors that might have contributed to the substantial decline in maternal mortality between 1997 and 2014. We chose to focus on western China because poverty, ethnic diversity, and geographical access represent particular challenges to ensuring universal access to maternal care in the region. Methods: In this systematic assessment, we used data from national census reports, National Statistical Yearbooks, the National Maternal and Child Health Routine Reporting System, the China National Health Accounts report, and National Health Statistical Yearbooks to describe changes in policies, health financing, health workforce, health infrastructure, coverage of maternal care, and maternal mortality by region between 1997 and 2014. We used a multivariate linear regression model to examine which contextual and health systems factors contributed to the regional variation in maternal mortality ratio in the same period. Using data from a cross-sectional survey in 2011, we also examined equity in access to maternity care in 42 poor counties in western China. Findings: Maternal mortality declined by 8·9% per year between 1997 and 2014 (geometric mean ratio for each year 0·91, 95% CI 0·91–0·92. After adjusting for GDP per capita, length of highways, female illiteracy, the number of licensed doctors per 1000 population, and the proportion of ethnic minorities, the maternal mortality ratio was 118% higher in the western region (2·18, 1·44–3·28 and 41% higher in the central region (1·41, 0·99–2·01 than in the eastern region. In the rural western region, the proportion of births in health facilities rose from 41·9% in 1997 to 98·4% in 2014. Underpinning such progress was the Government's strong commitment to long

  13. Key challenges and recent progress in batteries, fuel cells, and hydrogen storage for clean energy systems

    Science.gov (United States)

    Chalk, Steven G.; Miller, James F.

    Reducing or eliminating the dependency on petroleum of transportation systems is a major element of US energy research activities. Batteries are a key enabling technology for the development of clean, fuel-efficient vehicles and are key to making today's hybrid electric vehicles a success. Fuel cells are the key enabling technology for a future hydrogen economy and have the potential to revolutionize the way we power our nations, offering cleaner, more efficient alternatives to today's technology. Additionally fuel cells are significantly more energy efficient than combustion-based power generation technologies. Fuel cells are projected to have energy efficiency twice that of internal combustion engines. However before fuel cells can realize their potential, significant challenges remain. The two most important are cost and durability for both automotive and stationary applications. Recent electrocatalyst developments have shown that Pt alloy catalysts have increased activity and greater durability than Pt catalysts. The durability of conventional fluorocarbon membranes is improving, and hydrocarbon-based membranes have also shown promise of equaling the performance of fluorocarbon membranes at lower cost. Recent announcements have also provided indications that fuel cells can start from freezing conditions without significant deterioration. Hydrogen storage systems for vehicles are inadequate to meet customer driving range expectations (>300 miles or 500 km) without intrusion into vehicle cargo or passenger space. The United States Department of Energy has established three centers of Excellence for hydrogen storage materials development. The centers are focused on complex metal hydrides that can be regenerated onboard a vehicle, chemical hydrides that require off-board reprocessing, and carbon-based storage materials. Recent developments have shown progress toward the 2010 DOE targets. In addition DOE has established an independent storage material testing center

  14. Safety regulation of geological disposal of radioactive waste: progress since Cordoba and remaining challenges

    International Nuclear Information System (INIS)

    Duncan, A.; Pescatore, C.

    2010-01-01

    Claudio Pescatore, Deputy Division Head (NEA) presented a paper, the purpose of which was to recall where we stood at the time of the Cordoba Workshop (1997) on the regulation of disposal of long-lived radioactive waste, to review developments since then, to present the key existing issues, and reflect on the remaining challenges and possible responses. The overview study on progress in regulation for geological disposal since the Cordoba workshop [NEA/RWMC/RF(2008)6], provides a good list of references regarding the first two issues. The presentation of the existing issues takes advantage of the synthesis of the responses to a questionnaire completed by the regulatory organisations in preparation for this workshop. It warns regulators and implementers that international work to date seems to have created an expectation in the mind of the public and in some organisations that nothing less than a guarantee by the regulator is needed of maintaining current levels of protection of both individuals and populations practically forever, regardless of the impracticality of this. This expectation needs to be replaced with a carefully and clearly explained understanding of the choices involved in dealing with long-lived radioactive waste against a background of our responsibilities to both current and future generations and our practical capacity to deliver them. Concerning the current major challenges faced in regulation, the paper comes back to the issue of the 'guarantee' by the regulator and it observes that there is no doubt that there is a willingness to do the best to comply with the principle of protection and that we are broadly convinced that current concepts for geological disposal, supported by multiple lines of reasoning and application of best available techniques (BAT) will meet that principle. However, we do not have the capacity to prove or guarantee this, nor do we believe that it is possible in practice. Although we are advised that it is neither

  15. Computation-aware algorithm selection approach for interlaced-to-progressive conversion

    Science.gov (United States)

    Park, Sang-Jun; Jeon, Gwanggil; Jeong, Jechang

    2010-05-01

    We discuss deinterlacing results in a computationally constrained and varied environment. The proposed computation-aware algorithm selection approach (CASA) for fast interlaced to progressive conversion algorithm consists of three methods: the line-averaging (LA) method for plain regions, the modified edge-based line-averaging (MELA) method for medium regions, and the proposed covariance-based adaptive deinterlacing (CAD) method for complex regions. The proposed CASA uses two criteria, mean-squared error (MSE) and CPU time, for assigning the method. We proposed a CAD method. The principle idea of CAD is based on the correspondence between the high and low-resolution covariances. We estimated the local covariance coefficients from an interlaced image using Wiener filtering theory and then used these optimal minimum MSE interpolation coefficients to obtain a deinterlaced image. The CAD method, though more robust than most known methods, was not found to be very fast compared to the others. To alleviate this issue, we proposed an adaptive selection approach using a fast deinterlacing algorithm rather than using only one CAD algorithm. The proposed hybrid approach of switching between the conventional schemes (LA and MELA) and our CAD was proposed to reduce the overall computational load. A reliable condition to be used for switching the schemes was presented after a wide set of initial training processes. The results of computer simulations showed that the proposed methods outperformed a number of methods presented in the literature.

  16. Surgical treatment of progressive ethmoidal hematoma aided by computed tomography in a foal

    International Nuclear Information System (INIS)

    Colbourne, C.M.; Rosenstein, D.S.; Steficek, B.A.; Yovich, J.V.; Stick, J.A.

    1997-01-01

    A progressive ethmoidal hematoma (PEH) was treated successfully in a 4-week-old Belgian filly by surgical removal, using a frontonasal bone flap. The filly had respiratory stridor, epistaxis, and facial enlargement over the left paranasal sinuses, which had progressively increased in size since birth. Computed tomographic images of the head obtained with the foal under general anesthesia were useful in determining the extent and nature of the soft-tissue mass and planning surgical intervention. On the basis of the histologic appearance of the mass, a diagnosis of PEH was made. Twelve months after surgery, the facial appearance was normal and the abnormal appearance of the ethmoid region on endoscopic evaluation was less obvious, with return of the nasal septum to a normal position. Progressive ethmoidal hematoma is uncommon and, to our knowledge, has not been reported in a neonate. Clinical signs of PEH in this foal were atypical because of the rapid enlargement of the mass, extent of facial deformity, and minimal epistaxis and interoperative hemorrhage

  17. Computational Challenge of Fractional Differential Equations and the Potential Solutions: A Survey

    Directory of Open Access Journals (Sweden)

    Chunye Gong

    2015-01-01

    Full Text Available We present a survey of fractional differential equations and in particular of the computational cost for their numerical solutions from the view of computer science. The computational complexities of time fractional, space fractional, and space-time fractional equations are O(N2M, O(NM2, and O(NM(M + N compared with O(MN for the classical partial differential equations with finite difference methods, where M, N are the number of space grid points and time steps. The potential solutions for this challenge include, but are not limited to, parallel computing, memory access optimization (fractional precomputing operator, short memory principle, fast Fourier transform (FFT based solutions, alternating direction implicit method, multigrid method, and preconditioner technology. The relationships of these solutions for both space fractional derivative and time fractional derivative are discussed. The authors pointed out that the technologies of parallel computing should be regarded as a basic method to overcome this challenge, and some attention should be paid to the fractional killer applications, high performance iteration methods, high order schemes, and Monte Carlo methods. Since the computation of fractional equations with high dimension and variable order is even heavier, the researchers from the area of mathematics and computer science have opportunity to invent cornerstones in the area of fractional calculus.

  18. Evaluating a multi-player brain-computer interface game: challenge versus co-experience

    NARCIS (Netherlands)

    Gürkök, Hayrettin; Volpe, G; Reidsma, Dennis; Poel, Mannes; Camurri, A.; Obbink, Michel; Nijholt, Antinus

    2013-01-01

    Brain–computer interfaces (BCIs) have started to be considered as game controllers. The low level of control they provide prevents them from providing perfect control but allows the design of challenging games which can be enjoyed by players. Evaluation of enjoyment, or user experience (UX), is

  19. EPA and GSA Webinar: E Scrap Management, Computers for Learning and the Federal Green Challenge

    Science.gov (United States)

    EPA and the General Services Administration (GSA) are hosting a webinar on May 2, 2018. Topics will include policies and procedures on E Scrap management, a review of the Computers For Leaning Program, and benefits of joining the Federal Green Challenge.

  20. Computing in the Curriculum: Challenges and Strategies from a Teacher's Perspective

    Science.gov (United States)

    Sentance, Sue; Csizmadia, Andrew

    2017-01-01

    Computing is being introduced into the curriculum in many countries. Teachers' perspectives enable us to discover what challenges this presents, and also the strategies teachers claim to be using successfully in teaching the subject across primary and secondary education. The study described in this paper was carried out in the UK in 2014 where…

  1. Computing in research and development in Africa benefits, trends, challenges and solutions

    CERN Document Server

    2015-01-01

    This book describes the trends, challenges and solutions in computing use for scientific research and development within different domains in Africa, such as health, agriculture, environment, economy, energy, education and engineering. The benefits expected are discussed by a number of recognized, domain-specific experts, with a common theme being computing as solution enabler. This book is the first document providing such a representative up-to-date view on this topic at the continent level.   • Discusses computing for scientific research and development on the African continent, addressing domains such as engineering, health, agriculture, environment, economy, energy, and education; • Describes the state-of-the-art in usage of computing to address problems in developing countries pertaining to health, productivity, economic growth, and renewable energy; • Offers insights applicable to all developing countries on the use of computing technologies to address a variety of societal issues.

  2. Progress report for a research program in computational physics: Progress report, January 1, 1988-December 31, 1988

    International Nuclear Information System (INIS)

    Guralnik, G.S.

    1988-01-01

    The project of this progress report are all ultimately concerned with various aspects of numerical simulations of lattice gauge theories. These aspects include algorithms, machines, theoretical studies and actual simulations. We made progress in four general areas: studies of new algorithms, determination of the SU(3) β-function, studies of the finite temperature QCD phase transition in the presence of fermions, and the calculation of hadronic matrix elements. We will describe each of these in turn. 7 refs

  3. A Novel Quantitative Computed Tomographic Analysis Suggests How Sirolimus Stabilizes Progressive Air Trapping in Lymphangioleiomyomatosis.

    Science.gov (United States)

    Argula, Rahul G; Kokosi, Maria; Lo, Pechin; Kim, Hyun J; Ravenel, James G; Meyer, Cristopher; Goldin, Jonathan; Lee, Hye-Seung; Strange, Charlie; McCormack, Francis X

    2016-03-01

    The Multicenter International Lymphangioleiomyomatosis Efficacy and Safety of Sirolimus (MILES) trial demonstrated that sirolimus stabilized lung function and improved measures of functional performance and quality of life in patients with lymphangioleiomyomatosis. The physiologic mechanisms of these beneficial actions of sirolimus are incompletely understood. To prospectively determine the longitudinal computed tomographic lung imaging correlates of lung function change in MILES patients treated with placebo or sirolimus. We determined the baseline to 12-month change in computed tomographic image-derived lung volumes and the volume of the lung occupied by cysts in the 31 MILES participants (17 in sirolimus group, 14 in placebo group) with baseline and 12-month scans. There was a trend toward an increase in median expiratory cyst volume percentage in the placebo group and a reduction in the sirolimus group (+2.68% vs. +0.97%, respectively; P = 0.10). The computed tomographic image-derived residual volume and the ratio of residual volume to total lung capacity increased more in the placebo group than in the sirolimus group (+214.4 ml vs. +2.9 ml [P = 0.054] and +0.05 ml vs. -0.01 ml [P = 0.0498], respectively). A Markov transition chain analysis of respiratory cycle cyst volume changes revealed greater dynamic variation in the sirolimus group than in the placebo group at the 12-month time point. Collectively, these data suggest that sirolimus attenuates progressive gas trapping in lymphangioleiomyomatosis, consistent with a beneficial effect of the drug on airflow obstruction. We speculate that a reduction in lymphangioleiomyomatosis cell burden around small airways and cyst walls alleviates progressive airflow limitation and facilitates cyst emptying.

  4. [Facing the challenges of ubiquitous computing in the health care sector].

    Science.gov (United States)

    Georgieff, Peter; Friedewald, Michael

    2010-01-01

    The steady progress of microelectronics, communications and information technology will enable the realisation of the vision for "ubiquitous computing" where the Internet extends into the real world embracing everyday objects. The necessary technical basis is already in place. Due to their diminishing size, constantly falling price and declining energy consumption, processors, communications modules and sensors are being increasingly integrated into everyday objects today. This development is opening up huge opportunities for both the economy and individuals. In the present paper we discuss possible applications, but also technical, social and economic barriers to a wide-spread use of ubiquitous computing in the health care sector. .

  5. Progresses in application of computational ?uid dynamic methods to large scale wind turbine aerodynamics?

    Institute of Scientific and Technical Information of China (English)

    Zhenyu ZHANG; Ning ZHAO; Wei ZHONG; Long WANG; Bofeng XU

    2016-01-01

    The computational ?uid dynamics (CFD) methods are applied to aerody-namic problems for large scale wind turbines. The progresses including the aerodynamic analyses of wind turbine pro?les, numerical ?ow simulation of wind turbine blades, evalu-ation of aerodynamic performance, and multi-objective blade optimization are discussed. Based on the CFD methods, signi?cant improvements are obtained to predict two/three-dimensional aerodynamic characteristics of wind turbine airfoils and blades, and the vorti-cal structure in their wake ?ows is accurately captured. Combining with a multi-objective genetic algorithm, a 1.5 MW NH-1500 optimized blade is designed with high e?ciency in wind energy conversion.

  6. High End Computing Technologies for Earth Science Applications: Trends, Challenges, and Innovations

    Science.gov (United States)

    Parks, John (Technical Monitor); Biswas, Rupak; Yan, Jerry C.; Brooks, Walter F.; Sterling, Thomas L.

    2003-01-01

    Earth science applications of the future will stress the capabilities of even the highest performance supercomputers in the areas of raw compute power, mass storage management, and software environments. These NASA mission critical problems demand usable multi-petaflops and exabyte-scale systems to fully realize their science goals. With an exciting vision of the technologies needed, NASA has established a comprehensive program of advanced research in computer architecture, software tools, and device technology to ensure that, in partnership with US industry, it can meet these demanding requirements with reliable, cost effective, and usable ultra-scale systems. NASA will exploit, explore, and influence emerging high end computing architectures and technologies to accelerate the next generation of engineering, operations, and discovery processes for NASA Enterprises. This article captures this vision and describes the concepts, accomplishments, and the potential payoff of the key thrusts that will help meet the computational challenges in Earth science applications.

  7. Opportunities and Challenges of Cloud Computing to Improve Health Care Services

    Science.gov (United States)

    2011-01-01

    Cloud computing is a new way of delivering computing resources and services. Many managers and experts believe that it can improve health care services, benefit health care research, and change the face of health information technology. However, as with any innovation, cloud computing should be rigorously evaluated before its widespread adoption. This paper discusses the concept and its current place in health care, and uses 4 aspects (management, technology, security, and legal) to evaluate the opportunities and challenges of this computing model. Strategic planning that could be used by a health organization to determine its direction, strategy, and resource allocation when it has decided to migrate from traditional to cloud-based health services is also discussed. PMID:21937354

  8. Opportunities and challenges of cloud computing to improve health care services.

    Science.gov (United States)

    Kuo, Alex Mu-Hsing

    2011-09-21

    Cloud computing is a new way of delivering computing resources and services. Many managers and experts believe that it can improve health care services, benefit health care research, and change the face of health information technology. However, as with any innovation, cloud computing should be rigorously evaluated before its widespread adoption. This paper discusses the concept and its current place in health care, and uses 4 aspects (management, technology, security, and legal) to evaluate the opportunities and challenges of this computing model. Strategic planning that could be used by a health organization to determine its direction, strategy, and resource allocation when it has decided to migrate from traditional to cloud-based health services is also discussed.

  9. Building a Grad Nation: Progress and Challenge in Raising High School Graduation Rates. Annual Update 2016

    Science.gov (United States)

    DePaoli, Jennifer L.; Balfanz, Robert; Bridgeland, John

    2016-01-01

    The nation has achieved an 82.3 percent high school graduation rate--a record high. Graduation rates rose for all student subgroups, and the number of low-graduation-rate high schools and students enrolled in them dropped again, indicating that progress has had far-reaching benefits for all students. This report is the first to analyze 2014…

  10. New Data, Old Tensions: Big Data, Personalized Learning, and the Challenges of Progressive Education

    Science.gov (United States)

    Dishon, Gideon

    2017-01-01

    Personalized learning has become the most notable application of big data in primary and secondary schools in the United States. The combination of big data and adaptive technological platforms is heralded as a revolution that could transform education, overcoming the outdated classroom model, and realizing the progressive vision of…

  11. Building a Grad Nation: Progress and Challenge in Ending the High School Dropout Epidemic

    Science.gov (United States)

    Balfanz, Robert; Bridgeland, John M.; Moore, Laura A.; Fox, Joanna Hornig

    2010-01-01

    The central message of this report is that some states and school districts are raising their high school graduation rates with scalable solutions in the public schools, showing the nation they can end the high school dropout crisis. America made progress not only in suburbs and towns, but also in urban districts and in states across the South.…

  12. Some computational challenges of developing efficient parallel algorithms for data-dependent computations in thermal-hydraulics supercomputer applications

    International Nuclear Information System (INIS)

    Woodruff, S.B.

    1994-01-01

    The Transient Reactor Analysis Code (TRAC), which features a two-fluid treatment of thermal-hydraulics, is designed to model transients in water reactors and related facilities. One of the major computational costs associated with TRAC and similar codes is calculating constitutive coefficients. Although the formulations for these coefficients are local, the costs are flow-regime- or data-dependent; i.e., the computations needed for a given spatial node often vary widely as a function of time. Consequently, a fixed, uniform assignment of nodes to prallel processors will result in degraded computational efficiency due to the poor load balancing. A standard method for treating data-dependent models on vector architectures has been to use gather operations (or indirect adressing) to sort the nodes into subsets that (temporarily) share a common computational model. However, this method is not effective on distributed memory data parallel architectures, where indirect adressing involves expensive communication overhead. Another serious problem with this method involves software engineering challenges in the areas of maintainability and extensibility. For example, an implementation that was hand-tuned to achieve good computational efficiency would have to be rewritten whenever the decision tree governing the sorting was modified. Using an example based on the calculation of the wall-to-liquid and wall-to-vapor heat-transfer coefficients for three nonboiling flow regimes, we describe how the use of the Fortran 90 WHERE construct and automatic inlining of functions can be used to ameliorate this problem while improving both efficiency and software engineering. Unfortunately, a general automatic solution to the load-balancing problem associated with data-dependent computations is not yet available for massively parallel architectures. We discuss why developers should either wait for such solutions or consider alternative numerical algorithms, such as a neural network

  13. Challenges in Soft Computing: Case Study with Louisville MSD CSO Modeling

    Science.gov (United States)

    Ormsbee, L.; Tufail, M.

    2005-12-01

    The principal constituents of soft computing include fuzzy logic, neural computing, evolutionary computation, machine learning, and probabilistic reasoning. There are numerous applications of these constituents (both individually and combination of two or more) in the area of water resources and environmental systems. These range from development of data driven models to optimal control strategies to assist in more informed and intelligent decision making process. Availability of data is critical to such applications and having scarce data may lead to models that do not represent the response function over the entire domain. At the same time, too much data has a tendency to lead to over-constraining of the problem. This paper will describe the application of a subset of these soft computing techniques (neural computing and genetic algorithms) to the Beargrass Creek watershed in Louisville, Kentucky. The application include development of inductive models as substitutes for more complex process-based models to predict water quality of key constituents (such as dissolved oxygen) and use them in an optimization framework for optimal load reductions. Such a process will facilitate the development of total maximum daily loads for the impaired water bodies in the watershed. Some of the challenges faced in this application include 1) uncertainty in data sets, 2) model application, and 3) development of cause-and-effect relationships between water quality constituents and watershed parameters through use of inductive models. The paper will discuss these challenges and how they affect the desired goals of the project.

  14. Scientific Grand Challenges: Forefront Questions in Nuclear Science and the Role of High Performance Computing

    International Nuclear Information System (INIS)

    Khaleel, Mohammad A.

    2009-01-01

    This report is an account of the deliberations and conclusions of the workshop on 'Forefront Questions in Nuclear Science and the Role of High Performance Computing' held January 26-28, 2009, co-sponsored by the U.S. Department of Energy (DOE) Office of Nuclear Physics (ONP) and the DOE Office of Advanced Scientific Computing (ASCR). Representatives from the national and international nuclear physics communities, as well as from the high performance computing community, participated. The purpose of this workshop was to (1) identify forefront scientific challenges in nuclear physics and then determine which-if any-of these could be aided by high performance computing at the extreme scale; (2) establish how and why new high performance computing capabilities could address issues at the frontiers of nuclear science; (3) provide nuclear physicists the opportunity to influence the development of high performance computing; and (4) provide the nuclear physics community with plans for development of future high performance computing capability by DOE ASCR.

  15. Computer-assisted learning and simulation systems in dentistry--a challenge to society.

    Science.gov (United States)

    Welk, A; Splieth, Ch; Wierinck, E; Gilpatrick, R O; Meyer, G

    2006-07-01

    Computer technology is increasingly used in practical training at universities. However, in spite of their potential, computer-assisted learning (CAL) and computer-assisted simulation (CAS) systems still appear to be underutilized in dental education. Advantages, challenges, problems, and solutions of computer-assisted learning and simulation in dentistry are discussed by means of MEDLINE, open Internet platform searches, and key results of a study among German dental schools. The advantages of computer-assisted learning are seen for example in self-paced and self-directed learning and increased motivation. It is useful for both objective theoretical and practical tests and for training students to handle complex cases. CAL can lead to more structured learning and can support training in evidence-based decision-making. The reasons for the still relatively rare implementation of CAL/CAS systems in dental education include an inability to finance, lack of studies of CAL/CAS, and too much effort required to integrate CAL/CAS systems into the curriculum. To overcome the reasons for the relative low degree of computer technology use, we should strive for multicenter research and development projects monitored by the appropriate national and international scientific societies, so that the potential of computer technology can be fully realized in graduate, postgraduate, and continuing dental education.

  16. Ex Machina: Analytical platforms, Law and the Challenges of Computational Legal Science

    Directory of Open Access Journals (Sweden)

    Nicola Lettieri

    2018-04-01

    Full Text Available Over the years, computation has become a fundamental part of the scientific practice in several research fields that goes far beyond the boundaries of natural sciences. Data mining, machine learning, simulations and other computational methods lie today at the hearth of the scientific endeavour in a growing number of social research areas from anthropology to economics. In this scenario, an increasingly important role is played by analytical platforms: integrated environments allowing researchers to experiment cutting-edge data-driven and computation-intensive analyses. The paper discusses the appearance of such tools in the emerging field of computational legal science. After a general introduction to the impact of computational methods on both natural and social sciences, we describe the concept and the features of an analytical platform exploring innovative cross-methodological approaches to the academic and investigative study of crime. Stemming from an ongoing project involving researchers from law, computer science and bioinformatics, the initiative is presented and discussed as an opportunity to raise a debate about the future of legal scholarship and, inside of it, about the challenges of computational legal science.

  17. Scientific Grand Challenges: Forefront Questions in Nuclear Science and the Role of High Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.

    2009-10-01

    This report is an account of the deliberations and conclusions of the workshop on "Forefront Questions in Nuclear Science and the Role of High Performance Computing" held January 26-28, 2009, co-sponsored by the U.S. Department of Energy (DOE) Office of Nuclear Physics (ONP) and the DOE Office of Advanced Scientific Computing (ASCR). Representatives from the national and international nuclear physics communities, as well as from the high performance computing community, participated. The purpose of this workshop was to 1) identify forefront scientific challenges in nuclear physics and then determine which-if any-of these could be aided by high performance computing at the extreme scale; 2) establish how and why new high performance computing capabilities could address issues at the frontiers of nuclear science; 3) provide nuclear physicists the opportunity to influence the development of high performance computing; and 4) provide the nuclear physics community with plans for development of future high performance computing capability by DOE ASCR.

  18. Assessing the Progress of Trapped-Ion Processors Towards Fault-Tolerant Quantum Computation

    Science.gov (United States)

    Bermudez, A.; Xu, X.; Nigmatullin, R.; O'Gorman, J.; Negnevitsky, V.; Schindler, P.; Monz, T.; Poschinger, U. G.; Hempel, C.; Home, J.; Schmidt-Kaler, F.; Biercuk, M.; Blatt, R.; Benjamin, S.; Müller, M.

    2017-10-01

    A quantitative assessment of the progress of small prototype quantum processors towards fault-tolerant quantum computation is a problem of current interest in experimental and theoretical quantum information science. We introduce a necessary and fair criterion for quantum error correction (QEC), which must be achieved in the development of these quantum processors before their sizes are sufficiently big to consider the well-known QEC threshold. We apply this criterion to benchmark the ongoing effort in implementing QEC with topological color codes using trapped-ion quantum processors and, more importantly, to guide the future hardware developments that will be required in order to demonstrate beneficial QEC with small topological quantum codes. In doing so, we present a thorough description of a realistic trapped-ion toolbox for QEC and a physically motivated error model that goes beyond standard simplifications in the QEC literature. We focus on laser-based quantum gates realized in two-species trapped-ion crystals in high-optical aperture segmented traps. Our large-scale numerical analysis shows that, with the foreseen technological improvements described here, this platform is a very promising candidate for fault-tolerant quantum computation.

  19. Progress in computer aided diagnosis for medical images by information technology

    International Nuclear Information System (INIS)

    Mekada, Yoshito

    2007-01-01

    This paper describes the history, present state and future view of computer aided diagnosis (CAD) based on processing, recognition and visualization of chest and abdominal images. A primitive feature of CAD is seen as early as in 1960's for lung cancer detection. Contemporarily, advances in medical imaging by CT, MRI, single photon emission computed tomography (SPECT) and positron emission tomography (PET) in multi-dimensions require doctors to read those vast information, where necessity of CAD is evident. At present, simultaneous CAD for multi-organs and multi-diseases is in progress, the interaction between images and medical doctors is leading to developing a newer system like virtual endoscopy, objective evaluation of CAD systems is necessary for its approval to authorities like fluorescein diacetate (FDA) with use of receiver operating characteristics analysis, and thus cooperation of medical and technological fields is more and more important. In future, CAD should be responsible for individual difference and for change in disease state, usable simultaneously for time and space, more recognized of its importance by doctors, and more useful in participation to therapeutic practice. (R.T.)

  20. A community computational challenge to predict the activity of pairs of compounds.

    Science.gov (United States)

    Bansal, Mukesh; Yang, Jichen; Karan, Charles; Menden, Michael P; Costello, James C; Tang, Hao; Xiao, Guanghua; Li, Yajuan; Allen, Jeffrey; Zhong, Rui; Chen, Beibei; Kim, Minsoo; Wang, Tao; Heiser, Laura M; Realubit, Ronald; Mattioli, Michela; Alvarez, Mariano J; Shen, Yao; Gallahan, Daniel; Singer, Dinah; Saez-Rodriguez, Julio; Xie, Yang; Stolovitzky, Gustavo; Califano, Andrea

    2014-12-01

    Recent therapeutic successes have renewed interest in drug combinations, but experimental screening approaches are costly and often identify only small numbers of synergistic combinations. The DREAM consortium launched an open challenge to foster the development of in silico methods to computationally rank 91 compound pairs, from the most synergistic to the most antagonistic, based on gene-expression profiles of human B cells treated with individual compounds at multiple time points and concentrations. Using scoring metrics based on experimental dose-response curves, we assessed 32 methods (31 community-generated approaches and SynGen), four of which performed significantly better than random guessing. We highlight similarities between the methods. Although the accuracy of predictions was not optimal, we find that computational prediction of compound-pair activity is possible, and that community challenges can be useful to advance the field of in silico compound-synergy prediction.

  1. Challenges in Assessing Progress in Multifunctional Operations: Experiences from a Provincial Reconstruction Team in Afghanistan

    Science.gov (United States)

    2011-06-01

    these measures. Assessment of progress can thus be seen as a process consisting of monitoring and evaluation activities ( Sida , 2007). Input...limited integration and understanding between the Swedish Armed Forces and SIDA at the domestic interagency level. Four participants said that the...military and SIDA personnel had been sent to the PRT with different mandates, objectives and cultures, without practical instructions on how to

  2. Exploring the Benefits and Challenges of Using Laptop Computers in Higher Education Classrooms: A Formative Analysis

    OpenAIRE

    Robin H. Kay; Sharon Lauricella

    2011-01-01

    Because of decreased prices, increased convenience, and wireless access, an increasing number of college and university students are using laptop computers in their classrooms. This recent trend has forced instructors to address the educational consequences of using these mobile devices. The purpose of the current study was to analyze and assess beneficial and challenging laptop behaviours in higher education classrooms. Both quantitative and qualitative data were collected from 177 undergrad...

  3. Scientific Grand Challenges: Crosscutting Technologies for Computing at the Exascale - February 2-4, 2010, Washington, D.C.

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.

    2011-02-06

    The goal of the "Scientific Grand Challenges - Crosscutting Technologies for Computing at the Exascale" workshop in February 2010, jointly sponsored by the U.S. Department of Energy’s Office of Advanced Scientific Computing Research and the National Nuclear Security Administration, was to identify the elements of a research and development agenda that will address these challenges and create a comprehensive exascale computing environment. This exascale computing environment will enable the science applications identified in the eight previously held Scientific Grand Challenges Workshop Series.

  4. Research Progress of Global Land Domain Service Computing:Take GlobeLand 30 as an Example

    Directory of Open Access Journals (Sweden)

    CHEN Jun

    2017-10-01

    Full Text Available Combining service-computing technology with domain requirements is one of the important development directions of geographic information under Internet+, which provides highly efficient technical means for information sharing and collaborative services. Using GlobeLand 30 as an example, this paper analyzes the basic problems of integrating land cover information processing and service computing, introduces the latest research progress in domain service modeling, online computing method and dynamic service technology, and the GlobeLand 30 information service platform. The paper also discusses the further development directions of GlobeLand 30 domain service computing.

  5. Bringing high-performance computing to the biologist's workbench: approaches, applications, and challenges

    International Nuclear Information System (INIS)

    Oehmen, C S; Cannon, W R

    2008-01-01

    Data-intensive and high-performance computing are poised to significantly impact the future of biological research which is increasingly driven by the prevalence of high-throughput experimental methodologies for genome sequencing, transcriptomics, proteomics, and other areas. Large centers such as NIH's National Center for Biotechnology Information, The Institute for Genomic Research, and the DOE's Joint Genome Institute) have made extensive use of multiprocessor architectures to deal with some of the challenges of processing, storing and curating exponentially growing genomic and proteomic datasets, thus enabling users to rapidly access a growing public data source, as well as use analysis tools transparently on high-performance computing resources. Applying this computational power to single-investigator analysis, however, often relies on users to provide their own computational resources, forcing them to endure the learning curve of porting, building, and running software on multiprocessor architectures. Solving the next generation of large-scale biology challenges using multiprocessor machines-from small clusters to emerging petascale machines-can most practically be realized if this learning curve can be minimized through a combination of workflow management, data management and resource allocation as well as intuitive interfaces and compatibility with existing common data formats

  6. The challenges of developing computational physics: the case of South Africa

    International Nuclear Information System (INIS)

    Salagaram, T; Chetty, N

    2013-01-01

    Most modern scientific research problems are complex and interdisciplinary in nature. It is impossible to study such problems in detail without the use of computation in addition to theory and experiment. Although it is widely agreed that students should be introduced to computational methods at the undergraduate level, it remains a challenge to do this in a full traditional undergraduate curriculum. In this paper, we report on a survey that we conducted of undergraduate physics curricula in South Africa to determine the content and the approach taken in the teaching of computational physics. We also considered the pedagogy of computational physics at the postgraduate and research levels at various South African universities, research facilities and institutions. We conclude that the state of computational physics training in South Africa, especially at the undergraduate teaching level, is generally weak and needs to be given more attention at all universities. Failure to do so will impact negatively on the countrys capacity to grow its endeavours generally in the field of computational sciences, with negative impacts on research, and in commerce and industry

  7. Computational intelligence in wireless sensor networks recent advances and future challenges

    CERN Document Server

    Falcon, Rafael; Koeppen, Mario

    2017-01-01

    This book emphasizes the increasingly important role that Computational Intelligence (CI) methods are playing in solving a myriad of entangled Wireless Sensor Networks (WSN) related problems. The book serves as a guide for surveying several state-of-the-art WSN scenarios in which CI approaches have been employed. The reader finds in this book how CI has contributed to solve a wide range of challenging problems, ranging from balancing the cost and accuracy of heterogeneous sensor deployments to recovering from real-time sensor failures to detecting attacks launched by malicious sensor nodes and enacting CI-based security schemes. Network managers, industry experts, academicians and practitioners alike (mostly in computer engineering, computer science or applied mathematics) benefit from the spectrum of successful applications reported in this book. Senior undergraduate or graduate students may discover in this book some problems well suited for their own research endeavors. USP: Presents recent advances and fu...

  8. New Challenges for Design Participation in the Era of Ubiquitous Computing

    DEFF Research Database (Denmark)

    Brereton, Margot; Buur, Jacob

    2008-01-01

    Since the event of participatory design in the work democracy projects of the 1970’s and 1980’s in Scandinavia, computing technology and people’s engagement with it have undergone fundamental changes. Although participatory design continues to be a precondition for designing computing that aligns...... with human practices, the motivations to engage in participatory design have changed, and the new era requires formats that are different from the original ones. Through the analysis of three case studies this paper seeks to explain why participatory design must be brought to bear on the field of ubiquitous...... computing, and how this challenges the original participatory design thinking. In particular we will argue that more casual, exploratory formats of engagement with people are required, and rather than planning the all-encompassing systems development project, participatory design needs to move towards...

  9. Geant4 Hadronic Cascade Models and CMS Data Analysis : Computational Challenges in the LHC era

    CERN Document Server

    Heikkinen, Aatos

    This work belongs to the field of computational high-energy physics (HEP). The key methods used in this thesis work to meet the challenges raised by the Large Hadron Collider (LHC) era experiments are object-orientation with software engineering, Monte Carlo simulation, the computer technology of clusters, and artificial neural networks. The first aspect discussed is the development of hadronic cascade models, used for the accurate simulation of medium-energy hadron-nucleus reactions, up to 10 GeV. These models are typically needed in hadronic calorimeter studies and in the estimation of radiation backgrounds. Various applications outside HEP include the medical field (such as hadron treatment simulations), space science (satellite shielding), and nuclear physics (spallation studies). Validation results are presented for several significant improvements released in Geant4 simulation tool, and the significance of the new models for computing in the Large Hadron Collider era is estimated. In particular, we es...

  10. Getting ready for REDD+ in Tanzania: a case study of progress and challenges

    DEFF Research Database (Denmark)

    Burgess, Neil David; Bahane, Bruno; Clairs, Tim

    2010-01-01

    the Norwegian, Finnish and German governments and is a participant in the World Bank’s Forest Carbon Partnership Facility. In combination, these interventions aim to mitigate greenhouse gas emissions, provide an income to rural communities and conserve biodiversity. The establishment of the UN-REDD Programme...... in Tanzania illustrates real-world challenges in a developing country. These include currently inadequate baseline forestry data sets (needed to calculate reference emission levels), inadequate government capacity and insufficient experience of implementing REDD+-type measures at operational levels....... Additionally, for REDD+ to succeed, current users of forest resources must adopt new practices, including the equitable sharing of benefits that accrue from REDD+ implementation. These challenges are being addressed by combined donor support to implement a national forest inventory, remote sensing of forest...

  11. Seismic damage estimation for buried pipelines - challenges after three decades of progress

    Energy Technology Data Exchange (ETDEWEB)

    Pineda-porras, Omar Andrey [Los Alamos National Laboratory; Najafi, Mohammand [U. OF TEXAS

    2009-01-01

    This paper analyzes the evolution over the past three decades of seismic damage estimation for buried pipelines and identifies some challenges for future research studies on the subject. The first section of this paper presents a chronological description of the evolution since the mid-1970s of pipeline fragility relations - the most common tool for pipeline damage estimation - and follows with a careful analysis of the use of several ground motion parameters as pipeline damage indicators. In the second section of the paper, four gaps on the subject are identified and proposed as challenges for future research studies. The main conclusion of this work is that enhanced fragility relations must be developed for improving pipeline damage estimation, which must consider relevant parameters that could influence the seismic response of pipelines.

  12. National rural drinking water monitoring: progress and challenges with India's IMIS database

    OpenAIRE

    Wescoat, James; Fletcher, Sarah Marie; Novellino, Marianna

    2015-01-01

    National drinking water programs seek to address monitoring challenges that include self-reporting, data sampling, data consistency and quality, and sufficient frequency to assess the sustainability of water systems. India stands out for its comprehensive rural water database known as Integrated Management Information System (IMIS), which conducts annual monitoring of drinking water coverage, water quality, and related program components from the habitation level to the district, state, and n...

  13. Brazil and UN Security Council Resolution 1325: Progress and Challenges of the Implementation Process

    Science.gov (United States)

    2016-03-01

    Haiti, was a cultural trait, and as such very hard to address. This counterpoint affects their perception of the relationship between a lasting...challenge requires a long- term process of cultural change, not only in the military, but also more broadly. A general- ized reconstruction of the...Brazil, Bolivia , Chile, Peru, and Uruguay. 20 Data retrieved from the Ministry of Defense by e-mail in January 2014. 21 Barbara Miller, Milad Pournik

  14. Challenges and considerations for the design and production of a purpose-optimized body-worn wrist-watch computer

    Science.gov (United States)

    Narayanaswami, Chandra; Raghunath, Mandayam T.

    2004-09-01

    We outline a collection of technological challenges in the design of wearable computers with a focus on one of the most desirable form-factors, the wrist watch. We describe our experience with building three generations of wrist watch computers. We built these research prototypes as platforms to investigate the fundamental limitations of wearable computing. Results of our investigations are presented in the form of challenges that have been overcome and those that still remain.

  15. Progress and Challenges in the Design and Clinical Development of Antibodies for Cancer Therapy

    Directory of Open Access Journals (Sweden)

    Juan C. Almagro

    2018-01-01

    Full Text Available The remarkable progress in engineering and clinical development of therapeutic antibodies in the last 40 years, after the seminal work by Köhler and Milstein, has led to the approval by the United States Food and Drug Administration (FDA of 21 antibodies for cancer immunotherapy. We review here these approved antibodies, with emphasis on the methods used for their discovery, engineering, and optimization for therapeutic settings. These methods include antibody engineering via chimerization and humanization of non-human antibodies, as well as selection and further optimization of fully human antibodies isolated from human antibody phage-displayed libraries and immunization of transgenic mice capable of generating human antibodies. These technology platforms have progressively led to the development of therapeutic antibodies with higher human content and, thus, less immunogenicity. We also discuss the genetic engineering approaches that have allowed isotype switching and Fc modifications to modulate effector functions and bioavailability (half-life, which together with the technologies for engineering the Fv fragment, have been pivotal in generating more efficacious and better tolerated therapeutic antibodies to treat cancer.

  16. IMPLEMENTING THE COMPUTER-BASED NATIONAL EXAMINATION IN INDONESIAN SCHOOLS: THE CHALLENGES AND STRATEGIES

    Directory of Open Access Journals (Sweden)

    Heri Retnawati

    2017-12-01

    Full Text Available In line with technological development, the computer-based national examination (CBNE has become an urgent matter as its implementation faces various challenges, especially in developing countries. Strategies in implementing CBNE are thus needed to face the challenges. The aim of this research was to analyse the challenges and strategies of Indonesian schools in implementing CBNE. This research was qualitative phenomenological in nature. The data were collected through a questionnaire and a focus group discussion. The research participants were teachers who were test supervisors and technicians at junior high schools and senior high schools (i.e. Level 1 and 2 and vocational high schools implementing CBNE in Yogyakarta, Indonesia. The data were analysed using the Bogdan and Biklen model. The results indicate that (1 in implementing CBNE, the schools should initially make efforts to provide the electronic equipment supporting it; (2 the implementation of CBNE is challenged by problems concerning the Internet and the electricity supply; (3 the test supervisors have to learn their duties by themselves and (4 the students are not yet familiar with the beneficial use of information technology. To deal with such challenges, the schools employed strategies by making efforts to provide the standard electronic equipment through collaboration with the students’ parents and improving the curriculum content by adding information technology as a school subject.

  17. The Challenges and Benefits of Using Computer Technology for Communication and Teaching in the Geosciences

    Science.gov (United States)

    Fairley, J. P.; Hinds, J. J.

    2003-12-01

    The advent of the World Wide Web in the early 1990s not only revolutionized the exchange of ideas and information within the scientific community, but also provided educators with a new array of teaching, informational, and promotional tools. Use of computer graphics and animation to explain concepts and processes can stimulate classroom participation and student interest in the geosciences, which has historically attracted students with strong spatial and visualization skills. In today's job market, graduates are expected to have knowledge of computers and the ability to use them for acquiring, processing, and visually analyzing data. Furthermore, in addition to promoting visibility and communication within the scientific community, computer graphics and the Internet can be informative and educational for the general public. Although computer skills are crucial for earth science students and educators, many pitfalls exist in implementing computer technology and web-based resources into research and classroom activities. Learning to use these new tools effectively requires a significant time commitment and careful attention to the source and reliability of the data presented. Furthermore, educators have a responsibility to ensure that students and the public understand the assumptions and limitations of the materials presented, rather than allowing them to be overwhelmed by "gee-whiz" aspects of the technology. We present three examples of computer technology in the earth sciences classroom: 1) a computer animation of water table response to well pumping, 2) a 3-D fly-through animation of a fault controlled valley, and 3) a virtual field trip for an introductory geology class. These examples demonstrate some of the challenges and benefits of these new tools, and encourage educators to expand the responsible use of computer technology for teaching and communicating scientific results to the general public.

  18. Current progress and challenges in engineering viable artificial leaf for solar water splitting

    Directory of Open Access Journals (Sweden)

    Phuc D. Nguyen

    2017-12-01

    Full Text Available Large scale production of H2, a clean fuel, can be realized with just water and solar light energy by employing a viable energy conversion device called artificial leaf. In this tutorial review, we discuss on advances achieved recently and technical challenges remained toward the creation of such a leaf. Development of key components like catalysts for water electrolysis process and light harvester for harvesting solar energy as well as strategies being developed for assembling these components to create a complete artificial leaf will be highlighted.

  19. Tissue Engineering of Blood Vessels: Functional Requirements, Progress, and Future Challenges.

    Science.gov (United States)

    Kumar, Vivek A; Brewster, Luke P; Caves, Jeffrey M; Chaikof, Elliot L

    2011-09-01

    Vascular disease results in the decreased utility and decreased availability of autologus vascular tissue for small diameter (requires combined approaches from biomaterials science, cell biology, and translational medicine to develop feasible solutions with the requisite mechanical support, a non-fouling surface for blood flow, and tissue regeneration. Over the past two decades interest in blood vessel tissue engineering has soared on a global scale, resulting in the first clinical implants of multiple technologies, steady progress with several other systems, and critical lessons-learned. This review will highlight the current inadequacies of autologus and synthetic grafts, the engineering requirements for implantation of tissue-engineered grafts, and the current status of tissue-engineered blood vessel research.

  20. Fundamental challenges in mechanistic enzymology: progress toward understanding the rate enhancements of enzymes.

    Science.gov (United States)

    Herschlag, Daniel; Natarajan, Aditya

    2013-03-26

    Enzymes are remarkable catalysts that lie at the heart of biology, accelerating chemical reactions to an astounding extent with extraordinary specificity. Enormous progress in understanding the chemical basis of enzymatic transformations and the basic mechanisms underlying rate enhancements over the past decades is apparent. Nevertheless, it has been difficult to achieve a quantitative understanding of how the underlying mechanisms account for the energetics of catalysis, because of the complexity of enzyme systems and the absence of underlying energetic additivity. We review case studies from our own work that illustrate the power of precisely defined and clearly articulated questions when dealing with such complex and multifaceted systems, and we also use this approach to evaluate our current ability to design enzymes. We close by highlighting a series of questions that help frame some of what remains to be understood, and we encourage the reader to define additional questions and directions that will deepen and broaden our understanding of enzymes and their catalysis.

  1. Progress and challenges in the prevention and control of nonalcoholic fatty liver disease.

    Science.gov (United States)

    Cai, Jingjing; Zhang, Xiao-Jing; Li, Hongliang

    2018-05-30

    Nonalcoholic fatty liver disease (NAFLD) is rapidly becoming the most common liver disease worldwide. Individuals with NAFLD have a high frequency of developing progressive liver disease and metabolism-related comorbidities, which result from of a lack of awareness and poor surveillance of the disease and a paucity of approved and effective therapies. Managing the complications of NAFLD has already begun to place a tremendous burden on health-care systems. Although efforts to identify effective therapies are underway, the lack of validated preclinical NAFLD models that represent the biology and outcomes of human disease remains a major barrier. This review summarizes the characteristics and prevalence of the disease and the status of our understanding of its mechanisms and potential therapeutic targets. © 2018 Wiley Periodicals, Inc.

  2. A Survey on Underwater Wireless Sensor Networks: Progresses, Applications, and Challenges

    Directory of Open Access Journals (Sweden)

    Premalatha J.

    2016-01-01

    Full Text Available The endangered underwater species always drew the attention of the scientific society since their disappearance would cause irreplaceable loss. Asia is home to some of the most diverse habitats in the earth, but it is estimated that more than one in four species are endangered. In Underwater, a lot of factors are putting marine life under immense pressure. Extreme population pressure is leading to pollution, over-fishing and the devastation of crucial habitats. Consequently, the numbers of almost all fish are declining and many are already endangered. To help these species to survive, their habitat should be strictly protected. This can be achieved by strictly monitoring them. During this course, several parameters, constraints about the species and its environments are focused. Now, advances in sensor technology facilitate the monitoring of species and their habitat with less expenditure. Indeed, the increasing sophistication of underwater wireless sensors offers chances that enable new challenges in a lot of areas, like surveillance one. This paper endorses the use of sensors for monitoring underwater species endangered in their habitat. This paper further examines the key approaches and challenges in the design and implementation of underwater wireless sensor networks. We summarize major applications and the main phenomena related to acoustic propagation, and discuss how they affect the design and operation of communication systems and networking protocols at various layers.

  3. Deep vadose zone remediation: technical and policy challenges, opportunities, and progress in achieving cleanup endpoints

    International Nuclear Information System (INIS)

    Wellman, D.M.; Freshley, M.D.; Truex, M.J.; Lee, M.H.

    2013-01-01

    Current requirements for site remediation and closure are standards-based and are often overly conservative, costly, and in some cases, technically impractical. Use of risk-informed alternate endpoints provides a means to achieve remediation goals that are permitted by regulations and are protective of human health and the environment. Alternate endpoints enable the establishment of a path for cleanup that may include intermediate remedial milestones and transition points and/or regulatory alternatives to standards-based remediation. A framework is presented that is centered around developing and refining conceptual models in conjunction with assessing risks and potential endpoints as part of a system-based assessment that integrates site data with scientific understanding of processes that control the distribution and transport of contaminants in the subsurface and pathways to receptors. This system-based assessment and subsequent implementation of the remediation strategy with appropriate monitoring are targeted at providing a holistic approach to addressing risks to human health and the environment. This holistic approach also enables effective predictive analysis of contaminant behavior to provide defensible criteria and data for making long-term decisions. Developing and implementing an alternate endpoint-based approach for remediation and waste site closure presents a number of challenges and opportunities. Categories of these challenges include scientific and technical, regulatory, institutional, and budget and resource allocation issues. Opportunities exist for developing and implementing systems-based approaches with respect to supportive characterization, monitoring, predictive modeling, and remediation approaches. (authors)

  4. Alternate Endpoints for Deep Vadose Zone Environments: Challenges, Opportunities, and Progress - 13036

    International Nuclear Information System (INIS)

    Wellman, Dawn M.; Freshley, Mark D.; Truex, Michael J.; Lee, M. Hope

    2013-01-01

    Current requirements for site remediation and closure are standards-based and are often overly conservative, costly, and in some cases, technically impractical to achieve. Use of risk-informed alternate endpoints provide a means to achieve remediation goals that are permitted by regulations and are protective of human health and the environment. Alternate endpoints enable establishing a path for cleanup that may include intermediate remedial milestones and transition points and/or regulatory alternatives to standards-based remediation. A framework is presented that is centered around developing and refining conceptual models in conjunction with assessing risks and potential endpoints as part of a system-based assessment that integrates site data with scientific understanding of processes that control the distribution and transport of contaminants in the subsurface and pathways to receptors. This system-based assessment and subsequent implementation of the remediation strategy with appropriate monitoring are targeted at providing a holistic approach to addressing risks to human health and the environment. This holistic approach also enables effective predictive analysis of contaminant behavior to provide defensible criteria and data for making long-term decisions. Developing and implementing an alternate endpoint-based approach for remediation and waste site closure presents a number of challenges and opportunities. Categories of these challenges include scientific and technical, regulatory, institutional, and budget and resource allocation issues. Opportunities exist for developing and implementing systems-based approaches with respect to supportive characterization, monitoring, predictive modeling, and remediation approaches. (authors)

  5. Alternate Endpoints for Deep Vadose Zone Environments: Challenges, Opportunities, and Progress - 13036

    Energy Technology Data Exchange (ETDEWEB)

    Wellman, Dawn M.; Freshley, Mark D.; Truex, Michael J.; Lee, M. Hope [Pacific Northwest National Laboratory, 902 Battelle Blvd, Richland, WA, 99352 (United States)

    2013-07-01

    Current requirements for site remediation and closure are standards-based and are often overly conservative, costly, and in some cases, technically impractical to achieve. Use of risk-informed alternate endpoints provide a means to achieve remediation goals that are permitted by regulations and are protective of human health and the environment. Alternate endpoints enable establishing a path for cleanup that may include intermediate remedial milestones and transition points and/or regulatory alternatives to standards-based remediation. A framework is presented that is centered around developing and refining conceptual models in conjunction with assessing risks and potential endpoints as part of a system-based assessment that integrates site data with scientific understanding of processes that control the distribution and transport of contaminants in the subsurface and pathways to receptors. This system-based assessment and subsequent implementation of the remediation strategy with appropriate monitoring are targeted at providing a holistic approach to addressing risks to human health and the environment. This holistic approach also enables effective predictive analysis of contaminant behavior to provide defensible criteria and data for making long-term decisions. Developing and implementing an alternate endpoint-based approach for remediation and waste site closure presents a number of challenges and opportunities. Categories of these challenges include scientific and technical, regulatory, institutional, and budget and resource allocation issues. Opportunities exist for developing and implementing systems-based approaches with respect to supportive characterization, monitoring, predictive modeling, and remediation approaches. (authors)

  6. Deep vadose zone remediation: technical and policy challenges, opportunities, and progress in achieving cleanup endpoints

    Energy Technology Data Exchange (ETDEWEB)

    Wellman, D.M.; Freshley, M.D.; Truex, M.J.; Lee, M.H. [Pacific Northwest National Laboratory, Richland, Washington (United States)

    2013-07-01

    Current requirements for site remediation and closure are standards-based and are often overly conservative, costly, and in some cases, technically impractical. Use of risk-informed alternate endpoints provides a means to achieve remediation goals that are permitted by regulations and are protective of human health and the environment. Alternate endpoints enable the establishment of a path for cleanup that may include intermediate remedial milestones and transition points and/or regulatory alternatives to standards-based remediation. A framework is presented that is centered around developing and refining conceptual models in conjunction with assessing risks and potential endpoints as part of a system-based assessment that integrates site data with scientific understanding of processes that control the distribution and transport of contaminants in the subsurface and pathways to receptors. This system-based assessment and subsequent implementation of the remediation strategy with appropriate monitoring are targeted at providing a holistic approach to addressing risks to human health and the environment. This holistic approach also enables effective predictive analysis of contaminant behavior to provide defensible criteria and data for making long-term decisions. Developing and implementing an alternate endpoint-based approach for remediation and waste site closure presents a number of challenges and opportunities. Categories of these challenges include scientific and technical, regulatory, institutional, and budget and resource allocation issues. Opportunities exist for developing and implementing systems-based approaches with respect to supportive characterization, monitoring, predictive modeling, and remediation approaches. (authors)

  7. Recent progress and challenges in nanotechnology for biomedical applications: an insight into the analysis of neurotransmitters.

    Science.gov (United States)

    Shankaran, Dhesingh Ravi; Miura, Norio

    2007-01-01

    Nanotechnology offers exciting opportunities and unprecedented compatibilities in manipulating chemical and biological materials at the atomic or molecular scale for the development of novel functional materials with enhanced capabilities. It plays a central role in the recent technological advances in biomedical technology, especially in the areas of disease diagnosis, drug design and drug delivery. In this review, we present the recent trend and challenges in the development of nanomaterials for biomedical applications with a special emphasis on the analysis of neurotransmitters. Neurotransmitters are the chemical messengers which transform information and signals all over the body. They play prime role in functioning of the central nervous system (CNS) and governs most of the metabolic functions including movement, pleasure, pain, mood, emotion, thinking, digestion, sleep, addiction, fear, anxiety and depression. Thus, development of high-performance and user-friendly analytical methods for ultra-sensitive detection of neurotransmitters remain a major challenge in modern biomedical analysis. Nanostructured materials are emerging as a powerful mean for diagnosis of CNS disorders because of their unique optical, size and surface characteristics. This review provides a brief outline on the basic concepts and recent advancements of nanotechnology for biomedical applications, especially in the analysis of neurotransmitters. A brief introduction to the nanomaterials, bionanotechnology and neurotransmitters is also included along with discussions on most of the patents published in these areas.

  8. Science for Today's Energy Challenges: Accelerating Progress for a Sustainable Energy Future

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2006-07-01

    With a growing population and energy demand in the world, there is a pressing need for research to create secure and accessible energy options with greatly reduced emissions of greenhouse gases. While we work to deploy the clean and efficient technologies that we already have--which will be urgent for the coming decades--we must also work to develop the science for the technologies of the future. This brochure gives examples of some of the most promising developments, and it provides 'snapshots' of cutting edge work of scientists in the field. The areas of greatest promise include biochemistry, nanotechnology, supraconductivity, electrophysics and computing. There are many others.

  9. Computational Cellular Dynamics Based on the Chemical Master Equation: A Challenge for Understanding Complexity.

    Science.gov (United States)

    Liang, Jie; Qian, Hong

    2010-01-01

    Modern molecular biology has always been a great source of inspiration for computational science. Half a century ago, the challenge from understanding macromolecular dynamics has led the way for computations to be part of the tool set to study molecular biology. Twenty-five years ago, the demand from genome science has inspired an entire generation of computer scientists with an interest in discrete mathematics to join the field that is now called bioinformatics. In this paper, we shall lay out a new mathematical theory for dynamics of biochemical reaction systems in a small volume (i.e., mesoscopic) in terms of a stochastic, discrete-state continuous-time formulation, called the chemical master equation (CME). Similar to the wavefunction in quantum mechanics, the dynamically changing probability landscape associated with the state space provides a fundamental characterization of the biochemical reaction system. The stochastic trajectories of the dynamics are best known through the simulations using the Gillespie algorithm. In contrast to the Metropolis algorithm, this Monte Carlo sampling technique does not follow a process with detailed balance. We shall show several examples how CMEs are used to model cellular biochemical systems. We shall also illustrate the computational challenges involved: multiscale phenomena, the interplay between stochasticity and nonlinearity, and how macroscopic determinism arises from mesoscopic dynamics. We point out recent advances in computing solutions to the CME, including exact solution of the steady state landscape and stochastic differential equations that offer alternatives to the Gilespie algorithm. We argue that the CME is an ideal system from which one can learn to understand "complex behavior" and complexity theory, and from which important biological insight can be gained.

  10. A Step Towards A Computing Grid For The LHC Experiments ATLAS Data Challenge 1

    CERN Document Server

    Sturrock, R; Epp, B; Ghete, V M; Kuhn, D; Mello, A G; Caron, B; Vetterli, M C; Karapetian, G V; Martens, K; Agarwal, A; Poffenberger, P R; McPherson, R A; Sobie, R J; Amstrong, S; Benekos, N C; Boisvert, V; Boonekamp, M; Brandt, S; Casado, M P; Elsing, M; Gianotti, F; Goossens, L; Grote, M; Hansen, J B; Mair, K; Nairz, A; Padilla, C; Poppleton, A; Poulard, G; Richter-Was, Elzbieta; Rosati, S; Schörner-Sadenius, T; Wengler, T; Xu, G F; Ping, J L; Chudoba, J; Kosina, J; Lokajícek, M; Svec, J; Tas, P; Hansen, J R; Lytken, E; Nielsen, J L; Wäänänen, A; Tapprogge, Stefan; Calvet, D; Albrand, S; Collot, J; Fulachier, J; Ledroit-Guillon, F; Ohlsson-Malek, F; Viret, S; Wielers, M; Bernardet, K; Corréard, S; Rozanov, A; De Vivie de Régie, J B; Arnault, C; Bourdarios, C; Hrivnác, J; Lechowski, M; Parrour, G; Perus, A; Rousseau, D; Schaffer, A; Unal, G; Derue, F; Chevalier, L; Hassani, S; Laporte, J F; Nicolaidou, R; Pomarède, D; Virchaux, M; Nesvadba, N; Baranov, S; Putzer, A; Khonich, A; Duckeck, G; Schieferdecker, P; Kiryunin, A E; Schieck, J; Lagouri, T; Duchovni, E; Levinson, L; Schrager, D; Negri, G; Bilokon, H; Spogli, L; Barberis, D; Parodi, F; Cataldi, G; Gorini, E; Primavera, M; Spagnolo, S; Cavalli, D; Heldmann, M; Lari, T; Perini, L; Rebatto, D; Resconi, S; Tatarelli, F; Vaccarossa, L; Biglietti, M; Carlino, G; Conventi, F; Doria, A; Merola, L; Polesello, G; Vercesi, V; De Salvo, A; Di Mattia, A; Luminari, L; Nisati, A; Reale, M; Testa, M; Farilla, A; Verducci, M; Cobal, M; Santi, L; Hasegawa, Y; Ishino, M; Mashimo, T; Matsumoto, H; Sakamoto, H; Tanaka, J; Ueda, I; Bentvelsen, Stanislaus Cornelius Maria; Fornaini, A; Gorfine, G; Groep, D; Templon, J; Köster, L J; Konstantinov, A; Myklebust, T; Ould-Saada, F; Bold, T; Kaczmarska, A; Malecki, P; Szymocha, T; Turala, M; Kulchitskii, Yu A; Khoreauli, G; Gromova, N; Tsulaia, V; Minaenko, A A; Rudenko, R; Slabospitskaya, E; Solodkov, A; Gavrilenko, I; Nikitine, N; Sivoklokov, S Yu; Toms, K; Zalite, A; Zalite, Yu; Kervesan, B; Bosman, M; González, S; Sánchez, J; Salt, J; Andersson, N; Nixon, L; Eerola, Paule Anna Mari; Kónya, B; Smirnova, O G; Sandgren, A; Ekelöf, T J C; Ellert, M; Gollub, N; Hellman, S; Lipniacka, A; Corso-Radu, A; Pérez-Réale, V; Lee, S C; CLin, S C; Ren, Z L; Teng, P K; Faulkner, P J W; O'Neale, S W; Watson, A; Brochu, F; Lester, C; Thompson, S; Kennedy, J; Bouhova-Thacker, E; Henderson, R; Jones, R; Kartvelishvili, V G; Smizanska, M; Washbrook, A J; Drohan, J; Konstantinidis, N P; Moyse, E; Salih, S; Loken, J; Baines, J T M; Candlin, D; Candlin, R; Clifft, R; Li, W; McCubbin, N A; George, S; Lowe, A; Buttar, C; Dawson, I; Moraes, A; Tovey, Daniel R; Gieraltowski, J; Malon, D; May, E; LeCompte, T J; Vaniachine, A; Adams, D L; Assamagan, Ketevi A; Baker, R; Deng, W; Fine, V; Fisyak, Yu; Gibbard, B; Ma, H; Nevski, P; Paige, F; Rajagopalan, S; Smith, J; Undrus, A; Wenaus, T; Yu, D; Calafiura, P; Canon, S; Costanzo, D; Hinchliffe, Ian; Lavrijsen, W; Leggett, C; Marino, M; Quarrie, D R; Sakrejda, I; Stravopoulos, G; Tull, C; Loch, P; Youssef, S; Shank, J T; Engh, D; Frank, E; Sen-Gupta, A; Gardner, R; Meritt, F; Smirnov, Y; Huth, J; Grundhoefer, L; Luehring, F C; Goldfarb, S; Severini, H; Skubic, P L; Gao, Y; Ryan, T; De, K; Sosebee, M; McGuigan, P; Ozturk, N

    2004-01-01

    The ATLAS Collaboration at CERN is preparing for the data taking and analysis at the LHC that will start in 2007. Therefore, a series of Data Challenges was started in 2002 whose goals are the validation of the Computing Model, of the complete software suite, of the data model, and to ensure the correctness of the technical choices to be made for the final offline computing environment. A major feature of the first Data Challenge (DC1) was the preparation and the deployment of the software required for the production of large event samples as a worldwide distributed activity. It should be noted that it was not an option to "run the complete production at CERN" even if we had wanted to; the resources were not available at CERN to carry out the production on a reasonable time-scale. The great challenge of organising and carrying out this large-scale production at a significant number of sites around the world had therefore to be faced. However, the benefits of this are manifold: apart from realising the require...

  11. Quantification of progression and regression of descending thoracic aortic wall thickness by enhanced computed tomography

    International Nuclear Information System (INIS)

    Yokoyama, Kenichi; Takasu, Junichiro; Yamamoto, Rie; Taguchi, Rie; Itani, Yasutaka; Ito, Yuichi; Watanabe, Shigeru; Masuda, Yoshiaki

    2001-01-01

    The purpose of this study is to verify the usefulness of the quantification of aortic wall involvement by enhanced computed tomography (CT). One-hundred thirteen Japanese patients underwent two enhanced CT of the descending thoracic aorta at intervals. We sliced the descending thoracic aorta continuously from the level of the tracheal bifurcation with 1 cm intervals, and we defined aortic wall volume (AWV) (cm 3 ) as the sum of a 7-slice area of aortic wall involving calcification. The average of AWV increased from 7.95±2.92 cm 3 to 8.70±2.98 cm 3 . The developmental rate of AWV (ΔAWV) was 0.270±0.281 cm 3 /year. ΔAWV did not have a significant correlation with any risk factor at the baseline. ΔAWV had significant correlation with total cholesterol, (LDL-C) low-density lipoprotein cholesterol and LDL-C/(HDL-C) high-density lipoprotein cholesterol ratio at the follow-up, and by multivariate analysis with only the LDL-C/HDL-C ratio. ΔAWV was not correlated with the intake status of hypoglycemic, antihypertensive or lipid-lowering drugs. The cut-off level of total cholesterol with the most significant odds ratio for progression of aortic wall was 190 mg/dl, and that of LDL-C was 130 mg/dl. This method proved to be useful for the non-invasive assessment of aortic wall thickness. (author)

  12. Progress in Cell Marking for Synchrotron X-ray Computed Tomography

    Science.gov (United States)

    Hall, Christopher; Sturm, Erica; Schultke, Elisabeth; Arfelli, Fulvia; Menk, Ralf-Hendrik; Astolfo, Alberto; Juurlink, Bernhard H. J.

    2010-07-01

    Recently there has been an increase in research activity into finding ways of marking cells in live animals for pre-clinical trials. Development of certain drugs and other therapies crucially depend on tracking particular cells or cell types in living systems. Therefore cell marking techniques are required which will enable longitudinal studies, where individuals can be examined several times over the course of a therapy or study. The benefits of being able to study both disease and therapy progression in individuals, rather than cohorts are clear. The need for high contrast 3-D imaging, without harming or altering the biological system requires a non-invasive yet penetrating imaging technique. The technique will also have to provide an appropriate spatial and contrast resolution. X-ray computed tomography offers rapid acquisition of 3-D images and is set to become one of the principal imaging techniques in this area. Work by our group over the last few years has shown that marking cells with gold nano-particles (GNP) is an effective means of visualising marked cells in-vivo using x-ray CT. Here we report the latest results from these studies. Synchrotron X-ray CT images of brain lesions in rats taken using the SYRMEP facility at the Elettra synchrotron in 2009 have been compared with histological examination of the tissues. Some deductions are drawn about the visibility of the gold loaded cells in both light microscopy and x-ray imaging.

  13. Automated blood glucose control in type 1 diabetes: A review of progress and challenges.

    Science.gov (United States)

    Bertachi, Arthur; Ramkissoon, Charrise M; Bondia, Jorge; Vehí, Josep

    2018-03-01

    Since the 2000s, research teams worldwide have been working to develop closed-loop (CL) systems able to automatically control blood glucose (BG) levels in patients with type 1 diabetes. This emerging technology is known as artificial pancreas (AP), and its first commercial version just arrived in the market. The main objective of this paper is to present an extensive review of the clinical trials conducted since 2011, which tested various implementations of the AP for different durations under varying conditions. A comprehensive table that contains key information from the selected publications is provided, and the main challenges in AP development and the mitigation strategies used are discussed. The development timelines for different AP systems are also included, highlighting the main evolutions over the clinical trials for each system. Copyright © 2017 SEEN y SED. Publicado por Elsevier España, S.L.U. All rights reserved.

  14. The development of capability measures in health economics: opportunities, challenges and progress.

    Science.gov (United States)

    Coast, Joanna; Kinghorn, Philip; Mitchell, Paul

    2015-04-01

    Recent years have seen increased engagement amongst health economists with the capability approach developed by Amartya Sen and others. This paper focuses on the capability approach in relation to the evaluative space used for analysis within health economics. It considers the opportunities that the capability approach offers in extending this space, but also the methodological challenges associated with moving from the theoretical concepts to practical empirical applications. The paper then examines three 'families' of measures, Oxford Capability instruments (OxCap), Adult Social Care Outcome Toolkit (ASCOT) and ICEpop CAPability (ICECAP), in terms of the methodological choices made in each case. The paper concludes by discussing some of the broader issues involved in making use of the capability approach in health economics. It also suggests that continued exploration of the impact of different methodological choices will be important in moving forward.

  15. Technology 2.0: A Commentary on Progress, Challenges, and Next Steps.

    Science.gov (United States)

    Jones, Deborah J

    2017-11-01

    This commentary highlights the importance and promise of the innovative strategies described in the Child Maltreatment special issue on "Technology 2.0: A Focus on the Newest Technological Advances in Child Maltreatment Research." The commentary first highlights the collective advancements reflected in the articles in the special issue, with a primary focus on how the authors' work addresses a general challenge in services research that is perhaps nowhere more problematic than in the field of maltreatment. Next, the commentary extends the discussion of these articles to raise remaining gaps in our knowledge, theory, and methodology, which must be the focus of ongoing research if the true potential of technology as a service delivery vehicle is to be realized. Finally, the commentary concludes with a call for subsequent research which will be inspired by the articles in this special issue.

  16. Suncor Energy Inc. seventh annual progress report : Canada's climate change voluntary challenge and registry program

    International Nuclear Information System (INIS)

    2001-10-01

    This document detailed the various initiatives implemented by Suncor Energy Inc. in light of Climate Change Voluntary Challenge and Registry (VCR) Program. Project Millennium, which represents a 3.25 billion dollar expansion expected to lead to an increase production capacity for Oil Sands operations, was consolidated during 2000, along with the completion of restructuring, which led to the divestiture of conventional oil properties and the joint venture interest held by Suncor in the Stuart Oil Shale Project. In addition, there were some improvements made to the greenhouse gas management and reporting systems. Suncor is expected to invest funding in the order of 100 million dollars for the period 2000-2005 in the field of alternative and renewable energy. The reductions in greenhouse gas emissions achieved for the year 2000 were 404,000 tonnes carbon dioxide equivalent. Each of these major endeavours was discussed in the document. tabs

  17. Gender and leadership in healthcare administration: 21st century progress and challenges.

    Science.gov (United States)

    Lantz, Paula M

    2008-01-01

    The need for strong leadership and increased diversity is a prominent issue in today's health services workforce. This article reviews the latest literature, including research and proposed agendas, regarding women in executive healthcare leadership. Data suggest that the number of women in leadership roles is increasing, but women remain underrepresented in the top echelons of healthcare leadership, and gender differences exist in the types of leadership roles women do attain. Salary disparity prevails, even when controlling for gender differences in educational attainment, age, and experience. Despite widespread awareness of these problems in the field, current action and policy recommendations are severely lacking. Along with the challenges of cost, quality, and an aging population, the time has come for a more thoughtful, policy-focused approach to amend the discrepancy between gender and leadership in healthcare administration.

  18. Development of scan analysis techniques employing a small computer. Progress report, August 1, 1974--July 31, 1975

    International Nuclear Information System (INIS)

    Kuhl, D.E.

    1975-01-01

    Progress is reported in the development of equipment and counting techniques for transverse section scanning of the brain following the administration of radiopharmaceuticals to evaluate regional blood flow. The scanning instrument has an array of 32 scintillation detectors that surround the head and scan data are analyzed using a small computer. (U.S.)

  19. Technological progress and the energy challenges. The role of natural gas

    International Nuclear Information System (INIS)

    Rasmusen, H.J.

    1999-01-01

    Since the beginning of the industrial evolution, progress in technology development for the energy industry has been guided by economy and choice of fuel. For the last decades 'Energy Crisis' and 'Greenhouse effect' issues have supplemented the driving forces. (Improved Efficiency' is not of the strongest marketing issues when dealing with appliances for energy conversion. The trends of the development of today are towards smaller decentralized units and mass production. This is in contradiction to conventional wisdom of minimizing costs by use of centralized large-scale units. The future of energy conversion of power and heat production will be dominated by small-scale units, which produce heat and power simultaneously. Lower energy prices will slow down the transition to more efficient conversion technologies, but in the open and de-regulated market this will be opposed by competition between companies. To gain market shares and maintain customers, energy companies will have to use 'efficient appliances' as a market parameter. Use of more efficient technology always improves the environmental efficiency but conversion to natural gas from another fossil fuel will by itself lead to radical environmental improvements. (author)

  20. Third annual progress report for Canada's Climate Change Voluntary Challenge and Registry Program

    International Nuclear Information System (INIS)

    1997-08-01

    Examples of how greenhouse gas issues are being integrated into management processes within Suncor Energy Inc. were presented. Progress reports for Suncor's three operating businesses (oil sands, exploration and production, and Sunoco) are provided. Of the three business units, oil sands plants were the largest source of greenhouse gas emissions, accounting for 2/3 of the total. Carbon dioxide emissions accounted for 96 per cent of total emissions. Actual targeted volumes of production and greenhouse gas emissions for the period 1990 to 2000 were described. Suncor's production volumes were predicted to increase by 64 per cent by the year 2000. Greenhouse gas emissions are expected to rise by 12 per cent during the same period. Suncor's target for greenhouse gas emissions per unit of production represents a 32 per cent improvement over the 1990 to 2000 time period. Further performance improvements are being pursued. Additional oil sands expansion projects beyond the year 2000 are in the early stages of development, and greenhouse gas management initiatives will be integrated into these projects. Suncor Energy's Oil Sands Operations are committed to reducing projected year 2000 total carbon equivalent emissions to 1990 levels by 2001. tabs., figs

  1. Mapping cancer cell metabolism with 13 C flux analysis: Recent progress and future challenges

    Directory of Open Access Journals (Sweden)

    Casey Scott Duckwall

    2013-01-01

    Full Text Available The reprogramming of energy metabolism is emerging as an important molecular hallmark of cancer cells. Recent discoveries linking specific metabolic alterations to cancer development have strengthened the idea that altered metabolism is more than a side effect of malignant transformation, but may in fact be a functional driver of tumor growth and progression in some cancers. As a result, dysregulated metabolic pathways have become attractive targets for cancer therapeutics. This review highlights the application of 13 C metabolic flux analysis (MFA to map the flow of carbon through intracellular biochemical pathways of cancer cells. We summarize several recent applications of MFA that have identified novel biosynthetic pathways involved in cancer cell proliferation and shed light on the role of specific oncogenes in regulating these pathways. Through such studies, it has become apparent that the metabolic phenotypes of cancer cells are not as homogeneous as once thought, but instead depend strongly on the molecular alterations and environmental factors at play in each case.

  2. Exploring the Benefits and Challenges of Using Laptop Computers in Higher Education Classrooms: A Formative Analysis

    Directory of Open Access Journals (Sweden)

    Robin H. Kay

    2011-04-01

    Full Text Available Because of decreased prices, increased convenience, and wireless access, an increasing number of college and university students are using laptop computers in their classrooms. This recent trend has forced instructors to address the educational consequences of using these mobile devices. The purpose of the current study was to analyze and assess beneficial and challenging laptop behaviours in higher education classrooms. Both quantitative and qualitative data were collected from 177 undergraduate university students (89 males, 88 females. Key benefits observed include note-taking activities, in-class laptop-based academic tasks, collaboration, increased focus, improved organization and efficiency, and addressing special needs. Key challenges noted include other student’s distracting laptop behaviours, instant messaging, surfing the web, playing games, watching movies, and decreased focus. Nearly three-quarters of the students claimed that laptops were useful in supporting their academic experience. Twice as many benefits were reported compared to challenges. It is speculated that the integration of meaningful laptop activities is a critical determinant of benefits and challenges experienced in higher education classrooms.

  3. Computational analyses of ancient pathogen DNA from herbarium samples: challenges and prospects.

    Science.gov (United States)

    Yoshida, Kentaro; Sasaki, Eriko; Kamoun, Sophien

    2015-01-01

    The application of DNA sequencing technology to the study of ancient DNA has enabled the reconstruction of past epidemics from genomes of historically important plant-associated microbes. Recently, the genome sequences of the potato late blight pathogen Phytophthora infestans were analyzed from 19th century herbarium specimens. These herbarium samples originated from infected potatoes collected during and after the Irish potato famine. Herbaria have therefore great potential to help elucidate past epidemics of crops, date the emergence of pathogens, and inform about past pathogen population dynamics. DNA preservation in herbarium samples was unexpectedly good, raising the possibility of a whole new research area in plant and microbial genomics. However, the recovered DNA can be extremely fragmented resulting in specific challenges in reconstructing genome sequences. Here we review some of the challenges in computational analyses of ancient DNA from herbarium samples. We also applied the recently developed linkage method to haplotype reconstruction of diploid or polyploid genomes from fragmented ancient DNA.

  4. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    Energy Technology Data Exchange (ETDEWEB)

    King, W. E., E-mail: weking@llnl.gov [Physical and Life Sciences Directorate, Lawrence Livermore National Laboratory, Livermore, California 94550 (United States); Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Khairallah, S. A. [Engineering Directorate, Lawrence Livermore National Laboratory, Livermore, California 94550 (United States); Kamath, C. [Computation Directorate, Lawrence Livermore National Laboratory, Livermore, California 94550 (United States); Rubenchik, A. M. [NIF and Photon Sciences Directorate, Lawrence Livermore National Laboratory, Livermore, California 94550 (United States)

    2015-12-15

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  5. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    International Nuclear Information System (INIS)

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Khairallah, S. A.; Kamath, C.; Rubenchik, A. M.

    2015-01-01

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process

  6. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    Science.gov (United States)

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Kamath, C.; Khairallah, S. A.; Rubenchik, A. M.

    2015-12-01

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  7. Progress in Application of the Neurosciences to an Understanding of Human Learning: The Challenge of Finding a Middle-Ground Neuroeducational Theory

    Science.gov (United States)

    Anderson, O. Roger

    2014-01-01

    Modern neuroscientific research has substantially enhanced our understanding of the human brain. However, many challenges remain in developing a strong, brain-based theory of human learning, especially in complex environments such as educational settings. Some of the current issues and challenges in our progress toward developing comprehensive…

  8. Challenges and opportunities of modeling plasma–surface interactions in tungsten using high-performance computing

    Energy Technology Data Exchange (ETDEWEB)

    Wirth, Brian D., E-mail: bdwirth@utk.edu [Department of Nuclear Engineering, University of Tennessee, Knoxville, TN 37996 (United States); Nuclear Science and Engineering Directorate, Oak Ridge National Laboratory, Oak Ridge, TN (United States); Hammond, K.D. [Department of Nuclear Engineering, University of Tennessee, Knoxville, TN 37996 (United States); Krasheninnikov, S.I. [University of California, San Diego, La Jolla, CA (United States); Maroudas, D. [University of Massachusetts, Amherst, Amherst, MA 01003 (United States)

    2015-08-15

    The performance of plasma facing components (PFCs) is critical for ITER and future magnetic fusion reactors. The ITER divertor will be tungsten, which is the primary candidate material for future reactors. Recent experiments involving tungsten exposure to low-energy helium plasmas reveal significant surface modification, including the growth of nanometer-scale tendrils of “fuzz” and formation of nanometer-sized bubbles in the near-surface region. The large span of spatial and temporal scales governing plasma surface interactions are among the challenges to modeling divertor performance. Fortunately, recent innovations in computational modeling, increasingly powerful high-performance computers, and improved experimental characterization tools provide a path toward self-consistent, experimentally validated models of PFC and divertor performance. Recent advances in understanding tungsten–helium interactions are reviewed, including such processes as helium clustering, which serve as nuclei for gas bubbles; and trap mutation, dislocation loop punching and bubble bursting; which together initiate surface morphological modification.

  9. Challenges for the computational fluid dynamics codes in the nineties. Various examples of application

    International Nuclear Information System (INIS)

    Chabard, J.P.; Viollet, P.L.

    1991-08-01

    Most of the computational fluid dynamics applications which are encountered at the Research and Development Division of EDF (RDD) are dealing with thermal exchanges. The development of numerical tools for the simulation of flows, devoted to this class of application, has been under way for 15 years. At the beginning this work was mainly concerned with a good simulation of the dynamics of the flow. Now these tools can be used to compute flows with thermal exchanges. The presentation will be limited to incompressible and one phase flows. First the softwares developed at RDD will be presented. Then some applications of these tools to flows with thermal exchanges will be discussed. To conclude, the paper will treat be general case of the CFD codes. The challenges for the next years will be detailed in order to make these tools available for users involved in complex physical modeling

  10. Challenges and opportunities of modeling plasma–surface interactions in tungsten using high-performance computing

    International Nuclear Information System (INIS)

    Wirth, Brian D.; Hammond, K.D.; Krasheninnikov, S.I.; Maroudas, D.

    2015-01-01

    The performance of plasma facing components (PFCs) is critical for ITER and future magnetic fusion reactors. The ITER divertor will be tungsten, which is the primary candidate material for future reactors. Recent experiments involving tungsten exposure to low-energy helium plasmas reveal significant surface modification, including the growth of nanometer-scale tendrils of “fuzz” and formation of nanometer-sized bubbles in the near-surface region. The large span of spatial and temporal scales governing plasma surface interactions are among the challenges to modeling divertor performance. Fortunately, recent innovations in computational modeling, increasingly powerful high-performance computers, and improved experimental characterization tools provide a path toward self-consistent, experimentally validated models of PFC and divertor performance. Recent advances in understanding tungsten–helium interactions are reviewed, including such processes as helium clustering, which serve as nuclei for gas bubbles; and trap mutation, dislocation loop punching and bubble bursting; which together initiate surface morphological modification

  11. Directional backlight liquid crystal autostereoscopic display: technical challenges, research progress, and prospect (Conference Presentation)

    Science.gov (United States)

    Fan, Hang; Li, Kunyang; Zhou, Yangui; Liang, Haowen; Wang, Jiahui; Zhou, Jianying

    2016-09-01

    Recent upsurge on virtual and augmented realities (VR and AR) has re-ignited the interest to the immerse display technology. The VR/AR technology based on stereoscopic display is believed in its early stage as glasses-free, or autostereoscopic display, will be ultimately adopted for the viewing convenience, visual comfort and for the multi-viewer purposes. On the other hand, autostereoscopic display has not yet received positive market response for the past years neither with stereoscopic displays using shutter or polarized glasses. We shall present the analysis on the real-world applications, rigid user demand, the drawbacks to the existing barrier- and lenticular lens-based LCD autostereoscopy. We shall emphasize the emerging autostereoscopic display, and notably on directional backlight LCD technology using a hybrid spatial- and temporal-control scenario. We report the numerical simulation of a display system using Monte-Carlo ray-tracing method with the human retina as the real image receiver. The system performance is optimized using newly developed figure of merit for system design. The reduced crosstalk in an autostereoscopic system, the enhanced display quality, including the high resolution received by the retina, the display homogeneity without Moiré- and defect-pattern, will be highlighted. Recent research progress including a novel scheme for diffraction-free backlight illumination, the expanded viewing zone for autostereoscopic display, and the novel Fresnel lens array to achieve a near perfect display in 2D/3D mode will be introduced. The experimental demonstration will be presented to the autostereoscopic display with the highest resolution, low crosstalk, Moiré- and defect- pattern free.

  12. Canada's climate change voluntary challenge and registry program : Suncor Energy Inc. eighth annual progress report

    International Nuclear Information System (INIS)

    2002-10-01

    A corporate profile of Suncor Energy, a Canadian integrated energy company placing the emphasis on the development of the Athabasca oil sands in northern Alberta, is provided. A message from the president reiterates the company's commitment to improving both the environmental and economic performance through innovative policies and strategic management plans. A sustainable approach to climate change has meant an effort toward reducing the emissions of greenhouse gases and improving energy use. Suncor has lowered its greenhouse gas emission intensity by 11 per cent below 1990 levels in 2001. Total reductions of 12.9 million tonnes have been achieved during the period 1990-2001. The total absolute emissions are above 1990 levels, which can be explained by tremendous production growth at Suncor Energy. Suncor has developed a seven-point plan to address the issue of climate change as follows: manage its greenhouse gas emissions, develop renewable sources of energy, invest in environmental and economic research, use domestic and foreign offsets, collaborate with governments and other stakeholder groups on policy development, educate its employees and the public on ways to respond to the risk posed by climate change, and measure and report its progress from that perspective. The document is divided into sections. The first section provides an organization profile, and section two discusses senior management support. In section three, a review of base year methodology and quantification is provided, followed by projection in section four. Target setting is the topic of section five, while section six deals with measures to achieve targets. The results achieved are highlighted in section seven. Education, training and awareness is broached in section eight, and the final section includes the statistical summary. tabs., figs

  13. Orphan drugs in development for primary biliary cirrhosis: challenges and progress

    Directory of Open Access Journals (Sweden)

    Ali AH

    2015-09-01

    Full Text Available Ahmad H Ali,1 Thomas J Byrne,1 Keith D Lindor1,21Division of Gastroenterology and Hepatology, Mayo Clinic, 2College of Health Solutions, Arizona State University, Phoenix, AZ, USAAbstract: Primary biliary cirrhosis (PBC is a chronic progressive liver disease that often leads to fibrosis, cirrhosis, and end-stage liver disease. The diagnosis is made when there is evidence of cholestasis and reactivity to the antimitochondrial antibody. The etiology of PBC is poorly understood; however, several lines of evidence suggest an environmental factor that triggers a series of immune-mediated inflammatory reactions in the bile ducts in a genetically susceptible individual. Fatigue and pruritus are the most common symptoms of PBC; however, many patients are diagnosed with PBC only based on laboratory abnormalities. The only pharmacological treatment approved for PBC is ursodeoxycholic acid (UDCA. Several controlled studies have shown that UDCA improves liver biochemistries and prolongs transplant-free survival in PBC patients. Nearly 40% of PBC patients do not respond to UDCA, and those patients are at high risk of serious adverse events, such as the development of liver failure. Therefore, newer alternative therapeutic options for PBC are needed. Obeticholic acid is a first-in-class farnesoid X receptor agonist that has been recently evaluated in PBC patients with inadequate response to UDCA, and demonstrated beneficial results in improving liver biochemistries. Several other agents (fibrates and glucocorticoids have been previously examined in PBC patients with inadequate response to UDCA, and preliminary results showed biochemical improvement. However, large-scale controlled clinical trials are needed to determine the long-term effects of fibrates and glucocorticoids on the clinical outcomes of PBC. Clinical trials of NGM282 (a fibroblast growth factor-19 analog and Abatacept (a fusion protein composed of the Fc portion of immunoglobulin G1 fused to

  14. Protein Analysis in Human Cerebrospinal Fluid: Physiological Aspects, Current Progress and Future Challenges

    Directory of Open Access Journals (Sweden)

    Andreas F. Hühmer

    2006-01-01

    Full Text Available The introduction of lumbar puncture into clinical medicine over 100 years ago marks the beginning of the study of central nervous system diseases using the human cerebrospinal fluid (CSF. Ever since, CSF has been analyzed extensively to elucidate the physiological and biochemical bases of neurological disease. The proximity of CSF to the brain makes it a good target for studying the pathophysiology of brain functions, but the barrier function of the CSF also impedes its diagnostic value. Today, measurements to determine alterations in the composition of CSF are central in the differential diagnosis of specific diseases of the central nervous system (CNS. In particular, the analysis of the CSF protein composition provides crucial information in the diagnosis of CNS diseases. This enables the assessment of the physiology of the blood-CSF barrier and of the immunology of intrathecial responses. Besides those routine measurements, protein compositional studies of CSF have been extended recently to many other proteins in the expectation that comprehensive analysis of lower abundance CSF proteins will lead to the discovery of new disease markers. Disease marker discovery by molecular profiling of the CSF tissue has the enormous potential of providing many new disease relevant molecules. New developments in protein profiling techniques hold promise for the discovery and validation of relevant disease markers. In this review, we summarize the current efforts and progress in CSF protein profiling measurements using conventional and current protein analysis tools. We also discuss necessary development in methodology in order to have the highest impact on the study of the molecular composition of CSF proteins.

  15. Progress and challenges of the ITER TBM Program from the IO perspective

    International Nuclear Information System (INIS)

    Giancarli, L.M.; Barabash, V.; Campbell, D.J.; Chiocchio, S.; Cordier, J.-J.; Dammann, A.; Dell’Orco, G.; Elbez-Uzan, J.; Fourneron, J.M.; Friconneau, J.P.; Gasparotto, M.; Iseli, M.; Jung, C.-Y.; Kim, B.-Y.; Lazarov, D.; Levesy, B.; Loughlin, M.; Merola, M.; Nevière, J.-C.; Pascal, R.

    2016-01-01

    The paper describes the organization of the Test Blanket Module (TBM) program, its overall objective and schedule and the status of the technical activities within the ITER Organization-Central Team (IO-CT). The latter include the design integration of the Test Blanket Systems (TBSs) into the nuclear buildings, ensuring all interfaces with other ITER systems, the design of the common TBS components such as the TBM Frames, the Dummy TBMs, and the TBS maintenance tools and equipment in the TBM Port Cell as well as in the Hot Cell building, the design of the TBS connection pipes and the definition of the required maintenance operations and associated R&D. The paper also discusses the major challenges that the TBM Program will be facing in ITER such as the potential impact of the TBMs ferritic/martensitic structures on plasma operations, the approaches to tritium and contamination confinement, the required mitigation and recovery actions in case of accidents, and the assessment of the reliability aspects that could have an impact on ITER availability.

  16. The State of Intimate Partner Violence Intervention: Progress and Continuing Challenges.

    Science.gov (United States)

    Messing, Jill Theresa; Ward-Lasher, Allison; Thaller, Jonel; Bagwell-Gray, Meredith E

    2015-10-01

    Over the past 40 years, intimate partner violence (IPV) has evolved from an emerging social problem to a socially unacceptable crime. The Violence Against Women Act of 1994 encourages state policies that focus on criminal justice intervention, including mandatory arrest and prosecution. Services offered to victim-survivors of IPV are often tied to criminal justice intervention, or otherwise encourage separation. These interventions have been seen as effectively using the authority of the state to enhance women's power relative to that of abusive men. However, these interventions do not serve the needs of women who, for cultural or personal reasons, want to remain in their relationship, or marginalized women who fear the power of the state due to institutionalized violence, heterosexism, and racism. The one-size-fits-all approach that encourages prosecution and batterer intervention programs for offenders and shelter and advocacy for victim-survivors fails to adhere to the social work value of client self-determination and the practice principle of meeting clients where they are. It is imperative that social workers in all areas of practice are aware of IPV policies, services, and laws. Social workers' challenge moving forward is to develop innovative and evidence-based interventions that serve all victim-survivors of IPV

  17. Mobil Oil Canada annual progress report to the Voluntary Challenge and Registry Program 1998

    International Nuclear Information System (INIS)

    1998-10-01

    Mobil Oil Canada has prepared estimates of its emissions of major greenhouse gases for the years 1994, 1995, 1996 and 1997. While the Voluntary Challenge and Registry (VCR) encouraged an inventory of emissions for 1990 to the present, data are not present for earlier than 1994. An inventory summary is included that outlines the quantity of emissions due to the following sources for each facility: combustion of fuel gas for heaters, boilers and compressors, flaring and venting of natural gas, and consumption of coal generated electricity. For all Mobil operated facilities in 1997, GHG emissions existed in the following fractions: 43% due to fuel gas use, 42% due to gas flaring and venting, and 15% from electricity consumption. In 1996, emissions were more evenly distributed amongst the three major sources. A discussion is included of emissions produced and energy used by each process type with data shown in tables. This includes: heavy oil facilities, light oil facilities, sulfur recovery facilities, a sour gas facility, a sweet gas facility, and a thermal heavy oil facility. Various projects were undertaken at each Mobil operated facility to increase energy efficiency and reduce emissions. A summary of the fuel gas, electricity and emission savings for each facility is described, as well as the main actions responsible for each emission saving. These cover: light oil facilities, sulfur recovery facilities, a thermal heavy oil facility, a conventional heavy oil facility, a sweet gas facility, and a sour gas facility. 5 tabs., 9 figs

  18. PanCanadian Energy Corporation 2001 progress report : Voluntary challenge and registry Inc

    International Nuclear Information System (INIS)

    2001-10-01

    With extensive exploration and production activities stretching across Canada and reaching into the Gulf of Mexico, PanCanadian Energy Corporation is one of Canada's largest producers and marketers of crude oil, natural gas and natural gas liquids. PanCanadian is a committed supporter of the Climate Change Voluntary Challenge and Registry (VCR) program, whose aim is the reduction of greenhouse gas emissions. Through geological sequestration, improved operational efficiencies, research, public policy input, employee education, and regular reporting to external stakeholders, PanCanadian remains committed to greenhouse gas management. To date, the reductions amount to 2.5 million tonnes per year plus 103,000 net tonnes injected into the Weyburn project during 2000. The start-up of the Weyburn carbon dioxide injection project was the major focus of the efforts in 2000, along with improvements in the measurement processes used to prepare the reports. Assistance in the formulation of provincial and national strategies was provided. In section 1 of the document, a statement concerning senior management support was provided, and section 2 detailed the base year quantification. In section 3, the projection was discussed, followed by the targets in section 4. The measures to achieve targets were reviewed in section 5, and the results achieved were examined in section 6. Education, training and awareness were dealt with in section 7. 8 tabs., 3 figs

  19. Progress and challenges of the ITER TBM Program from the IO perspective

    Energy Technology Data Exchange (ETDEWEB)

    Giancarli, L.M., E-mail: luciano.giancarli@iter.org [ITER Organization, Route de Vinon-sur-Verdon, CS 90 046, 13067 St Paul Lez Durance (France); Barabash, V.; Campbell, D.J.; Chiocchio, S.; Cordier, J.-J.; Dammann, A.; Dell’Orco, G.; Elbez-Uzan, J.; Fourneron, J.M.; Friconneau, J.P. [ITER Organization, Route de Vinon-sur-Verdon, CS 90 046, 13067 St Paul Lez Durance (France); Gasparotto, M. [Max-Planck-Institut für Plasmaphysik, Wendelsteinstraße 1, 17491 Greifswald (Germany); Iseli, M.; Jung, C.-Y.; Kim, B.-Y.; Lazarov, D.; Levesy, B.; Loughlin, M.; Merola, M. [ITER Organization, Route de Vinon-sur-Verdon, CS 90 046, 13067 St Paul Lez Durance (France); Nevière, J.-C. [Comex-Nucleaire, 13115 Saint Paul Lez Durance (France); Pascal, R. [ITER Organization, Route de Vinon-sur-Verdon, CS 90 046, 13067 St Paul Lez Durance (France); and others

    2016-11-01

    The paper describes the organization of the Test Blanket Module (TBM) program, its overall objective and schedule and the status of the technical activities within the ITER Organization-Central Team (IO-CT). The latter include the design integration of the Test Blanket Systems (TBSs) into the nuclear buildings, ensuring all interfaces with other ITER systems, the design of the common TBS components such as the TBM Frames, the Dummy TBMs, and the TBS maintenance tools and equipment in the TBM Port Cell as well as in the Hot Cell building, the design of the TBS connection pipes and the definition of the required maintenance operations and associated R&D. The paper also discusses the major challenges that the TBM Program will be facing in ITER such as the potential impact of the TBMs ferritic/martensitic structures on plasma operations, the approaches to tritium and contamination confinement, the required mitigation and recovery actions in case of accidents, and the assessment of the reliability aspects that could have an impact on ITER availability.

  20. RETRIEVING SUSPECT TRANSURANIC WASTE FROM THE HANFORD BURIAL GROUNDS PROGRESS PLANS AND CHALLENGES

    International Nuclear Information System (INIS)

    FRENCH, M.S.

    2006-01-01

    This paper describes the scope and status of the program for retrieval of suspect transuranic (TRU) waste stored in the Hanford Site low-level burial grounds. Beginning in 1970 and continuing until the late 1980's, waste suspected of containing significant quantities of transuranic isotopes was placed in ''retrievable'' storage in designated modules in the Hanford burial grounds, with the intent that the waste would be retrieved when a national repository for disposal of such waste became operational. Approximately 15,000 cubic meters of waste, suspected of being TRU, was placed in storage modules in four burial grounds. With the availability of the national repository (the Waste Isolation Pilot Plant), retrieval of the suspect TRU waste is now underway. Retrieval efforts, to date, have been conducted in storage modules that contain waste, which is in general, contact-handled, relatively new (1980's and later), is stacked in neat, engineered configurations, and has a relatively good record of waste characteristics. Even with these optimum conditions, retrieval personnel have had to deal with a large number of structurally degraded containers, radioactive contamination issues, and industrial hazards (including organic vapors). Future retrieval efforts in older, less engineered modules are expected to present additional hazards and difficult challenges

  1. Violence and injuries in Brazil: the effect, progress made, and challenges ahead.

    Science.gov (United States)

    Reichenheim, Michael Eduardo; de Souza, Edinilsa Ramos; Moraes, Claudia Leite; de Mello Jorge, Maria Helena Prado; da Silva, Cosme Marcelo Furtado Passos; de Souza Minayo, Maria Cecília

    2011-06-04

    Although there are signs of decline, homicides and traffic-related injuries and deaths in Brazil account for almost two-thirds of all deaths from external causes. In 2007, the homicide rate was 26·8 per 100,000 people and traffic-related mortality was 23·5 per 100,000. Domestic violence might not lead to as many deaths, but its share of violence-related morbidity is large. These are important public health problems that lead to enormous individual and collective costs. Young, black, and poor men are the main victims and perpetrators of community violence, whereas poor black women and children are the main victims of domestic violence. Regional differentials are also substantial. Besides the sociocultural determinants, much of the violence in Brazil has been associated with the misuse of alcohol and illicit drugs, and the wide availability of firearms. The high traffic-related morbidity and mortality in Brazil have been linked to the chosen model for the transport system that has given priority to roads and private-car use without offering adequate infrastructure. The system is often poorly equipped to deal with violations of traffic rules. In response to the major problems of violence and injuries, Brazil has greatly advanced in terms of legislation and action plans. The main challenge is to assess these advances to identify, extend, integrate, and continue the successful ones. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Gene Editing of Microalgae: Scientific Progress and Regulatory Challenges in Europe.

    Science.gov (United States)

    Spicer, Andrew; Molnar, Attila

    2018-03-06

    It is abundantly clear that the development of gene editing technologies, represents a potentially powerful force for good with regard to human and animal health and addressing the challenges we continue to face in a growing global population. This now includes the development of approaches to modify microalgal strains for potential improvements in productivity, robustness, harvestability, processability, nutritional composition, and application. The rapid emergence and ongoing developments in this area demand a timely review and revision of the current definitions and regulations around genetically modified organisms (GMOs), particularly within Europe. Current practices within the EU provide exemptions from the GMO directives for organisms, including crop plants and micro-organisms that are produced through chemical or UV/radiation mutagenesis. However, organisms generated through gene editing, including microalgae, where only genetic changes in native genes are made, remain currently under the GMO umbrella; they are, as such, excluded from practical and commercial opportunities in the EU. In this review, we will review the advances that are being made in the area of gene editing in microalgae and the impact of regulation on commercial advances in this area with consideration to the current regulatory framework as it relates to GMOs including GM microalgae in Europe.

  3. Canadian Association of Petroleum Producers voluntary challenge action plans - 1996 progress report

    International Nuclear Information System (INIS)

    1996-01-01

    The Canadian Association of Petroleum Producers (CAPP) has helped 85 of its' 170 member companies to develop climate change management policies. CAPP believes that participation through a voluntary approach allows for the development of creative, cost-effective solutions without the associated costs of regulatory measures for government and industry. Industry efforts to reduce greenhouse gases have focused primarily on five areas. These were: (1) energy efficiency, (2) methane capture and recovery, (3) acid gas injection, (4) co-generation, (5) and other actions. Petroleum industry accomplishment in 1996 were reported. In terms of future plans, it was asserted that CAPP member companies will continue to broaden and deepen their commitment to the voluntary challenge. Technological enhancements that increase production efficiency, also have the potential to reduce greenhouse gas emissions, and for this reason, CAPP will undertake assessment of their greenhouse gas emission potential. Further, it was noted that greenhouse gas (GHG) emissions from the upstream petroleum industry will likely increase because overall production is expected to increase through the year 2000. However, much of this increased production will be exported to the United States, and will help them to reduce their carbon and greenhouse gas emissions. Since climate change is a global issue, it requires global solutions, hence increasing production efficiency may be viewed as an appropriate response to the climate change issue. Statistical information regarding Canada's natural gas and crude oil production, and the impact that the VCR program has had on the industry to date, was reviewed. 13 tabs., 7 figs

  4. The nature of the (visualization) game: Challenges and opportunities from computational geophysics

    Science.gov (United States)

    Kellogg, L. H.

    2016-12-01

    As the geosciences enters the era of big data, modeling and visualization become increasingly vital tools for discovery, understanding, education, and communication. Here, we focus on modeling and visualization of the structure and dynamics of the Earth's surface and interior. The past decade has seen accelerated data acquisition, including higher resolution imaging and modeling of Earth's deep interior, complex models of geodynamics, and high resolution topographic imaging of the changing surface, with an associated acceleration of computational modeling through better scientific software, increased computing capability, and the use of innovative methods of scientific visualization. The role of modeling is to describe a system, answer scientific questions, and test hypotheses; the term "model" encompasses mathematical models, computational models, physical models, conceptual models, statistical models, and visual models of a structure or process. These different uses of the term require thoughtful communication to avoid confusion. Scientific visualization is integral to every aspect of modeling. Not merely a means of communicating results, the best uses of visualization enable scientists to interact with their data, revealing the characteristics of the data and models to enable better interpretation and inform the direction of future investigation. Innovative immersive technologies like virtual reality, augmented reality, and remote collaboration techniques, are being adapted more widely and are a magnet for students. Time-varying or transient phenomena are especially challenging to model and to visualize; researchers and students may need to investigate the role of initial conditions in driving phenomena, while nonlinearities in the governing equations of many Earth systems make the computations and resulting visualization especially challenging. Training students how to use, design, build, and interpret scientific modeling and visualization tools prepares them

  5. Challenges in reducing the computational time of QSTS simulations for distribution system analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Deboever, Jeremiah [Georgia Inst. of Technology, Atlanta, GA (United States); Zhang, Xiaochen [Georgia Inst. of Technology, Atlanta, GA (United States); Reno, Matthew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grijalva, Santiago [Georgia Inst. of Technology, Atlanta, GA (United States); Therrien, Francis [CME International T& D, St. Bruno, QC (Canada)

    2017-06-01

    The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10 to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.

  6. Transplantation Tolerance through Hematopoietic Chimerism: Progress and Challenges for Clinical Translation

    Directory of Open Access Journals (Sweden)

    Benedikt Mahr

    2017-12-01

    -term patients and progress in pre-clinical research provide encouraging evidence that deliberately inducing tolerance through hematopoietic chimerism might eventually make it from dream to reality.

  7. Decommissioning of Kozloduy NPP units 1÷4 progress and challenges

    International Nuclear Information System (INIS)

    Kazakov, Momchil

    2016-01-01

    The process of decommissioning of Units 1 to 4 is under implementation according to the approved schedule and dismantling work in the TH is expected to be completed in due time, i.e. the end of 2018. Management of dismantled materials is difficult due to the lack of licensed sites for management of materials from the decommissioning activities, as well as due to the long free release procedures. In order to solve the above mentioned issues, measures have been taken concerning the design and construction of sites for management of materials from the decommissioning activities and in respect of the release of material from regulatory control. The preparation of the CA and auxiliary buildings for dismantling has started on schedule, as well as the dismantling of potentially contaminated equipment; Management and treatment of decommissioning RAM and RAW will be assisted by putting into operation of the Size Reduction and Decontamination Workshop (SRDW) and Plasma Melting Facility (PMF) which is scheduled for 2017; Management of RAW from the Mortuaries in the CA is another challenge for SERAW and in that regard a Feasibility Study for the Management of “Mogilnik” storages of KNPP Units 1-4 is first planned to be carried out and thereafter a management approach is to be selected; Regarding dismantling in the CA, SERAW is in the process of Elaboration of a Design for Dismantling of Equipment in the Controlled Areas of KNPP Units 1-4; Based on the selected option for dismantling, particularly the Reactor Pressure Vessel (RPV), reactor internals and the rest activated components, the Consultant shall justify by relevant analyses the requirement for temporary storage areas for activated equipment by complying with the best international practices

  8. MicroRNAs as potential biomarkers in adrenocortical cancer: progress and challenges

    Directory of Open Access Journals (Sweden)

    Nadia eCHERRADI

    2016-01-01

    Full Text Available Adrenocortical carcinoma is a rare malignancy with poor prognosis and limited therapeutic options. Over the last decade, pan-genomic analyses of genetic and epigenetic alterations and genome-wide expression profile studies allowed major advances in the understanding of the molecular genetics of adrenocortical carcinoma. Besides the well-known dysfunctional molecular pathways in adrenocortical tumors such as the IGF2 pathway, the Wnt pathway and TP53, high-throughput technologies enabled a more comprehensive genomic characterization of adrenocortical cancer. Integration of expression profile data with exome sequencing, SNP array analysis, methylation and microRNA profiling led to the identification of subgroups of malignant tumors with distinct molecular alterations and clinical outcomes. MicroRNAs post-transcriptionally silence their target gene expression either by degrading mRNA or by inhibiting translation. Although our knowledge of the contribution of deregulated microRNAs to the pathogenesis of adrenocortical carcinoma is still in its infancy, recent studies support their relevance in gene expression alterations in these tumors. Some microRNAs have been shown to carry potential diagnostic and prognostic values while others may be good candidates for therapeutic interventions. With the emergence of disease-specific blood-borne microRNAs signatures, analyses of small cohorts of patients with adrenocortical carcinoma suggest that circulating microRNAs represent promising non-invasive biomarkers of malignancy or recurrence. However, some technical challenges still remain, and most of the microRNAs reported in the literature have not yet been validated in sufficiently powered and longitudinal studies. In this review, we discuss the current knowledge regarding the deregulation of tumor-associated and circulating microRNAs in adrenocortical carcinoma patients, while emphasizing their potential significance in adrenocortical carcinoma pathogenic

  9. Progresses and challenges in supporting activities toward a license to operate European TBM systems in ITER

    Energy Technology Data Exchange (ETDEWEB)

    Poitevin, Y., E-mail: yves.poitevin@f4e.europa.eu [Fusion for Energy, Barcelona (Spain); Ricapito, I.; Zmitko, M. [Fusion for Energy, Barcelona (Spain); Tavassoli, F. [CEA, DEN, F-91191 Gif-sur-Yvette (France); Thomas, N. [ATMOSTAT, F-94815 Villejuif (France); De Dinechin, G. [CEA, DEN, F-91191 Gif-sur-Yvette (France); Bucci, Ph. [CEA DRT, 38000 Grenoble (France); Rey, J. [Karlsruhe Institute of Technology, Postfach 3640, Karlsruhe (Germany); Ibarra, A. [CIEMAT, Madrid (Spain); Panayotov, D. [Fusion for Energy, Barcelona (Spain); Giancarli, L. [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul-lez-Durance (France); Calderoni, P.; Galabert, J.; Vallory, J. [Fusion for Energy, Barcelona (Spain); Aiello, A. [C.R. ENEA Brasimone, I-40032 Camugnano (Italy)

    2014-10-15

    Highlights: • First EUROFER steel design limits have been introduced in RCC-MRx. • Preliminary fabrication/welding procedure specifications for the European TBM box are under standardization. • Post irradiation examination (PIE) of beryllium pebbles irradiated at 17% of DEMO fluence target has been achieved. • Dynamic modeling of the TBM Systems with EcosimPro{sup ®} has been developed. - Abstract: Licensing a pressurized nuclear equipment like the European Test Blanket Modules (TBM) Systems and, on the longer term, breeder blankets of a fusion demonstration reactor (DEMO), will require presenting to the Regulator and the Agreed Notified Body, along with design and safety analyses, supporting data like consolidated materials data and design limits, qualified fabrication procedures specifications and validated modeling tools that go often over today's state-of-the-art of nuclear industry. TBM systems feature indeed a newly developed structural material and advanced fabrication processes that were not referenced in any nuclear construction codes before, new type of functional materials, complex structures geometry and many interconnected sub-systems exchanging tritium by permeation or fluid mass transfer. For many years now, Europe has structured its development activities on TBM Systems toward the preparation of licensing. First tangible results are now arising: the EUROFER structural material has been introduced in the RCC-MRx nuclear code, supported by a database of several thousands of test records; TBM box fabrication procedure specifications are under standardization by industry in view of their qualification; a modeling tool for accurate simulation of tritium transport in TBM systems has been developed in view of refining conservative inventory data published in preliminary safety reports and optimizing waste management. Remaining challenges are identified and discussed.

  10. Progress and challenges associated with digitizing and serving up Hawaii's geothermal data

    Science.gov (United States)

    Thomas, D. M.; Lautze, N. C.; Abdullah, M.

    2012-12-01

    This presentation will report on the status of our effort to digitize and serve up Hawaii's geothermal information, an undertaking that commenced in 2011 and will continue through at least 2013. This work is part of national project that is funded by the Department of Energy and managed by the Arizona State Geology Survey (AZGS). The data submitted to AZGS is being entered into the National Geothermal Data System (see http://www.stategeothermaldata.org/overview). We are also planning to host the information locally. Main facets of this project are to: - digitize and generate metadata for non-published geothermal documents relevant to the State of Hawaii - digitize ~100 years of paper records relevant to well permitting and water resources development and serve up information on the ~4500 water wells in the state - digitize, organize, and serve up information on research and geothermal exploratory drilling conducted from the 1980s to the present. - work with AZGS and OneGeology to contribute a geologic map for Hawaii that integrates geologic and geothermal resource data. By December 2012, we anticipate that the majority of the digitization will be complete, the geologic map will be approved, and that over 1000 documents will be hosted online through the University of Hawaii's library system (in the "Geothermal Collection" within the "Scholar Space" repository, see http://scholarspace.manoa.hawaii.edu/handle/10125/21320). Developing a 'user-friendly' web interface for the water well and drilling data will be a main task in the coming year. Challenges we have faced and anticipate include: 1) ensuring that no personally identifiable information (e.g. SSN, private telephone numbers, bank or credit account) is contained in the geothermal documents and well files; 2) Homeland Security regulations regarding release of information on critical infrastructure related to municipal water supply systems; 3) maintenance of the well database as future well data are developed with

  11. Combining Brain–Computer Interfaces and Assistive Technologies: State-of-the-Art and Challenges

    Science.gov (United States)

    Millán, J. d. R.; Rupp, R.; Müller-Putz, G. R.; Murray-Smith, R.; Giugliemma, C.; Tangermann, M.; Vidaurre, C.; Cincotti, F.; Kübler, A.; Leeb, R.; Neuper, C.; Müller, K.-R.; Mattia, D.

    2010-01-01

    In recent years, new research has brought the field of electroencephalogram (EEG)-based brain–computer interfacing (BCI) out of its infancy and into a phase of relative maturity through many demonstrated prototypes such as brain-controlled wheelchairs, keyboards, and computer games. With this proof-of-concept phase in the past, the time is now ripe to focus on the development of practical BCI technologies that can be brought out of the lab and into real-world applications. In particular, we focus on the prospect of improving the lives of countless disabled individuals through a combination of BCI technology with existing assistive technologies (AT). In pursuit of more practical BCIs for use outside of the lab, in this paper, we identify four application areas where disabled individuals could greatly benefit from advancements in BCI technology, namely, “Communication and Control”, “Motor Substitution”, “Entertainment”, and “Motor Recovery”. We review the current state of the art and possible future developments, while discussing the main research issues in these four areas. In particular, we expect the most progress in the development of technologies such as hybrid BCI architectures, user–machine adaptation algorithms, the exploitation of users’ mental states for BCI reliability and confidence measures, the incorporation of principles in human–computer interaction (HCI) to improve BCI usability, and the development of novel BCI technology including better EEG devices. PMID:20877434

  12. Water Travel Time Distributions in Permafrost-affected Catchments: Challenges, Progress and Implications

    Science.gov (United States)

    Smith, A. A.; Piovano, T. I.; Tetzlaff, D.; Ala-aho, P. O. A.; Wookey, P. A.; Soulsby, C.

    2017-12-01

    Characterising the travel times of water has been a major research focus in catchment science over the past decade. Use of isotopes to quantify the temporal dynamics of the transformation of precipitation into runoff has revealed fundamental new insights into catchment flow paths and mixing processes that influence biogeochemical transport. However, permafrost-affected catchments have received little attention, despite their global importance in terms of rapid environmental change. Such places have limited access for data collection during critical periods (e.g. early phases of snowmelt), temporal and spatially variable freeze-thaw cycles, and the development of the active layer has a time variant influence on catchment hydrology. All of these characteristics make the application of traditional transit time estimation approaches challenging. This contribution describes an isotope-based study undertaken to provide a preliminary assessment of travel times at SikSik Creek in the Canadian Arctic. We adopted a model-data fusion approach to estimate the volumes and isotopic characteristics of snowpack and meltwater. Using sampling in the spring/summer we characterise the isotopic composition of summer rainfall, melt from residual snow, soil water and stream water. In addition, soil moisture dynamics and the temporal evolution of the active layer profile were also monitored. Transit times were estimated for soil and stream water compositions using lumped convolution integral models and temporally variable inputs including snowmelt, ice thaw, and summer rainfall. Comparing transit time estimates using a variety of inputs reveals transit time is best estimated using all available inflows (i.e. snowmelt, ice thaw, and rainfall). Early spring transit times are short, dominated by snowmelt and ice thaw and limited catchment storage when soils are predominantly frozen. However, significant and increasing mixing with water in the active layer during the summer results in more

  13. Evolution of the health sector response to HIV in Myanmar: progress, challenges and the way forward.

    Science.gov (United States)

    Oo, Htun Nyunt; Hone, San; Fujita, Masami; Maw-Naing, Amaya; Boonto, Krittayawan; Jacobs, Marjolein; Phyu, Sabe; Bollen, Phavady; Cheung, Jacquie; Aung, Htin; Aung Sang, May Thu; Myat Soe, Aye; Pendse, Razia; Murphy, Eamonn

    2016-11-28

    Critical building blocks for the response to HIV were made until 2012 despite a series of political, social and financial challenges. A rapid increase of HIV service coverage was observed from 2012 to 2015 through collaborative efforts of government and non-governmental organisations (NGOs). Government facilities, in particular, demonstrated their capacity to expand services for antiretroviral therapy (ART), prevention of mother-to-child transmission (PMTCT) of HIV, tuberculosis and HIV co-infection and methadone-maintenance therapy (MMT). After nearly three decades into the response to HIV, Myanmar has adopted strategies to provide the right interventions to the right people in the right places to maximise impact and cost efficiency. In particular, the country is now using strategic information to classify areas into high-, medium- and low-HIV burden and risk of new infections for geographical prioritisation - as HIV remains concentrated among key population (KP) groups in specific geographical areas. Ways forward include: •Addressing structural barriers for KP to access services, and identifying and targeting KPs at higher risk;•Strengthening the network of public facilities, NGOs and general practitioners and introducing a case management approach to assist KPs and other clients with unknown HIV status, HIV-negative clients and newly diagnosed clients to access the health services across the continuum to increase the number of people testing for HIV and to reduce loss to follow-up in both prevention and treatment;•Increasing the availability of HIV testing and counselling services for KPs, clients of female sex workers (FSW), and other populations at risk, and raising the demand for timely testing including expansion of outreach and client-initiated voluntary counselling and testing (VCT) services;•Monitoring and maximising retention from HIV diagnosis to ART initiation and expanding quality HIV laboratory services, especially viral load

  14. Progress and Challenges in Developing Reference Data Layers for Human Population Distribution and Built Infrastructure

    Science.gov (United States)

    Chen, R. S.; Yetman, G.; de Sherbinin, A. M.

    2015-12-01

    Understanding the interactions between environmental and human systems, and in particular supporting the applications of Earth science data and knowledge in place-based decision making, requires systematic assessment of the distribution and dynamics of human population and the built human infrastructure in conjunction with environmental variability and change. The NASA Socioeconomic Data and Applications Center (SEDAC) operated by the Center for International Earth Science Information Network (CIESIN) at Columbia University has had a long track record in developing reference data layers for human population and settlements and is expanding its efforts on topics such as intercity roads, reservoirs and dams, and energy infrastructure. SEDAC has set as a strategic priority the acquisition, development, and dissemination of data resources derived from remote sensing and socioeconomic data on urban land use change, including temporally and spatially disaggregated data on urban change and rates of change, the built infrastructure, and critical facilities. We report here on a range of past and ongoing activities, including the Global Human Settlements Layer effort led by the European Commission's Joint Research Centre (JRC), the Global Exposure Database for the Global Earthquake Model (GED4GEM) project, the Global Roads Open Access Data Working Group (gROADS) of the Committee on Data for Science and Technology (CODATA), and recent work with ImageCat, Inc. to improve estimates of the exposure and fragility of buildings, road and rail infrastructure, and other facilities with respect to selected natural hazards. New efforts such as the proposed Global Human Settlement indicators initiative of the Group on Earth Observations (GEO) could help fill critical gaps and link potential reference data layers with user needs. We highlight key sectors and themes that require further attention, and the many significant challenges that remain in developing comprehensive, high quality

  15. Fundamental challenging problems for developing new nuclear safety standard computer codes

    International Nuclear Information System (INIS)

    Wong, P.K.; Wong, A.E.; Wong, A.

    2005-01-01

    Based on the claims of the US Basic patents number 5,084,232; 5,848,377 and 6,430,516 that can be obtained from typing the Patent Numbers into the Box of the Web site http://164.195.100.11/netahtml/srchnum.htm and their associated published technical papers having been presented and published at International Conferences in the last three years and that all these had been sent into US-NRC by E-mail on March 26, 2003 at 2:46 PM., three fundamental challenging problems for developing new nuclear safety standard computer codes had been presented at the US-NRC RIC2003 Session W4. 2:15-3:15 PM. at the Washington D.C. Capital Hilton Hotel, Presidential Ballroom on April 16, 2003 in front of more than 800 nuclear professionals from many countries worldwide. The objective and scope of this paper is to invite all nuclear professionals to examine and evaluate all the current computer codes being used in their own countries by means of comparison of numerical data from these three specific openly challenging fundamental problems in order to set up a global safety standard for all nuclear power plants in the world. (authors)

  16. IBM Watson: How Cognitive Computing Can Be Applied to Big Data Challenges in Life Sciences Research.

    Science.gov (United States)

    Chen, Ying; Elenee Argentinis, J D; Weber, Griff

    2016-04-01

    Life sciences researchers are under pressure to innovate faster than ever. Big data offer the promise of unlocking novel insights and accelerating breakthroughs. Ironically, although more data are available than ever, only a fraction is being integrated, understood, and analyzed. The challenge lies in harnessing volumes of data, integrating the data from hundreds of sources, and understanding their various formats. New technologies such as cognitive computing offer promise for addressing this challenge because cognitive solutions are specifically designed to integrate and analyze big datasets. Cognitive solutions can understand different types of data such as lab values in a structured database or the text of a scientific publication. Cognitive solutions are trained to understand technical, industry-specific content and use advanced reasoning, predictive modeling, and machine learning techniques to advance research faster. Watson, a cognitive computing technology, has been configured to support life sciences research. This version of Watson includes medical literature, patents, genomics, and chemical and pharmacological data that researchers would typically use in their work. Watson has also been developed with specific comprehension of scientific terminology so it can make novel connections in millions of pages of text. Watson has been applied to a few pilot studies in the areas of drug target identification and drug repurposing. The pilot results suggest that Watson can accelerate identification of novel drug candidates and novel drug targets by harnessing the potential of big data. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Best Performers Announced for the NCI-CPTAC DREAM Proteogenomics Computational Challenge | Office of Cancer Clinical Proteomics Research

    Science.gov (United States)

    The National Cancer Institute (NCI) Clinical Proteomic Tumor Analysis Consortium (CPTAC) is pleased to announce that teams led by Jaewoo Kang (Korea University), and Yuanfang Guan with Hongyang Li (University of Michigan) as the best performers of the NCI-CPTAC DREAM Proteogenomics Computational Challenge. Over 500 participants from 20 countries registered for the Challenge, which offered $25,000 in cash awards contributed by the NVIDIA Foundation through its Compute the Cure initiative.

  18. Tracking progress towards equitable child survival in a Nicaraguan community: neonatal mortality challenges to meet the MDG 4

    Directory of Open Access Journals (Sweden)

    Persson Lars-Åke

    2011-06-01

    Full Text Available Abstract Background Nicaragua has made progress in the reduction of the under-five mortality since 1980s. Data for the national trends indicate that this poor Central American country is on track to reach the Millennium Development Goal-4 by 2015. Despite this progress, neonatal mortality has not showed same progress. The aim of this study is to analyse trends and social differentials in neonatal and under-five mortality in a Nicaraguan community from 1970 to 2005. Methods Two linked community-based reproductive surveys in 1993 and 2002 followed by a health and demographic surveillance system providing information on all births and child deaths in urban and rural areas of León municipality, Nicaragua. A total of 49 972 live births were registered. Results A rapid reduction in under-five mortality was observed during the late 1970s (from 103 deaths/1000 live births and the 1980s, followed by a gradual decline to the level of 23 deaths/1000 live births in 2005. This community is on track for the Millennium Development Goal 4 for improved child survival. However, neonatal mortality increased lately in spite of a good coverage of skilled assistance at delivery. After some years in the 1990s with a very small gap in neonatal survival between children of mothers of different educational levels this divide is increasing. Conclusions After the reduction of high under-five mortality that coincided with improved equity in survival in this Nicaraguan community, the current challenge is the neonatal mortality where questions of an equitable perinatal care of good quality must be addressed.

  19. Tackling some of the most intricate geophysical challenges via high-performance computing

    Science.gov (United States)

    Khosronejad, A.

    2016-12-01

    Recently, world has been witnessing significant enhancements in computing power of supercomputers. Computer clusters in conjunction with the advanced mathematical algorithms has set the stage for developing and applying powerful numerical tools to tackle some of the most intricate geophysical challenges that today`s engineers face. One such challenge is to understand how turbulent flows, in real-world settings, interact with (a) rigid and/or mobile complex bed bathymetry of waterways and sea-beds in the coastal areas; (b) objects with complex geometry that are fully or partially immersed; and (c) free-surface of waterways and water surface waves in the coastal area. This understanding is especially important because the turbulent flows in real-world environments are often bounded by geometrically complex boundaries, which dynamically deform and give rise to multi-scale and multi-physics transport phenomena, and characterized by multi-lateral interactions among various phases (e.g. air/water/sediment phases). Herein, I present some of the multi-scale and multi-physics geophysical fluid mechanics processes that I have attempted to study using an in-house high-performance computational model, the so-called VFS-Geophysics. More specifically, I will present the simulation results of turbulence/sediment/solute/turbine interactions in real-world settings. Parts of the simulations I present are performed to gain scientific insights into the processes such as sand wave formation (A. Khosronejad, and F. Sotiropoulos, (2014), Numerical simulation of sand waves in a turbulent open channel flow, Journal of Fluid Mechanics, 753:150-216), while others are carried out to predict the effects of climate change and large flood events on societal infrastructures ( A. Khosronejad, et al., (2016), Large eddy simulation of turbulence and solute transport in a forested headwater stream, Journal of Geophysical Research:, doi: 10.1002/2014JF003423).

  20. Experimental and computational studies of poly-L-lactic acid for cardiovascular applications: recent progress

    Science.gov (United States)

    Naseem, Raasti; Zhao, Liguo; Liu, Yang; Silberschmidt, Vadim V.

    2017-12-01

    Stents are commonly used in medical procedures to alleviate the symptoms of coronary heart disease, a prevalent modern society disease. These structures are employed to maintain vessel patency and restore blood flow. Traditionally stents are made of metals such as stainless steel or cobalt chromium; however, these scaffolds have known disadvantages. An emergence of transient scaffolds is gaining popularity, with the structure engaged for a required period whilst healing of the diseased arterial wall occurs. Polymers dominate a medical device sector, with incorporation in sutures, scaffolds and screws. Thanks to their good mechanical and biological properties and their ability to degrade naturally. Polylactic acid is an extremely versatile polymer, with its properties easily tailored to applications. Its dominance in the stenting field increases continually, with the first polymer scaffold gaining FDA approval in 2016. Still some challenges with PLLA bioresorbable materials remain, especially with regard to understanding their mechanical response, assessment of its changes with degradation and comparison of their performance with that of metallic drug-eluting stent. Currently, there is still a lack of works on evaluating both the pre-degradation properties and degradation performance of these scaffolds. Additionally, there are no established material models incorporating non-linear viscoelastic behaviour of PLLA and its evolution with in-service degradation. Assessing these features through experimental analysis accompanied by analytical and numerical studies will provide powerful tools for design and optimisation of these structures endorsing their broader use in stenting. This overview assesses the recent studies investigating mechanical and computational performance of poly(l-lactic) acid and its use in stenting applications.

  1. Multiscale Mechanics of Articular Cartilage: Potentials and Challenges of Coupling Musculoskeletal, Joint, and Microscale Computational Models

    Science.gov (United States)

    Halloran, J. P.; Sibole, S.; van Donkelaar, C. C.; van Turnhout, M. C.; Oomens, C. W. J.; Weiss, J. A.; Guilak, F.; Erdemir, A.

    2012-01-01

    Articular cartilage experiences significant mechanical loads during daily activities. Healthy cartilage provides the capacity for load bearing and regulates the mechanobiological processes for tissue development, maintenance, and repair. Experimental studies at multiple scales have provided a fundamental understanding of macroscopic mechanical function, evaluation of the micromechanical environment of chondrocytes, and the foundations for mechanobiological response. In addition, computational models of cartilage have offered a concise description of experimental data at many spatial levels under healthy and diseased conditions, and have served to generate hypotheses for the mechanical and biological function. Further, modeling and simulation provides a platform for predictive risk assessment, management of dysfunction, as well as a means to relate multiple spatial scales. Simulation-based investigation of cartilage comes with many challenges including both the computational burden and often insufficient availability of data for model development and validation. This review outlines recent modeling and simulation approaches to understand cartilage function from a mechanical systems perspective, and illustrates pathways to associate mechanics with biological function. Computational representations at single scales are provided from the body down to the microstructure, along with attempts to explore multiscale mechanisms of load sharing that dictate the mechanical environment of the cartilage and chondrocytes. PMID:22648577

  2. Mobile, Cloud, and Big Data Computing: Contributions, Challenges, and New Directions in Telecardiology

    Science.gov (United States)

    Hsieh, Jui-Chien; Li, Ai-Hsien; Yang, Chung-Chi

    2013-01-01

    Many studies have indicated that computing technology can enable off-site cardiologists to read patients’ electrocardiograph (ECG), echocardiography (ECHO), and relevant images via smart phones during pre-hospital, in-hospital, and post-hospital teleconsultation, which not only identifies emergency cases in need of immediate treatment, but also prevents the unnecessary re-hospitalizations. Meanwhile, several studies have combined cloud computing and mobile computing to facilitate better storage, delivery, retrieval, and management of medical files for telecardiology. In the future, the aggregated ECG and images from hospitals worldwide will become big data, which should be used to develop an e-consultation program helping on-site practitioners deliver appropriate treatment. With information technology, real-time tele-consultation and tele-diagnosis of ECG and images can be practiced via an e-platform for clinical, research, and educational purposes. While being devoted to promote the application of information technology onto telecardiology, we need to resolve several issues: (1) data confidentiality in the cloud, (2) data interoperability among hospitals, and (3) network latency and accessibility. If these challenges are overcome, tele-consultation will be ubiquitous, easy to perform, inexpensive, and beneficial. Most importantly, these services will increase global collaboration and advance clinical practice, education, and scientific research in cardiology. PMID:24232290

  3. Mobile, Cloud, and Big Data Computing: Contributions, Challenges, and New Directions in Telecardiology

    Directory of Open Access Journals (Sweden)

    Chung-Chi Yang

    2013-11-01

    Full Text Available Many studies have indicated that computing technology can enable off-site cardiologists to read patients’ electrocardiograph (ECG, echocardiography (ECHO, and relevant images via smart phones during pre-hospital, in-hospital, and post-hospital teleconsultation, which not only identifies emergency cases in need of immediate treatment, but also prevents the unnecessary re-hospitalizations. Meanwhile, several studies have combined cloud computing and mobile computing to facilitate better storage, delivery, retrieval, and management of medical files for telecardiology. In the future, the aggregated ECG and images from hospitals worldwide will become big data, which should be used to develop an e-consultation program helping on-site practitioners deliver appropriate treatment. With information technology, real-time tele-consultation and tele-diagnosis of ECG and images can be practiced via an e-platform for clinical, research, and educational purposes. While being devoted to promote the application of information technology onto telecardiology, we need to resolve several issues: (1 data confidentiality in the cloud, (2 data interoperability among hospitals, and (3 network latency and accessibility. If these challenges are overcome, tele-consultation will be ubiquitous, easy to perform, inexpensive, and beneficial. Most importantly, these services will increase global collaboration and advance clinical practice, education, and scientific research in cardiology.

  4. Mobile, cloud, and big data computing: contributions, challenges, and new directions in telecardiology.

    Science.gov (United States)

    Hsieh, Jui-Chien; Li, Ai-Hsien; Yang, Chung-Chi

    2013-11-13

    Many studies have indicated that computing technology can enable off-site cardiologists to read patients' electrocardiograph (ECG), echocardiography (ECHO), and relevant images via smart phones during pre-hospital, in-hospital, and post-hospital teleconsultation, which not only identifies emergency cases in need of immediate treatment, but also prevents the unnecessary re-hospitalizations. Meanwhile, several studies have combined cloud computing and mobile computing to facilitate better storage, delivery, retrieval, and management of medical files for telecardiology. In the future, the aggregated ECG and images from hospitals worldwide will become big data, which should be used to develop an e-consultation program helping on-site practitioners deliver appropriate treatment. With information technology, real-time tele-consultation and tele-diagnosis of ECG and images can be practiced via an e-platform for clinical, research, and educational purposes. While being devoted to promote the application of information technology onto telecardiology, we need to resolve several issues: (1) data confidentiality in the cloud, (2) data interoperability among hospitals, and (3) network latency and accessibility. If these challenges are overcome, tele-consultation will be ubiquitous, easy to perform, inexpensive, and beneficial. Most importantly, these services will increase global collaboration and advance clinical practice, education, and scientific research in cardiology.

  5. Research in progress at the Institute for Computer Applications in Science and Engineering

    Science.gov (United States)

    1987-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April 1, 1987 through October 1, 1987.

  6. Review of The SIAM 100-Digit Challenge: A Study in High-Accuracy Numerical Computing

    International Nuclear Information System (INIS)

    Bailey, David

    2005-01-01

    In the January 2002 edition of SIAM News, Nick Trefethen announced the '$100, 100-Digit Challenge'. In this note he presented ten easy-to-state but hard-to-solve problems of numerical analysis, and challenged readers to find each answer to ten-digit accuracy. Trefethen closed with the enticing comment: 'Hint: They're hard. If anyone gets 50 digits in total, I will be impressed.' This challenge obviously struck a chord in hundreds of numerical mathematicians worldwide, as 94 teams from 25 nations later submitted entries. Many of these submissions exceeded the target of 50 correct digits; in fact, 20 teams achieved a perfect score of 100 correct digits. Trefethen had offered $100 for the best submission. Given the overwhelming response, a generous donor (William Browning, founder of Applied Mathematics, Inc.) provided additional funds to provide a $100 award to each of the 20 winning teams. Soon after the results were out, four participants, each from a winning team, got together and agreed to write a book about the problems and their solutions. The team is truly international: Bornemann is from Germany, Laurie is from South Africa, Wagon is from the USA, and Waldvogel is from Switzerland. This book provides some mathematical background for each problem, and then shows in detail how each of them can be solved. In fact, multiple solution techniques are mentioned in each case. The book describes how to extend these solutions to much larger problems and much higher numeric precision (hundreds or thousands of digit accuracy). The authors also show how to compute error bounds for the results, so that one can say with confidence that one's results are accurate to the level stated. Numerous numerical software tools are demonstrated in the process, including the commercial products Mathematica, Maple and Matlab. Computer programs that perform many of the algorithms mentioned in the book are provided, both in an appendix to the book and on a website. In the process, the

  7. Computational mechanics research and support for aerodynamics and hydraulics at TFHRC year 1 quarter 4 progress report.

    Energy Technology Data Exchange (ETDEWEB)

    Lottes, S.A.; Kulak, R.F.; Bojanowski, C. (Energy Systems)

    2011-12-09

    The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. The analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to assess them for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFCHR wind engineering laboratory, vehicle stability under high wind loading, and the use of electromagnetic shock absorbers to improve vehicle stability

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  9. Algorithms for limited-view computed tomography: an annotated bibliography and a challenge

    International Nuclear Information System (INIS)

    Rangayyan, R.; Dhawan, A.P.; Gordon, R.

    1985-01-01

    In many applications of computed tomography, it may not be possible to acquire projection data at all angles, as required by the most commonly used algorithm of convolution backprojection. In such a limited-data situation, we face an ill-posed problem in attempting to reconstruct an image from an incomplete set of projections. Many techniques have been proposed to tackle this situation, employing diverse theories such as signal recovery, image restoration, constrained deconvolution, and constrained optimization, as well as novel schemes such as iterative object-dependent algorithms incorporating a priori knowledge and use of multispectral radiation. The authors present an overview of such techniques and offer a challenge to all readers to reconstruct images from a set of limited-view data provided here

  10. Copy-Right for Software and Computer Games: Strategies and Challenges

    Directory of Open Access Journals (Sweden)

    Hojatollah Ayoubi

    2009-11-01

    Full Text Available Copy-right has been initially used in cultural and art industries. From that time there have been two different approaches to the matter: the commercial-economic approach which is concerned with the rights of suppliers and investors; and the other approach, the cultural one, which is especially concerned with the rights of author. First approach is rooted in Anglo-American countries, while the other is originally French. Expansion of the computer market, and separating software and hardware markets caused to the so-called velvet-rubbery, which refers to the illegal reproduction in the market. Therefore, there were some struggles all over the world to protect rights of their producers. In present study, beside the domestic and international difficulties these strategies would encounter, this article has reviewed different strategies to face this challenge.

  11. Applied & Computational MathematicsChallenges for the Design and Control of Dynamic Energy Systems

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D L; Burns, J A; Collis, S; Grosh, J; Jacobson, C A; Johansen, H; Mezic, I; Narayanan, S; Wetter, M

    2011-03-10

    The Energy Independence and Security Act of 2007 (EISA) was passed with the goal 'to move the United States toward greater energy independence and security.' Energy security and independence cannot be achieved unless the United States addresses the issue of energy consumption in the building sector and significantly reduces energy consumption in buildings. Commercial and residential buildings account for approximately 40% of the U.S. energy consumption and emit 50% of CO{sub 2} emissions in the U.S. which is more than twice the total energy consumption of the entire U.S. automobile and light truck fleet. A 50%-80% improvement in building energy efficiency in both new construction and in retrofitting existing buildings could significantly reduce U.S. energy consumption and mitigate climate change. Reaching these aggressive building efficiency goals will not happen without significant Federal investments in areas of computational and mathematical sciences. Applied and computational mathematics are required to enable the development of algorithms and tools to design, control and optimize energy efficient buildings. The challenge has been issued by the U.S. Secretary of Energy, Dr. Steven Chu (emphasis added): 'We need to do more transformational research at DOE including computer design tools for commercial and residential buildings that enable reductions in energy consumption of up to 80 percent with investments that will pay for themselves in less than 10 years.' On July 8-9, 2010 a team of technical experts from industry, government and academia were assembled in Arlington, Virginia to identify the challenges associated with developing and deploying newcomputational methodologies and tools thatwill address building energy efficiency. These experts concluded that investments in fundamental applied and computational mathematics will be required to build enabling technology that can be used to realize the target of 80% reductions in energy

  12. Addressing current challenges in cancer immunotherapy with mathematical and computational modelling.

    Science.gov (United States)

    Konstorum, Anna; Vella, Anthony T; Adler, Adam J; Laubenbacher, Reinhard C

    2017-06-01

    The goal of cancer immunotherapy is to boost a patient's immune response to a tumour. Yet, the design of an effective immunotherapy is complicated by various factors, including a potentially immunosuppressive tumour microenvironment, immune-modulating effects of conventional treatments and therapy-related toxicities. These complexities can be incorporated into mathematical and computational models of cancer immunotherapy that can then be used to aid in rational therapy design. In this review, we survey modelling approaches under the umbrella of the major challenges facing immunotherapy development, which encompass tumour classification, optimal treatment scheduling and combination therapy design. Although overlapping, each challenge has presented unique opportunities for modellers to make contributions using analytical and numerical analysis of model outcomes, as well as optimization algorithms. We discuss several examples of models that have grown in complexity as more biological information has become available, showcasing how model development is a dynamic process interlinked with the rapid advances in tumour-immune biology. We conclude the review with recommendations for modellers both with respect to methodology and biological direction that might help keep modellers at the forefront of cancer immunotherapy development. © 2017 The Author(s).

  13. Research in progress in applied mathematics, numerical analysis, fluid mechanics, and computer science

    Science.gov (United States)

    1994-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.

  14. Research in progress and other activities of the Institute for Computer Applications in Science and Engineering

    Science.gov (United States)

    1993-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics and computer science during the period April 1, 1993 through September 30, 1993. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustic and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.

  15. Challenges in computational fluid dynamics simulation for the nineties. Various examples of application

    International Nuclear Information System (INIS)

    Chabard, J.P.; Viollet, P.L.

    1991-01-01

    Most of the computational fluid dynamics applications which are encountered at the Research Branch of EDF (DER) are dealing with thermal exchanges. The development of numerical tools for the simulation of flows, devoted to this class of application, has been under way for 15 years. At the beginning this work was mainly concerned with a good simulation of the dynamics of the flow. Now these tools can be used to compute flows with thermal exchanges. The presentation will be limited to incompressible and one phase flows (the DER developments on two phase flows are discussed in the paper by MM. Hery, Boivin et Viollet (in the present magazine). First the softwares developed at DER will be presented. Then some applications of these tools to flows with thermal exchanges will be discussed. To conclude, the paper will treat the general case of the CFD codes. The challenges for the next years will be detailed in order to make these tools available for users involved in complex physical modeling [fr

  16. Taking Action on Air Pollution Control in the Beijing-Tianjin-Hebei (BTH) Region: Progress, Challenges and Opportunities

    Science.gov (United States)

    Wang, Li; Zhang, Fengying; Pilot, Eva; Yu, Jie; Holdaway, Jennifer; Yang, Linsheng; Li, Yonghua; Wang, Wuyi; Vardoulakis, Sotiris; Krafft, Thomas

    2018-01-01

    Due to rapid urbanization, industrialization and motorization, a large number of Chinese cities are affected by heavy air pollution. In order to explore progress, remaining challenges, and sustainability of air pollution control in the Beijing-Tianjin-Hebei (BTH) region after 2013, a mixed method analysis was undertaken. The quantitative analysis comprised an overview of air quality management in the BTH region. Semi-structured expert interviews were conducted with 12 stakeholders from various levels of government and research institutions who played substantial roles either in decision-making or in research and advising on air pollution control in the BTH region. The results indicated that with the stringent air pollution control policies, the air quality in BTH meets the targets of the Air Pollution Prevention and Control Action Plan. However, improvements vary across the region and for different pollutants. Although implementation has been decisive and was at least in parts effectively enforced, significant challenges remained with regard to industrial and traffic emission control, and national air quality limits continued to be significantly exceeded and competing development interests remained mainly unsolved. There were also concerns about the sustainability of the current air pollution control measures especially for industries due to the top-down enforcement, and the associated large burden of social cost including unemployment and social inequity resulting industrial restructuring. Better mechanisms for ensuring cross-sectoral coordination and for improved central-local government communication were suggested. Further suggestions were provided to improve the conceptual design and effective implementation of respective air pollution control strategies in BTH. Our study highlights some of the major hurdles that need to be addressed to succeed with a comprehensive air pollution control management for the Chinese mega-urban agglomerations. PMID:29425189

  17. Taking Action on Air Pollution Control in the Beijing-Tianjin-Hebei (BTH) Region: Progress, Challenges and Opportunities.

    Science.gov (United States)

    Wang, Li; Zhang, Fengying; Pilot, Eva; Yu, Jie; Nie, Chengjing; Holdaway, Jennifer; Yang, Linsheng; Li, Yonghua; Wang, Wuyi; Vardoulakis, Sotiris; Krafft, Thomas

    2018-02-09

    Due to rapid urbanization, industrialization and motorization, a large number of Chinese cities are affected by heavy air pollution. In order to explore progress, remaining challenges, and sustainability of air pollution control in the Beijing-Tianjin-Hebei (BTH) region after 2013, a mixed method analysis was undertaken. The quantitative analysis comprised an overview of air quality management in the BTH region. Semi-structured expert interviews were conducted with 12 stakeholders from various levels of government and research institutions who played substantial roles either in decision-making or in research and advising on air pollution control in the BTH region. The results indicated that with the stringent air pollution control policies, the air quality in BTH meets the targets of the Air Pollution Prevention and Control Action Plan. However, improvements vary across the region and for different pollutants. Although implementation has been decisive and was at least in parts effectively enforced, significant challenges remained with regard to industrial and traffic emission control, and national air quality limits continued to be significantly exceeded and competing development interests remained mainly unsolved. There were also concerns about the sustainability of the current air pollution control measures especially for industries due to the top-down enforcement, and the associated large burden of social cost including unemployment and social inequity resulting industrial restructuring. Better mechanisms for ensuring cross-sectoral coordination and for improved central-local government communication were suggested. Further suggestions were provided to improve the conceptual design and effective implementation of respective air pollution control strategies in BTH. Our study highlights some of the major hurdles that need to be addressed to succeed with a comprehensive air pollution control management for the Chinese mega-urban agglomerations.

  18. Taking Action on Air Pollution Control in the Beijing-Tianjin-Hebei (BTH Region: Progress, Challenges and Opportunities

    Directory of Open Access Journals (Sweden)

    Li Wang

    2018-02-01

    Full Text Available Due to rapid urbanization, industrialization and motorization, a large number of Chinese cities are affected by heavy air pollution. In order to explore progress, remaining challenges, and sustainability of air pollution control in the Beijing-Tianjin-Hebei (BTH region after 2013, a mixed method analysis was undertaken. The quantitative analysis comprised an overview of air quality management in the BTH region. Semi-structured expert interviews were conducted with 12 stakeholders from various levels of government and research institutions who played substantial roles either in decision-making or in research and advising on air pollution control in the BTH region. The results indicated that with the stringent air pollution control policies, the air quality in BTH meets the targets of the Air Pollution Prevention and Control Action Plan. However, improvements vary across the region and for different pollutants. Although implementation has been decisive and was at least in parts effectively enforced, significant challenges remained with regard to industrial and traffic emission control, and national air quality limits continued to be significantly exceeded and competing development interests remained mainly unsolved. There were also concerns about the sustainability of the current air pollution control measures especially for industries due to the top-down enforcement, and the associated large burden of social cost including unemployment and social inequity resulting industrial restructuring. Better mechanisms for ensuring cross-sectoral coordination and for improved central-local government communication were suggested. Further suggestions were provided to improve the conceptual design and effective implementation of respective air pollution control strategies in BTH. Our study highlights some of the major hurdles that need to be addressed to succeed with a comprehensive air pollution control management for the Chinese mega-urban agglomerations.

  19. Assessing the Global Development Agenda (Goal 1 in Uganda: The Progress Made and the Challenges that Persist

    Directory of Open Access Journals (Sweden)

    E. A. Ndaguba

    2016-12-01

    Full Text Available The international development agenda (2000-2015 that was hailed in Uganda was unsuccessful and powerless in elevating individuals and groups to a place of comfort through the achievement of the MDGs. Hence, according to a survey of the Directorate of Social Protection in 2012, 67% of citizens of Uganda are either highly vulnerable to remaining in poverty or being poor.  This study therefore assesses the gains of the global development agenda (2000 – 2015 in Uganda. The study relies heavily on review papers, secondary dataset and material, and quasi-quantitative method in analyzing the research aim. Results show that ambiguous and unrealistic targets of the MDGs did not take into cognizance the structures, institutions, and interaction of systems and governance issues in Uganda. Despite these, the gains were also shortchanged as a result of drought, flood, and high prices of commodities, due to low farm production in most (rural areas in Uganda. In addition to the drought and the negative effects of climate change, other challenges include deficient access to markets and market place, lack of motorized and non-motorized load-carrying wheel vehicles, lack of capacity and infrastructure, lack of mechanized farming implements, and the lack of access to credit reduced the potency of the achievement of most of its goals. However, significant strides were attempted and the country was able to achieve several targets, which are worth celebrating. The study contends that the realization of the SDGs will only be wishful thinking, if challenges of rural poverty, governance and institution are not put in check. Shared progress and prosperity as acclaimed by the World Bank will never be visible in Uganda.

  20. The Security Challenges in the IoT Enabled Cyber-Physical Systems and Opportunities for Evolutionary Computing & Other Computational Intelligence

    OpenAIRE

    He, H.; Maple, C.; Watson, T.; Tiwari, A.; Mehnen, J.; Jin, Y.; Gabrys, Bogdan

    2016-01-01

    Internet of Things (IoT) has given rise to the fourth industrial revolution (Industrie 4.0), and it brings great benefits by connecting people, processes and data. However, cybersecurity has become a critical challenge in the IoT enabled cyber physical systems, from connected supply chain, Big Data produced by huge amount of IoT devices, to industry control systems. Evolutionary computation combining with other computational intelligence will play an important role for cybersecurity, such as ...

  1. Computational algebraic geometry for statistical modeling FY09Q2 progress.

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, David C.; Rojas, Joseph Maurice; Pebay, Philippe Pierre

    2009-03-01

    This is a progress report on polynomial system solving for statistical modeling. This is a progress report on polynomial system solving for statistical modeling. This quarter we have developed our first model of shock response data and an algorithm for identifying the chamber cone containing a polynomial system in n variables with n+k terms within polynomial time - a significant improvement over previous algorithms, all having exponential worst-case complexity. We have implemented and verified the chamber cone algorithm for n+3 and are working to extend the implementation to handle arbitrary k. Later sections of this report explain chamber cones in more detail; the next section provides an overview of the project and how the current progress fits into it.

  2. Progress and Challenges of Protecting North American Ash Trees from the Emerald Ash Borer Using Biological Control

    Directory of Open Access Journals (Sweden)

    Jian J. Duan

    2018-03-01

    Full Text Available After emerald ash borer (EAB, Agrilus planipennis Fairmaire, was discovered in the United States, a classical biological control program was initiated against this destructive pest of ash trees (Fraxinus spp.. This biocontrol program began in 2007 after federal regulatory agencies and the state of Michigan approved release of three EAB parasitoid species from China: Tetrastichus planipennisi Yang (Eulophidae, Spathius agrili Yang (Braconidae, and Oobius agrili Zhang and Huang (Encyrtidae. A fourth EAB parasitoid, Spathius galinae Belokobylskij (Braconidae from Russia, was approved for release in 2015. We review the rationale and ecological premises of the EAB biocontrol program, and then report on progress in North American ash recovery in southern Michigan, where the parasitoids were first released. We also identify challenges to conserving native Fraxinus using biocontrol in the aftermath of the EAB invasion, and provide suggestions for program improvements as EAB spreads throughout North America. We conclude that more work is needed to: (1 evaluate the establishment and impact of biocontrol agents in different climate zones; (2 determine the combined effect of EAB biocontrol and host plant resistance or tolerance on the regeneration of North American ash species; and (3 expand foreign exploration for EAB natural enemies throughout Asia.

  3. Clay-Inspired MXene-Based Electrochemical Devices and Photo-Electrocatalyst: State-of-the-Art Progresses and Challenges.

    Science.gov (United States)

    Wang, Hou; Wu, Yan; Yuan, Xingzhong; Zeng, Guangming; Zhou, Jin; Wang, Xin; Chew, Jia Wei

    2018-03-01

    MXene, an important and increasingly popular category of postgraphene 2D nanomaterials, has been rigorously investigated since early 2011 because of advantages including flexible tunability in element composition, hydrophobicity, metallic nature, unique in-plane anisotropic structure, high charge-carrier mobility, tunable band gap, and favorable optical and mechanical properties. To fully exploit these potentials and further expand beyond the existing boundaries, novel functional nanostructures spanning monolayer, multilayer, nanoparticles, and composites have been developed by means of intercalation, delamination, functionalization, hybridization, among others. Undeniably, the cutting-edge developments and applications of clay-inspired 2D MXene platform as electrochemical electrode or photo-electrocatalyst have conferred superior performance and have made significant impact in the field of energy and advanced catalysis. This review provides an overview of the fundamental properties and synthesis routes of pure MXene, functionalized MXene and their hybrids, highlights the state-of-the-art progresses of MXene-based applications with respect to supercapacitors, batteries, electrocatalysis and photocatalysis, and presents the challenges and prospects in the burgeoning field. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Qualitative Assessment of Academic Radiation Oncology Department Chairs' Insights on Diversity, Equity, and Inclusion: Progress, Challenges, and Future Aspirations.

    Science.gov (United States)

    Jones, Rochelle D; Chapman, Christina H; Holliday, Emma B; Lalani, Nafisha; Wilson, Emily; Bonner, James A; Movsas, Benjamin; Kalnicki, Shalom; Formenti, Silvia C; Thomas, Charles R; Hahn, Stephen M; Liu, Fei-Fei; Jagsi, Reshma

    2018-05-01

    A lack of diversity has been observed in radiation oncology (RO), with women and certain racial/ethnic groups underrepresented as trainees, faculty, and practicing physicians. We sought to gain a nuanced understanding of how to best promote diversity, equity, and inclusion (DEI) based on the insights of RO department chairs, with particular attention given to the experiences of the few women and underrepresented minorities (URMs) in these influential positions. From March to June 2016, we conducted telephone interviews with 24 RO department chairs (of 27 invited). Purposive sampling was used to invite all chairs who were women (n = 13) or URMs (n = 3) and 11 male chairs who were not URMs. Multiple analysts coded the verbatim transcripts. Five themes were identified: (1) commitment to DEI promotes quality health care and innovation; (2) gaps remain despite some progress with promoting diversity in RO; (3) women and URM faculty continue to experience challenges in various career domains; (4) solutions to DEI issues would be facilitated by acknowledging realities of gender and race; and (5) expansion of the career pipeline is needed. The chairs' insights had policy-relevant implications. Bias training should broach tokenism, blindness, and intersectionality. Efforts to recruit and support diverse talent should be deliberate and proactive. Bridge programs could engage students before their application to medical school. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Developmental Systems Toxicology: computer simulation in a ‘Virtual Embryo’ prototype (SEURAT-1 Progress Meeting)

    Science.gov (United States)

    Evaluating and assessing impacts to development is an Agency priority (EPA’s Children’s Environmental Health Research Roadmap); however, the quantity of chemicals needing assessment and challenges of species extrapolation require alternative approaches to traditional animal studi...

  6. OpenTopography: Addressing Big Data Challenges Using Cloud Computing, HPC, and Data Analytics

    Science.gov (United States)

    Crosby, C. J.; Nandigam, V.; Phan, M.; Youn, C.; Baru, C.; Arrowsmith, R.

    2014-12-01

    OpenTopography (OT) is a geoinformatics-based data facility initiated in 2009 for democratizing access to high-resolution topographic data, derived products, and tools. Hosted at the San Diego Supercomputer Center (SDSC), OT utilizes cyberinfrastructure, including large-scale data management, high-performance computing, and service-oriented architectures to provide efficient Web based access to large, high-resolution topographic datasets. OT collocates data with processing tools to enable users to quickly access custom data and derived products for their application. OT's ongoing R&D efforts aim to solve emerging technical challenges associated with exponential growth in data, higher order data products, as well as user base. Optimization of data management strategies can be informed by a comprehensive set of OT user access metrics that allows us to better understand usage patterns with respect to the data. By analyzing the spatiotemporal access patterns within the datasets, we can map areas of the data archive that are highly active (hot) versus the ones that are rarely accessed (cold). This enables us to architect a tiered storage environment consisting of high performance disk storage (SSD) for the hot areas and less expensive slower disk for the cold ones, thereby optimizing price to performance. From a compute perspective, OT is looking at cloud based solutions such as the Microsoft Azure platform to handle sudden increases in load. An OT virtual machine image in Microsoft's VM Depot can be invoked and deployed quickly in response to increased system demand. OT has also integrated SDSC HPC systems like the Gordon supercomputer into our infrastructure tier to enable compute intensive workloads like parallel computation of hydrologic routing on high resolution topography. This capability also allows OT to scale to HPC resources during high loads to meet user demand and provide more efficient processing. With a growing user base and maturing scientific user

  7. Using a progress computer for the direct acquisition and processing of radiation protection data

    International Nuclear Information System (INIS)

    Barz, H.G.; Borchardt, K.D.; Hacke, J.; Kirschfeld, K.E.; Kluppak, B.

    1976-01-01

    A process computer will be used in the Hahn-Meitner-Institute to rationalize radiation protection measures. Appr. 150 transmitters are to be connected with this computer. Especially the radiation measuring devices of a nuclear reactor, of hot cells, and of a heavy ion accelerator, as well as the emission- and environment monitoring systems will be connected. The advantages of this method are described: central data acquisition, central alarm and stoppage information, data processing of certain measurement values, possibility of quick disturbance analysis. Furthermore the authors report about the preparations already finished, particularly about data transmission of digital and analog values to the computer. (orig./HP) [de

  8. Computer aided surface representation. Progress report, June 1, 1989--May 31, 1990

    Energy Technology Data Exchange (ETDEWEB)

    Barnhill, R.E.

    1990-02-19

    The central research problem of this project is the effective representation, computation, and display of surfaces interpolating to information in three or more dimensions. If the given information is located on another surface, then the problem is to construct a ``surface defined on a surface``. Sometimes properties of an already defined surface are desired, which is ``geometry processing``. Visualization of multivariate surfaces is possible by means of contouring higher dimensional surfaces. These problems and more are discussed below. The broad sweep from constructive mathematics through computational algorithms to computer graphics illustrations is utilized in this research. The breadth and depth of this research activity makes this research project unique.

  9. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  11. Quantitative computed tomography for objectifying disseminated skeletal muscles alterations in female conductors of progressive muscular dystrophy

    International Nuclear Information System (INIS)

    Huppert, P.

    1987-01-01

    The detection of early morphologic changes, such as circumscribed infiltrations of adipose connective tissue into the muscles of female conductors of progressive muscular dystrophy requires quantitative planimetric methods. For a reliable interpretation of the results the dependence of the fat content of the musculature on age and physical constitution of the patient must be taken into consideration in each individual case. (author)

  12. Computational methods for the nuclear and neutron matter problems. Progress report

    International Nuclear Information System (INIS)

    Kalos, M.H.

    1979-01-01

    A brief report is given of progress on the development of Monte Carlo methods for the treatment of both simplified and realistic models of extensive neutron and nuclear matter and, eventually, of finite nuclei. A wide class of algorithms that may allow the efficient sampling of the integrands required in calculating the energy expectations with useful trial wave functions was devised

  13. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  14. Radiation doses in pediatric computed tomography procedures: challenges facing new technologies

    International Nuclear Information System (INIS)

    Cotelo, E.; Padilla, M.; Dibarboure, L.

    2008-01-01

    Despite the fact that in recent years an increasing number of radiologists and radiological technologists have been applying radiation dose optimization techniques in paediatric Computed Tomography (CT) examinations, dual and multi -slice CT (MSCT) scanners present a new challenge in Radiation Protection (RP). While on one hand these scanners are provided with Automatic Exposure Control (AEC) devices, dose reduction modes and dose estimation software, on the other hand Quality Control (QC) tests and CT Kerma Index (C) measurements and patient dose estimation present specific difficulties and require changes or adaptations of traditional QC protocols. This implies a major challenge in most developing countries where Quality Assurance Programmes (QAP) have not been implemented yet and there is a shortage in the number of medical physicists This paper analyses clinical and technical protocols as well as patient doses in 204 CT body procedures performed in 154 children. The investigation was carried out in a paediatric reference hospital of Uruguay, where are performed an average of 450 paediatric CT examinations per month in a sole CT dual scanner. Besides, C VOL reported from the scanner display was registered in order to be related with the same dosimetric quantity derived from technical parameters and C values published on tables. Results showed that not all the radiologists applied the same protocol in similar clinical situations delivering unnecessary patient dose with no significant differences in image quality. Moreover, it was found that dose reduction modes represent a drawback in order to estimate patient dose when mA changes according to tissue attenuation, in most cases in each rotation. The study concluded on the importance of QAP that must include education on RP of radiologists and technologists, as well as in the need of medical physicists to perform QC tests and patient dose estimations and measurements. (author)

  15. Advanced methods for the computation of particle beam transport and the computation of electromagnetic fields and beam-cavity interactions. Progress report, July 1993--August 1994

    International Nuclear Information System (INIS)

    Dragt, A.J.; Gluckstern, R.L.

    1994-08-01

    The University of Maryland Dynamical Systems and Accelerator Theory Group has been carrying out long-term research work in the general area of Dynamical Systems with a particular emphasis on applications to Accelerator Physics. This work is broadly divided into two tasks: the computation of charged particle beam transport and the computation of electromagnetic fields and beam-cavity interactions. Each of these tasks is described briefly. Work is devoted both to the development of new methods and the application of these methods to problems of current interest in accelerator physics including the theoretical performance of present and proposed high energy machines. In addition to its research effort, the Dynamical Systems and Accelerator Theory Group is actively engaged in the education of students and postdoctoral research associates. Substantial progress in research has been made during the past year. These achievements are summarized in the following report

  16. Progress and challenges in implementing HIV care and treatment policies in Latin America following the treatment 2.0 initiative.

    Science.gov (United States)

    Perez, Freddy; Gomez, Bertha; Ravasi, Giovanni; Ghidinelli, Massimo

    2015-12-19

    the use of the WHO preferred first-line regimen, 51% increase in the use of WHO-recommended second-line regimens, and a significant reduction in the use of obsolete drugs in first- and second-line regimens (respectively 1 and 9% of regimens in 2013). A relatively good level of progress was perceived in the recommendations related to optimization of ART regimens. Challenges remain on the improvement of recommendations related to health system strengthening and the promotion and support aimed at community-based organizations as part of the response to HIV/AIDS in Latin America. The JRMs are a useful mechanism for providing coherent technical support to guide countries in the pursuit of a comprehensive response to HIV/AIDS in the Latin American region.

  17. Comparing computer adaptive and curriculum-based measures of math in progress monitoring.

    Science.gov (United States)

    Shapiro, Edward S; Dennis, Minyi Shih; Fu, Qiong

    2015-12-01

    The purpose of the study was to compare the use of a Computer Adaptive Test and Curriculum-Based Measurement in the assessment of mathematics. This study also investigated the degree to which slope or rate of change predicted student outcomes on the annual state assessment of mathematics above and beyond scores of single point screening assessments (i.e., the computer adaptive test or the CBM assessment just before the administration of the state assessment). Repeated measurement of mathematics once per month across a 7-month period using a Computer Adaptive Test (STAR-Math) and Curriculum-Based Measurement (CBM, AIMSweb Math Computation, AIMSweb Math Concepts/Applications) was collected for a maximum total of 250 third, fourth, and fifth grade students. Results showed STAR-Math in all 3 grades and AIMSweb Math Concepts/Applications in the third and fifth grades had primarily linear growth patterns in mathematics. AIMSweb Math Computation in all grades and AIMSweb Math Concepts/Applications in Grade 4 had decelerating positive trends. Predictive validity evidence showed the strongest relationships were between STAR-Math and outcomes for third and fourth grade students. The blockwise multiple regression by grade revealed that slopes accounted for only a very small proportion of additional variance above and beyond what was explained by the scores obtained on a single point of assessment just prior to the administration of the state assessment. (c) 2015 APA, all rights reserved).

  18. Challenges in clinical applications of brain computer interfaces in individuals with spinal cord injury

    Directory of Open Access Journals (Sweden)

    Rüdiger eRupp

    2014-09-01

    Full Text Available Brain computer interfaces (BCIs are devices that measure brain activities and translate them into control signals used for a variety of applications. Among them are systems for communication, environmental control, neuroprostheses, exoskeletons or restorative therapies. Over the last years the technology of BCIs has reached a level of matureness allowing them to be used not only in research experiments supervised by scientists, but also in clinical routine with patients with neurological impairments supervised by clinical personnel or caregivers. However, clinicians and patients face many challenges in the application of BCIs. This particularly applies to high spinal cord injured patients, in whom artificial ventilation, autonomic dysfunctions, neuropathic pain or the inability to achieve a sufficient level of control during a short-term training may limit the successful use of a BCI. Additionally, spasmolytic medication and the acute stress reaction with associated episodes of depression may have a negative influence on the modulation of brain waves and therefore the ability to concentrate over an extended period of time. Although BCIs seem to be a promising assistive technology for individuals with high spinal cord injury systematic investigations are highly needed to obtain realistic estimates of the percentage of users that for any reason may not be able to operate a BCI in a clinical setting.

  19. [Geometry, analysis, and computation in mathematics and applied science]. Progress report

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, D.

    1994-02-01

    The principal investigators` work on a variety of pure and applied problems in Differential Geometry, Calculus of Variations and Mathematical Physics has been done in a computational laboratory and been based on interactive scientific computer graphics and high speed computation created by the principal investigators to study geometric interface problems in the physical sciences. We have developed software to simulate various physical phenomena from constrained plasma flow to the electron microscope imaging of the microstructure of compound materials, techniques for the visualization of geometric structures that has been used to make significant breakthroughs in the global theory of minimal surfaces, and graphics tools to study evolution processes, such as flow by mean curvature, while simultaneously developing the mathematical foundation of the subject. An increasingly important activity of the laboratory is to extend this environment in order to support and enhance scientific collaboration with researchers at other locations. Toward this end, the Center developed the GANGVideo distributed video software system and software methods for running lab-developed programs simultaneously on remote and local machines. Further, the Center operates a broadcast video network, running in parallel with the Center`s data networks, over which researchers can access stored video materials or view ongoing computations. The graphical front-end to GANGVideo can be used to make ``multi-media mail`` from both ``live`` computing sessions and stored materials without video editing. Currently, videotape is used as the delivery medium, but GANGVideo is compatible with future ``all-digital`` distribution systems. Thus as a byproduct of mathematical research, we are developing methods for scientific communication. But, most important, our research focuses on important scientific problems; the parallel development of computational and graphical tools is driven by scientific needs.

  20. Computer simulation of kinetic properties of plasmas. Progress report, October 1, 1977--September 30, 1978

    International Nuclear Information System (INIS)

    Denavit, J.

    1978-01-01

    The research is directed toward the development and testing of new numerical methods for particle and hybrid simulation of plasmas and their application to physical problems of current significance to Magnetic Fusion Energy. During the past year, research on the project has been concerned with the following specific problems: (1) analysis and computer simulations of the dissipative trapped-electron instability in tokamaks; (2) long-time-scale algorithms for numerical solutions of the drift-kinetic equation; and (3) computer simulation of field-reversed ion ring stability

  1. Computer simulation of kinetic properties of plasmas. Progress report, October 1, 1978-June 30, 1979

    International Nuclear Information System (INIS)

    Denavit, J.

    1979-01-01

    The research is directed toward the development and testing of new numerical methods for particle and hybrid simulation of plasmas, and their application to physical problems of current significance to Magnetic Fusion Energy. During the present period, research on the project has been concerned with the following specific problems: (1) Computer simulations of drift and dissipative trapped-electron instabilities in tokamaks, including radial dependence and shear stabilization. (2) Long-time-scale algorithms for numerical solutions of the drift-kinetic equation. (3) Computer simulation of field-reversed ion ring stability. (4) Nonlinear, single-mode saturation of the bump-on-tail instability

  2. Computational complexity in multidimensional neutron transport theory calculations. Progress report, September 1, 1975--August 31, 1976

    International Nuclear Information System (INIS)

    Bareiss, E.H.

    1976-05-01

    The objectives of the work are to develop mathematically and computationally founded for the design of highly efficient and reliable multidimensional neutron transport codes to solve a variety of neutron migration and radiation problems, and to analyze existing and new methods for performance. As new analytical insights are gained, new numerical methods are developed and tested. Significant results obtained include implementation of the integer-preserving Gaussian elimination method (two-step method) in a CDC 6400 computer code, modes analysis for one-dimensional transport solutions, and a new method for solving the 1-T transport equation. Some of the work dealt with the interface and corner problem in diffusion theory

  3. Trends and Progress in Reducing Teen Birth Rates and the Persisting Challenge of Eliminating Racial/Ethnic Disparities.

    Science.gov (United States)

    Ngui, Emmanuel M; Greer, Danielle M; Bridgewater, Farrin D; Salm Ward, Trina C; Cisler, Ron A

    2017-08-01

    We examined progress made by the Milwaukee community toward achieving the Milwaukee Teen Pregnancy Prevention Initiative's aggressive 2008 goal of reducing the teen birth rate to 30 live births/1000 females aged 15-17 years by 2015. We further examined differential teen birth rates in disparate racial and ethnic groups. We analyzed teen birth count data from the Wisconsin Interactive Statistics on Health system and demographic data from the US Census Bureau. We computed annual 2003-2014 teen birth rates for the city and four racial/ethnic groups within the city (white non-Hispanic, black non-Hispanic, Hispanic/Latina, Asian non-Hispanic). To compare birth rates from before (2003-2008) and after (2009-2014) goal setting, we used a single-system design to employ two time series analysis approaches, celeration line, and three standard deviation (3SD) bands. Milwaukee's teen birth rate dropped 54 % from 54.3 in 2003 to 23.7 births/1000 females in 2014, surpassing the goal of 30 births/1000 females 3 years ahead of schedule. Rate reduction following goal setting was statistically significant, as five of the six post-goal data points were located below the celeration line and points for six consecutive years (2010-2014) fell below the 3SD band. All racial/ethnic groups demonstrated significant reductions through at least one of the two time series approaches. The gap between white and both black and Hispanic/Latina teens widened. Significant reduction has occurred in the overall teen birth rate of Milwaukee. Achieving an aggressive reduction in teen births highlights the importance of collaborative community partnerships in setting and tracking public health goals.

  4. Research in progress in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1990-01-01

    Research conducted at the Institute in Science and Engineering in applied mathematics, numerical analysis, and computer science is summarized. The Institute conducts unclassified basic research in applied mathematics in order to extend and improve problem solving capabilities in science and engineering, particularly in aeronautics and space.

  5. Computational complexity in multidimensional neutron transport theory calculations. Progress report, September 1, 1974--August 31, 1975

    International Nuclear Information System (INIS)

    Bareiss, E.H.

    1975-01-01

    The objectives of the research remain the same as outlined in the original proposal. They are in short as follows: Develop mathematically and computationally founded criteria for the design of highly efficient and reliable multi-dimensional neutron transport codes to solve a variety of neutron migration and radiation problems and analyze existing and new methods for performance. (U.S.)

  6. Progress in analysis of computed tomography (CT) images of hardwood logs for defect detection

    Science.gov (United States)

    Erol Sarigul; A. Lynn Abbott; Daniel L. Schmoldt

    2003-01-01

    This paper addresses the problem of automatically detecting internal defects in logs using computed tomography (CT) images. The overall purpose is to assist in breakdown optimization. Several studies have shown that the commercial value of resulting boards can be increased substantially if defect locations are known in advance, and if this information is used to make...

  7. High Performance Parallel Processing Project: Industrial computing initiative. Progress reports for fiscal year 1995

    Energy Technology Data Exchange (ETDEWEB)

    Koniges, A.

    1996-02-09

    This project is a package of 11 individual CRADA`s plus hardware. This innovative project established a three-year multi-party collaboration that is significantly accelerating the availability of commercial massively parallel processing computing software technology to U.S. government, academic, and industrial end-users. This report contains individual presentations from nine principal investigators along with overall program information.

  8. Computational complexity in multidimensional neutron transport theory calculations. Progress report, September 1976--November 30, 1977

    International Nuclear Information System (INIS)

    Bareiss, E.H.

    1977-08-01

    The objectives of this research are to develop mathematically and computationally founded criteria for the design of highly efficient and reliable multidimensional neutron transport codes to solve a variety of neutron migration and radiation problems, and to analyze existing and new methods for performance

  9. Computer Literacy of Iranian Teachers of English as a Foreign Language: Challenges and Obstacles

    Science.gov (United States)

    Dashtestani, Reza

    2014-01-01

    Basically, one of the requirements for the implementation of computer-assisted language learning (CALL) is English as a foreign language (EFL) teachers' ability to use computers effectively. Educational authorities and planners should identify EFL teachers' computer literacy levels and make attempts to improve the teachers' computer competence.…

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  11. Impacts of Mothers’ Occupation Status and Parenting Styles on Levels of Self-Control, Addiction to Computer Games, and Educational Progress of Adolescents

    OpenAIRE

    Abedini, Yasamin; Zamani, Bibi Eshrat; Kheradmand, Ali; Rajabizadeh, Ghodratollah

    2012-01-01

    Background Addiction to computer (video) games in adolescents and its relationship with educational progress has recently attracted the attention of rearing and education experts as well as organizations and institutes involved in physical and mental health. The current research attempted to propose a structural model of the relationships between parenting styles, mothers’ occupation status, and addiction to computer games, self-control, and educational progress of secondary school students. ...

  12. Exercising CMS dataflows and workflows in computing challenges at the SpanishTier-1 and Tier-2 sites

    Energy Technology Data Exchange (ETDEWEB)

    Caballero, J; Colino, N; Peris, A D; G-Abia, P; Hernandez, J M; R-Calonge, F J [CIEMAT, Madrid (Spain); Cabrillo, I; Caballero, I G; Marco, R; Matorras, F [IFCA, Santander (Spain); Flix, J; Merino, G [PIC, Barcelona (Spain)], E-mail: jose.hernandez@ciemat.es

    2008-07-15

    An overview of the data transfer, processing and analysis operations conducted at the Spanish Tier-1 (PIC, Barcelona) and Tier-2 (CIEMAT-Madrid and IFCA-Santander federation) centres during the past CMS CSA06 Computing, Software and Analysis challenge and in preparation for CSA07 is present0008.

  13. Exercising CMS dataflows and workflows in computing challenges at the SpanishTier-1 and Tier-2 sites

    International Nuclear Information System (INIS)

    Caballero, J; Colino, N; Peris, A D; G-Abia, P; Hernandez, J M; R-Calonge, F J; Cabrillo, I; Caballero, I G; Marco, R; Matorras, F; Flix, J; Merino, G

    2008-01-01

    An overview of the data transfer, processing and analysis operations conducted at the Spanish Tier-1 (PIC, Barcelona) and Tier-2 (CIEMAT-Madrid and IFCA-Santander federation) centres during the past CMS CSA06 Computing, Software and Analysis challenge and in preparation for CSA07 is presented

  14. Grand Challenges of Advanced Computing for Energy Innovation Report from the Workshop Held July 31-August 2, 2012

    Energy Technology Data Exchange (ETDEWEB)

    Larzelere, Alex R.; Ashby, Steven F.; Christensen, Dana C.; Crawford, Dona L.; Khaleel, Mohammad A.; John, Grosh; Stults, B. Ray; Lee, Steven L.; Hammond, Steven W.; Grover, Benjamin T.; Neely, Rob; Dudney, Lee Ann; Goldstein, Noah C.; Wells, Jack; Peltz, Jim

    2013-03-06

    On July 31-August 2 of 2012, the U.S. Department of Energy (DOE) held a workshop entitled Grand Challenges of Advanced Computing for Energy Innovation. This workshop built on three earlier workshops that clearly identified the potential for the Department and its national laboratories to enable energy innovation. The specific goal of the workshop was to identify the key challenges that the nation must overcome to apply the full benefit of taxpayer-funded advanced computing technologies to U.S. energy innovation in the ways that the country produces, moves, stores, and uses energy. Perhaps more importantly, the workshop also developed a set of recommendations to help the Department overcome those challenges. These recommendations provide an action plan for what the Department can do in the coming years to improve the nation’s energy future.

  15. Recent progress in orbital-free density functional theory (recent advances in computational chemistry)

    CERN Document Server

    Wesolowski, Tomasz A

    2013-01-01

    This is a comprehensive overview of state-of-the-art computational methods based on orbital-free formulation of density functional theory completed by the most recent developments concerning the exact properties, approximations, and interpretations of the relevant quantities in density functional theory. The book is a compilation of contributions stemming from a series of workshops which had been taking place since 2002. It not only chronicles many of the latest developments but also summarises some of the more significant ones. The chapters are mainly reviews of sub-domains but also include original research. Readership: Graduate students, academics and researchers in computational chemistry. Atomic & molecular physicists, theoretical physicists, theoretical chemists, physical chemists and chemical physicists.

  16. The Berlin Brain-Computer Interface: Progress Beyond Communication and Control.

    Science.gov (United States)

    Blankertz, Benjamin; Acqualagna, Laura; Dähne, Sven; Haufe, Stefan; Schultze-Kraft, Matthias; Sturm, Irene; Ušćumlic, Marija; Wenzel, Markus A; Curio, Gabriel; Müller, Klaus-Robert

    2016-01-01

    The combined effect of fundamental results about neurocognitive processes and advancements in decoding mental states from ongoing brain signals has brought forth a whole range of potential neurotechnological applications. In this article, we review our developments in this area and put them into perspective. These examples cover a wide range of maturity levels with respect to their applicability. While we assume we are still a long way away from integrating Brain-Computer Interface (BCI) technology in general interaction with computers, or from implementing neurotechnological measures in safety-critical workplaces, results have already now been obtained involving a BCI as research tool. In this article, we discuss the reasons why, in some of the prospective application domains, considerable effort is still required to make the systems ready to deal with the full complexity of the real world.

  17. The Berlin Brain-Computer Interface: Progress Beyond Communication and Control

    Directory of Open Access Journals (Sweden)

    Benjamin Blankertz

    2016-11-01

    Full Text Available The combined effect of fundamental results about neurocognitive processes and advancements in decoding mental states from ongoing brain signals has brought forth a whole range of potential neurotechnological applications. In this article, we review our developments in this area and put them into perspective. These examples cover a wide range of maturity levels with respect to their applicability. While we assume we are still a long way away from integrating Brain-Computer Interface (BCI technology in general interaction with computers, or from implementing neurotechnological measures in safety-critical workplaces, results have already now been obtained involving a BCI as research tool. In this article, we discuss the reasons why, in some of the prospective application domains, considerable effort is still required to make the systems ready to deal with the full complexity of the real world.

  18. MIT Laboratory for Computer Science Progress Report No. 23, July 1985-June 1986

    Science.gov (United States)

    1986-06-01

    34An Expert System for Diagnosing Gait in Cerebral Palsy Patients," S.M. thesis, MIT Department of Electrical Engineering and Computer Science, Cambridge...barriers to program interoperability. Within our community, we perceive serious problems in building on the work of others, because the different...unmatched tokens. Of course, the name of the game is representation and access to the various data structures, and this has worked out very nicely. GITA

  19. Progress of computer-aided detection/diagnosis (CAD in dentistryCAD in dentistry

    Directory of Open Access Journals (Sweden)

    Akitoshi Katsumata

    2014-08-01

    CAD is also useful in the detection and evaluation of dental and maxillofacial lesions. Identifying alveolar bone resorption due to periodontitis and radiolucent jaw lesions (such as radicular and dentigerous cysts are important goals for CAD. CAD can be applied not only to panoramic radiography but also to dental cone-beam computed tomography (CBCT images. Linking of CAD and teleradiology will be an important issue.

  20. Computational intelligence in gait research: a perspective on current applications and future challenges.

    Science.gov (United States)

    Lai, Daniel T H; Begg, Rezaul K; Palaniswami, Marimuthu

    2009-09-01

    Our mobility is an important daily requirement so much so that any disruption to it severely degrades our perceived quality of life. Studies in gait and human movement sciences, therefore, play a significant role in maintaining the well-being of our mobility. Current gait analysis involves numerous interdependent gait parameters that are difficult to adequately interpret due to the large volume of recorded data and lengthy assessment times in gait laboratories. A proposed solution to these problems is computational intelligence (CI), which is an emerging paradigm in biomedical engineering most notably in pathology detection and prosthesis design. The integration of CI technology in gait systems facilitates studies in disorders caused by lower limb defects, cerebral disorders, and aging effects by learning data relationships through a combination of signal processing and machine learning techniques. Learning paradigms, such as supervised learning, unsupervised learning, and fuzzy and evolutionary algorithms, provide advanced modeling capabilities for biomechanical systems that in the past have relied heavily on statistical analysis. CI offers the ability to investigate nonlinear data relationships, enhance data interpretation, design more efficient diagnostic methods, and extrapolate model functionality. These are envisioned to result in more cost-effective, efficient, and easy-to-use systems, which would address global shortages in medical personnel and rising medical costs. This paper surveys current signal processing and CI methodologies followed by gait applications ranging from normal gait studies and disorder detection to artificial gait simulation. We review recent systems focusing on the existing challenges and issues involved in making them successful. We also examine new research in sensor technologies for gait that could be combined with these intelligent systems to develop more effective healthcare solutions.

  1. Computer simulation for the effect of coherent strain on the precipitation progress of binary alloy

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Based on the microscopic elasticity theory and microscopic diffusion equation, the precipitation progress of the binary alloys including coherent strain energy was studied. The results show that coherent strain has obvious effect on the coherent two-phase morphology and precipitation mechanism. With the increase of coherent strain energy, the particles shape changes from the randomly distributed equiaxed particels to elliptical precipitate shapes, their arrangement orientation increases; in the late stage of precipitation, the particle arrangement presents obvious directionality along the [10] and [01] directions, and the precipitation mechanism of alloy changes from typical spinodal decomposition mechanism to the mixture process which possesses the characteristics of both non-classical nucleation growth and spinodal decomposition mechanisms.

  2. Augmentation of spelling therapy with transcranial direct current stimulation in primary progressive aphasia: Preliminary results and challenges.

    Science.gov (United States)

    Tsapkini, Kyrana; Frangakis, Constantine; Gomez, Yessenia; Davis, Cameron; Hillis, Argye E

    Primary progressive aphasia (PPA) is a neurodegenerative disease that primarily affects language functions and often begins in the fifth or sixth decade of life. The devastating effects on work and family life call for the investigation of treatment alternatives. In this article, we present new data indicating that neuromodulatory treatment, using transcranial direct current stimulation (tDCS) combined with a spelling intervention, shows some promise for maintaining or even improving language, at least temporarily, in PPA. The main aim of the present article is to determine whether tDCS plus spelling intervention is more effective than spelling intervention alone in treating written language in PPA. We also asked whether the effects of tDCS are sustained longer than the effects of spelling intervention alone. We present data from six PPA participants who underwent anodal tDCS or sham plus spelling intervention in a within-subject crossover design. Each stimulation condition lasted 3 weeks or a total of 15 sessions with a 2-month interval in between. Participants were evaluated on treatment tasks as well as on other language and cognitive tasks at 2-week and 2-month follow-up intervals after each stimulation condition. All participants showed improvement in spelling (with sham or tDCS). There was no difference in the treated items between the two conditions. There was, however, consistent and significant improvement for untrained items only in the tDCS plus spelling intervention condition. Furthermore, the improvement lasted longer in the tDCS plus spelling intervention condition compared to sham plus spelling intervention condition. Neuromodulation with tDCS offers promise as a means of augmenting language therapy to improve written language function at least temporarily in PPA. The consistent finding of generalisation of treatment benefits to untreated items and the superior sustainability of treatment effects with tDCS justifies further investigations. However

  3. Some computational challenges of developing efficient parallel algorithms for data-dependent computations in thermal-hydraulics supercomputer applications

    International Nuclear Information System (INIS)

    Woodruff, S.B.

    1992-01-01

    The Transient Reactor Analysis Code (TRAC), which features a two- fluid treatment of thermal-hydraulics, is designed to model transients in water reactors and related facilities. One of the major computational costs associated with TRAC and similar codes is calculating constitutive coefficients. Although the formulations for these coefficients are local the costs are flow-regime- or data-dependent; i.e., the computations needed for a given spatial node often vary widely as a function of time. Consequently, poor load balancing will degrade efficiency on either vector or data parallel architectures when the data are organized according to spatial location. Unfortunately, a general automatic solution to the load-balancing problem associated with data-dependent computations is not yet available for massively parallel architectures. This document discusses why developers algorithms, such as a neural net representation, that do not exhibit algorithms, such as a neural net representation, that do not exhibit load-balancing problems

  4. FY 1992 Blue Book: Grand Challenges: High Performance Computing and Communications

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — High performance computing and computer communications networks are becoming increasingly important to scientific advancement, economic competition, and national...

  5. FY 1993 Blue Book: Grand Challenges 1993: High Performance Computing and Communications

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — High performance computing and computer communications networks are becoming increasingly important to scientific advancement, economic competition, and national...

  6. Biomedical Computing Technology Information Center (BCTIC): Final progress report, March 1, 1986-September 30, 1986

    International Nuclear Information System (INIS)

    1987-01-01

    During this time, BCTIC packaged and disseminated computing technology and honored all requests made before September 1, 1986. The final month of operation was devoted to completing code requests, returning submitted codes, and sending out notices of BCTIC's termination of services on September 30th. Final BCTIC library listings were distributed to members of the active mailing list. Also included in the library listing are names and addresses of program authors and contributors in order that users may have continued support of their programs. The BCTIC library list is attached

  7. Calculations detailed progression of fire in NPP ALMARAZ through the code computational fire dynamics SIMULATOR

    International Nuclear Information System (INIS)

    Villar Sanchez, T.

    2012-01-01

    (FDS) is an advanced computational model of calculation of simulation of fire that numerically solves the Navier-Stokes equations in each cell of the mesh in each interval of time, having capacity to calculate accurately all those parameters of fire to NUREG-1805 has a limited capacity. The objective of the analysis is to compare the results obtained with the FDS with those obtained from spreadsheets of NUREG-1805 and deal widespread and realistic study of the propagation of a fire in different areas of NPP Almaraz.

  8. Challenges in the twentieth century and beyond: Computer codes and data

    International Nuclear Information System (INIS)

    Kirk, B.L.

    1995-01-01

    The second half of the twentieth century has seen major changes in computer architecture. From the early fifties to the early seventies, the word open-quotes computerclose quotes demanded reverence, respect, and even fear. Computers, then, were almost open-quotes untouchable.close quotes Computers have become the mainstream of communication on rapidly expanding communication highways. They have become necessities of life. This report describes computer codes and packaging, as well as compilers and operating systems

  9. Migrating Educational Data and Services to Cloud Computing: Exploring Benefits and Challenges

    Science.gov (United States)

    Lahiri, Minakshi; Moseley, James L.

    2013-01-01

    "Cloud computing" is currently the "buzzword" in the Information Technology field. Cloud computing facilitates convenient access to information and software resources as well as easy storage and sharing of files and data, without the end users being aware of the details of the computing technology behind the process. This…

  10. Challenges in computational materials science: Multiple scales, multi-physics and evolving discontinuities

    NARCIS (Netherlands)

    Borst, de R.

    2008-01-01

    Novel experimental possibilities together with improvements in computer hardware as well as new concepts in computational mathematics and mechanics in particular multiscale methods are now, in principle, making it possible to derive and compute phenomena and material parameters at a macroscopic

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  12. A Computational Model of Peripheral Photocoagulation for the Prevention of Progressive Diabetic Capillary Occlusion

    Directory of Open Access Journals (Sweden)

    Thomas J. Gast

    2016-01-01

    Full Text Available We developed a computational model of the propagation of retinal ischemia in diabetic retinopathy and analyzed the consequences of various patterns and sizes of burns in peripheral retinal photocoagulation. The model addresses retinal ischemia as a phenomenon of adverse local feedback in which once a capillary is occluded there is an elevated probability of occlusion of adjacent capillaries resulting in enlarging areas of retinal ischemia as is commonly seen clinically. Retinal burns of different sizes and patterns, treated as local oxygen sources, are predicted to have different effects on the propagation of retinal ischemia. The patterns of retinal burns are optimized with regard to minimization of the sum of the photocoagulated retina and computer predicted ischemic retina. Our simulations show that certain patterns of retinal burns are effective at preventing the spatial spread of ischemia by creating oxygenated boundaries across which the ischemia does not propagate. This model makes no statement about current PRP treatment of avascular peripheral retina and notes that the usual spot sizes used in PRP will not prevent ischemic propagation in still vascularized retinal areas. The model seems to show that a properly patterned laser treatment of still vascularized peripheral retina may be able to prevent or at least constrain the propagation of diabetic retinal ischemia in those retinal areas with intact capillaries.

  13. Computer Games in Pre-School Settings: Didactical Challenges when Commercial Educational Computer Games Are Implemented in Kindergartens

    Science.gov (United States)

    Vangsnes, Vigdis; Gram Okland, Nils Tore; Krumsvik, Rune

    2012-01-01

    This article focuses on the didactical implications when commercial educational computer games are used in Norwegian kindergartens by analysing the dramaturgy and the didactics of one particular game and the game in use in a pedagogical context. Our justification for analysing the game by using dramaturgic theory is that we consider the game to be…

  14. Impacts of mothers' occupation status and parenting styles on levels of self-control, addiction to computer games, and educational progress of adolescents.

    Science.gov (United States)

    Abedini, Yasamin; Zamani, Bibi Eshrat; Kheradmand, Ali; Rajabizadeh, Ghodratollah

    2012-01-01

    Addiction to computer (video) games in adolescents and its relationship with educational progress has recently attracted the attention of rearing and education experts as well as organizations and institutes involved in physical and mental health. The current research attempted to propose a structural model of the relationships between parenting styles, mothers' occupation status, and addiction to computer games, self-control, and educational progress of secondary school students. Using multistage cluster random sampling, 500 female and male secondary school students in Kerman (Iran) were selected and studied. The research tools included self-control, parenting styles, and addiction to computer games questionnaires and a self-made questionnaire containing demographic details. The data was analyzed using exploratory factor analysis, Cronbach's alpha coefficient and route analysis (in LISREL). We found self-control to have a linking role in the relationship between four parenting styles and educational progress. Mothers' occupation status was directly and significantly correlated with addiction to computer games. Although four parenting styles directly and significantly affected addiction to computer games, the findings did not support the linking role of addiction to computer games in the relationship between four parenting styles and educational progress. In agreement with previous studies, the current research reflected the impact of four parenting styles on self-control, addiction to computer games, and educational progress of students. Among the parenting styles, authoritative style can affect the severity of addiction to computer games through self-control development. It can thus indirectly influence the educational progress of students. Parents are recommended to use authoritative parenting style to help both self-management and psychological health of their children. The employed mothers are also recommended to have more supervision and control on the degree

  15. Impacts of Mothers’ Occupation Status and Parenting Styles on Levels of Self-Control, Addiction to Computer Games, and Educational Progress of Adolescents

    Science.gov (United States)

    Abedini, Yasamin; Zamani, Bibi Eshrat; Kheradmand, Ali; Rajabizadeh, Ghodratollah

    2012-01-01

    Background Addiction to computer (video) games in adolescents and its relationship with educational progress has recently attracted the attention of rearing and education experts as well as organizations and institutes involved in physical and mental health. The current research attempted to propose a structural model of the relationships between parenting styles, mothers’ occupation status, and addiction to computer games, self-control, and educational progress of secondary school students. Methods Using multistage cluster random sampling, 500 female and male secondary school students in Kerman (Iran) were selected and studied. The research tools included self-control, parenting styles, and addiction to computer games questionnaires and a self-made questionnaire containing demographic details. The data was analyzed using exploratory factor analysis, Cronbach’s alpha coefficient and route analysis (in LISREL). Findings We found self-control to have a linking role in the relationship between four parenting styles and educational progress. Mothers’ occupation status was directly and significantly correlated with addiction to computer games. Although four parenting styles directly and significantly affected addiction to computer games, the findings did not support the linking role of addiction to computer games in the relationship between four parenting styles and educational progress. Conclusion In agreement with previous studies, the current research reflected the impact of four parenting styles on self-control, addiction to computer games, and educational progress of students. Among the parenting styles, authoritative style can affect the severity of addiction to computer games through self-control development. It can thus indirectly influence the educational progress of students. Parents are recommended to use authoritative parenting style to help both self-management and psychological health of their children. The employed mothers are also recommended to

  16. Progression criteria for cancer antigen 15.3 and carcinoembryonic antigen in metastatic breast cancer compared by computer simulation of marker data

    DEFF Research Database (Denmark)

    Sölétormos, G; Hyltoft Petersen, P; Dombernowsky, P

    2000-01-01

    .3 and carcinoembryonic antigen concentrations were combined with representative values for background variations in a computer simulation model. Fifteen criteria for assessment of longitudinal tumor marker data were obtained from the literature and computerized. Altogether, 7200 different patients, each based on 50......BACKGROUND: We investigated the utility of computer simulation models for performance comparisons of different tumor marker assessment criteria to define progression or nonprogression of metastatic breast cancer. METHODS: Clinically relevant values for progressive cancer antigen 15...... of progression. CONCLUSIONS: The computer simulation model is a fast, effective, and inexpensive approach for comparing the diagnostic potential of assessment criteria during clinically relevant conditions of steady-state and progressive disease. The model systems can be used to generate tumor marker assessment...

  17. Relearning and Retaining Personally-Relevant Words using Computer-Based Flashcard Software in Primary Progressive Aphasia

    Directory of Open Access Journals (Sweden)

    William Streicher Evans

    2016-11-01

    Full Text Available Although anomia treatments have often focused on training small sets of words in the hopes of promoting generalization to untrained items, an alternative is to directly train a larger set of words more efficiently. The current case report study reports on a novel treatment for a patient with semantic variant Primary Progressive Aphasia (svPPA, in which the patient was taught to make and practice flashcards for personally-relevant words using an open-source computer program (Anki. Results show that the patient was able to relearn and retain a large subset of her studied words over a 20-month period. At the end of treatment, she showed good retention for 139 studied words, far more than the number typically treated in svPPA studies. Furthermore, she showed evidence of stimulus generalization to confrontation-naming tasks for studied items, and of relearning forgotten items with additional practice. This case represents a successful example of patient-centered computer-based asynchronous telepractice. It also illustrates how data captured from computer-based treatments can provide powerful practice-based evidence, obtained during routine clinical care.

  18. Progressive systemic sclerosis: high-resolution computed tomography findings; Esclerose sistemica progressiva: aspectos na tomografia computadorizada de alta resolucao

    Energy Technology Data Exchange (ETDEWEB)

    Gasparetto, Emerson L.; Pimenta, Rodrigo; Ono, Sergio E.; Escuissato, Dante L. [Parana Univ., Curitiba, PR (Brazil). Hospital de Clinicas. Servico de Radiologia Medica]. E-mail: dante.luiz@onda.com.br; Inoue, Cesar [Parana Univ., Curitiba, PR (Brazil). Faculdade de Medicina

    2005-09-15

    Objective: To describe the high-resolution computed tomography findings in the lung of patients with systemic sclerosis, independently of the respiratory symptoms. Materials and methods: Seventy-three high-resolution computed tomography scans of 44 patients with clinical diagnosis of systemic sclerosis were reviewed and defined by the consensus of two radiologists. Results: Abnormalities were seen in 91.8% (n = 67) of the scans. The most frequent findings were reticular pattern (90.4%), ground-glass opacities (63%), traction bronchiectasis and bronchiolectasis (56.2%), esophageal dilatation (46.6%), honeycombing pattern (28.8%) and signs of pulmonary hypertension (15.6%). In most cases the lesions were bilateral (89%) and symmetrical (58.5%). The lesions were predominantly located in the basal (91.2%) and peripheral (92.2%) regions. Conclusion: In the majority of the patients, progressive systemic sclerosis can cause pulmonary fibrosis mainly characterized by reticular pattern with basal and peripheral distribution on high-resolution computed tomography. (author)

  19. Computer usage among nurses in rural health-care facilities in South Africa: obstacles and challenges.

    Science.gov (United States)

    Asah, Flora

    2013-04-01

    This study discusses factors inhibiting computer usage for work-related tasks among computer-literate professional nurses within rural healthcare facilities in South Africa. In the past two decades computer literacy courses have not been part of the nursing curricula. Computer courses are offered by the State Information Technology Agency. Despite this, there seems to be limited use of computers by professional nurses in the rural context. Focus group interviews held with 40 professional nurses from three government hospitals in northern KwaZulu-Natal. Contributing factors were found to be lack of information technology infrastructure, restricted access to computers and deficits in regard to the technical and nursing management support. The physical location of computers within the health-care facilities and lack of relevant software emerged as specific obstacles to usage. Provision of continuous and active support from nursing management could positively influence computer usage among professional nurses. A closer integration of information technology and computer literacy skills into existing nursing curricula would foster a positive attitude towards computer usage through early exposure. Responses indicated that change of mindset may be needed on the part of nursing management so that they begin to actively promote ready access to computers as a means of creating greater professionalism and collegiality. © 2011 Blackwell Publishing Ltd.

  20. Research Progress in Mathematical Analysis of Map Projection by Computer Algebra

    Directory of Open Access Journals (Sweden)

    BIAN Shaofeng

    2017-10-01

    Full Text Available Map projection is an important component of modern cartography, and involves many fussy mathematical analysis processes, such as the power series expansions of elliptical functions, differential of complex and implicit functions, elliptical integral and the operation of complex numbers. The derivation of these problems by hand not only consumes much time and energy but also makes mistake easily, and sometimes can not be realized at all because of the impossible complexity. The research achievements in mathematical analysis of map projection by computer algebra are systematically reviewed in five aspects, i.e., the symbolic expressions of forward and inverse solution of ellipsoidal latitudes, the direct transformations between map projections with different distortion properties, expressions of Gauss projection by complex function, mathematical analysis of oblique Mercator projection, polar chart projection with its transformation. Main problems that need to be further solved in this research field are analyzed. It will be helpful to promote the development of map projection.

  1. Pervasive Computing Technologies to Continuously Assess Alzheimer's Disease Progression and Intervention Efficacy.

    Science.gov (United States)

    Lyons, Bayard E; Austin, Daniel; Seelye, Adriana; Petersen, Johanna; Yeargers, Jonathan; Riley, Thomas; Sharma, Nicole; Mattek, Nora; Wild, Katherine; Dodge, Hiroko; Kaye, Jeffrey A

    2015-01-01

    Traditionally, assessment of functional and cognitive status of individuals with dementia occurs in brief clinic visits during which time clinicians extract a snapshot of recent changes in individuals' health. Conventionally, this is done using various clinical assessment tools applied at the point of care and relies on patients' and caregivers' ability to accurately recall daily activity and trends in personal health. These practices suffer from the infrequency and generally short durations of visits. Since 2004, researchers at the Oregon Center for Aging and Technology (ORCATECH) at the Oregon Health and Science University have been working on developing technologies to transform this model. ORCATECH researchers have developed a system of continuous in-home monitoring using pervasive computing technologies that make it possible to more accurately track activities and behaviors and measure relevant intra-individual changes. We have installed a system of strategically placed sensors in over 480 homes and have been collecting data for up to 8 years. Using this continuous in-home monitoring system, ORCATECH researchers have collected data on multiple behaviors such as gait and mobility, sleep and activity patterns, medication adherence, and computer use. Patterns of intra-individual variation detected in each of these areas are used to predict outcomes such as low mood, loneliness, and cognitive function. These methods have the potential to improve the quality of patient health data and in turn patient care especially related to cognitive decline. Furthermore, the continuous real-world nature of the data may improve the efficiency and ecological validity of clinical intervention studies.

  2. Quality Assurance Challenges for Motion-Adaptive Radiation Therapy: Gating, Breath Holding, and Four-Dimensional Computed Tomography

    International Nuclear Information System (INIS)

    Jiang, Steve B.; Wolfgang, John; Mageras, Gig S.

    2008-01-01

    Compared with conventional three-dimensional (3D) conformal radiation therapy and intensity-modulated radiation therapy treatments, quality assurance (QA) for motion-adaptive radiation therapy involves various challenges because of the added temporal dimension. Here we discuss those challenges for three specific techniques related to motion-adaptive therapy: namely respiratory gating, breath holding, and four-dimensional computed tomography. Similar to the introduction of any other new technologies in clinical practice, typical QA measures should be taken for these techniques also, including initial testing of equipment and clinical procedures, as well as frequent QA examinations during the early stage of implementation. Here, rather than covering every QA aspect in depth, we focus on some major QA challenges. The biggest QA challenge for gating and breath holding is how to ensure treatment accuracy when internal target position is predicted using external surrogates. Recommended QA measures for each component of treatment, including simulation, planning, patient positioning, and treatment delivery and verification, are discussed. For four-dimensional computed tomography, some major QA challenges have also been discussed

  3. Benefits and Challenges of the Adoption of Cloud Computing in Business

    OpenAIRE

    Colin Ting Si Xue; Felicia Tiong Wee Xin

    2016-01-01

    The loss of business and downturn of economics almost occur every day. Thus technology is needed in every organization. Cloud computing has played a major role in solving the inefficiencies problem in organizations and increase the growth of business thus help the organizations to stay competitive. It is required to improve and automate the traditional ways of doing business. Cloud computing has been considered as an innovative way to improve business. Overall, cloud computing enables the org...

  4. Mobile, Cloud, and Big Data Computing: Contributions, Challenges, and New Directions in Telecardiology

    OpenAIRE

    Hsieh, Jui-Chien; Li, Ai-Hsien; Yang, Chung-Chi

    2013-01-01

    Many studies have indicated that computing technology can enable off-site cardiologists to read patients’ electrocardiograph (ECG), echocardiography (ECHO), and relevant images via smart phones during pre-hospital, in-hospital, and post-hospital teleconsultation, which not only identifies emergency cases in need of immediate treatment, but also prevents the unnecessary re-hospitalizations. Meanwhile, several studies have combined cloud computing and mobile computing to facilitate better stora...

  5. A REVIEW ON SECURITY ISSUES AND CHALLENGES IN CLOUD COMPUTING MODEL OF RESOURCE MANAGEMENT

    OpenAIRE

    T. Vaikunth Pai; Dr. P. S. Aithal

    2017-01-01

    Cloud computing services refer to set of IT-enabled services delivered to a customer as services over the Internet on a leased basis and have the capability to extend up or down their service requirements or needs. Usually, cloud computing services are delivered by third party vendors who own the infrastructure. It has several advantages include scalability, elasticity, flexibility, efficiency and outsourcing non-core activities of an organization. Cloud computing offers an innovative busines...

  6. The United States Department of Energy, Office of Environmental Management's Progress and Challenges in Environmental Remediation and Decommissioning

    International Nuclear Information System (INIS)

    Szilagyi, A.; Collazo, Y.

    2008-01-01

    The United States Department of Energy Environmental Management Program (EM) is responsible for managing the world largest environmental cleanup program comprised of unparalleled scope, complexity, diversity of facilities and contaminants and technical challenges. Established in 1989, EM mission is the safe and successful cleanup of the Cold War legacy brought about from five decades of nuclear weapons development and government-sponsored nuclear energy research. Within this mission, EM is responsible for radioactive liquid wastes, spent nuclear fuel, nuclear materials, solid radioactive waste, contaminated soils and groundwater and contaminated facilities located in 14 States, on over 2,000,000 acres of land and over 4500 facilities requiring decommissioning. Since 1989 EM has, and continues to evolve into a true project management oriented organization with world-class engineering and technology capabilities, and as the National Academy of Public Administration has concluded, a with the changes underway, EM is on a solid path to becoming a high performing organization. Not only has EM grown and matured as a functional organization, but it has also achieved some remarkable on the ground accomplishments in environmental remediation, deactivation and decommissioning and waste management/nuclear material stabilization. These accomplishments have been made within a context of having to work with some of the most dangerous substances known to humanity; of having to perform first of a kind tasks in highly hazardous environments; and of having to design, construct and operate first of a kind technology and facilities to solve problems that once seemed unsolvable. In addition, EM accomplishments have been made with the highest priority and focus given to safety and risk reduction. In October 2006, and with a life cycle cost of $6.7 Billion, cleanup/D and D was completed at the 800+ facility 6200 acre former nuclear weapons complex at Rocky Flats (Denver, Colorado). Today

  7. Increasing high school girls' exposure to computing activities with e-textiles: challenges and lessons learned

    DEFF Research Database (Denmark)

    Borsotti, Valeria

    2017-01-01

    ; stereotypes about computing as a ‘male’ domain; widespread lack of pre-college CS education and perceptions of computing as not socially relevant. STEAM activities have often been used to bridge the gender gap and to broaden the appeal of computing among children and youth. This contribution examines a STEAM......The number of female students in computer science degrees has been rapidly declining in Denmark in the past 40 years, as in many other European and North-American countries. The main reasons behind this phenomenon are widespread gender stereotypes about who is best suited to pursue a career in CS...

  8. Thallium-201 single photon emission computed tomography (SPECT) in patients with Duchenne's progressive muscular dystrophy. A histopathologic correlation study

    International Nuclear Information System (INIS)

    Nishimura, Toru; Yanagisawa, Atsuo; Sakata, Konomi; Shimoyama, Katsuya; Yoshino, Hideaki; Ishikawa, Kyozo; Sakata, Hitomi; Ishihara, Tadayuki

    2001-01-01

    The pathomorphologic mechanism responsible for abnormal perfusion imaging during thallium-201 myocardial single photon emission computed tomography ( 201 Tl-SPECT) in patients with Duchenne's progressive muscular dystrophy (DMD) was investigated. Hearts from 7 patients with DMD were evaluated histopathologically at autopsy and the results correlated with findings on initial and delayed resting 201 Tl-SPECT images. The location of segments with perfusion defects correlated with the histopathologically abnormal segments in the hearts. Both the extent and degree of myocardial fibrosis were severe, especially in the posterolateral segment of the left ventricle. Severe transmural fibrosis and severe fatty infiltration were common in segments with perfusion defects. In areas of redistribution, the degree of fibrosis appeared to be greater than in areas of normal perfusion; and intermuscular edema was prominent. Thus, the degree and extent of perfusion defects detected by 201 Tl-SPECT were compatible with the histopathology. The presence of the redistribution phenomenon may indicate ongoing fibrosis. Initial and delayed resting 201 Tl-SPECT images can predict the site and progress of myocardial degeneration in patients with DMD. (author)

  9. Computational methods for the nuclear and neutron matter problems: Progress report

    International Nuclear Information System (INIS)

    Kalos, M.H.

    1989-01-01

    This proposal is concerned with the use of Monte Carlo methods as a numerical technique in the study of nuclear structure. The straightforward use of Monte Carlo in nuclear physics has been impeded by certain technical difficulties. Foremost among them is the fact that numerical integration of the Schr/umlt o/dinger equation, by now straightforward for the ground state of boson systems, is substantially more difficult for many-fermion systems. The first part of this proposal outlines a synthesis of several advances into a single experimental algorithm. The proposed work is to implement and study the properties of the algorithm with simple models of few-body nuclei as the physical system to be investigated. Variational Monte Carlo remains an extremely powerful and useful method. Its application to nuclear structure physics presents unique difficulties. The varieties of interactions in the phenomenological potentials must be reflected in a corresponding richness of the correlations in accurate trial wave functions. Then the sheer number of terms in such trial fashions written as a product of pairs presents specific difficulties. We have had good success in our first experiments on a random field method that decouples the interactions and propose to extend our research to 16 O and to p-shell nuclei. Spin-orbit terms present special problems as well, because the implied gradient operators must be applied repeatedly. We propose to treat them in first order only, for now, and to calculate the result in three- and four-body nuclei. We propose a new Monte Carlo method for computing the amplitude of deuteron components in trial functions for heavier nuclei (here, specifically for 6 Li). The method is an extension of that used for off-diagonal matrix elements in quantum fluids

  10. Service-oriented computing : State of the art and research challenges

    NARCIS (Netherlands)

    Papazoglou, Michael P.; Traverso, Paolo; Dustdar, Schahram; Leymann, Frank

    2007-01-01

    Service-oriented computing promotes the idea of assembling application components into a network of services that can be loosely coupled to create flexible, dynamic business processes and agile applications that span organizations and computing platforms. An SOC research road map provides a context

  11. 3 Ways that Web-Based Computing Will Change Colleges--And Challenge Them

    Science.gov (United States)

    Young, Jeffrey R.

    2008-01-01

    Cloud computing, one of the latest technology buzzwords, is so hard to explain that Google drove a bus from campus to campus to walk students through the company's vision of it. After students sat through a demo at computers set up nearby, they boarded the bus and got free T-shirts. The bus only stopped at colleges that had already agreed to hand…

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  13. Perspectives on Games, Computers, and Mental Health: Questions about Paradoxes, Evidences, and Challenges

    OpenAIRE

    Desseilles, Martin

    2016-01-01

    In the field of mental health, games and computerized games present questions about paradoxes, evidences, and challenges. This perspective article offers perspectives and personal opinion about these questions, evidences, and challenges with an objective of presenting several ideas and issues in this rapidly developing field. First, games raise some questions in the sense of the paradox between a game and an issue, as well as the paradox of using an amusing game to treat a serious pathology. ...

  14. Computer-aided detection systems to improve lung cancer early diagnosis: state-of-the-art and challenges

    International Nuclear Information System (INIS)

    Traverso, A; Lopez Torres, E; Cerello, P; Fantacci, M E

    2017-01-01

    Lung cancer is one of the most lethal types of cancer, because its early diagnosis is not good enough. In fact, the detection of pulmonary nodule, potential lung cancers, in Computed Tomography scans is a very challenging and time-consuming task for radiologists. To support radiologists, researchers have developed Computer-Aided Diagnosis (CAD) systems for the automated detection of pulmonary nodules in chest Computed Tomography scans. Despite the high level of technological developments and the proved benefits on the overall detection performance, the usage of Computer-Aided Diagnosis in clinical practice is far from being a common procedure. In this paper we investigate the causes underlying this discrepancy and present a solution to tackle it: the M5L WEB- and Cloud-based on-demand Computer-Aided Diagnosis. In addition, we prove how the combination of traditional imaging processing techniques with state-of-art advanced classification algorithms allows to build a system whose performance could be much larger than any Computer-Aided Diagnosis developed so far. This outcome opens the possibility to use the CAD as clinical decision support for radiologists. (paper)

  15. Computer-aided detection systems to improve lung cancer early diagnosis: state-of-the-art and challenges

    Science.gov (United States)

    Traverso, A.; Lopez Torres, E.; Fantacci, M. E.; Cerello, P.

    2017-05-01

    Lung cancer is one of the most lethal types of cancer, because its early diagnosis is not good enough. In fact, the detection of pulmonary nodule, potential lung cancers, in Computed Tomography scans is a very challenging and time-consuming task for radiologists. To support radiologists, researchers have developed Computer-Aided Diagnosis (CAD) systems for the automated detection of pulmonary nodules in chest Computed Tomography scans. Despite the high level of technological developments and the proved benefits on the overall detection performance, the usage of Computer-Aided Diagnosis in clinical practice is far from being a common procedure. In this paper we investigate the causes underlying this discrepancy and present a solution to tackle it: the M5L WEB- and Cloud-based on-demand Computer-Aided Diagnosis. In addition, we prove how the combination of traditional imaging processing techniques with state-of-art advanced classification algorithms allows to build a system whose performance could be much larger than any Computer-Aided Diagnosis developed so far. This outcome opens the possibility to use the CAD as clinical decision support for radiologists.

  16. Building a Grad Nation: Progress and Challenge in Ending the High School Dropout Epidemic. Annual Update, 2012

    Science.gov (United States)

    Balfanz, Robert; Bridgeland, John M.; Bruce, Mary; Fox, Joanna Hornig

    2012-01-01

    In 2010, the authors shared a Civic Marshall Plan to create a Grad Nation. Through that first report and subsequent update, they saw hopeful signs of progress in boosting high school graduation rates in communities across the country. This 2012 report shows that high school graduation rates continue to improve nationally and across many states and…

  17. Challenge for knowledge information processing systems (preliminary report on Fifth Generation Computer Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Moto-oka, T

    1982-01-01

    The author explains the reasons, aims and strategies for the Fifth Generation Computer Project in Japan. The project aims to introduce a radical new breed of computer by 1990. This article outlines the economic and social reasons for the project. It describes the impacts and effects that these computers are expected to have. The areas of technology which will form the contents of the research and development are highlighted. These are areas such as VLSI technology, speech and image understanding systems, artificial intelligence and advanced architecture design. Finally a schedule for completion of research is given which aims for a completed project by 1990.

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  19. Cloud Computing and its Challenges and Benefits in the Bank System

    Directory of Open Access Journals (Sweden)

    Bogdan NEDELCU

    2015-07-01

    Full Text Available The purpose of this article is to highlight the current situation of Cloud Computing systems. There is a tendency for enterprises and banks to seek such databases, so the article tries to answer the question: "Is Cloud Computing safe". Answering this question requires an analysis of the security system (strengths and weaknesses, accompanied by arguments for and against this trend and suggestions for improvement that can increase the customers confidence in the future.

  20. Qualitative Computing and Qualitative Research: Addressing the Challenges of Technology and Globalization

    Directory of Open Access Journals (Sweden)

    César A. Cisneros Puebla

    2012-05-01

    Full Text Available Qualitative computing has been part of our lives for thirty years. Today, we urgently call for an evaluation of its international impact on qualitative research. Evaluating the international impact of qualitative research and qualitative computing requires a consideration of the vast amount of qualitative research over the last decades, as well as thoughtfulness about the uneven and unequal way in which qualitative research and qualitative computing are present in different fields of study and geographical regions. To understand the international impact of qualitative computing requires evaluation of the digital divide and the huge differences between center and peripheries. The international impact of qualitative research, and, in particular qualitative computing, is the question at the heart of this array of selected papers from the "Qualitative Computing: Diverse Worlds and Research Practices Conference." In this article, we introduce the reader to the goals, motivation, and atmosphere at the conference, taking place in Istanbul, Turkey, in 2011. The dialogue generated there is still in the air, and this introduction is a call to spread that voice. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1202285