WorldWideScience

Sample records for computing challenges progress

  1. Progress and Challenges in High Performance Computer Technology

    Institute of Scientific and Technical Information of China (English)

    Xue-Jun Yang; Yong Dou; Qing-Feng Hu

    2006-01-01

    High performance computers provide strategic computing power in the construction of national economy and defense, and become one of symbols of the country's overall strength. Over 30 years, with the supports of governments, the technology of high performance computers is in the process of rapid development, during which the computing performance increases nearly 3 million times and the processors number expands over 10 hundred thousands times. To solve the critical issues related with parallel efficiency and scalability, scientific researchers pursued extensive theoretical studies and technical innovations. The paper briefly looks back the course of building high performance computer systems both at home and abroad,and summarizes the significant breakthroughs of international high performance computer technology. We also overview the technology progress of China in the area of parallel computer architecture, parallel operating system and resource management,parallel compiler and performance optimization, environment for parallel programming and network computing. Finally, we examine the challenging issues, "memory wall", system scalability and "power wall", and discuss the issues of high productivity computers, which is the trend in building next generation high performance computers.

  2. Towards brain-computer music interfaces: progress and challenges

    OpenAIRE

    Miranda, E. R.; Durrant, Simon; Anders, T.

    2008-01-01

    Brain-Computer Music Interface (BCMI) is a new research area that is emerging at the cross roads of neurobiology,engineering sciences and music. This research involves three major challenging problems: the extraction of meaningful control information from signals emanating directly from the brain, the design of generative music techniques that respond to such information, and the training of subjects to use the system. We have implemented a proof-of-concept BCMI system that is able to use ...

  3. Ubiquitous Wireless Computing: Current Research Progress, Challenging, and Future Directions

    OpenAIRE

    Elyas, Palantei

    2014-01-01

    - The aggressive research activities and generous studies focusing on the ubiquitous mobile computing carried-out during the last two decades have gained very tremendous outcomes to apply in broad areas of modern society lives. In the near future, the computing technology application is highly possible to emerge as the dominant method to connect any objects to the global ICT infrastructure, the internet. This talk mainly discusses several R&D achievements performed during the last five yea...

  4. Robust computing with nano-scale devices progresses and challenges

    CERN Document Server

    Huang, Chao

    2010-01-01

    The focus of this book is on various issues of robust nano-computing, defect-tolerance design for nano-technology at different design abstraction levels. It addresses both redundancy- and configuration-based methods as well as fault detecting techniques.

  5. Computational chemistry for graphene-based energy applications: progress and challenges.

    Science.gov (United States)

    Hughes, Zak E; Walsh, Tiffany R

    2015-04-28

    Research in graphene-based energy materials is a rapidly growing area. Many graphene-based energy applications involve interfacial processes. To enable advances in the design of these energy materials, such that their operation, economy, efficiency and durability is at least comparable with fossil-fuel based alternatives, connections between the molecular-scale structure and function of these interfaces are needed. While it is experimentally challenging to resolve this interfacial structure, molecular simulation and computational chemistry can help bridge these gaps. In this Review, we summarise recent progress in the application of computational chemistry to graphene-based materials for fuel cells, batteries, photovoltaics and supercapacitors. We also outline both the bright prospects and emerging challenges these techniques face for application to graphene-based energy materials in future.

  6. Computational chemistry for graphene-based energy applications: progress and challenges

    Science.gov (United States)

    Hughes, Zak E.; Walsh, Tiffany R.

    2015-04-01

    Research in graphene-based energy materials is a rapidly growing area. Many graphene-based energy applications involve interfacial processes. To enable advances in the design of these energy materials, such that their operation, economy, efficiency and durability is at least comparable with fossil-fuel based alternatives, connections between the molecular-scale structure and function of these interfaces are needed. While it is experimentally challenging to resolve this interfacial structure, molecular simulation and computational chemistry can help bridge these gaps. In this Review, we summarise recent progress in the application of computational chemistry to graphene-based materials for fuel cells, batteries, photovoltaics and supercapacitors. We also outline both the bright prospects and emerging challenges these techniques face for application to graphene-based energy materials in future.

  7. Recent progress and modern challenges in applied mathematics, modeling and computational science

    CERN Document Server

    Makarov, Roman; Belair, Jacques

    2017-01-01

    This volume is an excellent resource for professionals in various areas of applications of mathematics, modeling, and computational science. It focuses on recent progress and modern challenges in these areas. The volume provides a balance between fundamental theoretical and applied developments, emphasizing the interdisciplinary nature of modern trends and detailing state-of-the-art achievements in Applied Mathematics, Modeling, and Computational Science.  The chapters have been authored by international experts in their respective fields, making this book ideal for researchers in academia, practitioners, and graduate students. It can also serve as a reference in the diverse selected areas of applied mathematics, modelling, and computational sciences, and is ideal for interdisciplinary collaborations.

  8. Recent progress and challenges in exploiting graphics processors in computational fluid dynamics

    CERN Document Server

    Niemeyer, Kyle E

    2014-01-01

    The progress made in accelerating simulations of fluid flow using GPUs, and the challenges that remain, are surveyed. The review first provides an introduction to GPU computing and programming, and discusses various considerations for improved performance. Case studies comparing the performance of CPU- and GPU- based solvers for the Laplace and incompressible Navier-Stokes equations are performed in order to demonstrate the potential improvement even with simple codes. Recent efforts to accelerate CFD simulations using GPUs are reviewed for laminar, turbulent, and reactive flow solvers. Also, GPU implementations of the lattice Boltzmann method are reviewed. Finally, recommendations for implementing CFD codes on GPUs are given and remaining challenges are discussed, such as the need to develop new strategies and redesign algorithms to enable GPU acceleration.

  9. Quantum computing with atomic qubits and Rydberg interactions: Progress and challenges

    CERN Document Server

    Saffman, Mark

    2016-01-01

    We present a review of quantum computation with neutral atom qubits. After an overview of architectural options we examine Rydberg mediated gate protocols and fidelity for two- and multi-qubit interactions. We conclude with a summary of the current status and give an outlook for future progress.

  10. Quantum computing with atomic qubits and Rydberg interactions: progress and challenges

    Science.gov (United States)

    Saffman, M.

    2016-10-01

    We present a review of quantum computation with neutral atom qubits. After an overview of architectural options and approaches to preparing large qubit arrays we examine Rydberg mediated gate protocols and fidelity for two- and multi-qubit interactions. Quantum simulation and Rydberg dressing are alternatives to circuit based quantum computing for exploring many body quantum dynamics. We review the properties of the dressing interaction and provide a quantitative figure of merit for the complexity of the coherent dynamics that can be accessed with dressing. We conclude with a summary of the current status and an outlook for future progress.

  11. Photons, Photosynthesis, and High-Performance Computing: Challenges, Progress, and Promise of Modeling Metabolism in Green Algae

    Energy Technology Data Exchange (ETDEWEB)

    Chang, C. H.; Graf, P.; Alber, D. M.; Kim, K.; Murray, G.; Posewitz, M.; Seibert, M.

    2008-01-01

    The complexity associated with biological metabolism considered at a kinetic level presents a challenge to quantitative modeling. In particular, the relatively sparse knowledge of parameters for enzymes with known kinetic responses is problematic. The possible space of these parameters is of high-dimension, and sampling of such a space typifies a combinatorial explosion of possible dynamic states. However, with sufficient quantitative transcriptomics, proteomics, and metabolomics data at hand, these challenges could be met by high-performance software with sampling, fitting, and optimization capabilities. With this in mind, we present the High-Performance Systems Biology Toolkit HiPer SBTK, an evolving software package to simulate, fit, and optimize metabolite concentrations and fluxes within the space of rate and binding parameters associated with detailed enzyme kinetic models. We present our chosen modeling paradigm for the formulation of metabolic pathway models, the means to address the challenge of representing such models in a precise and persistent fashion using the standardized Systems Biology Markup Language, and our second-generation model of H2-associated Chlamydomonas metabolism. Processing of such models for hierarchically parallelized simulation and optimization, job specification by the user through a GUI interface, software capabilities and initial scaling data, and the mapping of the computation to biological questions is also discussed. Moreover, we present near-term future software and model development goals.

  12. Silicon spintronics: Progress and challenges

    Science.gov (United States)

    Sverdlov, Viktor; Selberherr, Siegfried

    2015-07-01

    Electron spin attracts much attention as an alternative to the electron charge degree of freedom for low-power reprogrammable logic and non-volatile memory applications. Silicon appears to be the perfect material for spin-driven applications. Recent progress and challenges regarding spin-based devices are reviewed. An order of magnitude enhancement of the electron spin lifetime in silicon thin films by shear strain is predicted and its impact on spin transport in SpinFETs is discussed. A relatively weak coupling between spin and effective electric field in silicon allows magnetoresistance modulation at room temperature, however, for long channel lengths. Due to tunneling magnetoresistance and spin transfer torque effects, a much stronger coupling between the spin (magnetization) orientation and charge current is achieved in magnetic tunnel junctions. Magnetic random access memory (MRAM) built on magnetic tunnel junctions is CMOS compatible and possesses all properties needed for future universal memory. Designs of spin-based non-volatile MRAM cells are presented. By means of micromagnetic simulations it is demonstrated that a substantial reduction of the switching time can be achieved. Finally, it is shown that any two arbitrary memory cells from an MRAM array can be used to perform a logic operation. Thus, an intrinsic non-volatile logic-in-memory architecture can be realized.

  13. Silicon spintronics: Progress and challenges

    Energy Technology Data Exchange (ETDEWEB)

    Sverdlov, Viktor; Selberherr, Siegfried, E-mail: Selberherr@TUWien.ac.at

    2015-07-14

    Electron spin attracts much attention as an alternative to the electron charge degree of freedom for low-power reprogrammable logic and non-volatile memory applications. Silicon appears to be the perfect material for spin-driven applications. Recent progress and challenges regarding spin-based devices are reviewed. An order of magnitude enhancement of the electron spin lifetime in silicon thin films by shear strain is predicted and its impact on spin transport in SpinFETs is discussed. A relatively weak coupling between spin and effective electric field in silicon allows magnetoresistance modulation at room temperature, however, for long channel lengths. Due to tunneling magnetoresistance and spin transfer torque effects, a much stronger coupling between the spin (magnetization) orientation and charge current is achieved in magnetic tunnel junctions. Magnetic random access memory (MRAM) built on magnetic tunnel junctions is CMOS compatible and possesses all properties needed for future universal memory. Designs of spin-based non-volatile MRAM cells are presented. By means of micromagnetic simulations it is demonstrated that a substantial reduction of the switching time can be achieved. Finally, it is shown that any two arbitrary memory cells from an MRAM array can be used to perform a logic operation. Thus, an intrinsic non-volatile logic-in-memory architecture can be realized.

  14. Progress and challenges in the computational prediction of gene function using networks [v1; ref status: indexed, http://f1000r.es/SqmJUM

    Directory of Open Access Journals (Sweden)

    Paul Pavlidis

    2012-09-01

    Full Text Available In this opinion piece, we attempt to unify recent arguments we have made that serious confounds affect the use of network data to predict and characterize gene function. The development of computational approaches to determine gene function is a major strand of computational genomics research. However, progress beyond using BLAST to transfer annotations has been surprisingly slow. We have previously argued that a large part of the reported success in using "guilt by association" in network data is due to the tendency of methods to simply assign new functions to already well-annotated genes. While such predictions will tend to be correct, they are generic; it is true, but not very helpful, that a gene with many functions is more likely to have any function. We have also presented evidence that much of the remaining performance in cross-validation cannot be usefully generalized to new predictions, making progressive improvement in analysis difficult to engineer. Here we summarize our findings about how these problems will affect network analysis, discuss some ongoing responses within the field to these issues, and consolidate some recommendations and speculation, which we hope will modestly increase the reliability and specificity of gene function prediction.

  15. Big Computing in Astronomy: Perspectives and Challenges

    Science.gov (United States)

    Pankratius, Victor

    2014-06-01

    Hardware progress in recent years has led to astronomical instruments gathering large volumes of data. In radio astronomy for instance, the current generation of antenna arrays produces data at Tbits per second, and forthcoming instruments will expand these rates much further. As instruments are increasingly becoming software-based, astronomers will get more exposed to computer science. This talk therefore outlines key challenges that arise at the intersection of computer science and astronomy and presents perspectives on how both communities can collaborate to overcome these challenges.Major problems are emerging due to increases in data rates that are much larger than in storage and transmission capacity, as well as humans being cognitively overwhelmed when attempting to opportunistically scan through Big Data. As a consequence, the generation of scientific insight will become more dependent on automation and algorithmic instrument control. Intelligent data reduction will have to be considered across the entire acquisition pipeline. In this context, the presentation will outline the enabling role of machine learning and parallel computing.BioVictor Pankratius is a computer scientist who joined MIT Haystack Observatory following his passion for astronomy. He is currently leading efforts to advance astronomy through cutting-edge computer science and parallel computing. Victor is also involved in projects such as ALMA Phasing to enhance the ALMA Observatory with Very-Long Baseline Interferometry capabilities, the Event Horizon Telescope, as well as in the Radio Array of Portable Interferometric Detectors (RAPID) to create an analysis environment using parallel computing in the cloud. He has an extensive track record of research in parallel multicore systems and software engineering, with contributions to auto-tuning, debugging, and empirical experiments studying programmers. Victor has worked with major industry partners such as Intel, Sun Labs, and Oracle. He holds

  16. Progress and Challenge of Artificial Intelligence

    Institute of Scientific and Technical Information of China (English)

    Zhong-Zhi Shi; Nan-Ning Zheng

    2006-01-01

    Artificial Intelligence (AI) is generally considered to be a subfield of computer science, that is concerned to attempt simulation, extension and expansion of human intelligence. Artificial intelligence has enjoyed tremendous success over the last fifty years. In this paper we only focus on visual perception, granular computing, agent computing, semantic grid. Human-level intelligence is the long-term goal of artificial intelligence. We should do joint research on basic theory and technology of intelligence by brain science, cognitive science, artificial intelligence and others. A new cross discipline intelligence science is undergoing a rapid development. Future challenges are given in final section.

  17. Challenges and reflections on exascale computing

    Institute of Scientific and Technical Information of China (English)

    Yang Xuejun

    2014-01-01

    This paper introduces the development of the exascale (1018) computing. Though exascale computing is a hot research direction worldwide,we are facing many challenges in the areas of memory wall,communica-tion wall,reliability wall,power wall and scalability of parallel computing. According to these challenges, some thoughts and strategies are proposed.

  18. Challenges and reflections on exascale computing

    Institute of Scientific and Technical Information of China (English)

    Yang Xuejun

    2014-01-01

    This paper introduces the development of the exascale (1018) computing. Though exascalc computing is a hot research direction worldwide, we are facing many challenges in the areas of memory wall, communica- tion wall, reliability wall, power wall and scalability of parallel computing. According to these challenges, some thoughts and strategies are proposed.

  19. Ubiquitous Computing: Potentials and Challenges

    OpenAIRE

    Sen, Jaydip

    2010-01-01

    The world is witnessing the birth of a revolutionary computing paradigm that promises to have a profound effect on the way we interact with computers, devices, physical spaces, and other people. This new technology, called ubiquitous computing, envisions a world where embedded processors, computers, sensors, and digital communications are inexpensive commodities that are available everywhere. This paper presents a comprehensive discussion on the central trends in ubiquitous computing consider...

  20. Granular computing: perspectives and challenges.

    Science.gov (United States)

    Yao, JingTao; Vasilakos, Athanasios V; Pedrycz, Witold

    2013-12-01

    Granular computing, as a new and rapidly growing paradigm of information processing, has attracted many researchers and practitioners. Granular computing is an umbrella term to cover any theories, methodologies, techniques, and tools that make use of information granules in complex problem solving. The aim of this paper is to review foundations and schools of research and to elaborate on current developments in granular computing research. We first review some basic notions of granular computing. Classification and descriptions of various schools of research in granular computing are given. We also present and identify some research directions in granular computing.

  1. Ubiquitous Computing: Potentials and Challenges

    CERN Document Server

    Sen, Jaydip

    2010-01-01

    The world is witnessing the birth of a revolutionary computing paradigm that promises to have a profound effect on the way we interact with computers, devices, physical spaces, and other people. This new technology, called ubiquitous computing, envisions a world where embedded processors, computers, sensors, and digital communications are inexpensive commodities that are available everywhere. Ubiquitous computing will surround users with a comfortable and convenient information environment that merges physical and computational infrastructures into an integrated habitat. This habitat will feature a proliferation of hundreds or thousands of computing devices and sensors that will provide new functionality, offer specialized services, and boost productivity and interaction. This paper presents a comprehensive discussion on the central trends in ubiquitous computing considering them form technical, social and economic perspectives. It clearly identifies different application areas and sectors that will benefit f...

  2. Progress in theoretical quantum computing

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    @@ Computing is perhaps one of the most distinguished features that differentiate humans from animals.Aside from counting numbers using fingers and toes,abacus was the first great computing machine of human civilization.

  3. Research Challenges for Enterprise Cloud Computing

    CERN Document Server

    Khajeh-Hosseini, Ali; Sriram, Ilango

    2010-01-01

    Cloud computing represents a shift away from computing as a product that is purchased, to computing as a service that is delivered to consumers over the internet from large-scale data centers - or "clouds". This paper discusses some of the research challenges for cloud computing from an enterprise or organizational perspective, and puts them in context by reviewing the existing body of literature in cloud computing. Various research challenges relating to the following topics are discussed: the organizational changes brought about by cloud computing; the economic and organizational implications of its utility billing model; the security, legal and privacy issues that cloud computing raises. It is important to highlight these research challenges because cloud computing is not simply about a technological improvement of data centers but a fundamental change in how IT is provisioned and used. This type of research has the potential to influence wider adoption of cloud computing in enterprise, and in the consumer...

  4. Achieving efficient RNAi therapy: progress and challenges

    Directory of Open Access Journals (Sweden)

    Kun Gao

    2013-07-01

    Full Text Available RNA interference (RNAi has been harnessed to produce a new class of drugs for treatment of various diseases. This review summarizes the most important parameters that govern the silencing efficiency and duration of the RNAi effect such as small interfering RNA (siRNA stability and modification, the type of delivery system and particle sizing methods. It also discusses the predominant barriers for siRNA delivery, such as off-target effects and introduces internalization, endosomal escape and mathematical modeling in RNAi therapy and combinatorial RNAi. At present, effective delivery of RNAi therapeutics in vivo remains a challenge although significant progress has been made in this field.

  5. EUV lithography: progress, challenges, and outlook

    Science.gov (United States)

    Wurm, S.

    2014-10-01

    Extreme Ultraviolet Lithography (EUVL) has been in the making for more than a quarter century. The first EUVL production tools have been delivered over the past year and chip manufacturers and suppliers are maturing the technology in pilot line mode to prepare for high volume manufacturing (HVM). While excellent progress has been made in many technical and business areas to prepare EUVL for HVM introduction, there are still critical technical and business challenges to be addressed before the industry will be able to use EUVL in HVM.

  6. Progress in Computational Complexity Theory

    Institute of Scientific and Technical Information of China (English)

    Jin-Yi Cai; Hong Zhu

    2005-01-01

    We briefly survey a number of important recent achievements in Theoretical Computer Science (TCS), especially Computational Complexity Theory. We will discuss the PCP Theorem, its implications to inapproximability on combinatorial optimization problems; space bounded computations, especially deterministic logspace algorithm for undirected graph connectivity problem; deterministic polynomial-time primality test; lattice complexity, worst-case to average-case reductions;pseudorandomness and extractor constructions; and Valiant's new theory of holographic algorithms and reductions.

  7. Chips challenging champions games, computers and artificial intelligence

    CERN Document Server

    Schaeffer, J

    2002-01-01

    One of the earliest dreams of the fledgling field of artificial intelligence (AI) was to build computer programs that could play games as well as or better than the best human players. Despite early optimism in the field, the challenge proved to be surprisingly difficult. However, the 1990s saw amazing progress. Computers are now better than humans in checkers, Othello and Scrabble; are at least as good as the best humans in backgammon and chess; and are rapidly improving at hex, go, poker, and shogi. This book documents the progress made in computers playing games and puzzles. The book is the

  8. Computational medicine tools and challenges

    CERN Document Server

    Trajanoski, Zlatko

    2014-01-01

    This book covers a number of contemporary computational medicine topics spanning scales from molecular to cell to organ and organism, presenting a state-of-the-art IT infrastructure, and reviewing four hierarchical scales.

  9. Challenging Issues and Limitations of Mobile Computing

    Directory of Open Access Journals (Sweden)

    Kusuma Kumari B.M

    2014-02-01

    Full Text Available Mobile computing is becoming increasingly important due to the rise in the number of portable computers and the desire to have continuous network connectivity to the Internet irrespective of the physical location of the node. Mobile computing has fast become an important new paradigm in today's world of networked computing systems. Ranging from wireless laptops to cellular phones and WiFi/Bluetooth-enabled PDA's to wireless sensor networks, mobile computing has become ubiquitous in its impact on the people daily lives. The goal of this paper is to point out some of the limitations, constraints, mobility, challenges and applications of mobile computing.

  10. CHALLENGING ISSUES AND LIMITATIONS OF MOBILE COMPUTING

    Directory of Open Access Journals (Sweden)

    Kusuma Kumari B.M

    2015-11-01

    Full Text Available Mobile computing is becoming increasingly important due to the rise in the number of portable computers and the desire to have continuous network connectivity to the Internet irrespective of the physical location of the node. Mobile computing has fast become an important new paradigm in today's world of networked computing systems. Ranging from wireless laptops to cellular phones and WiFi/Bluetooth-enabled PDA's to wireless sensor networks, mobile computing has become ubiquitous in its impact on the people daily lives. The goal of this paper is to point out some of the limitations, constraints, mobility, challenges and applications of mobile computing.

  11. Beyond moore computing research challenge workshop report.

    Energy Technology Data Exchange (ETDEWEB)

    Huey, Mark C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Aidun, John Bahram [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-10-01

    We summarize the presentations and break out session discussions from the in-house workshop that was held on 11 July 2013 to acquaint a wider group of Sandians with the Beyond Moore Computing research challenge.

  12. Challenger of the Computer Market

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    DESPITE having conceived, designed, planned and organised the '98 Chengdu Computer Festival, when it came time to make the opening address, Hu Xinling's hands and feet were icy cold. It was the first time she had spoken in front of so many people and she couldn't help wondering if she could go through with it. Even with two microphones her voice was still low, but she finally completed the task of hosting and spent the whole next day closeted at home with a good book.

  13. Challenges and solutions in enterprise computing

    NARCIS (Netherlands)

    van Sinderen, Marten J.

    2008-01-01

    The emergence of the networked enterprise has a profound effect on enterprise computing. This introduction discusses some important challenges in enterprise computing, which are the result of the mentioned networking trend, and positions the articles of this special issue with respect to these chall

  14. Biomimetic photo-actuation: progress and challenges

    Science.gov (United States)

    Dicker, Michael P. M.; Weaver, Paul M.; Rossiter, Jonathan M.; Bond, Ian P.; Faul, Charl F. J.

    2016-04-01

    Photo-actuation, such as that observed in the reversible sun-tracking movements of heliotropic plants, is produced by a complex, yet elegant series of processes. In the heliotropic leaf movements of the Cornish Mallow, photo-actuation involves the generation, transport and manipulation of chemical signals from a distributed network of sensors in the leaf veins to a specialized osmosis driven actuation region in the leaf stem. It is theorized that such an arrangement is both efficient in terms of materials use and operational energy conversion, as well as being highly robust. We concern ourselves with understanding and mimicking these light driven, chemically controlled actuating systems with the aim of generating intelligent structures which share the properties of efficiency and robustness that are so important to survival in Nature. In this work we present recent progress in mimicking these photo-actuating systems through remote light exposure of a metastable state photoacid and the resulting signal and energy transfer through solution to a pH-responsive hydrogel actuator. Reversible actuation strains of 20% were achieved from this arrangement, with modelling then employed to reveal the critical influence hydrogel pKa has on this result. Although the strong actuation achieved highlights the progress that has been made in replicating the principles of biomimetic photo-actuation, challenges such as photoacid degradation were also revealed. It is anticipated that current work can directly lead to the development of high-performance and low-cost solartrackers for increased photovoltaic energy capture and to the creation of new types of intelligent structures employing chemical control systems.

  15. Gaucher disease: Progress and ongoing challenges.

    Science.gov (United States)

    Mistry, Pramod K; Lopez, Grisel; Schiffmann, Raphael; Barton, Norman W; Weinreb, Neal J; Sidransky, Ellen

    Over the past decades, tremendous progress has been made in the field of Gaucher disease, the inherited deficiency of the lysosomal enzyme glucocerebrosidase. Many of the colossal achievements took place during the course of the sixty-year tenure of Dr. Roscoe Brady at the National Institutes of Health. These include the recognition of the enzymatic defect involved, the isolation and characterization of the protein, the localization and characterization of the gene and its nearby pseudogene, as well as the identification of the first mutant alleles in patients. The first treatment for Gaucher disease, enzyme replacement therapy, was conceived of, developed and tested at the Clinical Center of the National Institutes of Health. Advances including recombinant production of the enzyme, the development of mouse models, pioneering gene therapy experiments, high throughput screens of small molecules and the generation of induced pluripotent stem cell models have all helped to catapult research in Gaucher disease into the twenty-first century. The appreciation that mutations in the glucocerebrosidase gene are an important risk factor for parkinsonism further expands the impact of this work. However, major challenges still remain, some of which are described here, that will provide opportunities, excitement and discovery for the next generations of Gaucher investigators.

  16. Biomolecular computing systems: principles, progress and potential.

    Science.gov (United States)

    Benenson, Yaakov

    2012-06-12

    The task of information processing, or computation, can be performed by natural and man-made 'devices'. Man-made computers are made from silicon chips, whereas natural 'computers', such as the brain, use cells and molecules. Computation also occurs on a much smaller scale in regulatory and signalling pathways in individual cells and even within single biomolecules. Indeed, much of what we recognize as life results from the remarkable capacity of biological building blocks to compute in highly sophisticated ways. Rational design and engineering of biological computing systems can greatly enhance our ability to study and to control biological systems. Potential applications include tissue engineering and regeneration and medical treatments. This Review introduces key concepts and discusses recent progress that has been made in biomolecular computing.

  17. Smart garments in chronic disease management: progress and challenges

    Science.gov (United States)

    Khosla, Ajit

    2012-10-01

    This paper presents the progress made developments in the area of Smart Garments for chronic disease management over last 10 years. A large number of health monitoring smart garments and wearable sensors have been manufactured to monitor patient's physiological parameters such as electrocardiogram, blood pressure, body temperature, heart rate, oxygen saturation, while patient is not in hospital. In last few years with the advancement in smartphones and cloud computing it is now possible to send the measure physiological data to any desired location. However there are many challenges in the development of smart garment systems. The two major challenges are development of new lightweight power sources and there is a need for global standardization and a road map for development of smart garments. In this paper we will discuss current state-of-theart smart garments and wearable sensor systems. Also discussed will be the new emerging trends in smart garment research and development.

  18. COMPLEX NETWORKS IN CLIMATE SCIENCE: PROGRESS, OPPORTUNITIES AND CHALLENGES

    Data.gov (United States)

    National Aeronautics and Space Administration — COMPLEX NETWORKS IN CLIMATE SCIENCE: PROGRESS, OPPORTUNITIES AND CHALLENGES KARSTEN STEINHAEUSER, NITESH V. CHAWLA, AND AUROOP R. GANGULY Abstract. Networks have...

  19. Multiagent Work Practice Simulation: Progress and Challenges

    Science.gov (United States)

    Clancey, William J.; Sierhuis, Maarten; Shaffe, Michael G. (Technical Monitor)

    2001-01-01

    Modeling and simulating complex human-system interactions requires going beyond formal procedures and information flows to analyze how people interact with each other. Such work practices include conversations, modes of communication, informal assistance, impromptu meetings, workarounds, and so on. To make these social processes visible, we have developed a multiagent simulation tool, called Brahms, for modeling the activities of people belonging to multiple groups, situated in a physical environment (geographic regions, buildings, transport vehicles, etc.) consisting of tools, documents, and a computer system. We are finding many useful applications of Brahms for system requirements analysis, instruction, implementing software agents, and as a workbench for relating cognitive and social theories of human behavior. Many challenges remain for representing work practices, including modeling: memory over multiple days, scheduled activities combining physical objects, groups, and locations on a timeline (such as a Space Shuttle mission), habitat vehicles with trajectories (such as the Shuttle), agent movement in 3D space (e.g., inside the International Space Station), agent posture and line of sight, coupled movements (such as carrying objects), and learning (mimicry, forming habits, detecting repetition, etc.).

  20. Advances and Challenges in Computational Plasma Science

    Energy Technology Data Exchange (ETDEWEB)

    W.M. Tang; V.S. Chan

    2005-01-03

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behavior. Recent advances in simulations of magnetically-confined plasmas are reviewed in this paper with illustrative examples chosen from associated research areas such as microturbulence, magnetohydrodynamics, and other topics. Progress has been stimulated in particular by the exponential growth of computer speed along with significant improvements in computer technology.

  1. Progress and Current Challenges in Modeling Large RNAs.

    Science.gov (United States)

    Somarowthu, Srinivas

    2016-02-27

    Recent breakthroughs in next-generation sequencing technologies have led to the discovery of several classes of non-coding RNAs (ncRNAs). It is now apparent that RNA molecules are not only just carriers of genetic information but also key players in many cellular processes. While there has been a rapid increase in the number of ncRNA sequences deposited in various databases over the past decade, the biological functions of these ncRNAs are largely not well understood. Similar to proteins, RNA molecules carry out a function by forming specific three-dimensional structures. Understanding the function of a particular RNA therefore requires a detailed knowledge of its structure. However, determining experimental structures of RNA is extremely challenging. In fact, RNA-only structures represent just 1% of the total structures deposited in the PDB. Thus, computational methods that predict three-dimensional RNA structures are in high demand. Computational models can provide valuable insights into structure-function relationships in ncRNAs and can aid in the development of functional hypotheses and experimental designs. In recent years, a set of diverse RNA structure prediction tools have become available, which differ in computational time, input data and accuracy. This review discusses the recent progress and challenges in RNA structure prediction methods.

  2. India's Computational Biology Growth and Challenges.

    Science.gov (United States)

    Chakraborty, Chiranjib; Bandyopadhyay, Sanghamitra; Agoramoorthy, Govindasamy

    2016-09-01

    India's computational science is growing swiftly due to the outburst of internet and information technology services. The bioinformatics sector of India has been transforming rapidly by creating a competitive position in global bioinformatics market. Bioinformatics is widely used across India to address a wide range of biological issues. Recently, computational researchers and biologists are collaborating in projects such as database development, sequence analysis, genomic prospects and algorithm generations. In this paper, we have presented the Indian computational biology scenario highlighting bioinformatics-related educational activities, manpower development, internet boom, service industry, research activities, conferences and trainings undertaken by the corporate and government sectors. Nonetheless, this new field of science faces lots of challenges.

  3. New challenges in computational collective intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Ngoc Thanh; Katarzyniak, Radoslaw Piotr [Wroclaw Univ. of Technology (Poland). Inst. of Informatics; Janiak, Adam (eds.) [Wroclaw Univ. of Technology (Poland). Inst. of Computer Engineering, Control and Robotics

    2009-07-01

    The book consists of 29 chapters which have been selected and invited from the submissions to the 1{sup st} International Conference on Collective Intelligence - Semantic Web, Social Networks and Multiagent Systems (ICCCI 2009). All chapters in the book discuss various examples of applications of computational collective intelligence and related technologies to such fields as semantic web, information systems ontologies, social networks, agent and multiagent systems. The editors hope that the book can be useful for graduate and Ph.D. students in Computer Science, in particular participants to courses on Soft Computing, Multi-Agent Systems and Robotics. This book can also be useful for researchers working on the concept of computational collective intelligence in artificial populations. It is the hope of the editors that readers of this volume can find many inspiring ideas and use them to create new cases intelligent collectives. Many such challenges are suggested by particular approaches and models presented in particular chapters of this book. (orig.)

  4. The LHCb computing data challenge DC06

    Energy Technology Data Exchange (ETDEWEB)

    Nandakumar, R [Science and Technology Facilities Council (United Kingdom); Jimenez, S G [University Rovira i Virgili (Spain); Adinolfi, M [H. H. Wills Physics Laboratory, Bristol (United Kingdom); Bernet, R [Universitat Zurich (Switzerland); Blouw, J [Physikalisches Institut, Heidelberg (Germany); Bortolotti, D; Carbone, A; M' Charek, B [Universita and INFN, Bologna (Italy); Perego, D L [INFN sez. Milano-Bicocca (Italy); Pickford, A [University of Glasgow (United Kingdom); Potterat, C [LPHE-IPEP, Lausanne (Switzerland); Miguelez, M S [Universidad de Santiago de Compostela (Spain); Bargiotti, M; Castellani, G; Charpentier, P; Closier, J [CERN (Switzerland); Brook, N [University of Bristol (United Kingdom); Casajus, A; Diaz, R Graciani [Universitat de Barcelona (Spain); Cioffi, C [University of Oxford (United Kingdom)], E-mail: r.nandakumar@rl.ac.uk (and others)

    2008-07-15

    The worldwide computing grid is essential to the LHC experiments in analysing the data collected by the detectors. Within LHCb, the computing model aims to simulate data at Tier-2 grid sites as well as non-grid resources. The reconstruction, stripping and analysis of the produced LHCb data will pimarily place at the Tier-1 centres. The computing data challenge DC06 started in May 2006 with the primary aims being to exercise the LHCb computing mod and to produce events which will be used for analyses in the forthcoming LHCb physics book. This paper gives an overview of the LHCb computing model and addresses the challenges and experiences during DC06. The management of the production of Monte Carlo data on the LCG was done using the DIRAC worklad management system which in turn uses the WLCG infrastructure and middleware. We shall report on the amount of data simulated during DC06, including the performance of the sites used. The paper will also summarise the experience gained during DC06, in particular he distribution of data to the Ter-1 sits and the access to this data.

  5. Computational Psychiatry and the Challenge of Schizophrenia.

    Science.gov (United States)

    Krystal, John H; Murray, John D; Chekroud, Adam M; Corlett, Philip R; Yang, Genevieve; Wang, Xiao-Jing; Anticevic, Alan

    2017-05-01

    Schizophrenia research is plagued by enormous challenges in integrating and analyzing large datasets and difficulties developing formal theories related to the etiology, pathophysiology, and treatment of this disorder. Computational psychiatry provides a path to enhance analyses of these large and complex datasets and to promote the development and refinement of formal models for features of this disorder. This presentation introduces the reader to the notion of computational psychiatry and describes discovery-oriented and theory-driven applications to schizophrenia involving machine learning, reinforcement learning theory, and biophysically-informed neural circuit models. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center 2017.

  6. The Glass Ceiling: Progress and Persistent Challenges

    Science.gov (United States)

    McLlwain, Wendy M.

    2012-01-01

    It has been written that since 2001, there has not been any significant progress and the glass ceiling is still intact. Women are still underrepresented in top positions (Anonymous, 2004). If this is true, the glass ceiling presents a major barrier between women and their desire to advance into executive or senior management positions. In addition…

  7. The Glass Ceiling: Progress and Persistent Challenges

    Science.gov (United States)

    McLlwain, Wendy M.

    2012-01-01

    It has been written that since 2001, there has not been any significant progress and the glass ceiling is still intact. Women are still underrepresented in top positions (Anonymous, 2004). If this is true, the glass ceiling presents a major barrier between women and their desire to advance into executive or senior management positions. In addition…

  8. Progress and challenges in cleaning up Hanford

    Energy Technology Data Exchange (ETDEWEB)

    Wagoner, J.D. [Dept. of Energy, Richland, WA (United States)

    1997-08-01

    This paper presents captioned viewgraphs which briefly summarize cleanup efforts at the Hanford Site. Underground waste tank and spent nuclear fuel issues are described. Progress is reported for the Plutonium Finishing Plant, PUREX plant, B-Plant/Waste Encapsulation Storage Facility, and Fast Flux Test Facility. A very brief overview of costs and number of sites remediated and/or decommissioned is given.

  9. The CMS Computing System: Successes and Challenges

    CERN Document Server

    Bloom, Kenneth

    2009-01-01

    Each LHC experiment will produce datasets with sizes of order one petabyte per year. All of this data must be stored, processed, transferred, simulated and analyzed, which requires a computing system of a larger scale than ever mounted for any particle physics experiment, and possibly for any enterprise in the world. I discuss how CMS has chosen to address these challenges, focusing on recent tests of the system that demonstrate the experiment's readiness for producing physics results with the first LHC data.

  10. Egyptian women in physics: Progress and challenges

    Science.gov (United States)

    Mohsen, M.; Hosni, Hala; Mohamed, Hadeer; Gadalla, Afaf; Kahil, Heba; Hashem, Hassan

    2015-12-01

    The present study shows a progressive increase in the number of female physicists as undergraduates and postgraduates in several governmental universities. For instance, in Ain Shams University, the percentage of women who selected physics as a major course of study increased from 7.2% in 2012 to 10.8% in 2013 and 15.7% in 2014. The study also provides the current gender distribution in the various positions among the teaching staff in seven governmental universities. The data supports the fact that female teaching assistants are increasing in these universities.

  11. Human rights in Japan: progress and challenges

    Directory of Open Access Journals (Sweden)

    Yolanda Muñoz González

    2007-11-01

    Full Text Available The aim of this paper is to present an overview of the improvements and challenges that Japan has been facing between 1983 and 2007. The paper explores the interaction among the different stakeholders –i.e. the Japanese Government, international organizations and civil society- to advance full access to citizenship regarding gender equality, the elimination of social and physical barriers for the inclusion of people with disabilities and elderly persons; ethnic minorities –specifically the situation of the Ainu people and the Buraku community – and the persons considered as “foreigners” living in Japan.

  12. Progress and Challenges in Infectious Disease Cartography.

    Science.gov (United States)

    Kraemer, Moritz U G; Hay, Simon I; Pigott, David M; Smith, David L; Wint, G R William; Golding, Nick

    2016-01-01

    Quantitatively mapping the spatial distributions of infectious diseases is key to both investigating their epidemiology and identifying populations at risk of infection. Important advances in data quality and methodologies have allowed for better investigation of disease risk and its association with environmental factors. However, incorporating dynamic human behavioural processes in disease mapping remains challenging. For example, connectivity among human populations, a key driver of pathogen dispersal, has increased sharply over the past century, along with the availability of data derived from mobile phones and other dynamic data sources. Future work must be targeted towards the rapid updating and dissemination of appropriately designed disease maps to guide the public health community in reducing the global burden of infectious disease. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Ovarian cancer immunotherapy: opportunities, progresses and challenges

    Directory of Open Access Journals (Sweden)

    Stevens Richard

    2010-02-01

    Full Text Available Abstract Due to the low survival rates from invasive ovarian cancer, new effective treatment modalities are urgently needed. Compelling evidence indicates that the immune response against ovarian cancer may play an important role in controlling this disease. We herein summarize multiple immune-based strategies that have been proposed and tested for potential therapeutic benefit against advanced stage ovarian cancer. We will examine the evidence for the premise that an effective therapeutic vaccine against ovarian cancer is useful not only for inducing remission of the disease but also for preventing disease relapse. We will also highlight the questions and challenges in the development of ovarian cancer vaccines, and critically discuss the limitations of some of the existing immunotherapeutic strategies. Finally, we will summarize our own experience on the use of patient-specific tumor-derived heat shock protein-peptide complex for the treatment of advanced ovarian cancer.

  14. Computer Aided Modelling – Opportunities and Challenges

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    -based solutions to significant problems? The important issues of workflow and data flow are discussed together with fit-for-purpose model development. As well, the lack of tools around multiscale modelling provides opportunities for the development of efficient tools to address such challenges. The ability......This chapter considers the opportunities that are present in developing, extending and applying aspects of computer-aided modelling principles and practice. What are the best tasks to be done by modellers and what needs the application of CAPE tools? How do we efficiently develop model...... and opportunities are discussed for such systems....

  15. Legume proteomics: Progress, prospects, and challenges.

    Science.gov (United States)

    Rathi, Divya; Gayen, Dipak; Gayali, Saurabh; Chakraborty, Subhra; Chakraborty, Niranjan

    2016-01-01

    Legumes are the major sources of food and fodder with strong commercial relevance, and are essential components of agricultural ecosystems owing to their ability to carry out endosymbiotic nitrogen fixation. In recent years, legumes have become one of the major choices of plant research. The legume proteomics is currently represented by more than 100 reference maps and an equal number of stress-responsive proteomes. Among the 48 legumes in the protein databases, most proteomic studies have been accomplished in two model legumes, soybean, and barrel medic. This review highlights recent contributions in the field of legume proteomics to comprehend the defence and regulatory mechanisms during development and adaptation to climatic changes. Here, we attempted to provide a concise overview of the progress in legume proteomics and discuss future developments in three broad perspectives: (i) proteome of organs/tissues; (ii) subcellular compartments; and (iii) spatiotemporal changes in response to stress. Such data mining may aid in discovering potential biomarkers for plant growth, in general, apart from essential components involved in stress tolerance. The prospect of integrating proteome data with genome information from legumes will provide exciting opportunities for plant biologists to achieve long-term goals of crop improvement and sustainable agriculture.

  16. Childhood Obesity – 2010: Progress and Challenges

    Science.gov (United States)

    Han, Joan C.; Lawlor, Debbie A.; Kimm, Sue Y.S.

    2010-01-01

    Summary The worldwide prevalence of childhood obesity has increased greatly over the past 3 decades. The increasing occurrence in children of disorders, such as type 2 diabetes, is believed to be a consequence of this obesity epidemic. Much progress has been made in understanding the genetics and physiology of appetite control and from this, the elucidation of the causes of some rare obesity syndromes. However, these rare disorders have so far taught us only limited lessons on how to prevent or reverse obesity in most children. Calorie intake and activity recommendations need to be re-assessed and better quantified, on a population level, given the more sedentary life of children today. For individual treatment, the currently recommended calorie prescriptions may be too conservative given the evolving insight on the “energy gap.” Whilst quality of research in both prevention and treatment has improved, there is still a need for high-quality multi-centre trials with long-term follow-up. Meanwhile, prevention and treatment approaches that aim to increase energy expenditure and decrease intake need to continue. Most recently, the spiralling increase in obesity prevalence may be abating for children. Thus, even greater efforts need to be made on all fronts to continue this potentially exciting trend. PMID:20451244

  17. Statistical and computational challenges in physical mapping

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, D.O.; Speed, T.P.

    1994-06-01

    One of the great success stories of modern molecular genetics has been the ability of biologists to isolate and characterize the genes responsible for serious inherited diseases like Huntington`s disease, cystic fibrosis, and myotonic dystrophy. Instrumental in these efforts has been the construction of so-called {open_quotes}physical maps{close_quotes} of large regions of human chromosomes. Constructing a physical map of a chromosome presents a number of interesting challenges to the computational statistician. In addition to the general ill-posedness of the problem, complications include the size of the data sets, computational complexity, and the pervasiveness of experimental error. The nature of the problem and the presence of many levels of experimental uncertainty make statistical approaches to map construction appealing. Simultaneously, however, the size and combinatorial complexity of the problem make such approaches computationally demanding. In this paper we discuss what physical maps are and describe three different kinds of physical maps, outlining issues which arise in constructing them. In addition, we describe our experience with powerful, interactive statistical computing environments. We found that the ability to create high-level specifications of proposed algorithms which could then be directly executed provided a flexible rapid prototyping facility for developing new statistical models and methods. The ability to check the implementation of an algorithm by comparing its results to that of an executable specification enabled us to rapidly debug both specification and implementation in an environment of changing needs.

  18. Lattice QCD computations: Recent progress with modern Krylov subspace methods

    Energy Technology Data Exchange (ETDEWEB)

    Frommer, A. [Bergische Universitaet GH Wuppertal (Germany)

    1996-12-31

    Quantum chromodynamics (QCD) is the fundamental theory of the strong interaction of matter. In order to compare the theory with results from experimental physics, the theory has to be reformulated as a discrete problem of lattice gauge theory using stochastic simulations. The computational challenge consists in solving several hundreds of very large linear systems with several right hand sides. A considerable part of the world`s supercomputer time is spent in such QCD calculations. This paper presents results on solving systems for the Wilson fermions. Recent progress is reviewed on algorithms obtained in cooperation with partners from theoretical physics.

  19. Enamel Regeneration - Current Progress and Challenges

    Science.gov (United States)

    Baswaraj; H.K, Navin; K.B, Prasanna

    2014-01-01

    Dental Enamel is the outermost covering of teeth. It is hardest mineralized tissue present in the human body. Enamel faces the challenge of maintaining its integrity in a constant demineralization and remineralization within the oral environment and it is vulnerable to wear, damage, and decay. It cannot regenerate itself, because it is formed by a layer of cells that are lost after the tooth eruption. Conventional treatment relies on synthetic materials to restore lost enamel that cannot mimic natural enamel. With advances in material science and understanding of basic principles of organic matrix mediated mineralization paves a way for formation of synthetic enamel. The knowledge of enamel formation and understanding of protein interactions and their gene products function along with the isolation of postnatal stem cells from various sources in the oral cavity, and the development of smart materials for cell and growth factor delivery, makes possibility for biological based enamel regeneration. This article will review the recent endeavor on biomimetic synthesis and cell based strategies for enamel regeneration. PMID:25386548

  20. Progress and challenges of the bioartificial pancreas

    Science.gov (United States)

    Hwang, Patrick T. J.; Shah, Dishant K.; Garcia, Jacob A.; Bae, Chae Yun; Lim, Dong-Jin; Huiszoon, Ryan C.; Alexander, Grant C.; Jun, Ho-Wook

    2016-11-01

    Pancreatic islet transplantation has been validated as a treatment for type 1 diabetes since it maintains consistent and sustained type 1 diabetes reversal. However, one of the major challenges in pancreatic islet transplantation is the body's natural immune response to the implanted islets. Immunosuppressive drug treatment is the most popular immunomodulatory approach for islet graft survival. However, administration of immunosuppressive drugs gives rise to negative side effects, and long-term effects are not clearly understood. A bioartificial pancreas is a therapeutic approach to enable pancreatic islet transplantation without or with minimal immune suppression. The bioartificial pancreas encapsulates the pancreatic islets in a semi-permeable environment which protects islets from the body's immune responses, while allowing the permeation of insulin, oxygen, nutrients, and waste. Many groups have developed various types of the bioartificial pancreas and tested their efficacy in animal models. However, the clinical application of the bioartificial pancreas still requires further investigation. In this review, we discuss several types of bioartificial pancreases and address their advantages and limitations. We also discuss recent advances in bioartificial pancreas applications with microfluidic or micropatterning technology.

  1. Gene therapy for hemoglobinopathies: progress and challenges.

    Science.gov (United States)

    Dong, Alisa; Rivella, Stefano; Breda, Laura

    2013-04-01

    Hemoglobinopathies are genetic inherited conditions that originate from the lack or malfunction of the hemoglobin (Hb) protein. Sickle cell disease (SCD) and thalassemia are the most common forms of these conditions. The severe anemia combined with complications that arise in the most affected patients raises the necessity for a cure to restore hemoglobin function. The current routine therapies for these conditions, namely transfusion and iron chelation, have significantly improved the quality of life in patients over the years, but still fail to address the underlying cause of the diseases. A curative option, allogeneic bone marrow transplantation is available, but limited by the availability of suitable donors and graft-vs-host disease. Gene therapy offers an alternative approach to cure patients with hemoglobinopathies and aims at the direct recovery of the hemoglobin function via globin gene transfer. In the last 2 decades, gene transfer tools based on lentiviral vector development have been significantly improved and proven curative in several animal models for SCD and thalassemia. As a result, clinical trials are in progress and 1 patient has been successfully treated with this approach. However, there are still frontiers to explore that might improve this approach: the stoichiometry between the transgenic hemoglobin and endogenous hemoglobin with respect to the different globin genetic mutations; donor cell sourcing, such as the use of induced pluripotent stem cells (iPSCs); and the use of safer gene insertion methods to prevent oncogenesis. With this review we will provide insights about (1) the different lentiviral gene therapy approaches in mouse models and human cells; (2) current and planned clinical trials; (3) hurdles to overcome for clinical trials, such as myeloablation toxicity, insertional oncogenesis, and high vector expression; and (4) future perspectives for gene therapy, including safe harbors and iPSCs technology.

  2. Space Solar Power Demonstrations: Challenges and Progress

    Science.gov (United States)

    Howell, Joe T.; Mankins, John C.; Lavoie, Anthony R. (Technical Monitor)

    2002-01-01

    The prospects of using electrical power beamed from space are coming closer to reality with the continued pursuit and improvements in the supporting space solar research and technology. Space Solar Power (SSP) has been explored off and on for approximately three decades as a viable alternative and clean energy source. Results produced through the more recent Space Solar Power Exploratory Research and Technology (SERT) program involving extensive participation by industry, universities, and government has provided a sound technical basis for believing that technology can be improved to the extent that SSP systems can be built, economically feasible, and successfully deployed in space. Considerable advancements have been made in conceptual designs and supporting technologies including solar power generation, wireless power transmission, power management distribution, thermal management and materials, and the integrated systems engineering assessments. Basic technologies have progressed to the point were the next logical step is to formulate and conduct sophisticated demonstrations involving prototype hardware as final proof of concepts and identify high end technology readiness levels in preparation for full scale SSP systems designs. In addition to continued technical development issues, environmental and safety issues must be addressed and appropriate actions taken to reassure the public and prepare them for the future use of this alternative renewable energy resource. Accomplishing these objectives will allow informed future decisions regarding further SSP and related R&D investments by both NASA management and prospective external partners. In particular, accomplishing these objectives will also guide further definition of SSP and related technology roadmaps including performance objectives, resources and schedules; including 'multi-purpose' applications (terrestrial markets, science, commercial development of space, and other government missions).

  3. Achievements and Challenges in Computational Protein Design.

    Science.gov (United States)

    Samish, Ilan

    2017-01-01

    Computational protein design (CPD), a yet evolving field, includes computer-aided engineering for partial or full de novo designs of proteins of interest. Designs are defined by a requested structure, function, or working environment. This chapter describes the birth and maturation of the field by presenting 101 CPD examples in a chronological order emphasizing achievements and pending challenges. Integrating these aspects presents the plethora of CPD approaches with the hope of providing a "CPD 101". These reflect on the broader structural bioinformatics and computational biophysics field and include: (1) integration of knowledge-based and energy-based methods, (2) hierarchical designated approach towards local, regional, and global motifs and the integration of high- and low-resolution design schemes that fit each such region, (3) systematic differential approaches towards different protein regions, (4) identification of key hot-spot residues and the relative effect of remote regions, (5) assessment of shape-complementarity, electrostatics and solvation effects, (6) integration of thermal plasticity and functional dynamics, (7) negative design, (8) systematic integration of experimental approaches, (9) objective cross-assessment of methods, and (10) successful ranking of potential designs. Future challenges also include dissemination of CPD software to the general use of life-sciences researchers and the emphasis of success within an in vivo milieu. CPD increases our understanding of protein structure and function and the relationships between the two along with the application of such know-how for the benefit of mankind. Applied aspects range from biological drugs, via healthier and tastier food products to nanotechnology and environmentally friendly enzymes replacing toxic chemicals utilized in the industry.

  4. Computer Network Security- The Challenges of Securing a Computer Network

    Science.gov (United States)

    Scotti, Vincent, Jr.

    2011-01-01

    This article is intended to give the reader an overall perspective on what it takes to design, implement, enforce and secure a computer network in the federal and corporate world to insure the confidentiality, integrity and availability of information. While we will be giving you an overview of network design and security, this article will concentrate on the technology and human factors of securing a network and the challenges faced by those doing so. It will cover the large number of policies and the limits of technology and physical efforts to enforce such policies.

  5. Mathematical challenges from theoretical/computational chemistry

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-12-31

    The committee believes that this report has relevance and potentially valuable suggestions for a wide range of readers. Target audiences include: graduate departments in the mathematical and chemical sciences; federal and private agencies that fund research in the mathematical and chemical sciences; selected industrial and government research and development laboratories; developers of software and hardware for computational chemistry; and selected individual researchers. Chapter 2 of this report covers some history of computational chemistry for the nonspecialist, while Chapter 3 illustrates the fruits of some past successful cross-fertilization between mathematical scientists and computational/theoretical chemists. In Chapter 4 the committee has assembled a representative, but not exhaustive, survey of research opportunities. Most of these are descriptions of important open problems in computational/theoretical chemistry that could gain much from the efforts of innovative mathematical scientists, written so as to be accessible introductions to the nonspecialist. Chapter 5 is an assessment, necessarily subjective, of cultural differences that must be overcome if collaborative work is to be encouraged between the mathematical and the chemical communities. Finally, the report ends with a brief list of conclusions and recommendations that, if followed, could promote accelerated progress at this interface. Recognizing that bothersome language issues can inhibit prospects for collaborative research at the interface between distinctive disciplines, the committee has attempted throughout to maintain an accessible style, in part by using illustrative boxes, and has included at the end of the report a glossary of technical terms that may be familiar to only a subset of the target audiences listed above.

  6. Computational challenges of structure-based approaches applied to HIV.

    Science.gov (United States)

    Forli, Stefano; Olson, Arthur J

    2015-01-01

    Here, we review some of the opportunities and challenges that we face in computational modeling of HIV therapeutic targets and structural biology, both in terms of methodology development and structure-based drug design (SBDD). Computational methods have provided fundamental support to HIV research since the initial structural studies, helping to unravel details of HIV biology. Computational models have proved to be a powerful tool to analyze and understand the impact of mutations and to overcome their structural and functional influence in drug resistance. With the availability of structural data, in silico experiments have been instrumental in exploiting and improving interactions between drugs and viral targets, such as HIV protease, reverse transcriptase, and integrase. Issues such as viral target dynamics and mutational variability, as well as the role of water and estimates of binding free energy in characterizing ligand interactions, are areas of active computational research. Ever-increasing computational resources and theoretical and algorithmic advances have played a significant role in progress to date, and we envision a continually expanding role for computational methods in our understanding of HIV biology and SBDD in the future.

  7. Computing Platforms for Big Biological Data Analytics: Perspectives and Challenges.

    Science.gov (United States)

    Yin, Zekun; Lan, Haidong; Tan, Guangming; Lu, Mian; Vasilakos, Athanasios V; Liu, Weiguo

    2017-01-01

    The last decade has witnessed an explosion in the amount of available biological sequence data, due to the rapid progress of high-throughput sequencing projects. However, the biological data amount is becoming so great that traditional data analysis platforms and methods can no longer meet the need to rapidly perform data analysis tasks in life sciences. As a result, both biologists and computer scientists are facing the challenge of gaining a profound insight into the deepest biological functions from big biological data. This in turn requires massive computational resources. Therefore, high performance computing (HPC) platforms are highly needed as well as efficient and scalable algorithms that can take advantage of these platforms. In this paper, we survey the state-of-the-art HPC platforms for big biological data analytics. We first list the characteristics of big biological data and popular computing platforms. Then we provide a taxonomy of different biological data analysis applications and a survey of the way they have been mapped onto various computing platforms. After that, we present a case study to compare the efficiency of different computing platforms for handling the classical biological sequence alignment problem. At last we discuss the open issues in big biological data analytics.

  8. Imaging in Colorectal Cancer: Progress and Challenges for the Clinicians

    Directory of Open Access Journals (Sweden)

    Eric Van Cutsem

    2016-08-01

    Full Text Available The use of imaging in colorectal cancer (CRC has significantly evolved over the last twenty years, establishing important roles in surveillance, diagnosis, staging, treatment selection and follow up. The range of modalities has broadened with the development of novel tracer and contrast agents, and the fusion of technologies such as positron emission tomography (PET and computed tomography (CT. Traditionally, the most widely used modality for assessing treatment response in metastasised colon and rectal tumours is CT, combined with use of the RECIST guidelines. However, a growing body of evidence suggests that tumour size does not always adequately correlate with clinical outcomes. Magnetic resonance imaging (MRI is a more versatile technique and dynamic contrast-enhanced (DCE-MRI and diffusion-weighted (DW-MRI may be used to evaluate biological and functional effects of treatment. Integrated fluorodeoxyglucose (FDG-PET/CT combines metabolic and anatomical imaging to improve sensitivity and specificity of tumour detection, and a number of studies have demonstrated improved diagnostic accuracy of this modality in a variety of tumour types, including CRC. These developments have enabled the progression of treatment strategies in rectal cancer and improved the detection of hepatic metastatic disease, yet are not without their limitations. These include technical, economical and logistical challenges, along with a lack of robust evidence for standardisation and formal guidance. In order to successfully apply these novel imaging techniques and utilise their benefit to provide truly personalised cancer care, advances need to be clinically realised in a routine and robust manner.

  9. Computational Challenges in Nuclear Weapons Simulation

    Energy Technology Data Exchange (ETDEWEB)

    McMillain, C F; Adams, T F; McCoy, M G; Christensen, R B; Pudliner, B S; Zika, M R; Brantley, P S; Vetter, J S; May, J M

    2003-08-29

    After a decade of experience, the Stockpile Stewardship Program continues to ensure the safety, security and reliability of the nation's nuclear weapons. The Advanced Simulation and Computing (ASCI) program was established to provide leading edge, high-end simulation capabilities needed to meet the program's assessment and certification requirements. The great challenge of this program lies in developing the tools and resources necessary for the complex, highly coupled, multi-physics calculations required to simulate nuclear weapons. This paper describes the hardware and software environment we have applied to fulfill our nuclear weapons responsibilities. It also presents the characteristics of our algorithms and codes, especially as they relate to supercomputing resource capabilities and requirements. It then addresses impediments to the development and application of nuclear weapon simulation software and hardware and concludes with a summary of observations and recommendations on an approach for working with industry and government agencies to address these impediments.

  10. Opportunities and challenges of high-performance computing in chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Guest, M.F.; Kendall, R.A.; Nichols, J.A. [eds.] [and others

    1995-06-01

    The field of high-performance computing is developing at an extremely rapid pace. Massively parallel computers offering orders of magnitude increase in performance are under development by all the major computer vendors. Many sites now have production facilities that include massively parallel hardware. Molecular modeling methodologies (both quantum and classical) are also advancing at a brisk pace. The transition of molecular modeling software to a massively parallel computing environment offers many exciting opportunities, such as the accurate treatment of larger, more complex molecular systems in routine fashion, and a viable, cost-effective route to study physical, biological, and chemical `grand challenge` problems that are impractical on traditional vector supercomputers. This will have a broad effect on all areas of basic chemical science at academic research institutions and chemical, petroleum, and pharmaceutical industries in the United States, as well as chemical waste and environmental remediation processes. But, this transition also poses significant challenges: architectural issues (SIMD, MIMD, local memory, global memory, etc.) remain poorly understood and software development tools (compilers, debuggers, performance monitors, etc.) are not well developed. In addition, researchers that understand and wish to pursue the benefits offered by massively parallel computing are often hindered by lack of expertise, hardware, and/or information at their site. A conference and workshop organized to focus on these issues was held at the National Institute of Health, Bethesda, Maryland (February 1993). This report is the culmination of the organized workshop. The main conclusion: a drastic acceleration in the present rate of progress is required for the chemistry community to be positioned to exploit fully the emerging class of Teraflop computers, even allowing for the significant work to date by the community in developing software for parallel architectures.

  11. Implementation of STEM Education Policy: Challenges, Progress, and Lessons Learned

    Science.gov (United States)

    Johnson, Carla C.

    2012-01-01

    This is a case study of the implementation of state STEM (science, technology, engineering, and mathematics) policy over the period of the first 18 months of building a regional STEM partnership. Fullan's change theory is the framework used to determine progress and associated challenges with building a regional STEM educational partnership and…

  12. Video-Conferenced Music Teaching: Challenges and Progress

    Science.gov (United States)

    Riley, Patricia E.

    2009-01-01

    This article reports on a study that aimed to explore general classroom music teaching and learning via video-conferencing between pre-service music teachers in the USA, and students at an elementary school for underprivileged children in Mexico. This study examines the challenges, progress and lessons learned as interactions within this…

  13. Swallowable Wireless Capsule Endoscopy: Progress and Technical Challenges

    Directory of Open Access Journals (Sweden)

    Guobing Pan

    2012-01-01

    Full Text Available Wireless capsule endoscopy (WCE offers a feasible noninvasive way to detect the whole gastrointestinal (GI tract and revolutionizes the diagnosis technology. However, compared with wired endoscopies, the limited working time, the low frame rate, and the low image resolution limit the wider application. The progress of this new technology is reviewed in this paper, and the evolution tendencies are analyzed to be high image resolution, high frame rate, and long working time. Unfortunately, the power supply of capsule endoscope (CE is the bottleneck. Wireless power transmission (WPT is the promising solution to this problem, but is also the technical challenge. Active CE is another tendency and will be the next geneion of the WCE. Nevertheless, it will not come true shortly, unless the practical locomotion mechanism of the active CE in GI tract is achieved. The locomotion mechanism is the other technical challenge, besides the challenge of WPT. The progress about the WPT and the active capsule technology is reviewed.

  14. Progress in silicon-based quantum computing.

    Science.gov (United States)

    Clark, R G; Brenner, R; Buehler, T M; Chan, V; Curson, N J; Dzurak, A S; Gauja, E; Goan, H S; Greentree, A D; Hallam, T; Hamilton, A R; Hollenberg, L C L; Jamieson, D N; McCallum, J C; Milburn, G J; O'Brien, J L; Oberbeck, L; Pakes, C I; Prawer, S D; Reilly, D J; Ruess, F J; Schofield, S R; Simmons, M Y; Stanley, F E; Starrett, R P; Wellard, C; Yang, C

    2003-07-15

    We review progress at the Australian Centre for Quantum Computer Technology towards the fabrication and demonstration of spin qubits and charge qubits based on phosphorus donor atoms embedded in intrinsic silicon. Fabrication is being pursued via two complementary pathways: a 'top-down' approach for near-term production of few-qubit demonstration devices and a 'bottom-up' approach for large-scale qubit arrays with sub-nanometre precision. The 'top-down' approach employs a low-energy (keV) ion beam to implant the phosphorus atoms. Single-atom control during implantation is achieved by monitoring on-chip detector electrodes, integrated within the device structure. In contrast, the 'bottom-up' approach uses scanning tunnelling microscope lithography and epitaxial silicon overgrowth to construct devices at an atomic scale. In both cases, surface electrodes control the qubit using voltage pulses, and dual single-electron transistors operating near the quantum limit provide fast read-out with spurious-signal rejection.

  15. Computing Challenges in Coded Mask Imaging

    Science.gov (United States)

    Skinner, Gerald

    2009-01-01

    This slide presaentation reviews the complications and challenges in developing computer systems for Coded Mask Imaging telescopes. The coded mask technique is used when there is no other way to create the telescope, (i.e., when there are wide fields of view, high energies for focusing or low energies for the Compton/Tracker Techniques and very good angular resolution.) The coded mask telescope is described, and the mask is reviewed. The coded Masks for the INTErnational Gamma-Ray Astrophysics Laboratory (INTEGRAL) instruments are shown, and a chart showing the types of position sensitive detectors used for the coded mask telescopes is also reviewed. Slides describe the mechanism of recovering an image from the masked pattern. The correlation with the mask pattern is described. The Matrix approach is reviewed, and other approaches to image reconstruction are described. Included in the presentation is a review of the Energetic X-ray Imaging Survey Telescope (EXIST) / High Energy Telescope (HET), with information about the mission, the operation of the telescope, comparison of the EXIST/HET with the SWIFT/BAT and details of the design of the EXIST/HET.

  16. "Defining Computer 'Speed': An Unsolved Challenge"

    CERN Document Server

    CERN. Geneva

    2012-01-01

    Abstract: The reason we use computers is their speed, and the reason we use parallel computers is that they're faster than single-processor computers. Yet, after 70 years of electronic digital computing, we still do not have a solid definition of what computer 'speed' means, or even what it means to be 'faster'. Unlike measures in physics, where the definition of speed is rigorous and unequivocal, in computing there is no definition of speed that is universally accepted. As a result, computer customers have made purchases misguided by dubious information, computer designers have optimized their designs for the wrong goals, and computer programmers have chosen methods that optimize the wrong things. This talk describes why some of the obvious and historical ways of defining 'speed' haven't served us well, and the things we've learned in the struggle to find a definition that works. Biography: Dr. John Gustafson is a Director ...

  17. Recent Progress and New Challenges in Quantum Fluids and Solids

    Science.gov (United States)

    Lee, Y.; Halperin, W. P.

    2017-10-01

    Quantum fluids and solids have been a primary topic in low temperature physics. While many physical concepts and experimental techniques developed in this field have made impacts on other related areas in the last century, recent research activities demonstrate active infusion of techniques and physical ideas developed in other fields. In this paper, we present a short review of the recent progress and new challenges in this field, focusing on the superfluid phases of 3He and 4He.

  18. Global progress against cancer-challenges and oppor tunities

    Institute of Scientific and Technical Information of China (English)

    Frédéric Biemar; Margaret Foti

    2013-01-01

    The last ten years have seen remarkable progress in cancer research. However, despite significant breakthroughs in the understanding, prevention, and treatment of cancer, the disease continues to affect millions of people worldwide. Cancer’s complexity compounded with ifnancial, policy and regulatory roadblocks has slowed the rate of progress being made against cancer. In this paper, we review a few of the most recent breakthroughs that are fueling medical advances and bringing new hope for patients affected by this devastating disease. We also address the challenges facing us and the opportunities to accelerate future progress against cancer. The efforts of the American Association for Cancer Research (AACR) to address the cancer burden already extend beyond the borders of the United States of America. hTe AACR is committed to increasing its efforts to stem the tide of cancer worldwide by promoting innovative programs, strategies, and initiatives for cancer researchers and all those engaged in cancer-related biomedical sciences around the world.

  19. Molecular Mechanisms of Bipolar Disorder: Progress Made and Future Challenges

    Science.gov (United States)

    Kim, Yeni; Santos, Renata; Gage, Fred H.; Marchetto, Maria C.

    2017-01-01

    Bipolar disorder (BD) is a chronic and progressive psychiatric illness characterized by mood oscillations, with episodes of mania and depression. The impact of BD on patients can be devastating, with up to 15% of patients committing suicide. This disorder is associated with psychiatric and medical comorbidities and patients with a high risk of drug abuse, metabolic and endocrine disorders and vascular disease. Current knowledge of the pathophysiology and molecular mechanisms causing BD is still modest. With no clear biological markers available, early diagnosis is a great challenge to clinicians without previous knowledge of the longitudinal progress of illness. Moreover, despite recommendations from evidence-based guidelines, polypharmacy is still common in clinical treatment of BD, reflecting the gap between research and clinical practice. A major challenge in BD is the development of effective drugs with low toxicity for the patients. In this review article, we focus on the progress made and future challenges we face in determining the pathophysiology and molecular pathways involved in BD, such as circadian and metabolic perturbations, mitochondrial and endoplasmic reticulum (ER) dysfunction, autophagy and glutamatergic neurotransmission; which may lead to the development of new drugs. PMID:28261061

  20. Homogeneous Buchberger algorithms and Sullivant's computational commutative algebra challenge

    DEFF Research Database (Denmark)

    Lauritzen, Niels

    2005-01-01

    We give a variant of the homogeneous Buchberger algorithm for positively graded lattice ideals. Using this algorithm we solve the Sullivant computational commutative algebra challenge.......We give a variant of the homogeneous Buchberger algorithm for positively graded lattice ideals. Using this algorithm we solve the Sullivant computational commutative algebra challenge....

  1. Computational Intelligence and Games: Challenges and Opportunities

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The last few decades have seen a phenomenal increase in the quality, diversity and pervasiveness of computer games. The worldwide computer games market is estimated to be worth around USD 21bn annually, and is predicted to continue to grow rapidly.This paper reviews some of the recent developments in applying computational intelligence (CI) methods to games, points out some of the potential pitfalls, and suggests some fruitful directions for future research.

  2. The Computational Challenges of Medical Imaging

    Science.gov (United States)

    2004-02-01

    JASON will undertake a study for the DOE and the NIH National Institute for Bio- medical Imaging and Bio-engineering on the role of computation...broadly defined to include raw computational capabilities, mass storage needs, and connectivity) for medical imaging . This study will address the

  3. The Risk and Challenges of Cloud Computing

    Directory of Open Access Journals (Sweden)

    Usman Namadi Inuwa

    2015-12-01

    Full Text Available Cloud computing is a computing technology aiming to share storage, computation, and services transparently among a massive users. Current cloud computing systems pose serious limitation to protecting the confidentiality of user data. Since the data share and stored is presented in unencrypted forms to remote machines owned and operated by third party service providers despite it sensitivity (example contact address, mails, the risks of disclosing user confidential data by service providers may be quite high and the risk of attacking cloud storage by third party is also increasing. The purpose of this study is to review researches done on this technology, identify the security risk and explore some techniques for protecting users‟ data from attackers in the cloud.

  4. Editorial: Modelling and computational challenges in granular materials

    NARCIS (Netherlands)

    Weinhart, Thomas; Thornton, Anthony Richard; Einav, Itai

    2015-01-01

    This is the editorial for the special issue on “Modelling and computational challenges in granular materials” in the journal on Computational Particle Mechanics (CPM). The issue aims to provide an opportunity for physicists, engineers, applied mathematicians and computational scientists to discuss

  5. Petaflops Computing: The Key Algorithmic Challenges

    Science.gov (United States)

    Bailey, David H.; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    The prospect of petaflops-class computers brings to the fore some important algorithmic issues that have been considered in the high performance computing community for several years. Key among them are (1) concurrency (whether the fundamental concurrency of an algorithm is sufficient to keep thousands of processors productively busy); (2) data locality; (3) latency tolerance; and (4) memory and operation count scaling. This introductory presentation will give an overview of these issues.

  6. Challenges in computational statistics and data mining

    CERN Document Server

    Mielniczuk, Jan

    2016-01-01

    This volume contains nineteen research papers belonging to the areas of computational statistics, data mining, and their applications. Those papers, all written specifically for this volume, are their authors’ contributions to honour and celebrate Professor Jacek Koronacki on the occcasion of his 70th birthday. The book’s related and often interconnected topics, represent Jacek Koronacki’s research interests and their evolution. They also clearly indicate how close the areas of computational statistics and data mining are.

  7. Progress and challenges in bioinformatics approaches for enhancer identification

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2017-02-03

    Enhancers are cis-acting DNA elements that play critical roles in distal regulation of gene expression. Identifying enhancers is an important step for understanding distinct gene expression programs that may reflect normal and pathogenic cellular conditions. Experimental identification of enhancers is constrained by the set of conditions used in the experiment. This requires multiple experiments to identify enhancers, as they can be active under specific cellular conditions but not in different cell types/tissues or cellular states. This has opened prospects for computational prediction methods that can be used for high-throughput identification of putative enhancers to complement experimental approaches. Potential functions and properties of predicted enhancers have been catalogued and summarized in several enhancer-oriented databases. Because the current methods for the computational prediction of enhancers produce significantly different enhancer predictions, it will be beneficial for the research community to have an overview of the strategies and solutions developed in this field. In this review, we focus on the identification and analysis of enhancers by bioinformatics approaches. First, we describe a general framework for computational identification of enhancers, present relevant data types and discuss possible computational solutions. Next, we cover over 30 existing computational enhancer identification methods that were developed since 2000. Our review highlights advantages, limitations and potentials, while suggesting pragmatic guidelines for development of more efficient computational enhancer prediction methods. Finally, we discuss challenges and open problems of this topic, which require further consideration.

  8. Computational science: Emerging opportunities and challenges

    Science.gov (United States)

    Hendrickson, Bruce

    2009-07-01

    In the past two decades, computational methods have emerged as an essential component of the scientific and engineering enterprise. A diverse assortment of scientific applications has been simulated and explored via advanced computational techniques. Computer vendors have built enormous parallel machines to support these activities, and the research community has developed new algorithms and codes, and agreed on standards to facilitate ever more ambitious computations. However, this track record of success will be increasingly hard to sustain in coming years. Power limitations constrain processor clock speeds, so further performance improvements will need to come from ever more parallelism. This higher degree of parallelism will require new thinking about algorithms, programming models, and architectural resilience. Simultaneously, cutting edge science increasingly requires more complex simulations with unstructured and adaptive grids, and multi-scale and multi-physics phenomena. These new codes will push existing parallelization strategies to their limits and beyond. Emerging data-rich scientific applications are also in need of high performance computing, but their complex spatial and temporal data access patterns do not perform well on existing machines. These interacting forces will reshape high performance computing in the coming years.

  9. Emerging nanomedicine applications and manufacturing: progress and challenges.

    Science.gov (United States)

    Sartain, Felicity; Greco, Francesca; Hill, Kathryn; Rannard, Steve; Owen, Andrew

    2016-03-01

    APS 6th International PharmSci Conference 2015 7-9 September 2015 East Midlands Conference Centre, University of Nottingham, Nottingham, UK As part of the 6th APS International PharmSci Conference, a nanomedicine session was organised to address challenges and share experiences in this field. Topics ranged from the reporting on latest results and advances in the development of targeted therapeutics to the needs that the community faces in how to progress these exciting proof of concept results into products. Here we provide an overview of the discussion and highlight some of the initiatives that have recently been established to support the translation of nanomedicines into the clinic.

  10. Impedimetric biosensors for medical applications current progress and challenges

    CERN Document Server

    Rushworth, Jo V; Goode, Jack A; Pike, Douglas J; Ahmed, Asif; Millner, Paul

    2014-01-01

    In this monograph, the authors discuss the current progress in the medical application of impedimetric biosensors, along with the key challenges in the field. First, a general overview of biosensor development, structure and function is presented, followed by a detailed discussion of impedimetric biosensors and the principles of electrochemical impedance spectroscopy. Next, the current state-of-the art in terms of the science and technology underpinning impedance-based biosensors is reviewed in detail. The layer-by-layer construction of impedimetric sensors is described, including the design of electrodes, their nano-modification, transducer surface functionalization and the attachment of different bioreceptors. The current challenges of translating lab-based biosensor platforms into commercially-available devices that function with real patient samples at the POC are presented; this includes a consideration of systems integration, microfluidics and biosensor regeneration. The final section of this monograph ...

  11. Merging Library and Computing Services at Kenyon College: A Progress Report.

    Science.gov (United States)

    Oden, Robert A., Jr.; Temple, Daniel B.; Cottrell, Janet R.; Griggs, Ronald K.; Turney, Glen W.; Wojcik, Frank M.

    2001-01-01

    Describes the evolution and progress toward a uniquely integrated library and computer services unit at Kenyon College. Discusses its focus on constituencies; merging of the divisions; benefits for students, faculty, administrative units, and the institution; meeting challenges; and generalizing from the model. (EV)

  12. 2016 Institutional Computing Progress Report for w14_firetec

    Energy Technology Data Exchange (ETDEWEB)

    White, Judith W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Linn, Rodman [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-14

    This is a computing progress report for w14_firetec. FIRETEC simulations will explore the prescribed fire ignition methods to achieve burning objectives (understory reduction and ecosystem health) but at the same time minimize the risk of escaped fire.

  13. Computer Aided Modelling – Opportunities and Challenges

    DEFF Research Database (Denmark)

    2011-01-01

    This chapter considers the opportunities that are present in developing, extending and applying aspects of computer-aided modelling principles and practice. What are the best tasks to be done by modellers and what needs the application of CAPE tools? How do we efficiently develop model-based solu......This chapter considers the opportunities that are present in developing, extending and applying aspects of computer-aided modelling principles and practice. What are the best tasks to be done by modellers and what needs the application of CAPE tools? How do we efficiently develop model...

  14. EDOC: meeting the challenges of enterprise computing

    NARCIS (Netherlands)

    Hung, P.C.K.; Sinderen, van M.J.

    2005-01-01

    An increasing demand for interoperable applications exists, sparking the real-time exchange of data across borders, applications, and IT platforms. To perform these tasks, enterprise computing now encompasses a new class of groundbreaking technologies such as Web services and service-oriented archit

  15. EDOC: meeting the challenges of enterprise computing

    NARCIS (Netherlands)

    Hung, P.C.K.; van Sinderen, Marten J.

    An increasing demand for interoperable applications exists, sparking the real-time exchange of data across borders, applications, and IT platforms. To perform these tasks, enterprise computing now encompasses a new class of groundbreaking technologies such as Web services and service-oriented

  16. Biomedical Visual Computing: Case Studies and Challenges

    KAUST Repository

    Johnson, Christopher

    2012-01-01

    Advances in computational geometric modeling, imaging, and simulation let researchers build and test models of increasing complexity, generating unprecedented amounts of data. As recent research in biomedical applications illustrates, visualization will be critical in making this vast amount of data usable; it\\'s also fundamental to understanding models of complex phenomena. © 2012 IEEE.

  17. Current Computational Challenges for CMC Processes, Properties, and Structures

    Science.gov (United States)

    DiCarlo, James

    2008-01-01

    In comparison to current state-of-the-art metallic alloys, ceramic matrix composites (CMC) offer a variety of performance advantages, such as higher temperature capability (greater than the approx.2100 F capability for best metallic alloys), lower density (approx.30-50% metal density), and lower thermal expansion. In comparison to other competing high-temperature materials, CMC are also capable of providing significantly better static and dynamic toughness than un-reinforced monolithic ceramics and significantly better environmental resistance than carbon-fiber reinforced composites. Because of these advantages, NASA, the Air Force, and other U.S. government agencies and industries are currently seeking to implement these advanced materials into hot-section components of gas turbine engines for both propulsion and power generation. For applications such as these, CMC are expected to result in many important performance benefits, such as reduced component cooling air requirements, simpler component design, reduced weight, improved fuel efficiency, reduced emissions, higher blade frequencies, reduced blade clearances, and higher thrust. Although much progress has been made recently in the development of CMC constituent materials and fabrication processes, major challenges still remain for implementation of these advanced composite materials into viable engine components. The objective of this presentation is to briefly review some of those challenges that are generally related to the need to develop physics-based computational approaches to allow CMC fabricators and designers to model (1) CMC processes for fiber architecture formation and matrix infiltration, (2) CMC properties of high technical interest such as multidirectional creep, thermal conductivity, matrix cracking stress, damage accumulation, and degradation effects in aggressive environments, and (3) CMC component life times when all of these effects are interacting in a complex stress and service

  18. Scientific and Computational Challenges of the Fusion Simulation Program (FSP)

    Energy Technology Data Exchange (ETDEWEB)

    William M. Tang

    2011-02-09

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Program (FSP) a major national initiative in the United States with the primary objective being to enable scientific discovery of important new plasma phenomena with associated understanding that emerges only upon integration. This requires developing a predictive integrated simulation capability for magnetically-confined fusion plasmas that are properly validated against experiments in regimes relevant for producing practical fusion energy. It is expected to provide a suite of advanced modeling tools for reliably predicting fusion device behavior with comprehensive and targeted science-based simulations of nonlinearly-coupled phenomena in the core plasma, edge plasma, and wall region on time and space scales required for fusion energy production. As such, it will strive to embody the most current theoretical and experimental understanding of magnetic fusion plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing the ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices with high physics fidelity on all relevant time and space scales. From a computational perspective, this will demand computing resources in the petascale range and beyond together with the associated multi-core algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative experiment involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics modeling projects (e

  19. Security, Privacy and Trust Challenges in Cloud Computing and Solutions

    Directory of Open Access Journals (Sweden)

    Seyyed Yasser hashemi

    2014-07-01

    Full Text Available Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction. Cloud computing recently emerged as a promising solution to information technology (IT management. IT managers look to cloud computing as a means to maintain a flexible and scalable IT infrastructure that enables business agility. As much as the technological benefits, cloud computing also has risks involved. In this paper Cloud Computing security challenges will be discussed and proposed many new recommendations to increase security and trust also maintaining privacy.

  20. Addressing Security Challenges in Pervasive Computing Applications

    Science.gov (United States)

    2010-10-10

    Conference on Engineering of Complex Computer Systems, Auckland, New Zealand, July 2007. 5. Kyriakos Anastasakis, Behzad Bordbar, Geri Georg and...tending Database Technology, Saint-Petersburg, Russia, March 2009. 24. Geri Georg, Indrakshi Ray, Kyriakos Anastasakis, Behzad Bordbar, Manachai...and Behzad Bor- dbar, "Ensuring Spatio-Temporal Access Control for Real-World Applications", Proceed- ings of the 14 th ACM Symposium on Access

  1. Health impact assessment in China: Emergence, progress and challenges

    Energy Technology Data Exchange (ETDEWEB)

    Huang Zheng, E-mail: huangzhg@mails.tjmu.edu.cn

    2012-01-15

    The values, concepts and approaches of health impact assessment (HIA) were outlined in the Gothenburg consensus paper and some industrialized countries have implemented HIA for many years. HIA has played an important role in environmental protection in China, however, the emergence, progress and challenges of HIA in China have not been well described. In this paper, the evolution of HIA in China was analyzed and the challenges of HIA were presented based on the author's experiences. HIA contributed to decision-making for large capital construction projects, such as the Three Gorges Dam project, in its emergence stage. Increasing attention has been given to HIA in recent years due to supportive policies underpinning development of the draft HIA guidelines in 2008. However enormous challenges lie ahead in ensuring the institutionalization of HIA into project, program and policy decision-making process due to limited scope, immature tools and insufficient professionals in HIA practice. HIA should broaden its horizons by encompassing physical, chemical, biological and socio-economic aspects and constant attempts should be made to integrate HIA into the decision-making process, not only for projects and programs but also for policies as well.

  2. Cloud computing challenges, limitations and R&D solutions

    CERN Document Server

    Mahmood, Zaigham

    2014-01-01

    This important text/reference reviews the challenging issues that present barriers to greater implementation of the cloud computing paradigm, together with the latest research into developing potential solutions. Exploring the strengths and vulnerabilities of cloud provision and cloud environments, Cloud Computing: Challenges, Limitations and R&D Solutions provides case studies from a diverse selection of researchers and practitioners of international repute. The implications of emerging cloud technologies are also analyzed from the perspective of consumers. Topics and features: presents

  3. Modelling Hydrological Consequences of Climate Change-Progress and Challenges

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    The simulation of hydrological consequences of climate change has received increasing attention from the hydrology and land-surface modelling communities. There have been many studies of climate-change effects on hydrology and water resources which usually consist of three steps: (1) use of general circulation models (GCMs) to provide future global climate scenarios under the effect of increasing greenhouse gases,(2) use of downscaling techniques (both nested regional climate models, RCMs, and statistical methods)for "downscaling" the GCM output to the scales compatible with hydrological models, and (3) use of hydrologic models to simulate the effects of climate change on hydrological regimes at various scales.Great progress has been achieved in all three steps during the past few years, however, large uncertainties still exist in every stage of such study. This paper first reviews the present achievements in this field and then discusses the challenges for future studies of the hydrological impacts of climate change.

  4. Progress and challenges of carbon nanotube membrane in water treatment

    KAUST Repository

    Lee, Jieun

    2016-05-25

    The potential of the carbon nanotube (CNT) membrane has been highly strengthened in water treatment during the last decade. According to works published up to now, the unique and excellent characteristics of CNT outperformed conventional polymer membranes. Such achievements of CNT membranes are greatly dependent on their fabrication methods. Further, the intrinsic properties of CNT could be a critical factor of applicability to membrane processes. This article provides an explicit and systematic review of the progress of CNT membranes addressing the current epidemic—whether (i) the CNT membranes could tackle current challenges in the pressure- or thermally driven membrane processes and (ii) CNT hybrid nanocomposite as a new generation of materials could complement current CNT-enhanced membrane. © 2016 Taylor & Francis Group, LLC.

  5. Recent progress and challenges of organometal halide perovskite solar cells

    Science.gov (United States)

    Yang, Liyan; Barrows, Alexander T.; Lidzey, David G.; Wang, Tao

    2016-02-01

    We review recent progress in the development of organometal halide perovskite solar cells. We discuss different compounds used to construct perovskite photoactive layers, as well as the optoelectronic properties of this system. The factors that affect the morphology of the perovskite active layer are explored, e.g. material composition, film deposition methods, casting solvent and various post-treatments. Different strategies are reviewed that have recently emerged to prepare high performing perovskite films, creating polycrystalline films having either large or small grain size. Devices that are constructed using meso-superstructured and planar architectures are summarized and the impact of the fabrication process on operational efficiency is discussed. Finally, important research challenges (hysteresis, thermal and moisture instability, mechanical flexibility, as well as the development of lead-free materials) in the development of perovskite solar cells are outlined and their potential solutions are discussed.

  6. LHC Computing Centres Join Forces for Global Grid Challenge

    CERN Multimedia

    2005-01-01

    Today, in a significant milestone for scientific grid computing, eight major computing centres successfully completed a challenge to sustain a continuous data flow of 600 megabytes per second (MB/s) on average for 10 days from CERN in Geneva, Switzerland to seven sites in Europe and the US

  7. Aneesur Rahman Prize for Computational Physics Lecture: Addressing Dirac's Challenge

    Science.gov (United States)

    Chelikowsky, James

    2013-03-01

    After the invention of quantum mechanics, P. A. M. Dirac made the following observation: ``The underlying physical laws necessary for the mathematical theory of a large part of physics and the whole of chemistry are thus completely known, and the difficulty is only that the exact application of these laws leads to equations much too complicated to be soluble. It therefore becomes desirable that approximate practical methods of applying quantum mechanics should be developed, which can lead to an explanation of the main features of complex atomic systems...'' The creation of ``approximate practical methods'' in response to Dirac's challenge has included the one electron picture, density functional theory and the pseudopotential concept. The combination of such methods in conjunction with contemporary computational platforms and new algorithms offer the possibility of predicting properties of materials solely from knowledge of the atomic species present. I will give an overview of progress in this field with an emphasis on materials at the nanoscale. Support from the Department of Energy and the National Science Foundation is acknowledged.

  8. Does HEP still hold challenges for computer science

    Energy Technology Data Exchange (ETDEWEB)

    Hertzberger, L.O. (Amsterdam Univ. (Netherlands). Computer Systems Group)

    1989-12-01

    The characteristics of High Energy Physics (HEP) as well as Computer Science (CS) are changing. In HEP the ever larger scale of exerpimentation results in a dramatic increase in the amount of data that has to be handled. Consequently, computing techniques have to be found to keep the data manageable. Computing science has become more mature, realizing that it has to develop models and techniques applicable to a wide range of problems. Moreover, the interest has shifted to computing problems in everyday life, where emphasis is more on symbolic than numeric computation. It will be illustrated that HEP still offers challenges to CS, but that the nature of collaboration has changed considerably. (orig.).

  9. Inclusive Education in Georgia: Current Progress and Challenges

    Directory of Open Access Journals (Sweden)

    Nikoloz Kavelashvili

    2017-05-01

    Full Text Available Purpose and Originality: The paper provides a realistic picture about how the implementation process of inclusive education in Georgia is developing, about the problems that are encountered together with what needs are to be fulfilled for stimulating the process. Today’s challenge in the country is to make inclusive practices available to everybody, everywhere and all the time. This article discusses the status of the efforts being made to meet this challenge. In the course of that discussion, some comprehensive changes will be described that systemic efforts of school improvement must achieve to continue making progress towards fully inclusive learning. Method: The study was conducted in Georgia. A qualitative research design was employed along with closed-ended and open-ended questionnaires, which allowed participants to express their point of views, skills and knowledge. Data collection methods were applied: semi-structured interviews and observation on respondents. Results: The study uncovers those challenges that obstruct the implementation process: indifferent attitudes of teachers and parents towards inclusion, absence of self-awareness to the issue amongst educators, slightest involvement of parents and need to infrastructural development. Society: The results should raise the awareness of the population of Georgia as well as increase the understanding of the problem. Limitations / further research: There were quite enough informants on the school level (special teachers, principals, however, there are still many other possible respondents who could add something valuable to a better understanding of the process of inclusion at schools. The theoretical approach employed in the study and the empirical research could be validated.

  10. Challenges and promises for translating computational tools into clinical practice.

    Science.gov (United States)

    Ahn, Woo-Young; Busemeyer, Jerome R

    2016-10-01

    Computational modeling and associated methods have greatly advanced our understanding of cognition and neurobiology underlying complex behaviors and psychiatric conditions. Yet, no computational methods have been successfully translated into clinical settings. This review discusses three major methodological and practical challenges (A. precise characterization of latent neurocognitive processes, B. developing optimal assays, C. developing large-scale longitudinal studies and generating predictions from multi-modal data) and potential promises and tools that have been developed in various fields including mathematical psychology, computational neuroscience, computer science, and statistics. We conclude by highlighting a strong need to communicate and collaborate across multiple disciplines.

  11. PROGRESS & CHALLENGES IN CLEANUP OF HANFORDS TANK WASTES

    Energy Technology Data Exchange (ETDEWEB)

    HEWITT, W.M.; SCHEPENS, R.

    2006-01-23

    The River Protection Project (RPP), which is managed by the Department of Energy (DOE) Office of River Protection (ORP), is highly complex from technical, regulatory, legal, political, and logistical perspectives and is the largest ongoing environmental cleanup project in the world. Over the past three years, ORP has made significant advances in its planning and execution of the cleanup of the Hartford tank wastes. The 149 single-shell tanks (SSTs), 28 double-shell tanks (DSTs), and 60 miscellaneous underground storage tanks (MUSTs) at Hanford contain approximately 200,000 m{sup 3} (53 million gallons) of mixed radioactive wastes, some of which dates back to the first days of the Manhattan Project. The plan for treating and disposing of the waste stored in large underground tanks is to: (1) retrieve the waste, (2) treat the waste to separate it into high-level (sludge) and low-activity (supernatant) fractions, (3) remove key radionuclides (e.g., Cs-137, Sr-90, actinides) from the low-activity fraction to the maximum extent technically and economically practical, (4) immobilize both the high-level and low-activity waste fractions by vitrification, (5) interim store the high-level waste fraction for ultimate disposal off-site at the federal HLW repository, (6) dispose the low-activity fraction on-site in the Integrated Disposal Facility (IDF), and (7) close the waste management areas consisting of tanks, ancillary equipment, soils, and facilities. Design and construction of the Waste Treatment and Immobilization Plant (WTP), the cornerstone of the RPP, has progressed substantially despite challenges arising from new seismic information for the WTP site. We have looked closely at the waste and aligned our treatment and disposal approaches with the waste characteristics. For example, approximately 11,000 m{sup 3} (2-3 million gallons) of metal sludges in twenty tanks were not created during spent nuclear fuel reprocessing and have low fission product concentrations. We

  12. School Librarianship and Evidence Based Practice: Progress, Perspectives, and Challenges

    Directory of Open Access Journals (Sweden)

    Ross J. Todd

    2009-06-01

    Full Text Available Objective – This paper provides an overview of progress and developments surrounding evidence based practice in school librarianship, and seeks to provide a picture of current thinking about evidence based practice as it relates to the field. It addresses current issues and challenges facing the adoption of evidence based practice in school librarianship.Methods – The paper is based on a narrative review of a small but growing body of literature on evidence based practice in school librarianship, set within a broader perspective of evidence based education. In addition, it presents the outcomes of a collaborative process of input from 200 school libraries leaders collected at a School Library summit in 2007 specifically to address the emerging arena of evidence based practice in this field.Results – A holistic model of evidence based practice for school libraries is presented, centering on three integrated dimensions of evidence: evidence for practice, evidence in practice, and evidence of practice.Conclusion – The paper identifies key challenges ahead if evidence based school librarianship is to develop further. These include: building research credibility within the broader educational environment; the need for ongoing review and evaluation of the diverse body of research in education, librarianship and allied fields to make quality evidence available in ways that can enable practicing school librarians to build a culture of evidence based practice; development of tools, strategies, and exemplars to use to facilitate evidence based decision-making; and, ensuring that the many and diverse advances in education and librarianship become part of the practice of school librarianship.

  13. Geothermal Energy Development in Indonesia: Progress, Challenges and Prospect

    Directory of Open Access Journals (Sweden)

    Hadi Setiawan

    2014-01-01

    Full Text Available One of environmental friendly renewable energies with huge potential in Indonesia is geothermal. Indonesia has the largest geothermal potential in the world, reaching up to 40% of world reserves or about 27,000 MW to 29,000 MW. However the development of geothermal currently is only about 4.2% (1,226 MW of the existing reserves. The government of Indonesia has issued both fiscal and non-fiscal incentives to encourage geothermal development including establishing Fast Track Program II in 2010 to procure 17,918 MW of which 28% of them are geothermal. But apparently the amount of electricity that can be supplied from geothermal is only about 2.7% of total installed generations in Indonesia. This paper presents the progress of geothermal development in Indonesia and the role of the government including the policy, regulatory framework, and government incentives. It also identifies the challenges of the geothermal development, as well as its prospects in the future. Methodology used in this research is qualitative-descriptive method focused on literature review to obtain literature or secondary data.

  14. Progress, Understanding and Challenges in the Field of Nanodielectrics

    Science.gov (United States)

    Fréchette, Michel F.; Reed, Clive W.; Sedding, Howard

    The field of nanotechnology has emerged as one of the most active technological areas worldwide, and interest in nanodielectrics has grown rapidly as a potential new generation of HV insulating materials with unique properties. Experimental progress in this field and the challenges facing practical implementation will be commented. A wide range of materials (largely nanofillers in a polymer matrix or intercalated or exfoliated layered natural or synthetic inorganics within a polymer matrix) are being evaluated by universities and industry, and several significant improvements in important electrical, mechanical, physical, and thermal properties were confirmed. This suggests that a number of HV insulation applications could benefit from such materials. However, some limitations have been identified which need to be understood and corrected or accommodated. The role of the nano-inorganic-polymer interface (dielectric and electronic properties, space charge mitigation, band-gap and charge injection effects, and morphology effects) will be considered. Based on this analysis, the ultimate potential that might be realized via nanodielectrics will be envisaged.

  15. Entropy noise: A review of theory, progress and challenges

    Directory of Open Access Journals (Sweden)

    Aimee S Morgans

    2016-12-01

    Full Text Available Combustion noise comprises two components: direct combustion noise and indirect combustion noise. The latter is the lesser studied, with entropy noise believed to be its main component. Entropy noise is generated via a sequence involving diverse flow physics. It has enjoyed a resurgence of interest over recent years, because of its increasing importance to aero-engine exhaust noise and a recognition that it can affect gas turbine combustion instabilities. Entropy noise occurs when unsteady heat release rate generates temperature fluctuations (entropy waves, and these subsequently undergo acceleration. Five stages of flow physics have been identified as being important, these being (a generation of entropy waves by unsteady heat release rate; (b advection of entropy waves through the combustor; (c acceleration of entropy waves through either a nozzle or blade row, to generate entropy noise; (d passage of entropy noise through a succession of turbine blade rows to appear at the turbine exit; and (e reflection of entropy noise back into the combustor, where it may further perturb the flame, influencing the combustor thermoacoustics. This article reviews the underlying theory, recent progress and outstanding challenges pertaining to each of these stages.

  16. Alkaline polymer electrolyte fuel cells: Principle, challenges, and recent progress

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    Polymer electrolyte membrane fuel cells (PEMFC) have been recognized as a significant power source in future energy systems based on hydrogen. The current PEMFC technology features the employment of acidic polymer electrolytes which, albeit superior to electrolyte solutions, have intrinsically limited the catalysts to noble metals, fundamentally preventing PEMFC from widespread deployment. An effective solution to this problem is to develop fuel cells based on alkaline polymer electrolytes (APEFC), which not only enable the use of non-precious metal catalysts but also avoid the carbonate-precipitate issue which has been troubling the conventional alkaline fuel cells (AFC). This feature article introduces the principle of APEFC, the challenges, and our research progress, and focuses on strategies for developing key materials, including high-performance alkaline polyelectrolytes and stable non-precious metal catalysts. For alkaline polymer electrolytes, high ionic conductivity and satisfactory mechanical property are difficult to be balanced, therefore polymer cross-linking is an ultimate strategy. For non-precious metal catalysts, it is urgent to improve the catalytic activity and stability. New materials, such as transition-metal complexes, nitrogen-doped carbon nanotubes, and metal carbides, would become applicable in APEFC.

  17. Identifying Key Challenges in Performance Issues in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Ashraf Zia

    2012-10-01

    Full Text Available Cloud computing is a harbinger to a newer era in the field of computing where distributed and centralized services are used in a unique way. In cloud computing, the computational resources of different vendors and IT services providers are managed for providing an enormous and a scalable computing services platform that offers efficient data processing coupled with better QoS at a lower cost. The on-demand dynamic and scalable resource allocation is the main motif behind the development and deployment of cloud computing. The potential growth in this area and the presence of some dominant organizations with abundant resources (like Google, Amazon, Salesforce, Rackspace, Azure, GoGrid, make the field of cloud computing more fascinating. All the cloud computing processes need to be in unanimity to dole out better QoS i.e., to provide better software functionality, meet the tenant’s requirements for their desired processing power and to exploit elevated bandwidth.. However, several technical and functional e.g., pervasive access to resources, dynamic discovery, on the fly access and composition of resources pose serious challenges for cloud computing. In this study, the performance issues in cloud computing are discussed. A number of schemes pertaining to QoS issues are critically analyzed to point out their strengths and weaknesses. Some of the performance parameters at the three basic layers of the cloud — Infrastructure as a Service, Platform as a Service and Software as a Service — are also discussed in this paper.

  18. Malaria: Global progress 2000 - 2015 and future challenges.

    Science.gov (United States)

    Cibulskis, Richard E; Alonso, Pedro; Aponte, John; Aregawi, Maru; Barrette, Amy; Bergeron, Laurent; Fergus, Cristin A; Knox, Tessa; Lynch, Michael; Patouillard, Edith; Schwarte, Silvia; Stewart, Saira; Williams, Ryan

    2016-06-09

    2015 was the target year for malaria goals set by the World Health Assembly and other international institutions to reduce malaria incidence and mortality. A review of progress indicates that malaria programme financing and coverage have been transformed since the beginning of the millennium, and have contributed to substantial reductions in the burden of disease. Investments in malaria programmes increased by more than 2.5 times between 2005 and 2014 from US$ 960 million to US$ 2.5 billion, allowing an expansion in malaria prevention, diagnostic testing and treatment programmes. In 2015 more than half of the population of sub-Saharan Africa slept under insecticide-treated mosquito nets, compared to just 2 % in 2000. Increased availability of rapid diagnostic tests and antimalarial medicines has allowed many more people to access timely and appropriate treatment. Malaria incidence rates have decreased by 37 % globally and mortality rates by 60 % since 2000. It is estimated that 70 % of the reductions in numbers of cases in sub-Saharan Africa can be attributed to malaria interventions. Reductions in malaria incidence and mortality rates have been made in every WHO region and almost every country. However, decreases in malaria case incidence and mortality rates were slowest in countries that had the largest numbers of malaria cases and deaths in 2000; reductions in incidence need to be greatly accelerated in these countries to achieve future malaria targets. Progress is made challenging because malaria is concentrated in countries and areas with the least resourced health systems and the least ability to pay for system improvements. Malaria interventions are nevertheless highly cost-effective and have not only led to significant reductions in the incidence of the disease but are estimated to have saved about US$ 900 million in malaria case management costs to public providers in sub-Saharan Africa between 2000 and 2014. Investments in malaria programmes can not

  19. Business aspects of cardiovascular computed tomography: tackling the challenges.

    Science.gov (United States)

    Bateman, Timothy M

    2008-01-01

    The purpose of this article is to provide a comprehensive understanding of the business issues surrounding provision of dedicated cardiovascular computed tomographic imaging. Some of the challenges include high up-front costs, current low utilization relative to scanner capability, and inadequate payments. Cardiovascular computed tomographic imaging is a valuable clinical modality that should be offered by cardiovascular centers-of-excellence. With careful consideration of the business aspects, moderate-to-large size cardiology programs should be able to implement an economically viable cardiovascular computed tomographic service.

  20. HIV and Hepatitis Testing: Global Progress, Challenges, and Future Directions.

    Science.gov (United States)

    Easterbrook, Philippa; Johnson, Cheryl; Figueroa, Carmen; Baggaley, Rachel

    2016-01-01

    HIV infection and viral hepatitis due to HBV and HCV infection are major causes of chronic disease worldwide, and share some common routes of transmission, epidemiology, initial barriers faced in treatment access, and in strategies for a global public health response. Testing and diagnosis of HIV, HBV, and HCV infection is the gateway for access to both care and treatment and prevention services, and crucial for an effective HIV and hepatitis epidemic response. In this review article, we first summarize the common goals and guiding principles in a public health approach to HIV and hepatitis testing. We summarize the impressive global progress in HIV testing scale-up and evolution of approaches, with expansion of provider-initiated testing and counseling in clinical settings (particularly antenatal and tuberculosis clinics), the introduction of more community based testing services, and use of rapid diagnostic tests enabling provision of same-day test results. However, 46% of all people living with HIV are still unaware of their serostatus, and many continue to be diagnosed and start antiretroviral therapy late. As testing and treatment scale-up accelerates for an "treat all" approach, other challenges to address include how to better focus testing and reach those yet undiagnosed and most at risk, especially key populations, men, adolescents, and children. We summarize future directions in HIV testing to speed scale-up and close gaps that are addressed in the WHO 2015 consolidated HIV testing guidelines. In contrast to HIV, action in hepatitis testing and treatment has been fragmented and limited to a few countries, and there remains a large burden of undiagnosed cases globally. We summarize key challenges in the hepatitis testing response, including lack of simple, reliable, and low-cost diagnostic tests, laboratory capacity, and testing facilities; inadequate data to guide country specific hepatitis testing approaches and who to screen; stigmatization and social

  1. Child-Computer Interaction SIG: New Challenges and Opportunities

    DEFF Research Database (Denmark)

    Hourcade, Juan Pablo; Iversen, Ole Sejer; Revelle, Glenda

    2016-01-01

    This SIG will provide child-computer interaction researchers and practitioners an opportunity to discuss four topics that represent new challenges and opportunities for the community. The four areas are: interactive technologies for children under the age of five, technology for inclusion, privacy...

  2. Theoretical models for coronary vascular biomechanics: Progress & challenges

    Science.gov (United States)

    Waters, Sarah L.; Alastruey, Jordi; Beard, Daniel A.; Bovendeerd, Peter H.M.; Davies, Peter F.; Jayaraman, Girija; Jensen, Oliver E.; Lee, Jack; Parker, Kim H.; Popel, Aleksander S.; Secomb, Timothy W.; Siebes, Maria; Sherwin, Spencer J.; Shipley, Rebecca J.; Smith, Nicolas P.; van de Vosse, Frans N.

    2013-01-01

    A key aim of the cardiac Physiome Project is to develop theoretical models to simulate the functional behaviour of the heart under physiological and pathophysiological conditions. Heart function is critically dependent on the delivery of an adequate blood supply to the myocardium via the coronary vasculature. Key to this critical function of the coronary vasculature is system dynamics that emerge via the interactions of the numerous constituent components at a range of spatial and temporal scales. Here, we focus on several components for which theoretical approaches can be applied, including vascular structure and mechanics, blood flow and mass transport, flow regulation, angiogenesis and vascular remodelling, and vascular cellular mechanics. For each component, we summarise the current state of the art in model development, and discuss areas requiring further research. We highlight the major challenges associated with integrating the component models to develop a computational tool that can ultimately be used to simulate the responses of the coronary vascular system to changing demands and to diseases and therapies. PMID:21040741

  3. Graphene as Cancer Theranostic Tool: Progress and Future Challenges

    Science.gov (United States)

    Orecchioni, Marco; Cabizza, Roberto; Bianco, Alberto; Delogu, Lucia Gemma

    2015-01-01

    Nowadays cancer remains one of the main causes of death in the world. Current diagnostic techniques need to be improved to provide earlier diagnosis and treatment. Traditional therapy approaches to cancer are limited by lack of specificity and systemic toxicity. In this scenario nanomaterials could be good allies to give more specific cancer treatment effectively reducing undesired side effects and giving at the same time accurate diagnosis and successful therapy. In this context, thanks to its unique physical and chemical properties, graphene, graphene oxide (GO) and reduced graphene (rGO) have recently attracted tremendous interest in biomedicine including cancer therapy. Herein we analyzed all studies presented in literature related to cancer fight using graphene and graphene-based conjugates. In this context, we aimed at the full picture of the state of the art providing new inputs for future strategies in the cancer theranostic by using of graphene. We found an impressive increasing interest in the material for cancer therapy and/or diagnosis. The majority of the works (73%) have been carried out on drug and gene delivery applications, following by photothermal therapy (32%), imaging (31%) and photodynamic therapy (10%). A 27% of the studies focused on theranostic applications. Part of the works here discussed contribute to the growth of the theranostic field covering the use of imaging (i.e. ultrasonography, positron electron tomography, and fluorescent imaging) combined to one or more therapeutic modalities. We found that the use of graphene in cancer theranostics is still in an early but rapidly growing stage of investigation. Any technology based on nanomaterials can significantly enhance their possibility to became the real revolution in medicine if combines diagnosis and therapy at the same time. We performed a comprehensive summary of the latest progress of graphene cancer fight and highlighted the future challenges and the innovative possible

  4. Molecular Science Computing Facility Scientific Challenges: Linking Across Scales

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, Wibe A.; Windus, Theresa L.

    2005-07-01

    The purpose of this document is to define the evolving science drivers for performing environmental molecular research at the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) and to provide guidance associated with the next-generation high-performance computing center that must be developed at EMSL's Molecular Science Computing Facility (MSCF) in order to address this critical research. The MSCF is the pre-eminent computing facility?supported by the U.S. Department of Energy's (DOE's) Office of Biological and Environmental Research (BER)?tailored to provide the fastest time-to-solution for current computational challenges in chemistry and biology, as well as providing the means for broad research in the molecular and environmental sciences. The MSCF provides integral resources and expertise to emerging EMSL Scientific Grand Challenges and Collaborative Access Teams that are designed to leverage the multiple integrated research capabilities of EMSL, thereby creating a synergy between computation and experiment to address environmental molecular science challenges critical to DOE and the nation.

  5. Challenges for the CMS Computing Model in the First Year

    CERN Document Server

    Fisk, Ian

    2009-01-01

    CMS is in the process of commissioning a complex detector and a globally distributed computing infrastructure simultaneously. This represents a unique challenge. Even at the beginning there is not sufficient analysis or organized processing resources at CERN alone. In this presentation we discuss the unique computing challenges CMS expects to face during the first year of running and how they influence the baseline computing model decisions. During the early accelerator commissioning periods, CMS will attempt to collect as many events as possible when the beam is on in order to provide adequate early commissioning data. Some of these plans involve overdriving the Tier-0 infrastructure during data collection with recovery when the beam is off. In addition to the larger number of triggered events, there will be pressure in the first year to collect and analyze more complete data formats as the summarized formats mature. The large event formats impact the required storage, bandwidth, and processing capacity acro...

  6. Mathematical and Computational Challenges in Population Biology and Ecosystems Science

    Science.gov (United States)

    Levin, Simon A.; Grenfell, Bryan; Hastings, Alan; Perelson, Alan S.

    1997-01-01

    Mathematical and computational approaches provide powerful tools in the study of problems in population biology and ecosystems science. The subject has a rich history intertwined with the development of statistics and dynamical systems theory, but recent analytical advances, coupled with the enhanced potential of high-speed computation, have opened up new vistas and presented new challenges. Key challenges involve ways to deal with the collective dynamics of heterogeneous ensembles of individuals, and to scale from small spatial regions to large ones. The central issues-understanding how detail at one scale makes its signature felt at other scales, and how to relate phenomena across scales-cut across scientific disciplines and go to the heart of algorithmic development of approaches to high-speed computation. Examples are given from ecology, genetics, epidemiology, and immunology.

  7. Challenges and opportunities of cloud computing for atmospheric sciences

    Science.gov (United States)

    Pérez Montes, Diego A.; Añel, Juan A.; Pena, Tomás F.; Wallom, David C. H.

    2016-04-01

    Cloud computing is an emerging technological solution widely used in many fields. Initially developed as a flexible way of managing peak demand it has began to make its way in scientific research. One of the greatest advantages of cloud computing for scientific research is independence of having access to a large cyberinfrastructure to fund or perform a research project. Cloud computing can avoid maintenance expenses for large supercomputers and has the potential to 'democratize' the access to high-performance computing, giving flexibility to funding bodies for allocating budgets for the computational costs associated with a project. Two of the most challenging problems in atmospheric sciences are computational cost and uncertainty in meteorological forecasting and climate projections. Both problems are closely related. Usually uncertainty can be reduced with the availability of computational resources to better reproduce a phenomenon or to perform a larger number of experiments. Here we expose results of the application of cloud computing resources for climate modeling using cloud computing infrastructures of three major vendors and two climate models. We show how the cloud infrastructure compares in performance to traditional supercomputers and how it provides the capability to complete experiments in shorter periods of time. The monetary cost associated is also analyzed. Finally we discuss the future potential of this technology for meteorological and climatological applications, both from the point of view of operational use and research.

  8. Scenario-Based Digital Forensics Challenges in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Erik Miranda Lopez

    2016-10-01

    Full Text Available The aim of digital forensics is to extract information to answer the 5Ws (Why, When, Where, What, and Who from the data extracted from the evidence. In order to achieve this, most digital forensic processes assume absolute control of digital evidence. However, in a cloud environment forensic investigation, this is not always possible. Additionally, the unique characteristics of cloud computing create new technical, legal and architectural challenges when conducting a forensic investigation. We propose a hypothetical scenario to uncover and explain the challenges forensic practitioners face during cloud investigations. Additionally, we also provide solutions to address the challenges. Our hypothetical case scenario has shown that, in the long run, better live forensic tools, development of new methods tailored for cloud investigations and new procedures and standards are indeed needed. Furthermore, we have come to the conclusion that forensic investigations biggest challenge is not technical but legal.

  9. The WLCG common computing readiness challenge CCRC`08

    CERN Document Server

    Mendez Lorenzo, Patricia; Campana, Simone; Santinelli, Roberto; Lamanna, Massimo; Lanciotti, Elisa; Di Girolamo, Alessandro; Magini, Nicolo; Miccio, Enzo; Shiers, Jamie; Renshall, Harry

    2008-01-01

    The World’s biggest machine - the Large Hadron Collider (LHC) at CERN, Geneva, Switzerland- will enter operation in 2008. Using the Grid infrastructure provided mostly by EGEE and OSG, the WLCG project has been chosen to provide the computational and storage resources needs for the 4 experiments of the LHC. The goal of the Common Computing Readiness Challenge (CCRC’08) is to demonstrate that these computing facilities can be used to satisfy the needs of the experiments The LHC machine will produce some 15PB of data per year. The management and the analysis of these data rely on a worldwide production Grid service involving hundreds of sites from EGEE and collaborating Grids. One significant challenge remains: to demonstrate that these computing facilities can be used to satisfy simultaneously the needs of the 4 major experiments of the LHC at full 2008 rates. During the CCRC`08 we will demonstrate precisely this. Given the importance of the challenge, two phases are foreseen: an initial run in February, w...

  10. Computational challenges in modeling and simulating living matter

    Science.gov (United States)

    Sena, Alexandre C.; Silva, Dilson; Marzulo, Leandro A. J.; de Castro, Maria Clicia Stelling

    2016-12-01

    Computational modeling has been successfully used to help scientists understand physical and biological phenomena. Recent technological advances allowthe simulation of larger systems, with greater accuracy. However, devising those systems requires new approaches and novel architectures, such as the use of parallel programming, so that the application can run in the new high performance environments, which are often computer clusters composed of different computation devices, as traditional CPUs, GPGPUs, Xeon Phis and even FPGAs. It is expected that scientists take advantage of the increasing computational power to model and simulate more complex structures and even merge different models into larger and more extensive ones. This paper aims at discussing the challenges of using those devices to simulate such complex systems.

  11. Maternal and child health in Brazil: progress and challenges.

    Science.gov (United States)

    Victora, Cesar G; Aquino, Estela M L; do Carmo Leal, Maria; Monteiro, Carlos Augusto; Barros, Fernando C; Szwarcwald, Celia L

    2011-05-28

    In the past three decades, Brazil has undergone rapid changes in major social determinants of health and in the organisation of health services. In this report, we examine how these changes have affected indicators of maternal health, child health, and child nutrition. We use data from vital statistics, population censuses, demographic and health surveys, and published reports. In the past three decades, infant mortality rates have reduced substantially, decreasing by 5·5% a year in the 1980s and 1990s, and by 4·4% a year since 2000 to reach 20 deaths per 1000 livebirths in 2008. Neonatal deaths account for 68% of infant deaths. Stunting prevalence among children younger than 5 years decreased from 37% in 1974-75 to 7% in 2006-07. Regional differences in stunting and child mortality also decreased. Access to most maternal-health and child-health interventions increased sharply to almost universal coverage, and regional and socioeconomic inequalities in access to such interventions were notably reduced. The median duration of breastfeeding increased from 2·5 months in the 1970s to 14 months by 2006-07. Official statistics show stable maternal mortality ratios during the past 10 years, but modelled data indicate a yearly decrease of 4%, a trend which might not have been noticeable in official reports because of improvements in death registration and the increased number of investigations into deaths of women of reproductive age. The reasons behind Brazil's progress include: socioeconomic and demographic changes (economic growth, reduction in income disparities between the poorest and wealthiest populations, urbanisation, improved education of women, and decreased fertility rates), interventions outside the health sector (a conditional cash transfer programme and improvements in water and sanitation), vertical health programmes in the 1980s (promotion of breastfeeding, oral rehydration, and immunisations), creation of a tax-funded national health service in 1988

  12. Precision Medicine and PET/Computed Tomography: Challenges and Implementation.

    Science.gov (United States)

    Subramaniam, Rathan M

    2017-01-01

    Precision Medicine is about selecting the right therapy for the right patient, at the right time, specific to the molecular targets expressed by disease or tumors, in the context of patient's environment and lifestyle. Some of the challenges for delivery of precision medicine in oncology include biomarkers for patient selection for enrichment-precision diagnostics, mapping out tumor heterogeneity that contributes to therapy failures, and early therapy assessment to identify resistance to therapies. PET/computed tomography offers solutions in these important areas of challenges and facilitates implementation of precision medicine. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Challenges and possible approaches: towards the petaflops computers

    Institute of Scientific and Technical Information of China (English)

    Depei QIAN; Danfeng ZHU

    2009-01-01

    In parallel with the R&D efforts in USA and Eu-rope, China's National High-tech R&D program has setup its goal in developing petaflops computers. Researchers and engineers world-wide are looking for appropriate methods and technologies to achieve the petaflops computer system. Based on discussion on important design issues in devel-oping the petafiops computer, this paper raises the major technological challenges including the memory wall, low power system design, interconnects, and programming sup-port, etc. Current efforts in addressing some of these chal-lenges and in pursuing possible solutions for developing the petaflops systems are presented. Several existing systems are briefly introduced as examples, including Roadrunner, Cray XT5 jaguar, Dawning 5000A/6000, and Lenovo DeepComp 7000. Architectures proposed by Chinese researchers for im-plementing the petaflops computer are also introduced. Ad-vantages of the architecture as well as the difficulties in its implementation are discussed. Finally, future research direc-tion in development of high productivity computing systems is discussed.

  14. Challenges for the CMS computing model in the first year

    Energy Technology Data Exchange (ETDEWEB)

    Fisk, I, E-mail: ifisk@fnal.go [Fermi National Accelerator Laboratory (United States)

    2010-04-01

    CMS is in the process of commissioning a complex detector and a globally distributed computing infrastructure simultaneously. This represents a unique challenge. Even at the beginning there is not sufficient analysis or organized processing resources at CERN alone. In this presentation we discuss the unique computing challenges CMS expects to face during the first year of running and how they influence the baseline computing model decisions. During the early accelerator commissioning periods, CMS will attempt to collect as many events as possible when the beam is on in order to provide adequate early commissioning data. Some of these plans involve overdriving the Tier-0 infrastructure during data collection with recovery when the beam is off. In addition to the larger number of triggered events, there will be pressure in the first year to collect and analyze more complete data formats as the summarized formats mature. The large event formats impact the required storage, bandwidth, and processing capacity across all the computing centers. While the understanding of the detector and the event selections is being improved, there will likely be a larger number of reconstruction passes and skims performed by both central operations and individual users. We discuss how these additional stresses impact the allocation of resources and the changes from the baseline computing model.

  15. Challenges for the CMS computing model in the first year

    Energy Technology Data Exchange (ETDEWEB)

    Fisk, I.; /Fermilab

    2009-05-01

    CMS is in the process of commissioning a complex detector and a globally distributed computing infrastructure simultaneously. This represents a unique challenge. Even at the beginning there is not sufficient analysis or organized processing resources at CERN alone. In this presentation we discuss the unique computing challenges CMS expects to face during the first year of running and how they influence the baseline computing model decisions. During the early accelerator commissioning periods, CMS will attempt to collect as many events as possible when the beam is on in order to provide adequate early commissioning data. Some of these plans involve overdriving the Tier-0 infrastructure during data collection with recovery when the beam is off. In addition to the larger number of triggered events, there will be pressure in the first year to collect and analyze more complete data formats as the summarized formats mature. The large event formats impact the required storage, bandwidth, and processing capacity across all the computing centers. While the understanding of the detector and the event selections is being improved, there will likely be a larger number of reconstruction passes and skims performed by both central operations and individual users. We discuss how these additional stresses impact the allocation of resources and the changes from the baseline computing model.

  16. Reviews on Security Issues and Challenges in Cloud Computing

    Science.gov (United States)

    An, Y. Z.; Zaaba, Z. F.; Samsudin, N. F.

    2016-11-01

    Cloud computing is an Internet-based computing service provided by the third party allowing share of resources and data among devices. It is widely used in many organizations nowadays and becoming more popular because it changes the way of how the Information Technology (IT) of an organization is organized and managed. It provides lots of benefits such as simplicity and lower costs, almost unlimited storage, least maintenance, easy utilization, backup and recovery, continuous availability, quality of service, automated software integration, scalability, flexibility and reliability, easy access to information, elasticity, quick deployment and lower barrier to entry. While there is increasing use of cloud computing service in this new era, the security issues of the cloud computing become a challenges. Cloud computing must be safe and secure enough to ensure the privacy of the users. This paper firstly lists out the architecture of the cloud computing, then discuss the most common security issues of using cloud and some solutions to the security issues since security is one of the most critical aspect in cloud computing due to the sensitivity of user's data.

  17. Pluripotent stem cells for Parkinson's disease: progress and challenges

    National Research Council Canada - National Science Library

    Zeng, Xianmin; Couture, Larry A

    2013-01-01

    Parkinson's disease (PD) is a common debilitating neurodegenerative disease. The motor symptoms of PD are caused mainly by a progressive loss of dopaminergic neurons from the substania nigra, resulting in a loss of dopamine production...

  18. Computational challenges in atomic, molecular and optical physics.

    Science.gov (United States)

    Taylor, Kenneth T

    2002-06-15

    Six challenges are discussed. These are the laser-driven helium atom; the laser-driven hydrogen molecule and hydrogen molecular ion; electron scattering (with ionization) from one-electron atoms; the vibrational and rotational structure of molecules such as H(3)(+) and water at their dissociation limits; laser-heated clusters; and quantum degeneracy and Bose-Einstein condensation. The first four concern fundamental few-body systems where use of high-performance computing (HPC) is currently making possible accurate modelling from first principles. This leads to reliable predictions and support for laboratory experiment as well as true understanding of the dynamics. Important aspects of these challenges addressable only via a terascale facility are set out. Such a facility makes the last two challenges in the above list meaningfully accessible for the first time, and the scientific interest together with the prospective role for HPC in these is emphasized.

  19. Computed Optical Interferometric Imaging: Methods, Achievements, and Challenges.

    Science.gov (United States)

    South, Fredrick A; Liu, Yuan-Zhi; Carney, P Scott; Boppart, Stephen A

    2016-01-01

    Three-dimensional high-resolution optical imaging systems are generally restricted by the trade-off between resolution and depth-of-field as well as imperfections in the imaging system or sample. Computed optical interferometric imaging is able to overcome these longstanding limitations using methods such as interferometric synthetic aperture microscopy (ISAM) and computational adaptive optics (CAO) which manipulate the complex interferometric data. These techniques correct for limited depth-of-field and optical aberrations without the need for additional hardware. This paper aims to outline these computational methods, making them readily available to the research community. Achievements of the techniques will be highlighted, along with past and present challenges in implementing the techniques. Challenges such as phase instability and determination of the appropriate aberration correction have been largely overcome so that imaging of living tissues using ISAM and CAO is now possible. Computed imaging in optics is becoming a mature technology poised to make a significant impact in medicine and biology.

  20. The ATLAS computing challenge for HL-LHC

    CERN Document Server

    Campana, Simone; The ATLAS collaboration

    2016-01-01

    The ATLAS experiment successfully commissioned a software and computing infrastructure to support the physics program during LHC Run 2. The next phases of the accelerator upgrade will present new challenges in the offline area. In particular, at High Luminosity LHC (also known as Run 4) the data taking conditions will be very demanding in terms of computing resources: between 5 and 10 KHz of event rate from the HLT to be reconstructed (and possibly further reprocessed) with an average pile-up of up to 200 events per collision and an equivalent number of simulated samples to be produced. The same parameters for the current run are lower by up to an order of magnitude. While processing and storage resources would need to scale accordingly, the funding situation allows one at best to consider a flat budget over the next few years for offline computing needs. In this paper we present a study quantifying the challenge in terms of computing resources for HL-LHC and present ideas about the possible evolution of the ...

  1. Conceptual challenges and computational progress in X-ray simulation

    CERN Document Server

    Pia, Maria Grazia; Begalli, Marcia; Kim, Chan-Hyeung; Quintieri, Lina; Saracco, Paolo; Seo, Hee; Sudhakar, Manju; Weidenspointner, Georg; Zoglauer, Andreas

    2010-01-01

    Recent developments and validation tests related to the simulation of X-ray fluorescence and PIXE with Geant4 are reviewed. They concern new models for PIXE, which has enabled the first Geant4-based simulation of PIXE in a concrete experimental application, and the experimental validation of the content of the EADL data library relevant to the simulation of X-ray fluorescence. Achievements and open issues in this domain are discussed.

  2. The (WLCG) Common Computing Readiness Challenge (CCRC'08)

    CERN Document Server

    CERN. Geneva

    2008-01-01

    Since several years the four LHC experiments have been running periodic stress tests of their planned computing operations when LHC data arrives. They have written design documents which describe their models for how they will acquire, distribute and process their data. They differ in details but have a common base in a model of levels or 'Tiers' of resources. A weakness of all the tests done is that they have never overlapped all the experiments together so never reached the full rates expected at CERN and the Tier sites and never had to cope with the lack of homogeneity of having the 4 different experiment data management and batch processing models being busy at the same time. The CCRC08 (Common Computing Readiness Challenge) tests address this by deliberately exercising all experiments computing at the target rates at the same time.

  3. ATLAS computing challenges before the next LHC run

    CERN Document Server

    Barberis, D; The ATLAS collaboration

    2014-01-01

    ATLAS software and computing is in a period of intensive evolution. The current long shutdown presents an opportunity to assimilate lessons from the very successful Run 1 (2009-2013) and to prepare for the substantially increased computing requirements for Run 2 (from spring 2015). Run 2 will bring a near doubling of the energy and the data rate, high event pile-up levels, and higher event complexity from detector upgrades, meaning the number and complexity of events to be analyzed will increase dramatically. At the same time operational loads must be reduced through greater automation, a wider array of opportunistic resources must be supported, costly storage must be used with greater efficiency, a sophisticated new analysis model must be integrated, and concurrency features of new processors must be exploited. This paper surveys the distributed computing aspects of the upgrade program and the plans for 2014 to exercise the new capabilities in a large scale Data Challenge.

  4. Achievements and challenges in structural bioinformatics and computational biophysics.

    Science.gov (United States)

    Samish, Ilan; Bourne, Philip E; Najmanovich, Rafael J

    2015-01-01

    The field of structural bioinformatics and computational biophysics has undergone a revolution in the last 10 years. Developments that are captured annually through the 3DSIG meeting, upon which this article reflects. An increase in the accessible data, computational resources and methodology has resulted in an increase in the size and resolution of studied systems and the complexity of the questions amenable to research. Concomitantly, the parameterization and efficiency of the methods have markedly improved along with their cross-validation with other computational and experimental results. The field exhibits an ever-increasing integration with biochemistry, biophysics and other disciplines. In this article, we discuss recent achievements along with current challenges within the field. © The Author 2014. Published by Oxford University Press.

  5. Images as drivers of progress in cardiac computational modelling.

    Science.gov (United States)

    Lamata, Pablo; Casero, Ramón; Carapella, Valentina; Niederer, Steve A; Bishop, Martin J; Schneider, Jürgen E; Kohl, Peter; Grau, Vicente

    2014-08-01

    Computational models have become a fundamental tool in cardiac research. Models are evolving to cover multiple scales and physical mechanisms. They are moving towards mechanistic descriptions of personalised structure and function, including effects of natural variability. These developments are underpinned to a large extent by advances in imaging technologies. This article reviews how novel imaging technologies, or the innovative use and extension of established ones, integrate with computational models and drive novel insights into cardiac biophysics. In terms of structural characterization, we discuss how imaging is allowing a wide range of scales to be considered, from cellular levels to whole organs. We analyse how the evolution from structural to functional imaging is opening new avenues for computational models, and in this respect we review methods for measurement of electrical activity, mechanics and flow. Finally, we consider ways in which combined imaging and modelling research is likely to continue advancing cardiac research, and identify some of the main challenges that remain to be solved.

  6. Stem cell models of Alzheimers disease: progress and challenges

    National Research Council Canada - National Science Library

    Charles Arber; Christopher Lovejoy; Selina Wray

    2017-01-01

    .... Induced pluripotent stem cell (iPSC) technology, together with advances in 2D and 3D neuronal differentiation, offers a unique opportunity to overcome this challenge and generate a limitless supply of human neurons for in vitro studies...

  7. Computational challenges in the analyses of petrophysics using microtomography and upscaling: A review

    Science.gov (United States)

    Liu, Jie; Pereira, Gerald G.; Liu, Qingbin; Regenauer-Lieb, Klaus

    2016-04-01

    Microtomography provides detailed 3D internal structures of materials in micro- to tens of nano-meter resolution and is quickly turning into a new technology for studying petrophysical properties of rocks. An important step is the upscaling of these properties as micron or sub-micron resolution can only be achieved on the sample-scale of millimeters or even less than a millimeter. We have developed a computational workflow for the analysis of microstructures including the upscaling of material properties. Computations of properties are first performed using conventional material science simulations at micro to nano-scale. The subsequent upscaling of these properties is done by a novel renormalization procedure based on percolation theory. In this paper we discuss the computational challenges arising from the workflow, which include: 1) characterization of microtomography for extremely large data sets; 2) computational fluid dynamics simulations at pore-scale for permeability estimation; 3) solid mechanical computations at pore-scale for estimating elasto-plastic properties; 4) Extracting critical exponents from derivative models for scaling laws. We conclude that significant progress in each of these challenges is necessary to transform microtomography from the current research problem into a robust computational big data tool for multi-scale scientific and engineering problems.

  8. Computational Challenges in the Analysis of Petrophysics Using Microtomography and Upscaling

    Science.gov (United States)

    Liu, J.; Pereira, G.; Freij-Ayoub, R.; Regenauer-Lieb, K.

    2014-12-01

    Microtomography provides detailed 3D internal structures of rocks in micro- to tens of nano-meter resolution and is quickly turning into a new technology for studying petrophysical properties of materials. An important step is the upscaling of these properties as micron or sub-micron resolution can only be done on the sample-scale of millimeters or even less than a millimeter. We present here a recently developed computational workflow for the analysis of microstructures including the upscaling of material properties. Computations of properties are first performed using conventional material science simulations at micro to nano-scale. The subsequent upscaling of these properties is done by a novel renormalization procedure based on percolation theory. We have tested the workflow using different rock samples, biological and food science materials. We have also applied the technique on high-resolution time-lapse synchrotron CT scans. In this contribution we focus on the computational challenges that arise from the big data problem of analyzing petrophysical properties and its subsequent upscaling. We discuss the following challenges: 1) Characterization of microtomography for extremely large data sets - our current capability. 2) Computational fluid dynamics simulations at pore-scale for permeability estimation - methods, computing cost and accuracy. 3) Solid mechanical computations at pore-scale for estimating elasto-plastic properties - computational stability, cost, and efficiency. 4) Extracting critical exponents from derivative models for scaling laws - models, finite element meshing, and accuracy. Significant progress in each of these challenges is necessary to transform microtomography from the current research problem into a robust computational big data tool for multi-scale scientific and engineering problems.

  9. Science Education Reform in Qatar: Progress and Challenges

    Science.gov (United States)

    Said, Ziad

    2016-01-01

    Science education reform in Qatar has had limited success. In the Trends in International Mathematics and Science Study (TIMMS), Qatari 4th and 8th grade students have shown progress in science achievement, but they remain significantly below the international average. Also, in the Program for International Student Assessment (PISA), Qatari…

  10. Brain-computer interface systems: progress and prospects.

    Science.gov (United States)

    Allison, Brendan Z; Wolpaw, Elizabeth Winter; Wolpaw, Jonathan R

    2007-07-01

    Brain-computer interface (BCI) systems support communication through direct measures of neural activity without muscle activity. BCIs may provide the best and sometimes the only communication option for users disabled by the most severe neuromuscular disorders and may eventually become useful to less severely disabled and/or healthy individuals across a wide range of applications. This review discusses the structure and functions of BCI systems, clarifies terminology and addresses practical applications. Progress and opportunities in the field are also identified and explicated.

  11. Computational challenges of sequence classification in microbiomic data.

    Science.gov (United States)

    Ribeca, Paolo; Valiente, Gabriel

    2011-11-01

    Next-generation sequencing technologies have opened up an unprecedented opportunity for microbiology by enabling the culture-independent genetic study of complex microbial communities, which were so far largely unknown. The analysis of metagenomic data is challenging: potentially, one is faced with a sample containing a mixture of many different bacterial species, whose genome has not necessarily been sequenced beforehand. In the simpler case of the analysis of 16S ribosomal RNA metagenomic data, for which databases of reference sequences are known, we survey the computational challenges to be solved in order to be able to characterize and quantify a sample. In particular, we examine two aspects: how the necessary adoption of new tools geared towards high-throughput analysis impacts the quality of the results, and how good is the performance of various established methods to assign sequence reads to microbial species, with and without taking taxonomic information into account.

  12. Computational Challenges of the AARTFAAC All-sky Monitor

    Science.gov (United States)

    Huizinga, Folkert

    2014-04-01

    The AARTFAAC project will provide the LOFAR telescope with a fully commensal, continuously operational, all-sky transient monitoring system. This is achieved by real-time correlation of up to 288 wide-field antennae from the LOFAR core, followed by a high-performance calibration and imaging pipeline which feeds results to the existing LOFAR transient detection system. This poses formidable computational challenges, which have been addressed by the development of a heterogeneous system including FPGAs, GPUs and CPUs. I will describe the system architecture with a particular emphasis on the implementation of, and first performance results from, the calibration and imaging pipeline.

  13. Computational and Theoretical Challenges on Counting Solid Standard Young Tableaux

    CERN Document Server

    Ekhad, Shalosh B

    2012-01-01

    In how many ways can you place n chocolate pieces all of different sizes in an n by n chocolate box, in such a way that when you go from left to right and from top to bottom, there are no gaps AND the sizes increase along each row and each column? The answer is the well-known OEIS Sequence Number 85. To our amazement, the analogous sequence for a three-dimensional chocolate box was not there. Here we fill this gap, and more importantly, offer some computational and theoretical challenges about enumerating families of Solid Standard Young Tableaux.

  14. Computational and Theoretical Challenges on Counting Solid Standard Young Tableaux

    OpenAIRE

    Ekhad, Shalosh B.; Zeilberger, Doron

    2012-01-01

    In how many ways can you place n chocolate pieces all of different sizes in an n by n chocolate box, in such a way that when you go from left to right and from top to bottom, there are no gaps AND the sizes increase along each row and each column? The answer is the well-known OEIS Sequence Number 85. To our amazement, the analogous sequence for a three-dimensional chocolate box was not there. Here we fill this gap, and more importantly, offer some computational and theoretical challenges abou...

  15. Sustainable Nanotechnology: Opportunities and Challenges for Theoretical/Computational Studies.

    Science.gov (United States)

    Cui, Qiang; Hernandez, Rigoberto; Mason, Sara E; Frauenheim, Thomas; Pedersen, Joel A; Geiger, Franz

    2016-08-04

    For assistance in the design of the next generation of nanomaterials that are functional and have minimal health and safety concerns, it is imperative to establish causality, rather than correlations, in how properties of nanomaterials determine biological and environmental outcomes. Due to the vast design space available and the complexity of nano/bio interfaces, theoretical and computational studies are expected to play a major role in this context. In this minireview, we highlight opportunities and pressing challenges for theoretical and computational chemistry approaches to explore the relevant physicochemical processes that span broad length and time scales. We focus discussions on a bottom-up framework that relies on the determination of correct intermolecular forces, accurate molecular dynamics, and coarse-graining procedures to systematically bridge the scales, although top-down approaches are also effective at providing insights for many problems such as the effects of nanoparticles on biological membranes.

  16. Computational Aspects of Dam Risk Analysis: Findings and Challenges

    Directory of Open Access Journals (Sweden)

    Ignacio Escuder-Bueno

    2016-09-01

    Full Text Available In recent years, risk analysis techniques have proved to be a useful tool to inform dam safety management. This paper summarizes the outcomes of three themes related to dam risk analysis discussed in the Benchmark Workshops organized by the International Commission on Large Dams Technical Committee on “Computational Aspects of Analysis and Design of Dams.” In the 2011 Benchmark Workshop, estimation of the probability of failure of a gravity dam for the sliding failure mode was discussed. Next, in 2013, the discussion focused on the computational challenges of the estimation of consequences in dam risk analysis. Finally, in 2015, the probability of sliding and overtopping in an embankment was analyzed. These Benchmark Workshops have allowed a complete review of numerical aspects for dam risk analysis, showing that risk analysis methods are a very useful tool to analyze the risk of dam systems, including downstream consequence assessments and the uncertainty of structural models.

  17. Novel spintronics devices for memory and logic: prospects and challenges for room temperature all spin computing

    Science.gov (United States)

    Wang, Jian-Ping

    An energy efficient memory and logic device for the post-CMOS era has been the goal of a variety of research fields. The limits of scaling, which we expect to reach by the year 2025, demand that future advances in computational power will not be realized from ever-shrinking device sizes, but rather by innovative designs and new materials and physics. Magnetoresistive based devices have been a promising candidate for future integrated magnetic computation because of its unique non-volatility and functionalities. The application of perpendicular magnetic anisotropy for potential STT-RAM application was demonstrated and later has been intensively investigated by both academia and industry groups, but there is no clear path way how scaling will eventually work for both memory and logic applications. One of main reasons is that there is no demonstrated material stack candidate that could lead to a scaling scheme down to sub 10 nm. Another challenge for the usage of magnetoresistive based devices for logic application is its available switching speed and writing energy. Although a good progress has been made to demonstrate the fast switching of a thermally stable magnetic tunnel junction (MTJ) down to 165 ps, it is still several times slower than its CMOS counterpart. In this talk, I will review the recent progress by my research group and my C-SPIN colleagues, then discuss the opportunities, challenges and some potential path ways for magnetoresitive based devices for memory and logic applications and their integration for room temperature all spin computing system.

  18. Progress in Parallel Schur Complement Preconditioning for Computational Fluid Dynamics

    Science.gov (United States)

    Barth, Timothy J.; Chan, Tony F.; Tang, Wei-Pai; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    We consider preconditioning methods for nonself-adjoint advective-diffusive systems based on a non-overlapping Schur complement procedure for arbitrary triangulated domains. The ultimate goal of this research is to develop scalable preconditioning algorithms for fluid flow discretizations on parallel computing architectures. In our implementation of the Schur complement preconditioning technique, the triangulation is first partitioned into a number of subdomains using the METIS multi-level k-way partitioning code. This partitioning induces a natural 2X2 partitioning of the p.d.e. discretization matrix. By considering various inverse approximations of the 2X2 system, we have developed a family of robust preconditioning techniques. A computer code based on these ideas has been developed and tested on the IBM SP2 and the SGI Power Challenge array using MPI message passing protocol. A number of example CFD calculations will be presented to illustrate and assess various Schur complement approximations.

  19. DOE Advanced Scientific Computing Advisory Subcommittee (ASCAC) Report: Top Ten Exascale Research Challenges

    Energy Technology Data Exchange (ETDEWEB)

    Lucas, Robert [University of Southern California, Information Sciences Institute; Ang, James [Sandia National Laboratories; Bergman, Keren [Columbia University; Borkar, Shekhar [Intel; Carlson, William [Institute for Defense Analyses; Carrington, Laura [University of California, San Diego; Chiu, George [IBM; Colwell, Robert [DARPA; Dally, William [NVIDIA; Dongarra, Jack [University of Tennessee; Geist, Al [Oak Ridge National Laboratory; Haring, Rud [IBM; Hittinger, Jeffrey [Lawrence Livermore National Laboratory; Hoisie, Adolfy [Pacific Northwest National Laboratory; Klein, Dean Micron; Kogge, Peter [University of Notre Dame; Lethin, Richard [Reservoir Labs; Sarkar, Vivek [Rice University; Schreiber, Robert [Hewlett Packard; Shalf, John [Lawrence Berkeley National Laboratory; Sterling, Thomas [Indiana University; Stevens, Rick [Argonne National Laboratory; Bashor, Jon [Lawrence Berkeley National Laboratory; Brightwell, Ron [Sandia National Laboratories; Coteus, Paul [IBM; Debenedictus, Erik [Sandia National Laboratories; Hiller, Jon [Science and Technology Associates; Kim, K. H. [IBM; Langston, Harper [Reservoir Labs; Murphy, Richard Micron; Webster, Clayton [Oak Ridge National Laboratory; Wild, Stefan [Argonne National Laboratory; Grider, Gary [Los Alamos National Laboratory; Ross, Rob [Argonne National Laboratory; Leyffer, Sven [Argonne National Laboratory; Laros III, James [Sandia National Laboratories

    2014-02-10

    Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a system that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.

  20. Progress and Challenges in Short to Medium Range Coupled Prediction

    Science.gov (United States)

    Brassington, G. B.; Martin, M. J.; Tolman, H. L.; Akella, Santha; Balmeseda, M.; Chambers, C. R. S.; Cummings, J. A.; Drillet, Y.; Jansen, P. A. E. M.; Laloyaux, P.; hide

    2014-01-01

    The availability of GODAE Oceanview-type ocean forecast systems provides the opportunity to develop high-resolution, short- to medium-range coupled prediction systems. Several groups have undertaken the first experiments based on relatively unsophisticated approaches. Progress is being driven at the institutional level targeting a range of applications that represent their respective national interests with clear overlaps and opportunities for information exchange and collaboration. These include general circulation, hurricanes, extra-tropical storms, high-latitude weather and sea-ice forecasting as well as coastal air-sea interaction. In some cases, research has moved beyond case and sensitivity studies to controlled experiments to obtain statistically significant metrics.

  1. Rethinking what is "developmentally appropriate" from a learning progression perspective: The power and the challenge

    Directory of Open Access Journals (Sweden)

    KATHLEEN METZ

    2009-01-01

    Full Text Available Learning progressions have recently become increasingly visible in studies of learning and instruction in science. In this essay, I explore the power and considerable challenges in rethinking what may be developmentally appropriate for young children's learning science from the perspective of learning progressions. In particular, I examine the issues of: a the design of promising learning progressions within the vast design space of potential progressions; b identification of cognitive resources relevant to a progression; c analysis of effort / payoff for particular competencies at different points in the progression; d attribution of cognitive limitations and achievements; e coordination and collaboration needed to support the design, utilization, and refinement of the learning progression; and f absence of straightforward correspondence between a learning progression and trajectories of different children's knowledge-development.

  2. U.S. Department of Energy Workplace Charging Challenge - Progress Update 2016: A New Sustainable Commute

    Energy Technology Data Exchange (ETDEWEB)

    2017-01-01

    In June 2016, the Workplace Charging Challenge distributed its third annual survey to 295 partners with the goal of tracking partners' progress and identifying trends in workplace charging. This document summarizes findings from the survey and highlights accomplishments of the EV Everywhere Workplace Charging Challenge.

  3. Philanthropy and disparities: progress, challenges, and unfinished business.

    Science.gov (United States)

    Mitchell, Faith; Sessions, Kathryn

    2011-10-01

    Philanthropy has invested millions of dollars to reduce disparities in health care and improve minority health. Grants to strengthen providers' cultural competence, diversify health professions, and collect data have improved understanding of and spurred action on disparities. The persistence of disparities in spite of these advances has shifted philanthropic attention toward strategies to change social, economic, and environmental conditions. We argue that these evolving perspectives, along with earlier groundwork, present new opportunities for funders, especially in combination with progress toward universal health coverage. This article looks at how philanthropy has addressed health disparities over the past decade, with a focus on accomplishments, the work remaining to be done, and how funders can help advance the disparities agenda.

  4. The genetics of susceptibility to tuberculosis: Progress and challenges

    Directory of Open Access Journals (Sweden)

    Alexey Anatolievich Rudko

    2016-09-01

    Full Text Available Tuberculosis is a global pressing healthcare issue in the modern world. Host genetics is an important modifier of the disease risk. Genetic and genomic studies aim to reveal key inherited variants of the human genome associated with the susceptibility to tuberculosis. Much attention is given to the study of differential genetic susceptibility to various stages of tuberculous infection, particularly latent tuberculosis, the detection of which is most challenging. Susceptibility genes have been identified and most of which exhibit a relatively small effect on the disease risk. On the other hand, a proportion of children suffer from Mendelian susceptibility to tuberculosis associated with rare mutations with deterministic effect in genes for the components of cellular immunity against intra-cellular infections. This review focuses on the current achievements in genomic studies devoted to the identification of genes important for the implementation of the immune response and protection against the development of the infection in different populations in the world.

  5. EEG Derived Neuronal Dynamics during Meditation: Progress and Challenges

    Directory of Open Access Journals (Sweden)

    Chamandeep Kaur

    2015-01-01

    Full Text Available Meditation advances positivity but how these behavioral and psychological changes are brought can be explained by understanding neurophysiological effects of meditation. In this paper, a broad spectrum of neural mechanics under a variety of meditation styles has been reviewed. The overall aim of this study is to review existing scientific studies and future challenges on meditation effects based on changing EEG brainwave patterns. Albeit the existing researches evidenced the hold for efficacy of meditation in relieving anxiety and depression and producing psychological well-being, more rigorous studies are required with better design, considering client variables like personality characteristics to avoid negative effects, randomized controlled trials, and large sample sizes. A bigger number of clinical trials that concentrate on the use of meditation are required. Also, the controversial subject of epileptiform EEG changes and other adverse effects during meditation has been raised.

  6. Microscale and Nanoscale Process Systems Engineering: Challenge and Progress

    Institute of Scientific and Technical Information of China (English)

    杨友麒

    2008-01-01

    This is an overview of the development of process systems engineering (PSE) in a smaller world. Two different spatio-temporal scopes are identified for microscale and nanoscale process systems. The features and challenges for each scale are reviewed, and different methodologies used by them discussed. Comparison of these two new areas with traditional process systems engineering is described. If microscale PSE could be considered as an extension of traditional PSE, nanoscale PSE should be accepted as a new discipline which has looser connection with the extant core of chemical engineering. Since "molecular factories" is the next frontier of processing scale, nanoscale PSE will be the new theory to handle the design, simulation and operation of those active processing systems.

  7. Dendritic cell targeted vaccines: Recent progresses and challenges.

    Science.gov (United States)

    Chen, Pengfei; Liu, Xinsheng; Sun, Yuefeng; Zhou, Peng; Wang, Yonglu; Zhang, Yongguang

    2016-03-01

    Dendritic cells (DCs) are known to be a set of morphology, structure and function of heterogeneous professional antigen presenting cells (APCs), as well as the strongest functional antigen presenting cells, which can absorb, process and present antigens. As the key regulators of innate and adaptive immune responses, DCs are at the center of the immune system and capable of interacting with both B cells and T cells, thereby manipulating the humoral and cellular immune responses. DCs provide an essential link between the innate and adaptive immunity, and the strong immune activation function of DCs and their properties of natural adjuvants, make them a valuable target for antigen delivery. Targeting antigens to DC-specific endocytic receptors in combination with the relevant antibodies or ligands along with immunostimulatory adjuvants has been recently recognized as a promising strategy for designing an effective vaccine that elicits a strong and durable T cell response against intracellular pathogens and cancer. This opinion article provides a brief summary of the rationales, superiorities and challenges of existing DC-targeting approaches.

  8. Uterine sarcomas-Recent progress and future challenges

    Energy Technology Data Exchange (ETDEWEB)

    Seddon, Beatrice M., E-mail: beatrice.seddon@uclh.nhs.uk [London Sarcoma Service, Department of Oncology, University College Hospital, 1st Floor Central, 250 Euston Road, London, NW1 2PG (United Kingdom); Davda, Reena [London Sarcoma Service, Department of Oncology, University College Hospital, 1st Floor Central, 250 Euston Road, London, NW1 2PG (United Kingdom)

    2011-04-15

    Uterine sarcomas are a group of rare tumours that provide considerable challenges in their treatment. Radiological diagnosis prior to hysterectomy is difficult, with the diagnosis frequently made post-operatively. Current staging systems have been unsatisfactory, although a new FIGO staging system specifically for uterine sarcomas has now been introduced, and may allow better grouping of patients according to expected prognosis. While the mainstay of treatment of early disease is a total abdominal hysterectomy, it is less clear whether routine oophorectomy or lymphadenectomy is necessary. Adjuvant pelvic radiotherapy may improve local tumour control in high risk patients, but is not associated with an overall survival benefit. Similarly there is no good evidence for the routine use of adjuvant chemotherapy. For advanced leiomyosarcoma, newer chemotherapy agents including gemcitabine and docetaxel, and trabectedin, offer some promise, while hormonal therapies appear to be more useful in endometrial stromal sarcoma. Novel targeted agents are now being introduced for sarcomas, and uterine sarcomas, and show some indications of activity. Non-pharmacological treatments, including surgical metastatectomy, radiofrequency ablation, and CyberKnife radiotherapy, are important additions to systemic therapy for advanced metastatic disease.

  9. Computational chemistry meets cultural heritage: challenges and perspectives.

    Science.gov (United States)

    Fantacci, Simona; Amat, Anna; Sgamellotti, Antonio

    2010-06-15

    heritage can complement experimental investigations by establishing or rationalizing structure-property relations of the fundamental artwork components. These insights allow researchers to understand the interdependence of such components and eventually the composition of the artwork materials. As a perspective, we aim to extend the simulations to systems of increasing complexity that are similar to the realistic materials encountered in works of art. A challenge is the computational investigation of materials degradation and their associated reactive pathways; here the possible initial components, intermediates, final materials, and various deterioration mechanisms must all be simulated.

  10. Orphan drugs in development for Huntington's disease: challenges and progress

    Directory of Open Access Journals (Sweden)

    Burgunder JM

    2015-02-01

    Full Text Available Jean-Marc Burgunder1–4 1Swiss Huntington’s Disease Centre, Department of Neurology, University of Bern, Bern, Switzerland; 2Department of Neurology, West China Hospital, Sichuan University, Chengdu, 3Department of Neurology, Xiangya Hospital, Central South University, Changsha, 4Department of Neurology, Sun Yat-sen University, Guangzhou, People’s Republic of China Abstract: Huntington’s disease is a monogenic disorder encompassing a variable phenotype with progressive cognitive, psychiatric, and movement disorders. Knowledge of the mechanisms involved in this disorder has made substantial advances since the discovery of the gene mutation. The dynamic mutation is the expansion of a CAG (cytosine-adenine-guanine repeat in the huntingtin (HTT gene, which is transcribed into an abnormal protein with an elongated polyglutamine tract. Polyglutamine HTT accumulates and is changed in its function in multifaceted ways related to the numerous roles of the normal protein. The protein is expressed in numerous areas of the brain and also in other organs. The major brain region involved in the disease process is the striatum, but it is clear that other systems are involved as well. This accumulated knowledge has now led to the development of treatment strategies based on specific molecular pathways for symptomatic and disease course-modifying treatment. The most proximal way to handle the disturbed protein is to hinder the gene transcription, translation, and/or to increase protein clearance. Other mechanisms now being approached include modulation of energy and intracellular signaling, induction of factors potentially leading to neuroprotection, as well as modulation of glial function. Several clinical trials based on these approaches are now under way, and it is becoming clear that a future disease-modifying therapy will be a combination of several approaches harmonized with symptomatic treatments. In this review, some of the most promising and

  11. DISC: Deep Image Saliency Computing via Progressive Representation Learning.

    Science.gov (United States)

    Chen, Tianshui; Lin, Liang; Liu, Lingbo; Luo, Xiaonan; Li, Xuelong

    2016-06-01

    Salient object detection increasingly receives attention as an important component or step in several pattern recognition and image processing tasks. Although a variety of powerful saliency models have been intensively proposed, they usually involve heavy feature (or model) engineering based on priors (or assumptions) about the properties of objects and backgrounds. Inspired by the effectiveness of recently developed feature learning, we provide a novel deep image saliency computing (DISC) framework for fine-grained image saliency computing. In particular, we model the image saliency from both the coarse-and fine-level observations, and utilize the deep convolutional neural network (CNN) to learn the saliency representation in a progressive manner. In particular, our saliency model is built upon two stacked CNNs. The first CNN generates a coarse-level saliency map by taking the overall image as the input, roughly identifying saliency regions in the global context. Furthermore, we integrate superpixel-based local context information in the first CNN to refine the coarse-level saliency map. Guided by the coarse saliency map, the second CNN focuses on the local context to produce fine-grained and accurate saliency map while preserving object details. For a testing image, the two CNNs collaboratively conduct the saliency computing in one shot. Our DISC framework is capable of uniformly highlighting the objects of interest from complex background while preserving well object details. Extensive experiments on several standard benchmarks suggest that DISC outperforms other state-of-the-art methods and it also generalizes well across data sets without additional training. The executable version of DISC is available online: http://vision.sysu.edu.cn/projects/DISC.

  12. Geomorphology of ice stream beds: recent progress and future challenges

    Science.gov (United States)

    Stokes, Chris R.

    2016-04-01

    Ice sheets lose mass primarily by melting and discharge via rapidly-flowing ice streams. Surface and basal melting (e.g. of ice shelves) are closely linked to atmospheric and oceanic conditions, but the mechanisms that drive changes in ice stream discharge are more complex; and are influenced by conditions at their bed which can sustain, enhance or inhibit their motion. Although explicit comparisons are rare, the ice-bed interface is similar to the 'boundary layer' in fluvial and aeolian environments, where shear stresses (both basal and lateral in the case of ice streams) oppose the flow of the overlying medium. The analogy extends further because processes within the boundary layer create a distinctive geomorphology (and roughness) that is characterised by subglacial bedforms that resemble features in fluvial and aeolian environments. Their creation results from erosion, transport and deposition of sediment which is poorly constrained, but which is intimately linked to the mechanisms through which ice streams are able to flow rapidly. The study of ice stream geomorphology is, therefore, critical to our understanding of their dynamics. Despite difficulty in observing the subglacial environment of active ice streams, our understanding of their geomorphology has grown rapidly in the last three decades, from almost complete ignorance to a detailed knowledge of their geomorphological products. This has been brought about by two main approaches: (i) geophysical investigation of modern (active) ice streams, and (ii) sedimentological and geomorphological investigation of palaeo-ice stream beds. The aim of this paper is to review progress in these two areas, highlight the key questions that remain, and discuss the opportunities that are likely to arise that will enable them to be addressed. It is clear that whilst these two main approaches have led to important advances, they have often been viewed as separate sub-disciplines, with minimal cross-pollination of ideas and

  13. PREFACE: Scientific and Technical Challenges in the Well Drilling Progress

    Science.gov (United States)

    2015-02-01

    departments - Technologies in Mineral Exploration and Technologies in Mineral Exploration were merged into one department. In 2003 the newly merged Department of Drilling was established within the Institute of Petroleum Engineering, now the Institute of Natural Resources and is located in Building № 6 where it began its life. During these 60 years more than 3000 specialists have graduated the Department of Drilling, many whom are highly-qualified and dedicated professionals. There is no doubt that this Conference involved comprehensive advanced engineering problems in drilling and issues on relevant personnel training. It is extremely important to understand how the 60-year progress and contribution in the field of drilling has left its trace in the history of this Department; and, that, now, it is necessary to move further and seek new and new horizons in drilling.

  14. The face of an imposter: computer vision for deception detection research in progress

    NARCIS (Netherlands)

    Elkins, Aaron C.; Sun, Yijia; Zafeiriou, Stefanos; Pantic, Maja

    2013-01-01

    Using video analyzed from a novel deception experiment, this paper introduces computer vision research in progress that addresses two critical components to computational modeling of deceptive behavior: 1) individual nonverbal behavior differences, and 2) deceptive ground truth. Video interviews ana

  15. Nuclear Data Covariances in the Indian Context - Progress, Challenges, Excitement and Perspectives

    Science.gov (United States)

    Ganesan, S.

    2015-01-01

    We present a brief overview of progress, challenges, excitement and perspectives in developing nuclear data covariances in the Indian context in relation to target accuracies and sensitivity studies that are of great importance to Bhabha's 3-stage nuclear programme for energy and non-energy applications.

  16. Conference Scene: From innovative polymers to advanced nanomedicine: Key challenges, recent progress and future perspectives

    NARCIS (Netherlands)

    Feijen, J.; Hennink, W.E.; Zhong, Z.

    2013-01-01

    Recent developments in polymer-based controlled delivery systems have made a significant clinical impact. The second Symposium on Innovative Polymers for Controlled Delivery (SIPCD) was held in Suzhou, China to address the key challenges and provide up-to-date progress and future perspectives in the

  17. Conference Scene: From innovative polymers to advanced nanomedicine: Key challenges, recent progress and future perspectives

    NARCIS (Netherlands)

    Feijen, Jan; Hennink, W.E.; Zhong, Zhiyuan

    2013-01-01

    Recent developments in polymer-based controlled delivery systems have made a significant clinical impact. The second Symposium on Innovative Polymers for Controlled Delivery (SIPCD) was held in Suzhou, China to address the key challenges and provide up-to-date progress and future perspectives in the

  18. Security, Privacy and Trust Challenges in Cloud Computing and Solutions

    OpenAIRE

    Seyyed Yasser hashemi; Parisa Sheykhi Hesarlo

    2014-01-01

    Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction. Cloud computing recently emerged as a promising solution to information technology (IT) management. IT managers look to cloud computing as a means to maintain a flexible and scalable IT infrastructure that enables business agility. As much as the technologic...

  19. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  20. The ATLAS Distributed Computing: the challenges of the future

    CERN Document Server

    Sakamoto, H; The ATLAS collaboration

    2013-01-01

    The ATLAS experiment has collected more than 25 fb-1 of data since LHC has started it's operation in 2010. Tens of petabytes of collision events and Monte-Carlo simulations are stored over more than 150 computing centers all over the world. The data processing is performed on grid sites providing more than 100.000 computing cores and orchestrated by the ATLAS in-house developed job and data management services. The discovery of the Higgs-like boson in 2012 would not be possible without the excellent performance of the ATLAS Distributed Computing. The future ATLAS experiment operation with increased LHC beam energy and luminosity foreseen for 2014 imposes a significant increase in computing demands the ATLAS Distributed Computing needs to satisfy. Therefore, a development of the new data-processing, storage and data-distribution systems has been started to efficiently use the computing resources exploiting current and future technologies of distributed computing.

  1. Ion Trap Quantum Computers: Performance Limits and Experimental Progress

    Science.gov (United States)

    Hughes, Richard

    1998-03-01

    In a quantum computer information would be represented by the quantum mechanical states of suitable atomic-scale systems. (A single bit of information represented by a two-level quantum system is known as a qubit.) This notion leads to the possibility of computing with quantum mechanical superpositions of numbers ("quantum parallelism"), which for certain problems would make Quantum/quantum.html>quantum computation very much more efficient than classical computation. The possibility of rapidly factoring the large integers used in public-key cryptography is an important example. (Public key cryptosystems derive their security from the difficuty of factoring, and similar problems, with conventional computers.) Quantum computational hardware development is in its infancy, but an experimental study of quantum computation with laser-cooled trapped calcium ions that is under way at Los Alamos will be described. One of the pricipal obstacles to practical quantum computation is the inevitable loss of quantum coherence of the complex quantum states involved. The results of a theoretical analysis showing that quantum factoring of small integers should be possible with trapped ions will be presented. The prospects for larger-scale computations will be discussed.

  2. Progress in the genetics of polygenic brain disorders: significant new challenges for neurobiology.

    Science.gov (United States)

    McCarroll, Steven A; Hyman, Steven E

    2013-10-30

    Advances in genome analysis, accompanied by the assembly of large patient cohorts, are making possible successful genetic analyses of polygenic brain disorders. If the resulting molecular clues, previously hidden in the genomes of affected individuals, are to yield useful information about pathogenesis and inform the discovery of new treatments, neurobiology will have to rise to many difficult challenges. Here we review the underlying logic of the genetic investigations, describe in more detail progress in schizophrenia and autism, and outline the challenges for neurobiology that lie ahead. We argue that technologies at the disposal of neuroscience are adequately advanced to begin to study the biology of common and devastating polygenic disorders.

  3. Gender Digital Divide and Challenges in Undergraduate Computer Science Programs

    Science.gov (United States)

    Stoilescu, Dorian; McDougall, Douglas

    2011-01-01

    Previous research revealed a reduced number of female students registered in computer science studies. In addition, the female students feel isolated, have reduced confidence, and underperform. This article explores differences between female and male students in undergraduate computer science programs in a mid-size university in Ontario. Based on…

  4. Turning Shortcomings into Challenges: Brain-Computer Interfaces for Games

    NARCIS (Netherlands)

    Nijholt, Anton; Reuderink, Boris; Oude Bos, Danny; Nijholt, A.; Reidsma, D.; Hondorp, G.H.W.

    2009-01-01

    In recent years we have seen a rising interest in brain-computer interfacing for human-computer interaction and potential game applications. Until now, however, we have almost only seen attempts where BCI is used to measure the affective state of the user or in neurofeeedback games. There have hardl

  5. Turning Shortcomings into Challenges: Brain-Computer Interfaces for Games

    NARCIS (Netherlands)

    Nijholt, Anton; Oude Bos, Danny; Reuderink, Boris

    2009-01-01

    In recent years we have seen a rising interest in brain-computer interfacing for human-computer interaction and potential game applications. Until now, however, we have almost only seen proof-of-concepts where a single BCI paradigm is demonstrated to work as a simple control mechanism, as a measurem

  6. Progress Report 1 January 1983 - 31 December 1984. Computer Installation

    DEFF Research Database (Denmark)

    Risø National Laboratory, Roskilde

    This report describes selected parts of the activities at the Computer Installation of Risa National Laboratory in 1983 and 1984. Information given may be preliminary.......This report describes selected parts of the activities at the Computer Installation of Risa National Laboratory in 1983 and 1984. Information given may be preliminary....

  7. Progresses and challenges in optimization of human pluripotent stem cell culture.

    Science.gov (United States)

    Lin, Ge; Xu, Ren-He

    2010-09-01

    The pressing demand to elucidate the biology of human embryonic stem (ES) cells and to realize their therapeutic potential has greatly promoted the progresses in the optimization of the culture systems used for this highly promising cell type. These progresses include the characterization of exogenous regulators of pluripotency and differentiation, the development of animal-free, defined, and scalable culture systems, and some pioneering efforts to establish good manufactory practice facilities to derive and expand clinical-grade human ES cells and their derivatives. All of these advancements appear to be also applicable to the derivation and culture of human induced pluripotent stem cells, an ES cell-like cell type derived from somatic cells via reprogramming. This review attempts to summarize these progresses and discuss some of the remaining challenges.

  8. Computer Classification of Triangles and Quadrilaterals--A Challenging Application

    Science.gov (United States)

    Dennis, J. Richard

    1978-01-01

    Two computer exercises involving the classification of geometric figures are given. The mathematics required is relatively simple but comes from several areas--synthetic geometry, analytic geometry, and linear algebra. (MN)

  9. Multicore Challenges and Benefits for High Performance Scientific Computing

    Directory of Open Access Journals (Sweden)

    Ida M.B. Nielsen

    2008-01-01

    Full Text Available Until recently, performance gains in processors were achieved largely by improvements in clock speeds and instruction level parallelism. Thus, applications could obtain performance increases with relatively minor changes by upgrading to the latest generation of computing hardware. Currently, however, processor performance improvements are realized by using multicore technology and hardware support for multiple threads within each core, and taking full advantage of this technology to improve the performance of applications requires exposure of extreme levels of software parallelism. We will here discuss the architecture of parallel computers constructed from many multicore chips as well as techniques for managing the complexity of programming such computers, including the hybrid message-passing/multi-threading programming model. We will illustrate these ideas with a hybrid distributed memory matrix multiply and a quantum chemistry algorithm for energy computation using Møller–Plesset perturbation theory.

  10. The Challenges of Multidisciplinary Education in Computer Science

    Institute of Scientific and Technical Information of China (English)

    Fred S. Roberts

    2011-01-01

    Some of the most important problems facing the United States and China,indeed facing our entire planet,require approaches that are fundamentally multidisciplinary in nature.Many of those require skills in computer science (CS),basic understanding of another discipline,and the ability to apply the skills in one discipline to the problems of another.Modern training in computer science needs to prepare students to work in other disciplines or to work on multidisciplinary problems.What do we do to prepare them for a multidisciplinary world when there are already too many things we want to teach them about computer science? This paper describes successful examples of multidisciplinary education at the interface between CS and the biological sciences,as well as other examples involving CS and security,CS and sustainability,and CS and the social and economic sciences.It then discusses general principles for multidisciplinary education of computer scientists.

  11. Progress and challenges of disaster health management in China: a scoping review

    Science.gov (United States)

    Zhong, Shuang; Clark, Michele; Hou, Xiang-Yu; Zang, Yuli; FitzGerald, Gerard

    2014-01-01

    Background Despite the importance of an effective health system response to various disasters, relevant research is still in its infancy, especially in middle- and low-income countries. Objective This paper provides an overview of the status of disaster health management in China, with its aim to promote the effectiveness of the health response for reducing disaster-related mortality and morbidity. Design A scoping review method was used to address the recent progress of and challenges to disaster health management in China. Major health electronic databases were searched to identify English and Chinese literature that were relevant to the research aims. Results The review found that since 2003 considerable progress has been achieved in the health disaster response system in China. However, there remain challenges that hinder effective health disaster responses, including low standards of disaster-resistant infrastructure safety, the lack of specific disaster plans, poor emergency coordination between hospitals, lack of portable diagnostic equipment and underdeveloped triage skills, surge capacity, and psychological interventions. Additional challenges include the fragmentation of the emergency health service system, a lack of specific legislation for emergencies, disparities in the distribution of funding, and inadequate cost-effective considerations for disaster rescue. Conclusions One solution identified to address these challenges appears to be through corresponding policy strategies at multiple levels (e.g. community, hospital, and healthcare system level). PMID:25215910

  12. Progress and challenges of disaster health management in China: a scoping review

    Directory of Open Access Journals (Sweden)

    Shuang Zhong

    2014-09-01

    Full Text Available Background: Despite the importance of an effective health system response to various disasters, relevant research is still in its infancy, especially in middle- and low-income countries. Objective: This paper provides an overview of the status of disaster health management in China, with its aim to promote the effectiveness of the health response for reducing disaster-related mortality and morbidity. Design: A scoping review method was used to address the recent progress of and challenges to disaster health management in China. Major health electronic databases were searched to identify English and Chinese literature that were relevant to the research aims. Results: The review found that since 2003 considerable progress has been achieved in the health disaster response system in China. However, there remain challenges that hinder effective health disaster responses, including low standards of disaster-resistant infrastructure safety, the lack of specific disaster plans, poor emergency coordination between hospitals, lack of portable diagnostic equipment and underdeveloped triage skills, surge capacity, and psychological interventions. Additional challenges include the fragmentation of the emergency health service system, a lack of specific legislation for emergencies, disparities in the distribution of funding, and inadequate cost-effective considerations for disaster rescue. Conclusions: One solution identified to address these challenges appears to be through corresponding policy strategies at multiple levels (e.g. community, hospital, and healthcare system level.

  13. Progress and challenges of disaster health management in China: a scoping review.

    Science.gov (United States)

    Zhong, Shuang; Clark, Michele; Hou, Xiang-Yu; Zang, Yuli; FitzGerald, Gerard

    2014-01-01

    Despite the importance of an effective health system response to various disasters, relevant research is still in its infancy, especially in middle- and low-income countries. This paper provides an overview of the status of disaster health management in China, with its aim to promote the effectiveness of the health response for reducing disaster-related mortality and morbidity. A scoping review method was used to address the recent progress of and challenges to disaster health management in China. Major health electronic databases were searched to identify English and Chinese literature that were relevant to the research aims. The review found that since 2003 considerable progress has been achieved in the health disaster response system in China. However, there remain challenges that hinder effective health disaster responses, including low standards of disaster-resistant infrastructure safety, the lack of specific disaster plans, poor emergency coordination between hospitals, lack of portable diagnostic equipment and underdeveloped triage skills, surge capacity, and psychological interventions. Additional challenges include the fragmentation of the emergency health service system, a lack of specific legislation for emergencies, disparities in the distribution of funding, and inadequate cost-effective considerations for disaster rescue. One solution identified to address these challenges appears to be through corresponding policy strategies at multiple levels (e.g. community, hospital, and healthcare system level).

  14. Towards omnidirectional, large scale, full polarization, and broadband practical invisibility cloaks: challenges and progress

    Directory of Open Access Journals (Sweden)

    Yang Yihao

    2014-01-01

    Full Text Available Invisibility cloaks have experienced a tremendous development in the past few years, but the current technologies to convert the cloaks into practical applications are still facing numerous bottlenecks. In this paper, we provide the review of the challenges and recent progress in the invisibility cloaks from a practical perspective. In particular, the following key challenges such as non-extreme parameters, homogeneity, omnidirectivity, full polarization, large scale and broad band are addressed. We analyze the physical mechanisms behind the challenges and consequently evaluate the merits and defects of the recent solutions. We anticipate some compromises on the ideal cloaks are required in order to achieve practical invisibility cloaks in the future.

  15. Computational pan-genomics: status, promises and challenges

    NARCIS (Netherlands)

    Marschall, Tobias; Ridder, de D.; Sheikhizadeh Anari, S.; Smit, S.

    2016-01-01

    Many disciplines, from human genetics and oncology to plant breeding, microbiology and virology, commonly face the challenge of analyzing rapidly increasing numbers of genomes. In case of Homo sapiens, the number of sequenced genomes will approach hundreds of thousands in the next few years. Simply

  16. Computer Supported Content Analysis: Challenges, research and developments

    OpenAIRE

    2006-01-01

    This is a seminar organized to report on the research outcomes of work conducted under the HKU Strategic Research Theme on Information Technology, within the area of Applying Data Mining Techniques to Novel Applications. This seminar presents the work in progress by a collaborative team comprising researchers from the Centre for Knowledge Science & Engineering Research, Beijing Normal University (CKSER) at Beijing Normal University and the Centre for Information Technology in Education (CITE)...

  17. Remarkable Computing - the Challenge of Designing for the Home

    DEFF Research Database (Denmark)

    Petersen, Marianne Graves

    2004-01-01

    The vision of ubiquitous computing is floating into the domain of the household, despite arguments that lessons from design of workplace artefacts cannot be blindly transferred into the domain of the household. This paper discusses why the ideal of unremarkable or ubiquitous computing is too narrow...... with respect to the household. It points out how understanding technology use, is a matter of looking into the process of use and on how the specific context of the home, in several ways, call for technology to be remarkable rather than unremarkable....

  18. Progress Towards Computational Method for Circulation Control Airfoils

    Science.gov (United States)

    Swanson, R. C.; Rumsey, C. L.; Anders, S. G.

    2005-01-01

    The compressible Reynolds-averaged Navier-Stokes equations are solved for circulation control airfoil flows. Numerical solutions are computed with both structured and unstructured grid solvers. Several turbulence models are considered, including the Spalart-Allmaras model with and without curvature corrections, the shear stress transport model of Menter, and the k-enstrophy model. Circulation control flows with jet momentum coefficients of 0.03, 0.10, and 0.226 are considered. Comparisons are made between computed and experimental pressure distributions, velocity profiles, Reynolds stress profiles, and streamline patterns. Including curvature effects yields the closest agreement with the measured data.

  19. Scientific Grand Challenges: Challenges in Climate Change Science and the Role of Computing at the Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.; Johnson, Gary M.; Washington, Warren M.

    2009-07-02

    The U.S. Department of Energy (DOE) Office of Biological and Environmental Research (BER) in partnership with the Office of Advanced Scientific Computing Research (ASCR) held a workshop on the challenges in climate change science and the role of computing at the extreme scale, November 6-7, 2008, in Bethesda, Maryland. At the workshop, participants identified the scientific challenges facing the field of climate science and outlined the research directions of highest priority that should be pursued to meet these challenges. Representatives from the national and international climate change research community as well as representatives from the high-performance computing community attended the workshop. This group represented a broad mix of expertise. Of the 99 participants, 6 were from international institutions. Before the workshop, each of the four panels prepared a white paper, which provided the starting place for the workshop discussions. These four panels of workshop attendees devoted to their efforts the following themes: Model Development and Integrated Assessment; Algorithms and Computational Environment; Decadal Predictability and Prediction; Data, Visualization, and Computing Productivity. The recommendations of the panels are summarized in the body of this report.

  20. Bringing Vision-Based Measurements into our Daily Life: A Grand Challenge for Computer Vision Systems

    OpenAIRE

    Scharcanski, Jacob

    2016-01-01

    Bringing computer vision into our daily life has been challenging researchers in industry and in academia over the past decades. However, the continuous development of cameras and computing systems turned computer vision-based measurements into a viable option, allowing new solutions to known problems. In this context, computer vision is a generic tool that can be used to measure and monitor phenomena in wide range of fields. The idea of using vision-based measurements is appealing, since the...

  1. User Interface Improvements in Computer-Assisted Instruction, the Challenge.

    Science.gov (United States)

    Chalmers, P. A.

    2000-01-01

    Identifies user interface problems as they relate to computer-assisted instruction (CAI); reviews the learning theories and instructional theories related to CAI user interface; and presents potential CAI user interface improvements for research and development based on learning and instructional theory. Focuses on screen design improvements.…

  2. Traditional Host based Intrusion Detection Systems’ Challenges in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Masoudeh Keshavarzi

    Full Text Available Cloud computing is one of the hottest topics in IT today. It can help enterprises improve the creation and delivery of IT solutions by allowing them to access services more flexibly and cost-effectively. Security concerns in the cloud environment are the ...

  3. Challenges of high dam construction to computational mechanics

    Institute of Scientific and Technical Information of China (English)

    ZHANG Chuhan

    2007-01-01

    The current situations and growing prospects of China's hydro-power development and high dam construction are reviewed,giving emphasis to key issues for safety evaluation of large dams and hydro-power plants,especially those associated with application of state-of-the-art computational mechanics.These include but are not limited to:stress and stability analysis of dam foundations under external loads;earthquake behavior of dam-foundation-reservoir systems,mechanical properties of mass concrete for dams,high velocity flow and energy dissipation for high dams,scientific and technical problems of hydro-power plants and underground structures,and newly developed types of dam-Roll Compacted Concrete (RCC) dams and Concrete Face Rock-fill (CFR)dams.Some examples demonstrating successful utilizations of computational mechanics in high dam engineering are given,including seismic nonlinear analysis for arch dam foundations,nonlinear fracture analysis of arch dams under reservoir loads,and failure analysis of arch dam-foundations.To make more use of the computational mechanics in high dam engineering,it is pointed out that much research including different computational methods,numerical models and solution schemes,and verifications through experimental tests and filed measurements is necessary in the future.

  4. Human-Computer Interaction Software: Lessons Learned, Challenges Ahead

    Science.gov (United States)

    1989-01-01

    domain communi- Iatelligent s t s s Me cation. Users familiar with problem Inteligent support systes. High-func- anddomains but inxperienced with comput...8217i. April 1987, pp. 7.3-78. His research interests include artificial intel- Creating better HCI softw-are will have a 8. S.K Catrd. I.P. Moran. arid

  5. MIT Laboratory for Computer Science Progress Report 27

    Science.gov (United States)

    1990-06-01

    C. Waldspurger B. Gates M. Williams Undergraduate Students M. Levine A. Rhee V. Nicotina M. Sexton C. Poelman R. Stata Support Staff A. Rubin 113...Science, May 1990. [5] V. Nicotina . Implementing Mobile Guardians in Argus. Bachelor’s thesis, MIT Department of Electrical Engineering and Computer Science

  6. Challenges of Computational Processing of Code-Switching

    OpenAIRE

    Çetinoğlu, Özlem; Schulz, Sarah; Vu, Ngoc Thang

    2016-01-01

    This paper addresses challenges of Natural Language Processing (NLP) on non-canonical multilingual data in which two or more languages are mixed. It refers to code-switching which has become more popular in our daily life and therefore obtains an increasing amount of attention from the research community. We report our experience that cov- ers not only core NLP tasks such as normalisation, language identification, language modelling, part-of-speech tagging and dependency parsing but also more...

  7. A Comprehensive Study about Cloud Computing Security: Issues, Applications and Challenges

    Directory of Open Access Journals (Sweden)

    Sima Ghoflgary

    2014-11-01

    Full Text Available Cloud computing provides facilities for users to save their data or information in servers which are connected through Internet or Intranet. Further, users can run their applications with the help of software provided by cloud computing servers without installing that software in their own personal computers. Since many users access to cloud computing servers for various goals, therefore one of the main problem in this regard is providing security in access, usage, share or running users’ programs by cloud computing sources or servers. This paper attempts to study security issues, applications and its challenges on cloud computing

  8. Issues and challenges of intelligent systems and computational intelligence

    CERN Document Server

    Pozna, Claudiu; Kacprzyk, Janusz

    2014-01-01

    This carefully edited book contains contributions of prominent and active researchers and scholars in the broadly perceived area of intelligent systems. The book is unique both with respect to the width of coverage of tools and techniques, and to the variety of problems that could be solved by the tools and techniques presented. The editors have been able to gather a very good collection of relevant and original papers by prominent representatives of many areas, relevant both to the theory and practice of intelligent systems, artificial intelligence, computational intelligence, soft computing, and the like. The contributions have been divided into 7 parts presenting first more fundamental and theoretical contributions, and then applications in relevant areas.        

  9. "Tennis elbow". A challenging call for computation and medicine

    Science.gov (United States)

    Sfetsioris, D.; Bontioti, E. N.

    2014-10-01

    An attempt to give an insight on the features composing this musculotendinous disorder. We address the issues of definition, pathophysiology and the mechanism underlying the onset and the occurrence of the disease, diagnosis and diagnostic tools as well as the methods of treatment. We focus mostly on conservative treatment protocols and we recognize the need for a more thorough investigation with the aid of computation.

  10. Computational Science And Engineering Software Sustainability And Productivity (CSESSP) Challenges Workshop Report

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — This report details the challenges and opportunities discussed at the NITRD sponsored multi-agency workshop on Computational Science and Engineering Software...

  11. Progress and challenges in using stable isotopes to trace plant carbon and water relations across scales

    Directory of Open Access Journals (Sweden)

    C. Werner

    2012-08-01

    Full Text Available Stable isotope analysis is a powerful tool for assessing plant carbon and water relations and their impact on biogeochemical processes at different scales. Our process-based understanding of stable isotope signals, as well as technological developments, has progressed significantly, opening new frontiers in ecological and interdisciplinary research. This has promoted the broad utilisation of carbon, oxygen and hydrogen isotope applications to gain insight into plant carbon and water cycling and their interaction with the atmosphere and pedosphere. Here, we highlight specific areas of recent progress and new research challenges in plant carbon and water relations, using selected examples covering scales from the leaf to the regional scale. Further, we discuss strengths and limitations of recent technological developments and approaches and highlight new opportunities arising from unprecedented temporal and spatial resolution of stable isotope measurements.

  12. Malaria control in South Sudan, 2006–2013: strategies, progress and challenges

    Science.gov (United States)

    2013-01-01

    Background South Sudan has borne the brunt of years of chronic warfare and probably has the highest malaria burden in sub-Saharan Africa. However, effective malaria control in post-conflict settings is hampered by a multiplicity of challenges. This manuscript reports on the strategies, progress and challenges of malaria control in South Sudan and serves as an example epitome for programmes operating in similar environments and provides a window for leveraging resources. Case description To evaluate progress and challenges of the national malaria control programme an in-depth appraisal was undertaken according to the World Health Organization standard procedures for malaria programme performance review. Methodical analysis of published and unpublished documents on malaria control in South Sudan was conducted. To ensure completeness, findings of internal thematic desk assessments were triangulated in the field and updated by external review teams. Discussion and evaluation South Sudan has strived to make progress in implementing the WHO recommended malaria control interventions as set out in the 2006–2013 National Malaria Strategic Plan. The country has faced enormous programmatic constraints including infrastructure, human and financial resource and a weak health system compounded by an increasing number of refugees, returnees and internally displaced people. The findings present a platform on which to tailor an evidence-based 2014–2018 national malaria strategic plan for the country and a unique opportunity for providing a model for countries in a post-conflict situation. Conclusions The prospects for effective malaria control and elimination are huge in South Sudan. Nevertheless, strengthened coordination, infrastructure and human resource capacity, monitoring and evaluation are required. To achieve all this, allocation of adequate local funding would be critical. PMID:24160336

  13. CSA06 Computing, Software and Analysis challenge at the Spanish Tier-1 and Tier-2 sites

    CERN Document Server

    Alcaraz, J; Cabrillo, Iban Jose; Colino, Nicanor; Cuevas-Maestro, J; Delgado Peris, Antonio; Fernandez Menendez, Javier; Flix, Jose; García-Abia, Pablo; González-Caballero, I; Hernández, Jose M; Marco, Rafael; Martinez Ruiz del Arbol, Pablo; Matorras, Francisco; Merino, Gonzalo; Rodríguez-Calonge, F J; Vizan Garcia, Jesus Manuel

    2007-01-01

    This note describes the participation of the Spanish centres PIC, CIEMAT and IFCA as Tier-1 and Tier-2 sites in the CMS CSA06 Computing, Software and Analysis challenge. A number of the facilities, services and workflows have been demonstrated at the 2008 25% scale. Very valuable experience has been gained running the complex computing system under realistic conditions at a significant scale. The focus of this note is on presenting achieved results, operational experience and lessons learnt during the challenge.

  14. RRAM-based hardware implementations of artificial neural networks: progress update and challenges ahead

    Science.gov (United States)

    Prezioso, M.; Merrikh-Bayat, F.; Chakrabarti, B.; Strukov, D.

    2016-02-01

    Artificial neural networks have been receiving increasing attention due to their superior performance in many information processing tasks. Typically, scaling up the size of the network results in better performance and richer functionality. However, large neural networks are challenging to implement in software and customized hardware are generally required for their practical implementations. In this work, we will discuss our group's recent efforts on the development of such custom hardware circuits, based on hybrid CMOS/memristor circuits, in particular of CMOL variety. We will start by reviewing the basics of memristive devices and of CMOL circuits. We will then discuss our recent progress towards demonstration of hybrid circuits, focusing on the experimental and theoretical results for artificial neural networks based on crossbarintegrated metal oxide memristors. We will conclude presentation with the discussion of the remaining challenges and the most pressing research needs.

  15. Arabic Rule-Based Named Entity Recognition Systems Progress and Challenges

    Directory of Open Access Journals (Sweden)

    Ramzi Esmail Salah

    2017-06-01

    Full Text Available Rule-based approaches are using human-made rules to extract Named Entities (NEs, it is one of the most famous ways to extract NE as well as Machine Learning.  The term Named Entity Recognition (NER is defined as a task determined to indicate personal names, locations, organizations and many other entities. In Arabic language, Big Data challenges make Arabic NER develops rapidly and extracts useful information from texts. The current paper sheds some light on research progress in rule-based via a diagnostic comparison among linguistic resource, entity type, domain, and performance. We also highlight the challenges of the processing Arabic NEs through rule-based systems. It is expected that good performance of NER will be effective to other modern fields like semantic web searching, question answering, machine translation, information retrieval, and abstracting systems.

  16. Prevention and health promotion: decades of progress, new challenges, and an emerging agenda.

    Science.gov (United States)

    Smith, Timothy W; Orleans, C Tracy; Jenkins, C David

    2004-03-01

    Daily habits (e.g., smoking, diet, and exercise) and their immediate consequences (e.g., obesity) confer risk for most of the major health problems in industrialized nations. Hence, determinants of these behaviors and their modifications have been central topics in health psychology. Considerable scientific and applied progress has been made, but the field faces important challenges and opportunities in the future. These challenges and opportunities include changes in demographics and patterns of health, the need for a more comprehensive model of the domain of health behavior and prevention, the need to integrate behavioral and psychosocial risk and resilience, the incorporation of new technologies, and addressing a variety of professional and economic barriers to the implementation of prevention in health care.

  17. Addressing healthcare-associated infections and antimicrobial resistance from an organizational perspective: progress and challenges.

    Science.gov (United States)

    Murray, Eleanor; Holmes, Alison

    2012-07-01

    This paper explores the progress and challenges associated with the application of organizational factors and approaches to infection prevention and control (IPC) and antibiotic stewardship (AS) in England, many of which have been considered and supported by the Advisory Committee on Antimicrobial Resistance and Healthcare-associated Infections (ARHAI). An organizational perspective is described and the wider macro context and socio-political forces that shape an organizational approach are considered. Factors that drive organizational change in IPC and AS are discussed. The tensions, constraints and dilemmas that can occur are identified and outstanding challenges are debated. Some recommendations for the future direction of IPC and AS organizationally focused strategies and research are proposed.

  18. Computational Challenges of 3D Radiative Transfer in Atmospheric Models

    Science.gov (United States)

    Jakub, Fabian; Bernhard, Mayer

    2017-04-01

    The computation of radiative heating and cooling rates is one of the most expensive components in todays atmospheric models. The high computational cost stems not only from the laborious integration over a wide range of the electromagnetic spectrum but also from the fact that solving the integro-differential radiative transfer equation for monochromatic light is already rather involved. This lead to the advent of numerous approximations and parameterizations to reduce the cost of the solver. One of the most prominent one is the so called independent pixel approximations (IPA) where horizontal energy transfer is neglected whatsoever and radiation may only propagate in the vertical direction (1D). Recent studies implicate that the IPA introduces significant errors in high resolution simulations and affects the evolution and development of convective systems. However, using fully 3D solvers such as for example MonteCarlo methods is not even on state of the art supercomputers feasible. The parallelization of atmospheric models is often realized by a horizontal domain decomposition, and hence, horizontal transfer of energy necessitates communication. E.g. a cloud's shadow at a low zenith angle will cast a long shadow and potentially needs to communication through a multitude of processors. Especially light in the solar spectral range may travel long distances through the atmosphere. Concerning highly parallel simulations, it is vital that 3D radiative transfer solvers put a special emphasis on parallel scalability. We will present an introduction to intricacies computing 3D radiative heating and cooling rates as well as report on the parallel performance of the TenStream solver. The TenStream is a 3D radiative transfer solver using the PETSc framework to iteratively solve a set of partial differential equation. We investigate two matrix preconditioners, (a) geometric algebraic multigrid preconditioning(MG+GAMG) and (b) block Jacobi incomplete LU (ILU) factorization. The

  19. New challenges in grid generation and adaptivity for scientific computing

    CERN Document Server

    Formaggia, Luca

    2015-01-01

    This volume collects selected contributions from the “Fourth Tetrahedron Workshop on Grid Generation for Numerical Computations”, which was held in Verbania, Italy in July 2013. The previous editions of this Workshop were hosted by the Weierstrass Institute in Berlin (2005), by INRIA Rocquencourt in Paris (2007), and by Swansea University (2010). This book covers different, though related, aspects of the field: the generation of quality grids for complex three-dimensional geometries; parallel mesh generation algorithms; mesh adaptation, including both theoretical and implementation aspects; grid generation and adaptation on surfaces – all with an interesting mix of numerical analysis, computer science and strongly application-oriented problems.

  20. A Survey on Cloud Computing Security, Challenges and Threats

    Directory of Open Access Journals (Sweden)

    Rajnish Choubey,

    2011-03-01

    Full Text Available Cloud computing is an internet based model that enable convenient, on demand and pay per use access to a pool of shared resources. It is a new technology that satisfies a user’s requirement for computingresources like networks, storage, servers, services and applications, without physically acquiring them. It reduces the overhead of the organization of marinating the large system but it has associated risks and threats also which include – security, data leakage, insecure interface and sharing of resources and inside attacks.

  1. Computer Security: Join the CERN WhiteHat Challenge!

    CERN Multimedia

    Computer Security Team

    2014-01-01

    Over the past couple of months, several CERN users have reported vulnerabilities they have found in computing services and servers running at CERN. All were relevant, many were interesting and a few even surprising. Spotting weaknesses and areas for improvement before malicious people can exploit them is paramount. It helps protect the operation of our accelerators and experiments as well as the reputation of the Organization. Therefore, we would like to express our gratitude to those people for having reported these weaknesses! Great job and well done!   Seizing the opportunity, we would like to reopen the hunt for bugs, vulnerabilities and insecure configurations of CERN applications, websites and devices. You might recall we ran a similar initiative (“Hide & Seek”) in 2012 where we asked you to sift through CERN’s webpages and send us those that hold sensitive and confidential information. Quite a number of juicy documents were found and subsequently remov...

  2. GIB: Imperfect Information in a Computationally Challenging Game

    CERN Document Server

    Ginsberg, M L

    2011-01-01

    This paper investigates the problems arising in the construction of a program to play the game of contract bridge. These problems include both the difficulty of solving the game's perfect information variant, and techniques needed to address the fact that bridge is not, in fact, a perfect information game. GIB, the program being described, involves five separate technical advances: partition search, the practical application of Monte Carlo techniques to realistic problems, a focus on achievable sets to solve problems inherent in the Monte Carlo approach, an extension of alpha-beta pruning from total orders to arbitrary distributive lattices, and the use of squeaky wheel optimization to find approximately optimal solutions to cardplay problems. GIB is currently believed to be of approximately expert caliber, and is currently the strongest computer bridge program in the world.

  3. Calculating absorption shifts for retinal proteins: computational challenges.

    Science.gov (United States)

    Wanko, M; Hoffmann, M; Strodel, P; Koslowski, A; Thiel, W; Neese, F; Frauenheim, T; Elstner, M

    2005-03-01

    Rhodopsins can modulate the optical properties of their chromophores over a wide range of wavelengths. The mechanism for this spectral tuning is based on the response of the retinal chromophore to external stress and the interaction with the charged, polar, and polarizable amino acids of the protein environment and is connected to its large change in dipole moment upon excitation, its large electronic polarizability, and its structural flexibility. In this work, we investigate the accuracy of computational approaches for modeling changes in absorption energies with respect to changes in geometry and applied external electric fields. We illustrate the high sensitivity of absorption energies on the ground-state structure of retinal, which varies significantly with the computational method used for geometry optimization. The response to external fields, in particular to point charges which model the protein environment in combined quantum mechanical/molecular mechanical (QM/MM) applications, is a crucial feature, which is not properly represented by previously used methods, such as time-dependent density functional theory (TDDFT), complete active space self-consistent field (CASSCF), and Hartree-Fock (HF) or semiempirical configuration interaction singles (CIS). This is discussed in detail for bacteriorhodopsin (bR), a protein which blue-shifts retinal gas-phase excitation energy by about 0.5 eV. As a result of this study, we propose a procedure which combines structure optimization or molecular dynamics simulation using DFT methods with a semiempirical or ab initio multireference configuration interaction treatment of the excitation energies. Using a conventional QM/MM point charge representation of the protein environment, we obtain an absorption energy for bR of 2.34 eV. This result is already close to the experimental value of 2.18 eV, even without considering the effects of protein polarization, differential dispersion, and conformational sampling.

  4. Progress of solid-state quantum computers at NRIM

    Science.gov (United States)

    Kido, G.; Shinagawa, H.; Terai, K.; Hashi, K.; Goto, A.; Yakabe, T.; Takamasu, T.; Uji, S.; Shimizu, T.; Kitazawa, H.

    2001-04-01

    In the last five years, we have investigated quantum phenomena of low-dimensional materials and strongly correlated electron systems at high-magnetic fields under the Center of Excellence Development Program (COE project) at the National Research Institute for Metal. The second stage towards the realization of the solid-state quantum devices and measurement of the quantum properties began in April of this year. NMR spectra have been studied in CeP and various lithium fluoride crystals in anticipation of the crystal lattice quantum computer. The magneto-transport effect on tiny aluminum devices fabricated on semiconductors has been studied, and negative magnetoresistance has clearly been observed. An SPM which can be operated at various temperatures in the presence of high-magnetic fields has been developed to construct a magnetic resonance force microscope. The magnetic field effect on the magnetic recording pattern of an HDD was clearly measured up to 7 T.

  5. Computer-Assisted Diagnostic Decision Support: History, Challenges, and Possible Paths Forward

    Science.gov (United States)

    Miller, Randolph A.

    2009-01-01

    This paper presents a brief history of computer-assisted diagnosis, including challenges and future directions. Some ideas presented in this article on computer-assisted diagnostic decision support systems (CDDSS) derive from prior work by the author and his colleagues (see list in Acknowledgments) on the INTERNIST-1 and QMR projects. References…

  6. Computer-Assisted Diagnostic Decision Support: History, Challenges, and Possible Paths Forward

    Science.gov (United States)

    Miller, Randolph A.

    2009-01-01

    This paper presents a brief history of computer-assisted diagnosis, including challenges and future directions. Some ideas presented in this article on computer-assisted diagnostic decision support systems (CDDSS) derive from prior work by the author and his colleagues (see list in Acknowledgments) on the INTERNIST-1 and QMR projects. References…

  7. The Twentieth-Century Sea Level Budget: Recent Progress and Challenges

    Science.gov (United States)

    Jevrejeva, S.; Matthews, A.; Slangen, A.

    2017-01-01

    For coastal areas, given the large and growing concentration of population and economic activity, as well as the importance of coastal ecosystems, sea level rise is one of the most damaging aspects of the warming climate. Huge progress in quantifying the cause of sea level rise and closure of sea level budget for the period since the 1990s has been made mainly due to the development of the global observing system for sea level components and total sea levels. We suggest that a large spread (1.2 ± 0.2-1.9 ± 0.3 mm year-1) in estimates of sea level rise during the twentieth century from several reconstructions demonstrates the need for and importance of the rescue of historical observations from tide gauges, with a focus on the beginning of the twentieth century. Understanding the physical mechanisms contributing to sea level rise and controlling the variability of sea level over the past few 100 years are a challenging task. In this study, we provide an overview of the progress in understanding the cause of sea level rise during the twentieth century and highlight the main challenges facing the interdisciplinary sea level community in understanding the complex nature of sea level changes.

  8. Progress and challenges in utilization of palm oil biomass as fuel for decentralized electricity generation

    Energy Technology Data Exchange (ETDEWEB)

    Bazmi, Aqeel Ahmed [Process Systems Engineering Centre (PROSPECT), Department of Chemical Engineering, Faculty of Chemical and Natural Resources Engineering, University Technology Malaysia, Skudai 81310, Johor Bahru, JB (Malaysia); Biomass Conversion Research Center (BCRC), Department of Chemical Engineering, COMSATS Institute of Information Technology, Lahore (Pakistan); Zahedi, Gholamreza; Hashim, Haslenda [Process Systems Engineering Centre (PROSPECT), Department of Chemical Engineering, Faculty of Chemical and Natural Resources Engineering, University Technology Malaysia, Skudai 81310, Johor Bahru, JB (Malaysia)

    2011-01-15

    It has been broadly accepted worldwide that global warming, indeed, is the greatest threat of the time to the environment. Renewable energy (RE) is expected as a perfect solution to reduce global warming and to endorse sustainable development. Progressive release of greenhouse gases (GHG) from increasing energy-intensive industries has eventually caused human civilization to suffer. Realizing the exigency of reducing emissions and simultaneously catering to needs of industries, researchers foresee the RE as the perfect entrant to overcome these challenges. RE provides an effective option for the provision of energy services from the technical point of view while biomass, a major source of energy in the world until before industrialization when fossil fuels become dominant, appears an important renewable source of energy and researches have proven from time to time its viability for large-scale production. Being a widely spread source, biomass offers the execution of decentralized electricity generation gaining importance in liberalized electricity markets. The decentralized power is characterized by generation of electricity nearer to the demand centers, meeting the local energy needs. Researchers envisaged an increasing decentralization of power supply, expected to make a particular contribution to climate protection. This article investigates the progress and challenges for decentralized electricity generation by palm oil biomass according to the overall concept of sustainable development. (author)

  9. Computational Research Challenges and Opportunities for the Optimization of Fossil Energy Power Generation System

    Energy Technology Data Exchange (ETDEWEB)

    Zitney, S.E.

    2007-06-01

    Emerging fossil energy power generation systems must operate with unprecedented efficiency and near-zero emissions, while optimizing profitably amid cost fluctuations for raw materials, finished products, and energy. To help address these challenges, the fossil energy industry will have to rely increasingly on the use advanced computational tools for modeling and simulating complex process systems. In this paper, we present the computational research challenges and opportunities for the optimization of fossil energy power generation systems across the plant lifecycle from process synthesis and design to plant operations. We also look beyond the plant gates to discuss research challenges and opportunities for enterprise-wide optimization, including planning, scheduling, and supply chain technologies.

  10. 3rd International Symposium on Big Data and Cloud Computing Challenges

    CERN Document Server

    Neelanarayanan, V

    2016-01-01

    This proceedings volume contains selected papers that were presented in the 3rd International Symposium on Big data and Cloud Computing Challenges, 2016 held at VIT University, India on March 10 and 11. New research issues, challenges and opportunities shaping the future agenda in the field of Big Data and Cloud Computing are identified and presented throughout the book, which is intended for researchers, scholars, students, software developers and practitioners working at the forefront in their field. This book acts as a platform for exchanging ideas, setting questions for discussion, and sharing the experience in Big Data and Cloud Computing domain.

  11. The Awareness and Challenges of Cloud Computing Adoption on Tertiary Education in Malaysia

    Science.gov (United States)

    Hazreeni Hamzah, Nor; Mahmud, Maziah; Zukri, Shamsunarnie Mohamed; Yaacob, Wan Fairos Wan; Yacob, Jusoh

    2017-09-01

    This preliminary study aims to investigate the awareness of the adoption of cloud computing among the academicians in tertiary education in Malaysia. Besides, this study also want to explore the possible challenges faced by the academician while adopting this new technology. The pilot study was done on 40 lecturers in Universiti Teknologi MARA Kampus Kota Bharu (UiTMKB) by using self administered questionnaire. The results found that almost half (40 percent) were not aware on the existing of cloud computing in teaching and learning (T&L) process. The challenges confronting the adoption of cloud computing are data insecurity, data insecurity, unsolicited advertisement, lock-in, reluctance to eliminate staff positions, privacy concerns, reliability challenge, regulatory compliance concerns/user control and institutional culture/resistance to change in technology. This possible challenges can be factorized in two major factors which were security and dependency factor and user control and mentality factor.

  12. Computational Prediction of Effector Proteins in Fungi: Opportunities and Challenges

    Directory of Open Access Journals (Sweden)

    Humira eSonah

    2016-02-01

    Full Text Available Effector proteins are mostly secretory proteins that stimulate plant infection by manipulating the host response. Identifying fungal effector proteins and understanding their function is of great importance in efforts to curb losses to plant diseases. Recent advances in high-throughput sequencing technologies have facilitated the availability of several fungal genomes and thousands of transcriptomes. As a result, the growing amount of genomic information has provided great opportunities to identify putative effector proteins in different fungal species. There is little consensus over the annotation and functionality of effector proteins, and mostly small secretory proteins are considered as effector proteins, a concept that tends to overestimate the number of proteins involved in a plant-pathogen interaction. With the characterization of Avr genes, criteria for computational prediction of effector proteins are becoming more efficient. There are hundreds of tools available for the identification of conserved motifs, signature sequences and structural features in the proteins. Many pipelines and online servers, which combine several tools, are made available to perform genome-wide identification of effector proteins. In this review, available tools and pipelines, their strength and limitations for effective identification of fungal effector proteins are discussed. We also present an exhaustive list of classically secreted proteins along with their key conserved motifs found in 12 common plant pathogens (11 fungi and one oomycete through an analytical pipeline.

  13. Computational challenges in magnetic-confinement fusion physics

    Science.gov (United States)

    Fasoli, A.; Brunner, S.; Cooper, W. A.; Graves, J. P.; Ricci, P.; Sauter, O.; Villard, L.

    2016-05-01

    Magnetic-fusion plasmas are complex self-organized systems with an extremely wide range of spatial and temporal scales, from the electron-orbit scales (~10-11 s, ~ 10-5 m) to the diffusion time of electrical current through the plasma (~102 s) and the distance along the magnetic field between two solid surfaces in the region that determines the plasma-wall interactions (~100 m). The description of the individual phenomena and of the nonlinear coupling between them involves a hierarchy of models, which, when applied to realistic configurations, require the most advanced numerical techniques and algorithms and the use of state-of-the-art high-performance computers. The common thread of such models resides in the fact that the plasma components are at the same time sources of electromagnetic fields, via the charge and current densities that they generate, and subject to the action of electromagnetic fields. This leads to a wide variety of plasma modes of oscillations that resonate with the particle or fluid motion and makes the plasma dynamics much richer than that of conventional, neutral fluids.

  14. Computer Adaptive Multistage Testing: Practical Issues, Challenges and Principles

    Directory of Open Access Journals (Sweden)

    Halil Ibrahim SARI

    2016-12-01

    Full Text Available The purpose of many test in the educational and psychological measurement is to measure test takers’ latent trait scores from responses given to a set of items. Over the years, this has been done by traditional methods (paper and pencil tests. However, compared to other test administration models (e.g., adaptive testing, traditional methods are extensively criticized in terms of producing low measurement accuracy and long test length. Adaptive testing has been proposed to overcome these problems. There are two popular adaptive testing approaches. These are computerized adaptive testing (CAT and computer adaptive multistage testing (ca-MST. The former is a well-known approach that has been predominantly used in this field. We believe that researchers and practitioners are fairly familiar with many aspects of CAT because it has more than a hundred years of history. However, the same thing is not true for the latter one. Since ca-MST is relatively new, many researchers are not familiar with features of it. The purpose of this study is to closely examine the characteristics of ca-MST, including its working principle, the adaptation procedure called the routing method, test assembly, and scoring, and provide an overview to researchers, with the aim of drawing researchers’ attention to ca-MST and encouraging them to contribute to the research in this area. The books, software and future work for ca-MST are also discussed.

  15. P300 brain computer interface: current challenges and emerging trends

    Directory of Open Access Journals (Sweden)

    Reza eFazel-Rezai

    2012-07-01

    Full Text Available A brain-computer interface (BCI enables communication without movement based on brain signals measured with electroencephalography (EEG. BCIs usually rely on one of three types of signals: the P300 and other components of the event-related potential (ERP, steady state visual evoked potential (SSVEP, or event related desynchronization (ERD. Although P300 BCIs were introduced over twenty years ago, the past few years have seen a strong increase in P300 BCI research. This closed-loop BCI approach relies on the P300 and other components of the event-related potential (ERP, based on an oddball paradigm presented to the subject. In this paper, we overview the current status of P300 BCI technology, and then discuss new directions: paradigms for eliciting P300s; signal processing methods; applications; and hybrid BCIs. We conclude that P300 BCIs are quite promising, as several emerging directions have not yet been fully explored and could lead to improvements in bit rate, reliability, usability, and flexibility.

  16. Computational fluid dynamics challenges for hybrid air vehicle applications

    Science.gov (United States)

    Carrin, M.; Biava, M.; Steijl, R.; Barakos, G. N.; Stewart, D.

    2017-06-01

    This paper begins by comparing turbulence models for the prediction of hybrid air vehicle (HAV) flows. A 6 : 1 prolate spheroid is employed for validation of the computational fluid dynamics (CFD) method. An analysis of turbulent quantities is presented and the Shear Stress Transport (SST) k-ω model is compared against a k-ω Explicit Algebraic Stress model (EASM) within the unsteady Reynolds-Averaged Navier-Stokes (RANS) framework. Further comparisons involve Scale Adaptative Simulation models and a local transition transport model. The results show that the flow around the vehicle at low pitch angles is sensitive to transition effects. At high pitch angles, the vortices generated on the suction side provide substantial lift augmentation and are better resolved by EASMs. The validated CFD method is employed for the flow around a shape similar to the Airlander aircraft of Hybrid Air Vehicles Ltd. The sensitivity of the transition location to the Reynolds number is demonstrated and the role of each vehicle£s component is analyzed. It was found that the ¦ns contributed the most to increase the lift and drag.

  17. Computer Vision Malaria Diagnostic Systems—Progress and Prospects

    Directory of Open Access Journals (Sweden)

    Joseph Joel Pollak

    2017-08-01

    Full Text Available Accurate malaria diagnosis is critical to prevent malaria fatalities, curb overuse of antimalarial drugs, and promote appropriate management of other causes of fever. While several diagnostic tests exist, the need for a rapid and highly accurate malaria assay remains. Microscopy and rapid diagnostic tests are the main diagnostic modalities available, yet they can demonstrate poor performance and accuracy. Automated microscopy platforms have the potential to significantly improve and standardize malaria diagnosis. Based on image recognition and machine learning algorithms, these systems maintain the benefits of light microscopy and provide improvements such as quicker scanning time, greater scanning area, and increased consistency brought by automation. While these applications have been in development for over a decade, recently several commercial platforms have emerged. In this review, we discuss the most advanced computer vision malaria diagnostic technologies and investigate several of their features which are central to field use. Additionally, we discuss the technological and policy barriers to implementing these technologies in low-resource settings world-wide.

  18. CNS drug development in Europe--past progress and future challenges.

    Science.gov (United States)

    Nutt, David J; Attridge, Jim

    2014-01-01

    Despite enormous progress in defining, diagnosing and treating mental disorders, EU health systems face a mounting challenge in responding to 'unmet need'. Mental illnesses produce a societal burden that exceeds that for either cancers or cardiovascular conditions. Leveraging advances in science and medicine to make available new innovative medicines is a key component in responding to this challenge. The dominant paradigm has been, is and will continue to be, one of incremental progress. Better medicines for depression, anxiety and psychoses in the working age population would add great value to patients and improve labour productivity. But psychotropic medicines face exceptional challenges in demonstrating their added value, due to uncertainty in patient diagnosis, selecting treatments and ensuring adherence. Also, there are major difficulties in estimating costs. Advances in understanding brain processes, identifying biomarkers and neuro-imaging techniques promise far more effective 'diagnostic-therapeutic' treatments and improved patient outcomes in the future. Currently there are valuable incremental innovations in late development, which may well fail to recover their R&D costs, because of very low reimbursed prices. This will send a signal to innovators not to persist with product development in this area. Recently several leading companies have withdrawn from R&D in these mental disorders. This is a worrying development since building the capabilities to succeed in any disease sector takes many years and, once dismantled, they cannot easily be re-established. Three policy interventions could improve innovation incentives: Further 'push' incentives under i) and streamlining under ii) alone will not reverse the decline in investment incentives. An EU consensus, based upon an innovation model which encompasses the Research, Development and Market phases as a single cyclical process, which addresses the weak 'market pull incentives' under iii) is needed. There

  19. Computer vision challenges and technologies for agile manufacturing

    Science.gov (United States)

    Molley, Perry A.

    1996-02-01

    applicable to commercial production processes and applications. Computer vision will play a critical role in the new agile production environment for automation of processes such as inspection, assembly, welding, material dispensing and other process control tasks. Although there are many academic and commercial solutions that have been developed, none have had widespread adoption considering the huge potential number of applications that could benefit from this technology. The reason for this slow adoption is that the advantages of computer vision for automation can be a double-edged sword. The benefits can be lost if the vision system requires an inordinate amount of time for reprogramming by a skilled operator to account for different parts, changes in lighting conditions, background clutter, changes in optics, etc. Commercially available solutions typically require an operator to manually program the vision system with features used for the recognition. In a recent survey, we asked a number of commercial manufacturers and machine vision companies the question, 'What prevents machine vision systems from being more useful in factories?' The number one (and unanimous) response was that vision systems require too much skill to set up and program to be cost effective.

  20. Progress and Challenges in Developing Aptamer-Functionalized Targeted Drug Delivery Systems

    Directory of Open Access Journals (Sweden)

    Feng Jiang

    2015-10-01

    Full Text Available Aptamers, which can be screened via systematic evolution of ligands by exponential enrichment (SELEX, are superior ligands for molecular recognition due to their high selectivity and affinity. The interest in the use of aptamers as ligands for targeted drug delivery has been increasing due to their unique advantages. Based on their different compositions and preparation methods, aptamer-functionalized targeted drug delivery systems can be divided into two main categories: aptamer-small molecule conjugated systems and aptamer-nanomaterial conjugated systems. In this review, we not only summarize recent progress in aptamer selection and the application of aptamers in these targeted drug delivery systems but also discuss the advantages, challenges and new perspectives associated with these delivery systems.

  1. Candidiasis: a fungal infection--current challenges and progress in prevention and treatment.

    Science.gov (United States)

    Hani, Umme; Shivakumar, Hosakote G; Vaghela, Rudra; Osmani, Riyaz Ali M; Shrivastava, Atul

    2015-01-01

    Despite therapeutic advances candidiasis remains a common fungal infection most frequently caused by C. albicans and may occur as vulvovaginal candidiasis or thrush, a mucocutaneous candidiasis. Candidiasis frequently occurs in newborns, in immune-deficient people like AIDS patients, and in people being treated with broad spectrum antibiotics. It is mainly due to C. albicans while other species such as C. tropicalis, C. glabrata, C. parapsilosis and C. krusei are increasingly isolated. OTC antifungal dosage forms such as creams and gels can be used for effective treatment of local candidiasis. Whereas, for preventing spread of the disease to deeper vital organs, candidiasis antifungal chemotherapy is preferred. Use of probiotics and development of novel vaccines is an advanced approach for the prevention of candidiasis. Present review summarizes the diagnosis, current status and challenges in the treatment and prevention of candidiasis with prime focus on host defense against candidiasis, advancements in diagnosis, probiotics role and recent progress in the development of vaccines against candidiasis.

  2. Progress and Challenges in Developing Aptamer-Functionalized Targeted Drug Delivery Systems.

    Science.gov (United States)

    Jiang, Feng; Liu, Biao; Lu, Jun; Li, Fangfei; Li, Defang; Liang, Chao; Dang, Lei; Liu, Jin; He, Bing; Badshah, Shaikh Atik; Lu, Cheng; He, Xiaojuan; Guo, Baosheng; Zhang, Xiao-Bing; Tan, Weihong; Lu, Aiping; Zhang, Ge

    2015-01-01

    Aptamers, which can be screened via systematic evolution of ligands by exponential enrichment (SELEX), are superior ligands for molecular recognition due to their high selectivity and affinity. The interest in the use of aptamers as ligands for targeted drug delivery has been increasing due to their unique advantages. Based on their different compositions and preparation methods, aptamer-functionalized targeted drug delivery systems can be divided into two main categories: aptamer-small molecule conjugated systems and aptamer-nanomaterial conjugated systems. In this review, we not only summarize recent progress in aptamer selection and the application of aptamers in these targeted drug delivery systems but also discuss the advantages, challenges and new perspectives associated with these delivery systems.

  3. Challenges, Progress, and Opportunities: Proceedings of the Filovirus Medical Countermeasures Workshop

    Directory of Open Access Journals (Sweden)

    Rona Hirschberg

    2014-07-01

    Full Text Available On August 22–23, 2013, agencies within the United States Department of Defense (DoD and the Department of Health and Human Services (HHS sponsored the Filovirus Medical Countermeasures (MCMs Workshop as an extension of the activities of the Filovirus Animal Non-clinical Group (FANG. The FANG is a federally-recognized multi-Agency group established in 2011 to coordinate and facilitate U.S. government (USG efforts to develop filovirus MCMs. The workshop brought together government, academic and industry experts to consider the needs for filovirus MCMs and evaluate the status of the product development pipeline. This report summarizes speaker presentations and highlights progress and challenges remaining in the field.

  4. Advances in calculations of thermal elastic properties: progress and challenges in the interpretation of seismic tomography

    Science.gov (United States)

    Wentzcovitch, R. M.; Wu, Z.; Cococcioni, M.; Umemoto, K.

    2012-12-01

    First principles calculations in mineral physics have contributed decisively to understanding numerous mineral properties of interest in geophysics. Thermal elasticity is the essential property that allows for direct insights into the origin of seismic observables such as velocity discontinuities, gradients, anisotropies, and heterogeneities. Earth materials are challenging to calculations because they contain iron, a strongly correlated element that may undergo spin crossover under pressure, hydrogen, that even in very small amounts can produce can produce anelastic effects, and phases that are stabilized only by anharmonic fluctuations, e.g., CaSiO3-perovskite. I will describe recent progress in methodology that is enabling calculations of thermal elastic properties with unprecedented easiness. I will exemplify its power on iron bearing phases including some undergoing spin crossovers, and provide an updated view of lower mantle velocities.

  5. Progress of the Computer-Aided Engineering of Electric Drive Vehicle Batteries (CAEBAT) (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Pesaran, A. A.; Han, T.; Hartridge, S.; Shaffer, C.; Kim, G. H.; Pannala, S.

    2013-06-01

    This presentation, Progress of Computer-Aided Engineering of Electric Drive Vehicle Batteries (CAEBAT) is about simulation and computer-aided engineering (CAE) tools that are widely used to speed up the research and development cycle and reduce the number of build-and-break steps, particularly in the automotive industry. Realizing this, DOE?s Vehicle Technologies Program initiated the CAEBAT project in April 2010 to develop a suite of software tools for designing batteries.

  6. Establishing and sustaining a biorepository network in Israel: challenges and progress.

    Science.gov (United States)

    Cohen, Yehudit; Almog, Ronit; Onn, Amir; Itzhaki-Alfia, Ayelet; Meir, Karen

    2013-12-01

    Over the past 5 years, using European and North American biobanks as models, the grass-roots establishment of independently operating biobanks has occurred virtually simultaneously in large Israeli teaching hospitals. The process of establishing a national biorepository network in Israel has progressed slowly, sustained mainly by a few proponents working together on a personal level. Slow progress has been due to limited funding and the lack of a legal framework specific to biobanking activities. Recently, due to increasing pressure from the scientific community, the government has earmarked funds for a national biorepository network, and the structure is now being established. In forming a network, Israel's biobanks face certain difficulties, particularly lack of support. Additional challenges include harmonization of standard operating procedures, database centralization, and use of a common informed consent form. In this article, we highlight some of the issues faced by Israel's biobank managers in establishing and sustaining a functional biobank network, information that could provide guidance for other small countries with limited resources.

  7. Progress and challenges of engineering a biophysical carbon dioxide-concentrating mechanism into higher plants.

    Science.gov (United States)

    Rae, Benjamin D; Long, Benedict M; Förster, Britta; Nguyen, Nghiem D; Velanis, Christos N; Atkinson, Nicky; Hee, Wei Yih; Mukherjee, Bratati; Price, G Dean; McCormick, Alistair J

    2017-04-24

    Growth and productivity in important crop plants is limited by the inefficiencies of the C3 photosynthetic pathway. Introducing CO2-concentrating mechanisms (CCMs) into C3 plants could overcome these limitations and lead to increased yields. Many unicellular microautotrophs, such as cyanobacteria and green algae, possess highly efficient biophysical CCMs that increase CO2 concentrations around the primary carboxylase enzyme, Rubisco, to enhance CO2 assimilation rates. Algal and cyanobacterial CCMs utilize distinct molecular components, but share several functional commonalities. Here we outline the recent progress and current challenges of engineering biophysical CCMs into C3 plants. We review the predicted requirements for a functional biophysical CCM based on current knowledge of cyanobacterial and algal CCMs, the molecular engineering tools and research pipelines required to translate our theoretical knowledge into practice, and the current challenges to achieving these goals. © The Author 2017. Published by Oxford University Press on behalf of the Society for Experimental Biology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  8. Nanoparticle-Based Drug Delivery for Therapy of Lung Cancer: Progress and Challenges

    Directory of Open Access Journals (Sweden)

    Anish Babu

    2013-01-01

    Full Text Available The last decade has witnessed enormous advances in the development and application of nanotechnology in cancer detection, diagnosis, and therapy culminating in the development of the nascent field of “cancer nanomedicine.” A nanoparticle as per the National Institutes of Health (NIH guidelines is any material that is used in the formulation of a drug resulting in a final product smaller than 1 micron in size. Nanoparticle-based therapeutic systems have gained immense popularity due to their ability to overcome biological barriers, effectively deliver hydrophobic therapies, and preferentially target disease sites. Currently, many formulations of nanocarriers are utilized including lipid-based, polymeric and branched polymeric, metal-based, magnetic, and mesoporous silica. Innovative strategies have been employed to exploit the multicomponent, three-dimensional constructs imparting multifunctional capabilities. Engineering such designs allows simultaneous drug delivery of chemotherapeutics and anticancer gene therapies to site-specific targets. In lung cancer, nanoparticle-based therapeutics is paving the way in the diagnosis, imaging, screening, and treatment of primary and metastatic tumors. However, translating such advances from the bench to the bedside has been severely hampered by challenges encountered in the areas of pharmacology, toxicology, immunology, large-scale manufacturing, and regulatory issues. This review summarizes current progress and challenges in nanoparticle-based drug delivery systems, citing recent examples targeted at lung cancer treatment.

  9. Mathematics applied to the climate system: outstanding challenges and recent progress

    Science.gov (United States)

    Williams, Paul D.; Cullen, Michael J. P.; Davey, Michael K.; Huthnance, John M.

    2013-01-01

    The societal need for reliable climate predictions and a proper assessment of their uncertainties is pressing. Uncertainties arise not only from initial conditions and forcing scenarios, but also from model formulation. Here, we identify and document three broad classes of problems, each representing what we regard to be an outstanding challenge in the area of mathematics applied to the climate system. First, there is the problem of the development and evaluation of simple physically based models of the global climate. Second, there is the problem of the development and evaluation of the components of complex models such as general circulation models. Third, there is the problem of the development and evaluation of appropriate statistical frameworks. We discuss these problems in turn, emphasizing the recent progress made by the papers presented in this Theme Issue. Many pressing challenges in climate science require closer collaboration between climate scientists, mathematicians and statisticians. We hope the papers contained in this Theme Issue will act as inspiration for such collaborations and for setting future research directions. PMID:23588054

  10. High Performance Numerical Computing for High Energy Physics: A New Challenge for Big Data Science

    Directory of Open Access Journals (Sweden)

    Florin Pop

    2014-01-01

    Full Text Available Modern physics is based on both theoretical analysis and experimental validation. Complex scenarios like subatomic dimensions, high energy, and lower absolute temperature are frontiers for many theoretical models. Simulation with stable numerical methods represents an excellent instrument for high accuracy analysis, experimental validation, and visualization. High performance computing support offers possibility to make simulations at large scale, in parallel, but the volume of data generated by these experiments creates a new challenge for Big Data Science. This paper presents existing computational methods for high energy physics (HEP analyzed from two perspectives: numerical methods and high performance computing. The computational methods presented are Monte Carlo methods and simulations of HEP processes, Markovian Monte Carlo, unfolding methods in particle physics, kernel estimation in HEP, and Random Matrix Theory used in analysis of particles spectrum. All of these methods produce data-intensive applications, which introduce new challenges and requirements for ICT systems architecture, programming paradigms, and storage capabilities.

  11. SLA-Oriented Resource Provisioning for Cloud Computing: Challenges, Architecture, and Solutions

    CERN Document Server

    Buyya, Rajkumar; Calheiros, Rodrigo N

    2012-01-01

    Cloud computing systems promise to offer subscription-oriented, enterprise-quality computing services to users worldwide. With the increased demand for delivering services to a large number of users, they need to offer differentiated services to users and meet their quality expectations. Existing resource management systems in data centers are yet to support Service Level Agreement (SLA)-oriented resource allocation, and thus need to be enhanced to realize cloud computing and utility computing. In addition, no work has been done to collectively incorporate customer-driven service management, computational risk management, and autonomic resource management into a market-based resource management system to target the rapidly changing enterprise requirements of Cloud computing. This paper presents vision, challenges, and architectural elements of SLA-oriented resource management. The proposed architecture supports integration of marketbased provisioning policies and virtualisation technologies for flexible alloc...

  12. SECURITY ISSUES AND CHALLENGES IN MOBILE COMPUTING AND M-COMMERCE

    OpenAIRE

    Krishna Prakash; Balachandra

    2015-01-01

    Mobile computing and Mobile Commerce is most popular now a days because of the service offered during the mobility. Mobile computing has become the reality today rather than the luxury.Mobile wireless market is increasing by leaps and bounds. The quality and speeds available in the mobile environment must match the fixed networks if the convergence of the mobile wireless and fixed communication network is to happen in the real sense. The challenge for mobile network lies in provid...

  13. Computing in the curriculum: challenges and strategies from a teacher's perspective

    OpenAIRE

    Sentance, Susan Elizabeth; Csizmadia, Andrew

    2016-01-01

    Computing is being introduced into the curriculum in many countries. Teachers’ perspectives enable us to discover the challenges this presents, and also the strategies teachers claim to be using successfully in teaching the subject across primary and secondary education. The study described in this paper was carried out in the UK in 2014 where teachers were preparing for the mandatory inclusion of Computing into the curriculum. A survey was conducted of over 300 teachers who were currently te...

  14. A New Finite Element Approach for Prediction of Aerothermal Loads: Progress in Inviscid Flow Computations

    Science.gov (United States)

    Bey, K. S.; Thornton, E. A.; Dechaumphai, P.; Ramakrishnan, R.

    1985-01-01

    Recent progress in the development of finite element methodology for the prediction of aerothermal loads is described. Two dimensional, inviscid computations are presented, but emphasis is placed on development of an approach extendable to three dimensional viscous flows. Research progress is described for: (1) utilization of a commerically available program to construct flow solution domains and display computational results, (2) development of an explicit Taylor-Galerkin solution algorithm, (3) closed form evaluation of finite element matrices, (4) vector computer programming strategies, and (5) validation of solutions. Two test problems of interest to NASA Langley aerothermal research are studied. Comparisons of finite element solutions for Mach 6 flow with other solution methods and experimental data validate fundamental capabilities of the approach for analyzing high speed inviscid compressible flows.

  15. A new finite element approach for prediction of aerothermal loads - Progress in inviscid flow computations

    Science.gov (United States)

    Bey, K. S.; Thornton, E. A.; Dechaumphai, P.; Ramakrishnan, R.

    1985-01-01

    Recent progress in the development of finite element methodology for the prediction of aerothermal loads is described. Two dimensional, inviscid computations are presented, but emphasis is placed on development of an approach extendable to three dimensional viscous flows. Research progress is described for: (1) utilization of a commercially available program to construct flow solution domains and display computational results, (2) development of an explicit Taylor-Galerkin solution algorithm, (3) closed form evaluation of finite element matrices, (4) vector computer programming strategies, and (5) validation of solutions. Two test problems of interest to NASA Langley aerothermal research are studied. Comparisons of finite element solutions for Mach 6 flow with other solution methods and experimental data validate fundamental capabilities of the approach for analyzing high speed inviscid compressible flows.

  16. New Challenges for Design Participation in the Era of Ubiquitous Computing

    DEFF Research Database (Denmark)

    Brereton, Margot; Buur, Jacob

    2008-01-01

    computing, and how this challenges the original participatory design thinking. In particular we will argue that more casual, exploratory formats of engagement with people are required, and rather than planning the all-encompassing systems development project, participatory design needs to move towards......Since the event of participatory design in the work democracy projects of the 1970’s and 1980’s in Scandinavia, computing technology and people’s engagement with it have undergone fundamental changes. Although participatory design continues to be a precondition for designing computing that aligns...

  17. Center for computation and visualization of geometric structures. [Annual], Progress report

    Energy Technology Data Exchange (ETDEWEB)

    1993-02-12

    The mission of the Center is to establish a unified environment promoting research, education, and software and tool development. The work is centered on computing, interpreted in a broad sense to include the relevant theory, development of algorithms, and actual implementation. The research aspects of the Center are focused on geometry; correspondingly the computational aspects are focused on three (and higher) dimensional visualization. The educational aspects are likewise centered on computing and focused on geometry. A broader term than education is `communication` which encompasses the challenge of explaining to the world current research in mathematics, and specifically geometry.

  18. Evaluating a multi-player brain-computer interface game: challenge versus co-experience

    NARCIS (Netherlands)

    Gürkök, Hayrettin; Nijholt, Anton; Poel, Mannes; Obbink, Michel; Volpe, G.; Reidsma, D.; Camurri, A.; Nijholt, A.

    2013-01-01

    Brain–computer interfaces (BCIs) have started to be considered as game controllers. The low level of control they provide prevents them from providing perfect control but allows the design of challenging games which can be enjoyed by players. Evaluation of enjoyment, or user experience (UX), is rath

  19. Challenges and Opportunities of Information Technology in the 90s. Track VIII: Managing Distributed Computing Services.

    Science.gov (United States)

    CAUSE, Boulder, CO.

    Six papers from the 1990 CAUSE Conference Track VIII: Managing Distributed Computing are presented. Authors discuss the challenges and opportunities involved in providing user managers with direct access to institutional databases to support their decision making and planning activities. Papers and their authors are as follows: "Rendering an…

  20. Challenges of Teaching Computer Science in Transition Countries: Albanian University Case

    Science.gov (United States)

    Sotirofski, Kseanela; Kukeli, Agim; Kalemi, Edlira

    2010-01-01

    The main objective of our study is to determine the challenges faced during the process of teaching Computer Science in a university of a country in transition and make suggestions to improve this teaching process by perfecting the necessary conditions. Our survey builds on the thesis that we live in an information age; information technology is…

  1. Investigating the Benefits and Challenges of Using Laptop Computers in Higher Education Classrooms

    Science.gov (United States)

    Kay, Robin Holding; Lauricella, Sharon

    2014-01-01

    The purpose of this study was to investigate the benefits and challenges using laptop computers (hereafter referred to as laptops) inside and outside higher education classrooms. Quantitative and qualitative data were collected from 156 university students (54 males, 102 females) enrolled in either education or communication studies. Benefits of…

  2. Challenges in Integrating a Complex Systems Computer Simulation in Class: An Educational Design Research

    Science.gov (United States)

    Loke, Swee-Kin; Al-Sallami, Hesham S.; Wright, Daniel F. B.; McDonald, Jenny; Jadhav, Sheetal; Duffull, Stephen B.

    2012-01-01

    Complex systems are typically difficult for students to understand and computer simulations offer a promising way forward. However, integrating such simulations into conventional classes presents numerous challenges. Framed within an educational design research, we studied the use of an in-house built simulation of the coagulation network in four…

  3. Computational Challenge of Fractional Differential Equations and the Potential Solutions: A Survey

    Directory of Open Access Journals (Sweden)

    Chunye Gong

    2015-01-01

    Full Text Available We present a survey of fractional differential equations and in particular of the computational cost for their numerical solutions from the view of computer science. The computational complexities of time fractional, space fractional, and space-time fractional equations are O(N2M, O(NM2, and O(NM(M + N compared with O(MN for the classical partial differential equations with finite difference methods, where M, N are the number of space grid points and time steps. The potential solutions for this challenge include, but are not limited to, parallel computing, memory access optimization (fractional precomputing operator, short memory principle, fast Fourier transform (FFT based solutions, alternating direction implicit method, multigrid method, and preconditioner technology. The relationships of these solutions for both space fractional derivative and time fractional derivative are discussed. The authors pointed out that the technologies of parallel computing should be regarded as a basic method to overcome this challenge, and some attention should be paid to the fractional killer applications, high performance iteration methods, high order schemes, and Monte Carlo methods. Since the computation of fractional equations with high dimension and variable order is even heavier, the researchers from the area of mathematics and computer science have opportunity to invent cornerstones in the area of fractional calculus.

  4. Progress in Harmonizing Tiered HIV Laboratory Systems: Challenges and Opportunities in 8 African Countries.

    Science.gov (United States)

    Williams, Jason; Umaru, Farouk; Edgil, Dianna; Kuritsky, Joel

    2016-09-28

    In 2014, the Joint United Nations Programme on HIV/AIDS released its 90-90-90 targets, which make laboratory diagnostics a cornerstone for measuring efforts toward the epidemic control of HIV. A data-driven laboratory harmonization and standardization approach is one way to create efficiencies and ensure optimal laboratory procurements. Following the 2008 "Maputo Declaration on Strengthening of Laboratory Systems"-a call for government leadership in harmonizing tiered laboratory networks and standardizing testing services-several national ministries of health requested that the United States Government and in-country partners help implement the recommendations by facilitating laboratory harmonization and standardization workshops, with a primary focus on improving HIV laboratory service delivery. Between 2007 and 2015, harmonization and standardization workshops were held in 8 African countries. This article reviews progress in the harmonization of laboratory systems in these 8 countries. We examined agreed-upon instrument lists established at the workshops and compared them against instrument data from laboratory quantification exercises over time. We used this measure as an indicator of adherence to national procurement policies. We found high levels of diversity across laboratories' diagnostic instruments, equipment, and services. This diversity contributes to different levels of compliance with expected service delivery standards. We believe the following challenges to be the most important to address: (1) lack of adherence to procurement policies, (2) absence or limited influence of a coordinating body to fully implement harmonization proposals, and (3) misalignment of laboratory policies with minimum packages of care and with national HIV care and treatment guidelines. Overall, the effort to implement the recommendations from the Maputo Declaration has had mixed success and is a work in progress. Program managers should continue efforts to advance the

  5. Progress and Challenges in Improving the Nutritional Quality of Rice (Oryza sativa L.).

    Science.gov (United States)

    Birla, Deep Shikha; Malik, Kapil; Sainger, Manish; Chaudhary, Darshna; Jaiwal, Ranjana; Jaiwal, Pawan K

    2015-10-29

    Rice is a staple food for more than 3 billion people in more than 100 countries of the world but ironically it is deficient in many bioavailable vitamins, minerals, essential amino- and fatty-acids and phytochemicals that prevent chronic diseases like type 2 diabetes, heart disease, cancers and obesity. To enhance the nutritional and other quality aspects of rice, a better understanding of the regulation of the processes involved in the synthesis, uptake, transport and metabolism of macro-(starch, seed storage protein and lipid) and micronutrients (vitamins, minerals and phytochemicals) is required. With the publication of high quality genomic sequence of rice, significant progress has been made in identification, isolation and characterization of novel genes and their regulation for the nutritional and quality enhancement of rice. During the last decade, numerous efforts have been made to refine the nutritional and other quality traits either by using the traditional breeding with high through put technologies such as marker assisted selection and breeding, or by adopting the transgenic approach. A significant improvement in vitamins (A, folate and E), mineral (iron), essential amino acid (lysine) and flavonoids levels has been achieved in the edible part of rice, i. e. endosperm (biofortification) to meet the daily dietary allowance. However, studies on bioavailability and allergenicity on biofortified rice are still required. Despite the numerous efforts, the commercialization of biofortified rice has not yet been achieved. The present review summarizes the progress and challenges of genetic engineering and /or metabolic engineering technologies to improve rice grain quality, and presents the future prospects in developing nutrient dense rice to save the ever-increasing population, that depends solely on rice as the staple food, from widespread nutritional deficiencies.

  6. Computing in research and development in Africa benefits, trends, challenges and solutions

    CERN Document Server

    2015-01-01

    This book describes the trends, challenges and solutions in computing use for scientific research and development within different domains in Africa, such as health, agriculture, environment, economy, energy, education and engineering. The benefits expected are discussed by a number of recognized, domain-specific experts, with a common theme being computing as solution enabler. This book is the first document providing such a representative up-to-date view on this topic at the continent level.   • Discusses computing for scientific research and development on the African continent, addressing domains such as engineering, health, agriculture, environment, economy, energy, and education; • Describes the state-of-the-art in usage of computing to address problems in developing countries pertaining to health, productivity, economic growth, and renewable energy; • Offers insights applicable to all developing countries on the use of computing technologies to address a variety of societal issues.

  7. E-Learning Based on Cloud Computing: Requirements, Challenges and Solutions

    Directory of Open Access Journals (Sweden)

    Alireza Mohammadrezaei

    2014-07-01

    Full Text Available Cloud Computing Technology has changed the access method and development of applications. This technology by providing necessary fundamentals runs applications as services on the net via web browsers. E-learning can utilize cloud computing in order to fulfill the required infrastructures and also to provide the improved performance, scalability, and increased availability. This study in addition to representation of concepts such as e-learning and cloud computing by utilizing descriptive-analytical approach investigates usage of cloud computing in e-learning. Also by introducing the advantages indicates the significance and the necessity of using e-learning based upon cloud computing. Ultimately, the challenges of this model and their solutions have been represented as well

  8. Development of a vaccine for Chlamydia trachomatis: challenges and current progress

    Directory of Open Access Journals (Sweden)

    Hafner LM

    2015-08-01

    Full Text Available Louise M Hafner,1 Peter Timms2 1School of Biomedical Sciences, Institute of Health and Biomedical Innovation, Faculty of Health, Queensland University of Technology, Brisbane, 2Faculty of Science, Health, Education and Engineering, University of the Sunshine Coast, Maroochydore DC, QLD, Australia Abstract: Chlamydia trachomatis remains an enigmatic bacterial pathogen with no vaccine yet available to treat human ocular and genital tract infections caused by tissue-tropic serovars of the organism. Globally, it is the leading cause of preventable blindness as well as the leading cause of bacterial sexually transmitted infections. The pathogen has a range of virulence factors that enable it to successfully evade both the innate and adaptive immune system of the host. The host immune system, although protective, paradoxically is also associated closely with the pathologies of trachoma and pelvic inflammatory disease – disease sequelae of some chlamydial infections and reinfections in some genetically susceptible hosts. In this review, we focus on what is known currently about the pathogenesis of ocular and genital infections caused by this mucosal pathogen. We also discuss novel insights into the pathogenesis of infections caused by the genital and ocular serovars of C. trachomatis, including a discussion of both pathogen and host factors, such as the human microbiota at these mucosal sites as well as the current immunological challenges facing vaccine development. Finally, we discuss the current progress toward development of a vaccine against C. trachomatis. A wide range of recombinant protein antigens are being identified and, hence, are available for vaccine trials. A plasmid-free live strain has recently been produced and evaluated in the mouse (Chlamydia muridarum and monkey (C. trachomatis models. The data for ocular infections in the monkey model was particularly encouraging, although the path to regulatory approval of a live vaccine is

  9. B cells moderate inflammatory progression and enhance bacterial containment upon pulmonary challenge with Mycobacterium tuberculosis.

    Science.gov (United States)

    Maglione, Paul J; Xu, Jiayong; Chan, John

    2007-06-01

    Though much is known about the function of T lymphocytes in the adaptive immune response against Mycobacterium tuberculosis, comparably little is understood regarding the corresponding role of B lymphocytes. Indicating B cells as components of lymphoid neogenesis during pulmonary tuberculosis, we have identified ectopic germinal centers (GCs) in the lungs of infected mice. B cells in these pulmonary lymphoid aggregates express peanut agglutinin and GL7, two markers of GC B cells, as well as CXCR5, and migrate in response to the lymphoid-associated chemokine CXCL13 ex vivo. CXCL13 is negatively regulated by the presence of B cells, as its production is elevated in lungs of B cell-deficient (B cell(-/-)) mice. Upon aerosol with 100 CFU of M. tuberculosis Erdman, B cell(-/-) mice have exacerbated immunopathology corresponding with elevated pulmonary recruitment of neutrophils. Infected B cell(-/-) mice show increased production of IL-10 in the lungs, whereas IFN-gamma, TNF-alpha, and IL-10R remain unchanged from wild type. B cell(-/-) mice have enhanced susceptibility to infection when aerogenically challenged with 300 CFU of M. tuberculosis corresponding with elevated bacterial burden in the lungs but not in the spleen or liver. Adoptive transfer of B cells complements the phenotypes of B cell(-/-) mice, confirming a role for B cells in both modulation of the host response and optimal containment of the tubercle bacillus. As components of ectopic GCs, moderators of inflammatory progression, and enhancers of local immunity against bacterial challenge, B cells may have a greater role in the host defense against M. tuberculosis than previously thought.

  10. Developing E-Governance in the Eurasian Economic Union: Progress, Challenges and Prospects

    Directory of Open Access Journals (Sweden)

    Lyudmila Vidiasova

    2017-03-01

    Full Text Available he article provides an overview of e-governance development in the members of the Eurasian Economic Union (EEU. There is a lack of integrated research on e-governance in the EEU countries, although given the strengthening of this regional bloc, new information and communication technologies (ICT could serve as significant growth driver. Given the history and specifics of regional cooperation in the post-Soviet space and international best practices in ICT use in regional blocs, this article reviews the development of e-governance in the EEU members The research methodology was based on a three-stage concept of regionalism [Van Langenhov, Coste, 2005]. The study examines three key components: progress in developing e-governance, barriers to that development and future prospects. It used qualitative and quantitative methods. Data sources included the results of the United Nations E-Government rating, EEU countries’ regulations based on 3,200 documents and the authors’ expert survey, in which 18 experts (12 EEU representatives and six international experts participated. The study revealed the progress made by EEU countries in improving technological development and reducing human capital development indicators. The survey identified key barriers to e-governance development in the EEU: low motivation and information technology skills among civil servants, and citizens’ low computer literacy. The analysis of EEU members’ national economic priorities revealed a common focus on ICT development. The authors concluded that prospects for e-governance in the EEU were associated with strengthening regional cooperation in standardizing information systems, implementing one-stop-shop services, managing electronic documents and expanding online services. The authors presented two areas for developing e-governance within the EEU. The first is external integration, which, if strengthened, would affect the economy positivelyand optimize business processes

  11. Advanced Scientific Computing Environment Team new scientific database management task. Progress report

    Energy Technology Data Exchange (ETDEWEB)

    Church, J.P.; Roberts, J.C.; Sims, R.N.; Smetana, A.O.; Westmoreland, B.W.

    1991-06-01

    The mission of the ASCENT Team is to continually keep pace with, evaluate, and select emerging computing technologies to define and implement prototypic scientific environments that maximize the ability of scientists and engineers to manage scientific data. These environments are to be implemented in a manner consistent with the site computing architecture and standards and NRTSC/SCS strategic plans for scientific computing. The major trends in computing hardware and software technology clearly indicate that the future ``computer`` will be a network environment that comprises supercomputers, graphics boxes, mainframes, clusters, workstations, terminals, and microcomputers. This ``network computer`` will have an architecturally transparent operating system allowing the applications code to run on any box supplying the required computing resources. The environment will include a distributed database and database managing system(s) that permits use of relational, hierarchical, object oriented, GIS, et al, databases. To reach this goal requires a stepwise progression from the present assemblage of monolithic applications codes running on disparate hardware platforms and operating systems. The first steps include converting from the existing JOSHUA system to a new J80 system that complies with modern language standards, development of a new J90 prototype to provide JOSHUA capabilities on Unix platforms, development of portable graphics tools to greatly facilitate preparation of input and interpretation of output; and extension of ``Jvv`` concepts and capabilities to distributed and/or parallel computing environments.

  12. Progress of anti-vascular endothelial growth factor therapy for ocular neovascular disease: benefits and challenges

    Institute of Scientific and Technical Information of China (English)

    Xu Jianjiang; Li Yimin; Hong Jiaxu

    2014-01-01

    Objective This review aims to summarize the progress of current clinical studies in ocular angiogenesis treated with antivascular endothelial growth factor (VEGF) therapy and to discuss the benefits and challenges of the treatment.Data sources Pubmed,Embase and the Cochrane Library were searched with no limitations of language and year of publication.Study selection Clinical trials and case studies presented at medical conferences and published in peer-reviewed literature in the past decade were reviewed.Results Anti-VEGF agents have manifested great potential and promising outcomes in treating ocular neovascularization,though some of them are still used as off-label drugs.Intravitreal injection of anti-VEGF agents could be accompanied by devastating ocular or systemic complications,and intimate monitoring in both adult and pediatric population are warranted.Future directions should be focused on carrying out more well-designed large-scale controlled trials,promoting sustained duration of action,developing safer and more efficient generation of anti-VEGF agents.Conclusions Anti-VEGF treatment has proved to be beneficial in treating both anterior and posterior neovascular ocular diseases.However,more safer and affordable antiangiogenic agencies and regimens are warranted to be explored.

  13. Using chemical biology to assess and modulate mitochondria: progress and challenges

    Science.gov (United States)

    Murphy, Michael P.

    2017-01-01

    Our understanding of the role of mitochondria in biomedical sciences has expanded considerably over the past decade. In addition to their well-known metabolic roles, mitochondrial are also central to signalling for various processes through the generation of signals such as ROS and metabolites that affect cellular homeostasis, as well as other processes such as cell death and inflammation. Thus, mitochondrial function and dysfunction are central to the health and fate of the cell. Consequently, there is considerable interest in better understanding and assessing the many roles of mitochondria. Furthermore, there is also a growing realization that mitochondrial are a promising drug target in a wide range of pathologies. The application of interdisciplinary approaches at the interface between chemistry and biology are opening up new opportunities to understand mitochondrial function and in assessing the role of the organelle in biology. This work and the experience thus gained are leading to the development of new classes of therapies. Here, we overview the progress that has been made to date on exploring the chemical biology of the organelle and then focus on future challenges and opportunities that face this rapidly developing field. PMID:28382206

  14. Safety risk management of underground engineering in China:Progress, challenges and strategies

    Institute of Scientific and Technical Information of China (English)

    Qihu Qian; Peng Lin

    2016-01-01

    Underground construction in China is featured by large scale, high speed, long construction period, complex operation and frustrating situations regarding project safety. Various accidents have been re-ported from time to time, resulting in serious social impact and huge economic loss. This paper presents the main progress in the safety risk management of underground engineering in China over the last decade, i.e. (1) establishment of laws and regulations for safety risk management of underground en-gineering, (2) implementation of the safety risk management plan, (3) establishment of decision support system for risk management and early-warning based on information technology, and (4) strengthening the study on safety risk management, prediction and prevention. Based on the analysis of the typical accidents in China in the last decade, the new challenges in the safety risk management for underground engineering are identified as follows:(1) control of unsafe human behaviors;(2) technological innovation in safety risk management;and (3) design of safety risk management regulations. Finally, the strategies for safety risk management of underground engineering in China are proposed in six aspects, i.e. the

  15. Recent progress and future challenges in algal biofuel production [version 1; referees: 4 approved

    Directory of Open Access Journals (Sweden)

    Jonathan B. Shurin

    2016-10-01

    Full Text Available Modern society is fueled by fossil energy produced millions of years ago by photosynthetic organisms. Cultivating contemporary photosynthetic producers to generate energy and capture carbon from the atmosphere is one potential approach to sustaining society without disrupting the climate. Algae, photosynthetic aquatic microorganisms, are the fastest growing primary producers in the world and can therefore produce more energy with less land, water, and nutrients than terrestrial plant crops. We review recent progress and challenges in developing bioenergy technology based on algae. A variety of high-value products in addition to biofuels can be harvested from algal biomass, and these may be key to developing algal biotechnology and realizing the commercial potential of these organisms. Aspects of algal biology that differentiate them from plants demand an integrative approach based on genetics, cell biology, ecology, and evolution. We call for a systems approach to research on algal biotechnology rooted in understanding their biology, from the level of genes to ecosystems, and integrating perspectives from physical, chemical, and social sciences to solve one of the most critical outstanding technological problems.

  16. Progress in Overcoming Materials Challenges with S-CO2 RCBCs

    Energy Technology Data Exchange (ETDEWEB)

    Walker, Matthew S. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Materials Chemistry Group; Kruizenga, Alan Michael [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Materials Chemistry Group; Weck, Philippe F. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Storage and Transportation Technologies Group; Withey, Elizabeth Ann [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Materials Chemistry Group; Fleming, Darryn D. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Advanced Nuclear Concepts Group

    2016-09-01

    The supercritical carbon dioxide (S-CO2) Brayton Cycle has gained significant attention in the last decade as an advanced power cycle capable of achieving high efficiency power conversion. Sandia National Laboratories, with support from the U.S. Department of Energy Office of Nuclear Energy (US DOE-NE), has been conducting research and development in order to deliver a technology that is ready for commercialization. There are a wide range of materials-related challenges that must be overcome to ensure the success of this technology. At Sandia, recent work has focused on the following main areas: (1) Investigating the potential for system cost reduction through the introduction of low-cost alloys in low temperature loop sections; (2) identifying material options for 10MW RCBC systems; (3) understanding and resolving turbine degradation; (4) identifying gas foil bearing behavior in CO2; and (5) identifying the influence of gas chemistry on alloy corrosion. Progress in each of these areas is detailed in this report.

  17. Safety risk management of underground engineering in China: Progress, challenges and strategies

    Directory of Open Access Journals (Sweden)

    Qihu Qian

    2016-08-01

    Full Text Available Underground construction in China is featured by large scale, high speed, long construction period, complex operation and frustrating situations regarding project safety. Various accidents have been reported from time to time, resulting in serious social impact and huge economic loss. This paper presents the main progress in the safety risk management of underground engineering in China over the last decade, i.e. (1 establishment of laws and regulations for safety risk management of underground engineering, (2 implementation of the safety risk management plan, (3 establishment of decision support system for risk management and early-warning based on information technology, and (4 strengthening the study on safety risk management, prediction and prevention. Based on the analysis of the typical accidents in China in the last decade, the new challenges in the safety risk management for underground engineering are identified as follows: (1 control of unsafe human behaviors; (2 technological innovation in safety risk management; and (3 design of safety risk management regulations. Finally, the strategies for safety risk management of underground engineering in China are proposed in six aspects, i.e. the safety risk management system and policy, law, administration, economy, education and technology.

  18. Exsolution microstructures in ultrahigh-pressure rocks:Progress, controversies and challenges

    Institute of Scientific and Technical Information of China (English)

    LIU Liang; YANG JiaXi; ZHANG JunFeng; CHEN DanLing; WANG Chao; YANG WenQiang

    2009-01-01

    Exsolution microstructures in minerals of rocks from orogenic belts played an important role in recognition of ultrahigh-pressure (UHP) metamorphism in their host rocks by defining the subduction depth and improving our understanding of the dynamics during the subduction and exhumation of UHP rocks. However, it is a challenging scientific topic to distinguish the 'exsolution microstructures' from the 'non-exsolution microstructures' and decipher their geological implications. This paper describes the subtle differences between the 'exsolution microstructures' and the 'non-exsolution microstructures' and summarizes the progress in studies of exolution microstructures from UHP rocks and mantle rocks of ultra-deep origin. We emphasize distinguishing the 'exsolution microstructures' from the'non-exsolution microstructures' based on their geometric topotaxy and chemistry. In order to decipher correctly the exsolution microstructures, it is crucial to understand the changes of chemistry and habits of host minerals with pressure and temperature. Therefore, it is important to combine observations of exsolution microstructure in natural rocks with experimental results at high pressure and temperature and results of micro-scale analyses. Such studies will improve our understanding of the UHP metamorphism and cast new lights on solid geoscience issues such as deep subduction of continental crusts and crust-mantle interactions.

  19. Coupled Atmosphere-Wave-Ocean Modeling of Tropical Cyclones: Progress, Challenges, and Ways Forward

    Science.gov (United States)

    Chen, Shuyi

    2015-04-01

    /s. It is found that the air-sea fluxes are quite asymmetric around a storm with complex features representing various air-sea interaction processes in TCs. A unique observation in Typhoon Fanapi is the development of a stable boundary layer in the near-storm cold wake region, which has a direct impact on TC inner core structure and intensity. Despite of the progress, challenges remain. Air-sea momentum exchange in wind speed greater than 30-40 m/s is largely unresolved. Directional wind-wave stress and wave-current stress are difficult to determine from observations. Effects of sea spray on the air-sea fluxes are still not well understood. This talk will provide an overview on progress made in recent years, challenges we are facing, and ways forward. An integrated coupled observational and atmosphere-wave-ocean modeling system is urgently needed, in which coupled model development and targeted observations from field campaign and lab measurements together form the core of the research and prediction system. Another important aspect is that fully coupled models provide explicit, integrated impact forecasts of wind, rain, waves, ocean currents and surges in TCs and winter storms, which are missing in most current NWP models. It requires a new strategy for model development, evaluation, and verification. Ensemble forecasts using high-resolution coupled atmosphere-wave-ocean models can provide probabilistic forecasts and quantitative uncertainty estimates, which also allow us to explore new methodologies to verify probabilistic impact forecasts and evaluate model physics using a stochastic approach. Examples of such approach in TCs including Superstorm Sandy will be presented.

  20. South Carolina: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  1. Texas: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  2. Delaware: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  3. Kentucky: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  4. Florida: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP, high school graduation rates, and access…

  5. Virginia: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP, high school graduation rates, and access…

  6. Mississippi: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  7. Alabama: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP, high school graduation rates, and access…

  8. Arkansas: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  9. Oklahoma: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  10. North Carolina: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  11. Louisiana: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  12. Tennessee: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  13. Maryland: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  14. Georgia: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  15. West Virginia: Taking Stock and Pushing Forward. 2014 State Progress Report on the Challenge to Lead 2020, Goals for Education

    Science.gov (United States)

    Southern Regional Education Board (SREB), 2014

    2014-01-01

    "Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…

  16. Progress and challenges in electrically pumped GaN-based VCSELs

    Science.gov (United States)

    Haglund, A.; Hashemi, E.; Bengtsson, J.; Gustavsson, J.; Stattin, M.; Calciati, M.; Goano, M.

    2016-04-01

    ABSTRACT The Vertical-Cavity Surface-Emitting Laser (VCSEL) is an established optical source in short-distance optical communication links, computer mice and tailored infrared power heating systems. Its low power consumption, easy integration into two-dimensional arrays, and low-cost manufacturing also make this type of semiconductor laser suitable for application in areas such as high-resolution printing, medical applications, and general lighting. However, these applications require emission wavelengths in the blue-UV instead of the established infrared regime, which can be achieved by using GaN-based instead of GaAs-based materials. The development of GaN-based VCSELs is challenging, but during recent years several groups have managed to demonstrate electrically pumped GaN-based VCSELs with close to 1 mW of optical output power and threshold current densities between 3-16 kA/cm2. The performance is limited by challenges such as achieving high-reflectivity mirrors, vertical and lateral carrier confinement, efficient lateral current spreading, accurate cavity length control and lateral optical mode confinement. This paper summarizes different strategies to solve these issues in electrically pumped GaN-VCSELs together with state-of-the-art results. We will highlight our work on combined transverse current and optical mode confinement, where we show that many structures used for current confinement result in unintentionally optically anti-guided resonators. Such resonators can have a very high optical loss, which easily doubles the threshold gain for lasing. We will also present an alternative to the use of distributed Bragg reflectors as high-reflectivity mirrors, namely TiO2/air high contrast gratings (HCGs). Fabricated HCGs of this type show a high reflectivity (>95%) over a 25 nm wavelength span.

  17. Opportunities and challenges of cloud computing to improve health care services.

    Science.gov (United States)

    Kuo, Alex Mu-Hsing

    2011-09-21

    Cloud computing is a new way of delivering computing resources and services. Many managers and experts believe that it can improve health care services, benefit health care research, and change the face of health information technology. However, as with any innovation, cloud computing should be rigorously evaluated before its widespread adoption. This paper discusses the concept and its current place in health care, and uses 4 aspects (management, technology, security, and legal) to evaluate the opportunities and challenges of this computing model. Strategic planning that could be used by a health organization to determine its direction, strategy, and resource allocation when it has decided to migrate from traditional to cloud-based health services is also discussed.

  18. High End Computing Technologies for Earth Science Applications: Trends, Challenges, and Innovations

    Science.gov (United States)

    Parks, John (Technical Monitor); Biswas, Rupak; Yan, Jerry C.; Brooks, Walter F.; Sterling, Thomas L.

    2003-01-01

    Earth science applications of the future will stress the capabilities of even the highest performance supercomputers in the areas of raw compute power, mass storage management, and software environments. These NASA mission critical problems demand usable multi-petaflops and exabyte-scale systems to fully realize their science goals. With an exciting vision of the technologies needed, NASA has established a comprehensive program of advanced research in computer architecture, software tools, and device technology to ensure that, in partnership with US industry, it can meet these demanding requirements with reliable, cost effective, and usable ultra-scale systems. NASA will exploit, explore, and influence emerging high end computing architectures and technologies to accelerate the next generation of engineering, operations, and discovery processes for NASA Enterprises. This article captures this vision and describes the concepts, accomplishments, and the potential payoff of the key thrusts that will help meet the computational challenges in Earth science applications.

  19. Opportunities and Challenges of Cloud Computing to Improve Health Care Services

    Science.gov (United States)

    2011-01-01

    Cloud computing is a new way of delivering computing resources and services. Many managers and experts believe that it can improve health care services, benefit health care research, and change the face of health information technology. However, as with any innovation, cloud computing should be rigorously evaluated before its widespread adoption. This paper discusses the concept and its current place in health care, and uses 4 aspects (management, technology, security, and legal) to evaluate the opportunities and challenges of this computing model. Strategic planning that could be used by a health organization to determine its direction, strategy, and resource allocation when it has decided to migrate from traditional to cloud-based health services is also discussed. PMID:21937354

  20. Drugging specific conformational states of GPCRs: challenges and opportunities for computational chemistry.

    Science.gov (United States)

    Martí-Solano, Maria; Schmidt, Denis; Kolb, Peter; Selent, Jana

    2016-04-01

    Current advances in structural biology for membrane proteins support the existence of multiple Gprotein-coupled receptor (GPCR) conformations. These conformations can be associated to particular receptor states with definite coupling and signaling capacities. Drugging such receptor states represents an opportunity to discover a new generation of GPCR drugs with unprecedented specificity. However, exploiting recently available structural information to develop these drugs is still challenging. In this context, computational structure-based approaches can inform such drug development. In this review, we examine the potential of these approaches and the challenges they will need to overcome to guide the rational discovery of drugs targeting specific GPCR states.

  1. Medical image computing and computer-aided medical interventions applied to soft tissues. Work in progress in urology

    CERN Document Server

    Troccaz, Jocelyne; Berkelman, Peter; Cinquin, Philippe; Daanen, Vincent; Leroy, Antoine; Marchal, Maud; Payan, Yohan; Promayon, Emmanuel; Voros, Sandrine; Bart, Stéphane; Bolla, Michel; Chartier-Kastler, Emmanuel; Descotes, Jean-Luc; Dusserre, Andrée; Giraud, Jean-Yves; Long, Jean-Alexandre; Moalic, Ronan; Mozer, Pierre

    2006-01-01

    Until recently, Computer-Aided Medical Interventions (CAMI) and Medical Robotics have focused on rigid and non deformable anatomical structures. Nowadays, special attention is paid to soft tissues, raising complex issues due to their mobility and deformation. Mini-invasive digestive surgery was probably one of the first fields where soft tissues were handled through the development of simulators, tracking of anatomical structures and specific assistance robots. However, other clinical domains, for instance urology, are concerned. Indeed, laparoscopic surgery, new tumour destruction techniques (e.g. HIFU, radiofrequency, or cryoablation), increasingly early detection of cancer, and use of interventional and diagnostic imaging modalities, recently opened new challenges to the urologist and scientists involved in CAMI. This resulted in the last five years in a very significant increase of research and developments of computer-aided urology systems. In this paper, we propose a description of the main problems rel...

  2. Challenges in Soft Computing: Case Study with Louisville MSD CSO Modeling

    Science.gov (United States)

    Ormsbee, L.; Tufail, M.

    2005-12-01

    The principal constituents of soft computing include fuzzy logic, neural computing, evolutionary computation, machine learning, and probabilistic reasoning. There are numerous applications of these constituents (both individually and combination of two or more) in the area of water resources and environmental systems. These range from development of data driven models to optimal control strategies to assist in more informed and intelligent decision making process. Availability of data is critical to such applications and having scarce data may lead to models that do not represent the response function over the entire domain. At the same time, too much data has a tendency to lead to over-constraining of the problem. This paper will describe the application of a subset of these soft computing techniques (neural computing and genetic algorithms) to the Beargrass Creek watershed in Louisville, Kentucky. The application include development of inductive models as substitutes for more complex process-based models to predict water quality of key constituents (such as dissolved oxygen) and use them in an optimization framework for optimal load reductions. Such a process will facilitate the development of total maximum daily loads for the impaired water bodies in the watershed. Some of the challenges faced in this application include 1) uncertainty in data sets, 2) model application, and 3) development of cause-and-effect relationships between water quality constituents and watershed parameters through use of inductive models. The paper will discuss these challenges and how they affect the desired goals of the project.

  3. The DOE Accelerated Strategic Computing Initiative: Challenges and opportunities for predictive materials simulation capabilities

    Science.gov (United States)

    Mailhiot, Christian

    1998-05-01

    In response to the unprecedented national security challenges emerging from the end of nuclear testing, the Defense Programs of the Department of Energy has developed a long-term strategic plan based on a vigorous Science-Based Stockpile Stewardship (SBSS) program. The main objective of the SBSS program is to ensure confidence in the performance, safety, and reliability of the stockpile on the basis of a fundamental science-based approach. A central element of this approach is the development of predictive, ‘full-physics’, full-scale computer simulation tools. As a critical component of the SBSS program, the Accelerated Strategic Computing Initiative (ASCI) was established to provide the required advances in computer platforms and to enable predictive, physics-based simulation capabilities. In order to achieve the ASCI goals, fundamental problems in the fields of computer and physical sciences of great significance to the entire scientific community must be successfully solved. Foremost among the key elements needed to develop predictive simulation capabilities, the development of improved physics-based materials models is a cornerstone. We indicate some of the materials theory, modeling, and simulation challenges and illustrate how the ASCI program will enable both the hardware and the software tools necessary to advance the state-of-the-art in the field of computational condensed matter and materials physics.

  4. Scientific Grand Challenges: Forefront Questions in Nuclear Science and the Role of High Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.

    2009-10-01

    This report is an account of the deliberations and conclusions of the workshop on "Forefront Questions in Nuclear Science and the Role of High Performance Computing" held January 26-28, 2009, co-sponsored by the U.S. Department of Energy (DOE) Office of Nuclear Physics (ONP) and the DOE Office of Advanced Scientific Computing (ASCR). Representatives from the national and international nuclear physics communities, as well as from the high performance computing community, participated. The purpose of this workshop was to 1) identify forefront scientific challenges in nuclear physics and then determine which-if any-of these could be aided by high performance computing at the extreme scale; 2) establish how and why new high performance computing capabilities could address issues at the frontiers of nuclear science; 3) provide nuclear physicists the opportunity to influence the development of high performance computing; and 4) provide the nuclear physics community with plans for development of future high performance computing capability by DOE ASCR.

  5. Autism research and services for young children: history, progress and challenges.

    Science.gov (United States)

    Thompson, Travis

    2013-03-01

    For three decades after Leo Kanner's first clinical description, research progress in understanding and treating autism was minimal but since the late 1960s the growth of autism discoveries has been exponential, with a remarkable number of new findings published over the past two decades, in particular. These advances were made possible first by the discovery and dissemination of early intensive behavioural intervention (EIBI) for young children with autism that created the impetus for earlier accurate diagnosis. Other factors influencing the rapid growth in autism research were the first accepted diagnostic test for autism, the Autism Diagnostic Interview and Observation Schedule (ADI and ADOS). Developments in brain imaging and genetic technology combined to create a fuller understanding of the heterogeneity of autism, its multiple aetiologies, very early onset and course, and strategies for treatment. For a significant proportion of children with autism, it appears EIBI may be capable of promoting brain connectivity in specific cerebral areas, which is one of autism's underlying challenges. Disagreements about the most appropriate early intervention approach between developmental and behavioural psychologists have been unproductive and not contributed to advancing the field. Naturalistic behavioural and structured discrete trial methods are being integrated with developmental strategies with promising outcomes. Over these past 30 years, young people with autism have gone from receiving essentially no proactive treatment, resulting in lives languishing in institutions, to today, when half of children receiving EIBI treatment subsequently participate in regular classrooms alongside their peers. The future has entirely changed for young people with autism. Autism has become an eminently treatable condition. The time is overdue to set aside philosophical quarrels regarding theories of child development and apply what we know for the benefit of children with autism

  6. Progress and challenges in the development of a cell-based therapy for hemophilia A.

    Science.gov (United States)

    Fomin, M E; Togarrati, P P; Muench, M O

    2014-12-01

    Hemophilia A results from an insufficiency of factor VIII (FVIII). Although replacement therapy with plasma-derived or recombinant FVIII is a life-saving therapy for hemophilia A patients, such therapy is a life-long treatment rather than a cure for the disease. In this review, we discuss the possibilities, progress, and challenges that remain in the development of a cell-based cure for hemophilia A. The success of cell therapy depends on the type and availability of donor cells, the age of the host and method of transplantation, and the levels of engraftment and production of FVIII by the graft. Early therapy, possibly even prenatal transplantation, may yield the highest levels of engraftment by avoiding immunological rejection of the graft. Potential cell sources of FVIII include a specialized subset of endothelial cells known as liver sinusoidal endothelial cells (LSECs) present in the adult and fetal liver, or patient-specific endothelial cells derived from induced pluripotent stem cells that have undergone gene editing to produce FVIII. Achieving sufficient engraftment of transplanted LSECs is one of the obstacles to successful cell therapy for hemophilia A. We discuss recent results from transplants performed in animals that show production of functional and clinically relevant levels of FVIII obtained from donor LSECs. Hence, the possibility of treating hemophilia A can be envisioned through persistent production of FVIII from transplanted donor cells derived from a number of potential cell sources or through creation of donor endothelial cells from patient-specific induced pluripotent stem cells. © 2014 International Society on Thrombosis and Haemostasis.

  7. NATIONWIDE NATURAL RESOURCE INVENTORY OF THE PHILIPPINES USING LIDAR: STRATEGIES, PROGRESS, AND CHALLENGES

    Directory of Open Access Journals (Sweden)

    A. C. Blanco

    2016-06-01

    Full Text Available The Philippines has embarked on a detailed nationwide natural resource inventory using LiDAR through the Phil-LiDAR 2 Program. This 3-year program has developed and has been implementing mapping methodologies and protocols to produce high-resolution maps of agricultural, forest, coastal marine, hydrological features, and renewable energy resources. The Program has adopted strategies on system and process development, capacity building and enhancement, and expanding the network of collaborations. These strategies include training programs (on point cloud and image processing, GIS, and field surveys, workshops, forums, and colloquiums (program-wide, cluster-based, and project-based, and collaboration with partner national government agencies and other organizations. In place is a cycle of training, implementation, and feedback in order to continually improve the system and processes. To date, the Program has achieved progress in the development of workflows and in rolling out products such as resource maps and GIS data layers, which are indispensable in planning and decision-making. Challenges remains in speeding up output production (including quality checks and in ensuring sustainability considering the short duration of the program. Enhancements in the workflows and protocols have been incorporated to address data quality and data availability issues. More trainings have been conducted for project staff hired to address human resource gaps. Collaborative arrangements with more partners are being established. To attain sustainability, the Program is developing and instituting a system of training, data updating and sharing, information utilization, and feedback. This requires collaboration and cooperation of the government agencies, LGUs, universities, other organizations, and the communities.

  8. Nationwide Natural Resource Inventory of the Philippines Using Lidar: Strategies, Progress, and Challenges

    Science.gov (United States)

    Blanco, A. C.; Tamondong, A.; Perez, A. M.; Ang, M. R. C.; Paringit, E.; Alberto, R.; Alibuyog, N.; Aquino, D.; Ballado, A.; Garcia, P.; Japitana, M.; Ignacio, M. T.; Macandog, D.; Novero, A.; Otadoy, R. E.; Regis, E.; Rodriguez, M.; Silapan, J.; Villar, R.

    2016-06-01

    The Philippines has embarked on a detailed nationwide natural resource inventory using LiDAR through the Phil-LiDAR 2 Program. This 3-year program has developed and has been implementing mapping methodologies and protocols to produce high-resolution maps of agricultural, forest, coastal marine, hydrological features, and renewable energy resources. The Program has adopted strategies on system and process development, capacity building and enhancement, and expanding the network of collaborations. These strategies include training programs (on point cloud and image processing, GIS, and field surveys), workshops, forums, and colloquiums (program-wide, cluster-based, and project-based), and collaboration with partner national government agencies and other organizations. In place is a cycle of training, implementation, and feedback in order to continually improve the system and processes. To date, the Program has achieved progress in the development of workflows and in rolling out products such as resource maps and GIS data layers, which are indispensable in planning and decision-making. Challenges remains in speeding up output production (including quality checks) and in ensuring sustainability considering the short duration of the program. Enhancements in the workflows and protocols have been incorporated to address data quality and data availability issues. More trainings have been conducted for project staff hired to address human resource gaps. Collaborative arrangements with more partners are being established. To attain sustainability, the Program is developing and instituting a system of training, data updating and sharing, information utilization, and feedback. This requires collaboration and cooperation of the government agencies, LGUs, universities, other organizations, and the communities.

  9. High-performance computing, high-speed networks, and configurable computing environments: progress toward fully distributed computing.

    Science.gov (United States)

    Johnston, W E; Jacobson, V L; Loken, S C; Robertson, D W; Tierney, B L

    1992-01-01

    The next several years will see the maturing of a collection of technologies that will enable fully and transparently distributed computing environments. Networks will be used to configure independent computing, storage, and I/O elements into "virtual systems" that are optimal for solving a particular problem. This environment will make the most powerful computing systems those that are logically assembled from network-based components and will also make those systems available to a widespread audience. Anticipating that the necessary technology and communications infrastructure will be available in the next 3 to 5 years, we are developing and demonstrating prototype applications that test and exercise the currently available elements of this configurable environment. The Lawrence Berkeley Laboratory (LBL) Information and Computing Sciences and Research Medicine Divisions have collaborated with the Pittsburgh Supercomputer Center to demonstrate one distributed application that illuminates the issues and potential of using networks to configure virtual systems. This application allows the interactive visualization of large three-dimensional (3D) scalar fields (voxel data sets) by using a network-based configuration of heterogeneous supercomputers and workstations. The specific test case is visualization of 3D magnetic resonance imaging (MRI) data. The virtual system architecture consists of a Connection Machine-2 (CM-2) that performs surface reconstruction from the voxel data, a Cray Y-MP that renders the resulting geometric data into an image, and a workstation that provides the display of the image and the user interface for specifying the parameters for the geometry generation and 3D viewing. These three elements are configured into a virtual system by using several different network technologies. This paper reviews the current status of the software, hardware, and communications technologies that are needed to enable this configurable environment. These

  10. Bayesian approaches to spatial inference: Modelling and computational challenges and solutions

    Science.gov (United States)

    Moores, Matthew; Mengersen, Kerrie

    2014-12-01

    We discuss a range of Bayesian modelling approaches for spatial data and investigate some of the associated computational challenges. This paper commences with a brief review of Bayesian mixture models and Markov random fields, with enabling computational algorithms including Markov chain Monte Carlo (MCMC) and integrated nested Laplace approximation (INLA). Following this, we focus on the Potts model as a canonical approach, and discuss the challenge of estimating the inverse temperature parameter that controls the degree of spatial smoothing. We compare three approaches to addressing the doubly intractable nature of the likelihood, namely pseudo-likelihood, path sampling and the exchange algorithm. These techniques are applied to satellite data used to analyse water quality in the Great Barrier Reef.

  11. Comparison of progressive addition lenses for general purpose and for computer vision: an office field study.

    Science.gov (United States)

    Jaschinski, Wolfgang; König, Mirjam; Mekontso, Tiofil M; Ohlendorf, Arne; Welscher, Monique

    2015-05-01

    Two types of progressive addition lenses (PALs) were compared in an office field study: 1. General purpose PALs with continuous clear vision between infinity and near reading distances and 2. Computer vision PALs with a wider zone of clear vision at the monitor and in near vision but no clear distance vision. Twenty-three presbyopic participants wore each type of lens for two weeks in a double-masked four-week quasi-experimental procedure that included an adaptation phase (Weeks 1 and 2) and a test phase (Weeks 3 and 4). Questionnaires on visual and musculoskeletal conditions as well as preferences regarding the type of lenses were administered. After eight more weeks of free use of the spectacles, the preferences were assessed again. The ergonomic conditions were analysed from photographs. Head inclination when looking at the monitor was significantly lower by 2.3 degrees with the computer vision PALs than with the general purpose PALs. Vision at the monitor was judged significantly better with computer PALs, while distance vision was judged better with general purpose PALs; however, the reported advantage of computer vision PALs differed in extent between participants. Accordingly, 61 per cent of the participants preferred the computer vision PALs, when asked without information about lens design. After full information about lens characteristics and additional eight weeks of free spectacle use, 44 per cent preferred the computer vision PALs. On average, computer vision PALs were rated significantly better with respect to vision at the monitor during the experimental part of the study. In the final forced-choice ratings, approximately half of the participants preferred either the computer vision PAL or the general purpose PAL. Individual factors seem to play a role in this preference and in the rated advantage of computer vision PALs. © 2015 The Authors. Clinical and Experimental Optometry © 2015 Optometry Australia.

  12. Experimental Actinobacillus pleuropneumoniae challenge in swine: comparison of computed tomographic and radiographic findings during disease.

    Science.gov (United States)

    Brauer, Carsten; Hennig-Pauka, Isabel; Hoeltig, Doris; Buettner, Falk F R; Beyerbach, Martin; Gasse, Hagen; Gerlach, Gerald-F; Waldmann, Karl-H

    2012-04-30

    In pigs, diseases of the respiratory tract like pleuropneumonia due to Actinobacillus pleuropneumoniae (App) infection have led to high economic losses for decades. Further research on disease pathogenesis, pathogen-host-interactions and new prophylactic and therapeutic approaches are needed. In most studies, a large number of experimental animals are required to assess lung alterations at different stages of the disease. In order to reduce the required number of animals but nevertheless gather information on the nature and extent of lung alterations in living pigs, a computed tomographic scoring system for quantifying gross pathological findings was developed. In this study, five healthy pigs served as control animals while 24 pigs were infected with App, the causative agent of pleuropneumonia in pigs, in an established model for respiratory tract disease. Computed tomographic (CT) findings during the course of App challenge were verified by radiological imaging, clinical, serological, gross pathology and histological examinations. Findings from clinical examinations and both CT and radiological imaging, were recorded on day 7 and day 21 after challenge. Clinical signs after experimental App challenge were indicative of acute to chronic disease. Lung CT findings of infected pigs comprised ground-glass opacities and consolidation. On day 7 and 21 the clinical scores significantly correlated with the scores of both imaging techniques. At day 21, significant correlations were found between clinical scores, CT scores and lung lesion scores. In 19 out of 22 challenged pigs the determined disease grades (not affected, slightly affected, moderately affected, severely affected) from CT and gross pathological examination were in accordance. Disease classification by radiography and gross pathology agreed in 11 out of 24 pigs. High-resolution, high-contrast CT examination with no overlapping of organs is superior to radiography in the assessment of pneumonic lung lesions

  13. Catalysis Research of Relevance to Carbon Management: Progress, Challenges, and Opportunities

    Energy Technology Data Exchange (ETDEWEB)

    Arakawa, Hironori; Aresta, Michele; Armor, John; Barteau, Mark; Beckman, Eric J.; Bell, Alexis T.; Bercaw, John E.; Creutz, Carol; Dinjus, Eckhard; Dixon, David A.; Domen, Kazunari; Dubois, Daniel L.; Eckert, Juergen; Fujita, Etsuko; Gibson, Dorothy H.; Goddard, William A.; Goodman, Wayne D.; Keller, Jay; Kubas, Gregory J.; Kung, Harold H.; Lyons, James E.; Manzer, Leo; Marks, Tobin J.; Morokuma, Keiji; Nicholas, Kenneth M.; Periana, Roy; Que, Lawrence; Rostrup-Nielson, Jens; Sachtler, Woflgang M H.; Schmidt, Lanny D.; Sen, Ayusman; Somorjai, Gabor A.; Stair, Peter C.; Stults, Bailey R.; Tumas, William

    2001-04-11

    The goal of the 'Opportunities for Catalysis Research in Carbon Management' workshop was to review within the context of greenhouse gas/carbon issues the current state of knowledge, barriers to further scientific and technological progress, and basic scientific research needs in the areas of H{sub 2} generation and utilization, light hydrocarbon activation and utilization, carbon dioxide activation, utilization, and sequestration, emerging techniques and research directions in relevant catalysis research, and in catalysis for more efficient transportation engines. Several overarching themes emerge from this review. First and foremost, there is a pressing need to better understand in detail the catalytic mechanisms involved in almost every process area mentioned above. This includes the structures, energetics, lifetimes, and reactivities of the species thought to be important in the key catalytic cycles. As much of this type of information as is possible to acquire would also greatly aid in better understanding perplexing, incomplete/inefficient catalytic cycles and in inventing new, efficient ones. The most productive way to attack such problems must include long-term, in-depth fundamental studies of both commercial and model processes, by conventional research techniques and, importantly, by applying various promising new physicochemical and computational approaches which would allow incisive, in situ elucidation of reaction pathways. There is also a consensus that more exploratory experiments, especially high-risk, unconventional catalytic and model studies, should be undertaken. Such an effort will likely require specialized equipment, instrumentation, and computational facilities. The most expeditious and cost-effective means to carry out this research would be by close coupling of academic, industrial, and national laboratory catalysis efforts worldwide. Completely new research approaches should be vigorously explored, ranging from novel compositions

  14. Scientific Grand Challenges: Crosscutting Technologies for Computing at the Exascale - February 2-4, 2010, Washington, D.C.

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.

    2011-02-06

    The goal of the "Scientific Grand Challenges - Crosscutting Technologies for Computing at the Exascale" workshop in February 2010, jointly sponsored by the U.S. Department of Energy’s Office of Advanced Scientific Computing Research and the National Nuclear Security Administration, was to identify the elements of a research and development agenda that will address these challenges and create a comprehensive exascale computing environment. This exascale computing environment will enable the science applications identified in the eight previously held Scientific Grand Challenges Workshop Series.

  15. Geant4 Hadronic Cascade Models and CMS Data Analysis : Computational Challenges in the LHC era

    CERN Document Server

    Heikkinen, Aatos

    This work belongs to the field of computational high-energy physics (HEP). The key methods used in this thesis work to meet the challenges raised by the Large Hadron Collider (LHC) era experiments are object-orientation with software engineering, Monte Carlo simulation, the computer technology of clusters, and artificial neural networks. The first aspect discussed is the development of hadronic cascade models, used for the accurate simulation of medium-energy hadron-nucleus reactions, up to 10 GeV. These models are typically needed in hadronic calorimeter studies and in the estimation of radiation backgrounds. Various applications outside HEP include the medical field (such as hadron treatment simulations), space science (satellite shielding), and nuclear physics (spallation studies). Validation results are presented for several significant improvements released in Geant4 simulation tool, and the significance of the new models for computing in the Large Hadron Collider era is estimated. In particular, we es...

  16. Computational intelligence in wireless sensor networks recent advances and future challenges

    CERN Document Server

    Falcon, Rafael; Koeppen, Mario

    2017-01-01

    This book emphasizes the increasingly important role that Computational Intelligence (CI) methods are playing in solving a myriad of entangled Wireless Sensor Networks (WSN) related problems. The book serves as a guide for surveying several state-of-the-art WSN scenarios in which CI approaches have been employed. The reader finds in this book how CI has contributed to solve a wide range of challenging problems, ranging from balancing the cost and accuracy of heterogeneous sensor deployments to recovering from real-time sensor failures to detecting attacks launched by malicious sensor nodes and enacting CI-based security schemes. Network managers, industry experts, academicians and practitioners alike (mostly in computer engineering, computer science or applied mathematics) benefit from the spectrum of successful applications reported in this book. Senior undergraduate or graduate students may discover in this book some problems well suited for their own research endeavors. USP: Presents recent advances and fu...

  17. Characteristic detected on computed tomography angiography predict coronary artery plaque progression in non-culprit lesions

    Energy Technology Data Exchange (ETDEWEB)

    Tan, Ya Hang; Zhou, Jia Zhou; Zhou, Ying; Yang, Xiaobo; Yang, Jun Jie; Chen, Yun Dai [Dept. of Cardiology, Chinese PLA General Hospital, Beijing (China)

    2017-06-15

    This study sought to determine whether variables detected on coronary computed tomography angiography (CCTA) would predict plaque progression in non-culprit lesions (NCL). In this single-center trial, we analyzed 103 consecutive patients who were undergoing CCTA and percutaneous coronary intervention (PCI) for culprit lesions. Follow-up CCTA was scheduled 12 months after the PCI, and all patients were followed for 3 years after their second CCTA examination. High-risk plaque features and epicardial adipose tissue (EAT) volume were assessed by CCTA. Each NCL stenosis grade was compared visually between two CCTA scans to detect plaque progression, and patients were stratified into two groups based on this. Logistic regression analysis was used to evaluate the factors that were independently associated with plaque progression in NCLs. Time-to-event curves were compared using the log-rank statistic. Overall, 34 of 103 patients exhibited NCL plaque progression (33%). Logistic regression analyses showed that the NCL progression was associated with a history of ST-elevated myocardial infarction (odds ratio [OR] = 5.855, 95% confidence interval [CI] = 1.391–24.635, p = 0.016), follow-up low-density lipoprotein cholesterol level (OR = 6.832, 95% CI = 2.103–22.200, p = 0.001), baseline low-attenuation plaque (OR = 7.311, 95% CI = 1.242–43.028, p = 0.028) and EAT (OR = 1.015, 95% CI = 1.000–1.029, p = 0.044). Following the second CCTA examination, major adverse cardiac events (MACEs) were observed in 12 patients, and NCL plaque progression was significantly associated with future MACEs (log rank p = 0.006). Noninvasive assessment of NCLs by CCTA has potential prognostic value.

  18. Progression Analysis and Stage Discovery in Continuous Physiological Processes Using Image Computing

    Directory of Open Access Journals (Sweden)

    Ferrucci Luigi

    2010-01-01

    Full Text Available We propose an image computing-based method for quantitative analysis of continuous physiological processes that can be sensed by medical imaging and demonstrate its application to the analysis of morphological alterations of the bone structure, which correlate with the progression of osteoarthritis (OA. The purpose of the analysis is to quantitatively estimate OA progression in a fashion that can assist in understanding the pathophysiology of the disease. Ultimately, the texture analysis will be able to provide an alternative OA scoring method, which can potentially reflect the progression of the disease in a more direct fashion compared to the existing clinically utilized classification schemes based on radiology. This method can be useful not just for studying the nature of OA, but also for developing and testing the effect of drugs and treatments. While in this paper we demonstrate the application of the method to osteoarthritis, its generality makes it suitable for the analysis of other progressive clinical conditions that can be diagnosed and prognosed by using medical imaging.

  19. Diagnosis and Progression of Sacroiliitis in Repeated Sacroiliac Joint Computed Tomography

    Directory of Open Access Journals (Sweden)

    Mats Geijer

    2013-01-01

    Full Text Available Objective. To assess the clinical utility of repeat sacroiliac joint computed tomography (CT in sacroiliitis by assessing the proportion of patients changing from normal to pathologic at CT and to which degree there is progression of established sacroiliitis at repeat CT. Methods. In a retrospective analysis of 334 patients (median age 34 years with symptoms suggestive of inflammatory back pain, CT had been performed twice, in 47 of these thrice, and in eight patients four times. The studies were scored as normal, equivocal, unilateral sacroiliitis, or bilateral sacroiliitis. Results. There was no change in 331 of 389 repeat examinations. Ten patients (3.0% had progressed from normal or equivocal to unilateral or bilateral sacroiliitis. Of 43 cases with sacroiliitis on the first study, 36 (83.7% progressed markedly. Two normal cases had changed to equivocal. Eight equivocal cases were classified as normal on the repeat study. In further two patients, only small changes within the scoring grade equivocal were detected. Conclusions. CT is a valuable examination for diagnosis of sacroiliitis, but a repeated examination detects only a few additional cases of sacroiliitis. Most cases with already established sacroiliitis showed progression of disease.

  20. Key challenges and recent progress in batteries, fuel cells, and hydrogen storage for clean energy systems

    Science.gov (United States)

    Chalk, Steven G.; Miller, James F.

    Reducing or eliminating the dependency on petroleum of transportation systems is a major element of US energy research activities. Batteries are a key enabling technology for the development of clean, fuel-efficient vehicles and are key to making today's hybrid electric vehicles a success. Fuel cells are the key enabling technology for a future hydrogen economy and have the potential to revolutionize the way we power our nations, offering cleaner, more efficient alternatives to today's technology. Additionally fuel cells are significantly more energy efficient than combustion-based power generation technologies. Fuel cells are projected to have energy efficiency twice that of internal combustion engines. However before fuel cells can realize their potential, significant challenges remain. The two most important are cost and durability for both automotive and stationary applications. Recent electrocatalyst developments have shown that Pt alloy catalysts have increased activity and greater durability than Pt catalysts. The durability of conventional fluorocarbon membranes is improving, and hydrocarbon-based membranes have also shown promise of equaling the performance of fluorocarbon membranes at lower cost. Recent announcements have also provided indications that fuel cells can start from freezing conditions without significant deterioration. Hydrogen storage systems for vehicles are inadequate to meet customer driving range expectations (>300 miles or 500 km) without intrusion into vehicle cargo or passenger space. The United States Department of Energy has established three centers of Excellence for hydrogen storage materials development. The centers are focused on complex metal hydrides that can be regenerated onboard a vehicle, chemical hydrides that require off-board reprocessing, and carbon-based storage materials. Recent developments have shown progress toward the 2010 DOE targets. In addition DOE has established an independent storage material testing center

  1. Looking from Within: Prospects and Challenges for Progressive Education in Indonesia

    Science.gov (United States)

    Zulfikar, Teuku

    2013-01-01

    Many Indonesian scholars (Azra, 2002; Darmaningtyas, 2004; Yunus, 2004), have attempted to bring progressive education to their country. They believe that progressive practices such as critical thinking, critical dialogue and child-centered instruction will help students learn better. However, this implementation is resisted because of cultural…

  2. The Challenges and Benefits of Using Computer Technology for Communication and Teaching in the Geosciences

    Science.gov (United States)

    Fairley, J. P.; Hinds, J. J.

    2003-12-01

    The advent of the World Wide Web in the early 1990s not only revolutionized the exchange of ideas and information within the scientific community, but also provided educators with a new array of teaching, informational, and promotional tools. Use of computer graphics and animation to explain concepts and processes can stimulate classroom participation and student interest in the geosciences, which has historically attracted students with strong spatial and visualization skills. In today's job market, graduates are expected to have knowledge of computers and the ability to use them for acquiring, processing, and visually analyzing data. Furthermore, in addition to promoting visibility and communication within the scientific community, computer graphics and the Internet can be informative and educational for the general public. Although computer skills are crucial for earth science students and educators, many pitfalls exist in implementing computer technology and web-based resources into research and classroom activities. Learning to use these new tools effectively requires a significant time commitment and careful attention to the source and reliability of the data presented. Furthermore, educators have a responsibility to ensure that students and the public understand the assumptions and limitations of the materials presented, rather than allowing them to be overwhelmed by "gee-whiz" aspects of the technology. We present three examples of computer technology in the earth sciences classroom: 1) a computer animation of water table response to well pumping, 2) a 3-D fly-through animation of a fault controlled valley, and 3) a virtual field trip for an introductory geology class. These examples demonstrate some of the challenges and benefits of these new tools, and encourage educators to expand the responsible use of computer technology for teaching and communicating scientific results to the general public.

  3. Transactivation of Epidermal Growth Factor Receptor by G Protein-Coupled Receptors: Recent Progress, Challenges and Future Research

    Directory of Open Access Journals (Sweden)

    Zhixiang Wang

    2016-01-01

    Full Text Available Both G protein-coupled receptors (GPCRs and receptor-tyrosine kinases (RTKs regulate large signaling networks, control multiple cell functions and are implicated in many diseases including various cancers. Both of them are also the top therapeutic targets for disease treatment. The discovery of the cross-talk between GPCRs and RTKs connects these two vast signaling networks and complicates the already complicated signaling networks that regulate cell signaling and function. In this review, we focus on the transactivation of epidermal growth factor receptor (EGFR, a subfamily of RTKs, by GPCRs. Since the first report of EGFR transactivation by GPCR, significant progress has been made including the elucidation of the mechanisms underlying the transactivation. Here, we first provide a basic picture for GPCR, EGFR and EGFR transactivation by GPCR. We then discuss the progress made in the last five years and finally provided our view of the future challenge and future researches needed to overcome these challenges.

  4. A Step Towards A Computing Grid For The LHC Experiments ATLAS Data Challenge 1

    CERN Document Server

    Sturrock, R; Epp, B; Ghete, V M; Kuhn, D; Mello, A G; Caron, B; Vetterli, M C; Karapetian, G V; Martens, K; Agarwal, A; Poffenberger, P R; McPherson, R A; Sobie, R J; Amstrong, S; Benekos, N C; Boisvert, V; Boonekamp, M; Brandt, S; Casado, M P; Elsing, M; Gianotti, F; Goossens, L; Grote, M; Hansen, J B; Mair, K; Nairz, A; Padilla, C; Poppleton, A; Poulard, G; Richter-Was, Elzbieta; Rosati, S; Schörner-Sadenius, T; Wengler, T; Xu, G F; Ping, J L; Chudoba, J; Kosina, J; Lokajícek, M; Svec, J; Tas, P; Hansen, J R; Lytken, E; Nielsen, J L; Wäänänen, A; Tapprogge, Stefan; Calvet, D; Albrand, S; Collot, J; Fulachier, J; Ledroit-Guillon, F; Ohlsson-Malek, F; Viret, S; Wielers, M; Bernardet, K; Corréard, S; Rozanov, A; De Vivie de Régie, J B; Arnault, C; Bourdarios, C; Hrivnác, J; Lechowski, M; Parrour, G; Perus, A; Rousseau, D; Schaffer, A; Unal, G; Derue, F; Chevalier, L; Hassani, S; Laporte, J F; Nicolaidou, R; Pomarède, D; Virchaux, M; Nesvadba, N; Baranov, S; Putzer, A; Khonich, A; Duckeck, G; Schieferdecker, P; Kiryunin, A E; Schieck, J; Lagouri, T; Duchovni, E; Levinson, L; Schrager, D; Negri, G; Bilokon, H; Spogli, L; Barberis, D; Parodi, F; Cataldi, G; Gorini, E; Primavera, M; Spagnolo, S; Cavalli, D; Heldmann, M; Lari, T; Perini, L; Rebatto, D; Resconi, S; Tatarelli, F; Vaccarossa, L; Biglietti, M; Carlino, G; Conventi, F; Doria, A; Merola, L; Polesello, G; Vercesi, V; De Salvo, A; Di Mattia, A; Luminari, L; Nisati, A; Reale, M; Testa, M; Farilla, A; Verducci, M; Cobal, M; Santi, L; Hasegawa, Y; Ishino, M; Mashimo, T; Matsumoto, H; Sakamoto, H; Tanaka, J; Ueda, I; Bentvelsen, Stanislaus Cornelius Maria; Fornaini, A; Gorfine, G; Groep, D; Templon, J; Köster, L J; Konstantinov, A; Myklebust, T; Ould-Saada, F; Bold, T; Kaczmarska, A; Malecki, P; Szymocha, T; Turala, M; Kulchitskii, Yu A; Khoreauli, G; Gromova, N; Tsulaia, V; Minaenko, A A; Rudenko, R; Slabospitskaya, E; Solodkov, A; Gavrilenko, I; Nikitine, N; Sivoklokov, S Yu; Toms, K; Zalite, A; Zalite, Yu; Kervesan, B; Bosman, M; González, S; Sánchez, J; Salt, J; Andersson, N; Nixon, L; Eerola, Paule Anna Mari; Kónya, B; Smirnova, O G; Sandgren, A; Ekelöf, T J C; Ellert, M; Gollub, N; Hellman, S; Lipniacka, A; Corso-Radu, A; Pérez-Réale, V; Lee, S C; CLin, S C; Ren, Z L; Teng, P K; Faulkner, P J W; O'Neale, S W; Watson, A; Brochu, F; Lester, C; Thompson, S; Kennedy, J; Bouhova-Thacker, E; Henderson, R; Jones, R; Kartvelishvili, V G; Smizanska, M; Washbrook, A J; Drohan, J; Konstantinidis, N P; Moyse, E; Salih, S; Loken, J; Baines, J T M; Candlin, D; Candlin, R; Clifft, R; Li, W; McCubbin, N A; George, S; Lowe, A; Buttar, C; Dawson, I; Moraes, A; Tovey, Daniel R; Gieraltowski, J; Malon, D; May, E; LeCompte, T J; Vaniachine, A; Adams, D L; Assamagan, Ketevi A; Baker, R; Deng, W; Fine, V; Fisyak, Yu; Gibbard, B; Ma, H; Nevski, P; Paige, F; Rajagopalan, S; Smith, J; Undrus, A; Wenaus, T; Yu, D; Calafiura, P; Canon, S; Costanzo, D; Hinchliffe, Ian; Lavrijsen, W; Leggett, C; Marino, M; Quarrie, D R; Sakrejda, I; Stravopoulos, G; Tull, C; Loch, P; Youssef, S; Shank, J T; Engh, D; Frank, E; Sen-Gupta, A; Gardner, R; Meritt, F; Smirnov, Y; Huth, J; Grundhoefer, L; Luehring, F C; Goldfarb, S; Severini, H; Skubic, P L; Gao, Y; Ryan, T; De, K; Sosebee, M; McGuigan, P; Ozturk, N

    2004-01-01

    The ATLAS Collaboration at CERN is preparing for the data taking and analysis at the LHC that will start in 2007. Therefore, a series of Data Challenges was started in 2002 whose goals are the validation of the Computing Model, of the complete software suite, of the data model, and to ensure the correctness of the technical choices to be made for the final offline computing environment. A major feature of the first Data Challenge (DC1) was the preparation and the deployment of the software required for the production of large event samples as a worldwide distributed activity. It should be noted that it was not an option to "run the complete production at CERN" even if we had wanted to; the resources were not available at CERN to carry out the production on a reasonable time-scale. The great challenge of organising and carrying out this large-scale production at a significant number of sites around the world had therefore to be faced. However, the benefits of this are manifold: apart from realising the require...

  5. The Impact and Challenges of Cloud Computing Adoption on Public Universities in Southwestern Nigeria

    Directory of Open Access Journals (Sweden)

    Oyeleye Christopher Akin

    2014-08-01

    Full Text Available This study investigates the impact and challenges of the adoption of cloud computing by public universities in the Southwestern part of Nigeria. A sample size of 100 IT staff, 50 para-IT staff and 50 students were selected in each university using stratified sampling techniques with the aid of well-structured questionnaires. Microsoft excel was used to capture the data while frequency and percentage distributions were used to analyze it. In all, 2, 000 copies of the questionnaire were administered to the ten (10 public universities in the southwestern part of Nigeria while 1742 copies were returned which represents a respondent rate of 87.1%. The result of the findings revealed that the adoption of cloud computing has a significant impact on cost effectiveness, enhanced availability, low environmental impact, reduced IT complexities, mobility, scalability, increased operability and reduced investment in physical asset However, the major challenges confronting the adoption of cloud are data insecurity, regulatory compliance concerns, lock-in and privacy concerns. This paper concludes by recommending strategies to manage the identified challenges in the study area.

  6. Computational Cellular Dynamics Based on the Chemical Master Equation: A Challenge for Understanding Complexity.

    Science.gov (United States)

    Liang, Jie; Qian, Hong

    2010-01-01

    Modern molecular biology has always been a great source of inspiration for computational science. Half a century ago, the challenge from understanding macromolecular dynamics has led the way for computations to be part of the tool set to study molecular biology. Twenty-five years ago, the demand from genome science has inspired an entire generation of computer scientists with an interest in discrete mathematics to join the field that is now called bioinformatics. In this paper, we shall lay out a new mathematical theory for dynamics of biochemical reaction systems in a small volume (i.e., mesoscopic) in terms of a stochastic, discrete-state continuous-time formulation, called the chemical master equation (CME). Similar to the wavefunction in quantum mechanics, the dynamically changing probability landscape associated with the state space provides a fundamental characterization of the biochemical reaction system. The stochastic trajectories of the dynamics are best known through the simulations using the Gillespie algorithm. In contrast to the Metropolis algorithm, this Monte Carlo sampling technique does not follow a process with detailed balance. We shall show several examples how CMEs are used to model cellular biochemical systems. We shall also illustrate the computational challenges involved: multiscale phenomena, the interplay between stochasticity and nonlinearity, and how macroscopic determinism arises from mesoscopic dynamics. We point out recent advances in computing solutions to the CME, including exact solution of the steady state landscape and stochastic differential equations that offer alternatives to the Gilespie algorithm. We argue that the CME is an ideal system from which one can learn to understand "complex behavior" and complexity theory, and from which important biological insight can be gained.

  7. Relationship of computed tomography perfusion and positron emission tomography to tumour progression in malignant glioma

    Energy Technology Data Exchange (ETDEWEB)

    Yeung, Timothy P C [London Regional Cancer Program, London Health Sciences Centre, Ontario, Canada, N6A 4L6 (Canada); Robarts Research Institute, The University of Western Ontario, Ontario, Canada, N6A 5B7 (Canada); Department of Medical Biophysics, The University of Western Ontario, Ontario, Canada, N6A 5C1 (Canada); Yartsev, Slav [London Regional Cancer Program, London Health Sciences Centre, Ontario, Canada, N6A 4L6 (Canada); Department of Medical Biophysics, The University of Western Ontario, Ontario, Canada, N6A 5C1 (Canada); Department of Oncology, The University of Western Ontario, London Health Sciences Centre, London Regional Cancer Program, Ontario, Canada, N6A 4L6 (Canada); Lee, Ting-Yim [Robarts Research Institute, The University of Western Ontario, Ontario, Canada, N6A 5B7 (Canada); Department of Medical Biophysics, The University of Western Ontario, Ontario, Canada, N6A 5C1 (Canada); Department of Oncology, The University of Western Ontario, London Health Sciences Centre, London Regional Cancer Program, Ontario, Canada, N6A 4L6 (Canada); Department of Medical Imaging, The University of Western Ontario, London Health Sciences Centre, Victoria Hospital, Ontario, Canada, N6A 5W9 (Australia); Lawson Health Research Institute, St. Joseph' s Health Care London, Ontario, Canada, N6A 4V2 (Canada); Wong, Eugene [London Regional Cancer Program, London Health Sciences Centre, Ontario, Canada, N6A 4L6 (Canada); Department of Oncology, The University of Western Ontario, London Health Sciences Centre, London Regional Cancer Program, Ontario, Canada, N6A 4L6 (Canada); Department of Physics and Astronomy, The University of Western Ontario, Ontario, Canada, N6A 3K7 (Canada); He, Wenqing [Department of Statistical and Actuarial Sciences, The University of Western Ontario, Ontario, Canada, N6A 5B7 (Canada); Fisher, Barbara; VanderSpek, Lauren L [London Regional Cancer Program, London Health Sciences Centre, Ontario, Canada, N6A 4L6 (Canada); Department of Oncology, The University of Western Ontario, London Health Sciences Centre, London Regional Cancer Program, Ontario, Canada, N6A 4L6 (Canada); Macdonald, David [London Regional Cancer Program, London Health Sciences Centre, Ontario, Canada, N6A 4L6 (Canada); Department of Oncology, The University of Western Ontario, London Health Sciences Centre, London Regional Cancer Program, Ontario, Canada, N6A 4L6 (Canada); Department of Clinical Neurological Sciences, The University of Western Ontario, London Health Sciences Centre, University Hospital, Ontario, Canada, N6A 5A5 (Canada); Bauman, Glenn, E-mail: glenn.bauman@lhsc.on.ca [London Regional Cancer Program, London Health Sciences Centre, Ontario, Canada, N6A 4L6 (Canada); Department of Medical Biophysics, The University of Western Ontario, Ontario, Canada, N6A 5C1 (Canada); Department of Oncology, The University of Western Ontario, London Health Sciences Centre, London Regional Cancer Program, Ontario, Canada, N6A 4L6 (Canada)

    2014-02-15

    Introduction: This study aimed to explore the potential for computed tomography (CT) perfusion and 18-Fluorodeoxyglucose positron emission tomography (FDG-PET) in predicting sites of future progressive tumour on a voxel-by-voxel basis after radiotherapy and chemotherapy. Methods: Ten patients underwent pre-radiotherapy magnetic resonance (MR), FDG-PET and CT perfusion near the end of radiotherapy and repeated post-radiotherapy follow-up MR scans. The relationships between these images and tumour progression were assessed using logistic regression. Cross-validation with receiver operating characteristic (ROC) analysis was used to assess the value of these images in predicting sites of tumour progression. Results: Pre-radiotherapy MR-defined gross tumour; near-end-of-radiotherapy CT-defined enhancing lesion; CT perfusion blood flow (BF), blood volume (BV) and permeability-surface area (PS) product; FDG-PET standard uptake value (SUV); and SUV:BF showed significant associations with tumour progression on follow-up MR imaging (P < 0.0001). The mean sensitivity (±standard deviation), specificity and area under the ROC curve (AUC) of PS were 0.64 ± 0.15, 0.74 ± 0.07 and 0.72 ± 0.12 respectively. This mean AUC was higher than that of the pre-radiotherapy MR-defined gross tumour and near-end-of-radiotherapy CT-defined enhancing lesion (both AUCs = 0.6 ± 0.1, P ≤ 0.03). The multivariate model using BF, BV, PS and SUV had a mean AUC of 0.8 ± 0.1, but this was not significantly higher than the PS only model. Conclusion: PS is the single best predictor of tumour progression when compared to other parameters, but voxel-based prediction based on logistic regression had modest sensitivity and specificity.

  8. Challenges to Computational Aerothermodynamic Simulation and Validation for Planetary Entry Vehicle Analysis

    Science.gov (United States)

    Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, Bil

    2010-01-01

    Challenges to computational aerothermodynamic (CA) simulation and validation of hypersonic flow over planetary entry vehicles are discussed. Entry, descent, and landing (EDL) of high mass to Mars is a significant driver of new simulation requirements. These requirements include simulation of large deployable, flexible structures and interactions with reaction control system (RCS) and retro-thruster jets. Simulation of radiation and ablation coupled to the flow solver continues to be a high priority for planetary entry analyses, especially for return to Earth and outer planet missions. Three research areas addressing these challenges are emphasized. The first addresses the need to obtain accurate heating on unstructured tetrahedral grid systems to take advantage of flexibility in grid generation and grid adaptation. A multi-dimensional inviscid flux reconstruction algorithm is defined that is oriented with local flow topology as opposed to grid. The second addresses coupling of radiation and ablation to the hypersonic flow solver--flight- and ground-based data are used to provide limited validation of these multi-physics simulations. The third addresses the challenges of retro-propulsion simulation and the criticality of grid adaptation in this application. The evolution of CA to become a tool for innovation of EDL systems requires a successful resolution of these challenges.

  9. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    Energy Technology Data Exchange (ETDEWEB)

    King, W. E., E-mail: weking@llnl.gov [Physical and Life Sciences Directorate, Lawrence Livermore National Laboratory, Livermore, California 94550 (United States); Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Khairallah, S. A. [Engineering Directorate, Lawrence Livermore National Laboratory, Livermore, California 94550 (United States); Kamath, C. [Computation Directorate, Lawrence Livermore National Laboratory, Livermore, California 94550 (United States); Rubenchik, A. M. [NIF and Photon Sciences Directorate, Lawrence Livermore National Laboratory, Livermore, California 94550 (United States)

    2015-12-15

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  10. Modelling of Multi-Agent Systems: Experiences with Membrane Computing and Future Challenges

    CERN Document Server

    Kefalas, Petros; 10.4204/EPTCS.33.5

    2010-01-01

    Formal modelling of Multi-Agent Systems (MAS) is a challenging task due to high complexity, interaction, parallelism and continuous change of roles and organisation between agents. In this paper we record our research experience on formal modelling of MAS. We review our research throughout the last decade, by describing the problems we have encountered and the decisions we have made towards resolving them and providing solutions. Much of this work involved membrane computing and classes of P Systems, such as Tissue and Population P Systems, targeted to the modelling of MAS whose dynamic structure is a prominent characteristic. More particularly, social insects (such as colonies of ants, bees, etc.), biology inspired swarms and systems with emergent behaviour are indicative examples for which we developed formal MAS models. Here, we aim to review our work and disseminate our findings to fellow researchers who might face similar challenges and, furthermore, to discuss important issues for advancing research on ...

  11. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    Science.gov (United States)

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Kamath, C.; Khairallah, S. A.; Rubenchik, A. M.

    2015-12-01

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  12. Computational analyses of ancient pathogen DNA from herbarium samples: challenges and prospects

    Science.gov (United States)

    Yoshida, Kentaro; Sasaki, Eriko; Kamoun, Sophien

    2015-01-01

    The application of DNA sequencing technology to the study of ancient DNA has enabled the reconstruction of past epidemics from genomes of historically important plant-associated microbes. Recently, the genome sequences of the potato late blight pathogen Phytophthora infestans were analyzed from 19th century herbarium specimens. These herbarium samples originated from infected potatoes collected during and after the Irish potato famine. Herbaria have therefore great potential to help elucidate past epidemics of crops, date the emergence of pathogens, and inform about past pathogen population dynamics. DNA preservation in herbarium samples was unexpectedly good, raising the possibility of a whole new research area in plant and microbial genomics. However, the recovered DNA can be extremely fragmented resulting in specific challenges in reconstructing genome sequences. Here we review some of the challenges in computational analyses of ancient DNA from herbarium samples. We also applied the recently developed linkage method to haplotype reconstruction of diploid or polyploid genomes from fragmented ancient DNA. PMID:26442080

  13. Experimental Actinobacillus pleuropneumoniae challenge in swine: Comparison of computed tomographic and radiographic findings during disease

    Directory of Open Access Journals (Sweden)

    Brauer Carsten

    2012-04-01

    Full Text Available Abstract Background In pigs, diseases of the respiratory tract like pleuropneumonia due to Actinobacillus pleuropneumoniae (App infection have led to high economic losses for decades. Further research on disease pathogenesis, pathogen-host-interactions and new prophylactic and therapeutic approaches are needed. In most studies, a large number of experimental animals are required to assess lung alterations at different stages of the disease. In order to reduce the required number of animals but nevertheless gather information on the nature and extent of lung alterations in living pigs, a computed tomographic scoring system for quantifying gross pathological findings was developed. In this study, five healthy pigs served as control animals while 24 pigs were infected with App, the causative agent of pleuropneumonia in pigs, in an established model for respiratory tract disease. Results Computed tomographic (CT findings during the course of App challenge were verified by radiological imaging, clinical, serological, gross pathology and histological examinations. Findings from clinical examinations and both CT and radiological imaging, were recorded on day 7 and day 21 after challenge. Clinical signs after experimental App challenge were indicative of acute to chronic disease. Lung CT findings of infected pigs comprised ground-glass opacities and consolidation. On day 7 and 21 the clinical scores significantly correlated with the scores of both imaging techniques. At day 21, significant correlations were found between clinical scores, CT scores and lung lesion scores. In 19 out of 22 challenged pigs the determined disease grades (not affected, slightly affected, moderately affected, severely affected from CT and gross pathological examination were in accordance. Disease classification by radiography and gross pathology agreed in 11 out of 24 pigs. Conclusions High-resolution, high-contrast CT examination with no overlapping of organs is superior to

  14. The nature of the (visualization) game: Challenges and opportunities from computational geophysics

    Science.gov (United States)

    Kellogg, L. H.

    2016-12-01

    As the geosciences enters the era of big data, modeling and visualization become increasingly vital tools for discovery, understanding, education, and communication. Here, we focus on modeling and visualization of the structure and dynamics of the Earth's surface and interior. The past decade has seen accelerated data acquisition, including higher resolution imaging and modeling of Earth's deep interior, complex models of geodynamics, and high resolution topographic imaging of the changing surface, with an associated acceleration of computational modeling through better scientific software, increased computing capability, and the use of innovative methods of scientific visualization. The role of modeling is to describe a system, answer scientific questions, and test hypotheses; the term "model" encompasses mathematical models, computational models, physical models, conceptual models, statistical models, and visual models of a structure or process. These different uses of the term require thoughtful communication to avoid confusion. Scientific visualization is integral to every aspect of modeling. Not merely a means of communicating results, the best uses of visualization enable scientists to interact with their data, revealing the characteristics of the data and models to enable better interpretation and inform the direction of future investigation. Innovative immersive technologies like virtual reality, augmented reality, and remote collaboration techniques, are being adapted more widely and are a magnet for students. Time-varying or transient phenomena are especially challenging to model and to visualize; researchers and students may need to investigate the role of initial conditions in driving phenomena, while nonlinearities in the governing equations of many Earth systems make the computations and resulting visualization especially challenging. Training students how to use, design, build, and interpret scientific modeling and visualization tools prepares them

  15. Challenges in reducing the computational time of QSTS simulations for distribution system analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Deboever, Jeremiah [Georgia Inst. of Technology, Atlanta, GA (United States); Zhang, Xiaochen [Georgia Inst. of Technology, Atlanta, GA (United States); Reno, Matthew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grijalva, Santiago [Georgia Inst. of Technology, Atlanta, GA (United States); Therrien, Francis [CME International T& D, St. Bruno, QC (Canada)

    2017-06-01

    The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10 to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.

  16. Addressing capability computing challenges of high-resolution global climate modelling at the Oak Ridge Leadership Computing Facility

    Science.gov (United States)

    Anantharaj, Valentine; Norman, Matthew; Evans, Katherine; Taylor, Mark; Worley, Patrick; Hack, James; Mayer, Benjamin

    2014-05-01

    During 2013, high-resolution climate model simulations accounted for over 100 million "core hours" using Titan at the Oak Ridge Leadership Computing Facility (OLCF). The suite of climate modeling experiments, primarily using the Community Earth System Model (CESM) at nearly 0.25 degree horizontal resolution, generated over a petabyte of data and nearly 100,000 files, ranging in sizes from 20 MB to over 100 GB. Effective utilization of leadership class resources requires careful planning and preparation. The application software, such as CESM, need to be ported, optimized and benchmarked for the target platform in order to meet the computational readiness requirements. The model configuration needs to be "tuned and balanced" for the experiments. This can be a complicated and resource intensive process, especially for high-resolution configurations using complex physics. The volume of I/O also increases with resolution; and new strategies may be required to manage I/O especially for large checkpoint and restart files that may require more frequent output for resiliency. It is also essential to monitor the application performance during the course of the simulation exercises. Finally, the large volume of data needs to be analyzed to derive the scientific results; and appropriate data and information delivered to the stakeholders. Titan is currently the largest supercomputer available for open science. The computational resources, in terms of "titan core hours" are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) and ASCR Leadership Computing Challenge (ALCC) programs, both sponsored by the U.S. Department of Energy (DOE) Office of Science. Titan is a Cray XK7 system, capable of a theoretical peak performance of over 27 PFlop/s, consists of 18,688 compute nodes, with a NVIDIA Kepler K20 GPU and a 16-core AMD Opteron CPU in every node, for a total of 299,008 Opteron cores and 18,688 GPUs offering a cumulative 560

  17. Micro-computed tomographic analysis of progression of artificial enamel lesions in primary and permanent teeth after resin infiltration.

    Science.gov (United States)

    Ozgul, Betul Memis; Orhan, Kaan; Oz, Firdevs Tulga

    2015-09-01

    We investigated inhibition of lesion progression in artificial enamel lesions. Lesions were created on primary and permanent anterior teeth (n = 10 each) and were divided randomly into two groups with two windows: Group 1 (window A: resin infiltration; window B: negative control) and Group 2 (window A: resin infiltration + fluoride varnish; window B: fluoride varnish). After pH cycling, micro-computed tomography was used to analyze progression of lesion depth and changes in mineral density. Resin infiltration and resin infiltration + fluoride varnish significantly inhibited progression of lesion depth in primary teeth (P 0.05). Resin infiltration is a promising method of inhibiting progression of caries lesions.

  18. IBM Watson: How Cognitive Computing Can Be Applied to Big Data Challenges in Life Sciences Research.

    Science.gov (United States)

    Chen, Ying; Elenee Argentinis, J D; Weber, Griff

    2016-04-01

    Life sciences researchers are under pressure to innovate faster than ever. Big data offer the promise of unlocking novel insights and accelerating breakthroughs. Ironically, although more data are available than ever, only a fraction is being integrated, understood, and analyzed. The challenge lies in harnessing volumes of data, integrating the data from hundreds of sources, and understanding their various formats. New technologies such as cognitive computing offer promise for addressing this challenge because cognitive solutions are specifically designed to integrate and analyze big datasets. Cognitive solutions can understand different types of data such as lab values in a structured database or the text of a scientific publication. Cognitive solutions are trained to understand technical, industry-specific content and use advanced reasoning, predictive modeling, and machine learning techniques to advance research faster. Watson, a cognitive computing technology, has been configured to support life sciences research. This version of Watson includes medical literature, patents, genomics, and chemical and pharmacological data that researchers would typically use in their work. Watson has also been developed with specific comprehension of scientific terminology so it can make novel connections in millions of pages of text. Watson has been applied to a few pilot studies in the areas of drug target identification and drug repurposing. The pilot results suggest that Watson can accelerate identification of novel drug candidates and novel drug targets by harnessing the potential of big data.

  19. Combining Brain-Computer Interfaces and Assistive Technologies: State-of-the-Art and Challenges

    Directory of Open Access Journals (Sweden)

    José del R. Millán

    2010-09-01

    Full Text Available In recent years, new research has brought the field of EEG-based Brain-Computer Interfacing (BCI out of its infancy and into a phase of relative maturity through many demonstrated prototypes such as brain-controlled wheelchairs, keyboards, and computer games. With this proof-of-concept phase in the past, the time is now ripe to focus on the development of practical BCI technologies that can be brought out of the lab and into real-world applications. In particular, we focus on the prospect of improving the lives of countless disabled individuals through a combination of BCI technology with existing assistive technologies (AT. In pursuit of more practical BCIs for use outside of the lab, in this paper, we identify four application areas where disabled individuals could greatly benefit from advancements in BCI technology, namely,“Communication & Control”, “Motor Substitution”, “Entertainment”, and “Motor Recovery”. We review the current state of the art and possible future developments, while discussing the main research issues in these four areas. In particular, we expect the most progress in the development of technologies such as hybrid BCI architectures, user-machine adaptation algorithms, the exploitation of users’ mental states for BCI reliability and confidence measures, the incorporation of principles in human-computer interaction (HCI to improve BCI usability, and the development of novel BCI technology including better EEG devices.

  20. Combining Brain-Computer Interfaces and Assistive Technologies: State-of-the-Art and Challenges.

    Science.gov (United States)

    Millán, J D R; Rupp, R; Müller-Putz, G R; Murray-Smith, R; Giugliemma, C; Tangermann, M; Vidaurre, C; Cincotti, F; Kübler, A; Leeb, R; Neuper, C; Müller, K-R; Mattia, D

    2010-01-01

    In recent years, new research has brought the field of electroencephalogram (EEG)-based brain-computer interfacing (BCI) out of its infancy and into a phase of relative maturity through many demonstrated prototypes such as brain-controlled wheelchairs, keyboards, and computer games. With this proof-of-concept phase in the past, the time is now ripe to focus on the development of practical BCI technologies that can be brought out of the lab and into real-world applications. In particular, we focus on the prospect of improving the lives of countless disabled individuals through a combination of BCI technology with existing assistive technologies (AT). In pursuit of more practical BCIs for use outside of the lab, in this paper, we identify four application areas where disabled individuals could greatly benefit from advancements in BCI technology, namely, "Communication and Control", "Motor Substitution", "Entertainment", and "Motor Recovery". We review the current state of the art and possible future developments, while discussing the main research issues in these four areas. In particular, we expect the most progress in the development of technologies such as hybrid BCI architectures, user-machine adaptation algorithms, the exploitation of users' mental states for BCI reliability and confidence measures, the incorporation of principles in human-computer interaction (HCI) to improve BCI usability, and the development of novel BCI technology including better EEG devices.

  1. Tough Acts to Follow: The Challenges to Science Teachers Presented by Biotechnological Progress

    Science.gov (United States)

    Bryce, Tom; Gray, Donald

    2004-01-01

    The public controversies associated with biotechnological progress (genetic modification, cloning, and so forth) increasingly impact upon biology teaching in school; teachers find themselves engaged in discussions with pupils on value-laden issues deriving from the social and ethical implications of the 'new science'. The research described in…

  2. Tough Acts to Follow: The Challenges to Science Teachers Presented by Biotechnological Progress

    Science.gov (United States)

    Bryce, Tom; Gray, Donald

    2004-01-01

    The public controversies associated with biotechnological progress (genetic modification, cloning, and so forth) increasingly impact upon biology teaching in school; teachers find themselves engaged in discussions with pupils on value-laden issues deriving from the social and ethical implications of the 'new science'. The research described in…

  3. Computational Challenges and Collaborative Projects in the NCI Quantitative Imaging Network.

    Science.gov (United States)

    Farahani, Keyvan; Kalpathy-Cramer, Jayashree; Chenevert, Thomas L; Rubin, Daniel L; Sunderland, John J; Nordstrom, Robert J; Buatti, John; Hylton, Nola

    2016-12-01

    The Quantitative Imaging Network (QIN) of the National Cancer Institute (NCI) conducts research in development and validation of imaging tools and methods for predicting and evaluating clinical response to cancer therapy. Members of the network are involved in examining various imaging and image assessment parameters through network-wide cooperative projects. To more effectively use the cooperative power of the network in conducting computational challenges in benchmarking of tools and methods and collaborative projects in analytical assessment of imaging technologies, the QIN Challenge Task Force has developed policies and procedures to enhance the value of these activities by developing guidelines and leveraging NCI resources to help their administration and manage dissemination of results. Challenges and Collaborative Projects (CCPs) are further divided into technical and clinical CCPs. As the first NCI network to engage in CCPs, we anticipate a variety of CCPs to be conducted by QIN teams in the coming years. These will be aimed to benchmark advanced software tools for clinical decision support, explore new imaging biomarkers for therapeutic assessment, and establish consensus on a range of methods and protocols in support of the use of quantitative imaging to predict and assess response to cancer therapy.

  4. Tackling some of the most intricate geophysical challenges via high-performance computing

    Science.gov (United States)

    Khosronejad, A.

    2016-12-01

    Recently, world has been witnessing significant enhancements in computing power of supercomputers. Computer clusters in conjunction with the advanced mathematical algorithms has set the stage for developing and applying powerful numerical tools to tackle some of the most intricate geophysical challenges that today`s engineers face. One such challenge is to understand how turbulent flows, in real-world settings, interact with (a) rigid and/or mobile complex bed bathymetry of waterways and sea-beds in the coastal areas; (b) objects with complex geometry that are fully or partially immersed; and (c) free-surface of waterways and water surface waves in the coastal area. This understanding is especially important because the turbulent flows in real-world environments are often bounded by geometrically complex boundaries, which dynamically deform and give rise to multi-scale and multi-physics transport phenomena, and characterized by multi-lateral interactions among various phases (e.g. air/water/sediment phases). Herein, I present some of the multi-scale and multi-physics geophysical fluid mechanics processes that I have attempted to study using an in-house high-performance computational model, the so-called VFS-Geophysics. More specifically, I will present the simulation results of turbulence/sediment/solute/turbine interactions in real-world settings. Parts of the simulations I present are performed to gain scientific insights into the processes such as sand wave formation (A. Khosronejad, and F. Sotiropoulos, (2014), Numerical simulation of sand waves in a turbulent open channel flow, Journal of Fluid Mechanics, 753:150-216), while others are carried out to predict the effects of climate change and large flood events on societal infrastructures ( A. Khosronejad, et al., (2016), Large eddy simulation of turbulence and solute transport in a forested headwater stream, Journal of Geophysical Research:, doi: 10.1002/2014JF003423).

  5. Mobile, Cloud, and Big Data Computing: Contributions, Challenges, and New Directions in Telecardiology

    Directory of Open Access Journals (Sweden)

    Chung-Chi Yang

    2013-11-01

    Full Text Available Many studies have indicated that computing technology can enable off-site cardiologists to read patients’ electrocardiograph (ECG, echocardiography (ECHO, and relevant images via smart phones during pre-hospital, in-hospital, and post-hospital teleconsultation, which not only identifies emergency cases in need of immediate treatment, but also prevents the unnecessary re-hospitalizations. Meanwhile, several studies have combined cloud computing and mobile computing to facilitate better storage, delivery, retrieval, and management of medical files for telecardiology. In the future, the aggregated ECG and images from hospitals worldwide will become big data, which should be used to develop an e-consultation program helping on-site practitioners deliver appropriate treatment. With information technology, real-time tele-consultation and tele-diagnosis of ECG and images can be practiced via an e-platform for clinical, research, and educational purposes. While being devoted to promote the application of information technology onto telecardiology, we need to resolve several issues: (1 data confidentiality in the cloud, (2 data interoperability among hospitals, and (3 network latency and accessibility. If these challenges are overcome, tele-consultation will be ubiquitous, easy to perform, inexpensive, and beneficial. Most importantly, these services will increase global collaboration and advance clinical practice, education, and scientific research in cardiology.

  6. Mobile, cloud, and big data computing: contributions, challenges, and new directions in telecardiology.

    Science.gov (United States)

    Hsieh, Jui-Chien; Li, Ai-Hsien; Yang, Chung-Chi

    2013-11-13

    Many studies have indicated that computing technology can enable off-site cardiologists to read patients' electrocardiograph (ECG), echocardiography (ECHO), and relevant images via smart phones during pre-hospital, in-hospital, and post-hospital teleconsultation, which not only identifies emergency cases in need of immediate treatment, but also prevents the unnecessary re-hospitalizations. Meanwhile, several studies have combined cloud computing and mobile computing to facilitate better storage, delivery, retrieval, and management of medical files for telecardiology. In the future, the aggregated ECG and images from hospitals worldwide will become big data, which should be used to develop an e-consultation program helping on-site practitioners deliver appropriate treatment. With information technology, real-time tele-consultation and tele-diagnosis of ECG and images can be practiced via an e-platform for clinical, research, and educational purposes. While being devoted to promote the application of information technology onto telecardiology, we need to resolve several issues: (1) data confidentiality in the cloud, (2) data interoperability among hospitals, and (3) network latency and accessibility. If these challenges are overcome, tele-consultation will be ubiquitous, easy to perform, inexpensive, and beneficial. Most importantly, these services will increase global collaboration and advance clinical practice, education, and scientific research in cardiology.

  7. Progress and Challenges in Graduate Education in Gerontology: The U.S. Experience

    Science.gov (United States)

    Haley, William E.; Zelinski, Elizabeth

    2007-01-01

    The history and current status of graduate programs in gerontology in the United States are reviewed. Masters degree programs began in 1967, and currently exist at 57 universities in the United States. Challenges for these programs include maintaining enrollment and identifying employment for program graduates, given competition from graduates…

  8. Advanced Development of the rF1V and rBV A/B Vaccines: Progress and Challenges

    Directory of Open Access Journals (Sweden)

    Mary Kate Hart

    2012-01-01

    Full Text Available The development of vaccines for microorganisms and bacterial toxins with the potential to be used as biowarfare and bioterrorism agents is an important component of the US biodefense program. DVC is developing two vaccines, one against inhalational exposure to botulinum neurotoxins A1 and B1 and a second for Yersinia pestis, with the ultimate goal of licensure by the FDA under the Animal Rule. Progress has been made in all technical areas, including manufacturing, nonclinical, and clinical development and testing of the vaccines, and in assay development. The current status of development of these vaccines, and remaining challenges are described in this chapter.

  9. COMPUTER AIDED DESIGN OF THE STRIP LAYOUT IN PROGRESSIVE CUTTING DIE

    Directory of Open Access Journals (Sweden)

    Faruk MENDİ

    2000-02-01

    Full Text Available Progressive cutting dies have a very important position on the series manufacture of the sheet metals by the method of no-turnings reproduction system. By using classical methods, the design of the dies and the choosing of the suitable die parts are generally time consuming and money spending for designer. In this study, we development a program based on 1 MB study in QBASIC program which can design strip layout. According to the data base, in addition to strip layout design, force analysis, force capacity of the press dies, gravity force center can be found. By making the design of not standard die parts, you can also choose the standard die parts. With this computer program for die design, you can get much more sensitive, enough flexible and fast results.

  10. The brain-computer: origin of the idea and progress in its realization.

    Science.gov (United States)

    Ichikawa, Michinori; Matsumoto, Gen

    2004-06-01

    The Brain-Computer is a physical analogue of a real organism which uses both a brain-inspired memory-based architecture and an output-driven learning algorithm. This system can be realized by creating a scaled-down model car that learns how to drive by heuristically connecting image processing with behavior control. This study proves that learning efficiency progresses rapidly when the acquired behaviors are prioritized. We develop a small real-world device that moves about purposefully in an artificial environment. The robot uses imaging information acquired through its random actions to make a mental map. This map, then, provides the cognitive structure for acquiring necessary information for autonomous behavior.

  11. Computing the numerical solution to functional differential equations: some recent progresses towards E. Hopf's 1952 dream

    Science.gov (United States)

    Venturi, Daniele

    2016-11-01

    The fundamental importance of functional differential equations has been recognized in many areas of mathematical physics, such as fluid dynamics, quantum field theory and statistical physics. For example, in the context of fluid dynamics, the Hopf characteristic functional equation was deemed by Monin and Yaglom to be "the most compact formulation of the turbulence problem", which is the problem of determining the statistical properties of the velocity and pressure fields of Navier-Stokes equations given statistical information on the initial state. However, no effective numerical method has yet been developed to compute the solution to functional differential equations. In this talk I will provide a new perspective on this general problem, and discuss recent progresses in approximation theory for nonlinear functionals and functional equations. The proposed methods will be demonstrated through various examples.

  12. From the First Stars and Galaxies to the Epoch of Reionization: 20 Years of Computational Progress

    Science.gov (United States)

    Norman, Michael L.

    2016-06-01

    I give a progress report on computational efforts to reconstruct the first billion years of cosmic evolution beginning with the formation of the first generation of stars and galaxies, culminating in the complete photoionization of the intergalactic medium. After 20 years of intense effort, the picture is falling into place through the development and application of complex multiphysics numerical simulations of increasing physical complexity and scale on the most powerful supercomputers. I describe the processes that govern the formation of the first generation of stars, the transition to the second generation of stars, the assembly of the first galaxies, and finally the reionization of the universe. I discuss how these simulations guide the interpretation of relevant observations of the high redshift (and local) universe, along with some observational predictions of these simulations which will be tested with the next generation observatories.

  13. Tissue Engineering of Blood Vessels: Functional Requirements, Progress, and Future Challenges.

    Science.gov (United States)

    Kumar, Vivek A; Brewster, Luke P; Caves, Jeffrey M; Chaikof, Elliot L

    2011-09-01

    Vascular disease results in the decreased utility and decreased availability of autologus vascular tissue for small diameter (engineered replacement vessels represent an ideal solution to this clinical problem. Ongoing progress requires combined approaches from biomaterials science, cell biology, and translational medicine to develop feasible solutions with the requisite mechanical support, a non-fouling surface for blood flow, and tissue regeneration. Over the past two decades interest in blood vessel tissue engineering has soared on a global scale, resulting in the first clinical implants of multiple technologies, steady progress with several other systems, and critical lessons-learned. This review will highlight the current inadequacies of autologus and synthetic grafts, the engineering requirements for implantation of tissue-engineered grafts, and the current status of tissue-engineered blood vessel research.

  14. Pseudoprogression, radionecrosis, inflammation or true tumor progression? challenges associated with glioblastoma response assessment in an evolving therapeutic landscape.

    Science.gov (United States)

    Ellingson, Benjamin M; Chung, Caroline; Pope, Whitney B; Boxerman, Jerrold L; Kaufmann, Timothy J

    2017-04-05

    The wide variety of treatment options that exist for glioblastoma, including surgery, ionizing radiation, anti-neoplastic chemotherapies, anti-angiogenic therapies, and active or passive immunotherapies, all may alter aspects of vascular permeability within the tumor and/or normal parenchyma. These alterations manifest as changes in the degree of contrast enhancement or T2-weighted signal hyperintensity on standard anatomic MRI scans, posing a potential challenge for accurate radiographic response assessment for identifying anti-tumor effects. The current review highlights the challenges that remain in differentiating true disease progression from changes due to radiation therapy, including pseudoprogression and radionecrosis, as well as immune or inflammatory changes that may occur as either an undesired result of cytotoxic therapy or as a desired consequence of immunotherapies.

  15. Challenges in Assessing Progress in Multifunctional Operations: Experiences from a Provincial Reconstruction Team in Afghanistan

    Science.gov (United States)

    2011-06-01

    these measures. Assessment of progress can thus be seen as a process consisting of monitoring and evaluation activities ( Sida , 2007). Input...limited integration and understanding between the Swedish Armed Forces and SIDA at the domestic interagency level. Four participants said that the...military and SIDA personnel had been sent to the PRT with different mandates, objectives and cultures, without practical instructions on how to

  16. Informing efficient randomised controlled trials: exploration of challenges in developing progression criteria for internal pilot studies

    Science.gov (United States)

    Williamson, Paula R; Gamble, Carrol; O'Connell Francischetto, Elaine; Metcalfe, Chris; Davidson, Peter; Williams, Hywel; Blazeby, Jane M

    2017-01-01

    Objectives Designing studies with an internal pilot phase may optimise the use of pilot work to inform more efficient randomised controlled trials (RCTs). Careful selection of preagreed decision or ‘progression’ criteria at the juncture between the internal pilot and main trial phases provides a valuable opportunity to evaluate the likely success of the main trial and optimise its design or, if necessary, to make the decision not to proceed with the main trial. Guidance on the appropriate selection and application of progression criteria is, however, lacking. This paper outlines the key issues to consider in the optimal development and review of operational progression criteria for RCTs with an internal pilot phase. Design A structured literature review and exploration of stakeholders' opinions at a Medical Research Council (MRC) Hubs for Trials Methodology Research workshop. Key stakeholders included triallists, methodologists, statisticians and funders. Results There is considerable variation in the use of progression criteria for RCTs with an internal pilot phase, although 3 common issues predominate: trial recruitment, protocol adherence and outcome data. Detailed and systematic reporting around the decision-making process for stopping, amending or proceeding to a main trial is uncommon, which may hamper understanding in the research community about the appropriate and optimal use of RCTs with an internal pilot phase. 10 top tips for the development, use and reporting of progression criteria for internal pilot studies are presented. Conclusions Systematic and transparent reporting of the design, results and evaluation of internal pilot trials in the literature should be encouraged in order to facilitate understanding in the research community and to inform future trials. PMID:28213598

  17. Progress and Challenges in Developing Metabolic Footprints from Diet in Human Gut Microbial Cometabolism12

    OpenAIRE

    Duffy, Linda C; Raiten, Daniel J; Hubbard, S.; Starke-Reed, Pamela

    2015-01-01

    Homo sapiens harbor trillions of microbes, whose microbial metagenome (collective genome of a microbial community) using omic validation interrogation tools is estimated to be at least 100-fold that of human cells, which comprise 23,000 genes. This article highlights some of the current progress and open questions in nutrition-related areas of microbiome research. It also underscores the metabolic capabilities of microbial fermentation on nutritional substrates that require further mechanisti...

  18. The Role of Pulmonary Veins in Cancer Progression from a Computed Tomography Viewpoint

    Science.gov (United States)

    Chang, Hung; Liao, Tzu-Yao; Wen, Ming-Sheng; Yu, Chih-Teng

    2016-01-01

    Background. We studied the role of pulmonary veins in cancer progression using computed tomography (CT) scans. Methods. We obtained data from 260 patients with pulmonary vein obstruction syndrome (PVOS). We used CT scans to investigate pulmonary lesions in relation to pulmonary veins. We divided the lesions into central and peripheral lesions by their anatomical location: in the lung parenchymal tissue or pulmonary vein; in the superior or inferior pulmonary vein; and by unilateral or bilateral presence in the lungs. Results. Of the 260 PVOS patients, 226 (87%) had central lesions, 231 (89%) had peripheral lesions, and 190 (75%) had mixed central and peripheral lesions. Among the 226 central lesions, 93% had lesions within the superior pulmonary vein, either bilaterally or unilaterally. Among the 231 peripheral lesions, 65% involved bilateral lungs, 70% involved lesions within the inferior pulmonary veins, and 23% had obvious metastatic extensions into the left atrium. All patients exhibited nodules within their pulmonary veins. The predeath status included respiratory failure (40%) and loss of consciousness (60%). Conclusion. CT scans play an important role in following tumor progression within pulmonary veins. Besides respiratory distress, PVOS cancer cells entering centrally can result in cardiac and cerebral events and loss of consciousness or can metastasize peripherally from the pulmonary veins to the lungs.

  19. Seismic damage estimation for buried pipelines - challenges after three decades of progress

    Energy Technology Data Exchange (ETDEWEB)

    Pineda-porras, Omar Andrey [Los Alamos National Laboratory; Najafi, Mohammand [U. OF TEXAS

    2009-01-01

    This paper analyzes the evolution over the past three decades of seismic damage estimation for buried pipelines and identifies some challenges for future research studies on the subject. The first section of this paper presents a chronological description of the evolution since the mid-1970s of pipeline fragility relations - the most common tool for pipeline damage estimation - and follows with a careful analysis of the use of several ground motion parameters as pipeline damage indicators. In the second section of the paper, four gaps on the subject are identified and proposed as challenges for future research studies. The main conclusion of this work is that enhanced fragility relations must be developed for improving pipeline damage estimation, which must consider relevant parameters that could influence the seismic response of pipelines.

  20. Accessibility and use of essential medicines in health care: Current progress and challenges in India.

    Science.gov (United States)

    Bansal, Dipika; Purohit, Vilok K

    2013-01-01

    Essential Medicine Concept, a major breakthrough in health care, started in 1977 when World Health Organization (WHO) published its first list. Appropriate use of essential medicines is one of the most cost-effective components of modern health care. The selection process has evolved from expert evaluation to evidence-based selection. The first Indian list was published in 1996 and the recent revision with 348 medicines was published in 2011 after 8 years. Health expenditure is less in India as compared to developed countries. India faces a major challenge in providing access to medicines for its 1.2 billion people by focusing on providing essential medicines. In the future, countries will face challenges in selecting high-cost medicines for oncology, orphan diseases and other conditions. There is a need to develop strategies to improve affordable access to essential medicines under the current health care reform.

  1. A review about the development of fucoidan in antitumor activity: Progress and challenges.

    Science.gov (United States)

    Wu, Lei; Sun, Jing; Su, Xitong; Yu, Qiuli; Yu, Qiuyang; Zhang, Peng

    2016-12-10

    Fucoidan is composed of l-fucose, sulfate groups and one or more small proportions of d-xylose, d-mannose, d-galactose, l-rhamnose, arabinose, glucose, d-glucuronic acid and acetyl groups in different kinds of brown seaweeds. Many reports have demonstrated that fucoidan has antitumor activities on various cancers. However, until now, few reviews have discussed the antitumor activity of fucoidan and few reports have summarized detailed molecular mechanisms of its actions and antitumor challenges of fucoidan specially. In this review, the antitumor signaling pathway mechanisms related to fucoidan are elucidated as much detail as possible. Besides, the factors affecting the anticancer effects of fucoidan, the structural characteristics of fucoidan with anticancer activities and the challenges for the further development of fucoidan are also summarized and evaluated. The existing similar and different conclusions are summarized in an attempt to provide guidelines to help further research, and finally contribute to go into market as chemotherapeumtics.

  2. In Vivo Delivery of CRISPR/Cas9 for Therapeutic Gene Editing: Progress and Challenges.

    Science.gov (United States)

    Mout, Rubul; Ray, Moumita; Lee, Yi-Wei; Scaletti, Federica; Rotello, Vincent M

    2017-03-17

    The successful use of clustered regularly interspaced short palindromic repeat (CRISPR)/Cas9-based gene editing for therapeutics requires efficient in vivo delivery of the CRISPR components. There are, however, major challenges on the delivery front. In this Topical Review, we will highlight recent developments in CRISPR delivery, and we will present hurdles that still need to be overcome to achieve effective in vivo editing.

  3. Tensions and Challenges: Interrelationships between Social Movements and Progressive Institutional Politics in Latin America

    Directory of Open Access Journals (Sweden)

    Lázaro M. Bacallao-Pino

    2014-01-01

    Full Text Available Latin America is the scenario of both significant counterhegemonic social movements and allegedly progressive (or even anti-capitalist governments. The article aims to analyse the interrelationships between those collective agents and institutional politics in that scenario. Based on a general approach to some relevant social movements from the region, the positions of some particular Latin American governments and its leaders, as well as the examination of secondary sources, the text examines three main aspects that mediate the interrelationships between social movements and progressive institutional politics: the singular way in which social movements understand the sense of “politics”, the postures with respect to those collective agents assumed by those governments and the importance of autonomy for social movements. Social movements understand politics not as a separate dimension, but as a process of accumulation from sociability, in a continuity between social and political dimensions based on everyday experience of life, including this way social practices traditionally located outside established political institutions. Autonomy is a central value for those social actors, defining their position with regard to political parties, labour unions, churches and other traditional organisations. It is a value that crosses all their practices and the possibility of articulation to projects developed from governments, from the local level to the Latin American one. Against this, the vision on social movements of allegedly progressive (or even anti-capitalist governments is mediated by the purpose of understanding them from the point of view of traditional political rules, and two significant attitudes towards those social agents are some purposes of criminalisation and co-optation.

  4. Progressively invalidating orthostatic hypotension: A common symptom for a challenging diagnosis

    Directory of Open Access Journals (Sweden)

    Serena Pelusi

    2016-01-01

    Full Text Available We discuss here an uncommon condition of neurogenic hypotension in the context of immunoglobulin light chain (amyloid light-chain amyloidosis. The most serious feature was autonomic nervous system impairment, mainly characterized by severe refractory orthostatic hypotension, which became progressively invalidating, forcing the patient to bed. Moreover, since the systemic involvement of the disease, the patient presented also diarrhea, dysphagia, asthenia, peripheral edema because of gastrointestinal, and kidney dysfunction. Eventually, the massive myocardial depression and infiltration led to a fatal outcome.

  5. Getting ready for REDD+ in Tanzania: a case study of progress and challenges

    DEFF Research Database (Denmark)

    Burgess, Neil David; Bahane, Bruno; Clairs, Tim;

    2010-01-01

    the Norwegian, Finnish and German governments and is a participant in the World Bank’s Forest Carbon Partnership Facility. In combination, these interventions aim to mitigate greenhouse gas emissions, provide an income to rural communities and conserve biodiversity. The establishment of the UN-REDD Programme...... cover, enhanced capacity for measuring, reporting and verification, and pilot projects to test REDD+ implementation linked to the existing Participatory Forest Management Programme. Our conclusion is that even in a country with considerable donor support, progressive forest policies, laws...

  6. Applied & Computational MathematicsChallenges for the Design and Control of Dynamic Energy Systems

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D L; Burns, J A; Collis, S; Grosh, J; Jacobson, C A; Johansen, H; Mezic, I; Narayanan, S; Wetter, M

    2011-03-10

    The Energy Independence and Security Act of 2007 (EISA) was passed with the goal 'to move the United States toward greater energy independence and security.' Energy security and independence cannot be achieved unless the United States addresses the issue of energy consumption in the building sector and significantly reduces energy consumption in buildings. Commercial and residential buildings account for approximately 40% of the U.S. energy consumption and emit 50% of CO{sub 2} emissions in the U.S. which is more than twice the total energy consumption of the entire U.S. automobile and light truck fleet. A 50%-80% improvement in building energy efficiency in both new construction and in retrofitting existing buildings could significantly reduce U.S. energy consumption and mitigate climate change. Reaching these aggressive building efficiency goals will not happen without significant Federal investments in areas of computational and mathematical sciences. Applied and computational mathematics are required to enable the development of algorithms and tools to design, control and optimize energy efficient buildings. The challenge has been issued by the U.S. Secretary of Energy, Dr. Steven Chu (emphasis added): 'We need to do more transformational research at DOE including computer design tools for commercial and residential buildings that enable reductions in energy consumption of up to 80 percent with investments that will pay for themselves in less than 10 years.' On July 8-9, 2010 a team of technical experts from industry, government and academia were assembled in Arlington, Virginia to identify the challenges associated with developing and deploying newcomputational methodologies and tools thatwill address building energy efficiency. These experts concluded that investments in fundamental applied and computational mathematics will be required to build enabling technology that can be used to realize the target of 80% reductions in energy

  7. Computational analyses of ancient pathogen DNA from herbarium samples: challenges and prospects

    Directory of Open Access Journals (Sweden)

    Kentaro eYoshida

    2015-09-01

    Full Text Available The application of DNA sequencing technology to the study of ancient DNA has enabled the reconstruction of past epidemics from genomes of historically important plant-associated microbes. Recently, the genome sequences of the potato late blight pathogen Phytophthora infestans were analyzed from 19th century herbarium specimens. These herbarium samples originated from infected potatoes collected during and after the Irish potato famine. Herbaria have therefore great potential to help elucidate past epidemics of crops, date the emergence of pathogens, and inform about past pathogen population dynamics. DNA preservation in herbarium samples was unexpectedly good, raising the possibility of a whole new research area in plant and microbial genomics. However, the recovered DNA can be extremely fragmented resulting in specific challenges in reconstructing genome sequences. Here we review some of the challenges in computational analyses of ancient DNA from herbarium samples. We also applied the recently developed linkage method to haplotype reconstruction of diploid or polyploid genomes from fragmented ancient DNA.

  8. Modelling impacts of climate change on arable crop diseases: progress, challenges and applications.

    Science.gov (United States)

    Newbery, Fay; Qi, Aiming; Fitt, Bruce Dl

    2016-08-01

    Combining climate change, crop growth and crop disease models to predict impacts of climate change on crop diseases can guide planning of climate change adaptation strategies to ensure future food security. This review summarises recent developments in modelling climate change impacts on crop diseases, emphasises some major challenges and highlights recent trends. The use of multi-model ensembles in climate change modelling and crop modelling is contributing towards measures of uncertainty in climate change impact projections but other aspects of uncertainty remain largely unexplored. Impact assessments are still concentrated on few crops and few diseases but are beginning to investigate arable crop disease dynamics at the landscape level.

  9. Teacher-student relationships and school adjustment: progress and remaining challenges.

    Science.gov (United States)

    Hughes, Jan N

    2012-01-01

    This commentary highlights the ways in which the articles in this special issue contribute to the second generation of research on teacher-student relationships. Second generation research aims to increase our understanding of the development of these relationships, and the processes responsible for their effects, as well as to evaluate theoretically-informed interventions designed to enhance teacher-student interactions. Despite unanswered questions and challenges that confront this field of inquiry, the current state of knowledge is adequate to apply the knowledge gained to the task of increasing teachers' abilities to provide positive social and emotional learning environments, thereby improving students' learning and behavioral adjustment.

  10. Progress and challenges in the vaccine-based treatment of head and neck cancers

    Directory of Open Access Journals (Sweden)

    Venuti Aldo

    2009-05-01

    Full Text Available Abstract Head and neck (HN cancer represents one of the most challenging diseases because the mortality remains high despite advances in early diagnosis and treatment. Although vaccine-based approaches for the treatment of advanced squamous cell carcinoma of the head and neck have achieved limited clinical success, advances in cancer immunology provide a strong foundation and powerful new tools to guide current attempts to develop effective cancer vaccines. This article reviews what has to be rather what has been done in the field for the development of future vaccines in HN tumours.

  11. Progress and Challenges of Neutrino-Nuclear Cross Sections in the GeV Regime

    Science.gov (United States)

    Mahn, Kendall

    2017-01-01

    Interactions of neutrinos and antineutrinos with nuclear material are an essential ingredient in measurements of neutrino oscillation. As future experiments aim at unprecedented precision of the parameters which govern neutrino mixing, neutrino-nuclear interactions have come under intense scrutiny and interest. This talk will describe the needs of future experiments, the unique challenges of neutrino interaction physics and summarize recent results from a suite of experiments worldwide. The speaker would like to acknowledge support by Department of Energy and the Alfred P. Sloan Foundation.

  12. Reporter gene imaging of immune responses to cancer: progress and challenges.

    Science.gov (United States)

    Dubey, Purnima

    2012-01-01

    Immune responses to cancer are dynamic processes which take place through the concerted activity of innate and adaptive cell populations. In order to fully understand the efficacy of immune therapies for cancer, it is critical to understand how the treatment modulates the function of each cell type involved in the anti-tumor immune response. Molecular imaging is a versatile method for longitudinal studies of cellular localization and function. The development of reporter genes for tracking cell movement and function was a powerful addition to the immunologist's toolbox. This review will highlight the advances and challenges in the use of reporter gene imaging to track immune cell localization and function in cancer.

  13. MALDI imaging mass spectrometry: statistical data analysis and current computational challenges

    Directory of Open Access Journals (Sweden)

    Alexandrov Theodore

    2012-11-01

    Full Text Available Abstract Matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF imaging mass spectrometry, also called MALDI-imaging, is a label-free bioanalytical technique used for spatially-resolved chemical analysis of a sample. Usually, MALDI-imaging is exploited for analysis of a specially prepared tissue section thaw mounted onto glass slide. A tremendous development of the MALDI-imaging technique has been observed during the last decade. Currently, it is one of the most promising innovative measurement techniques in biochemistry and a powerful and versatile tool for spatially-resolved chemical analysis of diverse sample types ranging from biological and plant tissues to bio and polymer thin films. In this paper, we outline computational methods for analyzing MALDI-imaging data with the emphasis on multivariate statistical methods, discuss their pros and cons, and give recommendations on their application. The methods of unsupervised data mining as well as supervised classification methods for biomarker discovery are elucidated. We also present a high-throughput computational pipeline for interpretation of MALDI-imaging data using spatial segmentation. Finally, we discuss current challenges associated with the statistical analysis of MALDI-imaging data.

  14. Computational methods and challenges in hydrogen/deuterium exchange mass spectrometry.

    Science.gov (United States)

    Claesen, Jürgen; Burzykowski, Tomasz

    2017-09-01

    Hydrogen/Deuterium exchange (HDX) has been applied, since the 1930s, as an analytical tool to study the structure and dynamics of (small) biomolecules. The popularity of using HDX to study proteins increased drastically in the last two decades due to the successful combination with mass spectrometry (MS). Together with this growth in popularity, several technological advances have been made, such as improved quenching and fragmentation. As a consequence of these experimental improvements and the increased use of protein-HDXMS, large amounts of complex data are generated, which require appropriate analysis. Computational analysis of HDXMS requires several steps. A typical workflow for proteins consists of identification of (non-)deuterated peptides or fragments of the protein under study (local analysis), or identification of the deuterated protein as a whole (global analysis); determination of the deuteration level; estimation of the protection extent or exchange rates of the labile backbone amide hydrogen atoms; and a statistically sound interpretation of the estimated protection extent or exchange rates. Several algorithms, specifically designed for HDX analysis, have been proposed. They range from procedures that focus on one specific step in the analysis of HDX data to complete HDX workflow analysis tools. In this review, we provide an overview of the computational methods and discuss outstanding challenges. © 2016 Wiley Periodicals, Inc. Mass Spec Rev 36:649-667, 2017. © 2016 Wiley Periodicals, Inc.

  15. The Present and Future Challenges of Distributed Computing in the ATLAS experiment

    CERN Document Server

    Ueda, I; The ATLAS collaboration

    2012-01-01

    The ATLAS experiment has collected more than 5 fb-1 of data in 2011 at the energy of 7 TeV. Several billions of events had been promptly reconstructed and stored in the ATLAS remote data centers spanning tens of petabytes of disk and tape storage. In addition, a similar amount of data has been simulated on the Grid to study the detector performance and efficiencies. The data processing and distribution on the Grid sites with more than 100.000 computing cores is centrally controlled by the system developed by ATLAS, managing a coherent data processing and analysis of almost one million jobs daily. An increased collision energy of 8 TeV in 2012 and much larger expected data collection rate due to improved LHC operation impose new requirements on the system and suggests a further evolution of the computing model to be able the meet the new challenges in the future. The experience of large-scale data processing and analysis on the Grid is presented through the evolving model and organization of the ATLAS Distribu...

  16. A Survey on Underwater Wireless Sensor Networks: Progresses, Applications, and Challenges

    Directory of Open Access Journals (Sweden)

    Premalatha J.

    2016-01-01

    Full Text Available The endangered underwater species always drew the attention of the scientific society since their disappearance would cause irreplaceable loss. Asia is home to some of the most diverse habitats in the earth, but it is estimated that more than one in four species are endangered. In Underwater, a lot of factors are putting marine life under immense pressure. Extreme population pressure is leading to pollution, over-fishing and the devastation of crucial habitats. Consequently, the numbers of almost all fish are declining and many are already endangered. To help these species to survive, their habitat should be strictly protected. This can be achieved by strictly monitoring them. During this course, several parameters, constraints about the species and its environments are focused. Now, advances in sensor technology facilitate the monitoring of species and their habitat with less expenditure. Indeed, the increasing sophistication of underwater wireless sensors offers chances that enable new challenges in a lot of areas, like surveillance one. This paper endorses the use of sensors for monitoring underwater species endangered in their habitat. This paper further examines the key approaches and challenges in the design and implementation of underwater wireless sensor networks. We summarize major applications and the main phenomena related to acoustic propagation, and discuss how they affect the design and operation of communication systems and networking protocols at various layers.

  17. Fundamental challenges in mechanistic enzymology: progress toward understanding the rate enhancements of enzymes.

    Science.gov (United States)

    Herschlag, Daniel; Natarajan, Aditya

    2013-03-26

    Enzymes are remarkable catalysts that lie at the heart of biology, accelerating chemical reactions to an astounding extent with extraordinary specificity. Enormous progress in understanding the chemical basis of enzymatic transformations and the basic mechanisms underlying rate enhancements over the past decades is apparent. Nevertheless, it has been difficult to achieve a quantitative understanding of how the underlying mechanisms account for the energetics of catalysis, because of the complexity of enzyme systems and the absence of underlying energetic additivity. We review case studies from our own work that illustrate the power of precisely defined and clearly articulated questions when dealing with such complex and multifaceted systems, and we also use this approach to evaluate our current ability to design enzymes. We close by highlighting a series of questions that help frame some of what remains to be understood, and we encourage the reader to define additional questions and directions that will deepen and broaden our understanding of enzymes and their catalysis.

  18. How Novel Algorithms and Access to High Performance Computing Platforms are Enabling Scientific Progress in Atomic and Molecular Physics

    Science.gov (United States)

    Schneider, Barry I.

    2016-10-01

    Over the past 40 years there has been remarkable progress in the quantitative treatment of complex many-body problems in atomic and molecular physics (AMP). This has happened as a consequence of the development of new and powerful numerical methods, translating these algorithms into practical software and the associated evolution of powerful computing platforms ranging from desktops to high performance computational instruments capable of massively parallel computation. We are taking the opportunity afforded by this CCP2015 to review computational progress in scattering theory and the interaction of strong electromagnetic fields with atomic and molecular systems from the early 1960’s until the present time to show how these advances have revealed a remarkable array of interesting and in many cases unexpected features. The article is by no means complete and certainly reflects the views and experiences of the author.

  19. Progress and Challenges in Developing Metabolic Footprints from Diet in Human Gut Microbial Cometabolism12

    Science.gov (United States)

    Duffy, Linda C; Raiten, Daniel J; Hubbard, Van S; Starke-Reed, Pamela

    2015-01-01

    Homo sapiens harbor trillions of microbes, whose microbial metagenome (collective genome of a microbial community) using omic validation interrogation tools is estimated to be at least 100-fold that of human cells, which comprise 23,000 genes. This article highlights some of the current progress and open questions in nutrition-related areas of microbiome research. It also underscores the metabolic capabilities of microbial fermentation on nutritional substrates that require further mechanistic understanding and systems biology approaches of studying functional interactions between diet composition, gut microbiota, and host metabolism. Questions surrounding bacterial fermentation and degradation of dietary constituents (particularly by Firmicutes and Bacteroidetes) and deciphering how microbial encoding of enzymes and derived metabolites affect recovery of dietary energy by the host are more complex than previously thought. Moreover, it is essential to understand to what extent the intestinal microbiota is subject to dietary control and to integrate these data with functional metabolic signatures and biomarkers. Many lines of research have demonstrated the significant role of the gut microbiota in human physiology and disease. Probiotic and prebiotic products are proliferating in the market in response to consumer demand, and the science and technology around these products are progressing rapidly. With high-throughput molecular technologies driving the science, studying the bidirectional interactions of host-microbial cometabolism, epithelial cell maturation, shaping of innate immune development, normal vs. dysfunctional nutrient absorption and processing, and the complex signaling pathways involved is now possible. Substantiating the safety and mechanisms of action of probiotic/prebiotic formulations is critical. Beneficial modulation of the human microbiota by using these nutritional and biotherapeutic strategies holds considerable promise as next

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  1. Science for Today's Energy Challenges: Accelerating Progress for a Sustainable Energy Future

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2006-07-01

    With a growing population and energy demand in the world, there is a pressing need for research to create secure and accessible energy options with greatly reduced emissions of greenhouse gases. While we work to deploy the clean and efficient technologies that we already have--which will be urgent for the coming decades--we must also work to develop the science for the technologies of the future. This brochure gives examples of some of the most promising developments, and it provides 'snapshots' of cutting edge work of scientists in the field. The areas of greatest promise include biochemistry, nanotechnology, supraconductivity, electrophysics and computing. There are many others.

  2. New strategies for the development of H5N1 subtype influenza vaccines: progress and challenges.

    Science.gov (United States)

    Steel, John

    2011-10-01

    The emergence and spread of highly pathogenic avian influenza (H5N1) viruses among poultry in Asia, the Middle East, and Africa have fueled concerns of a possible human pandemic, and spurred efforts towards developing vaccines against H5N1 influenza viruses, as well as improving vaccine production methods. In recent years, promising experimental reverse genetics-derived H5N1 live attenuated vaccines have been generated and characterized, including vaccines that are attenuated through temperature-sensitive mutation, modulation of the interferon antagonist protein, or disruption of the M2 protein. Live attenuated influenza virus vaccines based on each of these modalities have conferred protection against homologous and heterologous challenge in animal models of influenza virus infection. Alternative vaccine strategies that do not require the use of live virus, such as virus-like particle (VLP) and DNA-based vaccines, have also been vigorously pursued in recent years. Studies have demonstrated that influenza VLP vaccination can confer homologous and heterologous protection from lethal challenge in a mouse model of infection. There have also been improvements in the formulation and production of vaccines following concerns over the threat of H5N1 influenza viruses. The use of novel substrates for the growth of vaccine virus stocks has been intensively researched in recent years, and several candidate cell culture-based systems for vaccine amplification have emerged, including production systems based on Madin-Darby canine kidney, Vero, and PerC6 cell lines. Such systems promise increased scalability of product, and reduced reliance on embryonated chicken eggs as a growth substrate. Studies into the use of adjuvants have shown that oil-in-water-based adjuvants can improve the immunogenicity of inactivated influenza vaccines and conserve antigen in such formulations. Finally, efforts to develop more broadly cross-protective immunization strategies through the inclusion

  3. The development of myelin repair agents for treatment of multiple sclerosis: progress and challenges.

    Science.gov (United States)

    Murphy, Robert P; Murphy, Keith J; Pickering, Mark

    2013-01-01

    Multiple Sclerosis (MS) is an inflammatory demyelinating disorder which affects the central nervous system. Multiple sclerosis treatment has traditionally focused on preventing inflammatory damage to the myelin sheath. Indeed, all currently available disease modifying agents are immunomodulators. However, the limitations of this approach are becoming increasingly clear, leading to the exploration of other potential therapeutic strategies. In particular, targeting the endogenous remyelination system to promote replacement of the lost myelin sheath has shown much promise. As our understanding of remyelination biology advances, the realization of a remyelinating therapeutic comes closer to fruition. In our review, we aim to summarize the limitations of the current immune focused treatment strategy and discuss the potential of remyelination as a new treatment method. Finally, we aim to highlight the challenges in the identification and development of such therapeutics.

  4. Sustainable engineered processes to mitigate the global arsenic crisis in drinking water: challenges and progress.

    Science.gov (United States)

    Sarkar, Sudipta; Greenleaf, John E; Gupta, Anirban; Uy, Davin; Sengupta, Arup K

    2012-01-01

    Millions of people around the world are currently living under the threat of developing serious health problems owing to ingestion of dangerous concentrations of arsenic through their drinking water. In many places, treatment of arsenic-contaminated water is an urgent necessity owing to a lack of safe alternative sources. Sustainable production of arsenic-safe water from an arsenic-contaminated raw water source is currently a challenge. Despite the successful development in the laboratory of technologies for arsenic remediation, few have been successful in the field. A sustainable arsenic-remediation technology should be robust, composed of local resources, and user-friendly as well as must attach special consideration to the social, economic, cultural, traditional, and environmental aspects of the target community. One such technology is in operation on the Indian subcontinent. Wide-scale replication of this technology with adequate improvisation can solve the arsenic crisis prevalent in the developing world.

  5. CRISPR/Cas9 for genome editing: progress, implications and challenges.

    Science.gov (United States)

    Zhang, Feng; Wen, Yan; Guo, Xiong

    2014-09-15

    Clustered regularly interspaced short palindromic repeats (CRISPR)/CRISPR-associated (Cas) protein 9 system provides a robust and multiplexable genome editing tool, enabling researchers to precisely manipulate specific genomic elements, and facilitating the elucidation of target gene function in biology and diseases. CRISPR/Cas9 comprises of a nonspecific Cas9 nuclease and a set of programmable sequence-specific CRISPR RNA (crRNA), which can guide Cas9 to cleave DNA and generate double-strand breaks at target sites. Subsequent cellular DNA repair process leads to desired insertions, deletions or substitutions at target sites. The specificity of CRISPR/Cas9-mediated DNA cleavage requires target sequences matching crRNA and a protospacer adjacent motif locating at downstream of target sequences. Here, we review the molecular mechanism, applications and challenges of CRISPR/Cas9-mediated genome editing and clinical therapeutic potential of CRISPR/Cas9 in future.

  6. Architecture and signal transduction mechanism of the bacterial chemosensory array: progress, controversies, and challenges.

    Science.gov (United States)

    Falke, Joseph J; Piasta, Kene N

    2014-12-01

    Recent research has deepened our understanding of the ancient, conserved chemosensory array that detects small molecule attractants and repellents, and directs the chemotaxis of bacterial and archaeal cells towards an optimal chemical environment. Here we review advances towards a molecular description of the ultrastable lattice architecture and ultrasensitive signal transduction mechanism of the chemosensory array, as well as controversies and challenges requiring further research. Ultimately, a full molecular understanding of array structure and on-off switching will foster (i) the design of novel therapies that block pathogenic wound seeking and infection, (ii) the development of highly specific, sensitive, stable biosensors, and (iii) the elucidation of general functional principles shared by receptor patches in all branches of life.

  7. Gender and leadership in healthcare administration: 21st century progress and challenges.

    Science.gov (United States)

    Lantz, Paula M

    2008-01-01

    The need for strong leadership and increased diversity is a prominent issue in today's health services workforce. This article reviews the latest literature, including research and proposed agendas, regarding women in executive healthcare leadership. Data suggest that the number of women in leadership roles is increasing, but women remain underrepresented in the top echelons of healthcare leadership, and gender differences exist in the types of leadership roles women do attain. Salary disparity prevails, even when controlling for gender differences in educational attainment, age, and experience. Despite widespread awareness of these problems in the field, current action and policy recommendations are severely lacking. Along with the challenges of cost, quality, and an aging population, the time has come for a more thoughtful, policy-focused approach to amend the discrepancy between gender and leadership in healthcare administration.

  8. Radiotherapy studies and extra-nodal non-Hodgkin lymphomas, progress and challenges

    DEFF Research Database (Denmark)

    Specht, L

    2012-01-01

    for the more common extra-nodal organs, e.g. stomach, Waldeyer's ring, skin and brain, are fairly well known and show significant variation. A few randomised trials have been carried out testing the role of radiotherapy in these lymphomas. However, for most extra-nodal lymphomas, randomised trials have...... coverage of extra-nodal lymphomatous involvement with better sparing of normal tissues. The necessary radiation doses and volumes need to be defined for the different extra-nodal lymphoma entities. The challenge is to optimise the use of radiotherapy in the modern multimodality treatment of extra...... not been carried out, and treatment decisions are made on small patient series and extrapolations from nodal lymphomas. Hopefully, wide international collaboration will make controlled clinical trials possible in the less common extra-nodal lymphomas. Modern highly conformal radiotherapy allows better...

  9. The development of capability measures in health economics: opportunities, challenges and progress.

    Science.gov (United States)

    Coast, Joanna; Kinghorn, Philip; Mitchell, Paul

    2015-04-01

    Recent years have seen increased engagement amongst health economists with the capability approach developed by Amartya Sen and others. This paper focuses on the capability approach in relation to the evaluative space used for analysis within health economics. It considers the opportunities that the capability approach offers in extending this space, but also the methodological challenges associated with moving from the theoretical concepts to practical empirical applications. The paper then examines three 'families' of measures, Oxford Capability instruments (OxCap), Adult Social Care Outcome Toolkit (ASCOT) and ICEpop CAPability (ICECAP), in terms of the methodological choices made in each case. The paper concludes by discussing some of the broader issues involved in making use of the capability approach in health economics. It also suggests that continued exploration of the impact of different methodological choices will be important in moving forward.

  10. Modelling gravitational waves from precessing black-hole binaries: Progress, challenges and prospects

    CERN Document Server

    Hannam, Mark

    2013-01-01

    The inspiral and merger of two orbiting black holes is among the most promising sources for the first (hopefully imminent) direct detection of gravitational waves (GWs), and measurements of these signals could provide a wealth of information about astrophysics, fundamental physics and cosmology. Detection and measurement require a theoretical description of the GW signals from all possible black-hole-binary configurations, which can include complicated precession effects due to the black-hole spins. Modelling the GW signal from generic precessing binaries is therefore one of the most urgent theoretical challenges facing GW astronomy. This article briefly reviews the phenomenology of generic-binary dynamics and waveforms, and recent advances in modelling them.

  11. Correlating structure and function of drug-metabolizing enzymes: progress and ongoing challenges.

    Science.gov (United States)

    Johnson, Eric F; Connick, J Patrick; Reed, James R; Backes, Wayne L; Desai, Manoj C; Xu, Lianhong; Estrada, D Fernando; Laurence, Jennifer S; Scott, Emily E

    2014-01-01

    This report summarizes a symposium sponsored by the American Society for Pharmacology and Experimental Therapeutics at Experimental Biology held April 20-24 in Boston, MA. Presentations discussed the status of cytochrome P450 (P450) knowledge, emphasizing advances and challenges in relating structure with function and in applying this information to drug design. First, at least one structure of most major human drug-metabolizing P450 enzymes is known. However, the flexibility of these active sites can limit the predictive value of one structure for other ligands. A second limitation is our coarse-grain understanding of P450 interactions with membranes, other P450 enzymes, NADPH-cytochrome P450 reductase, and cytochrome b5. Recent work has examined differential P450 interactions with reductase in mixed P450 systems and P450:P450 complexes in reconstituted systems and cells, suggesting another level of functional control. In addition, protein nuclear magnetic resonance is a new approach to probe these protein/protein interactions, identifying interacting b5 and P450 surfaces, showing that b5 and reductase binding are mutually exclusive, and demonstrating ligand modulation of CYP17A1/b5 interactions. One desired outcome is the application of such information to control drug metabolism and/or design selective P450 inhibitors. A final presentation highlighted development of a CYP3A4 inhibitor that slows clearance of human immunodeficiency virus drugs otherwise rapidly metabolized by CYP3A4. Although understanding P450 structure/function relationships is an ongoing challenge, translational advances will benefit from continued integration of existing and new biophysical approaches.

  12. Advancing Data Assimilation in Operational Hydrologic Forecasting: Progresses, Challenges, and Emerging Opportunities

    Science.gov (United States)

    Liu, Yuqiong; Weerts, A.; Clark, M.; Hendricks Franssen, H.-J; Kumar, S.; Moradkhani, H.; Seo, D.-J.; Schwanenberg, D.; Smith, P.; van Dijk, A. I. J. M.; hide

    2012-01-01

    Data assimilation (DA) holds considerable potential for improving hydrologic predictions as demonstrated in numerous research studies. However, advances in hydrologic DA research have not been adequately or timely implemented in operational forecast systems to improve the skill of forecasts for better informed real-world decision making. This is due in part to a lack of mechanisms to properly quantify the uncertainty in observations and forecast models in real-time forecasting situations and to conduct the merging of data and models in a way that is adequately efficient and transparent to operational forecasters. The need for effective DA of useful hydrologic data into the forecast process has become increasingly recognized in recent years. This motivated a hydrologic DA workshop in Delft, the Netherlands in November 2010, which focused on advancing DA in operational hydrologic forecasting and water resources management. As an outcome of the workshop, this paper reviews, in relevant detail, the current status of DA applications in both hydrologic research and operational practices, and discusses the existing or potential hurdles and challenges in transitioning hydrologic DA research into cost-effective operational forecasting tools, as well as the potential pathways and newly emerging opportunities for overcoming these challenges. Several related aspects are discussed, including (1) theoretical or mathematical aspects in DA algorithms, (2) the estimation of different types of uncertainty, (3) new observations and their objective use in hydrologic DA, (4) the use of DA for real-time control of water resources systems, and (5) the development of community-based, generic DA tools for hydrologic applications. It is recommended that cost-effective transition of hydrologic DA from research to operations should be helped by developing community-based, generic modeling and DA tools or frameworks, and through fostering collaborative efforts among hydrologic modellers, DA

  13. Advancing data assimilation in operational hydrologic forecasting: progresses, challenges, and emerging opportunities

    Directory of Open Access Journals (Sweden)

    Y. Liu

    2012-03-01

    Full Text Available Data assimilation (DA holds considerable potential for improving hydrologic predictions as demonstrated in numerous research studies. However, advances in hydrologic DA research have not been adequately or timely implemented into operational forecast systems to improve the skill of forecasts to better inform real-world decision making. This is due in part to a lack of mechanisms to properly quantify the uncertainty in observations and forecast models in real-time forecasting situations and to conduct the merging of data and models in a way that is adequately efficient and transparent to operational forecasters.

    The need for effective DA of useful hydrologic data into the forecast process has become increasingly recognized in recent years. This motivated a hydrologic DA workshop in Delft, The Netherlands in November 2010, which focused on advancing DA in operational hydrologic forecasting and water resources management. As an outcome of the workshop, this paper reviews, in relevant detail, the current status of DA applications in both hydrologic research and operational practices, and discusses the existing or potential hurdles and challenges in transitioning hydrologic DA research into cost-effective operational forecasting tools, as well as the potential pathways and newly emerging opportunities for overcoming these challenges. Several related aspects are discussed, including (1 theoretical or mathematical considerations in DA algorithms, (2 the estimation of different types of uncertainty, (3 new observations and their objective use in hydrologic DA, (4 the use of DA for real-time control of water resources systems, and (5 the development of community-based, generic DA tools for hydrologic applications. It is recommended that cost-effective transition of hydrologic DA from research to operations should be helped by developing community-based, generic modelling and DA tools or frameworks, and through fostering collaborative efforts

  14. Advancing data assimilation in operational hydrologic forecasting: progresses, challenges, and emerging opportunities

    Directory of Open Access Journals (Sweden)

    Y. Liu

    2012-10-01

    Full Text Available Data assimilation (DA holds considerable potential for improving hydrologic predictions as demonstrated in numerous research studies. However, advances in hydrologic DA research have not been adequately or timely implemented in operational forecast systems to improve the skill of forecasts for better informed real-world decision making. This is due in part to a lack of mechanisms to properly quantify the uncertainty in observations and forecast models in real-time forecasting situations and to conduct the merging of data and models in a way that is adequately efficient and transparent to operational forecasters.

    The need for effective DA of useful hydrologic data into the forecast process has become increasingly recognized in recent years. This motivated a hydrologic DA workshop in Delft, the Netherlands in November 2010, which focused on advancing DA in operational hydrologic forecasting and water resources management. As an outcome of the workshop, this paper reviews, in relevant detail, the current status of DA applications in both hydrologic research and operational practices, and discusses the existing or potential hurdles and challenges in transitioning hydrologic DA research into cost-effective operational forecasting tools, as well as the potential pathways and newly emerging opportunities for overcoming these challenges. Several related aspects are discussed, including (1 theoretical or mathematical aspects in DA algorithms, (2 the estimation of different types of uncertainty, (3 new observations and their objective use in hydrologic DA, (4 the use of DA for real-time control of water resources systems, and (5 the development of community-based, generic DA tools for hydrologic applications. It is recommended that cost-effective transition of hydrologic DA from research to operations should be helped by developing community-based, generic modeling and DA tools or frameworks, and through fostering collaborative efforts among

  15. Evolution of health coverage in Mexico: evidence of progress and challenges in the Mexican health system.

    Science.gov (United States)

    Urquieta-Salomón, José E; Villarreal, Héctor J

    2016-02-01

    To consolidate an effective and efficient universal health care coverage requires a deep understanding of the challenges faced by the health care system in providing services demanded by population in need. This study analyses the dynamics of health insurance coverage and effective access coverage to some health interventions in Mexico. It examines the evolution of inequalities and heterogeneous performance of the insurance subsystems incorporated under the Mexican health care system. Two types of coverage indicators were selected: health insurance and effective access to preventive health interventions intended for normative population. Data were drawn from National Health and Nutrition Surveys 2006 and 2012. The economic inequality was estimated using the Standardized Concentration Index by household per capita consumption expenditure as socioeconomic-status indicator. Approximately 75% of the population reported being covered by one of the existing insurance schemes, representing a huge step forward from 2006, when as much as 51.62% of the population had no health insurance. About 87% of this growth was attributable to the expansion of Non Contributory Health Insurance whereas 7% emanated from the Social Security subsystem. The results revealed that inequality in access to health insurance was virtually eradicated; however, traces of unequal access persisted in some subpopulations groups. Coverage indicators of effective access showed a slight improvement in the period analysed, but prenatal care and interventions to prevent chronic disease still presented a serious shortage. Furthermore, there was no evidence that inequities in coverage of these interventions have decreased in recent years. The results provided a mixed picture, generalizable to the system as a whole, expansion of insurance status represents one of the most remarkable advances that have not been accompanied by a significant improvement in effective access. In addition, existing inequalities are

  16. Progress towards computer simulation of NiH2 battery performance over life

    Science.gov (United States)

    Zimmerman, Albert H.; Quinzio, M. V.

    1995-01-01

    The long-term performance of rechargeable battery cells has traditionally been verified through life-testing, a procedure that generally requires significant commitments of funding and test resources. In the situation of nickel hydrogen battery cells, which have the capability of providing extremely long cycle life, the time and cost required to conduct even accelerated testing has become a serious impediment to transitioning technology improvements into spacecraft applications. The utilization of computer simulations to indicate the changes in performance to be expected in response to design or operating changes in nickel hydrogen cells is therefore a particularly attractive tool in advanced battery development, as well as for verifying performance in different applications. Computer-based simulations of the long-term performance of rechargeable battery cells have typically had very limited success in the past. There are a number of reasons for the lack in progress in this area. First, and probably most important, all battery cells are relatively complex electrochemical systems, in which performance is dictated by a large number of interacting physical and chemical processes. While the complexity alone is a significant part of the problem, in many instances the fundamental chemical and physical processes underlying long-term degradation and its effects on performance have not even been understood. Second, while specific chemical and physical changes within cell components have been associated with degradation, there has been no generalized simulation architecture that enables the chemical and physical structure (and changes therein) to be translated into cell performance. For the nickel hydrogen battery cell, our knowledge of the underlying reactions that control the performance of this cell has progressed to where it clearly is possible to model them. The recent development of a relative generalized cell modelling approach provides the framework for translating

  17. OpenTopography: Addressing Big Data Challenges Using Cloud Computing, HPC, and Data Analytics

    Science.gov (United States)

    Crosby, C. J.; Nandigam, V.; Phan, M.; Youn, C.; Baru, C.; Arrowsmith, R.

    2014-12-01

    OpenTopography (OT) is a geoinformatics-based data facility initiated in 2009 for democratizing access to high-resolution topographic data, derived products, and tools. Hosted at the San Diego Supercomputer Center (SDSC), OT utilizes cyberinfrastructure, including large-scale data management, high-performance computing, and service-oriented architectures to provide efficient Web based access to large, high-resolution topographic datasets. OT collocates data with processing tools to enable users to quickly access custom data and derived products for their application. OT's ongoing R&D efforts aim to solve emerging technical challenges associated with exponential growth in data, higher order data products, as well as user base. Optimization of data management strategies can be informed by a comprehensive set of OT user access metrics that allows us to better understand usage patterns with respect to the data. By analyzing the spatiotemporal access patterns within the datasets, we can map areas of the data archive that are highly active (hot) versus the ones that are rarely accessed (cold). This enables us to architect a tiered storage environment consisting of high performance disk storage (SSD) for the hot areas and less expensive slower disk for the cold ones, thereby optimizing price to performance. From a compute perspective, OT is looking at cloud based solutions such as the Microsoft Azure platform to handle sudden increases in load. An OT virtual machine image in Microsoft's VM Depot can be invoked and deployed quickly in response to increased system demand. OT has also integrated SDSC HPC systems like the Gordon supercomputer into our infrastructure tier to enable compute intensive workloads like parallel computation of hydrologic routing on high resolution topography. This capability also allows OT to scale to HPC resources during high loads to meet user demand and provide more efficient processing. With a growing user base and maturing scientific user

  18. Computer aided surface representation. Progress report, June 1, 1988--May 31, 1989

    Energy Technology Data Exchange (ETDEWEB)

    Barnhill, R.E.

    1989-02-09

    The central research problem of this project is the effective representation and display of surfaces, interpolating to given information, in three or more dimensions. In a typical problem, we wish to create a surface from some discrete information. If this information is itself on another surface, the problem is to determine a ``surface defined on a surface,`` which is discussed below. Often, properties of an already constructed surface are desired: such ``geometry processing`` is described below. The Summary of Proposed Research from our original proposal describes the aims of this research project. This Summary and the Table of Contents from the original proposal are enclosed as an Appendix to this Progress Report. The broad sweep from constructive mathematics through algorithms and computer graphics displays is utilized in the research. The wide range of activity, directed in both theory and applications, makes this project unique. Last month in the first Ardent Titan delivered in the State of Arizona came to our group, funded by the DOE and Arizona State University. Although the Titan is a commercial product, its newness requires our close collaboration with Ardent to maximize results. During the past year, four faculty members and several graduate research assistants have worked on this DOE project. The gaining of new professionals is an important aspect of this project. A listing of the students and their topics is given in the Appendix. The most significant publication during the past year is the book, Curves and Surfaces for Computer Aided Geometric Design, by Dr. Gerald Farin. This 300 page volume helps fill a considerable gap in the subject and includes many new results on Bernstein-Bezier curves and surfaces.

  19. Progress and challenges for cost effective kerfless Silicon crystal growth for PV application

    Science.gov (United States)

    Serra, J. M.; Alves, J. Maia; Vallera, A. M.

    2017-06-01

    The major barrier for PV penetration is cost. And the single most important cost factor in silicon technology is the wafer (≈35% of the module cost). Although tremendous progress on cell processing has been reported in recent years, a much smaller evolution is seen on what should be the key point to address - the wafer. The ingot-slicing process is reaching its limits as the wafer thickness is reduced in an effort to lower material costs. Kerf losses of ≈50% and an increase in breakage of a high value added material are putting a lower bound to this approach. New ideas are therefore needed for producing wafers in a way to overcome these limitations. In this paper we present three new concepts being developed in our laboratory that have one thing in common: they all are zero kerf loss processes, aiming at significant reductions in material loss. One explores the concept of exfoliation, the other two aim at the growth of silicon directly into ribbons. These were conceived as continuous processes, based on a floating molten zone concept, to avoid impurity contamination during crystallization.

  20. Mapping cancer cell metabolism with 13 C flux analysis: Recent progress and future challenges

    Directory of Open Access Journals (Sweden)

    Casey Scott Duckwall

    2013-01-01

    Full Text Available The reprogramming of energy metabolism is emerging as an important molecular hallmark of cancer cells. Recent discoveries linking specific metabolic alterations to cancer development have strengthened the idea that altered metabolism is more than a side effect of malignant transformation, but may in fact be a functional driver of tumor growth and progression in some cancers. As a result, dysregulated metabolic pathways have become attractive targets for cancer therapeutics. This review highlights the application of 13 C metabolic flux analysis (MFA to map the flow of carbon through intracellular biochemical pathways of cancer cells. We summarize several recent applications of MFA that have identified novel biosynthetic pathways involved in cancer cell proliferation and shed light on the role of specific oncogenes in regulating these pathways. Through such studies, it has become apparent that the metabolic phenotypes of cancer cells are not as homogeneous as once thought, but instead depend strongly on the molecular alterations and environmental factors at play in each case.

  1. [Ribozyme riboswitch based gene expression regulation systems for gene therapy applications: progress and challenges].

    Science.gov (United States)

    Feng, Jing-Xian; Wang, Jia-wen; Lin, Jun-sheng; Diao, Yong

    2014-11-01

    Robust and efficient control of therapeutic gene expression is needed for timing and dosing of gene therapy drugs in clinical applications. Ribozyme riboswitch provides a promising building block for ligand-controlled gene-regulatory system, based on its property that exhibits tunable gene regulation, design modularity, and target specificity. Ribozyme riboswitch can be used in various gene delivery vectors. In recent years, there have been breakthroughs in extending ribozyme riboswitch's application from gene-expression control to cellular function and fate control. High throughput screening platforms were established, that allow not only rapid optimization of ribozyme riboswitch in a microbial host, but also straightforward transfer of selected devices exhibiting desired activities to mammalian cell lines in a predictable manner. Mathematical models were employed successfully to explore the performance of ribozyme riboswitch quantitively and its rational design predictably. However, to progress toward gene therapy relevant applications, both precision rational design of regulatory circuits and the biocompatibility of regulatory ligand are still of crucial importance.

  2. Mapping the functional neuroanatomy of spatial neglect and human parietal lobe functions: progress and challenges.

    Science.gov (United States)

    Vuilleumier, Patrik

    2013-08-01

    Spatial neglect is generally defined by various deficits in processing information from one (e.g., left) side of space contralateral to focal (e.g., right) hemisphere damage. Although classically associated with parietal lobe functions, there is now compelling evidence that neglect can follow lesions in many different cortical and subcortical sites, suggesting a dysfunction in distributed brain networks. In addition, neglect is likely to result from a combination of distinct deficits that co-occur due to concomitant damage affecting juxtaposed brain areas and their connections, but the exact nature of core deficits and their neural substrates still remains unclear. The present review describes recent progress in identifying functional components of the neglect syndrome and relating them to distinct subregions of parietal cortex. A comprehensive understanding of spatial neglect will require a more precise definition of cognitive processes implicated in different behavioral manifestations, as well as meticulous mapping of these processes onto specific brain circuits, while taking into account functional changes in activity that may arise in structurally intact areas subsequent to damage in distant portions of the relevant networks. © 2013 New York Academy of Sciences.

  3. [The challenge of education at the IMSS: how to become the advanced guard of institutional progress].

    Science.gov (United States)

    Viniegra Velázquez, Leonardo

    2005-01-01

    A panorama of the educative actions at the Instituto Mexicano del Seguro Social is presented, and it is discussed why they have not had a significant impact in health care services. In order to explore how the explanation is in the working environment where the graduates of educative programs are inserted, it is analyzed the nature of the state, the organizational complexity of its institutions and how it is legitimated. It is recognized that the organizational complexity is constituted by the institution itself, its environments, groups, individuals, activities, and the way in which a certain type of interaction is carried out in the institutional management focused in control, as well as in the management centered in participation, which may influence the working environments in such a way that it works as a powerful antidote against degradation. The course of participative education at the IMSS is described, and how the consolidation of participative education and the operation of the institutional teaching career are about to constitute the most powerful lever for the advancement of the institutional performance when contributing to the emergency of stimulating working environments, where the activities acquire growing effectivity and progressive reach.

  4. Towards An Oceanographic Component Of A Global Earth Observation System Of Systems: Progress And Challenges

    Science.gov (United States)

    Ackleson, S. G.

    2012-12-01

    Ocean observatories (systems of coordinated sensors and platforms providing real-time in situ observations across multiple temporal and spatial scales) have advanced rapidly during the past several decades with the integration of novel hardware, development of advanced cyber-infrastructures and data management software, and the formation of researcher networks employing fixed, drifting, and mobile assets. These advances have provided persistent, real-time, multi-disciplinary observations representing even the most extreme environmental conditions, enabled unique and informative views of complicated ocean processes, and aided in the development of more accurate and higher fidelity ocean models. Combined with traditional ship-based and remotely sensed observations, ocean observatories have yielded new knowledge across a broad spectrum of earth-ocean scales that would likely not exist otherwise. These developments come at a critical time in human history when the demands of global population growth are creating unprecedented societal challenges associated with rapid climatic change and unsustainable consumption of key ocean resources. Successfully meeting and overcoming these challenges and avoiding the ultimate tragedy of the commons will require greater knowledge of environmental processes than currently exists, including interactions between the ocean, the overlying atmosphere, and the adjacent land and synthesizing new knowledge into effective policy and management structures. To achieve this, researchers must have free and ready access to comprehensive data streams (oceanic, atmospheric, and terrestrial), regardless of location and collection system. While the precedent for the concept of free and open access to environmental data is not new (it traces back to the International Geophysical Year, 1957), implementing procedures and standards on a global scale is proving to be difficult, both logistically and politically. Observatories have been implemented in many

  5. Progress and Challenges toward the Development of Vaccines against Avian Infectious Bronchitis

    Science.gov (United States)

    Bande, Faruku; Arshad, Siti Suri; Hair Bejo, Mohd; Moeini, Hassan; Omar, Abdul Rahman

    2015-01-01

    Avian infectious bronchitis (IB) is a widely distributed poultry disease that has huge economic impact on poultry industry. The continuous emergence of new IBV genotypes and lack of cross protection among different IBV genotypes have been an important challenge. Although live attenuated IB vaccines remarkably induce potent immune response, the potential risk of reversion to virulence, neutralization by the maternal antibodies, and recombination and mutation events are important concern on their usage. On the other hand, inactivated vaccines induce a weaker immune response and may require multiple dosing and/or the use of adjuvants that probably have potential safety risks and increased economic burdens. Consequently, alternative IB vaccines are widely sought. Recent advances in recombinant DNA technology have resulted in experimental IB vaccines that show promise in antibody and T-cells responses, comparable to live attenuated vaccines. Recombinant DNA vaccines have also been enhanced to target multiple serotypes and their efficacy has been improved using delivery vectors, nanoadjuvants, and in ovo vaccination approaches. Although most recombinant IB DNA vaccines are yet to be licensed, it is expected that these types of vaccines may hold sway as future vaccines for inducing a cross protection against multiple IBV serotypes. PMID:25954763

  6. Lake Winnipeg Basin: Advocacy, challenges and progress for sustainable phosphorus and eutrophication control.

    Science.gov (United States)

    Ulrich, Andrea E; Malley, Diane F; Watts, Paul D

    2016-01-15

    Intensification of agricultural production worldwide has altered cycles of phosphorus (P) and water. In particular, loading of P on land in fertilizer applications is a global water quality concern. The Lake Winnipeg Basin (LWB) is a major agricultural area displaying extreme eutrophication. We examined the eutrophication problem in the context of the reemerging global concern about future accessibility of phosphate rock for fertilizer production and sustainable phosphorus management. An exploratory action research participatory design was applied to study options for proactivity within the LWB. The multiple methods, including stakeholder interviews and surveys, demonstrate emerging synergies between the goals of reversing eutrophication and promoting food security. Furthermore, shifting the prevalent pollutant-driven eutrophication management paradigm in the basin toward a systemic, holistic and ecocentric approach, integrating global resource challenges, requires a mutual learning process among stakeholders in the basin to act on and adapt to ecosystem vulnerabilities. It is suggested to continue aspects of this research in a transdisciplinary format, i.e., science with society, in response to globally-expanding needs and concerns, with a possible focus on enhanced engagement of indigenous peoples and elders.

  7. Progress and challenges in predictive modeling of runaway electron generation in ITER

    Science.gov (United States)

    Brennan, Dylan; Hirvijoki, Eero; Liu, Chang; Bhattacharjee, Amitava; Boozer, Allen

    2016-10-01

    Among the most important questions given a thermal collapse event in ITER is that of how many seed electrons are available for runaway acceleration and the avalanche process, how collisional and radiative mechanisms will affect the electron acceleration, and what mitigation techniques will be effective. In this study, we use the kinetic equation for electrons and ions to investigate how different cooling scenarios lead to different seed distributions. Given any initial distribution, we study their subsequent avalanche and acceleration to runaway with Adjoint and test particle methods. This method gives an accurate calculation of the runaway threshold by including the collisional drag of background electrons (assuming they are Maxwellian), pitch angle scattering, and synchrotron and Bremsstrahlung radiation. This effort is part of a new large collaboration in the US which promises to contribute substantially to our understanding of these issues. This talk will briefly review how this work contributes to this collaboration, and in particular discuss the technical challenges and open questions that stand in the way of quantitative, predictive modeling of runaway generation in ITER, and how we plan to address them.

  8. RETRIEVING SUSPECT TRANSURANIC (TRU) WASTE FROM THE HANFORD BURIAL GROUNDS PROGRESS PLANS & CHALLENGES

    Energy Technology Data Exchange (ETDEWEB)

    FRENCH, M.S.

    2006-02-01

    This paper describes the scope and status of the program for retrieval of suspect transuranic (TRU) waste stored in the Hanford Site low-level burial grounds. Beginning in 1970 and continuing until the late 1980's, waste suspected of containing significant quantities of transuranic isotopes was placed in ''retrievable'' storage in designated modules in the Hanford burial grounds, with the intent that the waste would be retrieved when a national repository for disposal of such waste became operational. Approximately 15,000 cubic meters of waste, suspected of being TRU, was placed in storage modules in four burial grounds. With the availability of the national repository (the Waste Isolation Pilot Plant), retrieval of the suspect TRU waste is now underway. Retrieval efforts, to date, have been conducted in storage modules that contain waste, which is in general, contact-handled, relatively new (1980's and later), is stacked in neat, engineered configurations, and has a relatively good record of waste characteristics. Even with these optimum conditions, retrieval personnel have had to deal with a large number of structurally degraded containers, radioactive contamination issues, and industrial hazards (including organic vapors). Future retrieval efforts in older, less engineered modules are expected to present additional hazards and difficult challenges.

  9. Adsorptive removal of antibiotics from water and wastewater: Progress and challenges.

    Science.gov (United States)

    Ahmed, Mohammad Boshir; Zhou, John L; Ngo, Huu Hao; Guo, Wenshan

    2015-11-01

    Antibiotics as emerging contaminants are of global concern due to the development of antibiotic resistant genes potentially causing superbugs. Current wastewater treatment technology cannot sufficiently remove antibiotics from sewage, hence new and low-cost technology is needed. Adsorptive materials have been extensively used for the conditioning, remediation and removal of inorganic and organic hazardous materials, although their application for removing antibiotics has been reported for ~30 out of 250 antibiotics so far. The literature on the adsorptive removal of antibiotics using different adsorptive materials is summarized and critically reviewed, by comparing different adsorbents with varying physicochemical characteristics. The efficiency for removing antibiotics from water and wastewater by different adsorbents has been evaluated by examining their adsorption coefficient (Kd) values. For sulfamethoxazole the different adsorbents followed the trend: biochar (BC)> multi-walled carbon nanotubes (MWCNTs)>graphite = clay minerals, and for tetracycline the adsorptive materials followed the trend: SWCNT > graphite > MWCNT = activated carbon (AC) > bentonite = humic substance = clay minerals. The underlying controlling parameters for the adsorption technology have been examined. In addition, the cost of preparing adsorbents has been estimated, which followed the order of BCs < ACs < ion exchange resins < MWCNTs < SWCNTs. The future research challenges on process integration, production and modification of low-cost adsorbents are elaborated.

  10. Towards a predictive systems-level model of the human microbiome: progress, challenges, and opportunities.

    Science.gov (United States)

    Greenblum, Sharon; Chiu, Hsuan-Chao; Levy, Roie; Carr, Rogan; Borenstein, Elhanan

    2013-08-01

    The human microbiome represents a vastly complex ecosystem that is tightly linked to our development, physiology, and health. Our increased capacity to generate multiple channels of omic data from this system, brought about by recent advances in high throughput molecular technologies, calls for the development of systems-level methods and models that take into account not only the composition of genes and species in a microbiome but also the interactions between these components. Such models should aim to study the microbiome as a community of species whose metabolisms are tightly intertwined with each other and with that of the host, and should be developed with a view towards an integrated, comprehensive, and predictive modeling framework. Here, we review recent work specifically in metabolic modeling of the human microbiome, highlighting both novel methodologies and pressing challenges. We discuss various modeling approaches that lay the foundation for a full-scale predictive model, focusing on models of interactions between microbial species, metagenome-scale models of community-level metabolism, and models of the interaction between the microbiome and the host. Continued development of such models and of their integration into a multi-scale model of the microbiome will lead to a deeper mechanistic understanding of how variation in the microbiome impacts the host, and will promote the discovery of clinically relevant and ecologically relevant insights from the rich trove of data now available. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Global land cover mapping using Earth observation satellite data: Recent progresses and challenges

    Science.gov (United States)

    Ban, Yifang; Gong, Peng; Giri, Chandra

    2015-05-01

    Land cover is an important variable for many studies involving the Earth surface, such as climate, food security, hydrology, soil erosion, atmospheric quality, conservation biology, and plant functioning. Land cover not only changes with human caused land use changes, but also changes with nature. Therefore, the state of land cover is highly dynamic. In winter snow shields underneath various other land cover types in higher latitudes. Floods may persist for a long period in a year over low land areas in the tropical and subtropical regions. Forest maybe burnt or clear cut in a few days and changes to bare land. Within several months, the coverage of crops may vary from bare land to nearly 100% crops and then back to bare land following harvest. The highly dynamic nature of land cover creates a challenge in mapping and monitoring which remains to be adequately addressed. As economic globalization continues to intensify, there is an increasing trend of land cover/land use change, environmental pollution, land degradation, biodiversity loss at the global scale, timely and reliable information on global land cover and its changes is urgently needed to mitigate the negative impact of global environment change.

  12. Carbon nanotube high-performance logic technology - challenges and current progress

    Science.gov (United States)

    Han, Shu-Jen

    2015-03-01

    In the last four decades, we have witnessed a tremendous information technology revolution originated from the relentless scaling of Si complementary metal-oxide semiconductor (CMOS) devices. CMOS scaling provides ever-improved transistor performance, density, power and cost, and will continue to bring new applications and functions to our daily life. However, the conventional homogeneous scaling of silicon devices has become very difficult, firstly due to the unsatisfactory electrostatic control from the gate dielectric. In addition, as we look forward to the technology nodes with sub-10 nm channel length, non-Si based channel materials will be required to provide continuous carrier velocity enhancement when the conventional strained-Si techniques run out of steam. Single-walled carbon nanotubes are promising to replace silicon as the channel material for high-performance electronics near the end of silicon scaling roadmap, with their superb electrical properties, intrinsic ultrathin body, and nearly transparent contact with certain metals. This talk discusses recent advances in modeling and experimental works that reveal the properties and potential of ultra-scaled nanotube transistors, separation and assembly techniques for forming nanotube arrays with high semiconducting nanotube purity and tight pitch separation, and engineering aspects of their implementation in integrated circuits and functional systems. A concluding discussion highlights most significant challenges from technology points of view, and provides perspectives on the future of carbon nanotube based nanoelectronics.

  13. Recent progressions in stem cell research: breakthroughs achieved and challenges faced.

    Science.gov (United States)

    Tani, Jowy; Umbas, Rainy

    2009-01-01

    Stem cell studies have been conducted to study characteristics of stem cells, to develop better techniques for patient-specific stem cell lines generation, and to explore the therapeutic potential of stem cells. Techniques that enable efficient generation of new stem cell lines would facilitate research and allow generation of patient-specific stem cell lines for transplantation therapy. Somatic-Cell Nuclear Transfer (SCNT), which involves injection of donor cell nucleus into enucleated ovum, is the standard technique for new embryonic stem (ES) cell lines generation; presently its efficiency is low. A newer technique, pluripotent stem cell induction, reprograms somatic cells into induced pluripotent stem (iPS) cells by introducing certain factors into somatic cells. While certain adult stem cell treatments have been investigated on human participants, most ES cell or iPS cell treatments were still experimented on animal models. Recently, therapeutic potential of stem cell for several disorders was demonstrated. Researchers demonstrated stem cell's potential for treating hematologic disorders by correcting sickle cell anemia in rat model with iPS cells. Its potential role in treating cardiovascular disorder was demonstrated as injection of damaged rat heart with human ES cells derived cardiomyocyte plus "prosurvival" cocktail improved heart function. It might also treat nervous system disorders; injected into brain, ES cells derived neurons replace some loss cells in stroke rats and iPS cells derived neurons improved Parkinsonian syndrome in rats. Progress was also seen in other aspects of regenerative medicine. To overcome controversies caused by embryo destruction for obtaining ES cells, single blastomer stem cell derivation, Cdx2-inactivation, and parthenogenesis were proposed. All ES cell, iPS cell, and adult stem cell research should be continued with support from all sides.

  14. Directional backlight liquid crystal autostereoscopic display: technical challenges, research progress, and prospect (Conference Presentation)

    Science.gov (United States)

    Fan, Hang; Li, Kunyang; Zhou, Yangui; Liang, Haowen; Wang, Jiahui; Zhou, Jianying

    2016-09-01

    Recent upsurge on virtual and augmented realities (VR and AR) has re-ignited the interest to the immerse display technology. The VR/AR technology based on stereoscopic display is believed in its early stage as glasses-free, or autostereoscopic display, will be ultimately adopted for the viewing convenience, visual comfort and for the multi-viewer purposes. On the other hand, autostereoscopic display has not yet received positive market response for the past years neither with stereoscopic displays using shutter or polarized glasses. We shall present the analysis on the real-world applications, rigid user demand, the drawbacks to the existing barrier- and lenticular lens-based LCD autostereoscopy. We shall emphasize the emerging autostereoscopic display, and notably on directional backlight LCD technology using a hybrid spatial- and temporal-control scenario. We report the numerical simulation of a display system using Monte-Carlo ray-tracing method with the human retina as the real image receiver. The system performance is optimized using newly developed figure of merit for system design. The reduced crosstalk in an autostereoscopic system, the enhanced display quality, including the high resolution received by the retina, the display homogeneity without Moiré- and defect-pattern, will be highlighted. Recent research progress including a novel scheme for diffraction-free backlight illumination, the expanded viewing zone for autostereoscopic display, and the novel Fresnel lens array to achieve a near perfect display in 2D/3D mode will be introduced. The experimental demonstration will be presented to the autostereoscopic display with the highest resolution, low crosstalk, Moiré- and defect- pattern free.

  15. Orphan drugs in development for primary biliary cirrhosis: challenges and progress

    Directory of Open Access Journals (Sweden)

    Ali AH

    2015-09-01

    Full Text Available Ahmad H Ali,1 Thomas J Byrne,1 Keith D Lindor1,21Division of Gastroenterology and Hepatology, Mayo Clinic, 2College of Health Solutions, Arizona State University, Phoenix, AZ, USAAbstract: Primary biliary cirrhosis (PBC is a chronic progressive liver disease that often leads to fibrosis, cirrhosis, and end-stage liver disease. The diagnosis is made when there is evidence of cholestasis and reactivity to the antimitochondrial antibody. The etiology of PBC is poorly understood; however, several lines of evidence suggest an environmental factor that triggers a series of immune-mediated inflammatory reactions in the bile ducts in a genetically susceptible individual. Fatigue and pruritus are the most common symptoms of PBC; however, many patients are diagnosed with PBC only based on laboratory abnormalities. The only pharmacological treatment approved for PBC is ursodeoxycholic acid (UDCA. Several controlled studies have shown that UDCA improves liver biochemistries and prolongs transplant-free survival in PBC patients. Nearly 40% of PBC patients do not respond to UDCA, and those patients are at high risk of serious adverse events, such as the development of liver failure. Therefore, newer alternative therapeutic options for PBC are needed. Obeticholic acid is a first-in-class farnesoid X receptor agonist that has been recently evaluated in PBC patients with inadequate response to UDCA, and demonstrated beneficial results in improving liver biochemistries. Several other agents (fibrates and glucocorticoids have been previously examined in PBC patients with inadequate response to UDCA, and preliminary results showed biochemical improvement. However, large-scale controlled clinical trials are needed to determine the long-term effects of fibrates and glucocorticoids on the clinical outcomes of PBC. Clinical trials of NGM282 (a fibroblast growth factor-19 analog and Abatacept (a fusion protein composed of the Fc portion of immunoglobulin G1 fused to

  16. Flood risk assessment at the regional scale: Computational challenges and the monster of uncertainty

    Science.gov (United States)

    Efstratiadis, Andreas; Papalexiou, Simon-Michael; Markonis, Yiannis; Koukouvinos, Antonis; Vasiliades, Lampros; Papaioannou, George; Loukas, Athanasios

    2016-04-01

    We present a methodological framework for flood risk assessment at the regional scale, developed within the implementation of the EU Directive 2007/60 in Greece. This comprises three phases: (a) statistical analysis of extreme rainfall data, resulting to spatially-distributed parameters of intensity-duration-frequency (IDF) relationships and their confidence intervals, (b) hydrological simulations, using event-based semi-distributed rainfall-runoff approaches, and (c) hydraulic simulations, employing the propagation of flood hydrographs across the river network and the mapping of inundated areas. The flood risk assessment procedure is employed over the River Basin District of Thessaly, Greece, which requires schematization and modelling of hundreds of sub-catchments, each one examined for several risk scenarios. This is a challenging task, involving multiple computational issues to handle, such as the organization, control and processing of huge amount of hydrometeorological and geographical data, the configuration of model inputs and outputs, and the co-operation of several software tools. In this context, we have developed supporting applications allowing massive data processing and effective model coupling, thus drastically reducing the need for manual interventions and, consequently, the time of the study. Within flood risk computations we also account for three major sources of uncertainty, in an attempt to provide upper and lower confidence bounds of flood maps, i.e. (a) statistical uncertainty of IDF curves, (b) structural uncertainty of hydrological models, due to varying anteceded soil moisture conditions, and (c) parameter uncertainty of hydraulic models, with emphasis to roughness coefficients. Our investigations indicate that the combined effect of the above uncertainties (which are certainly not the unique ones) result to extremely large bounds of potential inundation, thus rising many questions about the interpretation and usefulness of current flood

  17. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  18. Improving the Performance of Phase-Change Perfluorocarbon Droplets for Medical Ultrasonography: Current Progress, Challenges, and Prospects

    Directory of Open Access Journals (Sweden)

    Paul S. Sheeran

    2014-01-01

    Full Text Available Over the past two decades, perfluorocarbon (PFC droplets have been investigated for biomedical applications across a wide range of imaging modalities. More recently, interest has increased in “phase-change” PFC droplets (or “phase-change” contrast agents, which can convert from liquid to gas with an external energy input. In the field of ultrasound, phase-change droplets present an attractive alternative to traditional microbubble agents for many diagnostic and therapeutic applications. Despite the progress, phase-change PFC droplets remain far from clinical implementation due to a number of challenges. In this review, we survey our recent work to enhance the performance of phase-change agents for ultrasound through a variety of techniques in order to provide increased efficacy in therapeutic applications of ultrasound and enable previously unexplored applications in diagnostic and molecular imaging.

  19. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  20. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  1. Development and implementation of energy efficiency standards and labeling programs in China: Progress and challenges

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Nan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Khanna, Nina Zheng [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Fridley, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Romankiewicz, John [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2013-01-31

    relatively simple techno-economic analyses used to determine its efficiency standards levels rather than the specific sets of analyses and tools used internationally. Based on international experiences, inclusion of more detailed energy consumption surveys in the Chinese national census surveys and statistical reporting systems could help provide the necessary data for more comprehensive standard-setting analyses. In terms of stakeholder participation in the standards development process, stakeholder participation in China is limited to membership on technical committees responsible for developing or revising standards and generally do not include environmental groups, consumer associations, utilities and other NGOs. Increasing stakeholder involvement to broader interest groups could help garner more support and feedback in the S&L implementation process. China has emerged as a leader in a national verification testing scheme with complementary pilot checktesting projects, but it still faces challenges with insufficient funding, low local awareness amongst some regulatory agencies and resistance to check-testing by some manufacturers, limited product sampling scope, and testing inconsistency and incomparability of results. Thus, further financial and staff resources and capacity building will be needed to overcome these remaining challenges and to expand impacts evaluations to assess the actual effectiveness of implementation and enforcement.

  2. Progress and challenges associated with digitizing and serving up Hawaii's geothermal data

    Science.gov (United States)

    Thomas, D. M.; Lautze, N. C.; Abdullah, M.

    2012-12-01

    This presentation will report on the status of our effort to digitize and serve up Hawaii's geothermal information, an undertaking that commenced in 2011 and will continue through at least 2013. This work is part of national project that is funded by the Department of Energy and managed by the Arizona State Geology Survey (AZGS). The data submitted to AZGS is being entered into the National Geothermal Data System (see http://www.stategeothermaldata.org/overview). We are also planning to host the information locally. Main facets of this project are to: - digitize and generate metadata for non-published geothermal documents relevant to the State of Hawaii - digitize ~100 years of paper records relevant to well permitting and water resources development and serve up information on the ~4500 water wells in the state - digitize, organize, and serve up information on research and geothermal exploratory drilling conducted from the 1980s to the present. - work with AZGS and OneGeology to contribute a geologic map for Hawaii that integrates geologic and geothermal resource data. By December 2012, we anticipate that the majority of the digitization will be complete, the geologic map will be approved, and that over 1000 documents will be hosted online through the University of Hawaii's library system (in the "Geothermal Collection" within the "Scholar Space" repository, see http://scholarspace.manoa.hawaii.edu/handle/10125/21320). Developing a 'user-friendly' web interface for the water well and drilling data will be a main task in the coming year. Challenges we have faced and anticipate include: 1) ensuring that no personally identifiable information (e.g. SSN, private telephone numbers, bank or credit account) is contained in the geothermal documents and well files; 2) Homeland Security regulations regarding release of information on critical infrastructure related to municipal water supply systems; 3) maintenance of the well database as future well data are developed with

  3. microRNAs as Potential Biomarkers in Adrenocortical Cancer: Progress and Challenges

    Science.gov (United States)

    Cherradi, Nadia

    2016-01-01

    Adrenocortical carcinoma (ACC) is a rare malignancy with poor prognosis and limited therapeutic options. Over the last decade, pan-genomic analyses of genetic and epigenetic alterations and genome-wide expression profile studies allowed major advances in the understanding of the molecular genetics of ACC. Besides the well-known dysfunctional molecular pathways in adrenocortical tumors, such as the IGF2 pathway, the Wnt pathway, and TP53, high-throughput technologies enabled a more comprehensive genomic characterization of adrenocortical cancer. Integration of expression profile data with exome sequencing, SNP array analysis, methylation, and microRNA (miRNA) profiling led to the identification of subgroups of malignant tumors with distinct molecular alterations and clinical outcomes. miRNAs post-transcriptionally silence their target gene expression either by degrading mRNA or by inhibiting translation. Although our knowledge of the contribution of deregulated miRNAs to the pathogenesis of ACC is still in its infancy, recent studies support their relevance in gene expression alterations in these tumors. Some miRNAs have been shown to carry potential diagnostic and prognostic values, while others may be good candidates for therapeutic interventions. With the emergence of disease-specific blood-borne miRNAs signatures, analyses of small cohorts of patients with ACC suggest that circulating miRNAs represent promising non-invasive biomarkers of malignancy or recurrence. However, some technical challenges still remain, and most of the miRNAs reported in the literature have not yet been validated in sufficiently powered and longitudinal studies. In this review, we discuss the current knowledge regarding the deregulation of tumor-associated and circulating miRNAs in ACC patients, while emphasizing their potential significance in pathogenic pathways in light of recent insights into the role of miRNAs in shaping the tumor microenvironment. PMID:26834703

  4. MicroRNAs as potential biomarkers in adrenocortical cancer: progress and challenges

    Directory of Open Access Journals (Sweden)

    Nadia eCHERRADI

    2016-01-01

    Full Text Available Adrenocortical carcinoma is a rare malignancy with poor prognosis and limited therapeutic options. Over the last decade, pan-genomic analyses of genetic and epigenetic alterations and genome-wide expression profile studies allowed major advances in the understanding of the molecular genetics of adrenocortical carcinoma. Besides the well-known dysfunctional molecular pathways in adrenocortical tumors such as the IGF2 pathway, the Wnt pathway and TP53, high-throughput technologies enabled a more comprehensive genomic characterization of adrenocortical cancer. Integration of expression profile data with exome sequencing, SNP array analysis, methylation and microRNA profiling led to the identification of subgroups of malignant tumors with distinct molecular alterations and clinical outcomes. MicroRNAs post-transcriptionally silence their target gene expression either by degrading mRNA or by inhibiting translation. Although our knowledge of the contribution of deregulated microRNAs to the pathogenesis of adrenocortical carcinoma is still in its infancy, recent studies support their relevance in gene expression alterations in these tumors. Some microRNAs have been shown to carry potential diagnostic and prognostic values while others may be good candidates for therapeutic interventions. With the emergence of disease-specific blood-borne microRNAs signatures, analyses of small cohorts of patients with adrenocortical carcinoma suggest that circulating microRNAs represent promising non-invasive biomarkers of malignancy or recurrence. However, some technical challenges still remain, and most of the microRNAs reported in the literature have not yet been validated in sufficiently powered and longitudinal studies. In this review, we discuss the current knowledge regarding the deregulation of tumor-associated and circulating microRNAs in adrenocortical carcinoma patients, while emphasizing their potential significance in adrenocortical carcinoma pathogenic

  5. Progress and Future Challenges of Human Induced Pluripotents Stem Cell in Regenerative Medicine

    Directory of Open Access Journals (Sweden)

    Anna Meiliana

    2011-08-01

    Full Text Available BACKGROUND: Less than a decade ago the prospect for reprogramming the human somatic cell looked bleak at best. It seemed that the only methods at our disposal for the generation of human isogenic pluripotent cells would have to involve somatic cell nuclear transfer (SCNT. Shinya Yamanaka in August 2006 in his publication (Cell promised to change everything by showing that it was apparently very simple to revert the phenotype of a differentiated cell to a pluripotent one by overexpressing four transcription factors in murine fibroblasts. CONTENT: Mouse and human somatic cells can be genetically reprogrammed into induced pluripotent stem cells (iPSCs by the expression of a defined set of factors (Oct4, Sox2, c-Myc, and Klf4, as well as Nanog and LIN28. iPSCs could be generated from mouse and human fibroblasts as well as from mouse liver, stomach, pancreatic, neural stem cells, and keratinocytes. Similarity of iPSCs and embryonic stem cells (ESCs has been demonstrated in their morphology, global expression profiles, epigenetic status, as well as in vitro and in vivo differentiation potential for both mouse and human cells. Many techniques for human iPSCs (hiPSCs derivation have been developed in recent years, utilizing different starting cell types, vector delivery systems, and culture conditions. A refined or perfected combination of these techniques might prove to be the key to generating clinically applicable hiPSCs. SUMMARY: iPSCs are a revolutionary tool for generating in vitro models of human diseases and may help us to understand the molecular basis of epigenetic reprogramming. Progress of the last four years has been truly amazing, almost verging on science fiction, but if we can learn to produce such cells cheaply and easily, and control their differentiation, our efforts to understand and fight disease will become more accessible, controllable and tailored. Ability to safely and efficiently derive hiPSCs may be of decisive importance to

  6. MIT Laboratory for Computer Science Progress Report 26. Final technical report, July 1988-June 1989

    Energy Technology Data Exchange (ETDEWEB)

    Dertouzos, M.L.

    1989-06-01

    Contents: advanced network architecture; clinical decision making; computer architecture group; computation structures; information mechanics; mercury; parallel processing; programming methodology; programming systems research; spoken language systems; systematic program development; theory of computation; theory of distributed systems.

  7. Laboratory for Computer Science progress report 24, July 1986-June 1987

    Energy Technology Data Exchange (ETDEWEB)

    1987-06-01

    The work reported here was carried out within the Laboratory for Computer Science, an MIT interdepartmental laboratory. Partial contents include: Clinical Decision Making; An Artificial Intelligence Approach to Clinical Decision Making; A Program for the Management of Heart Failure; Computation Structures; Computer Languages and Systems; Computer Applications; Distributed Computing; Bulk Data Transfer Protocol; Information Mechanics; The CAM-7 Multiprocessor; Fluid Dynamics; Texture-Locked Loops; Mercury; Parallel Processing; Computer Programming Methodology; Real Time Systems. A list of publications follows this report.

  8. Security in Cloud Computing For Service Delivery Models: Challenges and Solutions

    Directory of Open Access Journals (Sweden)

    Preeti Barrow

    2016-04-01

    Full Text Available Cloud computing, undoubtedly, is a path to expand the limits or add powerful capabilities on-demand with almost no investment in new framework, training new staff, or authorizing new software. Though today everyone is talking about cloud but, organizations are still in dilemma whether it’s safe to deploy their business on cloud. The reason behind it; is nothing but Security. No cloud service provider provides 100% security assurance to its customers and therefore, businesses are hesitant to accept cloud and the vast benefits that come along with it. The absence of proper security controls delimits the benefits of cloud. In this paper, a review on different cloud service models and a survey of the different security challenges and issues while providing services in cloud is presented .The paper focuses on the security issues specific to service delivery model (SaaS, IaaS and PaaS of cloud environment. This paper also explores the various security solutions currently being applied to protect cloud from various kinds of intruders

  9. Challenges in clinical applications of brain computer interfaces in individuals with spinal cord injury

    Directory of Open Access Journals (Sweden)

    Rüdiger eRupp

    2014-09-01

    Full Text Available Brain computer interfaces (BCIs are devices that measure brain activities and translate them into control signals used for a variety of applications. Among them are systems for communication, environmental control, neuroprostheses, exoskeletons or restorative therapies. Over the last years the technology of BCIs has reached a level of matureness allowing them to be used not only in research experiments supervised by scientists, but also in clinical routine with patients with neurological impairments supervised by clinical personnel or caregivers. However, clinicians and patients face many challenges in the application of BCIs. This particularly applies to high spinal cord injured patients, in whom artificial ventilation, autonomic dysfunctions, neuropathic pain or the inability to achieve a sufficient level of control during a short-term training may limit the successful use of a BCI. Additionally, spasmolytic medication and the acute stress reaction with associated episodes of depression may have a negative influence on the modulation of brain waves and therefore the ability to concentrate over an extended period of time. Although BCIs seem to be a promising assistive technology for individuals with high spinal cord injury systematic investigations are highly needed to obtain realistic estimates of the percentage of users that for any reason may not be able to operate a BCI in a clinical setting.

  10. Challenges in clinical applications of brain computer interfaces in individuals with spinal cord injury.

    Science.gov (United States)

    Rupp, Rüdiger

    2014-01-01

    Brain computer interfaces (BCIs) are devices that measure brain activities and translate them into control signals used for a variety of applications. Among them are systems for communication, environmental control, neuroprostheses, exoskeletons, or restorative therapies. Over the last years the technology of BCIs has reached a level of matureness allowing them to be used not only in research experiments supervised by scientists, but also in clinical routine with patients with neurological impairments supervised by clinical personnel or caregivers. However, clinicians and patients face many challenges in the application of BCIs. This particularly applies to high spinal cord injured patients, in whom artificial ventilation, autonomic dysfunctions, neuropathic pain, or the inability to achieve a sufficient level of control during a short-term training may limit the successful use of a BCI. Additionally, spasmolytic medication and the acute stress reaction with associated episodes of depression may have a negative influence on the modulation of brain waves and therefore the ability to concentrate over an extended period of time. Although BCIs seem to be a promising assistive technology for individuals with high spinal cord injury systematic investigations are highly needed to obtain realistic estimates of the percentage of users that for any reason may not be able to operate a BCI in a clinical setting.

  11. Evolution of the health sector response to HIV in Myanmar: progress, challenges and the way forward.

    Science.gov (United States)

    Oo, Htun Nyunt; Hone, San; Fujita, Masami; Maw-Naing, Amaya; Boonto, Krittayawan; Jacobs, Marjolein; Phyu, Sabe; Bollen, Phavady; Cheung, Jacquie; Aung, Htin; Aung Sang, May Thu; Myat Soe, Aye; Pendse, Razia; Murphy, Eamonn

    2016-11-28

    Critical building blocks for the response to HIV were made until 2012 despite a series of political, social and financial challenges. A rapid increase of HIV service coverage was observed from 2012 to 2015 through collaborative efforts of government and non-governmental organisations (NGOs). Government facilities, in particular, demonstrated their capacity to expand services for antiretroviral therapy (ART), prevention of mother-to-child transmission (PMTCT) of HIV, tuberculosis and HIV co-infection and methadone-maintenance therapy (MMT). After nearly three decades into the response to HIV, Myanmar has adopted strategies to provide the right interventions to the right people in the right places to maximise impact and cost efficiency. In particular, the country is now using strategic information to classify areas into high-, medium- and low-HIV burden and risk of new infections for geographical prioritisation - as HIV remains concentrated among key population (KP) groups in specific geographical areas. Ways forward include: •Addressing structural barriers for KP to access services, and identifying and targeting KPs at higher risk;•Strengthening the network of public facilities, NGOs and general practitioners and introducing a case management approach to assist KPs and other clients with unknown HIV status, HIV-negative clients and newly diagnosed clients to access the health services across the continuum to increase the number of people testing for HIV and to reduce loss to follow-up in both prevention and treatment;•Increasing the availability of HIV testing and counselling services for KPs, clients of female sex workers (FSW), and other populations at risk, and raising the demand for timely testing including expansion of outreach and client-initiated voluntary counselling and testing (VCT) services;•Monitoring and maximising retention from HIV diagnosis to ART initiation and expanding quality HIV laboratory services, especially viral load

  12. Progress and Challenges in Developing Reference Data Layers for Human Population Distribution and Built Infrastructure

    Science.gov (United States)

    Chen, R. S.; Yetman, G.; de Sherbinin, A. M.

    2015-12-01

    Understanding the interactions between environmental and human systems, and in particular supporting the applications of Earth science data and knowledge in place-based decision making, requires systematic assessment of the distribution and dynamics of human population and the built human infrastructure in conjunction with environmental variability and change. The NASA Socioeconomic Data and Applications Center (SEDAC) operated by the Center for International Earth Science Information Network (CIESIN) at Columbia University has had a long track record in developing reference data layers for human population and settlements and is expanding its efforts on topics such as intercity roads, reservoirs and dams, and energy infrastructure. SEDAC has set as a strategic priority the acquisition, development, and dissemination of data resources derived from remote sensing and socioeconomic data on urban land use change, including temporally and spatially disaggregated data on urban change and rates of change, the built infrastructure, and critical facilities. We report here on a range of past and ongoing activities, including the Global Human Settlements Layer effort led by the European Commission's Joint Research Centre (JRC), the Global Exposure Database for the Global Earthquake Model (GED4GEM) project, the Global Roads Open Access Data Working Group (gROADS) of the Committee on Data for Science and Technology (CODATA), and recent work with ImageCat, Inc. to improve estimates of the exposure and fragility of buildings, road and rail infrastructure, and other facilities with respect to selected natural hazards. New efforts such as the proposed Global Human Settlement indicators initiative of the Group on Earth Observations (GEO) could help fill critical gaps and link potential reference data layers with user needs. We highlight key sectors and themes that require further attention, and the many significant challenges that remain in developing comprehensive, high quality

  13. OneGeology-Europe - The Challenges and progress of implementing a basic geological infrastructure for Europe

    Science.gov (United States)

    Asch, Kristine; Tellez-Arenas, Agnes

    2010-05-01

    OneGeology-Europe is making geological spatial data held by the geological surveys of Europe more easily discoverable and accessible via the internet. This will provide a fundamental scientific layer to the European Plate Observation System Rich geological data assets exist in the geological survey of each individual EC Member State, but they are difficult to discover and are not interoperable. For those outside the geological surveys they are not easy to obtain, to understand or to use. Geological spatial data is essential to the prediction and mitigation of landslides, subsidence, earthquakes, flooding and pollution. These issues are global in nature and their profile has also been raised by the OneGeology global initiative for the International Year of Planet Earth 2008. Geology is also a key dataset in the EC INSPIRE Directive, where it is also fundamental to the themes of natural risk zones, energy and mineral resources. The OneGeology-Europe project is delivering a web-accessible, interoperable geological spatial dataset for the whole of Europe at the 1:1 million scale based on existing data held by the European geological surveys. Proof of concept will be applied to key areas at a higher resolution and some geological surveys will deliver their data at high resolution. An important role is developing a European specification for basic geological map data and making significant progress towards harmonising the dataset (an essential first step to addressing harmonisation at higher data resolutions). It is accelerating the development and deployment of a nascent international interchange standard for geological data - GeoSciML, which will enable the sharing and exchange of the data within and beyond the geological community within Europe and globally. The geological dataset for the whole of Europe is not a centralized database but a distributed system. Each geological survey implements and hosts an interoperable web service, delivering their national harmonized

  14. A challenging case of rapid progressive Kaposi sarcoma after renal transplantation: diagnostics by FDG PET/CT.

    Science.gov (United States)

    Reuter, Stefan; Vrachimis, Alexis; Huss, Sebastian; Wardelmann, Eva; Weckesser, Mathias; Pavenstädt, Hermann

    2014-09-01

    De-novo malignancy is a serious posttransplant complication. While the incidence of Kaposi sarcoma (KS) is low, the time for its diagnosis is early after renal transplantation. Typically, it can be identified because of the classical skin lesion. We herein report an unusual case of rapid progressive KS without skin lesions in a 52-year-old patient leading to death within 8 months after kidney transplantation. This striking case illustrates the usefulness of [18F]2-fluoro-2-deoxy-D-glucose positron emission tomography/computed tomography for demonstrating the cause of unexplained deterioration of patient's condition. Early identification of KS is critical because early (modification of) therapy can substantially improve patient's prognosis.

  15. Computer Literacy of Iranian Teachers of English as a Foreign Language: Challenges and Obstacles

    Science.gov (United States)

    Dashtestani, Reza

    2014-01-01

    Basically, one of the requirements for the implementation of computer-assisted language learning (CALL) is English as a foreign language (EFL) teachers' ability to use computers effectively. Educational authorities and planners should identify EFL teachers' computer literacy levels and make attempts to improve the teachers' computer competence.…

  16. Grand Challenges of Advanced Computing for Energy Innovation Report from the Workshop Held July 31-August 2, 2012

    Energy Technology Data Exchange (ETDEWEB)

    Larzelere, Alex R.; Ashby, Steven F.; Christensen, Dana C.; Crawford, Dona L.; Khaleel, Mohammad A.; John, Grosh; Stults, B. Ray; Lee, Steven L.; Hammond, Steven W.; Grover, Benjamin T.; Neely, Rob; Dudney, Lee Ann; Goldstein, Noah C.; Wells, Jack; Peltz, Jim

    2013-03-06

    On July 31-August 2 of 2012, the U.S. Department of Energy (DOE) held a workshop entitled Grand Challenges of Advanced Computing for Energy Innovation. This workshop built on three earlier workshops that clearly identified the potential for the Department and its national laboratories to enable energy innovation. The specific goal of the workshop was to identify the key challenges that the nation must overcome to apply the full benefit of taxpayer-funded advanced computing technologies to U.S. energy innovation in the ways that the country produces, moves, stores, and uses energy. Perhaps more importantly, the workshop also developed a set of recommendations to help the Department overcome those challenges. These recommendations provide an action plan for what the Department can do in the coming years to improve the nation’s energy future.

  17. Quantification of progression and regression of descending thoracic aortic wall thickness by enhanced computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Yokoyama, Kenichi; Takasu, Junichiro; Yamamoto, Rie; Taguchi, Rie; Itani, Yasutaka; Ito, Yuichi; Watanabe, Shigeru; Masuda, Yoshiaki [Chiba Univ. (Japan). School of Medicine

    2001-04-01

    The purpose of this study is to verify the usefulness of the quantification of aortic wall involvement by enhanced computed tomography (CT). One-hundred thirteen Japanese patients underwent two enhanced CT of the descending thoracic aorta at intervals. We sliced the descending thoracic aorta continuously from the level of the tracheal bifurcation with 1 cm intervals, and we defined aortic wall volume (AWV) (cm{sup 3}) as the sum of a 7-slice area of aortic wall involving calcification. The average of AWV increased from 7.95{+-}2.92 cm{sup 3} to 8.70{+-}2.98 cm{sup 3}. The developmental rate of AWV ({delta}AWV) was 0.270{+-}0.281 cm{sup 3}/year. {delta}AWV did not have a significant correlation with any risk factor at the baseline. {delta}AWV had significant correlation with total cholesterol, (LDL-C) low-density lipoprotein cholesterol and LDL-C/(HDL-C) high-density lipoprotein cholesterol ratio at the follow-up, and by multivariate analysis with only the LDL-C/HDL-C ratio. {delta}AWV was not correlated with the intake status of hypoglycemic, antihypertensive or lipid-lowering drugs. The cut-off level of total cholesterol with the most significant odds ratio for progression of aortic wall was 190 mg/dl, and that of LDL-C was 130 mg/dl. This method proved to be useful for the non-invasive assessment of aortic wall thickness. (author)

  18. Computational intelligence in gait research: a perspective on current applications and future challenges.

    Science.gov (United States)

    Lai, Daniel T H; Begg, Rezaul K; Palaniswami, Marimuthu

    2009-09-01

    Our mobility is an important daily requirement so much so that any disruption to it severely degrades our perceived quality of life. Studies in gait and human movement sciences, therefore, play a significant role in maintaining the well-being of our mobility. Current gait analysis involves numerous interdependent gait parameters that are difficult to adequately interpret due to the large volume of recorded data and lengthy assessment times in gait laboratories. A proposed solution to these problems is computational intelligence (CI), which is an emerging paradigm in biomedical engineering most notably in pathology detection and prosthesis design. The integration of CI technology in gait systems facilitates studies in disorders caused by lower limb defects, cerebral disorders, and aging effects by learning data relationships through a combination of signal processing and machine learning techniques. Learning paradigms, such as supervised learning, unsupervised learning, and fuzzy and evolutionary algorithms, provide advanced modeling capabilities for biomechanical systems that in the past have relied heavily on statistical analysis. CI offers the ability to investigate nonlinear data relationships, enhance data interpretation, design more efficient diagnostic methods, and extrapolate model functionality. These are envisioned to result in more cost-effective, efficient, and easy-to-use systems, which would address global shortages in medical personnel and rising medical costs. This paper surveys current signal processing and CI methodologies followed by gait applications ranging from normal gait studies and disorder detection to artificial gait simulation. We review recent systems focusing on the existing challenges and issues involved in making them successful. We also examine new research in sensor technologies for gait that could be combined with these intelligent systems to develop more effective healthcare solutions.

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  20. FY 1992 Blue Book: Grand Challenges: High Performance Computing and Communications

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — High performance computing and computer communications networks are becoming increasingly important to scientific advancement, economic competition, and national...

  1. FY 1993 Blue Book: Grand Challenges 1993: High Performance Computing and Communications

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — High performance computing and computer communications networks are becoming increasingly important to scientific advancement, economic competition, and national...

  2. Research in progress at the Institute for Computer Applications in Science and Engineering

    Science.gov (United States)

    1987-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April 1, 1987 through October 1, 1987.

  3. Computational mechanics research and support for aerodynamics and hydraulics at TFHRC, year 2 quarter 1 progress report.

    Energy Technology Data Exchange (ETDEWEB)

    Lottes, S.A.; Bojanowski, C.; Shen, J.; Xie, Z.; Zhai, Y. (Energy Systems); (Turner-Fairbank Highway Research Center)

    2012-04-09

    The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. The analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to improve design allowing for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFHRC wind engineering laboratory. This quarterly report documents technical progress on the project tasks for the period of October through

  4. Computational mechanics research and support for aerodynamics and hydraulics at TFHRC, year 2 quarter 2 progress report

    Energy Technology Data Exchange (ETDEWEB)

    Lottes, S.A.; Bojanowski, C.; Shen, J.; Xie, Z.; Zhai, Y. (Energy Systems); (Turner-Fairbank Highway Research Center)

    2012-06-28

    The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. The analysis methods employ well benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to improve design allowing for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFHRC wind engineering laboratory. This quarterly report documents technical progress on the project tasks for the period of January through

  5. Computational mechanics research and support for aerodynamics and hydraulics at TFHRC year 1 quarter 4 progress report.

    Energy Technology Data Exchange (ETDEWEB)

    Lottes, S.A.; Kulak, R.F.; Bojanowski, C. (Energy Systems)

    2011-12-09

    The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. The analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to assess them for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFCHR wind engineering laboratory, vehicle stability under high wind loading, and the use of electromagnetic shock absorbers to improve vehicle stability

  6. Computational mechanics research and support for aerodynamics and hydraulics at TFHRC, year 1 quarter 3 progress report.

    Energy Technology Data Exchange (ETDEWEB)

    Lottes, S.A.; Kulak, R.F.; Bojanowski, C. (Energy Systems)

    2011-08-26

    The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. The analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water loads on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to assess them for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, vehicle stability under high wind loading, and the use of electromagnetic shock absorbers to improve vehicle stability under high wind conditions. This quarterly report documents technical progress on the project

  7. Migrating Educational Data and Services to Cloud Computing: Exploring Benefits and Challenges

    Science.gov (United States)

    Lahiri, Minakshi; Moseley, James L.

    2013-01-01

    "Cloud computing" is currently the "buzzword" in the Information Technology field. Cloud computing facilitates convenient access to information and software resources as well as easy storage and sharing of files and data, without the end users being aware of the details of the computing technology behind the process. This…

  8. Computer usage among nurses in rural health-care facilities in South Africa: obstacles and challenges.

    Science.gov (United States)

    Asah, Flora

    2013-04-01

    This study discusses factors inhibiting computer usage for work-related tasks among computer-literate professional nurses within rural healthcare facilities in South Africa. In the past two decades computer literacy courses have not been part of the nursing curricula. Computer courses are offered by the State Information Technology Agency. Despite this, there seems to be limited use of computers by professional nurses in the rural context. Focus group interviews held with 40 professional nurses from three government hospitals in northern KwaZulu-Natal. Contributing factors were found to be lack of information technology infrastructure, restricted access to computers and deficits in regard to the technical and nursing management support. The physical location of computers within the health-care facilities and lack of relevant software emerged as specific obstacles to usage. Provision of continuous and active support from nursing management could positively influence computer usage among professional nurses. A closer integration of information technology and computer literacy skills into existing nursing curricula would foster a positive attitude towards computer usage through early exposure. Responses indicated that change of mindset may be needed on the part of nursing management so that they begin to actively promote ready access to computers as a means of creating greater professionalism and collegiality. © 2011 Blackwell Publishing Ltd.

  9. microRNAs with different functions and roles in disease development and as potential biomarkers of diabetes: progress and challenges.

    Science.gov (United States)

    Seyhan, Attila A

    2015-05-01

    Biomarkers provide information on early detection of diseases, in determining individuals at risk of developing complications or subtyping individuals for disease phenotypes. In addition, biomarkers may lead to better treatment strategies, personalized therapy, and improved outcome. A major gap in the field of biomarker development is that we have not identified appropriate (minimally invasive, life-style independent and informative) biomarkers for the underlying disease process(es) that can be measured in readily accessible samples (e.g. serum, plasma, blood, urine). miRNAs function as regulators in wide ranging cellular and physiological functions and also participate in many physiopathological processes and thus have been linked to many diseases including diabetes, metabolic and cardiovascular diseases, cancer, neurodegenerative diseases, and autoimmunity. Many miRNAs have been shown to have predictive value as potential biomarkers in a variety of diseases including diabetes, which are detectable in some instances many years before the manifestation of disease. Although some technical challenges still remain, due to their availability in the circulation, relative stability, and ease of detection; miRNAs have emerged as a promising new class of biomarkers to provide information on early detection of disease, monitoring disease progression, in determining individual's risk of developing complications or subtyping individuals for disease phenotypes, and to monitor response to therapeutic interventions. As a final note, most of the miRNAs reported in the literature have not yet been validated in sufficiently powered and longitudinal studies for specificity for that particular disease.

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  11. 信息化背景下的计算机教育面临的挑战及改革措施研究%Study on the Challenges Facing Computer Education and Reform Measures under the background of Information

    Institute of Scientific and Technical Information of China (English)

    蔡丽霞

    2014-01-01

    随着信息化的不断发展与进步,人们的衣、食、住、行都与计算机息息相关。在信息化背景下,计算机教育面临着重大的挑战,传统的计算机教育模式已适应不了教学的需要,需要实施改革。该文从计算机背景下的计算机教育角度出发,先分析了计算机教育所面临的挑战,然后在此基础上提出了相应的改革措施,旨在提高计算机教育水平。%With the continuous development and progress of informatization, is closely related to people's clothing, food, live, row and computer. In the background of the informationization, computer education is facing great challenge, the traditional mode of education has to adapt computer need not teaching, needs reform. In this paper, from the perspective of computer edu-cation under the background of the computer, the first analysis of the computer education challenges, and then put forward the corresponding reform measures, aimed at improving the level of computer education.

  12. Error suppression and error correction in adiabatic quantum computation I: techniques and challenges

    OpenAIRE

    Young, Kevin C.; Sarovar, Mohan; Blume-Kohout, Robin

    2013-01-01

    Adiabatic quantum computation (AQC) is known to possess some intrinsic robustness, though it is likely that some form of error correction will be necessary for large scale computations. Error handling routines developed for circuit-model quantum computation do not transfer easily to the AQC model since these routines typically require high-quality quantum gates, a resource not generally allowed in AQC. There are two main techniques known to suppress errors during an AQC implementation: energy...

  13. Research in progress and other activities of the Institute for Computer Applications in Science and Engineering

    Science.gov (United States)

    1993-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics and computer science during the period April 1, 1993 through September 30, 1993. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustic and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.

  14. Research in progress in applied mathematics, numerical analysis, fluid mechanics, and computer science

    Science.gov (United States)

    1994-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.

  15. Progress and challenges in implementing HIV care and treatment policies in Latin America following the treatment 2.0 initiative.

    Science.gov (United States)

    Perez, Freddy; Gomez, Bertha; Ravasi, Giovanni; Ghidinelli, Massimo

    2015-12-19

    the use of the WHO preferred first-line regimen, 51% increase in the use of WHO-recommended second-line regimens, and a significant reduction in the use of obsolete drugs in first- and second-line regimens (respectively 1 and 9% of regimens in 2013). A relatively good level of progress was perceived in the recommendations related to optimization of ART regimens. Challenges remain on the improvement of recommendations related to health system strengthening and the promotion and support aimed at community-based organizations as part of the response to HIV/AIDS in Latin America. The JRMs are a useful mechanism for providing coherent technical support to guide countries in the pursuit of a comprehensive response to HIV/AIDS in the Latin American region.

  16. Service-oriented computing : State of the art and research challenges

    NARCIS (Netherlands)

    Papazoglou, Michael P.; Traverso, Paolo; Dustdar, Schahram; Leymann, Frank

    2007-01-01

    Service-oriented computing promotes the idea of assembling application components into a network of services that can be loosely coupled to create flexible, dynamic business processes and agile applications that span organizations and computing platforms. An SOC research road map provides a context

  17. Developmental Systems Toxicology: computer simulation in a ‘Virtual Embryo’ prototype (SEURAT-1 Progress Meeting)

    Science.gov (United States)

    Evaluating and assessing impacts to development is an Agency priority (EPA’s Children’s Environmental Health Research Roadmap); however, the quantity of chemicals needing assessment and challenges of species extrapolation require alternative approaches to traditional animal studi...

  18. Advanced methods for the computation of particle beam transport and the computation of electromagnetic fields and beam-cavity interactions. Progress report, July 1993--August 1994

    Energy Technology Data Exchange (ETDEWEB)

    Dragt, A.J.; Gluckstern, R.L.

    1994-08-01

    The University of Maryland Dynamical Systems and Accelerator Theory Group has been carrying out long-term research work in the general area of Dynamical Systems with a particular emphasis on applications to Accelerator Physics. This work is broadly divided into two tasks: the computation of charged particle beam transport and the computation of electromagnetic fields and beam-cavity interactions. Each of these tasks is described briefly. Work is devoted both to the development of new methods and the application of these methods to problems of current interest in accelerator physics including the theoretical performance of present and proposed high energy machines. In addition to its research effort, the Dynamical Systems and Accelerator Theory Group is actively engaged in the education of students and postdoctoral research associates. Substantial progress in research has been made during the past year. These achievements are summarized in the following report.

  19. Computer-aided detection systems to improve lung cancer early diagnosis: state-of-the-art and challenges

    Science.gov (United States)

    Traverso, A.; Lopez Torres, E.; Fantacci, M. E.; Cerello, P.

    2017-05-01

    Lung cancer is one of the most lethal types of cancer, because its early diagnosis is not good enough. In fact, the detection of pulmonary nodule, potential lung cancers, in Computed Tomography scans is a very challenging and time-consuming task for radiologists. To support radiologists, researchers have developed Computer-Aided Diagnosis (CAD) systems for the automated detection of pulmonary nodules in chest Computed Tomography scans. Despite the high level of technological developments and the proved benefits on the overall detection performance, the usage of Computer-Aided Diagnosis in clinical practice is far from being a common procedure. In this paper we investigate the causes underlying this discrepancy and present a solution to tackle it: the M5L WEB- and Cloud-based on-demand Computer-Aided Diagnosis. In addition, we prove how the combination of traditional imaging processing techniques with state-of-art advanced classification algorithms allows to build a system whose performance could be much larger than any Computer-Aided Diagnosis developed so far. This outcome opens the possibility to use the CAD as clinical decision support for radiologists.

  20. Computational algebraic geometry for statistical modeling FY09Q2 progress.

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, David C.; Rojas, Joseph Maurice; Pebay, Philippe Pierre

    2009-03-01

    This is a progress report on polynomial system solving for statistical modeling. This is a progress report on polynomial system solving for statistical modeling. This quarter we have developed our first model of shock response data and an algorithm for identifying the chamber cone containing a polynomial system in n variables with n+k terms within polynomial time - a significant improvement over previous algorithms, all having exponential worst-case complexity. We have implemented and verified the chamber cone algorithm for n+3 and are working to extend the implementation to handle arbitrary k. Later sections of this report explain chamber cones in more detail; the next section provides an overview of the project and how the current progress fits into it.

  1. Energy-Efficient Management of Data Center Resources for Cloud Computing: A Vision, Architectural Elements, and Open Challenges

    CERN Document Server

    Buyya, Rajkumar; Abawajy, Jemal

    2010-01-01

    Cloud computing is offering utility-oriented IT services to users worldwide. Based on a pay-as-you-go model, it enables hosting of pervasive applications from consumer, scientific, and business domains. However, data centers hosting Cloud applications consume huge amounts of energy, contributing to high operational costs and carbon footprints to the environment. Therefore, we need Green Cloud computing solutions that can not only save energy for the environment but also reduce operational costs. This paper presents vision, challenges, and architectural elements for energy-efficient management of Cloud computing environments. We focus on the development of dynamic resource provisioning and allocation algorithms that consider the synergy between various data center infrastructures (i.e., the hardware, power units, cooling and software), and holistically work to boost data center energy efficiency and performance. In particular, this paper proposes (a) architectural principles for energy-efficient management of ...

  2. Modeling and Simulation of Scalable Cloud Computing Environments and the CloudSim Toolkit: Challenges and Opportunities

    CERN Document Server

    Buyya, Rajkumar; Calheiros, Rodrigo N

    2009-01-01

    Cloud computing aims to power the next generation data centers and enables application service providers to lease data center capabilities for deploying applications depending on user QoS (Quality of Service) requirements. Cloud applications have different composition, configuration, and deployment requirements. Quantifying the performance of resource allocation policies and application scheduling algorithms at finer details in Cloud computing environments for different application and service models under varying load, energy performance (power consumption, heat dissipation), and system size is a challenging problem to tackle. To simplify this process, in this paper we propose CloudSim: an extensible simulation toolkit that enables modelling and simulation of Cloud computing environments. The CloudSim toolkit supports modelling and creation of one or more virtual machines (VMs) on a simulated node of a Data Center, jobs, and their mapping to suitable VMs. It also allows simulation of multiple Data Centers to...

  3. 复杂脑网络研究:现状与挑战%Complex Brain Networks:Progresses and Challenges

    Institute of Scientific and Technical Information of China (English)

    张方风; 郑志刚

    2012-01-01

    Progresses in studies of complex networks and its applications in brain network were retrospected, including the research on topology structure features of anatomical and functional brain networks, as well as on the relationship between brain structures and functions. Based on complex networks theory, some important topology features of anatomical and functional brain networks were reported, such as small world,scale free,modular and hub regions; then some new findings were presented about the relationship between cognitive function and neuropsychiatry disorder with abnormalities in functional connectivity and changes in topological structure changes. Several challenges and key issues that should be addressed in the future were further raised.%以大脑网络研究为例,详细介绍了大脑网络的构建、结构网络、功能网络以及结构与功能的联系等方面的研究进展.基于复杂网络理论,对大脑结构网络和功能网络的分析得到很多重要的拓扑性质,如“小世界”、“无标度”、模块化以及核心脑区的分布等;另外发现认知功能、神经精神疾病与大脑结构和功能网络的拓扑结构变化或异常有关;总结了国内外在此领域的研究成果,提出了大脑网络研究工作面临的挑战,并展望了将来发展方向.

  4. 盲量子计算研究进展%Progress in Blind Quantum Computation

    Institute of Scientific and Technical Information of China (English)

    王帮海; 徐海茹

    2015-01-01

    Blind quantum computation that combines notions of quantum cryptography and quantum com-putation can fulfill quantum computation by a client with limited or even no quantum computational power with the help of an unreliable quantum server and keep the privacy of the client′s algorithm and the data. In this article, the principles and unconditional security of blind quantum computation are reviewed.And the researchers also explore several universal protocols of blind quantum computation, analyze their effi-ciency and introduce the physical implementation of blind quantum computation which is based on the technology of measurement-based computation.Finally, future prospect of blind quantum computation is discussed.%盲量子计算结合了量子密码学和量子计算的概念,使得量子能力有限甚至没有量子能力的用户可通过借助不可信的量子服务器实现量子计算,并保证其算法和数据的私密性。介绍了盲量子计算的原理及其无条件安全性,比较分析了几种通用盲量子计算协议的效率,叙述了采用基于测量技术的盲量子计算的物理实现,并对未来盲量子计算的发展和应用进行了展望。

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  6. Computer aided surface representation. Progress report, June 1, 1989--May 31, 1990

    Energy Technology Data Exchange (ETDEWEB)

    Barnhill, R.E.

    1990-02-19

    The central research problem of this project is the effective representation, computation, and display of surfaces interpolating to information in three or more dimensions. If the given information is located on another surface, then the problem is to construct a ``surface defined on a surface``. Sometimes properties of an already defined surface are desired, which is ``geometry processing``. Visualization of multivariate surfaces is possible by means of contouring higher dimensional surfaces. These problems and more are discussed below. The broad sweep from constructive mathematics through computational algorithms to computer graphics illustrations is utilized in this research. The breadth and depth of this research activity makes this research project unique.

  7. The vertical monitor position for presbyopic computer users with progressive lenses: how to reach clear vision and comfortable head posture.

    Science.gov (United States)

    Weidling, Patrick; Jaschinski, Wolfgang

    2015-01-01

    When presbyopic employees are wearing general-purpose progressive lenses, they have clear vision only with a lower gaze inclination to the computer monitor, given the head assumes a comfortable inclination. Therefore, in the present intervention field study the monitor position was lowered, also with the aim to reduce musculoskeletal symptoms. A comparison group comprised users of lenses that do not restrict the field of clear vision. The lower monitor positions led the participants to lower their head inclination, which was linearly associated with a significant reduction in musculoskeletal symptoms. However, for progressive lenses a lower head inclination means a lower zone of clear vision, so that clear vision of the complete monitor was not achieved, rather the monitor should have been placed even lower. The procedures of this study may be useful for optimising the individual monitor position depending on the comfortable head and gaze inclination and the vertical zone of clear vision of progressive lenses. For users of general-purpose progressive lenses, it is suggested that low monitor positions allow for clear vision at the monitor and for a physiologically favourable head inclination. Employees may improve their workplace using a flyer providing ergonomic-optometric information.

  8. Computational Fluid Dynamics in Solid Earth Sciences-a HPC challenge

    OpenAIRE

    Vlad Constantin Manea; Marina Manea; Mihai Pomeran; Lucian Besutiu; Luminita Zlagnean

    2012-01-01

    Presently, the Solid Earth Sciences started to move towards implementing High Performance Computational (HPC) research facilities. One of the key tenants of HPC is performance, which strongly depends on the interaction between software and hardware. In this paper, they are presented benchmark results from two HPC systems. Testing a Computational Fluid Dynamics (CFD) code specific for Solid Earth Sciences, the HPC system Horus, based on Gigabit Ethernet, performed reasonably well compared with...

  9. E-Mail Systems In Cloud Computing Environment Privacy,Trust And Security Challenges

    Directory of Open Access Journals (Sweden)

    Maha Attia

    2016-07-01

    Full Text Available In this paper, SMCSaaS is proposed to secure email system based on Web Service and Cloud Computing Model. The model offers end-to-end security, privacy, and non-repudiation of PKI without the associated infrastructure complexity. The Proposed Model control risks in Cloud Computing like Insecure Application Programming Interfaces, Malicious Insiders, Data Loss Shared Technology Vulnerabilities, or Leakage, Account, Service, Traffic Hijacking and Unknown Risk Profile

  10. Trends and Progress in Reducing Teen Birth Rates and the Persisting Challenge of Eliminating Racial/Ethnic Disparities.

    Science.gov (United States)

    Ngui, Emmanuel M; Greer, Danielle M; Bridgewater, Farrin D; Salm Ward, Trina C; Cisler, Ron A

    2017-08-01

    We examined progress made by the Milwaukee community toward achieving the Milwaukee Teen Pregnancy Prevention Initiative's aggressive 2008 goal of reducing the teen birth rate to 30 live births/1000 females aged 15-17 years by 2015. We further examined differential teen birth rates in disparate racial and ethnic groups. We analyzed teen birth count data from the Wisconsin Interactive Statistics on Health system and demographic data from the US Census Bureau. We computed annual 2003-2014 teen birth rates for the city and four racial/ethnic groups within the city (white non-Hispanic, black non-Hispanic, Hispanic/Latina, Asian non-Hispanic). To compare birth rates from before (2003-2008) and after (2009-2014) goal setting, we used a single-system design to employ two time series analysis approaches, celeration line, and three standard deviation (3SD) bands. Milwaukee's teen birth rate dropped 54 % from 54.3 in 2003 to 23.7 births/1000 females in 2014, surpassing the goal of 30 births/1000 females 3 years ahead of schedule. Rate reduction following goal setting was statistically significant, as five of the six post-goal data points were located below the celeration line and points for six consecutive years (2010-2014) fell below the 3SD band. All racial/ethnic groups demonstrated significant reductions through at least one of the two time series approaches. The gap between white and both black and Hispanic/Latina teens widened. Significant reduction has occurred in the overall teen birth rate of Milwaukee. Achieving an aggressive reduction in teen births highlights the importance of collaborative community partnerships in setting and tracking public health goals.

  11. Perspectives on Games, Computers, and Mental Health: Questions about Paradoxes, Evidences, and Challenges.

    Science.gov (United States)

    Desseilles, Martin

    2016-01-01

    In the field of mental health, games and computerized games present questions about paradoxes, evidences, and challenges. This perspective article offers perspectives and personal opinion about these questions, evidences, and challenges with an objective of presenting several ideas and issues in this rapidly developing field. First, games raise some questions in the sense of the paradox between a game and an issue, as well as the paradox of using an amusing game to treat a serious pathology. Second, games also present evidence in the sense that they involve relationships with others, as well as learning, communication, language, emotional regulation, and hedonism. Third, games present challenges, such as the risk of abuse, the critical temporal period that may be limited to childhood, their important influence on sociocognitive learning and the establishment of social norms, and the risk of misuse of games.

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  13. Using inverse electrocardiography to image myocardial infarction--reflecting on the 2007 PhysioNet/Computers in Cardiology Challenge.

    Science.gov (United States)

    Dawoud, Fady; Wagner, Galen S; Moody, George; Horácek, B Milan

    2008-01-01

    The goal of the 2007 PhysioNet/Computers in Cardiology Challenge was to try to establish how well it is possible to characterize the location and extent of old myocardial infarcts using electrocardiographic evidence supplemented by anatomical imaging information. A brief overview of the challenge and how different challengers approached the competition is provided, followed by detailed response of the first author to integrate electrophysiologic and anatomical data. The first author used the provided 120-electrode body-surface potential mapping data and magnetic resonance imaging heart and torso images to calculate epicardial potentials on customized ventricular geometries. A method was developed to define the location and extent of scar tissue based on the morphology of computed epicardial electrograms. Negative Q-wave deflection followed by R-wave on the left ventricular surface seemed to correspond with the location of the scar as determined by the gadolinium-enhanced magnetic resonance imaging gold standard in the supplied data sets. The method shows promising results as a noninvasive imaging tool to quantitatively characterize chronic infarcts and warrants further investigation on a larger patient data set.

  14. Comparing computer adaptive and curriculum-based measures of math in progress monitoring.

    Science.gov (United States)

    Shapiro, Edward S; Dennis, Minyi Shih; Fu, Qiong

    2015-12-01

    The purpose of the study was to compare the use of a Computer Adaptive Test and Curriculum-Based Measurement in the assessment of mathematics. This study also investigated the degree to which slope or rate of change predicted student outcomes on the annual state assessment of mathematics above and beyond scores of single point screening assessments (i.e., the computer adaptive test or the CBM assessment just before the administration of the state assessment). Repeated measurement of mathematics once per month across a 7-month period using a Computer Adaptive Test (STAR-Math) and Curriculum-Based Measurement (CBM, AIMSweb Math Computation, AIMSweb Math Concepts/Applications) was collected for a maximum total of 250 third, fourth, and fifth grade students. Results showed STAR-Math in all 3 grades and AIMSweb Math Concepts/Applications in the third and fifth grades had primarily linear growth patterns in mathematics. AIMSweb Math Computation in all grades and AIMSweb Math Concepts/Applications in Grade 4 had decelerating positive trends. Predictive validity evidence showed the strongest relationships were between STAR-Math and outcomes for third and fourth grade students. The blockwise multiple regression by grade revealed that slopes accounted for only a very small proportion of additional variance above and beyond what was explained by the scores obtained on a single point of assessment just prior to the administration of the state assessment.

  15. 计算思维的研究及其进展%Research and Progress in Computational Thinking

    Institute of Scientific and Technical Information of China (English)

    牟琴; 谭良

    2011-01-01

    计算思维是当前国际计算机界广为关注的一个重要概念,也是当前计算机教育需要重点研究的重要课题.国内外计算机界、社会学界以及哲学界的广大学者对这一课题进行了广泛的研究与探讨,取得了一些积极的成果,但仍有分歧.首先以时间为线索,根据计算思维的形成特点,提出了萌芽时期、奠基时期和混沌时期的阶段划分方法,对计算思维的形成和发展过程进行了全面的分析,给出了计算思维演化进程的一个全景视图;然后重点综述了计算思维的关键内容,包括计算思维的概念、计算思维的原理以及计算思维在教学中的应用等,并进行了总结.最后,结合已有的研究成果,展望了计算思维未来的研究方向及其面对的挑战.%Computational thinking is an widely-concerned and important concept in current international computer community,and it is an important issue which needs to be researched by current computer educational. It has been wildly researched and discussed by many domestic and overseas scholars in computer community,academic and philosophical circles on this subject,although there are some positive results, but the differences are still widely remained. Firstly, according to the clues in time and the characters of the formation of computational thinking,a method of phases was proposed by the article basic on its infancy, foundation and chaos period, it widely analysed the calculation of the formation and development process of the computational thinking and gave a panoramic view of the evolution process of computational thinking. Then, it illuminated the key content of the computational thinking, including the conception, the principiurn, the application of the computational thinking in education, and it summaried the computational thinking. Based on the existing research results,we made a vista of the future research direction of the computational thinking and its challenges.

  16. Computing supersonic non-premixed turbulent combustion by an SMLD flamelet progress variable model

    CERN Document Server

    Coclite, A; Gurtner, M; De Palma, P; Haidnd, O J; Pascazio, G

    2015-01-01

    This paper describes the numerical simulation of the NASA Langley Research Center supersonic H2 -Air combustion chamber performed using two approaches to model the presumed probability density function (PDF) in the flamelet progress variable (FPV) framework. The first one is a standard FPV model, built presuming the functional shape of the PDFs of the mixture fraction, Z, and of the progress parameter, {\\Lambda}. In order to enhance the prediction capabilities of such a model in high-speed reacting flows, a second approach is proposed employing the statistically most likely distribution (SMLD) techcnique to presume the joint PDF of Z and {\\Lambda}, without any assumption about their behaviour. The standard and FPV-SMLD models have been developed using the low Mach number assumption. In both cases, the temperature is evaluated by solving the total-energy conservation equation, providing a more suitable approach for the simulation of supersonic combustion. By comparison with experimental data, the proposed SMLD...

  17. Computational Challenges in Characterization of Bacteria and Bacteria-Host Interactions Based on Genomic Data

    Institute of Scientific and Technical Information of China (English)

    Chao Zhang; Guo-lu Zheng; Shun-Fu Xu; Dong Xu

    2012-01-01

    With the rapid development of next-generation sequencing technologies,bacterial identification becomes a very important and essential step in processing genomic data,especially for metagenomic data.Many computational methods have been developed and some of them are widely used to address the problems in bacterial identification.In this article we review the algorithms of these methods,discuss their drawbacks,and propose future computational methods that use genomic data to characterize bacteria.In addition,we tackle two specific computational problems in bacterial identification,namely,the detection of host-specific bacteria and the detection of disease-associated bacteria,by offering potential solutions as a starting point for those who are interested in the area.

  18. MIT Laboratory for Computer Science Progress Report, July 1984-June 1985

    Science.gov (United States)

    1985-06-01

    transcription and translation, according to the operon theory of procaryotic genetics. The first program, GENEX I, used a procedural encoding of how to...goals 1 and 2, are critically needed components of most medical consulting and critiquing programs. Within our own research laboratory, results from...RESEARCH PROGRESS In the past year of work on the clinical cognition subproject, we have collected transcripts from eight expert physicians solving

  19. [Geometry, analysis, and computation in mathematics and applied science]. Progress report

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, D.

    1994-02-01

    The principal investigators` work on a variety of pure and applied problems in Differential Geometry, Calculus of Variations and Mathematical Physics has been done in a computational laboratory and been based on interactive scientific computer graphics and high speed computation created by the principal investigators to study geometric interface problems in the physical sciences. We have developed software to simulate various physical phenomena from constrained plasma flow to the electron microscope imaging of the microstructure of compound materials, techniques for the visualization of geometric structures that has been used to make significant breakthroughs in the global theory of minimal surfaces, and graphics tools to study evolution processes, such as flow by mean curvature, while simultaneously developing the mathematical foundation of the subject. An increasingly important activity of the laboratory is to extend this environment in order to support and enhance scientific collaboration with researchers at other locations. Toward this end, the Center developed the GANGVideo distributed video software system and software methods for running lab-developed programs simultaneously on remote and local machines. Further, the Center operates a broadcast video network, running in parallel with the Center`s data networks, over which researchers can access stored video materials or view ongoing computations. The graphical front-end to GANGVideo can be used to make ``multi-media mail`` from both ``live`` computing sessions and stored materials without video editing. Currently, videotape is used as the delivery medium, but GANGVideo is compatible with future ``all-digital`` distribution systems. Thus as a byproduct of mathematical research, we are developing methods for scientific communication. But, most important, our research focuses on important scientific problems; the parallel development of computational and graphical tools is driven by scientific needs.

  20. Laboratory for Computer Science Progress Report 19, 1 July 1981-30 June 1982.

    Science.gov (United States)

    1984-05-01

    Contract No. N00014-75-C-O6. Views and conctusions conaned in ts report are those of the authors and should not be interpreted as representing the...Environment, S. Thesis, EE & CS Dept., September 1976, AD A030-402 TM-78 Benjamin. Arthur J. Improving Information Storage Reliability Using a Data Network...17 Samuel, Arthur L. Time-Sharing on a Multiconsole Computer, March 1965, AD 462-158 TR-18 Scherr, Allan Lee An Analysis of Time-Shared Computer

  1. SPECT-CT and PET-CT progress in the research of computer analysis method

    Directory of Open Access Journals (Sweden)

    Jingang Guo

    2015-04-01

    Full Text Available The development of multimodal imaging equipment is milestone in the development of imaging, after the PET-CT, American GE company launched a Discovery 670 NM/CT, CT and SPECT, the organic integration of the formation of SPECT-CT new molecular medical imaging equipment, with SPECT, CT and PET-CT is getting more and more widely attention and application, many of SPECT-CT and PET-CT image analysis computer method arises at the historic moment, getting increasing attention of the clinical and imaging science. The paper carried on the detailed description of the SPECT-CT and PET-CT computer analysis method.

  2. Progress in the computation of flows in combustion in controlled ignition engines

    Energy Technology Data Exchange (ETDEWEB)

    Naji, H.; Borghi, R.; Souhaite, P.; Argueyrolles, B.

    1987-01-01

    The characterization by computation of reactive flows in the combustion chambers is now used directly to design piston engines. This computation requires numerical programs taking into account fluid mechanics, heat transfer, combustion and turbulence. For these flows, turbulence and combustion cannot be dissociated. Moreover the latter should be represented by a process with a finite speed. The existing program has been improved by incorporation of two combustion models. The turbulent fluctuations are taken into account by a modelisation of their probability densities. Numerical results show that the two models give a realistic values of the burnt fraction and of the average pressure during the combustion phase.

  3. Progress of projection computed tomography by upgrading of the beamline 37XU of SPring-8

    Science.gov (United States)

    Terada, Yasuko; Suzuki, Yoshio; Uesugi, Kentaro; Miura, Keiko

    2016-01-01

    Beamline 37XU at SPring-8 has been upgraded for nano-focusing applications. The length of the beamline has been extended to 80 m. By utilizing this length, the beamline has advantages for experiments such as X-ray focusing, X-ray microscopic imaging and X-ray computed tomography. Projection computed tomography measurements were carried out at experimental hutch 3 located 80 m from the light source. CT images of a microcapsule have been successfully obtained with a wide X-ray energy range.

  4. Progress of projection computed tomography by upgrading of the beamline 37XU of SPring-8

    Energy Technology Data Exchange (ETDEWEB)

    Terada, Yasuko, E-mail: yterada@spring8.or.jp; Suzuki, Yoshio; Uesugi, Kentaro; Miura, Keiko [Japan Synchrotron Radiation Research Institute, SPring-8, 1-1-1 Koto, Sayo, Hyogo 679-5198 (Japan)

    2016-01-28

    Beamline 37XU at SPring-8 has been upgraded for nano-focusing applications. The length of the beamline has been extended to 80 m. By utilizing this length, the beamline has advantages for experiments such as X-ray focusing, X-ray microscopic imaging and X-ray computed tomography. Projection computed tomography measurements were carried out at experimental hutch 3 located 80 m from the light source. CT images of a microcapsule have been successfully obtained with a wide X-ray energy range.

  5. Augmentation of spelling therapy with transcranial direct current stimulation in primary progressive aphasia: Preliminary results and challenges.

    Science.gov (United States)

    Tsapkini, Kyrana; Frangakis, Constantine; Gomez, Yessenia; Davis, Cameron; Hillis, Argye E

    Primary progressive aphasia (PPA) is a neurodegenerative disease that primarily affects language functions and often begins in the fifth or sixth decade of life. The devastating effects on work and family life call for the investigation of treatment alternatives. In this article, we present new data indicating that neuromodulatory treatment, using transcranial direct current stimulation (tDCS) combined with a spelling intervention, shows some promise for maintaining or even improving language, at least temporarily, in PPA. The main aim of the present article is to determine whether tDCS plus spelling intervention is more effective than spelling intervention alone in treating written language in PPA. We also asked whether the effects of tDCS are sustained longer than the effects of spelling intervention alone. We present data from six PPA participants who underwent anodal tDCS or sham plus spelling intervention in a within-subject crossover design. Each stimulation condition lasted 3 weeks or a total of 15 sessions with a 2-month interval in between. Participants were evaluated on treatment tasks as well as on other language and cognitive tasks at 2-week and 2-month follow-up intervals after each stimulation condition. All participants showed improvement in spelling (with sham or tDCS). There was no difference in the treated items between the two conditions. There was, however, consistent and significant improvement for untrained items only in the tDCS plus spelling intervention condition. Furthermore, the improvement lasted longer in the tDCS plus spelling intervention condition compared to sham plus spelling intervention condition. Neuromodulation with tDCS offers promise as a means of augmenting language therapy to improve written language function at least temporarily in PPA. The consistent finding of generalisation of treatment benefits to untreated items and the superior sustainability of treatment effects with tDCS justifies further investigations. However

  6. The Computer-Mediated Communication (CMC) Classroom: A Challenge of Medium, Presence, Interaction, Identity, and Relationship

    Science.gov (United States)

    Sherblom, John C.

    2010-01-01

    There is a "prevalence of computer-mediated communication (CMC) in education," and a concern for its negative psychosocial consequences and lack of effectiveness as an instructional tool. This essay identifies five variables in the CMC research literature and shows their moderating effect on the psychosocial, instructional expevrience of the CMC…

  7. Expanding Computer Science Education in Schools: Understanding Teacher Experiences and Challenges

    Science.gov (United States)

    Yadav, Aman; Gretter, Sarah; Hambrusch, Susanne; Sands, Phil

    2017-01-01

    The increased push for teaching computer science (CS) in schools in the United States requires training a large number of new K-12 teachers. The current efforts to increase the number of CS teachers have predominantly focused on training teachers from other content areas. In order to support these beginning CS teachers, we need to better…

  8. Fledgling CPRI (Computer-Based Patient Record Institute) faces difficult challenges as legislative clock ticks.

    Science.gov (United States)

    Laughlin, M L

    1992-09-01

    A diverse group of users, vendors, employers, insurers and government officials met in Washington in July for the Computer-Based Patient Record Institute's First Annual Meeting. Deemed "the focal point" of legislation demanding automated patient records, their task was to overcome a myriad of differences and form a true coalition that can meet an ambitious, and some say unrealistic, deadline.

  9. High Performance Parallel Processing Project: Industrial computing initiative. Progress reports for fiscal year 1995

    Energy Technology Data Exchange (ETDEWEB)

    Koniges, A.

    1996-02-09

    This project is a package of 11 individual CRADA`s plus hardware. This innovative project established a three-year multi-party collaboration that is significantly accelerating the availability of commercial massively parallel processing computing software technology to U.S. government, academic, and industrial end-users. This report contains individual presentations from nine principal investigators along with overall program information.

  10. Research in progress in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1990-01-01

    Research conducted at the Institute in Science and Engineering in applied mathematics, numerical analysis, and computer science is summarized. The Institute conducts unclassified basic research in applied mathematics in order to extend and improve problem solving capabilities in science and engineering, particularly in aeronautics and space.

  11. Comparing Computer Adaptive and Curriculum-Based Measures of Math in Progress Monitoring

    Science.gov (United States)

    Shapiro, Edward S.; Dennis, Minyi Shih; Fu, Qiong

    2015-01-01

    The purpose of the study was to compare the use of a Computer Adaptive Test and Curriculum-Based Measurement in the assessment of mathematics. This study also investigated the degree to which slope or rate of change predicted student outcomes on the annual state assessment of mathematics above and beyond scores of single point screening…

  12. Laboratory for Computer Science Progress Report 16, 1 July 1978 - 30 June 1979,

    Science.gov (United States)

    1980-08-01

    Language May 1975 TM-62 Patil, Suhas S. An Asynchronous Logic Array May 1975 TM-63 Pless, Vera Encryption Schemes for Computer Confidentiality May...Processor, M.S. Thesis, EE Dept. June 1970 AD 710-479 TR-72 Patil, Suhas S. Coordination of Asynchronous Events, Sc.D. Thesis, EE Dept. ~June 1 970 AD

  13. Next-generation sequence assembly: four stages of data processing and computational challenges.

    Directory of Open Access Journals (Sweden)

    Sara El-Metwally

    Full Text Available Decoding DNA symbols using next-generation sequencers was a major breakthrough in genomic research. Despite the many advantages of next-generation sequencers, e.g., the high-throughput sequencing rate and relatively low cost of sequencing, the assembly of the reads produced by these sequencers still remains a major challenge. In this review, we address the basic framework of next-generation genome sequence assemblers, which comprises four basic stages: preprocessing filtering, a graph construction process, a graph simplification process, and postprocessing filtering. Here we discuss them as a framework of four stages for data analysis and processing and survey variety of techniques, algorithms, and software tools used during each stage. We also discuss the challenges that face current assemblers in the next-generation environment to determine the current state-of-the-art. We recommend a layered architecture approach for constructing a general assembler that can handle the sequences generated by different sequencing platforms.

  14. Next-generation sequence assembly: four stages of data processing and computational challenges.

    Science.gov (United States)

    El-Metwally, Sara; Hamza, Taher; Zakaria, Magdi; Helmy, Mohamed

    2013-01-01

    Decoding DNA symbols using next-generation sequencers was a major breakthrough in genomic research. Despite the many advantages of next-generation sequencers, e.g., the high-throughput sequencing rate and relatively low cost of sequencing, the assembly of the reads produced by these sequencers still remains a major challenge. In this review, we address the basic framework of next-generation genome sequence assemblers, which comprises four basic stages: preprocessing filtering, a graph construction process, a graph simplification process, and postprocessing filtering. Here we discuss them as a framework of four stages for data analysis and processing and survey variety of techniques, algorithms, and software tools used during each stage. We also discuss the challenges that face current assemblers in the next-generation environment to determine the current state-of-the-art. We recommend a layered architecture approach for constructing a general assembler that can handle the sequences generated by different sequencing platforms.

  15. Computer-aided diagnosis in radiological imaging: current status and future challenges

    Science.gov (United States)

    Doi, Kunio

    2009-10-01

    Computer-aided diagnosis (CAD) has become one of the major research subjects in medical imaging and diagnostic radiology. Many different types of CAD schemes are being developed for detection and/or characterization of various lesions in medical imaging, including conventional projection radiography, CT, MRI, and ultrasound imaging. Commercial systems for detection of breast lesions on mammograms have been developed and have received FDA approval for clinical use. CAD may be defined as a diagnosis made by a physician who takes into account the computer output as a "second opinion". The purpose of CAD is to improve the quality and productivity of physicians in their interpretation of radiologic images. The quality of their work can be improved in terms of the accuracy and consistency of their radiologic diagnoses. In addition, the productivity of radiologists is expected to be improved by a reduction in the time required for their image readings. The computer output is derived from quantitative analysis of radiologic images by use of various methods and techniques in computer vision, artificial intelligence, and artificial neural networks (ANNs). The computer output may indicate a number of important parameters, for example, the locations of potential lesions such as lung cancer and breast cancer, the likelihood of malignancy of detected lesions, and the likelihood of various diseases based on differential diagnosis in a given image and clinical parameters. In this review article, the basic concept of CAD is first defined, and the current status of CAD research is then described. In addition, the potential of CAD in the future is discussed and predicted.

  16. Towards large-scale data analysis: challenges in the design of portable systems and use of Cloud computing.

    Science.gov (United States)

    Diaz, Javier; Arrizabalaga, Saioa; Bustamante, Paul; Mesa, Iker; Añorga, Javier; Goya, Jon

    2013-01-01

    Portable systems and global communications open a broad spectrum for new health applications. In the framework of electrophysiological applications, several challenges are faced when developing portable systems embedded in Cloud computing services. In order to facilitate new developers in this area based on our experience, five areas of interest are presented in this paper where strategies can be applied for improving the performance of portable systems: transducer and conditioning, processing, wireless communications, battery and power management. Likewise, for Cloud services, scalability, portability, privacy and security guidelines have been highlighted.

  17. Interactive effects of explicit emergent structure: a major challenge for cognitive computational modeling.

    Science.gov (United States)

    French, Robert M; Thomas, Elizabeth

    2015-04-01

    David Marr's (1982) three-level analysis of computational cognition argues for three distinct levels of cognitive information processing-namely, the computational, representational, and implementational levels. But Marr's levels are-and were meant to be-descriptive, rather than interactive and dynamic. For this reason, we suggest that, had Marr been writing today, he might well have gone even farther in his analysis, including the emergence of structure-in particular, explicit structure at the conceptual level-from lower levels, and the effect of explicit emergent structures on the level (or levels) that gave rise to them. The message is that today's cognitive scientists need not only to understand how emergent structures-in particular, explicit emergent structures at the cognitive level-develop but also to understand how they feed back on the sub-structures from which they emerged.

  18. Computational Fluid Dynamics in Solid Earth Sciences–a HPC challenge

    Directory of Open Access Journals (Sweden)

    Luminita Zlagnean

    2012-11-01

    Full Text Available Presently, the Solid Earth Sciences started to move towards implementing High Performance Computational (HPC research facilities. One of the key tenants of HPC is performance, which strongly depends on the interaction between software and hardware. In this paper, they are presented benchmark results from two HPC systems. Testing a Computational Fluid Dynamics (CFD code specific for Solid Earth Sciences, the HPC system Horus, based on Gigabit Ethernet, performed reasonably well compared with its counterpart CyberDyn, based on Infiniband QDR fabric. However, the HPCC CyberDyn based on low-latency high-speed QDR network dedicated to MPI traffic outperformed the HPCC Horus. Due to the high-resolution simulations involved in geodynamic research studies, HPC facilities used in Earth Sciences should benefit from larger up-front investment in future systems that are based on high-speed interconnects.

  19. Analysis on Digital Crime and Emergence of Technical Challenges with Cloud Computing

    Directory of Open Access Journals (Sweden)

    Divya avanthi

    2015-09-01

    Full Text Available The most recent technology release Forbes, Cisco reveals adoption of cloud computing in a global scale. As for storage estimated the average data storage consumed per average household grow from 464 gigabytes. One of the cloud key effects the benefit of digital marketing is that is has provided marketers to approach potential customers at the same time example who is watching TV and texting on the couch used to be out of reach of digital marketers. Organizations deploying infrastructure into remote virtualized environments hosted by third parties significant implications for forensic investigators vendors’ enforcement corporate compliance and audit departments. Digital forensic assumes control and management assets during the conduct of an investigation and overviews the cloud computing how establishes digital procedures will be validated in investigation

  20. The Berlin Brain-Computer Interface: Progress Beyond Communication and Control

    Directory of Open Access Journals (Sweden)

    Benjamin Blankertz

    2016-11-01

    Full Text Available The combined effect of fundamental results about neurocognitive processes and advancements in decoding mental states from ongoing brain signals has brought forth a whole range of potential neurotechnological applications. In this article, we review our developments in this area and put them into perspective. These examples cover a wide range of maturity levels with respect to their applicability. While we assume we are still a long way away from integrating Brain-Computer Interface (BCI technology in general interaction with computers, or from implementing neurotechnological measures in safety-critical workplaces, results have already now been obtained involving a BCI as research tool. In this article, we discuss the reasons why, in some of the prospective application domains, considerable effort is still required to make the systems ready to deal with the full complexity of the real world.