WorldWideScience

Sample records for computing readiness challenge

  1. CMS results in the Combined Computing Readiness Challenge CCRC'08

    International Nuclear Information System (INIS)

    Bonacorsi, D.; Bauerdick, L.

    2009-01-01

    During February and May 2008, CMS participated to the Combined Computing Readiness Challenge (CCRC'08) together with all other LHC experiments. The purpose of this worldwide exercise was to check the readiness of the Computing infrastructure for LHC data taking. Another set of major CMS tests called Computing, Software and Analysis challenge (CSA'08) - as well as CMS cosmic runs - were also running at the same time: CCRC augmented the load on computing with additional tests to validate and stress-test all CMS computing workflows at full data taking scale, also extending this to the global WLCG community. CMS exercised most aspects of the CMS computing model, with very comprehensive tests. During May 2008, CMS moved more than 3.6 Petabytes among more than 300 links in the complex Grid topology. CMS demonstrated that is able to safely move data out of CERN to the Tier-1 sites, sustaining more than 600 MB/s as a daily average for more than seven days in a row, with enough headroom and with hourly peaks of up to 1.7 GB/s. CMS ran hundreds of simultaneous jobs at each Tier-1 site, re-reconstructing and skimming hundreds of millions of events. After re-reconstruction the fresh AOD (Analysis Object Data) has to be synchronized between Tier-1 centers: CMS demonstrated that the required inter-Tier-1 transfers are achievable within a few days. CMS also showed that skimmed analysis data sets can be transferred to Tier-2 sites for analysis at sufficient rate, regionally as well as inter-regionally, achieving all goals in about 90% of >200 links. Simultaneously, CMS also ran a large Tier-2 analysis exercise, where realistic analysis jobs were submitted to a large set of Tier-2 sites by a large number of people to produce a chaotic workload across the systems, and with more than 400 analysis users in May. Taken all together, CMS routinely achieved submissions of 100k jobs/day, with peaks up to 200k jobs/day. The achieved results in CCRC'08 - focussing on the distributed

  2. Ready To Buy a Computer?

    Science.gov (United States)

    Rourke, Martha; Rourke, Patrick

    1974-01-01

    The school district business manager can make sound, cost-conscious decisions in the purchase of computer equipment by developing a list of cost-justified applications for automation, considering the software, writing performance specifications for bidding or negotiating a contract, and choosing the vendor wisely prior to the purchase; and by…

  3. Computer Skills Training and Readiness to Work with Computers

    Directory of Open Access Journals (Sweden)

    Arnon Hershkovitz

    2016-05-01

    Full Text Available In today’s job market, computer skills are part of the prerequisites for many jobs. In this paper, we report on a study of readiness to work with computers (the dependent variable among unemployed women (N=54 after participating in a unique, web-supported training focused on computer skills and empowerment. Overall, the level of participants’ readiness to work with computers was much higher at the end of the course than it was at its begin-ning. During the analysis, we explored associations between this variable and variables from four categories: log-based (describing the online activity; computer literacy and experience; job-seeking motivation and practice; and training satisfaction. Only two variables were associated with the dependent variable: knowledge post-test duration and satisfaction with content. After building a prediction model for the dependent variable, another log-based variable was highlighted: total number of actions in the course website along the course. Overall, our analyses shed light on the predominance of log-based variables over variables from other categories. These findings might hint at the need of developing new assessment tools for learners and trainees that take into consideration human-computer interaction when measuring self-efficacy variables.

  4. CBI students: ready for new challenges

    CERN Multimedia

    Antonella Del Rosso

    2015-01-01

    Twenty-seven students from four universities and over ten countries gathered at IdeaSquare to start their Challenge-Based Innovation (CBI) course (see here). Labour mobility, food safety, literacy in the developing world and water safety are the four projects that the students will work on now that they are back at their home institutions. The final ideas and prototypes will be presented at CERN in December.   The CBI students enjoy some training sessions at IdeaSquare. (Image: Joona Kurikka for IdeaSquare). The intensive first week of the four-month CBI Mediterranean course took place from 14 to 18 September. The students, from four universities – ESADE, IED and UPC in Barcelona and UNIMORE in Italy – gathered at CERN to meet researchers and carry out need-finding and benchmarking studies. “The idea of CBI courses is to get multidisciplinary student teams and their instructors to collaborate with researchers at CERN to develop novel solutions that meet societal need...

  5. Challenges and insights for situated language processing: Comment on "Towards a computational comparative neuroprimatology: Framing the language-ready brain" by Michael A. Arbib

    Science.gov (United States)

    Knoeferle, Pia

    2016-03-01

    In his review article [19], Arbib outlines an ambitious research agenda: to accommodate within a unified framework the evolution, the development, and the processing of language in natural settings (implicating other systems such as vision). He does so with neuro-computationally explicit modeling in mind [1,2] and inspired by research on the mirror neuron system in primates. Similar research questions have received substantial attention also among other scientists [3,4,12].

  6. The Challenge of Computers.

    Science.gov (United States)

    Leger, Guy

    Computers may change teachers' lifestyles, teaching styles, and perhaps even their personal values. A brief survey of the history of computers demonstrates the incredible pace at which computer technology is moving ahead. The cost and size of microchips will continue to decline dramatically over the next 20 years, while the capability and variety…

  7. The challenge of computer mathematics.

    Science.gov (United States)

    Barendregt, Henk; Wiedijk, Freek

    2005-10-15

    Progress in the foundations of mathematics has made it possible to formulate all thinkable mathematical concepts, algorithms and proofs in one language and in an impeccable way. This is not in spite of, but partially based on the famous results of Gödel and Turing. In this way statements are about mathematical objects and algorithms, proofs show the correctness of statements and computations, and computations are dealing with objects and proofs. Interactive computer systems for a full integration of defining, computing and proving are based on this. The human defines concepts, constructs algorithms and provides proofs, while the machine checks that the definitions are well formed and the proofs and computations are correct. Results formalized so far demonstrate the feasibility of this 'computer mathematics'. Also there are very good applications. The challenge is to make the systems more mathematician-friendly, by building libraries and tools. The eventual goal is to help humans to learn, develop, communicate, referee and apply mathematics.

  8. CMS Software and Computing Ready for Run 2

    CERN Document Server

    Bloom, Kenneth

    2015-01-01

    In Run 1 of the Large Hadron Collider, software and computing was a strategic strength of the Compact Muon Solenoid experiment. The timely processing of data and simulation samples and the excellent performance of the reconstruction algorithms played an important role in the preparation of the full suite of searches used for the observation of the Higgs boson in 2012. In Run 2, the LHC will run at higher intensities and CMS will record data at a higher trigger rate. These new running conditions will provide new challenges for the software and computing systems. Over the two years of Long Shutdown 1, CMS has built upon the successes of Run 1 to improve the software and computing to meet these challenges. In this presentation we will describe the new features in software and computing that will once again put CMS in a position of physics leadership.

  9. Computer-Based Assessment of School Readiness and Early Reasoning

    Science.gov (United States)

    Csapó, Beno; Molnár, Gyöngyvér; Nagy, József

    2014-01-01

    This study explores the potential of using online tests for the assessment of school readiness and for monitoring early reasoning. Four tests of a face-to-face-administered school readiness test battery (speech sound discrimination, relational reasoning, counting and basic numeracy, and deductive reasoning) and a paper-and-pencil inductive…

  10. Towards a Computational Comparative Neuroprimatology: Framing the language-ready brain

    Science.gov (United States)

    Arbib, Michael A.

    2016-03-01

    We make the case for developing a Computational Comparative Neuroprimatology to inform the analysis of the function and evolution of the human brain. First, we update the mirror system hypothesis on the evolution of the language-ready brain by (i) modeling action and action recognition and opportunistic scheduling of macaque brains to hypothesize the nature of the last common ancestor of macaque and human (LCA-m); and then we (ii) introduce dynamic brain modeling to show how apes could acquire gesture through ontogenetic ritualization, hypothesizing the nature of evolution from LCA-m to the last common ancestor of chimpanzee and human (LCA-c). We then (iii) hypothesize the role of imitation, pantomime, protosign and protospeech in biological and cultural evolution from LCA-c to Homo sapiens with a language-ready brain. Second, we suggest how cultural evolution in Homo sapiens led from protolanguages to full languages with grammar and compositional semantics. Third, we assess the similarities and differences between the dorsal and ventral streams in audition and vision as the basis for presenting and comparing two models of language processing in the human brain: A model of (i) the auditory dorsal and ventral streams in sentence comprehension; and (ii) the visual dorsal and ventral streams in defining ;what language is about; in both production and perception of utterances related to visual scenes provide the basis for (iii) a first step towards a synthesis and a look at challenges for further research.

  11. Global Health Governance Challenges 2016 - Are We Ready?

    Science.gov (United States)

    Kickbusch, Ilona

    2016-02-29

    The year 2016 could turn out to be a turning point for global health, new political realities and global insecurities will test governance and financing mechanisms in relation to both people and planet. But most importantly political factors such as the global power shift and "the rise of the rest" will define the future of global health. A new mix of health inequity and security challenges has emerged and the 2015 humanitarian and health crises have shown the limits of existing systems. The global health as well as the humanitarian system will have to prove their capacity to respond and reform. The challenge ahead is deeply political, especially for the rising political actors. They are confronted with the consequences of a model of development that has neglected sustainability and equity, and was built on their exploitation. Some direction has been given by the path breaking international conferences in 2015. Especially the agreement on the Sustainable Development Goals (SDGs) and the Paris agreement on climate change will shape action. Conceptually, we will need a different understanding of global health and its ultimate goals - the health of people can no longer be seen separate from the health of the planet and wealth measured by parameters of growth will no longer ensure health. © 2016 by Kerman University of Medical Sciences.

  12. New LHCb Management readies for run 2 challenges

    CERN Multimedia

    Antonella Del Rosso

    2014-01-01

    As of 1 July, LHCb, one of the four biggest experiments at the LHC, will have a new Management. Ahead are the huge challenges of run 2 and the following long technical shutdown during which LHCb will undergo a major upgrade. In the meantime, the discovery of new physics could be a dream within reach…   New LHCb Spokesperson, Guy Wilkinson.   “We have to make sure that the detector wakes up after its long hibernation and goes back to data taking in the most efficient way and that we are able to process all these data to produce high-quality physics results,” says Guy Wilkinson, new Spokesperson of the LHCb collaboration. Although this already sounds like a considerable “to-do” list for the coming months, it’s just the beginning of a much longer and ambitious plan. “The previous management has done an excellent job in analysing the data we took during run 1. They also put on a very sound footing the LHCb upgrade, whi...

  13. Granular computing: perspectives and challenges.

    Science.gov (United States)

    Yao, JingTao; Vasilakos, Athanasios V; Pedrycz, Witold

    2013-12-01

    Granular computing, as a new and rapidly growing paradigm of information processing, has attracted many researchers and practitioners. Granular computing is an umbrella term to cover any theories, methodologies, techniques, and tools that make use of information granules in complex problem solving. The aim of this paper is to review foundations and schools of research and to elaborate on current developments in granular computing research. We first review some basic notions of granular computing. Classification and descriptions of various schools of research in granular computing are given. We also present and identify some research directions in granular computing.

  14. Using a Computer Simulation to Improve Psychological Readiness for Job Interviewing in Unemployed Individuals of Pre-Retirement Age.

    Science.gov (United States)

    Aysina, Rimma M; Efremova, Galina I; Maksimenko, Zhanna A; Nikiforov, Mikhail V

    2017-05-01

    Unemployed individuals of pre-retirement age face significant challenges in finding a new job. This may be partly due to their lack of psychological readiness to go through a job interview. We view psychological readiness as one of the psychological attitude components. It is an active conscious readiness to interact with a certain aspect of reality, based on previously acquired experience. It includes a persons' special competence to manage their activities and cope with anxiety. We created Job Interview Simulation Training (JIST) - a computer-based simulator, which allowed unemployed job seekers to practice interviewing repeatedly in a stress-free environment. We hypothesized that completion of JIST would be related to increase in pre-retirement job seekers' psychological readiness for job interviewing in real life. Participants were randomized into control (n = 18) and experimental (n = 21) conditions. Both groups completed pre- and post-intervention job interview role-plays and self-reporting forms of psychological readiness for job interviewing. JIST consisted of 5 sessions of a simulated job interview, and the experimental group found it easy to use and navigate as well as helpful to prepare for interviewing. After finishing JIST-sessions the experimental group had significant decrease in heart rate during the post-intervention role-play and demonstrated significant increase in their self-rated psychological readiness, whereas the control group did not have changes in these variables. Future research may help clarify whether JIST is related to an increase in re-employment of pre-retirement job seekers.

  15. Security and Privacy in Fog Computing: Challenges

    OpenAIRE

    Mukherjee, Mithun; Matam, Rakesh; Shu, Lei; Maglaras, Leandros; Ferrag, Mohamed Amine; Choudhry, Nikumani; Kumar, Vikas

    2017-01-01

    open access article Fog computing paradigm extends the storage, networking, and computing facilities of the cloud computing toward the edge of the networks while offloading the cloud data centers and reducing service latency to the end users. However, the characteristics of fog computing arise new security and privacy challenges. The existing security and privacy measurements for cloud computing cannot be directly applied to the fog computing due to its features, such as mobility, heteroge...

  16. Internet Shop Users: Computer Practices and Its Relationship to E-Learning Readiness

    OpenAIRE

    Jasper Vincent Q. Alontaga

    2018-01-01

    Access to computer technology is essential in developing 21st century skills. One venue that serves to bridge the gap in terms of access is internet shops (also known cybercafés or internet cafés). As such, it is important to examine the type of activities internet shop users engage in and how they develop and relate to their e-learning readiness. This study examined the profile, computer practices and e-learning readiness of seventy one (71) internet shop users. A researcher-made internet sh...

  17. Internet ware cloud computing :Challenges

    OpenAIRE

    Qamar, S; Lal, Niranjan; Singh, Mrityunjay

    2010-01-01

    After decades of engineering development and infrastructural investment, Internet connections have become commodity product in many countries, and Internet scale “cloud computing” has started to compete with traditional software business through its technological advantages and economy of scale. Cloud computing is a promising enabling technology of Internet ware Cloud Computing is termed as the next big thing in the modern corporate world. Apart from the present day software and technologies,...

  18. Towards a Computational Comparative Neuroprimatology: Framing the language-ready brain.

    Science.gov (United States)

    Arbib, Michael A

    2016-03-01

    We make the case for developing a Computational Comparative Neuroprimatology to inform the analysis of the function and evolution of the human brain. First, we update the mirror system hypothesis on the evolution of the language-ready brain by (i) modeling action and action recognition and opportunistic scheduling of macaque brains to hypothesize the nature of the last common ancestor of macaque and human (LCA-m); and then we (ii) introduce dynamic brain modeling to show how apes could acquire gesture through ontogenetic ritualization, hypothesizing the nature of evolution from LCA-m to the last common ancestor of chimpanzee and human (LCA-c). We then (iii) hypothesize the role of imitation, pantomime, protosign and protospeech in biological and cultural evolution from LCA-c to Homo sapiens with a language-ready brain. Second, we suggest how cultural evolution in Homo sapiens led from protolanguages to full languages with grammar and compositional semantics. Third, we assess the similarities and differences between the dorsal and ventral streams in audition and vision as the basis for presenting and comparing two models of language processing in the human brain: A model of (i) the auditory dorsal and ventral streams in sentence comprehension; and (ii) the visual dorsal and ventral streams in defining "what language is about" in both production and perception of utterances related to visual scenes provide the basis for (iii) a first step towards a synthesis and a look at challenges for further research. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Challenges and Security in Cloud Computing

    Science.gov (United States)

    Chang, Hyokyung; Choi, Euiin

    People who live in this world want to solve any problems as they happen then. An IT technology called Ubiquitous computing should help the situations easier and we call a technology which makes it even better and powerful cloud computing. Cloud computing, however, is at the stage of the beginning to implement and use and it faces a lot of challenges in technical matters and security issues. This paper looks at the cloud computing security.

  20. Computing challenges of the CMS experiment

    International Nuclear Information System (INIS)

    Krammer, N.; Liko, D.

    2017-01-01

    The success of the LHC experiments is due to the magnificent performance of the detector systems and the excellent operating computing systems. The CMS offline software and computing system is successfully fulfilling the LHC Run 2 requirements. For the increased data rate of future LHC operation, together with high pileup interactions, improvements of the usage of the current computing facilities and new technologies became necessary. Especially for the challenge of the future HL-LHC a more flexible and sophisticated computing model is needed. In this presentation, I will discuss the current computing system used in the LHC Run 2 and future computing facilities for the HL-LHC runs using flexible computing technologies like commercial and academic computing clouds. The cloud resources are highly virtualized and can be deployed for a variety of computing tasks providing the capacities for the increasing needs of large scale scientific computing.

  1. Computational challenges in modeling gene regulatory events.

    Science.gov (United States)

    Pataskar, Abhijeet; Tiwari, Vijay K

    2016-10-19

    Cellular transcriptional programs driven by genetic and epigenetic mechanisms could be better understood by integrating "omics" data and subsequently modeling the gene-regulatory events. Toward this end, computational biology should keep pace with evolving experimental procedures and data availability. This article gives an exemplified account of the current computational challenges in molecular biology.

  2. Challenges and solutions in enterprise computing

    NARCIS (Netherlands)

    van Sinderen, Marten J.

    2008-01-01

    The emergence of the networked enterprise has a profound effect on enterprise computing. This introduction discusses some important challenges in enterprise computing, which are the result of the mentioned networking trend, and positions the articles of this special issue with respect to these

  3. Characterization of a new computer-ready photon counting system

    Science.gov (United States)

    Andor, Gyorgy

    1998-08-01

    The photon-counting system seems to be the best solution for extremely low optical power measurements. The Hamamatsu HC135 photon counting module has a built-in high-voltage power supply amplifier, discriminator, micro-controller with an RS232 serial output. It requires only a +5V supply voltage and an IBM PC or compatible computer to run. The system is supplied with an application software. This talk is about the testing of the device.

  4. Microbiological challenge testing for Listeria monocytogenes in ready-to-eat food: a practical approach

    Directory of Open Access Journals (Sweden)

    Carlo Spanu

    2014-12-01

    Full Text Available Food business operators (FBOs are the primary responsible for the safety of food they place on the market. The definition and validation of the product’s shelf-life is an essential part for ensuring microbiological safety of food and health of consumers. In the frame of the Regulation (EC No 2073/2005 on microbiological criteria for foodstuffs, FBOs shall conduct shelf-life studies in order to assure that their food does not exceed the food safety criteria throughout the defined shelf-life. In particular this is required for ready-to-eat (RTE food that supports the growth of Listeria monocytogenes. Among other studies, FBOs can rely on the conclusion drawn by microbiological challenge tests. A microbiological challenge test consists in the artificial contamination of a food with a pathogen microorganism and aims at simulating its behaviour during processing and distribution under the foreseen storage and handling conditions. A number of documents published by international health authorities and research institutions describes how to conduct challenge studies. The authors reviewed the existing literature and described the methodology for implementing such laboratory studies. All the main aspects for the conduction of L. monocytogenes microbiological challenge tests were considered, from the selection of the strains, preparation and choice of the inoculum level and method of contamination, to the experimental design and data interpretation. The objective of the present document is to provide an exhaustive and practical guideline for laboratories that want to implement L. monocytogenes challenge testing on RTE food.

  5. Microbiological Challenge Testing for Listeria Monocytogenes in Ready-to-Eat Food: A Practical Approach.

    Science.gov (United States)

    Spanu, Carlo; Scarano, Christian; Ibba, Michela; Pala, Carlo; Spanu, Vincenzo; De Santis, Enrico Pietro Luigi

    2014-12-09

    Food business operators (FBOs) are the primary responsible for the safety of food they place on the market. The definition and validation of the product's shelf-life is an essential part for ensuring microbiological safety of food and health of consumers. In the frame of the Regulation (EC) No 2073/2005 on microbiological criteria for foodstuffs, FBOs shall conduct shelf-life studies in order to assure that their food does not exceed the food safety criteria throughout the defined shelf-life. In particular this is required for ready-to-eat (RTE) food that supports the growth of Listeria monocytogenes . Among other studies, FBOs can rely on the conclusion drawn by microbiological challenge tests. A microbiological challenge test consists in the artificial contamination of a food with a pathogen microorganism and aims at simulating its behaviour during processing and distribution under the foreseen storage and handling conditions. A number of documents published by international health authorities and research institutions describes how to conduct challenge studies. The authors reviewed the existing literature and described the methodology for implementing such laboratory studies. All the main aspects for the conduction of L. monocytogenes microbiological challenge tests were considered, from the selection of the strains, preparation and choice of the inoculum level and method of contamination, to the experimental design and data interpretation. The objective of the present document is to provide an exhaustive and practical guideline for laboratories that want to implement L. monocytogenes challenge testing on RTE food.

  6. Research on Digital Forensic Readiness Design in a Cloud Computing-Based Smart Work Environment

    Directory of Open Access Journals (Sweden)

    Sangho Park

    2018-04-01

    Full Text Available Recently, the work environments of organizations have been in the process of transitioning into smart work environments by applying cloud computing technology in the existing work environment. The smart work environment has the characteristic of being able to access information assets inside the company from outside the company through cloud computing technology, share information without restrictions on location by using mobile terminals, and provide a work environment where work can be conducted effectively in various locations and mobile environments. Thus, in the cloud computing-based smart work environment, changes are occurring in terms of security risks, such as an increase in the leakage risk of an organization’s information assets through mobile terminals which have a high risk of loss and theft and increase the hacking risk of wireless networks in mobile environments. According to these changes in security risk, the reactive digital forensic method, which investigates digital evidence after the occurrence of security incidents, appears to have a limit which has led to a rise in the necessity of proactive digital forensic approaches wherein security incidents can be addressed preemptively. Accordingly, in this research, we design a digital forensic readiness model at the level of preemptive prevention by considering changes in the cloud computing-based smart work environment. Firstly, we investigate previous research related to the cloud computing-based smart work environment and digital forensic readiness and analyze a total of 50 components of digital forensic readiness. In addition, through the analysis of the corresponding preceding research, we design seven detailed areas, namely, outside the organization environment, within the organization guideline, system information, terminal information, user information, usage information, and additional function. Then, we design a draft of the digital forensic readiness model in the cloud

  7. Cloud Computing: Opportunities and Challenges for Businesses

    Directory of Open Access Journals (Sweden)

    İbrahim Halil Seyrek

    2011-12-01

    Full Text Available Cloud computing represents a new approach for supplying and using information technology services. Considering its benefits for firms and the potential of changes that it may lead to, it is envisioned that cloud computing can be the most important innovation in information technology since the development of the internet. In this study, firstly, the development of cloud computing and related technologies are explained and classified by giving current application examples. Then the benefits of this new computing model for businesses are elaborated especially in terms of cost, flexibility and service quality. In spite of its benefits, cloud computing also poses some risks for firms, of which security is one of the most important, and there are some challenges in its implementation. This study points out the risks that companies should be wary about and some legal challenges related to cloud computing. Lastly, approaches that companies may take against cloud computing and different strategies that they may adopt are discussed and some recommendations are made

  8. Taking the Challenge at Singer Village. A Cold Climate Zero Energy Ready Home

    Energy Technology Data Exchange (ETDEWEB)

    Puttagunta, S. [Consortium for Advanced Residential Buildings, Norwalk, CT (United States); Faakye, O. [Consortium for Advanced Residential Buildings, Norwalk, CT (United States)

    2014-10-01

    After progressively incorporating ENERGY STAR® for Homes Versions 1, 2, and 3 into its standard practices over the years, this builder, Brookside Development, was seeking to build an even more sustainable product that would further increase energy efficiency, while also addressing indoor air quality, water conservation, renewable-ready, and resiliency. These objectives align with the framework of the DOE Challenge Home program, which "builds upon the comprehensive building science requirements of ENERGY STAR for Homes Version 3, along with proven Building America innovations and best practices. Other special attribute programs are incorporated to help builders reach unparalleled levels of performance with homes designed to last hundreds of years." Consortium for Advanced Residential Buildings (CARB) partnered with Brookside Development on the design optimization and construction of the first home in a small development of seven planned new homes being built on the old Singer Estate in Derby, CT.

  9. Taking the Challenge at Singer Village--A Cold Climate Zero Energy Ready Home

    Energy Technology Data Exchange (ETDEWEB)

    Puttagunta, S.; Gaakye, O.

    2014-10-01

    After progressively incorporating ENERGY STAR(R) for Homes Versions 1, 2, and 3 into its standard practices over the years, this builder, Brookside Development, was seeking to build an even more sustainable product that would further increase energy efficiency, while also addressing indoor air quality, water conservation, renewable-ready, and resiliency. These objectives align with the framework of the DOE Challenge Home program, which 'builds upon the comprehensive building science requirements of ENERGY STAR for Homes Version 3, along with proven Building America innovations and best practices. Other special attribute programs are incorporated to help builders reach unparalleled levels of performance with homes designed to last hundreds of years.' CARB partnered with Brookside Development on the design optimization and construction of the first home in a small development of seven planned new homes being built on the old Singer Estate in Derby, CT.

  10. New challenges in computational collective intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Ngoc Thanh; Katarzyniak, Radoslaw Piotr [Wroclaw Univ. of Technology (Poland). Inst. of Informatics; Janiak, Adam (eds.) [Wroclaw Univ. of Technology (Poland). Inst. of Computer Engineering, Control and Robotics

    2009-07-01

    The book consists of 29 chapters which have been selected and invited from the submissions to the 1{sup st} International Conference on Collective Intelligence - Semantic Web, Social Networks and Multiagent Systems (ICCCI 2009). All chapters in the book discuss various examples of applications of computational collective intelligence and related technologies to such fields as semantic web, information systems ontologies, social networks, agent and multiagent systems. The editors hope that the book can be useful for graduate and Ph.D. students in Computer Science, in particular participants to courses on Soft Computing, Multi-Agent Systems and Robotics. This book can also be useful for researchers working on the concept of computational collective intelligence in artificial populations. It is the hope of the editors that readers of this volume can find many inspiring ideas and use them to create new cases intelligent collectives. Many such challenges are suggested by particular approaches and models presented in particular chapters of this book. (orig.)

  11. Computational Psychiatry and the Challenge of Schizophrenia

    Science.gov (United States)

    Murray, John D.; Chekroud, Adam M.; Corlett, Philip R.; Yang, Genevieve; Wang, Xiao-Jing; Anticevic, Alan

    2017-01-01

    Abstract Schizophrenia research is plagued by enormous challenges in integrating and analyzing large datasets and difficulties developing formal theories related to the etiology, pathophysiology, and treatment of this disorder. Computational psychiatry provides a path to enhance analyses of these large and complex datasets and to promote the development and refinement of formal models for features of this disorder. This presentation introduces the reader to the notion of computational psychiatry and describes discovery-oriented and theory-driven applications to schizophrenia involving machine learning, reinforcement learning theory, and biophysically-informed neural circuit models. PMID:28338845

  12. Are we ready to accept the challenge? Addressing the shortcomings of contemporary qualitative health research.

    Science.gov (United States)

    Lau, Sofie Rosenlund; Traulsen, Janine M

    Qualitative approaches represent an important contributor to health care research. However, several researchers argue that contemporary qualitative research does not live up to its full potential. By presenting a snapshot of contemporary qualitative research in the field of social and administrative pharmacy, this study challenges contributors to the field by asking: Are we ready to accept the challenge and take qualitative research one step further? The purpose of this study was to initiate a constructive dialogue on the need for increased transparency in qualitative data analysis, including explicitly reflecting upon theoretical perspectives affecting the research process. Content analysis was used to evaluate levels of theoretical visibility and analysis transparency in selected qualitative research articles published in Research in Social and Administrative Pharmacy between January 2014 and January 2015. In 14 out of 21 assessed papers, the use of theory was found to be Seemingly Absent (lowest level of theory use), and the data analyses did not include any interpretive endeavors. Only two papers consistently applied theory throughout the entire study and clearly took the data analyses from a descriptive to an interpretive level. It was found that the aim of the majority of assessed papers was to change or modify a given practice, which however, resulted in a lack of both theoretical underpinnings and analysis transparency. This study takes the standpoint that theory and high-quality analysis go hand-in-hand. Based on the content analysis, articles that were deemed to be high in quality were explicit about the theoretical framework of their study and transparent in how they analyzed their data. It was found that theory contributed to the transparency of how the data were analyzed and interpreted. Two ways of improving contemporary qualitative research in the field of social and administrative pharmacy are discussed: engaging with social theory and establishing

  13. Computer Network Security- The Challenges of Securing a Computer Network

    Science.gov (United States)

    Scotti, Vincent, Jr.

    2011-01-01

    This article is intended to give the reader an overall perspective on what it takes to design, implement, enforce and secure a computer network in the federal and corporate world to insure the confidentiality, integrity and availability of information. While we will be giving you an overview of network design and security, this article will concentrate on the technology and human factors of securing a network and the challenges faced by those doing so. It will cover the large number of policies and the limits of technology and physical efforts to enforce such policies.

  14. Achievements and Challenges in Computational Protein Design.

    Science.gov (United States)

    Samish, Ilan

    2017-01-01

    Computational protein design (CPD), a yet evolving field, includes computer-aided engineering for partial or full de novo designs of proteins of interest. Designs are defined by a requested structure, function, or working environment. This chapter describes the birth and maturation of the field by presenting 101 CPD examples in a chronological order emphasizing achievements and pending challenges. Integrating these aspects presents the plethora of CPD approaches with the hope of providing a "CPD 101". These reflect on the broader structural bioinformatics and computational biophysics field and include: (1) integration of knowledge-based and energy-based methods, (2) hierarchical designated approach towards local, regional, and global motifs and the integration of high- and low-resolution design schemes that fit each such region, (3) systematic differential approaches towards different protein regions, (4) identification of key hot-spot residues and the relative effect of remote regions, (5) assessment of shape-complementarity, electrostatics and solvation effects, (6) integration of thermal plasticity and functional dynamics, (7) negative design, (8) systematic integration of experimental approaches, (9) objective cross-assessment of methods, and (10) successful ranking of potential designs. Future challenges also include dissemination of CPD software to the general use of life-sciences researchers and the emphasis of success within an in vivo milieu. CPD increases our understanding of protein structure and function and the relationships between the two along with the application of such know-how for the benefit of mankind. Applied aspects range from biological drugs, via healthier and tastier food products to nanotechnology and environmentally friendly enzymes replacing toxic chemicals utilized in the industry.

  15. CMS Computing Software and Analysis Challenge 2006

    Energy Technology Data Exchange (ETDEWEB)

    De Filippis, N. [Dipartimento interateneo di Fisica M. Merlin and INFN Bari, Via Amendola 173, 70126 Bari (Italy)

    2007-10-15

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work.

  16. CMS Computing Software and Analysis Challenge 2006

    International Nuclear Information System (INIS)

    De Filippis, N.

    2007-01-01

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work

  17. Assessing Learners' Perceived Readiness for Computer-Supported Collaborative Learning (CSCL): A Study on Initial Development and Validation

    Science.gov (United States)

    Xiong, Yao; So, Hyo-Jeong; Toh, Yancy

    2015-01-01

    The main purpose of this study was to develop an instrument that assesses university students' perceived readiness for computer-supported collaborative learning (CSCL). Assessment in CSCL research had predominantly focused on measuring "after-collaboration" outcomes and "during-collaboration" behaviors while…

  18. The Coming Challenge: Are Community Colleges Ready for the New Wave of Contextual Learners?

    Science.gov (United States)

    Hull, Dan; Souders, John C., Jr.

    1996-01-01

    Defines contextual learning, or presenting new information to students in familiar contexts. Argues that community colleges must be ready for an anticipated increase in contextual learners due to its use in tech prep programs. Describes elements of contextual learning, its application in the classroom, and ways that colleges can prepare for…

  19. Salt as a public health challenge in continental European convenience and ready meals.

    Science.gov (United States)

    Kanzler, Sonja; Hartmann, Christina; Gruber, Anita; Lammer, Guido; Wagner, Karl-Heinz

    2014-11-01

    To assess the salt content of continental European convenience and ready meals. A multistage study in which, after laboratory analysis of the products' salt contents (n 32), new salt-reduced meals were developed through food reformulation. Additionally, a comprehensive survey of convenience meals from the Austrian market (n 572) was conducted to evaluate the salt contents of a wider product range. Six continental European countries participated. No subjects enrolled. The salt contents of continental European convenience and ready meals mostly exceeded 1·8 g/100 g, which is 30 % of the targeted daily intake level; some contained even more than the recommended daily intake of 6 g. The highest salt contents were found in pizzas and pasta dishes, the lowest ones in sweet meals. Large variations in salt levels were found not only between and within meal type categories, but also between similar meals from different producers. In addition, our approach to develop new salt-reduced meals showed that a stepwise reduction of the ready meals' salt contents is possible without compromising the sensory quality. To address the problem of hypertension and increased risk for CVD through high salt intake, a reduction of the salt levels in continental European convenience and ready meals is urgently needed, since they are providing a major part of the daily salt intake. Successful national-wide salt reduction strategies in the UK or Finland have already demonstrated the public health impact of this setting.

  20. Developing MOOCs to Narrow the College Readiness Gap: Challenges and Recommendations for a Writing Course

    Science.gov (United States)

    Bandi-Rao, Shoba; Devers, Christopher J.

    2015-01-01

    Massive Open Online Courses (MOOCs) have demonstrated the potential to deliver quality and cost effective course materials to large numbers of students. Approximately 60% of first-year students at community colleges are underprepared for college-level coursework. One reason for low graduation rates is the lack of the overall college readiness.…

  1. Computing Challenges in Coded Mask Imaging

    Science.gov (United States)

    Skinner, Gerald

    2009-01-01

    This slide presaentation reviews the complications and challenges in developing computer systems for Coded Mask Imaging telescopes. The coded mask technique is used when there is no other way to create the telescope, (i.e., when there are wide fields of view, high energies for focusing or low energies for the Compton/Tracker Techniques and very good angular resolution.) The coded mask telescope is described, and the mask is reviewed. The coded Masks for the INTErnational Gamma-Ray Astrophysics Laboratory (INTEGRAL) instruments are shown, and a chart showing the types of position sensitive detectors used for the coded mask telescopes is also reviewed. Slides describe the mechanism of recovering an image from the masked pattern. The correlation with the mask pattern is described. The Matrix approach is reviewed, and other approaches to image reconstruction are described. Included in the presentation is a review of the Energetic X-ray Imaging Survey Telescope (EXIST) / High Energy Telescope (HET), with information about the mission, the operation of the telescope, comparison of the EXIST/HET with the SWIFT/BAT and details of the design of the EXIST/HET.

  2. "Defining Computer 'Speed': An Unsolved Challenge"

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Abstract: The reason we use computers is their speed, and the reason we use parallel computers is that they're faster than single-processor computers. Yet, after 70 years of electronic digital computing, we still do not have a solid definition of what computer 'speed' means, or even what it means to be 'faster'. Unlike measures in physics, where the definition of speed is rigorous and unequivocal, in computing there is no definition of speed that is universally accepted. As a result, computer customers have made purchases misguided by dubious information, computer designers have optimized their designs for the wrong goals, and computer programmers have chosen methods that optimize the wrong things. This talk describes why some of the obvious and historical ways of defining 'speed' haven't served us well, and the things we've learned in the struggle to find a definition that works. Biography: Dr. John Gustafson is a Director ...

  3. CLOUD COMPUTING OVERVIEW AND CHALLENGES: A REVIEW PAPER

    OpenAIRE

    Satish Kumar*, Vishal Thakur, Payal Thakur, Ashok Kumar Kashyap

    2017-01-01

    Cloud computing era is the most resourceful, elastic, utilized and scalable period for internet technology to use the computing resources over the internet successfully. Cloud computing did not provide only the speed, accuracy, storage capacity and efficiency for computing but it also lead to propagate the green computing and resource utilization. In this research paper, a brief description of cloud computing, cloud services and cloud security challenges is given. Also the literature review o...

  4. Ready…, Set, Go!. Comment on "Towards a Computational Comparative Neuroprimatology: Framing the language-ready brain" by Michael A. Arbib

    Science.gov (United States)

    Iriki, Atsushi

    2016-03-01

    ;Language-READY brain; in the title of this article [1] seems to be the expression that the author prefers to use to illustrate his theoretical framework. The usage of the term ;READY; appears to be of extremely deep connotation, for three reasons. Firstly, of course it needs a ;principle; - the depth and the width of the computational theory depicted here is as expected from the author's reputation. However, ;readiness; implies that it is much more than just ;a theory;. That is, such a principle is not static, but it rather has dynamic properties, which are ready to gradually proceed to flourish once brains are put in adequate conditions to make time progressions - namely, evolution and development. So the second major connotation is that this article brought in the perspectives of the comparative primatology as a tool to relativise the language-realizing human brains among other animal species, primates in particular, in the context of evolutionary time scale. The tertiary connotation lies in the context of the developmental time scale. The author claims that it is the interaction of the newborn with its care takers, namely its mother and other family or social members in its ecological conditions, that brings the brain mechanism subserving language faculty to really mature to its final completion. Taken together, this article proposes computational theories and mechanisms of Evo-Devo-Eco interactions for language acquisition in the human brains.

  5. Cloud Computing Security Issues and Challenges

    OpenAIRE

    Kuyoro S. O.; Ibikunle F; Awodele O

    2011-01-01

    Cloud computing is a set of IT services that are provided to a customer over a network on a leased basis and with the ability to scale up or down their service requirements. Usually cloud computing services are delivered by a third party provider who owns the infrastructure. It advantages to mention but a few include scalability, resilience, flexibility, efficiency and outsourcing non-core activities. Cloud computing offers an innovative business model for organizations to adopt IT services w...

  6. Cloud Computing Security Issues - Challenges and Opportunities

    OpenAIRE

    Vaikunth, Pai T.; Aithal, P. S.

    2017-01-01

    Cloud computing services enabled through information communication technology delivered to a customer as services over the Internet on a leased basis have the capability to extend up or down their service requirements or needs. In this model, the infrastructure is owned by a third party vendor and the cloud computing services are delivered to the requested customers. Cloud computing model has many advantages including scalability, flexibility, elasticity, efficiency, and supports outsourcing ...

  7. Homogeneous Buchberger algorithms and Sullivant's computational commutative algebra challenge

    DEFF Research Database (Denmark)

    Lauritzen, Niels

    2005-01-01

    We give a variant of the homogeneous Buchberger algorithm for positively graded lattice ideals. Using this algorithm we solve the Sullivant computational commutative algebra challenge.......We give a variant of the homogeneous Buchberger algorithm for positively graded lattice ideals. Using this algorithm we solve the Sullivant computational commutative algebra challenge....

  8. Mobile cloud computing for computation offloading: Issues and challenges

    Directory of Open Access Journals (Sweden)

    Khadija Akherfi

    2018-01-01

    Full Text Available Despite the evolution and enhancements that mobile devices have experienced, they are still considered as limited computing devices. Today, users become more demanding and expect to execute computational intensive applications on their smartphone devices. Therefore, Mobile Cloud Computing (MCC integrates mobile computing and Cloud Computing (CC in order to extend capabilities of mobile devices using offloading techniques. Computation offloading tackles limitations of Smart Mobile Devices (SMDs such as limited battery lifetime, limited processing capabilities, and limited storage capacity by offloading the execution and workload to other rich systems with better performance and resources. This paper presents the current offloading frameworks, computation offloading techniques, and analyzes them along with their main critical issues. In addition, it explores different important parameters based on which the frameworks are implemented such as offloading method and level of partitioning. Finally, it summarizes the issues in offloading frameworks in the MCC domain that requires further research.

  9. Technology Assessment of the Inspection Readiness Plan in Chemical Weapons Convention Challenge Inspections

    National Research Council Canada - National Science Library

    Woodley, Anthony

    1998-01-01

    ...) Challenge Inspections. The CWC is an intrusive inspection. The Challenge Inspection allows for a team of international inspectors to inspect on very short notice a naval facility suspected of violating the CWC...

  10. Process Improvement to the Inspection Readiness Plan in Chemical Weapons Convention Challenge Inspections

    National Research Council Canada - National Science Library

    Triplett, William

    1997-01-01

    ...) Challenge Inspection. The CWC is an intensive inspection. The Challenge Inspection allows for a team of international inspectors to inspect a naval facility suspected of violating the CWC on very short notice...

  11. Advances and Challenges in Computational Plasma Science

    International Nuclear Information System (INIS)

    Tang, W.M.; Chan, V.S.

    2005-01-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behavior. Recent advances in simulations of magnetically-confined plasmas are reviewed in this paper with illustrative examples chosen from associated research areas such as microturbulence, magnetohydrodynamics, and other topics. Progress has been stimulated in particular by the exponential growth of computer speed along with significant improvements in computer technology

  12. Advances and challenges in computational plasma science

    International Nuclear Information System (INIS)

    Tang, W M; Chan, V S

    2005-01-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This

  13. Challenges in computational statistics and data mining

    CERN Document Server

    Mielniczuk, Jan

    2016-01-01

    This volume contains nineteen research papers belonging to the areas of computational statistics, data mining, and their applications. Those papers, all written specifically for this volume, are their authors’ contributions to honour and celebrate Professor Jacek Koronacki on the occcasion of his 70th birthday. The book’s related and often interconnected topics, represent Jacek Koronacki’s research interests and their evolution. They also clearly indicate how close the areas of computational statistics and data mining are.

  14. Investigation of Cloud Computing: Applications and Challenges

    OpenAIRE

    Amid Khatibi Bardsiri; Anis Vosoogh; Fatemeh Ahoojoosh

    2014-01-01

    Cloud computing is a model for saving data or knowledge in distance servers through Internet. It can be save the required memory space and reduce cost of extending memory capacity in users’ own machines and etc., Therefore, Cloud Computing has several benefits for individuals as well as organizations. It provides protection for personal and organizational data. Further, with the help of cloud service, one business owner, organization manager or service provider will be able to make privacy an...

  15. Computational science: Emerging opportunities and challenges

    International Nuclear Information System (INIS)

    Hendrickson, Bruce

    2009-01-01

    In the past two decades, computational methods have emerged as an essential component of the scientific and engineering enterprise. A diverse assortment of scientific applications has been simulated and explored via advanced computational techniques. Computer vendors have built enormous parallel machines to support these activities, and the research community has developed new algorithms and codes, and agreed on standards to facilitate ever more ambitious computations. However, this track record of success will be increasingly hard to sustain in coming years. Power limitations constrain processor clock speeds, so further performance improvements will need to come from ever more parallelism. This higher degree of parallelism will require new thinking about algorithms, programming models, and architectural resilience. Simultaneously, cutting edge science increasingly requires more complex simulations with unstructured and adaptive grids, and multi-scale and multi-physics phenomena. These new codes will push existing parallelization strategies to their limits and beyond. Emerging data-rich scientific applications are also in need of high performance computing, but their complex spatial and temporal data access patterns do not perform well on existing machines. These interacting forces will reshape high performance computing in the coming years.

  16. Assessing self-efficacy and college readiness level among new undergraduate students in computer science using metacognitive awareness inventory (MAI)

    Science.gov (United States)

    Othman, Wan Nor Afiqah Wan; Abdullah, Aziman

    2018-04-01

    This preliminary study was conducted to address the issue of academic planning skills among new university student. Due to lack of proper measurement mechanism for awareness and readiness among students, this study proposes Metacognitive Awareness Inventory (MAI) to assess the connection between student self-efficacy and college readiness. Qualitative and quantitative approach were used by provide an online self-assessment for new student of Faculty of Computer Systems & Software Engineering (FSKKP) and analyse the data respectively. The possible relationships between MAI and College Readiness Item (CRI) in self-assessment has been evaluated. The sample size of 368 respondents from UMP are responding to the online self-assessment. The initial finding shows most student (71%) of the respondent lack of skills in planning. We manage to use Pearson Product-moment correlation coefficient to find the significant relationship between MAI and CRI. Thus, we found that College Readiness provide sufficient evidence that there is a significant correlation with most of MAI items. The findings also indicated not much difference was found between gender in terms of self-efficacy level. This paper suggests the MAI and CRI is a reliable and valid scale to respond the planning skills issues among new university students.

  17. Editorial: Modelling and computational challenges in granular materials

    NARCIS (Netherlands)

    Weinhart, Thomas; Thornton, Anthony Richard; Einav, Itai

    2015-01-01

    This is the editorial for the special issue on “Modelling and computational challenges in granular materials” in the journal on Computational Particle Mechanics (CPM). The issue aims to provide an opportunity for physicists, engineers, applied mathematicians and computational scientists to discuss

  18. Mobile Computing and Ubiquitous Networking: Concepts, Technologies and Challenges.

    Science.gov (United States)

    Pierre, Samuel

    2001-01-01

    Analyzes concepts, technologies and challenges related to mobile computing and networking. Defines basic concepts of cellular systems. Describes the evolution of wireless technologies that constitute the foundations of mobile computing and ubiquitous networking. Presents characterization and issues of mobile computing. Analyzes economical and…

  19. Female challenges in acquiring computer education at the federal ...

    African Journals Online (AJOL)

    Computer education and application of Computer skills in the knowledge-based society is ever increasing. It is in recognition of this that this study determined the challenges of female students in acquisition of Computer education using the Federal Polytechnic, Idah as a case study. The data were obtained from 72 female ...

  20. Computer Aided Modelling – Opportunities and Challenges

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    -based solutions to significant problems? The important issues of workflow and data flow are discussed together with fit-for-purpose model development. As well, the lack of tools around multiscale modelling provides opportunities for the development of efficient tools to address such challenges. The ability...

  1. Computing challenges in HEP for WLHC grid

    CERN Document Server

    Muralidharan, Servesh

    2017-01-01

    As CERN moves towards preparation for increasing the luminosity of the particle beam towards HL-LHC, predictions shows computing demand would out grow our conservative scaling estimates by over ten times. Fortunately we are talking about a time scale of roughly ten years to develop new techniques and novel solutions to address this gap in compute resources. Experiments at CERN face a unique scenario where in they need to scale both latency sensitive workloads such as data acquisition of the detectors and throughput based ones such as simulations and reconstruction of high level events and physics processes. In this talk we cover some of the ongoing research at tier-0 in CERN which investigates several aspects of throughput sensitive workloads that consume significant compute cycles.

  2. Biomedical Visual Computing: Case Studies and Challenges

    KAUST Repository

    Johnson, Christopher

    2012-01-01

    Advances in computational geometric modeling, imaging, and simulation let researchers build and test models of increasing complexity, generating unprecedented amounts of data. As recent research in biomedical applications illustrates, visualization will be critical in making this vast amount of data usable; it\\'s also fundamental to understanding models of complex phenomena. © 2012 IEEE.

  3. Biomedical Visual Computing: Case Studies and Challenges

    KAUST Repository

    Johnson, Christopher

    2012-01-01

    Advances in computational geometric modeling, imaging, and simulation let researchers build and test models of increasing complexity, generating unprecedented amounts of data. As recent research in biomedical applications illustrates, visualization will be critical in making this vast amount of data usable; it's also fundamental to understanding models of complex phenomena. © 2012 IEEE.

  4. Editorial: Modelling and computational challenges in granular materials

    OpenAIRE

    Weinhart, Thomas; Thornton, Anthony Richard; Einav, Itai

    2015-01-01

    This is the editorial for the special issue on “Modelling and computational challenges in granular materials” in the journal on Computational Particle Mechanics (CPM). The issue aims to provide an opportunity for physicists, engineers, applied mathematicians and computational scientists to discuss the current progress and latest advancements in the field of advanced numerical methods and modelling of granular materials. The focus will be on computational methods, improved algorithms and the m...

  5. Mathematical challenges from theoretical/computational chemistry

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-12-31

    The committee believes that this report has relevance and potentially valuable suggestions for a wide range of readers. Target audiences include: graduate departments in the mathematical and chemical sciences; federal and private agencies that fund research in the mathematical and chemical sciences; selected industrial and government research and development laboratories; developers of software and hardware for computational chemistry; and selected individual researchers. Chapter 2 of this report covers some history of computational chemistry for the nonspecialist, while Chapter 3 illustrates the fruits of some past successful cross-fertilization between mathematical scientists and computational/theoretical chemists. In Chapter 4 the committee has assembled a representative, but not exhaustive, survey of research opportunities. Most of these are descriptions of important open problems in computational/theoretical chemistry that could gain much from the efforts of innovative mathematical scientists, written so as to be accessible introductions to the nonspecialist. Chapter 5 is an assessment, necessarily subjective, of cultural differences that must be overcome if collaborative work is to be encouraged between the mathematical and the chemical communities. Finally, the report ends with a brief list of conclusions and recommendations that, if followed, could promote accelerated progress at this interface. Recognizing that bothersome language issues can inhibit prospects for collaborative research at the interface between distinctive disciplines, the committee has attempted throughout to maintain an accessible style, in part by using illustrative boxes, and has included at the end of the report a glossary of technical terms that may be familiar to only a subset of the target audiences listed above.

  6. Challenges & Roadmap for Beyond CMOS Computing Simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Rodrigues, Arun F. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Frank, Michael P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-12-01

    Simulating HPC systems is a difficult task and the emergence of “Beyond CMOS” architectures and execution models will increase that difficulty. This document presents a “tutorial” on some of the simulation challenges faced by conventional and non-conventional architectures (Section 1) and goals and requirements for simulating Beyond CMOS systems (Section 2). These provide background for proposed short- and long-term roadmaps for simulation efforts at Sandia (Sections 3 and 4). Additionally, a brief explanation of a proof-of-concept integration of a Beyond CMOS architectural simulator is presented (Section 2.3).

  7. Risk management is every managers’ responsibility: Are HR practitioners ready for the challenge

    Directory of Open Access Journals (Sweden)

    Pascal Siphelele Zulu

    2014-07-01

    Full Text Available Risk and Enterprise Risk Management has become a strategic imperative in most organisations and government departments over the years. Most company boards and government entities in South Africa have adopted various corporate governance frameworks as a mechanism to direct and control the operations of their organisations. As a result, risk management and enterprise risk management has become every manager’s responsibility. The key question that the study investigates is whether HR managers are aware of this strategic imperative and ready to be risk champions in their environment. Data was collected from forty eight (48 HR Managers and Practitioners from private companies and sixty eight (68 HR Managers and Practitioners from government departments and government companies in Durban, Kwazulu-Natal and Cape Town, Western Cape using both personal interviews and questionnaires which were distributed to one hundred and fifty (150 employees, of which one hundred and sixteen (116 questionnaires were completed (return rate 77.3%. The results of this paper indicate that, in general and across all sectors, HR practitioners’ levels of understanding of corporate governance and risk management is limited

  8. Cloud computing challenges, limitations and R&D solutions

    CERN Document Server

    Mahmood, Zaigham

    2014-01-01

    This important text/reference reviews the challenging issues that present barriers to greater implementation of the cloud computing paradigm, together with the latest research into developing potential solutions. Exploring the strengths and vulnerabilities of cloud provision and cloud environments, Cloud Computing: Challenges, Limitations and R&D Solutions provides case studies from a diverse selection of researchers and practitioners of international repute. The implications of emerging cloud technologies are also analyzed from the perspective of consumers. Topics and features: presents

  9. Computer graphics visions and challenges: a European perspective.

    Science.gov (United States)

    Encarnação, José L

    2006-01-01

    I have briefly described important visions and challenges in computer graphics. They are a personal and therefore subjective selection. But most of these issues have to be addressed and solved--no matter if we call them visions or challenges or something else--if we want to make and further develop computer graphics into a key enabling technology for our IT-based society.

  10. Corporate Social Responsibility in the Financial Sector: Are Financial Cooperatives Ready to the Challenge?

    OpenAIRE

    Élias Rizkallah; Inmaculada Buendía Martínez

    2011-01-01

    After the crash of financial institutions and the negative effects of the financial crisis, financial service cooperatives (FSCs) emerged as good performer compared to commercial banks. But this condition will not be enough to face the challenges that the new financial panorama will bring on the banking arena. Among them, challenges related to the corporate social responsibility (CSR) sphere will play a special role. In Canada, the financial regulatory framework forces some federal institutio...

  11. Multimodal Challenge: Analytics Beyond User-computer Interaction Data

    NARCIS (Netherlands)

    Di Mitri, Daniele; Schneider, Jan; Specht, Marcus; Drachsler, Hendrik

    2018-01-01

    This contribution describes one the challenges explored in the Fourth LAK Hackathon. This challenge aims at shifting the focus from learning situations which can be easily traced through user-computer interactions data and concentrate more on user-world interactions events, typical of co-located and

  12. Addressing Cloud Computing in Enterprise Architecture: Issues and Challenges

    OpenAIRE

    Khan, Khaled; Gangavarapu, Narendra

    2009-01-01

    This article discusses how the characteristics of cloud computing affect the enterprise architecture in four domains: business, data, application and technology. The ownership and control of architectural components are shifted from organisational perimeters to cloud providers. It argues that although cloud computing promises numerous benefits to enterprises, the shifting control from enterprises to cloud providers on architectural components introduces several architectural challenges. The d...

  13. Getting ready for REDD+ in Tanzania: a case study of progress and challenges

    DEFF Research Database (Denmark)

    Burgess, Neil David; Bahane, Bruno; Clairs, Tim

    2010-01-01

    the Norwegian, Finnish and German governments and is a participant in the World Bank’s Forest Carbon Partnership Facility. In combination, these interventions aim to mitigate greenhouse gas emissions, provide an income to rural communities and conserve biodiversity. The establishment of the UN-REDD Programme...... in Tanzania illustrates real-world challenges in a developing country. These include currently inadequate baseline forestry data sets (needed to calculate reference emission levels), inadequate government capacity and insufficient experience of implementing REDD+-type measures at operational levels....... Additionally, for REDD+ to succeed, current users of forest resources must adopt new practices, including the equitable sharing of benefits that accrue from REDD+ implementation. These challenges are being addressed by combined donor support to implement a national forest inventory, remote sensing of forest...

  14. Military Readiness: Progress and Challenges in Implementing the Navy’s Optimized Fleet Response Plan

    Science.gov (United States)

    2016-05-02

    command and control under the OFRP contributes to wide swings in port workload , which in turn can have a negative effect on the private - sector industrial...for 53 percent of all private - sector aircraft carrier maintenance contracts and 70 percent of cruiser and destroyer contracts from fiscal years...their impact on the Navy; (2) the Navy’s goals and progress in implementing the OFRP; and (3) challenges faced by public and private shipyards

  15. Mobile Computing: The Emerging Technology, Sensing, Challenges and Applications

    International Nuclear Information System (INIS)

    Bezboruah, T.

    2010-12-01

    The mobile computing is a computing system in which a computer and all necessary accessories like files and software are taken out to the field. It is a system of computing through which it is being able to use a computing device even when someone being mobile and therefore changing location. The portability is one of the important aspects of mobile computing. The mobile phones are being used to gather scientific data from remote and isolated places that could not be possible to retrieve by other means. The scientists are initiating to use mobile devices and web-based applications to systematically explore interesting scientific aspects of their surroundings, ranging from climate change, environmental pollution to earthquake monitoring. This mobile revolution enables new ideas and innovations to spread out more quickly and efficiently. Here we will discuss in brief about the mobile computing technology, its sensing, challenges and the applications. (author)

  16. The computational challenges of Earth-system science.

    Science.gov (United States)

    O'Neill, Alan; Steenman-Clark, Lois

    2002-06-15

    The Earth system--comprising atmosphere, ocean, land, cryosphere and biosphere--is an immensely complex system, involving processes and interactions on a wide range of space- and time-scales. To understand and predict the evolution of the Earth system is one of the greatest challenges of modern science, with success likely to bring enormous societal benefits. High-performance computing, along with the wealth of new observational data, is revolutionizing our ability to simulate the Earth system with computer models that link the different components of the system together. There are, however, considerable scientific and technical challenges to be overcome. This paper will consider four of them: complexity, spatial resolution, inherent uncertainty and time-scales. Meeting these challenges requires a significant increase in the power of high-performance computers. The benefits of being able to make reliable predictions about the evolution of the Earth system should, on their own, amply repay this investment.

  17. Opinions differ on whether nuclear energy industry is ready for cyber-challenges

    International Nuclear Information System (INIS)

    Dalton, David

    2017-01-01

    In October 2015 the UK's respected Chatham House think-tank published a report that drew some worrying conclusions about the civil nuclear industry. It said many in the sector do not fully understand the risks posed by hackers and the industry needs to be ''more robust'' on taking the initiative in cyberspace and funding effective responses to the challenge. The industry does not seem to be prepared for a large-scale cyber security emergency and needs to invest in counter-measures and response plans, the report said. It warned that developing countries are ''particularly vulnerable'' to cyber-attacks at nuclear facilities. The industry should develop guidelines to measure cyber security risk, including an integrated risk assessment that takes both security and safety measures into account. All countries with nuclear facilities should adopt an effective regulatory approach to cyber security e.g. on the basis of IAEA guidance.

  18. Effects of Home and School Computer Use on School Readiness and Cognitive Development among Head Start Children: A Randomized Controlled Pilot Trial

    Science.gov (United States)

    Li, Xiaoming; Atkins, Melissa S.; Stanton, Bonita

    2006-01-01

    Data from 122 Head Start children were analyzed to examine the impact of computer use on school readiness and psychomotor skills. Children in the experimental group were given the opportunity to work on a computer for 15-20 minutes per day with their choice of developmentally appropriate educational software, while the control group received a…

  19. High-End Computing Challenges in Aerospace Design and Engineering

    Science.gov (United States)

    Bailey, F. Ronald

    2004-01-01

    High-End Computing (HEC) has had significant impact on aerospace design and engineering and is poised to make even more in the future. In this paper we describe four aerospace design and engineering challenges: Digital Flight, Launch Simulation, Rocket Fuel System and Digital Astronaut. The paper discusses modeling capabilities needed for each challenge and presents projections of future near and far-term HEC computing requirements. NASA's HEC Project Columbia is described and programming strategies presented that are necessary to achieve high real performance.

  20. Opinions differ on whether nuclear energy industry is ready for cyber-challenges

    Energy Technology Data Exchange (ETDEWEB)

    Dalton, David [NucNet, Brussels (Belgium)

    2017-05-15

    In October 2015 the UK's respected Chatham House think-tank published a report that drew some worrying conclusions about the civil nuclear industry. It said many in the sector do not fully understand the risks posed by hackers and the industry needs to be ''more robust'' on taking the initiative in cyberspace and funding effective responses to the challenge. The industry does not seem to be prepared for a large-scale cyber security emergency and needs to invest in counter-measures and response plans, the report said. It warned that developing countries are ''particularly vulnerable'' to cyber-attacks at nuclear facilities. The industry should develop guidelines to measure cyber security risk, including an integrated risk assessment that takes both security and safety measures into account. All countries with nuclear facilities should adopt an effective regulatory approach to cyber security e.g. on the basis of IAEA guidance.

  1. New Challenges for Computing in High Energy Physics

    International Nuclear Information System (INIS)

    Santoro, Alberto

    2003-01-01

    In view of the new scientific programs established for the LHC (Large Hadron Collider) era, the way to face the technological challenges in computing was develop a new concept of GRID computing. We show some examples and, in particular, a proposal for high energy physicists in countries like Brazil. Due to the big amount of data and the need of close collaboration it will be impossible to work in research centers and universities very far from Fermilab or CERN unless a GRID architecture is built. An important effort is being made by the international community to up to date their computing infrastructure and networks

  2. Chips challenging champions games, computers and artificial intelligence

    CERN Document Server

    Schaeffer, J

    2002-01-01

    One of the earliest dreams of the fledgling field of artificial intelligence (AI) was to build computer programs that could play games as well as or better than the best human players. Despite early optimism in the field, the challenge proved to be surprisingly difficult. However, the 1990s saw amazing progress. Computers are now better than humans in checkers, Othello and Scrabble; are at least as good as the best humans in backgammon and chess; and are rapidly improving at hex, go, poker, and shogi. This book documents the progress made in computers playing games and puzzles. The book is the

  3. Molecular Science Computing Facility Scientific Challenges: Linking Across Scales

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, Wibe A.; Windus, Theresa L.

    2005-07-01

    The purpose of this document is to define the evolving science drivers for performing environmental molecular research at the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) and to provide guidance associated with the next-generation high-performance computing center that must be developed at EMSL's Molecular Science Computing Facility (MSCF) in order to address this critical research. The MSCF is the pre-eminent computing facility?supported by the U.S. Department of Energy's (DOE's) Office of Biological and Environmental Research (BER)?tailored to provide the fastest time-to-solution for current computational challenges in chemistry and biology, as well as providing the means for broad research in the molecular and environmental sciences. The MSCF provides integral resources and expertise to emerging EMSL Scientific Grand Challenges and Collaborative Access Teams that are designed to leverage the multiple integrated research capabilities of EMSL, thereby creating a synergy between computation and experiment to address environmental molecular science challenges critical to DOE and the nation.

  4. Challenges and opportunities of cloud computing for atmospheric sciences

    Science.gov (United States)

    Pérez Montes, Diego A.; Añel, Juan A.; Pena, Tomás F.; Wallom, David C. H.

    2016-04-01

    Cloud computing is an emerging technological solution widely used in many fields. Initially developed as a flexible way of managing peak demand it has began to make its way in scientific research. One of the greatest advantages of cloud computing for scientific research is independence of having access to a large cyberinfrastructure to fund or perform a research project. Cloud computing can avoid maintenance expenses for large supercomputers and has the potential to 'democratize' the access to high-performance computing, giving flexibility to funding bodies for allocating budgets for the computational costs associated with a project. Two of the most challenging problems in atmospheric sciences are computational cost and uncertainty in meteorological forecasting and climate projections. Both problems are closely related. Usually uncertainty can be reduced with the availability of computational resources to better reproduce a phenomenon or to perform a larger number of experiments. Here we expose results of the application of cloud computing resources for climate modeling using cloud computing infrastructures of three major vendors and two climate models. We show how the cloud infrastructure compares in performance to traditional supercomputers and how it provides the capability to complete experiments in shorter periods of time. The monetary cost associated is also analyzed. Finally we discuss the future potential of this technology for meteorological and climatological applications, both from the point of view of operational use and research.

  5. Scenario-Based Digital Forensics Challenges in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Erik Miranda Lopez

    2016-10-01

    Full Text Available The aim of digital forensics is to extract information to answer the 5Ws (Why, When, Where, What, and Who from the data extracted from the evidence. In order to achieve this, most digital forensic processes assume absolute control of digital evidence. However, in a cloud environment forensic investigation, this is not always possible. Additionally, the unique characteristics of cloud computing create new technical, legal and architectural challenges when conducting a forensic investigation. We propose a hypothetical scenario to uncover and explain the challenges forensic practitioners face during cloud investigations. Additionally, we also provide solutions to address the challenges. Our hypothetical case scenario has shown that, in the long run, better live forensic tools, development of new methods tailored for cloud investigations and new procedures and standards are indeed needed. Furthermore, we have come to the conclusion that forensic investigations biggest challenge is not technical but legal.

  6. Precision Medicine and PET/Computed Tomography: Challenges and Implementation.

    Science.gov (United States)

    Subramaniam, Rathan M

    2017-01-01

    Precision Medicine is about selecting the right therapy for the right patient, at the right time, specific to the molecular targets expressed by disease or tumors, in the context of patient's environment and lifestyle. Some of the challenges for delivery of precision medicine in oncology include biomarkers for patient selection for enrichment-precision diagnostics, mapping out tumor heterogeneity that contributes to therapy failures, and early therapy assessment to identify resistance to therapies. PET/computed tomography offers solutions in these important areas of challenges and facilitates implementation of precision medicine. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Reviews on Security Issues and Challenges in Cloud Computing

    Science.gov (United States)

    An, Y. Z.; Zaaba, Z. F.; Samsudin, N. F.

    2016-11-01

    Cloud computing is an Internet-based computing service provided by the third party allowing share of resources and data among devices. It is widely used in many organizations nowadays and becoming more popular because it changes the way of how the Information Technology (IT) of an organization is organized and managed. It provides lots of benefits such as simplicity and lower costs, almost unlimited storage, least maintenance, easy utilization, backup and recovery, continuous availability, quality of service, automated software integration, scalability, flexibility and reliability, easy access to information, elasticity, quick deployment and lower barrier to entry. While there is increasing use of cloud computing service in this new era, the security issues of the cloud computing become a challenges. Cloud computing must be safe and secure enough to ensure the privacy of the users. This paper firstly lists out the architecture of the cloud computing, then discuss the most common security issues of using cloud and some solutions to the security issues since security is one of the most critical aspect in cloud computing due to the sensitivity of user's data.

  8. Challenges to Software/Computing for Experimentation at the LHC

    Science.gov (United States)

    Banerjee, Sunanda

    The demands of future high energy physics experiments towards software and computing have led the experiments to plan the related activities as a full-fledged project and to investigate new methodologies and languages to meet the challenges. The paths taken by the four LHC experiments ALICE, ATLAS, CMS and LHCb are coherently put together in an LHC-wide framework based on Grid technology. The current status and understandings have been broadly outlined.

  9. Towards Cloud Computing SLA Risk Management: Issues and Challenges

    OpenAIRE

    Morin, Jean-Henry; Aubert, Jocelyn; Gateau, Benjamin

    2012-01-01

    Cloud Computing has become mainstream technology offering a commoditized approach to software, platform and infrastructure as a service over the Internet on a global scale. This raises important new security issues beyond traditional perimeter based approaches. This paper attempts to identify these issues and their corresponding challenges, proposing to use risk and Service Level Agreement (SLA) management as the basis for a service level framework to improve governance, risk and compliance i...

  10. The emergence of grammar in a language-ready brain. Comment on "Towards a Computational Comparative Neuroprimatology: Framing the language-ready brain" by Michael A. Arbib

    Science.gov (United States)

    Hawkins, John A.

    2016-03-01

    Arbib makes the interesting proposal [3, §1.6] that the first Homo sapiens could have been ;language-ready;, without possessing the kind of rich lexicon, grammar and compositional semantics that we see in the world's languages today. This early language readiness would have consisted of a set of ;protolanguage; abilities, which he enumerates (1-7 in §1.6), supported by brain mechanisms unique to humans. The transition to full ;language; (properties 8-11 in §1.6 and §3) would have required no changes in the genome, he argues, but could have resulted from cultural evolution plus some measure of Baldwinian evolution favoring offspring with greater linguistic skill. The full picture is set out in [1].

  11. Radiation doses to patients in computed tomography including a ready reckoner for dose estimation

    International Nuclear Information System (INIS)

    Szendroe, G.; Axelsson, B.; Leitz, W.

    1995-11-01

    The radiation burden from CT-examinations is still growing in most countries and has reached a considerable part of the total from medical diagnostic x-ray procedures. Efforts for avoiding excess radiation doses are therefore especially well motivated within this field. A survey of CT-examination techniques practised in Sweden showed that standard settings for the exposure variables are used for the vast majority of examinations. Virtually no adjustments to the patient's differences in anatomy have been performed - even for infants and children on average the same settings have been used. The adjustment of the exposure variables to the individual anatomy offers a large potential of dose savings. Amongst the imaging parameters, a change of the radiation dose will primarily influence the noise. As a starting point it is assumed that, irrespective of the patient's anatomy, the same level of noise can be accepted for a certain diagnostic task. To a large extent the noise level is determined by the number of photons that are registered in the detector. Hence, for different patient size and anatomy, the exposure should be adjusted so that the same transmitted photon fluence is achieved. An appendix with a ready reckoner for dose estimation for CT-scanners used in Sweden is attached. 7 refs, 5 figs, 8 tabs

  12. Computational challenges in atomic, molecular and optical physics.

    Science.gov (United States)

    Taylor, Kenneth T

    2002-06-15

    Six challenges are discussed. These are the laser-driven helium atom; the laser-driven hydrogen molecule and hydrogen molecular ion; electron scattering (with ionization) from one-electron atoms; the vibrational and rotational structure of molecules such as H(3)(+) and water at their dissociation limits; laser-heated clusters; and quantum degeneracy and Bose-Einstein condensation. The first four concern fundamental few-body systems where use of high-performance computing (HPC) is currently making possible accurate modelling from first principles. This leads to reliable predictions and support for laboratory experiment as well as true understanding of the dynamics. Important aspects of these challenges addressable only via a terascale facility are set out. Such a facility makes the last two challenges in the above list meaningfully accessible for the first time, and the scientific interest together with the prospective role for HPC in these is emphasized.

  13. The ATLAS computing challenge for HL-LHC

    CERN Document Server

    Campana, Simone; The ATLAS collaboration

    2016-01-01

    The ATLAS experiment successfully commissioned a software and computing infrastructure to support the physics program during LHC Run 2. The next phases of the accelerator upgrade will present new challenges in the offline area. In particular, at High Luminosity LHC (also known as Run 4) the data taking conditions will be very demanding in terms of computing resources: between 5 and 10 KHz of event rate from the HLT to be reconstructed (and possibly further reprocessed) with an average pile-up of up to 200 events per collision and an equivalent number of simulated samples to be produced. The same parameters for the current run are lower by up to an order of magnitude. While processing and storage resources would need to scale accordingly, the funding situation allows one at best to consider a flat budget over the next few years for offline computing needs. In this paper we present a study quantifying the challenge in terms of computing resources for HL-LHC and present ideas about the possible evolution of the ...

  14. Undergraduate students’ challenges with computational modelling in physics

    Directory of Open Access Journals (Sweden)

    Simen A. Sørby

    2012-12-01

    Full Text Available In later years, computational perspectives have become essential parts in several of the University of Oslo’s natural science studies. In this paper we discuss some main findings from a qualitative study of the computational perspectives’ impact on the students’ work with their first course in physics– mechanics – and their learning and meaning making of its contents. Discussions of the students’ learning of physics are based on sociocultural theory, which originates in Vygotsky and Bakhtin, and subsequent physics education research. Results imply that the greatest challenge for students when working with computational assignments is to combine knowledge from previously known, but separate contexts. Integrating knowledge of informatics, numerical and analytical mathematics and conceptual understanding of physics appears as a clear challenge for the students. We also observe alack of awareness concerning the limitations of physical modelling. The students need help with identifying the appropriate knowledge system or “tool set”, for the different tasks at hand; they need helpto create a plan for their modelling and to become aware of its limits. In light of this, we propose thatan instructive and dialogic text as basis for the exercises, in which the emphasis is on specification, clarification and elaboration, would be of potential great aid for students who are new to computational modelling.

  15. Static Load Balancing Algorithms In Cloud Computing Challenges amp Solutions

    Directory of Open Access Journals (Sweden)

    Nadeem Shah

    2015-08-01

    Full Text Available Abstract Cloud computing provides on-demand hosted computing resources and services over the Internet on a pay-per-use basis. It is currently becoming the favored method of communication and computation over scalable networks due to numerous attractive attributes such as high availability scalability fault tolerance simplicity of management and low cost of ownership. Due to the huge demand of cloud computing efficient load balancing becomes critical to ensure that computational tasks are evenly distributed across servers to prevent bottlenecks. The aim of this review paper is to understand the current challenges in cloud computing primarily in cloud load balancing using static algorithms and finding gaps to bridge for more efficient static cloud load balancing in the future. We believe the ideas suggested as new solution will allow researchers to redesign better algorithms for better functionalities and improved user experiences in simple cloud systems. This could assist small businesses that cannot afford infrastructure that supports complex amp dynamic load balancing algorithms.

  16. Computing Platforms for Big Biological Data Analytics: Perspectives and Challenges.

    Science.gov (United States)

    Yin, Zekun; Lan, Haidong; Tan, Guangming; Lu, Mian; Vasilakos, Athanasios V; Liu, Weiguo

    2017-01-01

    The last decade has witnessed an explosion in the amount of available biological sequence data, due to the rapid progress of high-throughput sequencing projects. However, the biological data amount is becoming so great that traditional data analysis platforms and methods can no longer meet the need to rapidly perform data analysis tasks in life sciences. As a result, both biologists and computer scientists are facing the challenge of gaining a profound insight into the deepest biological functions from big biological data. This in turn requires massive computational resources. Therefore, high performance computing (HPC) platforms are highly needed as well as efficient and scalable algorithms that can take advantage of these platforms. In this paper, we survey the state-of-the-art HPC platforms for big biological data analytics. We first list the characteristics of big biological data and popular computing platforms. Then we provide a taxonomy of different biological data analysis applications and a survey of the way they have been mapped onto various computing platforms. After that, we present a case study to compare the efficiency of different computing platforms for handling the classical biological sequence alignment problem. At last we discuss the open issues in big biological data analytics.

  17. Achievements and challenges in structural bioinformatics and computational biophysics.

    Science.gov (United States)

    Samish, Ilan; Bourne, Philip E; Najmanovich, Rafael J

    2015-01-01

    The field of structural bioinformatics and computational biophysics has undergone a revolution in the last 10 years. Developments that are captured annually through the 3DSIG meeting, upon which this article reflects. An increase in the accessible data, computational resources and methodology has resulted in an increase in the size and resolution of studied systems and the complexity of the questions amenable to research. Concomitantly, the parameterization and efficiency of the methods have markedly improved along with their cross-validation with other computational and experimental results. The field exhibits an ever-increasing integration with biochemistry, biophysics and other disciplines. In this article, we discuss recent achievements along with current challenges within the field. © The Author 2014. Published by Oxford University Press.

  18. Multiscale methods in turbulent combustion: strategies and computational challenges

    International Nuclear Information System (INIS)

    Echekki, Tarek

    2009-01-01

    A principal challenge in modeling turbulent combustion flows is associated with their complex, multiscale nature. Traditional paradigms in the modeling of these flows have attempted to address this nature through different strategies, including exploiting the separation of turbulence and combustion scales and a reduced description of the composition space. The resulting moment-based methods often yield reasonable predictions of flow and reactive scalars' statistics under certain conditions. However, these methods must constantly evolve to address combustion at different regimes, modes or with dominant chemistries. In recent years, alternative multiscale strategies have emerged, which although in part inspired by the traditional approaches, also draw upon basic tools from computational science, applied mathematics and the increasing availability of powerful computational resources. This review presents a general overview of different strategies adopted for multiscale solutions of turbulent combustion flows. Within these strategies, some specific models are discussed or outlined to illustrate their capabilities and underlying assumptions. These strategies may be classified under four different classes, including (i) closure models for atomistic processes, (ii) multigrid and multiresolution strategies, (iii) flame-embedding strategies and (iv) hybrid large-eddy simulation-low-dimensional strategies. A combination of these strategies and models can potentially represent a robust alternative strategy to moment-based models; but a significant challenge remains in the development of computational frameworks for these approaches as well as their underlying theories. (topical review)

  19. US DOE Grand Challenge in Computational Accelerator Physics

    International Nuclear Information System (INIS)

    Ryne, R.; Habib, S.; Qiang, J.; Ko, K.; Li, Z.; McCandless, B.; Mi, W.; Ng, C.; Saparov, M.; Srinivas, V.; Sun, Y.; Zhan, X.; Decyk, V.; Golub, G.

    1998-01-01

    Particle accelerators are playing an increasingly important role in basic and applied science, and are enabling new accelerator-driven technologies. But the design of next-generation accelerators, such as linear colliders and high intensity linacs, will require a major advance in numerical modeling capability due to extremely stringent beam control and beam loss requirements, and the presence of highly complex three-dimensional accelerator components. To address this situation, the U.S. Department of Energy has approved a ''Grand Challenge'' in Computational Accelerator Physics, whose primary goal is to develop a parallel modeling capability that will enable high performance, large scale simulations for the design, optimization, and numerical validation of next-generation accelerators. In this paper we report on the status of the Grand Challenge

  20. Computer and internet use in vascular outpatients--ready for interactive applications?

    Science.gov (United States)

    Richter, J G; Schneider, M; Klein-Weigel, P

    2009-11-01

    Exploring patients' computer and internet use, their expectations and attitudes is mandatory for successful introduction of interactive online health-care applications in Angiology. We included 165 outpatients suffering from peripheral arterial disease (PAD; n = 62) and chronic venous and / or lymphatic disease (CVLD; n = 103) in a cross-sectional-study. Patients answered a paper-based questionnaire. Patients were predominantly female (54.5%). 142 (86.1%) reported regular computer use for 9.7 +/- 5.8 years and 134 (81.2 %) used the internet for 6.2 +/- 3.6 years. CVLD-patients and internet-user were younger and higher educated, resulting in a significant difference in computer and internet use between the disease groups (p internet users without significant differences between the groups. The topics retrieved from the internet covered a wide spectrum and searches for health information were mentioned by 41.2 %. Although confidence in the internet (3.3 +/- 1.1 on a 1-6 Likert scale) and reliability in information retrieved from the internet (3.1 +/- 1.1) were relatively low, health-related issues were of high actual and future interest. 42.8% of the patients were even interested in interactive applications like health educational programs, 37.4% in self-reported assessments and outcome questionnaires and 26.9% in chatforums; 50% demanded access to their medical data on an Internetserver. Compared to older participants those internet more often for shopping, chatting, and e-mailing, but not for health information retrieval and interactive applications. Computers are commonly used and the internet has been adopted as an important source of information by patients suffering from PAD and CVLD. Besides, the internet offers great potentials and new opportunities for interactive disease (self-)management in angiology. To increase confidence and reliability in the medium a careful introduction and evaluation of these new online applications is mandatory.

  1. Computational Aspects of Dam Risk Analysis: Findings and Challenges

    Directory of Open Access Journals (Sweden)

    Ignacio Escuder-Bueno

    2016-09-01

    Full Text Available In recent years, risk analysis techniques have proved to be a useful tool to inform dam safety management. This paper summarizes the outcomes of three themes related to dam risk analysis discussed in the Benchmark Workshops organized by the International Commission on Large Dams Technical Committee on “Computational Aspects of Analysis and Design of Dams.” In the 2011 Benchmark Workshop, estimation of the probability of failure of a gravity dam for the sliding failure mode was discussed. Next, in 2013, the discussion focused on the computational challenges of the estimation of consequences in dam risk analysis. Finally, in 2015, the probability of sliding and overtopping in an embankment was analyzed. These Benchmark Workshops have allowed a complete review of numerical aspects for dam risk analysis, showing that risk analysis methods are a very useful tool to analyze the risk of dam systems, including downstream consequence assessments and the uncertainty of structural models.

  2. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  3. Scientific and Computational Challenges of the Fusion Simulation Program (FSP)

    International Nuclear Information System (INIS)

    Tang, William M.

    2011-01-01

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Program (FSP) a major national initiative in the United States with the primary objective being to enable scientific discovery of important new plasma phenomena with associated understanding that emerges only upon integration. This requires developing a predictive integrated simulation capability for magnetically-confined fusion plasmas that are properly validated against experiments in regimes relevant for producing practical fusion energy. It is expected to provide a suite of advanced modeling tools for reliably predicting fusion device behavior with comprehensive and targeted science-based simulations of nonlinearly-coupled phenomena in the core plasma, edge plasma, and wall region on time and space scales required for fusion energy production. As such, it will strive to embody the most current theoretical and experimental understanding of magnetic fusion plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing the ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices with high physics fidelity on all relevant time and space scales. From a computational perspective, this will demand computing resources in the petascale range and beyond together with the associated multi-core algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative experiment involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics modeling projects (e

  4. Scientific and computational challenges of the fusion simulation program (FSP)

    International Nuclear Information System (INIS)

    Tang, William M.

    2011-01-01

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Program (FSP) - a major national initiative in the United States with the primary objective being to enable scientific discovery of important new plasma phenomena with associated understanding that emerges only upon integration. This requires developing a predictive integrated simulation capability for magnetically-confined fusion plasmas that are properly validated against experiments in regimes relevant for producing practical fusion energy. It is expected to provide a suite of advanced modeling tools for reliably predicting fusion device behavior with comprehensive and targeted science-based simulations of nonlinearly-coupled phenomena in the core plasma, edge plasma, and wall region on time and space scales required for fusion energy production. As such, it will strive to embody the most current theoretical and experimental understanding of magnetic fusion plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing the ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices with high physics fidelity on all relevant time and space scales. From a computational perspective, this will demand computing resources in the petascale range and beyond together with the associated multi-core algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative experiment involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics modeling projects (e

  5. Scientific and computational challenges of the fusion simulation project (FSP)

    International Nuclear Information System (INIS)

    Tang, W M

    2008-01-01

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Project (FSP). The primary objective is to develop advanced software designed to use leadership-class computers for carrying out multiscale physics simulations to provide information vital to delivering a realistic integrated fusion simulation model with unprecedented physics fidelity. This multiphysics capability will be unprecedented in that in the current FES applications domain, the largest-scale codes are used to carry out first-principles simulations of mostly individual phenomena in realistic 3D geometry while the integrated models are much smaller-scale, lower-dimensionality codes with significant empirical elements used for modeling and designing experiments. The FSP is expected to be the most up-to-date embodiment of the theoretical and experimental understanding of magnetically confined thermonuclear plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing a reliable ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices on all relevant time and space scales. From a computational perspective, the fusion energy science application goal to produce high-fidelity, whole-device modeling capabilities will demand computing resources in the petascale range and beyond, together with the associated multicore algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative device involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  7. Community Mobilization and Readiness: Planning Flaws which Challenge Effective Implementation of 'Communities that Care' (CTC) Prevention System.

    Science.gov (United States)

    Basic, Josipa

    2015-01-01

    This article reviews the experience of implementing a community approach to drug use and youth delinquency prevention based on the 'Communities that Care' (CTC) system implemented in one Croatian county consisting of 12 communities, 2002 to 2013 (Hawkins, 1999; Hawkins & Catalano, 2004). This overview explores selected critical issues which are often not considered in substance use(r) community intervention planning, implementation as well as in associated process and outcome assessments. These issues include, among others, the mobilization process of adequate representation of people; the involvement of relevant key individual and organizational stakeholders and being aware of the stakeholders' willingness to participate in the prevention process. In addition, it is important to be aware of the stakeholders' knowledge and perceptions about the 'problems' of drug use and youth delinquency in their communities as well as the characteristics of the targeted population(s). Sometimes there are community members and stakeholders who block needed change and therefore prevention process enablers and 'bridges' should be involved in moving prevention programming forward. Another barrier that is often overlooked in prevention planning is community readiness to change and a realistic assessment of available and accessible resources for initiating the planned change(s) and sustaining them. All of these issues have been found to be potentially related to intervention success. At the end of this article, I summarize perspectives from prevention scientists and practitioners and lessons learned from communities' readiness research and practice in Croatian that has international relevance.

  8. Cognitive Readiness

    National Research Council Canada - National Science Library

    Morrison, John

    2002-01-01

    Cognitive readiness is described as the mental preparation an individual needs to establish and sustain competent performance in the complex and unpredictable environment of modern military operations...

  9. Parameterized algorithmics for computational social choice : nine research challenges

    NARCIS (Netherlands)

    Bredereck, R.; Chen, J.; Faliszewski, P.; Guo, J.; Niedermeier, R.; Woeginger, G.J.

    2014-01-01

    Computational Social Choice is an interdisciplinary research area involving Economics, Political Science, and Social Science on the one side, and Mathematics and Computer Science (including Artificial Intelligence and Multiagent Systems) on the other side. Typical computational problems studied in

  10. Computational brain connectivity mapping: A core health and scientific challenge.

    Science.gov (United States)

    Deriche, Rachid

    2016-10-01

    One third of the burden of all the diseases in Europe is due to problems caused by diseases affecting brain. Although exceptional progress have been obtained for exploring the brain during the past decades, it is still terra-incognita and calls for specific efforts in research to better understand its architecture and functioning. To take up this great challenge of modern science and to solve the limited view of the brain provided just by one imaging modality, this article advocates the idea developed in my research group of a global approach involving new generation of models for brain connectivity mapping and strong interactions between structural and functional connectivities. Capitalizing on the strengths of integrated and complementary non invasive imaging modalities such as diffusion Magnetic Resonance Imaging (dMRI) and Electro & Magneto-Encephalography (EEG & MEG) will contribute to achieve new frontiers for identifying and characterizing structural and functional brain connectivities and to provide a detailed mapping of the brain connectivity, both in space and time. Thus leading to an added clinical value for high impact diseases with new perspectives in computational neuro-imaging and cognitive neuroscience. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. GIS Readiness Survey 2014

    DEFF Research Database (Denmark)

    Schrøder, Lise; Hvingel, Line Træholt; Hansen, Henning Sten

    2014-01-01

    The GIS Readiness Survey 2014 is a follow-up to the corresponding survey that was carried out among public institutions in Denmark in 2009. The present survey thus provides an updated image of status and challenges in relation to the use of spatial information, the construction of the com- mon...

  12. A heterogeneous computing environment to solve the 768-bit RSA challenge

    OpenAIRE

    Kleinjung, Thorsten; Bos, Joppe Willem; Lenstra, Arjen K.; Osvik, Dag Arne; Aoki, Kazumaro; Contini, Scott; Franke, Jens; Thomé, Emmanuel; Jermini, Pascal; Thiémard, Michela; Leyland, Paul; Montgomery, Peter L.; Timofeev, Andrey; Stockinger, Heinz

    2010-01-01

    In December 2009 the 768-bit, 232-digit number RSA-768 was factored using the number field sieve. Overall, the computational challenge would take more than 1700 years on a single, standard core. In the article we present the heterogeneous computing approach, involving different compute clusters and Grid computing environments, used to solve this problem.

  13. Computer usage among nurses in rural health-care facilities in South Africa: obstacles and challenges.

    Science.gov (United States)

    Asah, Flora

    2013-04-01

    This study discusses factors inhibiting computer usage for work-related tasks among computer-literate professional nurses within rural healthcare facilities in South Africa. In the past two decades computer literacy courses have not been part of the nursing curricula. Computer courses are offered by the State Information Technology Agency. Despite this, there seems to be limited use of computers by professional nurses in the rural context. Focus group interviews held with 40 professional nurses from three government hospitals in northern KwaZulu-Natal. Contributing factors were found to be lack of information technology infrastructure, restricted access to computers and deficits in regard to the technical and nursing management support. The physical location of computers within the health-care facilities and lack of relevant software emerged as specific obstacles to usage. Provision of continuous and active support from nursing management could positively influence computer usage among professional nurses. A closer integration of information technology and computer literacy skills into existing nursing curricula would foster a positive attitude towards computer usage through early exposure. Responses indicated that change of mindset may be needed on the part of nursing management so that they begin to actively promote ready access to computers as a means of creating greater professionalism and collegiality. © 2011 Blackwell Publishing Ltd.

  14. Evaluating mobile centric readiness of students: A case of computer science students in open-distance learning

    CSIR Research Space (South Africa)

    Chipangura, B

    2015-07-01

    Full Text Available and Education]: Distance learning Keywords Mobile centric services, mobile information access, mobile readiness 1. INTRODUCTION As the mobile phone market matures in terms of penetration rate, subscription rate, handsets functionality and mobile centric..., this reflects a ratio of one mobile phone per person. High mobile phone penetration has made it possible for digitally alienated communities in developing countries to have improved access to business, health, education and social services. Indeed, this has...

  15. Y2K compliance readiness and contingency planning.

    Science.gov (United States)

    Stahl, S; Cohan, D

    1999-09-01

    As the millennium approaches, discussion of "Y2K compliance" will shift to discussion of "Y2K readiness." While "compliance" focuses on the technological functioning of one's own computers, "readiness" focuses on the operational planning required in a world of interdependence, in which the functionality of one's own computers is only part of the story. "Readiness" includes the ability to cope with potential Y2K failures of vendors, suppliers, staff, banks, utility companies, and others. Administrators must apply their traditional skills of analysis, inquiry and diligence to the manifold imaginable challenges which Y2K will thrust upon their facilities. The SPICE template can be used as a systematic tool to guide planning for this historic event.

  16. The challenge of networked enterprises for cloud computing interoperability

    OpenAIRE

    Mezgár, István; Rauschecker, Ursula

    2014-01-01

    Manufacturing enterprises have to organize themselves into effective system architectures forming different types of Networked Enterprises (NE) to match fast changing market demands. Cloud Computing (CC) is an important up to date computing concept for NE, as it offers significant financial and technical advantages beside high-level collaboration possibilities. As cloud computing is a new concept the solutions for handling interoperability, portability, security, privacy and standardization c...

  17. Exploring the marketing challenges faced by assembled computer dealers

    OpenAIRE

    Kallimani, Rashmi

    2010-01-01

    There has been a great competition in computer market these days for obtaining higher market share. Computer market consisting of many branded and non branded players have been using various methods for matching the supply and demand in best possible way for attaining market dominance. Branded companies are seen to be investing large amount in aggressive marketing techniques for reaching the customers and obtaining higher market share. Due to this many small companies and non branded computer...

  18. Integrated challenge test: a new approach evaluating quantitative risk assessment of Listeria in ready to eat foods

    Directory of Open Access Journals (Sweden)

    Paolo Matteini

    2012-02-01

    Full Text Available The study was aimed to predict the maximum concentration of Listeria monocytogenes during the shelf life in chicken liver paté. The prediction has been performed using the integrated challenge test: a test based on the interaction between indigenous lactic flora and L. monocytogenes and their growth parameters. Two different approaches were investigated: the former is based on the time difference between the onset of the L. monocytogenes and the lactic flora stationary phases, while the latter is based on the lactic flora concentration capable to induct the stationary phase of L. monocytogenes. Three different strains of L. monocytogenes, isolated from meat products, were used to perform three challenge tests. Triplicate samples from three different batches of liver paté were inoculated with a single-strain inoculum of 1.8 Log CFU/g. Samples were then stored at 4°C, 8°C and 12°C. Lactobacillus spp. (ISO 15214:1998 and L. monocytogenes (UNI EN ISO 11290-02:2005 plate counts were performed daily on each sample until the stationary phase was reached by both populations. The challenge test results were input in the Combase software to determine the growth parameters, later used for the calculation method. Predictive data were then statically assessed against the results of two additional challenge tests using triplicate samples from two different batches, the same strains and the same single-strain inoculum. Samples from the first batch were stored for 5 days at 4°C + 5 days at 8°C + 5 days at 12°C; samples from the second batch were stored for 3 days at 4°C + 3 days at 8°C + 4 days at 12°C. The results obtained showed that both approaches provided results very close to the reality. Therefore the Integrated challenge test is useful to determine the maximum concentration of L. monocytogenes, by simply knowing the concentration of the concerned microbial populations at a given time.

  19. The Challenge '88 Project: Interfacing of Chemical Instruments to Computers.

    Science.gov (United States)

    Lyons, Jim; Verghese, Manoj

    The main part of this project involved using a computer, either an Apple or an IBM, as a chart recorder for the infrared (IR) and nuclear magnetic resonance (NMR) spectrophotometers. The computer "reads" these machines and displays spectra on its monitor. The graphs can then be stored for future reference and manipulation. The program to…

  20. Gender Digital Divide and Challenges in Undergraduate Computer Science Programs

    Science.gov (United States)

    Stoilescu, Dorian; McDougall, Douglas

    2011-01-01

    Previous research revealed a reduced number of female students registered in computer science studies. In addition, the female students feel isolated, have reduced confidence, and underperform. This article explores differences between female and male students in undergraduate computer science programs in a mid-size university in Ontario. Based on…

  1. Multicore Challenges and Benefits for High Performance Scientific Computing

    Directory of Open Access Journals (Sweden)

    Ida M.B. Nielsen

    2008-01-01

    Full Text Available Until recently, performance gains in processors were achieved largely by improvements in clock speeds and instruction level parallelism. Thus, applications could obtain performance increases with relatively minor changes by upgrading to the latest generation of computing hardware. Currently, however, processor performance improvements are realized by using multicore technology and hardware support for multiple threads within each core, and taking full advantage of this technology to improve the performance of applications requires exposure of extreme levels of software parallelism. We will here discuss the architecture of parallel computers constructed from many multicore chips as well as techniques for managing the complexity of programming such computers, including the hybrid message-passing/multi-threading programming model. We will illustrate these ideas with a hybrid distributed memory matrix multiply and a quantum chemistry algorithm for energy computation using Møller–Plesset perturbation theory.

  2. Lattice QCD - a challenge in large scale computing

    International Nuclear Information System (INIS)

    Schilling, K.

    1987-01-01

    The computation of the hadron spectrum within the framework of lattice QCD sets a demanding goal for the application of supercomputers in basic science. It requires both big computer capacities and clever algorithms to fight all the numerical evils that one encounters in the Euclidean space-time-world. The talk will attempt to introduce to the present state of the art of spectrum calculations by lattice simulations. (orig.)

  3. Information Assurance and Forensic Readiness

    Science.gov (United States)

    Pangalos, Georgios; Katos, Vasilios

    Egalitarianism and justice are amongst the core attributes of a democratic regime and should be also secured in an e-democratic setting. As such, the rise of computer related offenses pose a threat to the fundamental aspects of e-democracy and e-governance. Digital forensics are a key component for protecting and enabling the underlying (e-)democratic values and therefore forensic readiness should be considered in an e-democratic setting. This position paper commences from the observation that the density of compliance and potential litigation activities is monotonically increasing in modern organizations, as rules, legislative regulations and policies are being constantly added to the corporate environment. Forensic practices seem to be departing from the niche of law enforcement and are becoming a business function and infrastructural component, posing new challenges to the security professionals. Having no a priori knowledge on whether a security related event or corporate policy violation will lead to litigation, we advocate that computer forensics need to be applied to all investigatory, monitoring and auditing activities. This would result into an inflation of the responsibilities of the Information Security Officer. After exploring some commonalities and differences between IS audit and computer forensics, we present a list of strategic challenges the organization and, in effect, the IS security and audit practitioner will face.

  4. Fault tolerance in computational grids: perspectives, challenges, and issues.

    Science.gov (United States)

    Haider, Sajjad; Nazir, Babar

    2016-01-01

    Computational grids are established with the intention of providing shared access to hardware and software based resources with special reference to increased computational capabilities. Fault tolerance is one of the most important issues faced by the computational grids. The main contribution of this survey is the creation of an extended classification of problems that incur in the computational grid environments. The proposed classification will help researchers, developers, and maintainers of grids to understand the types of issues to be anticipated. Moreover, different types of problems, such as omission, interaction, and timing related have been identified that need to be handled on various layers of the computational grid. In this survey, an analysis and examination is also performed pertaining to the fault tolerance and fault detection mechanisms. Our conclusion is that a dependable and reliable grid can only be established when more emphasis is on fault identification. Moreover, our survey reveals that adaptive and intelligent fault identification, and tolerance techniques can improve the dependability of grid working environments.

  5. Are project managers ready for the 21th challenges? A review of problem structuring methods for decision support

    Directory of Open Access Journals (Sweden)

    José Mateo

    2017-01-01

    Full Text Available Numerous contemporary problems that project managers face today can be considered as unstructured decision problems characterized by multiple actors and perspectives, incommensurable and/or conflicting objectives, and important intangibles. This work environment demands that project managers possess not only hard skills but also soft skills with the ability to take a management perspective and, above all, develop real leadership capabilities. In this paper, a family of problem structured methods for decision support aimed at assisting project managers in tackling complex problems are presented. Problem structured methods are a family of soft operations research methods for decision support that assist groups of diverse composition to agree a problem focus and make commitments to consequential action. Project management programs are challenged to implement these methodologies in such a way that it is organized around the key competences that a project manager needs in order to be more effective, work efficiently as members of interdisciplinary teams and successfully execute even a small project.

  6. Computational pan-genomics: status, promises and challenges

    NARCIS (Netherlands)

    The Computational Pan-Genomics Consortium; T. Marschall (Tobias); M. Marz (Manja); T. Abeel (Thomas); L.J. Dijkstra (Louis); B.E. Dutilh (Bas); A. Ghaffaari (Ali); P. Kersey (Paul); W.P. Kloosterman (Wigard); V. Mäkinen (Veli); A.M. Novak (Adam); B. Paten (Benedict); D. Porubsky (David); E. Rivals (Eric); C. Alkan (Can); J.A. Baaijens (Jasmijn); P.I.W. de Bakker (Paul); V. Boeva (Valentina); R.J.P. Bonnal (Raoul); F. Chiaromonte (Francesca); R. Chikhi (Rayan); F.D. Ciccarelli (Francesca); C.P. Cijvat (Robin); E. Datema (Erwin); C.M. van Duijn (Cornelia); E.E. Eichler (Evan); C. Ernst (Corinna); E. Eskin (Eleazar); E. Garrison (Erik); M. El-Kebir (Mohammed); G.W. Klau (Gunnar); J.O. Korbel (Jan); E.-W. Lameijer (Eric-Wubbo); B. Langmead (Benjamin); M. Martin; P. Medvedev (Paul); J.C. Mu (John); P.B.T. Neerincx (Pieter); K. Ouwens (Klaasjan); P. Peterlongo (Pierre); N. Pisanti (Nadia); S. Rahmann (Sven); B.J. Raphael (Benjamin); K. Reinert (Knut); D. de Ridder (Dick); J. de Ridder (Jeroen); M. Schlesner (Matthias); O. Schulz-Trieglaff (Ole); A.D. Sanders (Ashley); S. Sheikhizadeh (Siavash); C. Shneider (Carl); S. Smit (Sandra); D. Valenzuela (Daniel); J. Wang (Jiayin); L.F.A. Wessels (Lodewyk); Y. Zhang (Ying); V. Guryev (Victor); F. Vandin (Fabio); K. Ye (Kai); A. Schönhuth (Alexander)

    2018-01-01

    textabstractMany disciplines, from human genetics and oncology to plant breeding, microbiology and virology, commonly face the challenge of analyzing rapidly increasing numbers of genomes. In case of Homo sapiens, the number of sequenced genomes will approach hundreds of thousands in the next few

  7. Use of computers for meeting future challenges in traffic management

    International Nuclear Information System (INIS)

    Ford, C.L. Jr.

    1983-01-01

    Since overall distribution costs, including transportation, is approaching 10% of the material expense dollar, new strategies are emerging in distribution management. Innovative technology utilizing computer hardware and software is an aid to solving inventory control and distribution problems. This paper discusses the information needs of the shipper traffic manager and the role of computers in meeting this need. DOE, in association with Union Carbide's Nuclear Division, has utilized data base technology to collect and report transportation statistics for a variety of management information needs

  8. Remarkable Computing - the Challenge of Designing for the Home

    DEFF Research Database (Denmark)

    Petersen, Marianne Graves

    2004-01-01

    The vision of ubiquitous computing is floating into the domain of the household, despite arguments that lessons from design of workplace artefacts cannot be blindly transferred into the domain of the household. This paper discusses why the ideal of unremarkable or ubiquitous computing is too narrow...... with respect to the household. It points out how understanding technology use, is a matter of looking into the process of use and on how the specific context of the home, in several ways, call for technology to be remarkable rather than unremarkable....

  9. Algebraic Functions, Computer Programming, and the Challenge of Transfer

    Science.gov (United States)

    Schanzer, Emmanuel Tanenbaum

    2015-01-01

    Students' struggles with algebra are well documented. Prior to the introduction of functions, mathematics is typically focused on applying a set of arithmetic operations to compute an answer. The introduction of functions, however, marks the point at which mathematics begins to focus on building up abstractions as a way to solve complex problems.…

  10. Challenges in scaling NLO generators to leadership computers

    Science.gov (United States)

    Benjamin, D.; Childers, JT; Hoeche, S.; LeCompte, T.; Uram, T.

    2017-10-01

    Exascale computing resources are roughly a decade away and will be capable of 100 times more computing than current supercomputers. In the last year, Energy Frontier experiments crossed a milestone of 100 million core-hours used at the Argonne Leadership Computing Facility, Oak Ridge Leadership Computing Facility, and NERSC. The Fortran-based leading-order parton generator called Alpgen was successfully scaled to millions of threads to achieve this level of usage on Mira. Sherpa and MadGraph are next-to-leading order generators used heavily by LHC experiments for simulation. Integration times for high-multiplicity or rare processes can take a week or more on standard Grid machines, even using all 16-cores. We will describe our ongoing work to scale the Sherpa generator to thousands of threads on leadership-class machines and reduce run-times to less than a day. This work allows the experiments to leverage large-scale parallel supercomputers for event generation today, freeing tens of millions of grid hours for other work, and paving the way for future applications (simulation, reconstruction) on these and future supercomputers.

  11. Human-Computer Interaction Software: Lessons Learned, Challenges Ahead

    Science.gov (United States)

    1989-01-01

    domain communi- Iatelligent s t s s Me cation. Users familiar with problem Inteligent support systes. High-func- anddomains but inxperienced with comput...8217i. April 1987, pp. 7.3-78. His research interests include artificial intel- Creating better HCI softw-are will have a 8. S.K Catrd. I.P. Moran. arid

  12. The challenge of quantum computer simulations of physical phenomena

    International Nuclear Information System (INIS)

    Ortiz, G.; Knill, E.; Gubernatis, J.E.

    2002-01-01

    The goal of physics simulation using controllable quantum systems ('physics imitation') is to exploit quantum laws to advantage, and thus accomplish efficient simulation of physical phenomena. In this Note, we discuss the fundamental concepts behind this paradigm of information processing, such as the connection between models of computation and physical systems. The experimental simulation of a toy quantum many-body problem is described

  13. Scientific Grand Challenges: Challenges in Climate Change Science and the Role of Computing at the Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.; Johnson, Gary M.; Washington, Warren M.

    2009-07-02

    The U.S. Department of Energy (DOE) Office of Biological and Environmental Research (BER) in partnership with the Office of Advanced Scientific Computing Research (ASCR) held a workshop on the challenges in climate change science and the role of computing at the extreme scale, November 6-7, 2008, in Bethesda, Maryland. At the workshop, participants identified the scientific challenges facing the field of climate science and outlined the research directions of highest priority that should be pursued to meet these challenges. Representatives from the national and international climate change research community as well as representatives from the high-performance computing community attended the workshop. This group represented a broad mix of expertise. Of the 99 participants, 6 were from international institutions. Before the workshop, each of the four panels prepared a white paper, which provided the starting place for the workshop discussions. These four panels of workshop attendees devoted to their efforts the following themes: Model Development and Integrated Assessment; Algorithms and Computational Environment; Decadal Predictability and Prediction; Data, Visualization, and Computing Productivity. The recommendations of the panels are summarized in the body of this report.

  14. Single-Cell Transcriptomics Bioinformatics and Computational Challenges

    Directory of Open Access Journals (Sweden)

    Lana Garmire

    2016-09-01

    Full Text Available The emerging single-cell RNA-Seq (scRNA-Seq technology holds the promise to revolutionize our understanding of diseases and associated biological processes at an unprecedented resolution. It opens the door to reveal the intercellular heterogeneity and has been employed to a variety of applications, ranging from characterizing cancer cells subpopulations to elucidating tumor resistance mechanisms. Parallel to improving experimental protocols to deal with technological issues, deriving new analytical methods to reveal the complexity in scRNA-Seq data is just as challenging. Here we review the current state-of-the-art bioinformatics tools and methods for scRNA-Seq analysis, as well as addressing some critical analytical challenges that the field faces.

  15. Operating the worldwide LHC computing grid: current and future challenges

    International Nuclear Information System (INIS)

    Molina, J Flix; Forti, A; Girone, M; Sciaba, A

    2014-01-01

    The Wordwide LHC Computing Grid project (WLCG) provides the computing and storage resources required by the LHC collaborations to store, process and analyse their data. It includes almost 200,000 CPU cores, 200 PB of disk storage and 200 PB of tape storage distributed among more than 150 sites. The WLCG operations team is responsible for several essential tasks, such as the coordination of testing and deployment of Grid middleware and services, communication with the experiments and the sites, followup and resolution of operational issues and medium/long term planning. In 2012 WLCG critically reviewed all operational procedures and restructured the organisation of the operations team as a more coherent effort in order to improve its efficiency. In this paper we describe how the new organisation works, its recent successes and the changes to be implemented during the long LHC shutdown in preparation for the LHC Run 2.

  16. "Tennis elbow". A challenging call for computation and medicine

    Science.gov (United States)

    Sfetsioris, D.; Bontioti, E. N.

    2014-10-01

    An attempt to give an insight on the features composing this musculotendinous disorder. We address the issues of definition, pathophysiology and the mechanism underlying the onset and the occurrence of the disease, diagnosis and diagnostic tools as well as the methods of treatment. We focus mostly on conservative treatment protocols and we recognize the need for a more thorough investigation with the aid of computation.

  17. Computational brain models: Advances from system biology and future challenges

    Directory of Open Access Journals (Sweden)

    George E. Barreto

    2015-02-01

    Full Text Available Computational brain models focused on the interactions between neurons and astrocytes, modeled via metabolic reconstructions, are reviewed. The large source of experimental data provided by the -omics techniques and the advance/application of computational and data-management tools are being fundamental. For instance, in the understanding of the crosstalk between these cells, the key neuroprotective mechanisms mediated by astrocytes in specific metabolic scenarios (1 and the identification of biomarkers for neurodegenerative diseases (2,3. However, the modeling of these interactions demands a clear view of the metabolic and signaling pathways implicated, but most of them are controversial and are still under evaluation (4. Hence, to gain insight into the complexity of these interactions a current view of the main pathways implicated in the neuron-astrocyte communication processes have been made from recent experimental reports and reviews. Furthermore, target problems, limitations and main conclusions have been identified from metabolic models of the brain reported from 2010. Finally, key aspects to take into account into the development of a computational model of the brain and topics that could be approached from a systems biology perspective in future research are highlighted.

  18. Computational Science And Engineering Software Sustainability And Productivity (CSESSP) Challenges Workshop Report

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — This report details the challenges and opportunities discussed at the NITRD sponsored multi-agency workshop on Computational Science and Engineering Software...

  19. New challenges in grid generation and adaptivity for scientific computing

    CERN Document Server

    Formaggia, Luca

    2015-01-01

    This volume collects selected contributions from the “Fourth Tetrahedron Workshop on Grid Generation for Numerical Computations”, which was held in Verbania, Italy in July 2013. The previous editions of this Workshop were hosted by the Weierstrass Institute in Berlin (2005), by INRIA Rocquencourt in Paris (2007), and by Swansea University (2010). This book covers different, though related, aspects of the field: the generation of quality grids for complex three-dimensional geometries; parallel mesh generation algorithms; mesh adaptation, including both theoretical and implementation aspects; grid generation and adaptation on surfaces – all with an interesting mix of numerical analysis, computer science and strongly application-oriented problems.

  20. Leaderboard Now Open: CPTAC’s DREAM Proteogenomics Computational Challenge | Office of Cancer Clinical Proteomics Research

    Science.gov (United States)

    The National Cancer Institute’s Clinical Proteomic Tumor Analysis Consortium (CPTAC) is pleased to announce the opening of the leaderboard to its Proteogenomics Computational DREAM Challenge. The leadership board remains open for submissions during September 25, 2017 through October 8, 2017, with the Challenge expected to run until November 17, 2017.

  1. Computer Security: Join the CERN WhiteHat Challenge!

    CERN Multimedia

    Computer Security Team

    2014-01-01

    Over the past couple of months, several CERN users have reported vulnerabilities they have found in computing services and servers running at CERN. All were relevant, many were interesting and a few even surprising. Spotting weaknesses and areas for improvement before malicious people can exploit them is paramount. It helps protect the operation of our accelerators and experiments as well as the reputation of the Organization. Therefore, we would like to express our gratitude to those people for having reported these weaknesses! Great job and well done!   Seizing the opportunity, we would like to reopen the hunt for bugs, vulnerabilities and insecure configurations of CERN applications, websites and devices. You might recall we ran a similar initiative (“Hide & Seek”) in 2012 where we asked you to sift through CERN’s webpages and send us those that hold sensitive and confidential information. Quite a number of juicy documents were found and subsequently remov...

  2. P300 brain computer interface: current challenges and emerging trends

    Science.gov (United States)

    Fazel-Rezai, Reza; Allison, Brendan Z.; Guger, Christoph; Sellers, Eric W.; Kleih, Sonja C.; Kübler, Andrea

    2012-01-01

    A brain-computer interface (BCI) enables communication without movement based on brain signals measured with electroencephalography (EEG). BCIs usually rely on one of three types of signals: the P300 and other components of the event-related potential (ERP), steady state visual evoked potential (SSVEP), or event related desynchronization (ERD). Although P300 BCIs were introduced over twenty years ago, the past few years have seen a strong increase in P300 BCI research. This closed-loop BCI approach relies on the P300 and other components of the ERP, based on an oddball paradigm presented to the subject. In this paper, we overview the current status of P300 BCI technology, and then discuss new directions: paradigms for eliciting P300s; signal processing methods; applications; and hybrid BCIs. We conclude that P300 BCIs are quite promising, as several emerging directions have not yet been fully explored and could lead to improvements in bit rate, reliability, usability, and flexibility. PMID:22822397

  3. Computational challenges in magnetic-confinement fusion physics

    Science.gov (United States)

    Fasoli, A.; Brunner, S.; Cooper, W. A.; Graves, J. P.; Ricci, P.; Sauter, O.; Villard, L.

    2016-05-01

    Magnetic-fusion plasmas are complex self-organized systems with an extremely wide range of spatial and temporal scales, from the electron-orbit scales (~10-11 s, ~ 10-5 m) to the diffusion time of electrical current through the plasma (~102 s) and the distance along the magnetic field between two solid surfaces in the region that determines the plasma-wall interactions (~100 m). The description of the individual phenomena and of the nonlinear coupling between them involves a hierarchy of models, which, when applied to realistic configurations, require the most advanced numerical techniques and algorithms and the use of state-of-the-art high-performance computers. The common thread of such models resides in the fact that the plasma components are at the same time sources of electromagnetic fields, via the charge and current densities that they generate, and subject to the action of electromagnetic fields. This leads to a wide variety of plasma modes of oscillations that resonate with the particle or fluid motion and makes the plasma dynamics much richer than that of conventional, neutral fluids.

  4. Computer Adaptive Multistage Testing: Practical Issues, Challenges and Principles

    Directory of Open Access Journals (Sweden)

    Halil Ibrahim SARI

    2016-12-01

    Full Text Available The purpose of many test in the educational and psychological measurement is to measure test takers’ latent trait scores from responses given to a set of items. Over the years, this has been done by traditional methods (paper and pencil tests. However, compared to other test administration models (e.g., adaptive testing, traditional methods are extensively criticized in terms of producing low measurement accuracy and long test length. Adaptive testing has been proposed to overcome these problems. There are two popular adaptive testing approaches. These are computerized adaptive testing (CAT and computer adaptive multistage testing (ca-MST. The former is a well-known approach that has been predominantly used in this field. We believe that researchers and practitioners are fairly familiar with many aspects of CAT because it has more than a hundred years of history. However, the same thing is not true for the latter one. Since ca-MST is relatively new, many researchers are not familiar with features of it. The purpose of this study is to closely examine the characteristics of ca-MST, including its working principle, the adaptation procedure called the routing method, test assembly, and scoring, and provide an overview to researchers, with the aim of drawing researchers’ attention to ca-MST and encouraging them to contribute to the research in this area. The books, software and future work for ca-MST are also discussed.

  5. 3rd International Symposium on Big Data and Cloud Computing Challenges

    CERN Document Server

    Neelanarayanan, V

    2016-01-01

    This proceedings volume contains selected papers that were presented in the 3rd International Symposium on Big data and Cloud Computing Challenges, 2016 held at VIT University, India on March 10 and 11. New research issues, challenges and opportunities shaping the future agenda in the field of Big Data and Cloud Computing are identified and presented throughout the book, which is intended for researchers, scholars, students, software developers and practitioners working at the forefront in their field. This book acts as a platform for exchanging ideas, setting questions for discussion, and sharing the experience in Big Data and Cloud Computing domain.

  6. The Awareness and Challenges of Cloud Computing Adoption on Tertiary Education in Malaysia

    Science.gov (United States)

    Hazreeni Hamzah, Nor; Mahmud, Maziah; Zukri, Shamsunarnie Mohamed; Yaacob, Wan Fairos Wan; Yacob, Jusoh

    2017-09-01

    This preliminary study aims to investigate the awareness of the adoption of cloud computing among the academicians in tertiary education in Malaysia. Besides, this study also want to explore the possible challenges faced by the academician while adopting this new technology. The pilot study was done on 40 lecturers in Universiti Teknologi MARA Kampus Kota Bharu (UiTMKB) by using self administered questionnaire. The results found that almost half (40 percent) were not aware on the existing of cloud computing in teaching and learning (T&L) process. The challenges confronting the adoption of cloud computing are data insecurity, data insecurity, unsolicited advertisement, lock-in, reluctance to eliminate staff positions, privacy concerns, reliability challenge, regulatory compliance concerns/user control and institutional culture/resistance to change in technology. This possible challenges can be factorized in two major factors which were security and dependency factor and user control and mentality factor.

  7. BigData and computing challenges in high energy and nuclear physics

    International Nuclear Information System (INIS)

    Klimentov, A.; Grigorieva, M.; Kiryanov, A.; Zarochentsev, A.

    2017-01-01

    In this contribution we discuss the various aspects of the computing resource needs experiments in High Energy and Nuclear Physics, in particular at the Large Hadron Collider. This will evolve in the future when moving from LHC to HL-LHC in ten years from now, when the already exascale levels of data we are processing could increase by a further order of magnitude. The distributed computing environment has been a great success and the inclusion of new super-computing facilities, cloud computing and volunteering computing for the future is a big challenge, which we are successfully mastering with a considerable contribution from many super-computing centres around the world, academic and commercial cloud providers. We also discuss R and D computing projects started recently in National Research Center ''Kurchatov Institute''

  8. BigData and computing challenges in high energy and nuclear physics

    Science.gov (United States)

    Klimentov, A.; Grigorieva, M.; Kiryanov, A.; Zarochentsev, A.

    2017-06-01

    In this contribution we discuss the various aspects of the computing resource needs experiments in High Energy and Nuclear Physics, in particular at the Large Hadron Collider. This will evolve in the future when moving from LHC to HL-LHC in ten years from now, when the already exascale levels of data we are processing could increase by a further order of magnitude. The distributed computing environment has been a great success and the inclusion of new super-computing facilities, cloud computing and volunteering computing for the future is a big challenge, which we are successfully mastering with a considerable contribution from many super-computing centres around the world, academic and commercial cloud providers. We also discuss R&D computing projects started recently in National Research Center ``Kurchatov Institute''

  9. CERN readies world's biggest science grid The computing network now encompasses more than 100 sites in 31 countries

    CERN Multimedia

    Niccolai, James

    2005-01-01

    If the Large Hadron Collider (LHC) at CERN is to yield miraculous discoveries in particle physics, it may also require a small miracle in grid computing. By a lack of suitable tools from commercial vendors, engineers at the famed Geneva laboratory are hard at work building a giant grid to store and process the vast amount of data the collider is expected to produce when it begins operations in mid-2007 (2 pages)

  10. High Performance Numerical Computing for High Energy Physics: A New Challenge for Big Data Science

    International Nuclear Information System (INIS)

    Pop, Florin

    2014-01-01

    Modern physics is based on both theoretical analysis and experimental validation. Complex scenarios like subatomic dimensions, high energy, and lower absolute temperature are frontiers for many theoretical models. Simulation with stable numerical methods represents an excellent instrument for high accuracy analysis, experimental validation, and visualization. High performance computing support offers possibility to make simulations at large scale, in parallel, but the volume of data generated by these experiments creates a new challenge for Big Data Science. This paper presents existing computational methods for high energy physics (HEP) analyzed from two perspectives: numerical methods and high performance computing. The computational methods presented are Monte Carlo methods and simulations of HEP processes, Markovian Monte Carlo, unfolding methods in particle physics, kernel estimation in HEP, and Random Matrix Theory used in analysis of particles spectrum. All of these methods produce data-intensive applications, which introduce new challenges and requirements for ICT systems architecture, programming paradigms, and storage capabilities.

  11. Computational Challenge of Fractional Differential Equations and the Potential Solutions: A Survey

    Directory of Open Access Journals (Sweden)

    Chunye Gong

    2015-01-01

    Full Text Available We present a survey of fractional differential equations and in particular of the computational cost for their numerical solutions from the view of computer science. The computational complexities of time fractional, space fractional, and space-time fractional equations are O(N2M, O(NM2, and O(NM(M + N compared with O(MN for the classical partial differential equations with finite difference methods, where M, N are the number of space grid points and time steps. The potential solutions for this challenge include, but are not limited to, parallel computing, memory access optimization (fractional precomputing operator, short memory principle, fast Fourier transform (FFT based solutions, alternating direction implicit method, multigrid method, and preconditioner technology. The relationships of these solutions for both space fractional derivative and time fractional derivative are discussed. The authors pointed out that the technologies of parallel computing should be regarded as a basic method to overcome this challenge, and some attention should be paid to the fractional killer applications, high performance iteration methods, high order schemes, and Monte Carlo methods. Since the computation of fractional equations with high dimension and variable order is even heavier, the researchers from the area of mathematics and computer science have opportunity to invent cornerstones in the area of fractional calculus.

  12. Evaluating a multi-player brain-computer interface game: challenge versus co-experience

    NARCIS (Netherlands)

    Gürkök, Hayrettin; Volpe, G; Reidsma, Dennis; Poel, Mannes; Camurri, A.; Obbink, Michel; Nijholt, Antinus

    2013-01-01

    Brain–computer interfaces (BCIs) have started to be considered as game controllers. The low level of control they provide prevents them from providing perfect control but allows the design of challenging games which can be enjoyed by players. Evaluation of enjoyment, or user experience (UX), is

  13. EPA and GSA Webinar: E Scrap Management, Computers for Learning and the Federal Green Challenge

    Science.gov (United States)

    EPA and the General Services Administration (GSA) are hosting a webinar on May 2, 2018. Topics will include policies and procedures on E Scrap management, a review of the Computers For Leaning Program, and benefits of joining the Federal Green Challenge.

  14. Computing in the Curriculum: Challenges and Strategies from a Teacher's Perspective

    Science.gov (United States)

    Sentance, Sue; Csizmadia, Andrew

    2017-01-01

    Computing is being introduced into the curriculum in many countries. Teachers' perspectives enable us to discover what challenges this presents, and also the strategies teachers claim to be using successfully in teaching the subject across primary and secondary education. The study described in this paper was carried out in the UK in 2014 where…

  15. Computing in research and development in Africa benefits, trends, challenges and solutions

    CERN Document Server

    2015-01-01

    This book describes the trends, challenges and solutions in computing use for scientific research and development within different domains in Africa, such as health, agriculture, environment, economy, energy, education and engineering. The benefits expected are discussed by a number of recognized, domain-specific experts, with a common theme being computing as solution enabler. This book is the first document providing such a representative up-to-date view on this topic at the continent level.   • Discusses computing for scientific research and development on the African continent, addressing domains such as engineering, health, agriculture, environment, economy, energy, and education; • Describes the state-of-the-art in usage of computing to address problems in developing countries pertaining to health, productivity, economic growth, and renewable energy; • Offers insights applicable to all developing countries on the use of computing technologies to address a variety of societal issues.

  16. Computational Challenges in the Analysis of Petrophysics Using Microtomography and Upscaling

    Science.gov (United States)

    Liu, J.; Pereira, G.; Freij-Ayoub, R.; Regenauer-Lieb, K.

    2014-12-01

    Microtomography provides detailed 3D internal structures of rocks in micro- to tens of nano-meter resolution and is quickly turning into a new technology for studying petrophysical properties of materials. An important step is the upscaling of these properties as micron or sub-micron resolution can only be done on the sample-scale of millimeters or even less than a millimeter. We present here a recently developed computational workflow for the analysis of microstructures including the upscaling of material properties. Computations of properties are first performed using conventional material science simulations at micro to nano-scale. The subsequent upscaling of these properties is done by a novel renormalization procedure based on percolation theory. We have tested the workflow using different rock samples, biological and food science materials. We have also applied the technique on high-resolution time-lapse synchrotron CT scans. In this contribution we focus on the computational challenges that arise from the big data problem of analyzing petrophysical properties and its subsequent upscaling. We discuss the following challenges: 1) Characterization of microtomography for extremely large data sets - our current capability. 2) Computational fluid dynamics simulations at pore-scale for permeability estimation - methods, computing cost and accuracy. 3) Solid mechanical computations at pore-scale for estimating elasto-plastic properties - computational stability, cost, and efficiency. 4) Extracting critical exponents from derivative models for scaling laws - models, finite element meshing, and accuracy. Significant progress in each of these challenges is necessary to transform microtomography from the current research problem into a robust computational big data tool for multi-scale scientific and engineering problems.

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  18. Readiness for Solving Story Problems.

    Science.gov (United States)

    Dunlap, William F.

    1982-01-01

    Readiness activities are described which are designed to help learning disabled (LD) students learn to perform computations in story problems. Activities proceed from concrete objects to numbers and involve the students in devising story problems. The language experience approach is incorporated with the enactive, iconic, and symbolic levels of…

  19. High End Computing Technologies for Earth Science Applications: Trends, Challenges, and Innovations

    Science.gov (United States)

    Parks, John (Technical Monitor); Biswas, Rupak; Yan, Jerry C.; Brooks, Walter F.; Sterling, Thomas L.

    2003-01-01

    Earth science applications of the future will stress the capabilities of even the highest performance supercomputers in the areas of raw compute power, mass storage management, and software environments. These NASA mission critical problems demand usable multi-petaflops and exabyte-scale systems to fully realize their science goals. With an exciting vision of the technologies needed, NASA has established a comprehensive program of advanced research in computer architecture, software tools, and device technology to ensure that, in partnership with US industry, it can meet these demanding requirements with reliable, cost effective, and usable ultra-scale systems. NASA will exploit, explore, and influence emerging high end computing architectures and technologies to accelerate the next generation of engineering, operations, and discovery processes for NASA Enterprises. This article captures this vision and describes the concepts, accomplishments, and the potential payoff of the key thrusts that will help meet the computational challenges in Earth science applications.

  20. Opportunities and Challenges of Cloud Computing to Improve Health Care Services

    Science.gov (United States)

    2011-01-01

    Cloud computing is a new way of delivering computing resources and services. Many managers and experts believe that it can improve health care services, benefit health care research, and change the face of health information technology. However, as with any innovation, cloud computing should be rigorously evaluated before its widespread adoption. This paper discusses the concept and its current place in health care, and uses 4 aspects (management, technology, security, and legal) to evaluate the opportunities and challenges of this computing model. Strategic planning that could be used by a health organization to determine its direction, strategy, and resource allocation when it has decided to migrate from traditional to cloud-based health services is also discussed. PMID:21937354

  1. Opportunities and challenges of cloud computing to improve health care services.

    Science.gov (United States)

    Kuo, Alex Mu-Hsing

    2011-09-21

    Cloud computing is a new way of delivering computing resources and services. Many managers and experts believe that it can improve health care services, benefit health care research, and change the face of health information technology. However, as with any innovation, cloud computing should be rigorously evaluated before its widespread adoption. This paper discusses the concept and its current place in health care, and uses 4 aspects (management, technology, security, and legal) to evaluate the opportunities and challenges of this computing model. Strategic planning that could be used by a health organization to determine its direction, strategy, and resource allocation when it has decided to migrate from traditional to cloud-based health services is also discussed.

  2. Primates, computation, and the path to language. Reply to comments on "Towards a Computational Comparative Neuroprimatology: Framing the language-ready brain"

    Science.gov (United States)

    Arbib, Michael A.

    2016-03-01

    The target article [6], henceforth TA, had as its main title Towards a Computational Comparative Neuroprimatology. This unpacks into three claims: Comparative Primatology: If one wishes to understand the behavior of any one primate species (whether monkey, ape or human - TA did not discuss, e.g., lemurs but that study could well be of interest), one will gain new insight by comparing behaviors across species, sharpening one's analysis of one class of behaviors by analyzing similarities and differences between two or more species.

  3. Computational Replication of the Primary Isotope Dependence of Secondary Kinetic Isotope Effects in Solution Hydride-Transfer Reactions: Supporting the Isotopically Different Tunneling Ready State Conformations.

    Science.gov (United States)

    Derakhshani-Molayousefi, Mortaza; Kashefolgheta, Sadra; Eilers, James E; Lu, Yun

    2016-06-30

    We recently reported a study of the steric effect on the 1° isotope dependence of 2° KIEs for several hydride-transfer reactions in solution (J. Am. Chem. Soc. 2015, 137, 6653). The unusual 2° KIEs decrease as the 1° isotope changes from H to D, and more in the sterically hindered systems. These were explained in terms of a more crowded tunneling ready state (TRS) conformation in D-tunneling, which has a shorter donor-acceptor distance (DAD) than in H-tunneling. To examine the isotopic DAD difference explanation, in this paper, following an activated motion-assisted H-tunneling model that requires a shorter DAD in a heavier isotope transfer process, we computed the 2° KIEs at various H/D positions at different DADs (2.9 Å to 3.5 Å) for the hydride-transfer reactions from 2-propanol to the xanthylium and thioxanthylium ions (Xn(+) and TXn(+)) and their 9-phenyl substituted derivatives (Ph(T)Xn(+)). The calculated 2° KIEs match the experiments and the calculated DAD effect on the 2° KIEs fits the observed 1° isotope effect on the 2° KIEs. These support the motion-assisted H-tunneling model and the isotopically different TRS conformations. Furthermore, it was found that the TRS of the sterically hindered Ph(T)Xn(+) system does not possess a longer DAD than that of the (T)Xn(+) system. This predicts a no larger 1° KIE in the former system than in the latter. The observed 1° KIE order is, however, contrary to the prediction. This implicates the stronger DAD-compression vibrations coupled to the bulky Ph(T)Xn(+) reaction coordinate.

  4. Some computational challenges of developing efficient parallel algorithms for data-dependent computations in thermal-hydraulics supercomputer applications

    International Nuclear Information System (INIS)

    Woodruff, S.B.

    1994-01-01

    The Transient Reactor Analysis Code (TRAC), which features a two-fluid treatment of thermal-hydraulics, is designed to model transients in water reactors and related facilities. One of the major computational costs associated with TRAC and similar codes is calculating constitutive coefficients. Although the formulations for these coefficients are local, the costs are flow-regime- or data-dependent; i.e., the computations needed for a given spatial node often vary widely as a function of time. Consequently, a fixed, uniform assignment of nodes to prallel processors will result in degraded computational efficiency due to the poor load balancing. A standard method for treating data-dependent models on vector architectures has been to use gather operations (or indirect adressing) to sort the nodes into subsets that (temporarily) share a common computational model. However, this method is not effective on distributed memory data parallel architectures, where indirect adressing involves expensive communication overhead. Another serious problem with this method involves software engineering challenges in the areas of maintainability and extensibility. For example, an implementation that was hand-tuned to achieve good computational efficiency would have to be rewritten whenever the decision tree governing the sorting was modified. Using an example based on the calculation of the wall-to-liquid and wall-to-vapor heat-transfer coefficients for three nonboiling flow regimes, we describe how the use of the Fortran 90 WHERE construct and automatic inlining of functions can be used to ameliorate this problem while improving both efficiency and software engineering. Unfortunately, a general automatic solution to the load-balancing problem associated with data-dependent computations is not yet available for massively parallel architectures. We discuss why developers should either wait for such solutions or consider alternative numerical algorithms, such as a neural network

  5. Challenges in Soft Computing: Case Study with Louisville MSD CSO Modeling

    Science.gov (United States)

    Ormsbee, L.; Tufail, M.

    2005-12-01

    The principal constituents of soft computing include fuzzy logic, neural computing, evolutionary computation, machine learning, and probabilistic reasoning. There are numerous applications of these constituents (both individually and combination of two or more) in the area of water resources and environmental systems. These range from development of data driven models to optimal control strategies to assist in more informed and intelligent decision making process. Availability of data is critical to such applications and having scarce data may lead to models that do not represent the response function over the entire domain. At the same time, too much data has a tendency to lead to over-constraining of the problem. This paper will describe the application of a subset of these soft computing techniques (neural computing and genetic algorithms) to the Beargrass Creek watershed in Louisville, Kentucky. The application include development of inductive models as substitutes for more complex process-based models to predict water quality of key constituents (such as dissolved oxygen) and use them in an optimization framework for optimal load reductions. Such a process will facilitate the development of total maximum daily loads for the impaired water bodies in the watershed. Some of the challenges faced in this application include 1) uncertainty in data sets, 2) model application, and 3) development of cause-and-effect relationships between water quality constituents and watershed parameters through use of inductive models. The paper will discuss these challenges and how they affect the desired goals of the project.

  6. DOE Advanced Scientific Computing Advisory Subcommittee (ASCAC) Report: Top Ten Exascale Research Challenges

    Energy Technology Data Exchange (ETDEWEB)

    Lucas, Robert [University of Southern California, Information Sciences Institute; Ang, James [Sandia National Laboratories; Bergman, Keren [Columbia University; Borkar, Shekhar [Intel; Carlson, William [Institute for Defense Analyses; Carrington, Laura [University of California, San Diego; Chiu, George [IBM; Colwell, Robert [DARPA; Dally, William [NVIDIA; Dongarra, Jack [University of Tennessee; Geist, Al [Oak Ridge National Laboratory; Haring, Rud [IBM; Hittinger, Jeffrey [Lawrence Livermore National Laboratory; Hoisie, Adolfy [Pacific Northwest National Laboratory; Klein, Dean Micron; Kogge, Peter [University of Notre Dame; Lethin, Richard [Reservoir Labs; Sarkar, Vivek [Rice University; Schreiber, Robert [Hewlett Packard; Shalf, John [Lawrence Berkeley National Laboratory; Sterling, Thomas [Indiana University; Stevens, Rick [Argonne National Laboratory; Bashor, Jon [Lawrence Berkeley National Laboratory; Brightwell, Ron [Sandia National Laboratories; Coteus, Paul [IBM; Debenedictus, Erik [Sandia National Laboratories; Hiller, Jon [Science and Technology Associates; Kim, K. H. [IBM; Langston, Harper [Reservoir Labs; Murphy, Richard Micron; Webster, Clayton [Oak Ridge National Laboratory; Wild, Stefan [Argonne National Laboratory; Grider, Gary [Los Alamos National Laboratory; Ross, Rob [Argonne National Laboratory; Leyffer, Sven [Argonne National Laboratory; Laros III, James [Sandia National Laboratories

    2014-02-10

    Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a system that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.

  7. Scientific Grand Challenges: Forefront Questions in Nuclear Science and the Role of High Performance Computing

    International Nuclear Information System (INIS)

    Khaleel, Mohammad A.

    2009-01-01

    This report is an account of the deliberations and conclusions of the workshop on 'Forefront Questions in Nuclear Science and the Role of High Performance Computing' held January 26-28, 2009, co-sponsored by the U.S. Department of Energy (DOE) Office of Nuclear Physics (ONP) and the DOE Office of Advanced Scientific Computing (ASCR). Representatives from the national and international nuclear physics communities, as well as from the high performance computing community, participated. The purpose of this workshop was to (1) identify forefront scientific challenges in nuclear physics and then determine which-if any-of these could be aided by high performance computing at the extreme scale; (2) establish how and why new high performance computing capabilities could address issues at the frontiers of nuclear science; (3) provide nuclear physicists the opportunity to influence the development of high performance computing; and (4) provide the nuclear physics community with plans for development of future high performance computing capability by DOE ASCR.

  8. Computer-assisted learning and simulation systems in dentistry--a challenge to society.

    Science.gov (United States)

    Welk, A; Splieth, Ch; Wierinck, E; Gilpatrick, R O; Meyer, G

    2006-07-01

    Computer technology is increasingly used in practical training at universities. However, in spite of their potential, computer-assisted learning (CAL) and computer-assisted simulation (CAS) systems still appear to be underutilized in dental education. Advantages, challenges, problems, and solutions of computer-assisted learning and simulation in dentistry are discussed by means of MEDLINE, open Internet platform searches, and key results of a study among German dental schools. The advantages of computer-assisted learning are seen for example in self-paced and self-directed learning and increased motivation. It is useful for both objective theoretical and practical tests and for training students to handle complex cases. CAL can lead to more structured learning and can support training in evidence-based decision-making. The reasons for the still relatively rare implementation of CAL/CAS systems in dental education include an inability to finance, lack of studies of CAL/CAS, and too much effort required to integrate CAL/CAS systems into the curriculum. To overcome the reasons for the relative low degree of computer technology use, we should strive for multicenter research and development projects monitored by the appropriate national and international scientific societies, so that the potential of computer technology can be fully realized in graduate, postgraduate, and continuing dental education.

  9. Ex Machina: Analytical platforms, Law and the Challenges of Computational Legal Science

    Directory of Open Access Journals (Sweden)

    Nicola Lettieri

    2018-04-01

    Full Text Available Over the years, computation has become a fundamental part of the scientific practice in several research fields that goes far beyond the boundaries of natural sciences. Data mining, machine learning, simulations and other computational methods lie today at the hearth of the scientific endeavour in a growing number of social research areas from anthropology to economics. In this scenario, an increasingly important role is played by analytical platforms: integrated environments allowing researchers to experiment cutting-edge data-driven and computation-intensive analyses. The paper discusses the appearance of such tools in the emerging field of computational legal science. After a general introduction to the impact of computational methods on both natural and social sciences, we describe the concept and the features of an analytical platform exploring innovative cross-methodological approaches to the academic and investigative study of crime. Stemming from an ongoing project involving researchers from law, computer science and bioinformatics, the initiative is presented and discussed as an opportunity to raise a debate about the future of legal scholarship and, inside of it, about the challenges of computational legal science.

  10. Scientific Grand Challenges: Forefront Questions in Nuclear Science and the Role of High Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.

    2009-10-01

    This report is an account of the deliberations and conclusions of the workshop on "Forefront Questions in Nuclear Science and the Role of High Performance Computing" held January 26-28, 2009, co-sponsored by the U.S. Department of Energy (DOE) Office of Nuclear Physics (ONP) and the DOE Office of Advanced Scientific Computing (ASCR). Representatives from the national and international nuclear physics communities, as well as from the high performance computing community, participated. The purpose of this workshop was to 1) identify forefront scientific challenges in nuclear physics and then determine which-if any-of these could be aided by high performance computing at the extreme scale; 2) establish how and why new high performance computing capabilities could address issues at the frontiers of nuclear science; 3) provide nuclear physicists the opportunity to influence the development of high performance computing; and 4) provide the nuclear physics community with plans for development of future high performance computing capability by DOE ASCR.

  11. A community computational challenge to predict the activity of pairs of compounds.

    Science.gov (United States)

    Bansal, Mukesh; Yang, Jichen; Karan, Charles; Menden, Michael P; Costello, James C; Tang, Hao; Xiao, Guanghua; Li, Yajuan; Allen, Jeffrey; Zhong, Rui; Chen, Beibei; Kim, Minsoo; Wang, Tao; Heiser, Laura M; Realubit, Ronald; Mattioli, Michela; Alvarez, Mariano J; Shen, Yao; Gallahan, Daniel; Singer, Dinah; Saez-Rodriguez, Julio; Xie, Yang; Stolovitzky, Gustavo; Califano, Andrea

    2014-12-01

    Recent therapeutic successes have renewed interest in drug combinations, but experimental screening approaches are costly and often identify only small numbers of synergistic combinations. The DREAM consortium launched an open challenge to foster the development of in silico methods to computationally rank 91 compound pairs, from the most synergistic to the most antagonistic, based on gene-expression profiles of human B cells treated with individual compounds at multiple time points and concentrations. Using scoring metrics based on experimental dose-response curves, we assessed 32 methods (31 community-generated approaches and SynGen), four of which performed significantly better than random guessing. We highlight similarities between the methods. Although the accuracy of predictions was not optimal, we find that computational prediction of compound-pair activity is possible, and that community challenges can be useful to advance the field of in silico compound-synergy prediction.

  12. Exploring the Benefits and Challenges of Using Laptop Computers in Higher Education Classrooms: A Formative Analysis

    OpenAIRE

    Robin H. Kay; Sharon Lauricella

    2011-01-01

    Because of decreased prices, increased convenience, and wireless access, an increasing number of college and university students are using laptop computers in their classrooms. This recent trend has forced instructors to address the educational consequences of using these mobile devices. The purpose of the current study was to analyze and assess beneficial and challenging laptop behaviours in higher education classrooms. Both quantitative and qualitative data were collected from 177 undergrad...

  13. Scientific Grand Challenges: Crosscutting Technologies for Computing at the Exascale - February 2-4, 2010, Washington, D.C.

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.

    2011-02-06

    The goal of the "Scientific Grand Challenges - Crosscutting Technologies for Computing at the Exascale" workshop in February 2010, jointly sponsored by the U.S. Department of Energy’s Office of Advanced Scientific Computing Research and the National Nuclear Security Administration, was to identify the elements of a research and development agenda that will address these challenges and create a comprehensive exascale computing environment. This exascale computing environment will enable the science applications identified in the eight previously held Scientific Grand Challenges Workshop Series.

  14. Bringing high-performance computing to the biologist's workbench: approaches, applications, and challenges

    International Nuclear Information System (INIS)

    Oehmen, C S; Cannon, W R

    2008-01-01

    Data-intensive and high-performance computing are poised to significantly impact the future of biological research which is increasingly driven by the prevalence of high-throughput experimental methodologies for genome sequencing, transcriptomics, proteomics, and other areas. Large centers such as NIH's National Center for Biotechnology Information, The Institute for Genomic Research, and the DOE's Joint Genome Institute) have made extensive use of multiprocessor architectures to deal with some of the challenges of processing, storing and curating exponentially growing genomic and proteomic datasets, thus enabling users to rapidly access a growing public data source, as well as use analysis tools transparently on high-performance computing resources. Applying this computational power to single-investigator analysis, however, often relies on users to provide their own computational resources, forcing them to endure the learning curve of porting, building, and running software on multiprocessor architectures. Solving the next generation of large-scale biology challenges using multiprocessor machines-from small clusters to emerging petascale machines-can most practically be realized if this learning curve can be minimized through a combination of workflow management, data management and resource allocation as well as intuitive interfaces and compatibility with existing common data formats

  15. The challenges of developing computational physics: the case of South Africa

    International Nuclear Information System (INIS)

    Salagaram, T; Chetty, N

    2013-01-01

    Most modern scientific research problems are complex and interdisciplinary in nature. It is impossible to study such problems in detail without the use of computation in addition to theory and experiment. Although it is widely agreed that students should be introduced to computational methods at the undergraduate level, it remains a challenge to do this in a full traditional undergraduate curriculum. In this paper, we report on a survey that we conducted of undergraduate physics curricula in South Africa to determine the content and the approach taken in the teaching of computational physics. We also considered the pedagogy of computational physics at the postgraduate and research levels at various South African universities, research facilities and institutions. We conclude that the state of computational physics training in South Africa, especially at the undergraduate teaching level, is generally weak and needs to be given more attention at all universities. Failure to do so will impact negatively on the countrys capacity to grow its endeavours generally in the field of computational sciences, with negative impacts on research, and in commerce and industry

  16. Computational intelligence in wireless sensor networks recent advances and future challenges

    CERN Document Server

    Falcon, Rafael; Koeppen, Mario

    2017-01-01

    This book emphasizes the increasingly important role that Computational Intelligence (CI) methods are playing in solving a myriad of entangled Wireless Sensor Networks (WSN) related problems. The book serves as a guide for surveying several state-of-the-art WSN scenarios in which CI approaches have been employed. The reader finds in this book how CI has contributed to solve a wide range of challenging problems, ranging from balancing the cost and accuracy of heterogeneous sensor deployments to recovering from real-time sensor failures to detecting attacks launched by malicious sensor nodes and enacting CI-based security schemes. Network managers, industry experts, academicians and practitioners alike (mostly in computer engineering, computer science or applied mathematics) benefit from the spectrum of successful applications reported in this book. Senior undergraduate or graduate students may discover in this book some problems well suited for their own research endeavors. USP: Presents recent advances and fu...

  17. New Challenges for Design Participation in the Era of Ubiquitous Computing

    DEFF Research Database (Denmark)

    Brereton, Margot; Buur, Jacob

    2008-01-01

    Since the event of participatory design in the work democracy projects of the 1970’s and 1980’s in Scandinavia, computing technology and people’s engagement with it have undergone fundamental changes. Although participatory design continues to be a precondition for designing computing that aligns...... with human practices, the motivations to engage in participatory design have changed, and the new era requires formats that are different from the original ones. Through the analysis of three case studies this paper seeks to explain why participatory design must be brought to bear on the field of ubiquitous...... computing, and how this challenges the original participatory design thinking. In particular we will argue that more casual, exploratory formats of engagement with people are required, and rather than planning the all-encompassing systems development project, participatory design needs to move towards...

  18. Geant4 Hadronic Cascade Models and CMS Data Analysis : Computational Challenges in the LHC era

    CERN Document Server

    Heikkinen, Aatos

    This work belongs to the field of computational high-energy physics (HEP). The key methods used in this thesis work to meet the challenges raised by the Large Hadron Collider (LHC) era experiments are object-orientation with software engineering, Monte Carlo simulation, the computer technology of clusters, and artificial neural networks. The first aspect discussed is the development of hadronic cascade models, used for the accurate simulation of medium-energy hadron-nucleus reactions, up to 10 GeV. These models are typically needed in hadronic calorimeter studies and in the estimation of radiation backgrounds. Various applications outside HEP include the medical field (such as hadron treatment simulations), space science (satellite shielding), and nuclear physics (spallation studies). Validation results are presented for several significant improvements released in Geant4 simulation tool, and the significance of the new models for computing in the Large Hadron Collider era is estimated. In particular, we es...

  19. Recent progress and modern challenges in applied mathematics, modeling and computational science

    CERN Document Server

    Makarov, Roman; Belair, Jacques

    2017-01-01

    This volume is an excellent resource for professionals in various areas of applications of mathematics, modeling, and computational science. It focuses on recent progress and modern challenges in these areas. The volume provides a balance between fundamental theoretical and applied developments, emphasizing the interdisciplinary nature of modern trends and detailing state-of-the-art achievements in Applied Mathematics, Modeling, and Computational Science.  The chapters have been authored by international experts in their respective fields, making this book ideal for researchers in academia, practitioners, and graduate students. It can also serve as a reference in the diverse selected areas of applied mathematics, modelling, and computational sciences, and is ideal for interdisciplinary collaborations.

  20. Challenges and considerations for the design and production of a purpose-optimized body-worn wrist-watch computer

    Science.gov (United States)

    Narayanaswami, Chandra; Raghunath, Mandayam T.

    2004-09-01

    We outline a collection of technological challenges in the design of wearable computers with a focus on one of the most desirable form-factors, the wrist watch. We describe our experience with building three generations of wrist watch computers. We built these research prototypes as platforms to investigate the fundamental limitations of wearable computing. Results of our investigations are presented in the form of challenges that have been overcome and those that still remain.

  1. IMPLEMENTING THE COMPUTER-BASED NATIONAL EXAMINATION IN INDONESIAN SCHOOLS: THE CHALLENGES AND STRATEGIES

    Directory of Open Access Journals (Sweden)

    Heri Retnawati

    2017-12-01

    Full Text Available In line with technological development, the computer-based national examination (CBNE has become an urgent matter as its implementation faces various challenges, especially in developing countries. Strategies in implementing CBNE are thus needed to face the challenges. The aim of this research was to analyse the challenges and strategies of Indonesian schools in implementing CBNE. This research was qualitative phenomenological in nature. The data were collected through a questionnaire and a focus group discussion. The research participants were teachers who were test supervisors and technicians at junior high schools and senior high schools (i.e. Level 1 and 2 and vocational high schools implementing CBNE in Yogyakarta, Indonesia. The data were analysed using the Bogdan and Biklen model. The results indicate that (1 in implementing CBNE, the schools should initially make efforts to provide the electronic equipment supporting it; (2 the implementation of CBNE is challenged by problems concerning the Internet and the electricity supply; (3 the test supervisors have to learn their duties by themselves and (4 the students are not yet familiar with the beneficial use of information technology. To deal with such challenges, the schools employed strategies by making efforts to provide the standard electronic equipment through collaboration with the students’ parents and improving the curriculum content by adding information technology as a school subject.

  2. The Challenges and Benefits of Using Computer Technology for Communication and Teaching in the Geosciences

    Science.gov (United States)

    Fairley, J. P.; Hinds, J. J.

    2003-12-01

    The advent of the World Wide Web in the early 1990s not only revolutionized the exchange of ideas and information within the scientific community, but also provided educators with a new array of teaching, informational, and promotional tools. Use of computer graphics and animation to explain concepts and processes can stimulate classroom participation and student interest in the geosciences, which has historically attracted students with strong spatial and visualization skills. In today's job market, graduates are expected to have knowledge of computers and the ability to use them for acquiring, processing, and visually analyzing data. Furthermore, in addition to promoting visibility and communication within the scientific community, computer graphics and the Internet can be informative and educational for the general public. Although computer skills are crucial for earth science students and educators, many pitfalls exist in implementing computer technology and web-based resources into research and classroom activities. Learning to use these new tools effectively requires a significant time commitment and careful attention to the source and reliability of the data presented. Furthermore, educators have a responsibility to ensure that students and the public understand the assumptions and limitations of the materials presented, rather than allowing them to be overwhelmed by "gee-whiz" aspects of the technology. We present three examples of computer technology in the earth sciences classroom: 1) a computer animation of water table response to well pumping, 2) a 3-D fly-through animation of a fault controlled valley, and 3) a virtual field trip for an introductory geology class. These examples demonstrate some of the challenges and benefits of these new tools, and encourage educators to expand the responsible use of computer technology for teaching and communicating scientific results to the general public.

  3. Computational Cellular Dynamics Based on the Chemical Master Equation: A Challenge for Understanding Complexity.

    Science.gov (United States)

    Liang, Jie; Qian, Hong

    2010-01-01

    Modern molecular biology has always been a great source of inspiration for computational science. Half a century ago, the challenge from understanding macromolecular dynamics has led the way for computations to be part of the tool set to study molecular biology. Twenty-five years ago, the demand from genome science has inspired an entire generation of computer scientists with an interest in discrete mathematics to join the field that is now called bioinformatics. In this paper, we shall lay out a new mathematical theory for dynamics of biochemical reaction systems in a small volume (i.e., mesoscopic) in terms of a stochastic, discrete-state continuous-time formulation, called the chemical master equation (CME). Similar to the wavefunction in quantum mechanics, the dynamically changing probability landscape associated with the state space provides a fundamental characterization of the biochemical reaction system. The stochastic trajectories of the dynamics are best known through the simulations using the Gillespie algorithm. In contrast to the Metropolis algorithm, this Monte Carlo sampling technique does not follow a process with detailed balance. We shall show several examples how CMEs are used to model cellular biochemical systems. We shall also illustrate the computational challenges involved: multiscale phenomena, the interplay between stochasticity and nonlinearity, and how macroscopic determinism arises from mesoscopic dynamics. We point out recent advances in computing solutions to the CME, including exact solution of the steady state landscape and stochastic differential equations that offer alternatives to the Gilespie algorithm. We argue that the CME is an ideal system from which one can learn to understand "complex behavior" and complexity theory, and from which important biological insight can be gained.

  4. A Step Towards A Computing Grid For The LHC Experiments ATLAS Data Challenge 1

    CERN Document Server

    Sturrock, R; Epp, B; Ghete, V M; Kuhn, D; Mello, A G; Caron, B; Vetterli, M C; Karapetian, G V; Martens, K; Agarwal, A; Poffenberger, P R; McPherson, R A; Sobie, R J; Amstrong, S; Benekos, N C; Boisvert, V; Boonekamp, M; Brandt, S; Casado, M P; Elsing, M; Gianotti, F; Goossens, L; Grote, M; Hansen, J B; Mair, K; Nairz, A; Padilla, C; Poppleton, A; Poulard, G; Richter-Was, Elzbieta; Rosati, S; Schörner-Sadenius, T; Wengler, T; Xu, G F; Ping, J L; Chudoba, J; Kosina, J; Lokajícek, M; Svec, J; Tas, P; Hansen, J R; Lytken, E; Nielsen, J L; Wäänänen, A; Tapprogge, Stefan; Calvet, D; Albrand, S; Collot, J; Fulachier, J; Ledroit-Guillon, F; Ohlsson-Malek, F; Viret, S; Wielers, M; Bernardet, K; Corréard, S; Rozanov, A; De Vivie de Régie, J B; Arnault, C; Bourdarios, C; Hrivnác, J; Lechowski, M; Parrour, G; Perus, A; Rousseau, D; Schaffer, A; Unal, G; Derue, F; Chevalier, L; Hassani, S; Laporte, J F; Nicolaidou, R; Pomarède, D; Virchaux, M; Nesvadba, N; Baranov, S; Putzer, A; Khonich, A; Duckeck, G; Schieferdecker, P; Kiryunin, A E; Schieck, J; Lagouri, T; Duchovni, E; Levinson, L; Schrager, D; Negri, G; Bilokon, H; Spogli, L; Barberis, D; Parodi, F; Cataldi, G; Gorini, E; Primavera, M; Spagnolo, S; Cavalli, D; Heldmann, M; Lari, T; Perini, L; Rebatto, D; Resconi, S; Tatarelli, F; Vaccarossa, L; Biglietti, M; Carlino, G; Conventi, F; Doria, A; Merola, L; Polesello, G; Vercesi, V; De Salvo, A; Di Mattia, A; Luminari, L; Nisati, A; Reale, M; Testa, M; Farilla, A; Verducci, M; Cobal, M; Santi, L; Hasegawa, Y; Ishino, M; Mashimo, T; Matsumoto, H; Sakamoto, H; Tanaka, J; Ueda, I; Bentvelsen, Stanislaus Cornelius Maria; Fornaini, A; Gorfine, G; Groep, D; Templon, J; Köster, L J; Konstantinov, A; Myklebust, T; Ould-Saada, F; Bold, T; Kaczmarska, A; Malecki, P; Szymocha, T; Turala, M; Kulchitskii, Yu A; Khoreauli, G; Gromova, N; Tsulaia, V; Minaenko, A A; Rudenko, R; Slabospitskaya, E; Solodkov, A; Gavrilenko, I; Nikitine, N; Sivoklokov, S Yu; Toms, K; Zalite, A; Zalite, Yu; Kervesan, B; Bosman, M; González, S; Sánchez, J; Salt, J; Andersson, N; Nixon, L; Eerola, Paule Anna Mari; Kónya, B; Smirnova, O G; Sandgren, A; Ekelöf, T J C; Ellert, M; Gollub, N; Hellman, S; Lipniacka, A; Corso-Radu, A; Pérez-Réale, V; Lee, S C; CLin, S C; Ren, Z L; Teng, P K; Faulkner, P J W; O'Neale, S W; Watson, A; Brochu, F; Lester, C; Thompson, S; Kennedy, J; Bouhova-Thacker, E; Henderson, R; Jones, R; Kartvelishvili, V G; Smizanska, M; Washbrook, A J; Drohan, J; Konstantinidis, N P; Moyse, E; Salih, S; Loken, J; Baines, J T M; Candlin, D; Candlin, R; Clifft, R; Li, W; McCubbin, N A; George, S; Lowe, A; Buttar, C; Dawson, I; Moraes, A; Tovey, Daniel R; Gieraltowski, J; Malon, D; May, E; LeCompte, T J; Vaniachine, A; Adams, D L; Assamagan, Ketevi A; Baker, R; Deng, W; Fine, V; Fisyak, Yu; Gibbard, B; Ma, H; Nevski, P; Paige, F; Rajagopalan, S; Smith, J; Undrus, A; Wenaus, T; Yu, D; Calafiura, P; Canon, S; Costanzo, D; Hinchliffe, Ian; Lavrijsen, W; Leggett, C; Marino, M; Quarrie, D R; Sakrejda, I; Stravopoulos, G; Tull, C; Loch, P; Youssef, S; Shank, J T; Engh, D; Frank, E; Sen-Gupta, A; Gardner, R; Meritt, F; Smirnov, Y; Huth, J; Grundhoefer, L; Luehring, F C; Goldfarb, S; Severini, H; Skubic, P L; Gao, Y; Ryan, T; De, K; Sosebee, M; McGuigan, P; Ozturk, N

    2004-01-01

    The ATLAS Collaboration at CERN is preparing for the data taking and analysis at the LHC that will start in 2007. Therefore, a series of Data Challenges was started in 2002 whose goals are the validation of the Computing Model, of the complete software suite, of the data model, and to ensure the correctness of the technical choices to be made for the final offline computing environment. A major feature of the first Data Challenge (DC1) was the preparation and the deployment of the software required for the production of large event samples as a worldwide distributed activity. It should be noted that it was not an option to "run the complete production at CERN" even if we had wanted to; the resources were not available at CERN to carry out the production on a reasonable time-scale. The great challenge of organising and carrying out this large-scale production at a significant number of sites around the world had therefore to be faced. However, the benefits of this are manifold: apart from realising the require...

  5. Novel spintronics devices for memory and logic: prospects and challenges for room temperature all spin computing

    Science.gov (United States)

    Wang, Jian-Ping

    An energy efficient memory and logic device for the post-CMOS era has been the goal of a variety of research fields. The limits of scaling, which we expect to reach by the year 2025, demand that future advances in computational power will not be realized from ever-shrinking device sizes, but rather by innovative designs and new materials and physics. Magnetoresistive based devices have been a promising candidate for future integrated magnetic computation because of its unique non-volatility and functionalities. The application of perpendicular magnetic anisotropy for potential STT-RAM application was demonstrated and later has been intensively investigated by both academia and industry groups, but there is no clear path way how scaling will eventually work for both memory and logic applications. One of main reasons is that there is no demonstrated material stack candidate that could lead to a scaling scheme down to sub 10 nm. Another challenge for the usage of magnetoresistive based devices for logic application is its available switching speed and writing energy. Although a good progress has been made to demonstrate the fast switching of a thermally stable magnetic tunnel junction (MTJ) down to 165 ps, it is still several times slower than its CMOS counterpart. In this talk, I will review the recent progress by my research group and my C-SPIN colleagues, then discuss the opportunities, challenges and some potential path ways for magnetoresitive based devices for memory and logic applications and their integration for room temperature all spin computing system.

  6. Exploring the Benefits and Challenges of Using Laptop Computers in Higher Education Classrooms: A Formative Analysis

    Directory of Open Access Journals (Sweden)

    Robin H. Kay

    2011-04-01

    Full Text Available Because of decreased prices, increased convenience, and wireless access, an increasing number of college and university students are using laptop computers in their classrooms. This recent trend has forced instructors to address the educational consequences of using these mobile devices. The purpose of the current study was to analyze and assess beneficial and challenging laptop behaviours in higher education classrooms. Both quantitative and qualitative data were collected from 177 undergraduate university students (89 males, 88 females. Key benefits observed include note-taking activities, in-class laptop-based academic tasks, collaboration, increased focus, improved organization and efficiency, and addressing special needs. Key challenges noted include other student’s distracting laptop behaviours, instant messaging, surfing the web, playing games, watching movies, and decreased focus. Nearly three-quarters of the students claimed that laptops were useful in supporting their academic experience. Twice as many benefits were reported compared to challenges. It is speculated that the integration of meaningful laptop activities is a critical determinant of benefits and challenges experienced in higher education classrooms.

  7. Computational analyses of ancient pathogen DNA from herbarium samples: challenges and prospects.

    Science.gov (United States)

    Yoshida, Kentaro; Sasaki, Eriko; Kamoun, Sophien

    2015-01-01

    The application of DNA sequencing technology to the study of ancient DNA has enabled the reconstruction of past epidemics from genomes of historically important plant-associated microbes. Recently, the genome sequences of the potato late blight pathogen Phytophthora infestans were analyzed from 19th century herbarium specimens. These herbarium samples originated from infected potatoes collected during and after the Irish potato famine. Herbaria have therefore great potential to help elucidate past epidemics of crops, date the emergence of pathogens, and inform about past pathogen population dynamics. DNA preservation in herbarium samples was unexpectedly good, raising the possibility of a whole new research area in plant and microbial genomics. However, the recovered DNA can be extremely fragmented resulting in specific challenges in reconstructing genome sequences. Here we review some of the challenges in computational analyses of ancient DNA from herbarium samples. We also applied the recently developed linkage method to haplotype reconstruction of diploid or polyploid genomes from fragmented ancient DNA.

  8. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    Energy Technology Data Exchange (ETDEWEB)

    King, W. E., E-mail: weking@llnl.gov [Physical and Life Sciences Directorate, Lawrence Livermore National Laboratory, Livermore, California 94550 (United States); Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Khairallah, S. A. [Engineering Directorate, Lawrence Livermore National Laboratory, Livermore, California 94550 (United States); Kamath, C. [Computation Directorate, Lawrence Livermore National Laboratory, Livermore, California 94550 (United States); Rubenchik, A. M. [NIF and Photon Sciences Directorate, Lawrence Livermore National Laboratory, Livermore, California 94550 (United States)

    2015-12-15

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  9. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    International Nuclear Information System (INIS)

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Khairallah, S. A.; Kamath, C.; Rubenchik, A. M.

    2015-01-01

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process

  10. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    Science.gov (United States)

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Kamath, C.; Khairallah, S. A.; Rubenchik, A. M.

    2015-12-01

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  12. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  13. Challenges and opportunities of modeling plasma–surface interactions in tungsten using high-performance computing

    Energy Technology Data Exchange (ETDEWEB)

    Wirth, Brian D., E-mail: bdwirth@utk.edu [Department of Nuclear Engineering, University of Tennessee, Knoxville, TN 37996 (United States); Nuclear Science and Engineering Directorate, Oak Ridge National Laboratory, Oak Ridge, TN (United States); Hammond, K.D. [Department of Nuclear Engineering, University of Tennessee, Knoxville, TN 37996 (United States); Krasheninnikov, S.I. [University of California, San Diego, La Jolla, CA (United States); Maroudas, D. [University of Massachusetts, Amherst, Amherst, MA 01003 (United States)

    2015-08-15

    The performance of plasma facing components (PFCs) is critical for ITER and future magnetic fusion reactors. The ITER divertor will be tungsten, which is the primary candidate material for future reactors. Recent experiments involving tungsten exposure to low-energy helium plasmas reveal significant surface modification, including the growth of nanometer-scale tendrils of “fuzz” and formation of nanometer-sized bubbles in the near-surface region. The large span of spatial and temporal scales governing plasma surface interactions are among the challenges to modeling divertor performance. Fortunately, recent innovations in computational modeling, increasingly powerful high-performance computers, and improved experimental characterization tools provide a path toward self-consistent, experimentally validated models of PFC and divertor performance. Recent advances in understanding tungsten–helium interactions are reviewed, including such processes as helium clustering, which serve as nuclei for gas bubbles; and trap mutation, dislocation loop punching and bubble bursting; which together initiate surface morphological modification.

  14. Challenges for the computational fluid dynamics codes in the nineties. Various examples of application

    International Nuclear Information System (INIS)

    Chabard, J.P.; Viollet, P.L.

    1991-08-01

    Most of the computational fluid dynamics applications which are encountered at the Research and Development Division of EDF (RDD) are dealing with thermal exchanges. The development of numerical tools for the simulation of flows, devoted to this class of application, has been under way for 15 years. At the beginning this work was mainly concerned with a good simulation of the dynamics of the flow. Now these tools can be used to compute flows with thermal exchanges. The presentation will be limited to incompressible and one phase flows. First the softwares developed at RDD will be presented. Then some applications of these tools to flows with thermal exchanges will be discussed. To conclude, the paper will treat be general case of the CFD codes. The challenges for the next years will be detailed in order to make these tools available for users involved in complex physical modeling

  15. Challenges and opportunities of modeling plasma–surface interactions in tungsten using high-performance computing

    International Nuclear Information System (INIS)

    Wirth, Brian D.; Hammond, K.D.; Krasheninnikov, S.I.; Maroudas, D.

    2015-01-01

    The performance of plasma facing components (PFCs) is critical for ITER and future magnetic fusion reactors. The ITER divertor will be tungsten, which is the primary candidate material for future reactors. Recent experiments involving tungsten exposure to low-energy helium plasmas reveal significant surface modification, including the growth of nanometer-scale tendrils of “fuzz” and formation of nanometer-sized bubbles in the near-surface region. The large span of spatial and temporal scales governing plasma surface interactions are among the challenges to modeling divertor performance. Fortunately, recent innovations in computational modeling, increasingly powerful high-performance computers, and improved experimental characterization tools provide a path toward self-consistent, experimentally validated models of PFC and divertor performance. Recent advances in understanding tungsten–helium interactions are reviewed, including such processes as helium clustering, which serve as nuclei for gas bubbles; and trap mutation, dislocation loop punching and bubble bursting; which together initiate surface morphological modification

  16. The nature of the (visualization) game: Challenges and opportunities from computational geophysics

    Science.gov (United States)

    Kellogg, L. H.

    2016-12-01

    As the geosciences enters the era of big data, modeling and visualization become increasingly vital tools for discovery, understanding, education, and communication. Here, we focus on modeling and visualization of the structure and dynamics of the Earth's surface and interior. The past decade has seen accelerated data acquisition, including higher resolution imaging and modeling of Earth's deep interior, complex models of geodynamics, and high resolution topographic imaging of the changing surface, with an associated acceleration of computational modeling through better scientific software, increased computing capability, and the use of innovative methods of scientific visualization. The role of modeling is to describe a system, answer scientific questions, and test hypotheses; the term "model" encompasses mathematical models, computational models, physical models, conceptual models, statistical models, and visual models of a structure or process. These different uses of the term require thoughtful communication to avoid confusion. Scientific visualization is integral to every aspect of modeling. Not merely a means of communicating results, the best uses of visualization enable scientists to interact with their data, revealing the characteristics of the data and models to enable better interpretation and inform the direction of future investigation. Innovative immersive technologies like virtual reality, augmented reality, and remote collaboration techniques, are being adapted more widely and are a magnet for students. Time-varying or transient phenomena are especially challenging to model and to visualize; researchers and students may need to investigate the role of initial conditions in driving phenomena, while nonlinearities in the governing equations of many Earth systems make the computations and resulting visualization especially challenging. Training students how to use, design, build, and interpret scientific modeling and visualization tools prepares them

  17. ceRNAs in plants: computational approaches and associated challenges for target mimic research.

    Science.gov (United States)

    Paschoal, Alexandre Rossi; Lozada-Chávez, Irma; Domingues, Douglas Silva; Stadler, Peter F

    2017-05-30

    The competing endogenous RNA hypothesis has gained increasing attention as a potential global regulatory mechanism of microRNAs (miRNAs), and as a powerful tool to predict the function of many noncoding RNAs, including miRNAs themselves. Most studies have been focused on animals, although target mimic (TMs) discovery as well as important computational and experimental advances has been developed in plants over the past decade. Thus, our contribution summarizes recent progresses in computational approaches for research of miRNA:TM interactions. We divided this article in three main contributions. First, a general overview of research on TMs in plants is presented with practical descriptions of the available literature, tools, data, databases and computational reports. Second, we describe a common protocol for the computational and experimental analyses of TM. Third, we provide a bioinformatics approach for the prediction of TM motifs potentially cross-targeting both members within the same or from different miRNA families, based on the identification of consensus miRNA-binding sites from known TMs across sequenced genomes, transcriptomes and known miRNAs. This computational approach is promising because, in contrast to animals, miRNA families in plants are large with identical or similar members, several of which are also highly conserved. From the three consensus TM motifs found with our approach: MIM166, MIM171 and MIM159/319, the last one has found strong support on the recent experimental work by Reichel and Millar [Specificity of plant microRNA TMs: cross-targeting of mir159 and mir319. J Plant Physiol 2015;180:45-8]. Finally, we stress the discussion on the major computational and associated experimental challenges that have to be faced in future ceRNA studies. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. Challenges in reducing the computational time of QSTS simulations for distribution system analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Deboever, Jeremiah [Georgia Inst. of Technology, Atlanta, GA (United States); Zhang, Xiaochen [Georgia Inst. of Technology, Atlanta, GA (United States); Reno, Matthew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grijalva, Santiago [Georgia Inst. of Technology, Atlanta, GA (United States); Therrien, Francis [CME International T& D, St. Bruno, QC (Canada)

    2017-06-01

    The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10 to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.

  19. Fundamental challenging problems for developing new nuclear safety standard computer codes

    International Nuclear Information System (INIS)

    Wong, P.K.; Wong, A.E.; Wong, A.

    2005-01-01

    Based on the claims of the US Basic patents number 5,084,232; 5,848,377 and 6,430,516 that can be obtained from typing the Patent Numbers into the Box of the Web site http://164.195.100.11/netahtml/srchnum.htm and their associated published technical papers having been presented and published at International Conferences in the last three years and that all these had been sent into US-NRC by E-mail on March 26, 2003 at 2:46 PM., three fundamental challenging problems for developing new nuclear safety standard computer codes had been presented at the US-NRC RIC2003 Session W4. 2:15-3:15 PM. at the Washington D.C. Capital Hilton Hotel, Presidential Ballroom on April 16, 2003 in front of more than 800 nuclear professionals from many countries worldwide. The objective and scope of this paper is to invite all nuclear professionals to examine and evaluate all the current computer codes being used in their own countries by means of comparison of numerical data from these three specific openly challenging fundamental problems in order to set up a global safety standard for all nuclear power plants in the world. (authors)

  20. IBM Watson: How Cognitive Computing Can Be Applied to Big Data Challenges in Life Sciences Research.

    Science.gov (United States)

    Chen, Ying; Elenee Argentinis, J D; Weber, Griff

    2016-04-01

    Life sciences researchers are under pressure to innovate faster than ever. Big data offer the promise of unlocking novel insights and accelerating breakthroughs. Ironically, although more data are available than ever, only a fraction is being integrated, understood, and analyzed. The challenge lies in harnessing volumes of data, integrating the data from hundreds of sources, and understanding their various formats. New technologies such as cognitive computing offer promise for addressing this challenge because cognitive solutions are specifically designed to integrate and analyze big datasets. Cognitive solutions can understand different types of data such as lab values in a structured database or the text of a scientific publication. Cognitive solutions are trained to understand technical, industry-specific content and use advanced reasoning, predictive modeling, and machine learning techniques to advance research faster. Watson, a cognitive computing technology, has been configured to support life sciences research. This version of Watson includes medical literature, patents, genomics, and chemical and pharmacological data that researchers would typically use in their work. Watson has also been developed with specific comprehension of scientific terminology so it can make novel connections in millions of pages of text. Watson has been applied to a few pilot studies in the areas of drug target identification and drug repurposing. The pilot results suggest that Watson can accelerate identification of novel drug candidates and novel drug targets by harnessing the potential of big data. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  1. Best Performers Announced for the NCI-CPTAC DREAM Proteogenomics Computational Challenge | Office of Cancer Clinical Proteomics Research

    Science.gov (United States)

    The National Cancer Institute (NCI) Clinical Proteomic Tumor Analysis Consortium (CPTAC) is pleased to announce that teams led by Jaewoo Kang (Korea University), and Yuanfang Guan with Hongyang Li (University of Michigan) as the best performers of the NCI-CPTAC DREAM Proteogenomics Computational Challenge. Over 500 participants from 20 countries registered for the Challenge, which offered $25,000 in cash awards contributed by the NVIDIA Foundation through its Compute the Cure initiative.

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  5. Tackling some of the most intricate geophysical challenges via high-performance computing

    Science.gov (United States)

    Khosronejad, A.

    2016-12-01

    Recently, world has been witnessing significant enhancements in computing power of supercomputers. Computer clusters in conjunction with the advanced mathematical algorithms has set the stage for developing and applying powerful numerical tools to tackle some of the most intricate geophysical challenges that today`s engineers face. One such challenge is to understand how turbulent flows, in real-world settings, interact with (a) rigid and/or mobile complex bed bathymetry of waterways and sea-beds in the coastal areas; (b) objects with complex geometry that are fully or partially immersed; and (c) free-surface of waterways and water surface waves in the coastal area. This understanding is especially important because the turbulent flows in real-world environments are often bounded by geometrically complex boundaries, which dynamically deform and give rise to multi-scale and multi-physics transport phenomena, and characterized by multi-lateral interactions among various phases (e.g. air/water/sediment phases). Herein, I present some of the multi-scale and multi-physics geophysical fluid mechanics processes that I have attempted to study using an in-house high-performance computational model, the so-called VFS-Geophysics. More specifically, I will present the simulation results of turbulence/sediment/solute/turbine interactions in real-world settings. Parts of the simulations I present are performed to gain scientific insights into the processes such as sand wave formation (A. Khosronejad, and F. Sotiropoulos, (2014), Numerical simulation of sand waves in a turbulent open channel flow, Journal of Fluid Mechanics, 753:150-216), while others are carried out to predict the effects of climate change and large flood events on societal infrastructures ( A. Khosronejad, et al., (2016), Large eddy simulation of turbulence and solute transport in a forested headwater stream, Journal of Geophysical Research:, doi: 10.1002/2014JF003423).

  6. Deep Space Exploration: Will We Be Ready? Infectious Diseases, Microgravity and Other Forces Affecting Health Pose Challenges for Humans Planning to Explore Space

    Science.gov (United States)

    LaRocco, Mark T.; Pierson, Duane L.

    1999-01-01

    In contemplating space travel beyond earth orbits, we humans face significant barriers and major challenges. Although researchers involved in several scientific subdisciplines, including space medicine and space life sciences, may provide insights to help overcome those barriers, their efforts are at an early stage of development, leaving open many questions of potentially major consequence.

  7. Multiscale Mechanics of Articular Cartilage: Potentials and Challenges of Coupling Musculoskeletal, Joint, and Microscale Computational Models

    Science.gov (United States)

    Halloran, J. P.; Sibole, S.; van Donkelaar, C. C.; van Turnhout, M. C.; Oomens, C. W. J.; Weiss, J. A.; Guilak, F.; Erdemir, A.

    2012-01-01

    Articular cartilage experiences significant mechanical loads during daily activities. Healthy cartilage provides the capacity for load bearing and regulates the mechanobiological processes for tissue development, maintenance, and repair. Experimental studies at multiple scales have provided a fundamental understanding of macroscopic mechanical function, evaluation of the micromechanical environment of chondrocytes, and the foundations for mechanobiological response. In addition, computational models of cartilage have offered a concise description of experimental data at many spatial levels under healthy and diseased conditions, and have served to generate hypotheses for the mechanical and biological function. Further, modeling and simulation provides a platform for predictive risk assessment, management of dysfunction, as well as a means to relate multiple spatial scales. Simulation-based investigation of cartilage comes with many challenges including both the computational burden and often insufficient availability of data for model development and validation. This review outlines recent modeling and simulation approaches to understand cartilage function from a mechanical systems perspective, and illustrates pathways to associate mechanics with biological function. Computational representations at single scales are provided from the body down to the microstructure, along with attempts to explore multiscale mechanisms of load sharing that dictate the mechanical environment of the cartilage and chondrocytes. PMID:22648577

  8. Mobile, Cloud, and Big Data Computing: Contributions, Challenges, and New Directions in Telecardiology

    Science.gov (United States)

    Hsieh, Jui-Chien; Li, Ai-Hsien; Yang, Chung-Chi

    2013-01-01

    Many studies have indicated that computing technology can enable off-site cardiologists to read patients’ electrocardiograph (ECG), echocardiography (ECHO), and relevant images via smart phones during pre-hospital, in-hospital, and post-hospital teleconsultation, which not only identifies emergency cases in need of immediate treatment, but also prevents the unnecessary re-hospitalizations. Meanwhile, several studies have combined cloud computing and mobile computing to facilitate better storage, delivery, retrieval, and management of medical files for telecardiology. In the future, the aggregated ECG and images from hospitals worldwide will become big data, which should be used to develop an e-consultation program helping on-site practitioners deliver appropriate treatment. With information technology, real-time tele-consultation and tele-diagnosis of ECG and images can be practiced via an e-platform for clinical, research, and educational purposes. While being devoted to promote the application of information technology onto telecardiology, we need to resolve several issues: (1) data confidentiality in the cloud, (2) data interoperability among hospitals, and (3) network latency and accessibility. If these challenges are overcome, tele-consultation will be ubiquitous, easy to perform, inexpensive, and beneficial. Most importantly, these services will increase global collaboration and advance clinical practice, education, and scientific research in cardiology. PMID:24232290

  9. Mobile, Cloud, and Big Data Computing: Contributions, Challenges, and New Directions in Telecardiology

    Directory of Open Access Journals (Sweden)

    Chung-Chi Yang

    2013-11-01

    Full Text Available Many studies have indicated that computing technology can enable off-site cardiologists to read patients’ electrocardiograph (ECG, echocardiography (ECHO, and relevant images via smart phones during pre-hospital, in-hospital, and post-hospital teleconsultation, which not only identifies emergency cases in need of immediate treatment, but also prevents the unnecessary re-hospitalizations. Meanwhile, several studies have combined cloud computing and mobile computing to facilitate better storage, delivery, retrieval, and management of medical files for telecardiology. In the future, the aggregated ECG and images from hospitals worldwide will become big data, which should be used to develop an e-consultation program helping on-site practitioners deliver appropriate treatment. With information technology, real-time tele-consultation and tele-diagnosis of ECG and images can be practiced via an e-platform for clinical, research, and educational purposes. While being devoted to promote the application of information technology onto telecardiology, we need to resolve several issues: (1 data confidentiality in the cloud, (2 data interoperability among hospitals, and (3 network latency and accessibility. If these challenges are overcome, tele-consultation will be ubiquitous, easy to perform, inexpensive, and beneficial. Most importantly, these services will increase global collaboration and advance clinical practice, education, and scientific research in cardiology.

  10. Mobile, cloud, and big data computing: contributions, challenges, and new directions in telecardiology.

    Science.gov (United States)

    Hsieh, Jui-Chien; Li, Ai-Hsien; Yang, Chung-Chi

    2013-11-13

    Many studies have indicated that computing technology can enable off-site cardiologists to read patients' electrocardiograph (ECG), echocardiography (ECHO), and relevant images via smart phones during pre-hospital, in-hospital, and post-hospital teleconsultation, which not only identifies emergency cases in need of immediate treatment, but also prevents the unnecessary re-hospitalizations. Meanwhile, several studies have combined cloud computing and mobile computing to facilitate better storage, delivery, retrieval, and management of medical files for telecardiology. In the future, the aggregated ECG and images from hospitals worldwide will become big data, which should be used to develop an e-consultation program helping on-site practitioners deliver appropriate treatment. With information technology, real-time tele-consultation and tele-diagnosis of ECG and images can be practiced via an e-platform for clinical, research, and educational purposes. While being devoted to promote the application of information technology onto telecardiology, we need to resolve several issues: (1) data confidentiality in the cloud, (2) data interoperability among hospitals, and (3) network latency and accessibility. If these challenges are overcome, tele-consultation will be ubiquitous, easy to perform, inexpensive, and beneficial. Most importantly, these services will increase global collaboration and advance clinical practice, education, and scientific research in cardiology.

  11. Review of The SIAM 100-Digit Challenge: A Study in High-Accuracy Numerical Computing

    International Nuclear Information System (INIS)

    Bailey, David

    2005-01-01

    In the January 2002 edition of SIAM News, Nick Trefethen announced the '$100, 100-Digit Challenge'. In this note he presented ten easy-to-state but hard-to-solve problems of numerical analysis, and challenged readers to find each answer to ten-digit accuracy. Trefethen closed with the enticing comment: 'Hint: They're hard. If anyone gets 50 digits in total, I will be impressed.' This challenge obviously struck a chord in hundreds of numerical mathematicians worldwide, as 94 teams from 25 nations later submitted entries. Many of these submissions exceeded the target of 50 correct digits; in fact, 20 teams achieved a perfect score of 100 correct digits. Trefethen had offered $100 for the best submission. Given the overwhelming response, a generous donor (William Browning, founder of Applied Mathematics, Inc.) provided additional funds to provide a $100 award to each of the 20 winning teams. Soon after the results were out, four participants, each from a winning team, got together and agreed to write a book about the problems and their solutions. The team is truly international: Bornemann is from Germany, Laurie is from South Africa, Wagon is from the USA, and Waldvogel is from Switzerland. This book provides some mathematical background for each problem, and then shows in detail how each of them can be solved. In fact, multiple solution techniques are mentioned in each case. The book describes how to extend these solutions to much larger problems and much higher numeric precision (hundreds or thousands of digit accuracy). The authors also show how to compute error bounds for the results, so that one can say with confidence that one's results are accurate to the level stated. Numerous numerical software tools are demonstrated in the process, including the commercial products Mathematica, Maple and Matlab. Computer programs that perform many of the algorithms mentioned in the book are provided, both in an appendix to the book and on a website. In the process, the

  12. Algorithms for limited-view computed tomography: an annotated bibliography and a challenge

    International Nuclear Information System (INIS)

    Rangayyan, R.; Dhawan, A.P.; Gordon, R.

    1985-01-01

    In many applications of computed tomography, it may not be possible to acquire projection data at all angles, as required by the most commonly used algorithm of convolution backprojection. In such a limited-data situation, we face an ill-posed problem in attempting to reconstruct an image from an incomplete set of projections. Many techniques have been proposed to tackle this situation, employing diverse theories such as signal recovery, image restoration, constrained deconvolution, and constrained optimization, as well as novel schemes such as iterative object-dependent algorithms incorporating a priori knowledge and use of multispectral radiation. The authors present an overview of such techniques and offer a challenge to all readers to reconstruct images from a set of limited-view data provided here

  13. Copy-Right for Software and Computer Games: Strategies and Challenges

    Directory of Open Access Journals (Sweden)

    Hojatollah Ayoubi

    2009-11-01

    Full Text Available Copy-right has been initially used in cultural and art industries. From that time there have been two different approaches to the matter: the commercial-economic approach which is concerned with the rights of suppliers and investors; and the other approach, the cultural one, which is especially concerned with the rights of author. First approach is rooted in Anglo-American countries, while the other is originally French. Expansion of the computer market, and separating software and hardware markets caused to the so-called velvet-rubbery, which refers to the illegal reproduction in the market. Therefore, there were some struggles all over the world to protect rights of their producers. In present study, beside the domestic and international difficulties these strategies would encounter, this article has reviewed different strategies to face this challenge.

  14. Applied & Computational MathematicsChallenges for the Design and Control of Dynamic Energy Systems

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D L; Burns, J A; Collis, S; Grosh, J; Jacobson, C A; Johansen, H; Mezic, I; Narayanan, S; Wetter, M

    2011-03-10

    The Energy Independence and Security Act of 2007 (EISA) was passed with the goal 'to move the United States toward greater energy independence and security.' Energy security and independence cannot be achieved unless the United States addresses the issue of energy consumption in the building sector and significantly reduces energy consumption in buildings. Commercial and residential buildings account for approximately 40% of the U.S. energy consumption and emit 50% of CO{sub 2} emissions in the U.S. which is more than twice the total energy consumption of the entire U.S. automobile and light truck fleet. A 50%-80% improvement in building energy efficiency in both new construction and in retrofitting existing buildings could significantly reduce U.S. energy consumption and mitigate climate change. Reaching these aggressive building efficiency goals will not happen without significant Federal investments in areas of computational and mathematical sciences. Applied and computational mathematics are required to enable the development of algorithms and tools to design, control and optimize energy efficient buildings. The challenge has been issued by the U.S. Secretary of Energy, Dr. Steven Chu (emphasis added): 'We need to do more transformational research at DOE including computer design tools for commercial and residential buildings that enable reductions in energy consumption of up to 80 percent with investments that will pay for themselves in less than 10 years.' On July 8-9, 2010 a team of technical experts from industry, government and academia were assembled in Arlington, Virginia to identify the challenges associated with developing and deploying newcomputational methodologies and tools thatwill address building energy efficiency. These experts concluded that investments in fundamental applied and computational mathematics will be required to build enabling technology that can be used to realize the target of 80% reductions in energy

  15. Addressing current challenges in cancer immunotherapy with mathematical and computational modelling.

    Science.gov (United States)

    Konstorum, Anna; Vella, Anthony T; Adler, Adam J; Laubenbacher, Reinhard C

    2017-06-01

    The goal of cancer immunotherapy is to boost a patient's immune response to a tumour. Yet, the design of an effective immunotherapy is complicated by various factors, including a potentially immunosuppressive tumour microenvironment, immune-modulating effects of conventional treatments and therapy-related toxicities. These complexities can be incorporated into mathematical and computational models of cancer immunotherapy that can then be used to aid in rational therapy design. In this review, we survey modelling approaches under the umbrella of the major challenges facing immunotherapy development, which encompass tumour classification, optimal treatment scheduling and combination therapy design. Although overlapping, each challenge has presented unique opportunities for modellers to make contributions using analytical and numerical analysis of model outcomes, as well as optimization algorithms. We discuss several examples of models that have grown in complexity as more biological information has become available, showcasing how model development is a dynamic process interlinked with the rapid advances in tumour-immune biology. We conclude the review with recommendations for modellers both with respect to methodology and biological direction that might help keep modellers at the forefront of cancer immunotherapy development. © 2017 The Author(s).

  16. Challenges in computational fluid dynamics simulation for the nineties. Various examples of application

    International Nuclear Information System (INIS)

    Chabard, J.P.; Viollet, P.L.

    1991-01-01

    Most of the computational fluid dynamics applications which are encountered at the Research Branch of EDF (DER) are dealing with thermal exchanges. The development of numerical tools for the simulation of flows, devoted to this class of application, has been under way for 15 years. At the beginning this work was mainly concerned with a good simulation of the dynamics of the flow. Now these tools can be used to compute flows with thermal exchanges. The presentation will be limited to incompressible and one phase flows (the DER developments on two phase flows are discussed in the paper by MM. Hery, Boivin et Viollet (in the present magazine). First the softwares developed at DER will be presented. Then some applications of these tools to flows with thermal exchanges will be discussed. To conclude, the paper will treat the general case of the CFD codes. The challenges for the next years will be detailed in order to make these tools available for users involved in complex physical modeling [fr

  17. The Security Challenges in the IoT Enabled Cyber-Physical Systems and Opportunities for Evolutionary Computing & Other Computational Intelligence

    OpenAIRE

    He, H.; Maple, C.; Watson, T.; Tiwari, A.; Mehnen, J.; Jin, Y.; Gabrys, Bogdan

    2016-01-01

    Internet of Things (IoT) has given rise to the fourth industrial revolution (Industrie 4.0), and it brings great benefits by connecting people, processes and data. However, cybersecurity has become a critical challenge in the IoT enabled cyber physical systems, from connected supply chain, Big Data produced by huge amount of IoT devices, to industry control systems. Evolutionary computation combining with other computational intelligence will play an important role for cybersecurity, such as ...

  18. Readiness for Living Technology

    DEFF Research Database (Denmark)

    Peronard, Jean-Paul

    2013-01-01

    This paper is a comparative analysis between workers in healthcare with high and low degree of readiness for living technology such as robotics. To explore the differences among workers’ readiness for robotics in healthcare, statistical analysis was conducted in the data set obtained from 200...

  19. A New Gilded Age, and What It Means for Global Health Comment on "Global Health Governance Challenges 2016 - Are We Ready?"

    Science.gov (United States)

    Schrecker, Ted

    2016-08-17

    New contours of global inequality present new challenges for global health, and require that we consider new kinds of health issues as global. I provide a number of illustrations, arguing the need for a political science of health that goes beyond conventional preoccupations with formal institutional and inter-state interactions and takes into account how globalization has affected the health policy landscape and restructured the distribution of economic and political power not only among countries, but also within them. © 2017 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  20. A Federal Vision for Future Computing: A Nanotechnology-Inspired Grand Challenge

    Science.gov (United States)

    2016-07-29

    fault-tolerant system that consumes less power than an incandescent light bulb. Recent progress in developing novel, low-power methods of sensing and...computation—including neuromorphic, magneto-electronic, and analog systems—combined with dramatic advances in neuroscience and cognitive sciences...enable ready-to-fabricate designs and specifications. 4. Brain-Inspired Approaches Neuroscience research suggests that the brain is a complex, high

  1. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  2. OpenTopography: Addressing Big Data Challenges Using Cloud Computing, HPC, and Data Analytics

    Science.gov (United States)

    Crosby, C. J.; Nandigam, V.; Phan, M.; Youn, C.; Baru, C.; Arrowsmith, R.

    2014-12-01

    OpenTopography (OT) is a geoinformatics-based data facility initiated in 2009 for democratizing access to high-resolution topographic data, derived products, and tools. Hosted at the San Diego Supercomputer Center (SDSC), OT utilizes cyberinfrastructure, including large-scale data management, high-performance computing, and service-oriented architectures to provide efficient Web based access to large, high-resolution topographic datasets. OT collocates data with processing tools to enable users to quickly access custom data and derived products for their application. OT's ongoing R&D efforts aim to solve emerging technical challenges associated with exponential growth in data, higher order data products, as well as user base. Optimization of data management strategies can be informed by a comprehensive set of OT user access metrics that allows us to better understand usage patterns with respect to the data. By analyzing the spatiotemporal access patterns within the datasets, we can map areas of the data archive that are highly active (hot) versus the ones that are rarely accessed (cold). This enables us to architect a tiered storage environment consisting of high performance disk storage (SSD) for the hot areas and less expensive slower disk for the cold ones, thereby optimizing price to performance. From a compute perspective, OT is looking at cloud based solutions such as the Microsoft Azure platform to handle sudden increases in load. An OT virtual machine image in Microsoft's VM Depot can be invoked and deployed quickly in response to increased system demand. OT has also integrated SDSC HPC systems like the Gordon supercomputer into our infrastructure tier to enable compute intensive workloads like parallel computation of hydrologic routing on high resolution topography. This capability also allows OT to scale to HPC resources during high loads to meet user demand and provide more efficient processing. With a growing user base and maturing scientific user

  3. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  5. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  6. Radiation doses in pediatric computed tomography procedures: challenges facing new technologies

    International Nuclear Information System (INIS)

    Cotelo, E.; Padilla, M.; Dibarboure, L.

    2008-01-01

    Despite the fact that in recent years an increasing number of radiologists and radiological technologists have been applying radiation dose optimization techniques in paediatric Computed Tomography (CT) examinations, dual and multi -slice CT (MSCT) scanners present a new challenge in Radiation Protection (RP). While on one hand these scanners are provided with Automatic Exposure Control (AEC) devices, dose reduction modes and dose estimation software, on the other hand Quality Control (QC) tests and CT Kerma Index (C) measurements and patient dose estimation present specific difficulties and require changes or adaptations of traditional QC protocols. This implies a major challenge in most developing countries where Quality Assurance Programmes (QAP) have not been implemented yet and there is a shortage in the number of medical physicists This paper analyses clinical and technical protocols as well as patient doses in 204 CT body procedures performed in 154 children. The investigation was carried out in a paediatric reference hospital of Uruguay, where are performed an average of 450 paediatric CT examinations per month in a sole CT dual scanner. Besides, C VOL reported from the scanner display was registered in order to be related with the same dosimetric quantity derived from technical parameters and C values published on tables. Results showed that not all the radiologists applied the same protocol in similar clinical situations delivering unnecessary patient dose with no significant differences in image quality. Moreover, it was found that dose reduction modes represent a drawback in order to estimate patient dose when mA changes according to tissue attenuation, in most cases in each rotation. The study concluded on the importance of QAP that must include education on RP of radiologists and technologists, as well as in the need of medical physicists to perform QC tests and patient dose estimations and measurements. (author)

  7. Data Challenges

    CERN Multimedia

    McCubbin, N A

    Some two years ago we planned a series of Data Challenges starting at the end of 2001. At the time, that seemed to be comfortingly far in the future... Well, as the saying goes, doesn't time fly when you are having fun! ATLAS Computing is now deep in the throes of getting the first Data Challenge (DC0) up and running. One of the main aims of DC0 is to have a software 'release' in which we can generate full physics events, track all particles through the detector, simulate the detector response, reconstruct the event, and study it, with appropriate data storage en route. As all software is "always 95% ready" (!), we have been able to do most of this, more or less, for some time. But DC0 forces us to have everything working, together, at the same time: a reality check. DC0 should finish early next year, and it will be followed almost immediately afterwards by DC1 (DC0 was foreseen as the 'check' for DC1). DC1 will last into the middle of 2002, and has two major goals. The first is generation, simulation, and r...

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  9. Cone beam computed tomographic imaging: perspective, challenges, and the impact of near-trend future applications.

    Science.gov (United States)

    Cavalcanti, Marcelo Gusmão Paraiso

    2012-01-01

    Cone beam computed tomography (CBCT) can be considered as a valuable imaging modality for improving diagnosis and treatment planning to achieve true guidance for several craniofacial surgical interventions. A new concept and perspective in medical informatics is the highlight discussion about the new imaging interactive workflow. The aim of this article was to present, in a short literature review, the usefulness of CBCT technology as an important alternative imaging modality, highlighting current practices and near-term future applications in cutting-edge thought-provoking perspectives for craniofacial surgical assessment. This article explains the state of the art of CBCT improvements, medical workstation, and perspectives of the dedicated unique hardware and software, which can be used from the CBCT source. In conclusion, CBCT technology is developing rapidly, and many advances are on the horizon. Further progress in medical workstations, engineering capabilities, and improvement in independent software-some open source-should be attempted with this new imaging method. The perspectives, challenges, and pitfalls in CBCT will be delineated and evaluated along with the technological developments.

  10. Challenges in clinical applications of brain computer interfaces in individuals with spinal cord injury

    Directory of Open Access Journals (Sweden)

    Rüdiger eRupp

    2014-09-01

    Full Text Available Brain computer interfaces (BCIs are devices that measure brain activities and translate them into control signals used for a variety of applications. Among them are systems for communication, environmental control, neuroprostheses, exoskeletons or restorative therapies. Over the last years the technology of BCIs has reached a level of matureness allowing them to be used not only in research experiments supervised by scientists, but also in clinical routine with patients with neurological impairments supervised by clinical personnel or caregivers. However, clinicians and patients face many challenges in the application of BCIs. This particularly applies to high spinal cord injured patients, in whom artificial ventilation, autonomic dysfunctions, neuropathic pain or the inability to achieve a sufficient level of control during a short-term training may limit the successful use of a BCI. Additionally, spasmolytic medication and the acute stress reaction with associated episodes of depression may have a negative influence on the modulation of brain waves and therefore the ability to concentrate over an extended period of time. Although BCIs seem to be a promising assistive technology for individuals with high spinal cord injury systematic investigations are highly needed to obtain realistic estimates of the percentage of users that for any reason may not be able to operate a BCI in a clinical setting.

  11. Technology Readiness Level Guidebook

    Science.gov (United States)

    2017-09-01

    This guidebook provides the necessary information for conducting a Technology Readiness Level (TRL) Assessment. TRL Assessments are a tool for determining the maturity of technologies and identifying next steps in the research process. This guidebook...

  12. Computer Literacy of Iranian Teachers of English as a Foreign Language: Challenges and Obstacles

    Science.gov (United States)

    Dashtestani, Reza

    2014-01-01

    Basically, one of the requirements for the implementation of computer-assisted language learning (CALL) is English as a foreign language (EFL) teachers' ability to use computers effectively. Educational authorities and planners should identify EFL teachers' computer literacy levels and make attempts to improve the teachers' computer competence.…

  13. Exercising CMS dataflows and workflows in computing challenges at the SpanishTier-1 and Tier-2 sites

    Energy Technology Data Exchange (ETDEWEB)

    Caballero, J; Colino, N; Peris, A D; G-Abia, P; Hernandez, J M; R-Calonge, F J [CIEMAT, Madrid (Spain); Cabrillo, I; Caballero, I G; Marco, R; Matorras, F [IFCA, Santander (Spain); Flix, J; Merino, G [PIC, Barcelona (Spain)], E-mail: jose.hernandez@ciemat.es

    2008-07-15

    An overview of the data transfer, processing and analysis operations conducted at the Spanish Tier-1 (PIC, Barcelona) and Tier-2 (CIEMAT-Madrid and IFCA-Santander federation) centres during the past CMS CSA06 Computing, Software and Analysis challenge and in preparation for CSA07 is present0008.

  14. Exercising CMS dataflows and workflows in computing challenges at the SpanishTier-1 and Tier-2 sites

    International Nuclear Information System (INIS)

    Caballero, J; Colino, N; Peris, A D; G-Abia, P; Hernandez, J M; R-Calonge, F J; Cabrillo, I; Caballero, I G; Marco, R; Matorras, F; Flix, J; Merino, G

    2008-01-01

    An overview of the data transfer, processing and analysis operations conducted at the Spanish Tier-1 (PIC, Barcelona) and Tier-2 (CIEMAT-Madrid and IFCA-Santander federation) centres during the past CMS CSA06 Computing, Software and Analysis challenge and in preparation for CSA07 is presented

  15. Grand Challenges of Advanced Computing for Energy Innovation Report from the Workshop Held July 31-August 2, 2012

    Energy Technology Data Exchange (ETDEWEB)

    Larzelere, Alex R.; Ashby, Steven F.; Christensen, Dana C.; Crawford, Dona L.; Khaleel, Mohammad A.; John, Grosh; Stults, B. Ray; Lee, Steven L.; Hammond, Steven W.; Grover, Benjamin T.; Neely, Rob; Dudney, Lee Ann; Goldstein, Noah C.; Wells, Jack; Peltz, Jim

    2013-03-06

    On July 31-August 2 of 2012, the U.S. Department of Energy (DOE) held a workshop entitled Grand Challenges of Advanced Computing for Energy Innovation. This workshop built on three earlier workshops that clearly identified the potential for the Department and its national laboratories to enable energy innovation. The specific goal of the workshop was to identify the key challenges that the nation must overcome to apply the full benefit of taxpayer-funded advanced computing technologies to U.S. energy innovation in the ways that the country produces, moves, stores, and uses energy. Perhaps more importantly, the workshop also developed a set of recommendations to help the Department overcome those challenges. These recommendations provide an action plan for what the Department can do in the coming years to improve the nation’s energy future.

  16. Human Resource Information System implementation readiness in the Ethiopian health sector: a cross-sectional study.

    Science.gov (United States)

    Dilu, Eyilachew; Gebreslassie, Measho; Kebede, Mihiretu

    2017-12-20

    Health workforce information systems in low-income countries tend to be defective with poor relationship to information sources. Human Resource Information System (HRIS) is currently in a pilot implementation phase in the Federal Ministry of Health and Regional Health Bureaus of Ethiopia. Before scaling up the implementation, it is important to understand the implementation readiness of hospitals and health departments. The aims of this study were to assess the readiness for HRIS implementation, identify associated factors, and explore the implementation challenges in public hospitals and health departments of the Amhara National Regional State, Ethiopia. An institution-based cross-sectional study supplemented with a qualitative study was conducted from the 15th of February to the 30th of March 2016 in 19 public hospitals and health departments of the Amhara National Regional State, Ethiopia. A self-administered questionnaire was used to collect the data. The questionnaire includes items on socio-demographic characteristics and questions measuring technical, personal, and organizational factors adapted from the 32-item questionnaire of the Management Science for Health (MSH) HRIS readiness assessment tool. The data were entered and analyzed with statistical software. Descriptive statistics and bivariate and multivariable logistic regression analyses were performed. Odds ratios with 95% confidence interval were computed to identify the factors statistically associated with readiness of HRIS implementation. In-depth interviews and observation checklists were used to collect qualitative data. Thematic content analysis was used to analyze the qualitative data. A total of 246 human resource (HR) employees and 16 key informants have been included in the study. The HR employee's level of readiness for HRIS implementation in this study was 35.8%. Employee's Internet access (AOR = 2.59, 95%CI = 1.19, 5.62), availability of separate HR section (AOR = 8.08, 95%CI

  17. Computational intelligence in gait research: a perspective on current applications and future challenges.

    Science.gov (United States)

    Lai, Daniel T H; Begg, Rezaul K; Palaniswami, Marimuthu

    2009-09-01

    Our mobility is an important daily requirement so much so that any disruption to it severely degrades our perceived quality of life. Studies in gait and human movement sciences, therefore, play a significant role in maintaining the well-being of our mobility. Current gait analysis involves numerous interdependent gait parameters that are difficult to adequately interpret due to the large volume of recorded data and lengthy assessment times in gait laboratories. A proposed solution to these problems is computational intelligence (CI), which is an emerging paradigm in biomedical engineering most notably in pathology detection and prosthesis design. The integration of CI technology in gait systems facilitates studies in disorders caused by lower limb defects, cerebral disorders, and aging effects by learning data relationships through a combination of signal processing and machine learning techniques. Learning paradigms, such as supervised learning, unsupervised learning, and fuzzy and evolutionary algorithms, provide advanced modeling capabilities for biomechanical systems that in the past have relied heavily on statistical analysis. CI offers the ability to investigate nonlinear data relationships, enhance data interpretation, design more efficient diagnostic methods, and extrapolate model functionality. These are envisioned to result in more cost-effective, efficient, and easy-to-use systems, which would address global shortages in medical personnel and rising medical costs. This paper surveys current signal processing and CI methodologies followed by gait applications ranging from normal gait studies and disorder detection to artificial gait simulation. We review recent systems focusing on the existing challenges and issues involved in making them successful. We also examine new research in sensor technologies for gait that could be combined with these intelligent systems to develop more effective healthcare solutions.

  18. Some computational challenges of developing efficient parallel algorithms for data-dependent computations in thermal-hydraulics supercomputer applications

    International Nuclear Information System (INIS)

    Woodruff, S.B.

    1992-01-01

    The Transient Reactor Analysis Code (TRAC), which features a two- fluid treatment of thermal-hydraulics, is designed to model transients in water reactors and related facilities. One of the major computational costs associated with TRAC and similar codes is calculating constitutive coefficients. Although the formulations for these coefficients are local the costs are flow-regime- or data-dependent; i.e., the computations needed for a given spatial node often vary widely as a function of time. Consequently, poor load balancing will degrade efficiency on either vector or data parallel architectures when the data are organized according to spatial location. Unfortunately, a general automatic solution to the load-balancing problem associated with data-dependent computations is not yet available for massively parallel architectures. This document discusses why developers algorithms, such as a neural net representation, that do not exhibit algorithms, such as a neural net representation, that do not exhibit load-balancing problems

  19. FY 1992 Blue Book: Grand Challenges: High Performance Computing and Communications

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — High performance computing and computer communications networks are becoming increasingly important to scientific advancement, economic competition, and national...

  20. FY 1993 Blue Book: Grand Challenges 1993: High Performance Computing and Communications

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — High performance computing and computer communications networks are becoming increasingly important to scientific advancement, economic competition, and national...

  1. First ALMA Transporter Ready for Challenging Duty

    Science.gov (United States)

    2008-07-01

    The first of two ALMA transporters -- unique vehicles designed to move high-tech radio-telescope antennas in the harsh, high-altitude environment of the Atacama Large Millimeter/submillimeter Array -- has been completed and passed its initial operational tests. The 130-ton machine moves on 28 wheels and will be able to transport a 115-ton antenna and set it down on a concrete pad within millimeters of a prescribed position. ALMA Transporter The ALMA Transporter on a Test Run CREDIT: ESO Click on image for high-resolution file (244 KB) The ALMA transporter rolled out of its hangar and underwent the tests at the Scheuerle Fahrzeugfabrik company site near Nuremberg, Germany. The machine is scheduled for delivery at the ALMA site in Chile by the end of 2007, and a second vehicle will follow about three months later. ALMA is a giant, international observatory under construction in the Atacama Desert of northern Chile at an elevation of 16,500 feet. Using at least 66 high-precision antennas, with the possibility of increasing the number in the future, ALMA will provide astronomers with an unprecedented ability to explore the Universe as seen at wavelengths of a few millimeters to less than a millimeter. By moving the antennas from configurations as compact as 150 meters to as wide as 15 kilometers, the system will provide a zoom-lens ability for scientists. "The ability to move antennas to reconfigure the array is vital to fulfilling ALMA's scientific mission. The operations plan calls for moving antennas on a daily basis to provide the flexibility that will be such a big part of ALMA's scientific value. That's why the transporters are so important and why this is such a significant milestone," said Adrian Russell, North American Project Manager for ALMA. "The ALMA antennas will be assembled and their functionality will be verified at a base camp, located at an altitude of 2900 meters (9500 feet) and the transporters will in a first step bring the telescopes up to the 5000-meter (16,500 feet) high observatory," explained Hans Rykaczewski, the European ALMA Project Manager. "There, the transporters will move the antennas from the compact configuration to any extended configuration which could stretch up to 15 kilometers." To do their job for ALMA, the transporters will have to climb a 17-mile, high-altitude road with an average grade of 7 percent. Carrying an antenna, they can move about 7 mph; when empty, they can travel about 12 mph. The trip from the base camp to the high observing site will take about three hours. A special brake system allows them to safely make the downhill trip. The machines also incorporate a number of redundant safety devices to protect both the personnel and the valuable antennas. "In order to operate the transporter at the ALMA site, two engines with a total of about 1400 horsepower are installed and all the components have been checked to meet the requirements at this extreme conditions," says Andreas Kohler, Vice President for Research and Development at Scheuerle Fahrzeugfabrik, the company which built the transporters under contract to ESO. "The human factor was also considered. For example, the backrests of the driver seats are shaped to allow the driver to wear his oxygen tank while driving." At the high elevation of 16,500 feet, the transporter engines will only provide about half their rated power, because of the lowered amount of available oxygen. The ALMA project is a partnership between Europe, Japan and North America in cooperation with the Republic of Chile. ALMA is funded in Europe by ESO, in Japan by the National Institutes of Natural Sciences in cooperation with the Academia Sinica in Taiwan and in North America by the U.S. National Science Foundation in cooperation with the National Research Council of Canada. ALMA construction and operations are led on behalf of Europe by ESO, on behalf of Japan by the National Astronomical Observatory of Japan and on behalf of North America by the National Radio Astronomy Observatory, which is managed by Associated Universities, Inc.

  2. Are we ready to accept the challenge?

    DEFF Research Database (Denmark)

    Lau, Sofie Rosenlund; Traulsen, Janine M

    2017-01-01

    , including explicitly reflecting upon theoretical perspectives affecting the research process. METHODS: Content analysis was used to evaluate levels of theoretical visibility and analysis transparency in selected qualitative research articles published in Research in Social and Administrative Pharmacy...... the standpoint that theory and high-quality analysis go hand-in-hand. Based on the content analysis, articles that were deemed to be high in quality were explicit about the theoretical framework of their study and transparent in how they analyzed their data. It was found that theory contributed...... to the transparency of how the data were analyzed and interpreted. Two ways of improving contemporary qualitative research in the field of social and administrative pharmacy are discussed: engaging with social theory and establishing close collaboration with social scientists....

  3. Challenges in the twentieth century and beyond: Computer codes and data

    International Nuclear Information System (INIS)

    Kirk, B.L.

    1995-01-01

    The second half of the twentieth century has seen major changes in computer architecture. From the early fifties to the early seventies, the word open-quotes computerclose quotes demanded reverence, respect, and even fear. Computers, then, were almost open-quotes untouchable.close quotes Computers have become the mainstream of communication on rapidly expanding communication highways. They have become necessities of life. This report describes computer codes and packaging, as well as compilers and operating systems

  4. Migrating Educational Data and Services to Cloud Computing: Exploring Benefits and Challenges

    Science.gov (United States)

    Lahiri, Minakshi; Moseley, James L.

    2013-01-01

    "Cloud computing" is currently the "buzzword" in the Information Technology field. Cloud computing facilitates convenient access to information and software resources as well as easy storage and sharing of files and data, without the end users being aware of the details of the computing technology behind the process. This…

  5. Challenges in computational materials science: Multiple scales, multi-physics and evolving discontinuities

    NARCIS (Netherlands)

    Borst, de R.

    2008-01-01

    Novel experimental possibilities together with improvements in computer hardware as well as new concepts in computational mathematics and mechanics in particular multiscale methods are now, in principle, making it possible to derive and compute phenomena and material parameters at a macroscopic

  6. Computer Games in Pre-School Settings: Didactical Challenges when Commercial Educational Computer Games Are Implemented in Kindergartens

    Science.gov (United States)

    Vangsnes, Vigdis; Gram Okland, Nils Tore; Krumsvik, Rune

    2012-01-01

    This article focuses on the didactical implications when commercial educational computer games are used in Norwegian kindergartens by analysing the dramaturgy and the didactics of one particular game and the game in use in a pedagogical context. Our justification for analysing the game by using dramaturgic theory is that we consider the game to be…

  7. Quality Assurance Challenges for Motion-Adaptive Radiation Therapy: Gating, Breath Holding, and Four-Dimensional Computed Tomography

    International Nuclear Information System (INIS)

    Jiang, Steve B.; Wolfgang, John; Mageras, Gig S.

    2008-01-01

    Compared with conventional three-dimensional (3D) conformal radiation therapy and intensity-modulated radiation therapy treatments, quality assurance (QA) for motion-adaptive radiation therapy involves various challenges because of the added temporal dimension. Here we discuss those challenges for three specific techniques related to motion-adaptive therapy: namely respiratory gating, breath holding, and four-dimensional computed tomography. Similar to the introduction of any other new technologies in clinical practice, typical QA measures should be taken for these techniques also, including initial testing of equipment and clinical procedures, as well as frequent QA examinations during the early stage of implementation. Here, rather than covering every QA aspect in depth, we focus on some major QA challenges. The biggest QA challenge for gating and breath holding is how to ensure treatment accuracy when internal target position is predicted using external surrogates. Recommended QA measures for each component of treatment, including simulation, planning, patient positioning, and treatment delivery and verification, are discussed. For four-dimensional computed tomography, some major QA challenges have also been discussed

  8. Is the "Net Generation" Ready for Digital Citizenship? Perspectives from the IEA International Computer and Information Literacy Study 2013. Policy Brief No. 6

    Science.gov (United States)

    Watkins, Ryan; Engel, Laura C.; Hastedt, Dirk

    2015-01-01

    The rise of digital information and communication technologies (ICT) has made the acquisition of computer and information literacy (CIL) a leading factor in creating an engaged, informed, and employable citizenry. However, are young people, often described as "digital natives" or the "net generation," developing the necessary…

  9. Capture ready study

    Energy Technology Data Exchange (ETDEWEB)

    Minchener, A.

    2007-07-15

    There are a large number of ways in which the capture of carbon as carbon dioxide (CO{sub 2}) can be integrated into fossil fuel power stations, most being applicable for both gas and coal feedstocks. To add to the choice of technology is the question of whether an existing plant should be retrofitted for capture, or whether it is more attractive to build totally new. This miscellany of choices adds considerably to the commercial risk of investing in a large power station. An intermediate stage between the non-capture and full capture state would be advantageous in helping to determine the best way forward and hence reduce those risks. In recent years the term 'carbon capture ready' or 'capture ready' has been coined to describe such an intermediate stage plant and is now widely used. However a detailed and all-encompassing definition of this term has never been published. All fossil fuel consuming plant produce a carbon dioxide gas byproduct. There is a possibility of scrubbing it with an appropriate CO{sub 2} solvent. Hence it could be said that all fossil fuel plant is in a condition for removal of its CO{sub 2} effluent and therefore already in a 'capture ready' state. Evidently, the practical reality of solvent scrubbing could cost more than the rewards offered by such as the ETS (European Trading Scheme). In which case, it can be said that although the possibility exists of capturing CO{sub 2}, it is not a commercially viable option and therefore the plant could not be described as ready for CO{sub 2} capture. The boundary between a capture ready and a non-capture ready condition using this definition cannot be determined in an objective and therefore universally acceptable way and criteria must be found which are less onerous and less potentially contentious to assess. 16 refs., 2 annexes.

  10. Benefits and Challenges of the Adoption of Cloud Computing in Business

    OpenAIRE

    Colin Ting Si Xue; Felicia Tiong Wee Xin

    2016-01-01

    The loss of business and downturn of economics almost occur every day. Thus technology is needed in every organization. Cloud computing has played a major role in solving the inefficiencies problem in organizations and increase the growth of business thus help the organizations to stay competitive. It is required to improve and automate the traditional ways of doing business. Cloud computing has been considered as an innovative way to improve business. Overall, cloud computing enables the org...

  11. Mobile, Cloud, and Big Data Computing: Contributions, Challenges, and New Directions in Telecardiology

    OpenAIRE

    Hsieh, Jui-Chien; Li, Ai-Hsien; Yang, Chung-Chi

    2013-01-01

    Many studies have indicated that computing technology can enable off-site cardiologists to read patients’ electrocardiograph (ECG), echocardiography (ECHO), and relevant images via smart phones during pre-hospital, in-hospital, and post-hospital teleconsultation, which not only identifies emergency cases in need of immediate treatment, but also prevents the unnecessary re-hospitalizations. Meanwhile, several studies have combined cloud computing and mobile computing to facilitate better stora...

  12. A REVIEW ON SECURITY ISSUES AND CHALLENGES IN CLOUD COMPUTING MODEL OF RESOURCE MANAGEMENT

    OpenAIRE

    T. Vaikunth Pai; Dr. P. S. Aithal

    2017-01-01

    Cloud computing services refer to set of IT-enabled services delivered to a customer as services over the Internet on a leased basis and have the capability to extend up or down their service requirements or needs. Usually, cloud computing services are delivered by third party vendors who own the infrastructure. It has several advantages include scalability, elasticity, flexibility, efficiency and outsourcing non-core activities of an organization. Cloud computing offers an innovative busines...

  13. Increasing high school girls' exposure to computing activities with e-textiles: challenges and lessons learned

    DEFF Research Database (Denmark)

    Borsotti, Valeria

    2017-01-01

    ; stereotypes about computing as a ‘male’ domain; widespread lack of pre-college CS education and perceptions of computing as not socially relevant. STEAM activities have often been used to bridge the gender gap and to broaden the appeal of computing among children and youth. This contribution examines a STEAM......The number of female students in computer science degrees has been rapidly declining in Denmark in the past 40 years, as in many other European and North-American countries. The main reasons behind this phenomenon are widespread gender stereotypes about who is best suited to pursue a career in CS...

  14. Academic Training: QCD: are we ready for the LHC

    CERN Multimedia

    2006-01-01

    2006-2007 ACADEMIC TRAINING PROGRAMME LECTURE SERIES 4, 5, 6, 7 December, from 11:00 to 12:00 4, 5, 6 December - Main Auditorium, bldg. 500, 7 December - TH Auditorium, bldg. 4 - 3-006 QCD: are we ready for the LHC S. FRIXIONE / INFN, Genoa, Italy The LHC energy regime poses a serious challenge to our capability of predicting QCD reactions to the level of accuracy necessary for a successful programme of searches for physics beyond the Standard Model. In these lectures, I'll introduce basic concepts in QCD, and present techniques based on perturbation theory, such as fixed-order and resummed computations, and Monte Carlo simulations. I'll discuss applications of these techniques to hadron-hadron processes, concentrating on recent trends in perturbative QCD aimed at improving our understanding of LHC phenomenology.

  15. Service-oriented computing : State of the art and research challenges

    NARCIS (Netherlands)

    Papazoglou, Michael P.; Traverso, Paolo; Dustdar, Schahram; Leymann, Frank

    2007-01-01

    Service-oriented computing promotes the idea of assembling application components into a network of services that can be loosely coupled to create flexible, dynamic business processes and agile applications that span organizations and computing platforms. An SOC research road map provides a context

  16. 3 Ways that Web-Based Computing Will Change Colleges--And Challenge Them

    Science.gov (United States)

    Young, Jeffrey R.

    2008-01-01

    Cloud computing, one of the latest technology buzzwords, is so hard to explain that Google drove a bus from campus to campus to walk students through the company's vision of it. After students sat through a demo at computers set up nearby, they boarded the bus and got free T-shirts. The bus only stopped at colleges that had already agreed to hand…

  17. Perspectives on Games, Computers, and Mental Health: Questions about Paradoxes, Evidences, and Challenges

    OpenAIRE

    Desseilles, Martin

    2016-01-01

    In the field of mental health, games and computerized games present questions about paradoxes, evidences, and challenges. This perspective article offers perspectives and personal opinion about these questions, evidences, and challenges with an objective of presenting several ideas and issues in this rapidly developing field. First, games raise some questions in the sense of the paradox between a game and an issue, as well as the paradox of using an amusing game to treat a serious pathology. ...

  18. School Readiness Factor Analyzed.

    Science.gov (United States)

    Brenner, Anton; Scott, Leland H.

    This paper is an empirical statistical analysis and interpretation of data relating to school readiness previously examined and reported on a theoretical basis. A total of 118 white, middle class children from six consecutive kindergarten groups in Dearborn, Michigan were tested with seven instruments, evaluated in terms of achievement, ability,…

  19. Computer-aided detection systems to improve lung cancer early diagnosis: state-of-the-art and challenges

    International Nuclear Information System (INIS)

    Traverso, A; Lopez Torres, E; Cerello, P; Fantacci, M E

    2017-01-01

    Lung cancer is one of the most lethal types of cancer, because its early diagnosis is not good enough. In fact, the detection of pulmonary nodule, potential lung cancers, in Computed Tomography scans is a very challenging and time-consuming task for radiologists. To support radiologists, researchers have developed Computer-Aided Diagnosis (CAD) systems for the automated detection of pulmonary nodules in chest Computed Tomography scans. Despite the high level of technological developments and the proved benefits on the overall detection performance, the usage of Computer-Aided Diagnosis in clinical practice is far from being a common procedure. In this paper we investigate the causes underlying this discrepancy and present a solution to tackle it: the M5L WEB- and Cloud-based on-demand Computer-Aided Diagnosis. In addition, we prove how the combination of traditional imaging processing techniques with state-of-art advanced classification algorithms allows to build a system whose performance could be much larger than any Computer-Aided Diagnosis developed so far. This outcome opens the possibility to use the CAD as clinical decision support for radiologists. (paper)

  20. Computer-aided detection systems to improve lung cancer early diagnosis: state-of-the-art and challenges

    Science.gov (United States)

    Traverso, A.; Lopez Torres, E.; Fantacci, M. E.; Cerello, P.

    2017-05-01

    Lung cancer is one of the most lethal types of cancer, because its early diagnosis is not good enough. In fact, the detection of pulmonary nodule, potential lung cancers, in Computed Tomography scans is a very challenging and time-consuming task for radiologists. To support radiologists, researchers have developed Computer-Aided Diagnosis (CAD) systems for the automated detection of pulmonary nodules in chest Computed Tomography scans. Despite the high level of technological developments and the proved benefits on the overall detection performance, the usage of Computer-Aided Diagnosis in clinical practice is far from being a common procedure. In this paper we investigate the causes underlying this discrepancy and present a solution to tackle it: the M5L WEB- and Cloud-based on-demand Computer-Aided Diagnosis. In addition, we prove how the combination of traditional imaging processing techniques with state-of-art advanced classification algorithms allows to build a system whose performance could be much larger than any Computer-Aided Diagnosis developed so far. This outcome opens the possibility to use the CAD as clinical decision support for radiologists.

  1. Organizational readiness in specialty mental health care.

    Science.gov (United States)

    Hamilton, Alison B; Cohen, Amy N; Young, Alexander S

    2010-01-01

    Implementing quality improvement efforts in clinics is challenging. Assessment of organizational "readiness" for change can set the stage for implementation by providing information regarding existing strengths and deficiencies, thereby increasing the chance of a successful improvement effort. This paper discusses organizational assessment in specialty mental health, in preparation for improving care for individuals with schizophrenia. To assess organizational readiness for change in specialty mental health in order to facilitate locally tailored implementation strategies. EQUIP-2 is a site-level controlled trial at nine VA medical centers (four intervention, five control). Providers at all sites completed an organizational readiness for change (ORC) measure, and key stakeholders at the intervention sites completed a semi-structured interview at baseline. At the four intervention sites, 16 administrators and 43 clinical staff completed the ORC, and 38 key stakeholders were interviewed. The readiness domains of training needs, communication, and change were the domains with lower mean scores (i.e., potential deficiencies) ranging from a low of 23.8 to a high of 36.2 on a scale of 10-50, while staff attributes of growth and adaptability had higher mean scores (i.e., potential strengths) ranging from a low of 35.4 to a high of 41.1. Semi-structured interviews revealed that staff perceptions and experiences of change and decision-making are affected by larger structural factors such as change mandates from VA headquarters. Motivation for change, organizational climate, staff perceptions and beliefs, and prior experience with change efforts contribute to readiness for change in specialty mental health. Sites with less readiness for change may require more flexibility in the implementation of a quality improvement intervention. We suggest that uptake of evidence-based practices can be enhanced by tailoring implementation efforts to the strengths and deficiencies of the

  2. Challenge for knowledge information processing systems (preliminary report on Fifth Generation Computer Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Moto-oka, T

    1982-01-01

    The author explains the reasons, aims and strategies for the Fifth Generation Computer Project in Japan. The project aims to introduce a radical new breed of computer by 1990. This article outlines the economic and social reasons for the project. It describes the impacts and effects that these computers are expected to have. The areas of technology which will form the contents of the research and development are highlighted. These are areas such as VLSI technology, speech and image understanding systems, artificial intelligence and advanced architecture design. Finally a schedule for completion of research is given which aims for a completed project by 1990.

  3. The ontogeny of great ape gesture - not a simple story. Comment on "Towards a Computational Comparative Neuroprimatology: Framing the language-ready brain" by Michael A. Arbib

    Science.gov (United States)

    Liebal, Katja

    2016-03-01

    Although there is an increasing number of studies investigating gestural communication in primates other than humans in both natural and captive settings [1], very little is known about how they acquire their gestures. Different mechanisms have been proposed, including genetic transmission [2], social learning [3], or ontogenetic ritualization [4]. This latter mechanism is central to Arbib's paper [5], because he uses dyadic brain modeling - that is ;modeling the brains of two creatures as they interact with each other, so that the action of one affects the perception of the other and so the cycle of interactions continues, with both brains changing in the process; - to explain how gestures might emerge in ontogeny from previously non-communicative behaviors over the course of repeated and increasingly abbreviated and thus ritualized interactions. The aim of my comment is to discuss the current evidence from primate gesture research with regard the different mechanisms proposed for gesture acquisition and how this might confirm or challenge Arbib's approach.

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  5. Toward the Language-Ready Brain: Biological Evolution and Primate Comparisons.

    Science.gov (United States)

    Arbib, Michael A

    2017-02-01

    The approach to language evolution suggested here focuses on three questions: How did the human brain evolve so that humans can develop, use, and acquire languages? How can the evolutionary quest be informed by studying brain, behavior, and social interaction in monkeys, apes, and humans? How can computational modeling advance these studies? I hypothesize that the brain is language ready in that the earliest humans had protolanguages but not languages (i.e., communication systems endowed with rich and open-ended lexicons and grammars supporting a compositional semantics), and that it took cultural evolution to yield societies (a cultural constructed niche) in which language-ready brains could become language-using brains. The mirror system hypothesis is a well-developed example of this approach, but I offer it here not as a closed theory but as an evolving framework for the development and analysis of conflicting subhypotheses in the hope of their eventual integration. I also stress that computational modeling helps us understand the evolving role of mirror neurons, not in and of themselves, but only in their interaction with systems "beyond the mirror." Because a theory of evolution needs a clear characterization of what it is that evolved, I also outline ideas for research in neurolinguistics to complement studies of the evolution of the language-ready brain. A clear challenge is to go beyond models of speech comprehension to include sign language and models of production, and to link language to visuomotor interaction with the physical and social world.

  6. Cloud Computing and its Challenges and Benefits in the Bank System

    Directory of Open Access Journals (Sweden)

    Bogdan NEDELCU

    2015-07-01

    Full Text Available The purpose of this article is to highlight the current situation of Cloud Computing systems. There is a tendency for enterprises and banks to seek such databases, so the article tries to answer the question: "Is Cloud Computing safe". Answering this question requires an analysis of the security system (strengths and weaknesses, accompanied by arguments for and against this trend and suggestions for improvement that can increase the customers confidence in the future.

  7. Qualitative Computing and Qualitative Research: Addressing the Challenges of Technology and Globalization

    Directory of Open Access Journals (Sweden)

    César A. Cisneros Puebla

    2012-05-01

    Full Text Available Qualitative computing has been part of our lives for thirty years. Today, we urgently call for an evaluation of its international impact on qualitative research. Evaluating the international impact of qualitative research and qualitative computing requires a consideration of the vast amount of qualitative research over the last decades, as well as thoughtfulness about the uneven and unequal way in which qualitative research and qualitative computing are present in different fields of study and geographical regions. To understand the international impact of qualitative computing requires evaluation of the digital divide and the huge differences between center and peripheries. The international impact of qualitative research, and, in particular qualitative computing, is the question at the heart of this array of selected papers from the "Qualitative Computing: Diverse Worlds and Research Practices Conference." In this article, we introduce the reader to the goals, motivation, and atmosphere at the conference, taking place in Istanbul, Turkey, in 2011. The dialogue generated there is still in the air, and this introduction is a call to spread that voice. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1202285

  8. Ready for the plunge!

    CERN Multimedia

    2007-01-01

    Herman Ten Kate, Project Leader for the ATLAS magnet system, standing in front of the truck transporting the magnet across the Route de Meyrin.Every time any part of the ATLAS detector is moved, it’s quite a spectacle! On Tuesday 29 May, the first end-cap of the ATLAS toroid magnet left Building 180, bound for Point 1. The 240-ton behemoth covered the two short kilometres in no less than five hours. Traffic was interrupted on the Route de Meyrin while the exceptional load was wheeled to its final destination. One of the technical challenges was to keep the magnet horizontal throughout the operation and, to achieve this, computers permanently monitored the magnet’s angles of displacement and hydraulic jacks rectified any tilt. But the most hazardous part of operation remains the 80-m plunge into the ATLAS cavern.

  9. An EGS4-ready tomographic computational model of a 14-year-old female torso for calculating organ doses from CT examinations

    International Nuclear Information System (INIS)

    Caon, M.; School of Physics and Electronic Systems Engineering, University of South Australia, The Levels Campus, Mawson Lakes, South Australia, 5095; Pattison, J.

    1999-01-01

    Fifty-four consecutive CT scans have been used to construct a tomographic computational model of a 14-year-old female torso suitable for the determination of organ doses from CT. The model, known as ADELAIDE, is in the form of an input file compatible with user codes based on XYZDOS.MOR from the readily available EGS4 Monte Carlo radiation transport code. ADELAIDE's dimensions are close to the Australian averages for her age so the model is representative of a 14-year-old girl. The realistic anatomy in the model differs considerably from that in Cristy's 15-year-old mathematical computational model by having realistically shaped organs that are appropriately located within a real external contour. Average absorbed dose to organs from simulated CT examinations of the chest and abdomen have been calculated for ADELAIDE using EGS4 within a geometry specific to the General Electric Hi-Speed Advantage CT scanner and using an x-ray spectrum calculated using data from the scanner's x-ray tube. The simulations include the scanner's beam shaping filter and patient table. It is suggested that the resulting values have fewer possible sources of uncertainty than organ doses derived from dose coefficients calculated for a MIRD style model with mathematical anatomy and a spectrum that may not match that of the scanner. The organ doses were normalized using the scanner's CTDI measured free-in-air and an EGS4 simulation of the CTDI measurement. Effective dose to the torso from 26-slice chest and 24-slice abdomen examinations (at 120 kV, 200 mAs, 7 mm slices) is 4.6±0.1mSv and 4.3±0.1mSv respectively. (author)

  10. Perspectives on Games, Computers, and Mental Health: Questions about Paradoxes, Evidences, and Challenges.

    Science.gov (United States)

    Desseilles, Martin

    2016-01-01

    In the field of mental health, games and computerized games present questions about paradoxes, evidences, and challenges. This perspective article offers perspectives and personal opinion about these questions, evidences, and challenges with an objective of presenting several ideas and issues in this rapidly developing field. First, games raise some questions in the sense of the paradox between a game and an issue, as well as the paradox of using an amusing game to treat a serious pathology. Second, games also present evidence in the sense that they involve relationships with others, as well as learning, communication, language, emotional regulation, and hedonism. Third, games present challenges, such as the risk of abuse, the critical temporal period that may be limited to childhood, their important influence on sociocognitive learning and the establishment of social norms, and the risk of misuse of games.

  11. Exploring Cloud Computing Tools to Enhance Team-Based Problem Solving for Challenging Behavior

    Science.gov (United States)

    Johnson, LeAnne D.

    2017-01-01

    Data-driven decision making is central to improving success of children. Actualizing the use of data is challenging when addressing the social, emotional, and behavioral needs of children across different types of early childhood programs (i.e., early childhood special education, early childhood family education, Head Start, and childcare).…

  12. [Facing the challenges of ubiquitous computing in the health care sector].

    Science.gov (United States)

    Georgieff, Peter; Friedewald, Michael

    2010-01-01

    The steady progress of microelectronics, communications and information technology will enable the realisation of the vision for "ubiquitous computing" where the Internet extends into the real world embracing everyday objects. The necessary technical basis is already in place. Due to their diminishing size, constantly falling price and declining energy consumption, processors, communications modules and sensors are being increasingly integrated into everyday objects today. This development is opening up huge opportunities for both the economy and individuals. In the present paper we discuss possible applications, but also technical, social and economic barriers to a wide-spread use of ubiquitous computing in the health care sector. .

  13. Benefits and Challenges in Using Computers and the Internet with Adult English Learners.

    Science.gov (United States)

    Terrill, Lynda

    Although resources and training vary from program to program, adult English as a Second or Other Language (ESOL) teachers and English learners across the country are integrating computers and Internet use with ESOL instruction. This can be seen in the growing number of ESOL resources available on the World Wide Web. There are very good reasons for…

  14. Expanding Computer Science Education in Schools: Understanding Teacher Experiences and Challenges

    Science.gov (United States)

    Yadav, Aman; Gretter, Sarah; Hambrusch, Susanne; Sands, Phil

    2017-01-01

    The increased push for teaching computer science (CS) in schools in the United States requires training a large number of new K-12 teachers. The current efforts to increase the number of CS teachers have predominantly focused on training teachers from other content areas. In order to support these beginning CS teachers, we need to better…

  15. Exploring the Benefits and Challenges of Using Laptop Computers in Higher Education Classrooms: A Formative Analysis

    Science.gov (United States)

    Kay, Robin H.; Lauricella, Sharon

    2011-01-01

    Because of decreased prices, increased convenience, and wireless access, an increasing number of college and university students are using laptop computers in their classrooms. This recent trend has forced instructors to address the educational consequences of using these mobile devices. The purpose of the current study was to analyze and assess…

  16. Review of Affective Computing in Education/Learning: Trends and Challenges

    Science.gov (United States)

    Wu, Chih-Hung; Huang, Yueh-Min; Hwang, Jan-Pan

    2016-01-01

    Affect can significantly influence education/learning. Thus, understanding a learner's affect throughout the learning process is crucial for understanding motivation. In conventional education/learning research, learner motivation can be known through postevent self-reported questionnaires. With the advance of affective computing technology,…

  17. The challenge associated with the robust computation of meteor velocities from video and photographic records

    Science.gov (United States)

    Egal, A.; Gural, P. S.; Vaubaillon, J.; Colas, F.; Thuillot, W.

    2017-09-01

    The CABERNET project was designed to push the limits for obtaining accurate measurements of meteoroids orbits from photographic and video meteor camera recordings. The discrepancy between the measured and theoretic orbits of these objects heavily depends on the semi-major axis determination, and thus on the reliability of the pre-atmospheric velocity computation. With a spatial resolution of 0.01° per pixel and a temporal resolution of up to 10 ms, CABERNET should be able to provide accurate measurements of velocities and trajectories of meteors. To achieve this, it is necessary to improve the precision of the data reduction processes, and especially the determination of the meteor's velocity. In this work, most of the steps of the velocity computation are thoroughly investigated in order to reduce the uncertainties and error contributions at each stage of the reduction process. The accuracy of the measurement of meteor centroids is established and results in a precision of 0.09 pixels for CABERNET, which corresponds to 3.24‧‧. Several methods to compute the velocity were investigated based on the trajectory determination algorithms described in Ceplecha (1987) and Borovicka (1990), as well as the multi-parameter fitting (MPF) method proposed by Gural (2012). In the case of the MPF, many optimization methods were implemented in order to find the most efficient and robust technique to solve the minimization problem. The entire data reduction process is assessed using simulated meteors, with different geometrical configurations and deceleration behaviors. It is shown that the multi-parameter fitting method proposed by Gural(2012)is the most accurate method to compute the pre-atmospheric velocity in all circumstances. Many techniques that assume constant velocity at the beginning of the path as derived from the trajectory determination using Ceplecha (1987) or Borovicka (1990) can lead to large errors for decelerating meteors. The MPF technique also allows one to

  18. Managing Military Readiness

    Science.gov (United States)

    2017-02-01

    These metrics contain critical information and have their place in readiness management. However, they have never been sufficient to fully...demand signals along with simultaneity assumptions form the es- sence of the operational requirements in national strategy. This section briefly... places demands on the capability and capacity of the Air Force that consume its resources in today’s fight and exceed our capacity to address

  19. K-Reactor readiness

    International Nuclear Information System (INIS)

    Rice, P.D.

    1991-01-01

    This document describes some of the more significant accomplishments in the reactor restart program and details the magnitude and extent of the work completed to bring K-Reactor to a state of restart readiness. The discussion of restart achievements is organized into the three major categories of personnel, programs, and plant. Also presented is information on the scope and extent of internal and external oversight of the efforts, as well as some details on the startup plan

  20. Computer-aided diagnosis in radiological imaging: current status and future challenges

    Science.gov (United States)

    Doi, Kunio

    2009-10-01

    Computer-aided diagnosis (CAD) has become one of the major research subjects in medical imaging and diagnostic radiology. Many different types of CAD schemes are being developed for detection and/or characterization of various lesions in medical imaging, including conventional projection radiography, CT, MRI, and ultrasound imaging. Commercial systems for detection of breast lesions on mammograms have been developed and have received FDA approval for clinical use. CAD may be defined as a diagnosis made by a physician who takes into account the computer output as a "second opinion". The purpose of CAD is to improve the quality and productivity of physicians in their interpretation of radiologic images. The quality of their work can be improved in terms of the accuracy and consistency of their radiologic diagnoses. In addition, the productivity of radiologists is expected to be improved by a reduction in the time required for their image readings. The computer output is derived from quantitative analysis of radiologic images by use of various methods and techniques in computer vision, artificial intelligence, and artificial neural networks (ANNs). The computer output may indicate a number of important parameters, for example, the locations of potential lesions such as lung cancer and breast cancer, the likelihood of malignancy of detected lesions, and the likelihood of various diseases based on differential diagnosis in a given image and clinical parameters. In this review article, the basic concept of CAD is first defined, and the current status of CAD research is then described. In addition, the potential of CAD in the future is discussed and predicted.

  1. Organisational readiness for introducing a performance management system

    Directory of Open Access Journals (Sweden)

    Michael Ochurub

    2012-09-01

    Full Text Available Orientation: The successful introduction of performance management systems to the public service requires careful measurement of readiness for change.Research purpose: This study investigated the extent to which employees were ready for change as an indication of whether their organisation was ready to introduce a performance management system (PMS.Motivation for the study: Introducing system changes in organisations depends on positive employee preconditions. There is some debate over whether organisations can facilitate these preconditions. This research investigates change readiness linked to the introduction of a PMS in a public sector organisation. The results add to the growing literature on levels of change readiness.Research design, approach and method: The researchers used a quantitative, questionnairebased design. Because the organisation was large, the researchers used stratified sampling to select a sample from each population stratum. The sample size was 460, which constituted 26% of the total population. They used a South African change readiness questionnaire to elicit employee perceptions and opinions.Main findings: The researchers found that the organisation was not ready to introduce a PMS. The study identified various challenges and key factors that were negatively affecting the introduction of a PMS.Practical/managerial implications: The intention to develop and introduce performance management systems is generally to change the attitudes, values and approaches of managers and employees to the new strategies, processes and plans to improve productivity and performance. However, pre-existing conditions and attitudes could have an effect. It is essential to ensure that organisations are ready to introduce performance management systems and to provide sound change leadership to drive the process effectively. This study contributes to the body of knowledge about the challenges and factors organisations should consider when they

  2. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  3. Computational fragment-based screening using RosettaLigand: the SAMPL3 challenge

    Science.gov (United States)

    Kumar, Ashutosh; Zhang, Kam Y. J.

    2012-05-01

    SAMPL3 fragment based virtual screening challenge provides a valuable opportunity for researchers to test their programs, methods and screening protocols in a blind testing environment. We participated in SAMPL3 challenge and evaluated our virtual fragment screening protocol, which involves RosettaLigand as the core component by screening a 500 fragments Maybridge library against bovine pancreatic trypsin. Our study reaffirmed that the real test for any virtual screening approach would be in a blind testing environment. The analyses presented in this paper also showed that virtual screening performance can be improved, if a set of known active compounds is available and parameters and methods that yield better enrichment are selected. Our study also highlighted that to achieve accurate orientation and conformation of ligands within a binding site, selecting an appropriate method to calculate partial charges is important. Another finding is that using multiple receptor ensembles in docking does not always yield better enrichment than individual receptors. On the basis of our results and retrospective analyses from SAMPL3 fragment screening challenge we anticipate that chances of success in a fragment screening process could be increased significantly with careful selection of receptor structures, protein flexibility, sufficient conformational sampling within binding pocket and accurate assignment of ligand and protein partial charges.

  4. Towards large-scale data analysis: challenges in the design of portable systems and use of Cloud computing.

    Science.gov (United States)

    Diaz, Javier; Arrizabalaga, Saioa; Bustamante, Paul; Mesa, Iker; Añorga, Javier; Goya, Jon

    2013-01-01

    Portable systems and global communications open a broad spectrum for new health applications. In the framework of electrophysiological applications, several challenges are faced when developing portable systems embedded in Cloud computing services. In order to facilitate new developers in this area based on our experience, five areas of interest are presented in this paper where strategies can be applied for improving the performance of portable systems: transducer and conditioning, processing, wireless communications, battery and power management. Likewise, for Cloud services, scalability, portability, privacy and security guidelines have been highlighted.

  5. Solving algebraic computational problems in geodesy and geoinformatics the answer to modern challenges

    CERN Document Server

    Awange, Joseph L

    2004-01-01

    While preparing and teaching 'Introduction to Geodesy I and II' to - dergraduate students at Stuttgart University, we noticed a gap which motivated the writing of the present book: Almost every topic that we taughtrequiredsomeskillsinalgebra,andinparticular,computeral- bra! From positioning to transformation problems inherent in geodesy and geoinformatics, knowledge of algebra and application of computer algebra software were required. In preparing this book therefore, we haveattemptedtoputtogetherbasicconceptsofabstractalgebra which underpin the techniques for solving algebraic problems. Algebraic c- putational algorithms useful for solving problems which require exact solutions to nonlinear systems of equations are presented and tested on various problems. Though the present book focuses mainly on the two ?elds,theconceptsand techniquespresented hereinarenonetheless- plicable to other ?elds where algebraic computational problems might be encountered. In Engineering for example, network densi?cation and robo...

  6. Computational Modeling of Cobalt-based Water Oxidation: Current Status and Future Challenges

    Science.gov (United States)

    Schilling, Mauro; Luber, Sandra

    2018-04-01

    A lot of effort is nowadays put into the development of novel water oxidation catalysts. In this context mechanistic studies are crucial in order to elucidate the reaction mechanisms governing this complex process, new design paradigms and strategies how to improve the stability and efficiency of those catalysis. This review is focused on recent theoretical mechanistic studies in the field of homogeneous cobalt-based water oxidation catalysts. In the first part, computational methodologies and protocols are summarized and evaluated on the basis of their applicability towards real catalytic or smaller model systems, whereby special emphasis is laid on the choice of an appropriate model system. In the second part, an overview of mechanistic studies is presented, from which conceptual guidelines are drawn on how to approach novel studies of catalysts and how to further develop the field of computational modeling of water oxidation reactions.

  7. Computational Modeling of Cobalt-Based Water Oxidation: Current Status and Future Challenges

    Directory of Open Access Journals (Sweden)

    Mauro Schilling

    2018-04-01

    Full Text Available A lot of effort is nowadays put into the development of novel water oxidation catalysts. In this context, mechanistic studies are crucial in order to elucidate the reaction mechanisms governing this complex process, new design paradigms and strategies how to improve the stability and efficiency of those catalysts. This review is focused on recent theoretical mechanistic studies in the field of homogeneous cobalt-based water oxidation catalysts. In the first part, computational methodologies and protocols are summarized and evaluated on the basis of their applicability toward real catalytic or smaller model systems, whereby special emphasis is laid on the choice of an appropriate model system. In the second part, an overview of mechanistic studies is presented, from which conceptual guidelines are drawn on how to approach novel studies of catalysts and how to further develop the field of computational modeling of water oxidation reactions.

  8. Computational enzyme design approaches with significant biological outcomes: progress and challenges

    OpenAIRE

    Li, Xiaoman; Zhang, Ziding; Song, Jiangning

    2012-01-01

    Enzymes are powerful biocatalysts, however, so far there is still a large gap between the number of enzyme-based practical applications and that of naturally occurring enzymes. Multiple experimental approaches have been applied to generate nearly all possible mutations of target enzymes, allowing the identification of desirable variants with improved properties to meet the practical needs. Meanwhile, an increasing number of computational methods have been developed to assist in the modificati...

  9. Security in Cloud Computing For Service Delivery Models: Challenges and Solutions

    OpenAIRE

    Preeti Barrow; Runni Kumari; Prof. Manjula R

    2016-01-01

    Cloud computing, undoubtedly, is a path to expand the limits or add powerful capabilities on-demand with almost no investment in new framework, training new staff, or authorizing new software. Though today everyone is talking about cloud but, organizations are still in dilemma whether it’s safe to deploy their business on cloud. The reason behind it; is nothing but Security. No cloud service provider provides 100% security assurance to its customers and therefore, businesses are h...

  10. Challenges in clinical applications of brain computer interfaces in individuals with spinal cord injury

    OpenAIRE

    Rupp, Rüdiger

    2014-01-01

    Brain computer interfaces (BCIs) are devices that measure brain activities and translate them into control signals used for a variety of applications. Among them are systems for communication, environmental control, neuroprostheses, exoskeletons, or restorative therapies. Over the last years the technology of BCIs has reached a level of matureness allowing them to be used not only in research experiments supervised by scientists, but also in clinical routine with patients with neurological im...

  11. The NASA Computational Fluid Dynamics (CFD) program - Building technology to solve future challenges

    Science.gov (United States)

    Richardson, Pamela F.; Dwoyer, Douglas L.; Kutler, Paul; Povinelli, Louis A.

    1993-01-01

    This paper presents the NASA Computational Fluid Dynamics program in terms of a strategic vision and goals as well as NASA's financial commitment and personnel levels. The paper also identifies the CFD program customers and the support to those customers. In addition, the paper discusses technical emphasis and direction of the program and some recent achievements. NASA's Ames, Langley, and Lewis Research Centers are the research hubs of the CFD program while the NASA Headquarters Office of Aeronautics represents and advocates the program.

  12. Combining Brain–Computer Interfaces and Assistive Technologies: State-of-the-Art and Challenges

    Science.gov (United States)

    Millán, J. d. R.; Rupp, R.; Müller-Putz, G. R.; Murray-Smith, R.; Giugliemma, C.; Tangermann, M.; Vidaurre, C.; Cincotti, F.; Kübler, A.; Leeb, R.; Neuper, C.; Müller, K.-R.; Mattia, D.

    2010-01-01

    In recent years, new research has brought the field of electroencephalogram (EEG)-based brain–computer interfacing (BCI) out of its infancy and into a phase of relative maturity through many demonstrated prototypes such as brain-controlled wheelchairs, keyboards, and computer games. With this proof-of-concept phase in the past, the time is now ripe to focus on the development of practical BCI technologies that can be brought out of the lab and into real-world applications. In particular, we focus on the prospect of improving the lives of countless disabled individuals through a combination of BCI technology with existing assistive technologies (AT). In pursuit of more practical BCIs for use outside of the lab, in this paper, we identify four application areas where disabled individuals could greatly benefit from advancements in BCI technology, namely, “Communication and Control”, “Motor Substitution”, “Entertainment”, and “Motor Recovery”. We review the current state of the art and possible future developments, while discussing the main research issues in these four areas. In particular, we expect the most progress in the development of technologies such as hybrid BCI architectures, user–machine adaptation algorithms, the exploitation of users’ mental states for BCI reliability and confidence measures, the incorporation of principles in human–computer interaction (HCI) to improve BCI usability, and the development of novel BCI technology including better EEG devices. PMID:20877434

  13. Computational challenges of large-scale, long-time, first-principles molecular dynamics

    International Nuclear Information System (INIS)

    Kent, P R C

    2008-01-01

    Plane wave density functional calculations have traditionally been able to use the largest available supercomputing resources. We analyze the scalability of modern projector-augmented wave implementations to identify the challenges in performing molecular dynamics calculations of large systems containing many thousands of electrons. Benchmark calculations on the Cray XT4 demonstrate that global linear-algebra operations are the primary reason for limited parallel scalability. Plane-wave related operations can be made sufficiently scalable. Improving parallel linear-algebra performance is an essential step to reaching longer timescales in future large-scale molecular dynamics calculations

  14. Portable non-invasive brain-computer interface: challenges and opportunities of optical modalities

    Science.gov (United States)

    Scholl, Clara A.; Hendrickson, Scott M.; Swett, Bruce A.; Fitch, Michael J.; Walter, Erich C.; McLoughlin, Michael P.; Chevillet, Mark A.; Blodgett, David W.; Hwang, Grace M.

    2017-05-01

    The development of portable non-invasive brain computer interface technologies with higher spatio-temporal resolution has been motivated by the tremendous success seen with implanted devices. This talk will discuss efforts to overcome several major obstacles to viability including approaches that promise to improve spatial and temporal resolution. Optical approaches in particular will be highlighted and the potential benefits of both Blood-Oxygen Level Dependent (BOLD) and Fast Optical Signal (FOS) will be discussed. Early-stage research into the correlations between neural activity and FOS will be explored.

  15. An Australian Perspective On The Challenges For Computer And Network Security For Novice End-Users

    Directory of Open Access Journals (Sweden)

    Patryk Szewczyk

    2012-12-01

    Full Text Available It is common for end-users to have difficulty in using computer or network security appropriately and thus have often been ridiculed when misinterpreting instructions or procedures. This discussion paper details the outcomes of research undertaken over the past six years on why security is overly complex for end-users. The results indicate that multiple issues may render end-users vulnerable to security threats and that there is no single solution to address these problems. Studies on a small group of senior citizens has shown that educational seminars can be beneficial in ensuring that simple security aspects are understood and used appropriately.

  16. Speed challenge: a case for hardware implementation in soft-computing

    Science.gov (United States)

    Daud, T.; Stoica, A.; Duong, T.; Keymeulen, D.; Zebulum, R.; Thomas, T.; Thakoor, A.

    2000-01-01

    For over a decade, JPL has been actively involved in soft computing research on theory, architecture, applications, and electronics hardware. The driving force in all our research activities, in addition to the potential enabling technology promise, has been creation of a niche that imparts orders of magnitude speed advantage by implementation in parallel processing hardware with algorithms made especially suitable for hardware implementation. We review our work on neural networks, fuzzy logic, and evolvable hardware with selected application examples requiring real time response capabilities.

  17. THE CHALLENGE OF THE PERFORMANCE CONCEPT WITHIN THE SUSTAINABILITY AND COMPUTATIONAL DESIGN FIELD

    Directory of Open Access Journals (Sweden)

    Marcio Nisenbaum

    2017-11-01

    Full Text Available This paper discusses the notion of performance and its appropriation within the research fields related to sustainability and computational design, focusing on the design processes of the architectural and urban fields. Recently, terms such as “performance oriented design” or “performance driven architecture”, especially when related to sustainability, have been used by many authors and professionals as an attempt to engender project guidelines based on simulation processes and systematic use of digital tools. In this context, the notion of performance has basically been understood as the way in which an action is fulfilled, agreeing to contemporary discourses of efficiency and optimization – in this circumstance it is considered that a building or urban area “performs” if it fulfills certain objective sustainability evaluation criteria, reduced to mathematical parameters. This paper intends to broaden this understanding by exploring new theoretical interpretations, referring to etymological investigation, historical research, and literature review, based on authors from different areas and on the case study of the solar houses academic competition, Solar Decathlon. This initial analysis is expected to contribute to the emergence of new forms of interpretation of the performance concept, relativizing the notion of the “body” that “performs” in different manners, thus enhancing its appropriation and use within the fields of sustainability and computational design.

  18. The Computational Fluid Dynamics Rupture Challenge 2013--Phase II: Variability of Hemodynamic Simulations in Two Intracranial Aneurysms.

    Science.gov (United States)

    Berg, Philipp; Roloff, Christoph; Beuing, Oliver; Voss, Samuel; Sugiyama, Shin-Ichiro; Aristokleous, Nicolas; Anayiotos, Andreas S; Ashton, Neil; Revell, Alistair; Bressloff, Neil W; Brown, Alistair G; Chung, Bong Jae; Cebral, Juan R; Copelli, Gabriele; Fu, Wenyu; Qiao, Aike; Geers, Arjan J; Hodis, Simona; Dragomir-Daescu, Dan; Nordahl, Emily; Bora Suzen, Yildirim; Owais Khan, Muhammad; Valen-Sendstad, Kristian; Kono, Kenichi; Menon, Prahlad G; Albal, Priti G; Mierka, Otto; Münster, Raphael; Morales, Hernán G; Bonnefous, Odile; Osman, Jan; Goubergrits, Leonid; Pallares, Jordi; Cito, Salvatore; Passalacqua, Alberto; Piskin, Senol; Pekkan, Kerem; Ramalho, Susana; Marques, Nelson; Sanchi, Stéphane; Schumacher, Kristopher R; Sturgeon, Jess; Švihlová, Helena; Hron, Jaroslav; Usera, Gabriel; Mendina, Mariana; Xiang, Jianping; Meng, Hui; Steinman, David A; Janiga, Gábor

    2015-12-01

    With the increased availability of computational resources, the past decade has seen a rise in the use of computational fluid dynamics (CFD) for medical applications. There has been an increase in the application of CFD to attempt to predict the rupture of intracranial aneurysms, however, while many hemodynamic parameters can be obtained from these computations, to date, no consistent methodology for the prediction of the rupture has been identified. One particular challenge to CFD is that many factors contribute to its accuracy; the mesh resolution and spatial/temporal discretization can alone contribute to a variation in accuracy. This failure to identify the importance of these factors and identify a methodology for the prediction of ruptures has limited the acceptance of CFD among physicians for rupture prediction. The International CFD Rupture Challenge 2013 seeks to comment on the sensitivity of these various CFD assumptions to predict the rupture by undertaking a comparison of the rupture and blood-flow predictions from a wide range of independent participants utilizing a range of CFD approaches. Twenty-six groups from 15 countries took part in the challenge. Participants were provided with surface models of two intracranial aneurysms and asked to carry out the corresponding hemodynamics simulations, free to choose their own mesh, solver, and temporal discretization. They were requested to submit velocity and pressure predictions along the centerline and on specified planes. The first phase of the challenge, described in a separate paper, was aimed at predicting which of the two aneurysms had previously ruptured and where the rupture site was located. The second phase, described in this paper, aims to assess the variability of the solutions and the sensitivity to the modeling assumptions. Participants were free to choose boundary conditions in the first phase, whereas they were prescribed in the second phase but all other CFD modeling parameters were not

  19. Photons, photosynthesis, and high-performance computing: challenges, progress, and promise of modeling metabolism in green algae

    International Nuclear Information System (INIS)

    Chang, C H; Graf, P; Alber, D M; Kim, K; Murray, G; Posewitz, M; Seibert, M

    2008-01-01

    The complexity associated with biological metabolism considered at a kinetic level presents a challenge to quantitative modeling. In particular, the relatively sparse knowledge of parameters for enzymes with known kinetic responses is problematic. The possible space of these parameters is of high-dimension, and sampling of such a space typifies a combinatorial explosion of possible dynamic states. However, with sufficient quantitative transcriptomics, proteomics, and metabolomics data at hand, these challenges could be met by high-performance software with sampling, fitting, and optimization capabilities. With this in mind, we present the High-Performance Systems Biology Toolkit HiPer SBTK, an evolving software package to simulate, fit, and optimize metabolite concentrations and fluxes within the space of rate and binding parameters associated with detailed enzyme kinetic models. We present our chosen modeling paradigm for the formulation of metabolic pathway models, the means to address the challenge of representing such models in a precise and persistent fashion using the standardized Systems Biology Markup Language, and our second-generation model of H2-associated Chlamydomonas metabolism. Processing of such models for hierarchically parallelized simulation and optimization, job specification by the user through a GUI interface, software capabilities and initial scaling data, and the mapping of the computation to biological questions is also discussed. Moreover, we present near-term future software and model development goals

  20. Optical computed tomography in PRESAGE® three-dimensional dosimetry: Challenges and prospective.

    Science.gov (United States)

    Khezerloo, Davood; Nedaie, Hassan Ali; Farhood, Bagher; Zirak, Alireza; Takavar, Abbas; Banaee, Nooshin; Ahmadalidokht, Isa; Kron, Tomas

    2017-01-01

    With the advent of new complex but precise radiotherapy techniques, the demands for an accurate, feasible three-dimensional (3D) dosimetry system have been increased. A 3D dosimeter system generally should not only have accurate and precise results but should also feasible, inexpensive, and time consuming. Recently, one of the new candidates for 3D dosimetry is optical computed tomography (CT) with a radiochromic dosimeter such as PRESAGE®. Several generations of optical CT have been developed since the 90s. At the same time, a large attempt has been also done to introduce the robust dosimeters that compatible with optical CT scanners. In 2004, PRESAGE® dosimeter as a new radiochromic solid plastic dosimeters was introduced. In this decade, a large number of efforts have been carried out to enhance optical scanning methods. This article attempts to review and reflect on the results of these investigations.

  1. Optical computed tomography in PRESAGE® three-dimensional dosimetry: Challenges and prospective

    Directory of Open Access Journals (Sweden)

    Davood Khezerloo

    2017-01-01

    Full Text Available With the advent of new complex but precise radiotherapy techniques, the demands for an accurate, feasible three-dimensional (3D dosimetry system have been increased. A 3D dosimeter system generally should not only have accurate and precise results but should also feasible, inexpensive, and time consuming. Recently, one of the new candidates for 3D dosimetry is optical computed tomography (CT with a radiochromic dosimeter such as PRESAGE®. Several generations of optical CT have been developed since the 90s. At the same time, a large attempt has been also done to introduce the robust dosimeters that compatible with optical CT scanners. In 2004, PRESAGE® dosimeter as a new radiochromic solid plastic dosimeters was introduced. In this decade, a large number of efforts have been carried out to enhance optical scanning methods. This article attempts to review and reflect on the results of these investigations.

  2. Career Readiness: Has Its Time Finally Come?

    Science.gov (United States)

    DeWitt, Stephen

    2012-01-01

    In 2010, the Association for Career and Technical Education (ACTE) released a "What Is Career Ready?" definition. As the career-readiness definition explains, there is much overlap between "college readiness" and "career readiness," but academic preparedness for college alone is not enough to be truly career-ready.…

  3. A challenge for computing in the 21. century: Radwaste knowledge management

    International Nuclear Information System (INIS)

    Umeki, H.

    2007-01-01

    Integrated nuclear waste management, including waste disposal, is a technical area characterised by a breadth of required multidisciplinary knowledge that is wider than almost any other industry - covering geology to radiation physics, materials science to microbiology, archaeology to engineering, public communication to advanced IT. It also has an unparalleled depth in time, in terms of project implementation (around 100 years - matched maybe by some medieval cathedrals) and the associated safety case (millions of years - longer than the existence of modern man). If anything, this is even more critical in Japan; which depends on a major nuclear power industry, has complex (and dynamic) geology and a policy of repository siting based on solicitation of volunteer municipalities. The technical challenge of Knowledge Management in such an area, which is suffering more than most from the information explosion caused by the exponentially increasing capacities of modern technology, are truly daunting. In order to take control of the situation, the main R and D organisation in this area (Japan Atomic Energy Agency; JAEA) is planning to develop a Knowledge Management System (KMS) that will ride the wave of cutting edge technology in: Database development and management; Search engines; Expert systems; Management support systems; Security and archiving. This initiative will be complemented by a major reassessment of the modelling approach to performance assessment of repositories and the technology for communication of such complex issues with a wide range of different audiences. The requirements as specified go beyond anything doable with existing technology, so an initial goal will be to build up a team capable of anticipating the areas of active technology development that will provide the required tools - and then tailoring them to provide the infrastructure needed for these very ambitious projects. Although this particular application is of limited scope, the general

  4. PV ready roofing systems

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-07-01

    The integration of PV technology into roofs of houses has become very popular in the United States, Japan, Germany and The Netherlands. There could be a considerable market in the UK for these systems, given the large number of houses that are projected to be built in the next 10 years, and taking account of increased awareness of energy issues. A significant proportion of the market share of annual installed PV is for solar PV systems installed into homes (currently 15%), this is expected to rise to 23% (900MW) by 2010. The grid connected roof and building mounted facade systems represent the fastest growing market for PV systems in Europe. In conclusion, therefore, innovative approached for fixing PV technology onto roofs have been identified for both domestic roofs and for the commercial sector. With reference to production methodologies within the roofing industry, both approaches should be capable of being designed with PV-ready connections suitable for fixing PV modules at a later date. This will help overcome the key barriers of cost of installation, skills required and the lack of retrofit potential. Based on the results of this project, Sustainable Energy together with PV Systems are keen to take forward the full research and development of PV-ready systems for both the domestic and commercial sectors.

  5. Challenging data and workload management in CMS Computing with network-aware systems

    Science.gov (United States)

    D, Bonacorsi; T, Wildish

    2014-06-01

    After a successful first run at the LHC, and during the Long Shutdown (LS1) of the accelerator, the workload and data management sectors of the CMS Computing Model are entering into an operational review phase in order to concretely assess area of possible improvements and paths to exploit new promising technology trends. In particular, since the preparation activities for the LHC start, the Networks have constantly been of paramount importance for the execution of CMS workflows, exceeding the original expectations - as from the MONARC model - in terms of performance, stability and reliability. The low-latency transfers of PetaBytes of CMS data among dozens of WLCG Tiers worldwide using the PhEDEx dataset replication system is an example of the importance of reliable Networks. Another example is the exploitation of WAN data access over data federations in CMS. A new emerging area of work is the exploitation of Intelligent Network Services, including also bandwidth on demand concepts. In this paper, we will review the work done in CMS on this, and the next steps.

  6. Challenging data and workload management in CMS Computing with network-aware systems

    International Nuclear Information System (INIS)

    Bonacorsi D; Wildish T

    2014-01-01

    After a successful first run at the LHC, and during the Long Shutdown (LS1) of the accelerator, the workload and data management sectors of the CMS Computing Model are entering into an operational review phase in order to concretely assess area of possible improvements and paths to exploit new promising technology trends. In particular, since the preparation activities for the LHC start, the Networks have constantly been of paramount importance for the execution of CMS workflows, exceeding the original expectations - as from the MONARC model - in terms of performance, stability and reliability. The low-latency transfers of PetaBytes of CMS data among dozens of WLCG Tiers worldwide using the PhEDEx dataset replication system is an example of the importance of reliable Networks. Another example is the exploitation of WAN data access over data federations in CMS. A new emerging area of work is the exploitation of Intelligent Network Services, including also bandwidth on demand concepts. In this paper, we will review the work done in CMS on this, and the next steps.

  7. Challenging data and workload management in CMS Computing with network-aware systems

    CERN Document Server

    Wildish, Anthony

    2014-01-01

    After a successful first run at the LHC, and during the Long Shutdown (LS1) of the accelerator, the workload and data management sectors of the CMS Computing Model are entering into an operational review phase in order to concretely assess area of possible improvements and paths to exploit new promising technology trends. In particular, since the preparation activities for the LHC start, the Networks have constantly been of paramount importance for the execution of CMS workflows, exceeding the original expectations - as from the MONARC model - in terms of performance, stability and reliability. The low-latency transfers of PetaBytes of CMS data among dozens of WLCG Tiers worldwide using the PhEDEx dataset replication system is an example of the importance of reliable Networks. Another example is the exploitation of WAN data access over data federations in CMS. A new emerging area of work is the exploitation of "Intelligent Network Services", including also bandwidth on demand concepts. In this paper, we will ...

  8. Challenging Data Management in CMS Computing with Network-aware Systems

    CERN Document Server

    Bonacorsi, Daniele

    2013-01-01

    After a successful first run at the LHC, and during the Long Shutdown (LS1) of the accelerator, the workload and data management sectors of the CMS Computing Model are entering into an operational review phase in order to concretely assess area of possible improvements and paths to exploit new promising technology trends. In particular, since the preparation activities for the LHC start, the Networks have constantly been of paramount importance for the execution of CMS workflows, exceeding the original expectations - as from the MONARC model - in terms of performance, stability and reliability. The low-latency transfers of PetaBytes of CMS data among dozens of WLCG Tiers worldwide using the PhEDEx dataset replication system is an example of the importance of reliable Networks. Another example is the exploitation of WAN data access over data federations in CMS. A new emerging area of work is the exploitation of �?��??Intelligent Network Services�?��?�, including also bandwidt...

  9. Promises and challenges for the implementation of computational medical imaging (radiomics) in oncology.

    Science.gov (United States)

    Limkin, E J; Sun, R; Dercle, L; Zacharaki, E I; Robert, C; Reuzé, S; Schernberg, A; Paragios, N; Deutsch, E; Ferté, C

    2017-06-01

    Medical image processing and analysis (also known as Radiomics) is a rapidly growing discipline that maps digital medical images into quantitative data, with the end goal of generating imaging biomarkers as decision support tools for clinical practice. The use of imaging data from routine clinical work-up has tremendous potential in improving cancer care by heightening understanding of tumor biology and aiding in the implementation of precision medicine. As a noninvasive method of assessing the tumor and its microenvironment in their entirety, radiomics allows the evaluation and monitoring of tumor characteristics such as temporal and spatial heterogeneity. One can observe a rapid increase in the number of computational medical imaging publications-milestones that have highlighted the utility of imaging biomarkers in oncology. Nevertheless, the use of radiomics as clinical biomarkers still necessitates amelioration and standardization in order to achieve routine clinical adoption. This Review addresses the critical issues to ensure the proper development of radiomics as a biomarker and facilitate its implementation in clinical practice. © The Author 2017. Published by Oxford University Press on behalf of the European Society for Medical Oncology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  10. Fibromuscular dysplasia in living renal donors: Still a challenge to computed tomographic angiography

    International Nuclear Information System (INIS)

    Blondin, D.; Lanzman, R.; Schellhammer, F.; Oels, M.; Grotemeyer, D.; Baldus, S.E.; Rump, L.C.; Sandmann, W.; Voiculescu, A.

    2010-01-01

    Background: Computed tomographic angiography has become the standard evaluating method of potential living renal donors in most centers. Although incidence of fibromuscular dysplasia is low (3.5-6%), this pathology may be relevant for success of renal transplantation. The incidence of FMD in our population of LRD and reliability of CTA for detecting vascular pathology were the aims of this study. Materials and methods: 101 living renal donors, examined between 7/2004 and 9/2008 by CTA, were included in a retrospective evaluation. The examinations were carried out using a 64 Multi-detector CT (Siemens Medical Solutions, Erlangen). The presence or absence of the characteristic signs of fibromuscular dysplasia, as 'string-of-beads' appearance, focal stenosis or aneurysms, were assessed and graded from mild (=1) to severe (=3). Furthermore, vascular anatomy and arterial stenosis were investigated in this study. Retrospective analysis of CTA and ultrasound were compared with operative and histological reports. Results: Four cases of fibromuscular dysplasia (incidence 3.9%) in 101 renal donors were diagnosed by transplanting surgeons and histopathology, respectively. Three cases could be detected by CTA. In one donor even retrospective analysis of CTA was negative. Ten accessory arteries, 14 venous anomalies and 12 renal arteries stenosis due to atherosclerosis were diagnosed by CTA and could be confirmed by the operative report. Conclusion: CTA is sufficient for detection of hemodynamic relevant stenosis and vascular anatomy. Only one patient with a mild form of FMD was under estimated. Therefore, if the CTA shows slightest irregularities which are not typical for atherosclerotic lesions, further diagnostic work up by DSA might still be necessary.

  11. Fibromuscular dysplasia in living renal donors: Still a challenge to computed tomographic angiography

    Energy Technology Data Exchange (ETDEWEB)

    Blondin, D., E-mail: blondin@med.uni-duesseldorf.d [Institute of Radiology, University Hospital Duesseldorf, Moorenstr. 5, D-40225 Duesseldorf (Germany); Lanzman, R.; Schellhammer, F. [Institute of Radiology, University Hospital Duesseldorf, Moorenstr. 5, D-40225 Duesseldorf (Germany); Oels, M. [Department of Nephrology (Germany); Grotemeyer, D. [Department of Vascular Surgery and Renal Transplantation (Germany); Baldus, S.E. [Institute of Pathology (Germany); Rump, L.C. [Department of Nephrology (Germany); Sandmann, W. [Department of Vascular Surgery and Renal Transplantation (Germany); Voiculescu, A. [Department of Nephrology (Germany)

    2010-07-15

    Background: Computed tomographic angiography has become the standard evaluating method of potential living renal donors in most centers. Although incidence of fibromuscular dysplasia is low (3.5-6%), this pathology may be relevant for success of renal transplantation. The incidence of FMD in our population of LRD and reliability of CTA for detecting vascular pathology were the aims of this study. Materials and methods: 101 living renal donors, examined between 7/2004 and 9/2008 by CTA, were included in a retrospective evaluation. The examinations were carried out using a 64 Multi-detector CT (Siemens Medical Solutions, Erlangen). The presence or absence of the characteristic signs of fibromuscular dysplasia, as 'string-of-beads' appearance, focal stenosis or aneurysms, were assessed and graded from mild (=1) to severe (=3). Furthermore, vascular anatomy and arterial stenosis were investigated in this study. Retrospective analysis of CTA and ultrasound were compared with operative and histological reports. Results: Four cases of fibromuscular dysplasia (incidence 3.9%) in 101 renal donors were diagnosed by transplanting surgeons and histopathology, respectively. Three cases could be detected by CTA. In one donor even retrospective analysis of CTA was negative. Ten accessory arteries, 14 venous anomalies and 12 renal arteries stenosis due to atherosclerosis were diagnosed by CTA and could be confirmed by the operative report. Conclusion: CTA is sufficient for detection of hemodynamic relevant stenosis and vascular anatomy. Only one patient with a mild form of FMD was under estimated. Therefore, if the CTA shows slightest irregularities which are not typical for atherosclerotic lesions, further diagnostic work up by DSA might still be necessary.

  12. Nonaneurysmal "Pseudo-Subarachnoid Hemorrhage" Computed Tomography Patterns: Challenges in an Acute Decision-Making Heuristics.

    Science.gov (United States)

    Hasan, Tasneem F; Duarte, Walter; Akinduro, Oluwaseun O; Goldstein, Eric D; Hurst, Rebecca; Haranhalli, Neil; Miller, David A; Wharen, Robert E; Tawk, Rabih G; Freeman, William D

    2018-06-05

    Acute aneurysmal subarachnoid hemorrhage (SAH) is a medical and neurosurgical emergency from ruptured brain aneurysm. Aneurysmal SAH is identified on brain computed tomography (CT) as increased density of basal cisterns and subarachnoid spaces from acute blood products. Aneurysmal SAH-like pattern on CT appears as an optical illusion effect of hypodense brain parenchyma and/or hyperdense surrounding cerebral cisterns and blood vessels termed as "pseudo-subarachnoid hemorrhage" (pseudo-SAH). We reviewed clinical, laboratory, and radiographic data of all SAH diagnoses between January 2013 and January 2018, and found subsets of nonaneurysmal SAH, originally suspected to be aneurysmal in origin. We performed a National Library of Medicine search methodology using terms "subarachnoid hemorrhage," "pseudo," and "non-aneurysmal subarachnoid hemorrhage" singly and in combination to understand the sensitivity, specificity, and precision of pseudo-SAH. Over 5 years, 230 SAH cases were referred to our tertiary academic center and only 7 (3%) met the definition of pseudo-SAH. Searching the National Library of Medicine using subarachnoid hemorrhage yielded 27,402 results. When subarachnoid hemorrhage and pseudo were combined, this yielded 70 results and sensitivity was 50% (n = 35). Similarly, search precision was relatively low (26%) as only 18 results fit the clinical description similar to the 7 cases discussed in our series. Aneurysmal SAH pattern on CT is distinct from nonaneurysmal and pseudo-SAH patterns. The origin of pseudo-SAH terminology appears mostly tied to comatose cardiac arrest patients with diffuse dark brain Hounsfield units and cerebral edema, and is a potential imaging pitfall in acute medical decision-making. Copyright © 2018 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  13. SAMPL4, a blind challenge for computational solvation free energies: the compounds considered

    Science.gov (United States)

    Guthrie, J. Peter

    2014-03-01

    For the fifth time I have provided a set of solvation energies (1 M gas to 1 M aqueous) for a SAMPL challenge. In this set there are 23 blind compounds and 30 supplementary compounds of related structure to one of the blind sets, but for which the solvation energy is readily available. The best current values of each compound are presented along with complete documentation of the experimental origins of the solvation energies. The calculations needed to go from reported data to solvation energies are presented, with particular attention to aspects which are new to this set. For some compounds the vapor pressures (VP) were reported for the liquid compound, which is solid at room temperature. To correct from VPsubcooled liquid to VPsublimation requires ΔSfusion, which is only known for mannitol. Estimated values were used for the others, all but one of which were benzene derivatives and expected to have very similar values. The final compound for which ΔSfusion was estimated was menthol, which melts at 42 °C so that modest errors in ΔSfusion will have little effect. It was also necessary to look into the effects of including estimated values of ΔCp on this correction. The approximate sizes of the effects of inclusion of ΔCp in the correction from VPsubcooled liquid to VPsublimation were estimated and it was noted that inclusion of ΔCp invariably makes ΔGS more positive. To extend the set of compounds for which the solvation energy could be calculated we explored the use of boiling point (b.p.) data from Reaxys/Beilstein as a substitute for studies of the VP as a function of temperature. B.p. data are not always reliable so it was necessary to develop a criterion for rejecting outliers. For two compounds (chlorinated guaiacols) it became clear that inclusion represented overreach; for each there were only two independent pressure, temperature points, which is too little for a trustworthy extrapolation. For a number of compounds the extrapolation from lowest

  14. Big data challenges in decoding cortical activity in a human with quadriplegia to inform a brain computer interface.

    Science.gov (United States)

    Friedenberg, David A; Bouton, Chad E; Annetta, Nicholas V; Skomrock, Nicholas; Mingming Zhang; Schwemmer, Michael; Bockbrader, Marcia A; Mysiw, W Jerry; Rezai, Ali R; Bresler, Herbert S; Sharma, Gaurav

    2016-08-01

    Recent advances in Brain Computer Interfaces (BCIs) have created hope that one day paralyzed patients will be able to regain control of their paralyzed limbs. As part of an ongoing clinical study, we have implanted a 96-electrode Utah array in the motor cortex of a paralyzed human. The array generates almost 3 million data points from the brain every second. This presents several big data challenges towards developing algorithms that should not only process the data in real-time (for the BCI to be responsive) but are also robust to temporal variations and non-stationarities in the sensor data. We demonstrate an algorithmic approach to analyze such data and present a novel method to evaluate such algorithms. We present our methodology with examples of decoding human brain data in real-time to inform a BCI.

  15. Osteoid osteomas in common and in technically challenging locations treated with computed tomography-guided percutaneous radiofrequency ablation

    International Nuclear Information System (INIS)

    Mylona, Sophia; Patsoura, Sofia; Karapostolakis, Georgios; Galani, Panagiota; Pomoni, Anastasia; Thanos, Loukas

    2010-01-01

    To evaluate the efficacy of computed tomography (CT)-guided radiofrequency (RF) ablation for the treatment of osteoid osteomas in common and in technically challenging locations. Twenty-three patients with osteoid osteomas in common (nine cases) and technically challenging [14 cases: intra-articular (n = 7), spinal (n = 5), metaphyseal (n = 2)] positions were treated with CT-guided RF ablation. Therapy was performed under conscious sedation with a seven-array expandable RF electrode for 8-10 min at 80-110 C and power of 90-110 W. The patients went home under instruction. A brief pain inventory (BPI) score was calculated before and after (1 day, 4 weeks, 6 months and 1 year) treatment. All procedures were technically successful. Primary clinical success was 91.3% (21 of total 23 patients), despite the lesions' locations. BPI score was dramatically reduced after the procedure, and the decrease in BPI score was significant (P < 0.001, paired t-test; n - 1 = 22) for all periods during follow up. Two patients had persistent pain after 1 month and were treated successfully with a second procedure (secondary success rate 100%). No immediate or delayed complications were observed. CT-guided RF ablation is safe and highly effective for treatment of osteoid osteomas, even in technically difficult positions. (orig.)

  16. Process operational readiness and operational readiness follow-on

    International Nuclear Information System (INIS)

    Nertney, R.J.

    1992-11-01

    The first document in the System Safety Development Center (SSDC) series deals with the subject of Occupancy-Use Readiness. The material included in that manual provided the basis for development of the SSDC workshop in Operational Readiness. The original Occupancy Readiness Manual, however, deals only generally with the subject of process safety; i.e., the safety of overall ''processes'' such as solar collection systems, nuclear reactors, and coal fired electrical plants. The manual also fails to detail the considerations involved in maintaining the state of readiness on a continuing basis. Both of the latter subjects are dealt with in some detail in the SSDC's Operational Readiness Workshop. The purpose of this document is to provide additional documentary material dealing with subjects introduced in SSDC-1 Occupancy-Use Readiness Manual, and SSDC-12, Safety Considerations in Evaluation of Maintenance Programs. In augmenting SSDC-1, Part I of this manual provides additional material related to process safety; in the case of SSDC-12, the subject of safety considerations in evaluation of maintenance programs is broadened in Part II to include maintenance of personnel systems and procedural systems as well as hardware. ''Maintenance'' is related more directly to the concept of operational readiness and an alternative analytical tree is provided for hardware maintenance program evaluation

  17. The use of the Climate-science Computational End Station (CCES) development and grand challenge team for the next IPCC assessment: an operational plan

    International Nuclear Information System (INIS)

    Washington, W M; Buja, L; Gent, P; Drake, J; Erickson, D; Anderson, D; Bader, D; Dickinson, R; Ghan, S; Jones, P; Jacob, R

    2008-01-01

    The grand challenge of climate change science is to predict future climates based on scenarios of anthropogenic emissions and other changes resulting from options in energy and development policies. Addressing this challenge requires a Climate Science Computational End Station consisting of a sustained climate model research, development, and application program combined with world-class DOE leadership computing resources to enable advanced computational simulation of the Earth system. This project provides the primary computer allocations for the DOE SciDAC and Climate Change Prediction Program. It builds on the successful interagency collaboration of the National Science and the U.S. Department of Energy in developing and applying the Community Climate System Model (CCSM) for climate change science. It also includes collaboration with the National Aeronautics and Space Administration in carbon data assimilation and university partners with expertise in high-end computational climate research

  18. The Readiness of Sorsogon State College Faculty for Teaching with ICT: Basis for a Faculty Training Program

    Directory of Open Access Journals (Sweden)

    Catherine A. De Castro

    2016-02-01

    Full Text Available Information and communication technologies (ICT such as computers, multimedia systems, productivity software, and the Internet have greatly improved the performance of different organizations and influenced higher learning institutions like Sorsogon State College (SSC to develop and implement innovative teaching and learning methods. However, despite the many benefits of ICT when used in education, there are still faculty members who do not use these technologies for teaching. Hence, this research was conducted to assess their readiness for teaching with ICT. Findings revealed that most of the surveyed respondents were above forty-five years old, have 1-10 years of government service, and have specialization in the field of education. In terms of readiness to teach with ICT, the results disclosed that they were fairly ready along human-resource readiness, ready along technological skill readiness, and much ready along equipment readiness. Their age was not significantly related to their human resource readiness but significantly related to their technological skill and equipment readiness. The respondents’ number of years in the government was significantly related to their readiness to teach with ICT in terms of human resource, technological skill, and equipment readiness. Their field of specialization was not significantly related to their readiness to teach with ICT. Among the most identified factors why some of them do not use ICT resources were unavailability of ICT resources, lack of knowledge and lack of familiarity to ICT. The output of this research is a faculty training program to enhance their know

  19. Measuring Africa's E-Readiness in the Global Networked Economy: A Nine-Country Data Analysis

    Science.gov (United States)

    Ifinedo, Princely

    2005-01-01

    This paper assesses the integration of Africa into the global economy by computing the e-readiness for nine African countries. The measuring tool used is simple and incorporates a variety of indicators used by comparable tools. Overall, the mean e-readiness of Africa is poor in comparison to other economies. Particularly, Sub-Saharan Africa…

  20. A practical implementation science heuristic for organizational readiness: R = MC2

    Science.gov (United States)

    Cook, Brittany S.; Lamont, Andrea; Wandersman, Abraham; Castellow, Jennifer; Katz, Jason; Beidas, Rinad S.

    2015-01-01

    There are many challenges when an innovation (i.e., a program, process, or policy that is new to an organization) is actively introduced into an organization. One critical component for successful implementation is the organization’s readiness for the innovation. In this article, we propose a practical implementation science heuristic, abbreviated as R= MC2. We propose that organizational readiness involves: 1) the motivation to implement an innovation, 2) the general capacities of an organization, and 3) the innovation-specific capacities needed for a particular innovation. Each of these components can be assessed independently and be used formatively. The heuristic can be used by organizations to assess readiness to implement and by training and technical assistance providers to help build organizational readiness. We present an illustration of the heuristic by showing how behavioral health organizations differ in readiness to implement a peer specialist initiative. Implications for research and practice of organizational readiness are discussed. PMID:26668443

  1. Policy, Institutional and Programme Readiness for Solar Energy ...

    African Journals Online (AJOL)

    South Africa has been facing challenges in terms of electricity supply. The increase in population and a growing economy have exacerbated electricity supply constraints. In response, policies and institutions have emerged to promote solar energy. This study investigates policy, institutional and programme readiness to ...

  2. LHCf: ready to go

    CERN Multimedia

    CERN Bulletin

    2015-01-01

    Reinstalled in the tunnel at the end of 2014, the two detectors of the LHCf experiment are now ready for operation. The first data should be taken in May.   LHCf’s Arm1 detector. The Large Hadron Collider forward (LHCf) experiment measures neutral particles emitted at nearly zero degrees from the proton beam direction. Because these "very forward" particles carry a large fraction of the collision energy, they are important for understanding the development of atmospheric air-shower phenomena produced by high-energy cosmic rays. To measure these particles, two detectors, Arm1 and Arm2, sit along the LHC beamline, at 140 metres either side of the ATLAS collision point. In July 2010, after a 9-month operation, the LHCf collaboration removed the two detectors from the tunnel to avoid severe radiation damage. The Arm2 detector was reinstalled in the tunnel for data-taking with proton–lead collisions in 2013, while Arm1 was being upgraded to be a radiation-ha...

  3. Change readiness research

    DEFF Research Database (Denmark)

    Høstgaard, Anna Marie Balling

    2006-01-01

    the ”Basic Structure for The Electronic Health Record” (B-EHR) using prototypes. http://medinfo.dk/epj/proj/gepka/). In the Gepka project the participation varied from 33.3% to 78.9%. The objective of this study is to set out themes by which this variation can be studied. A qualitative explorative research...... of participation – it is to suggest a qualitative relationship between the two. Neither does this study try to generalize the results, as further research on more wards would be needed to do so. This study does, however, set out themes that can be a useful tool in future CRR projects in order to maximize......The Change readiness research method (CRR) has become a wellknown method in Denmark to identify issues needed to be discussed on a hospital ward before implementation of a new IT-system and to start a dialogue. A precondition for a constructive dialogue, however, is a high degree of participation...

  4. Smoke Ready Toolbox for Wildfires

    Science.gov (United States)

    This site provides an online Smoke Ready Toolbox for Wildfires, which lists resources and tools that provide information on health impacts from smoke exposure, current fire conditions and forecasts and strategies to reduce exposure to smoke.

  5. Checklist for clinical readiness published

    Science.gov (United States)

    Scientists from NCI, together with collaborators from outside academic centers, have developed a checklist of criteria to evaluate the readiness of complex molecular tests that will guide decisions made during clinical trials. The checklist focuses on tes

  6. Analysis, biomedicine, collaboration, and determinism challenges and guidance: wish list for biopharmaceuticals on the interface of computing and statistics.

    Science.gov (United States)

    Goodman, Arnold F

    2011-11-01

    I have personally witnessed processing advance from desk calculators and mainframes, through timesharing and PCs, to supercomputers and cloud computing. I have also witnessed resources grow from too little data into almost too much data, and from theory dominating data into data beginning to dominate theory while needing new theory. Finally, I have witnessed problems advance from simple in a lone discipline into becoming almost too complex in multiple disciplines, as well as approaches evolve from analysis driving solutions into solutions by data mining beginning to drive the analysis itself. How we do all of this has transitioned from competition overcoming collaboration into collaboration starting to overcome competition, as well as what is done being more important than how it is done has transitioned into how it is done becoming as important as what is done. In addition, what or how we do it being more important than what or how we should actually do it has shifted into what or how we should do it becoming just as important as what or how we do it, if not more so. Although we have come a long way in both our methodology and technology, are they sufficient for our current or future complex and multidisciplinary problems with their massive databases? Since the apparent answer is not a resounding yes, we are presented with tremendous challenges and opportunities. This personal perspective adapts my background and experience to be appropriate for biopharmaceuticals. In these times of exploding change, informed perspectives on what challenges should be explored with accompanying guidance may be even more valuable than the far more typical literature reviews in conferences and journals of what has already been accomplished without challenges or guidance. Would we believe that an architect who designs a skyscraper determines the skyscraper's exact exterior, interior and furnishings or only general characteristics? Why not increase dependability of conclusions in

  7. Welding. Module 8 of the Vocational Education Readiness Test (VERT).

    Science.gov (United States)

    Thomas, Edward L., Comp.

    Focusing on welding, this module is one of eight included in the Vocational Education Readiness Tests (VERT). The module begins by listing the objectives of the module and describing tools and equipment needed. The remainder of the module contains sections on manipulative skills, trade vocabulary, tool identification, trade computational skills,…

  8. Operational readiness review phase-1 final report for WRAP-1

    Energy Technology Data Exchange (ETDEWEB)

    Bowen, W., Westinghouse Hanford

    1996-12-27

    This report documents the Operational Readiness Review for WRAP-1 Phase-1 operations. The report includes all criteria, lines of inquiry with resulting Findings and Observations. The review included assessing operational capability of the organization and the computer controlled process and facility systems.

  9. Technical Challenges and Lessons from the Migration of the GLOBE Data and Information System to Utilize Cloud Computing Service

    Science.gov (United States)

    Moses, J. F.; Memarsadeghi, N.; Overoye, D.; Littlefield, B.

    2016-12-01

    The Global Learning and Observation to Benefit the Environment (GLOBE) Data and Information System supports an international science and education program with capabilities to accept local environment observations, archive, display and visualize them along with global satellite observations. Since its inception twenty years ago, the Web and database system has been upgraded periodically to accommodate the changes in technology and the steady growth of GLOBE's education community and collection of observations. Recently, near the end-of-life of the system hardware, new commercial computer platform options were explored and a decision made to utilize Cloud services. Now the GLOBE DIS has been fully deployed and maintained using Amazon Cloud services for over two years now. This paper reviews the early risks, actual challenges, and some unexpected findings as a result of the GLOBE DIS migration. We describe the plans, cost drivers and estimates, highlight adjustments that were made and suggest improvements. We present the trade studies for provisioning, for load balancing, networks, processing , storage, as well as production, staging and backup systems. We outline the migration team's skills and required level of effort for transition, and resulting changes in the overall maintenance and operations activities. Examples include incremental adjustments to processing capacity and frequency of backups, and efforts previously expended on hardware maintenance that were refocused onto application-specific enhancements.

  10. Technical Challenges and Lessons from the Migration of the GLOBE Data and Information System to Utilize Cloud Computing Service

    Science.gov (United States)

    Moses, John F.; Memarsadeghi, Nargess; Overoye, David; Littlefield, Brain

    2017-01-01

    The Global Learning and Observation to Benefit the Environment (GLOBE) Data and Information System supports an international science and education program with capabilities to accept local environment observations, archive, display and visualize them along with global satellite observations. Since its inception twenty years ago, the Web and database system has been upgraded periodically to accommodate the changes in technology and the steady growth of GLOBEs education community and collection of observations. Recently, near the end-of-life of the system hardware, new commercial computer platform options were explored and a decision made to utilize Cloud services. Now the GLOBE DIS has been fully deployed and maintained using Amazon Cloud services for over two years now. This paper reviews the early risks, actual challenges, and some unexpected findings as a result of the GLOBE DIS migration. We describe the plans, cost drivers and estimates, highlight adjustments that were made and suggest improvements. We present the trade studies for provisioning, for load balancing, networks, processing, storage, as well as production, staging and backup systems. We outline the migration teams skills and required level of effort for transition, and resulting changes in the overall maintenance and operations activities. Examples include incremental adjustments to processing capacity and frequency of backups, and efforts previously expended on hardware maintenance that were refocused onto application-specific enhancements.

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  12. A Proposed for Assessing Hotel E-Readiness for Tourism in Southern Thailand

    Directory of Open Access Journals (Sweden)

    Piman Sirot

    2016-01-01

    Full Text Available This article only focuses on an overview of the Hotel E-readiness model and model design for tourism in Southern Thailand. “The convergence of information technology (IT and communications technology (CT” will be an important part of these technological innovations. The global economy has been turbulent during the last several years, and governments and enterprises are doing everything possible to inject momentum and effectuate sustainable growth. All member countries of Association of Southeast Asian Nations (ASEAN, aims to be ASEAN Economic Community (AEC by December 2015, have come to realize that an integrated ICT technology will enhance the competitiveness and creativity of their economies and fuel the sustainable growth of the global economy. The role that information and communication technologies (ICTs can play to support economic growth, especially on tourism, has never drawn so much attention and research. According to Networked Readiness Index (NRI, Thailand has made improvement in NRI, edging up from 77th to 74th place in 2013 and from 74th to 67th place to the latest measurement released by the World Economic Forum in2014 and ranked 3 out of 10 countries of ASEAN members. Although we still face serious challenges the impact of ICTs on tourism has become more far reaching as its transformational effects spread to several sectors of the economy and society via innovations. On this research we focus on only the hotels division in Southern of Thailand due to tourism’s economic on this area benefits very high income from oversea and ASEAN. We give an overview of the Hotel E-readiness Model that impact to tourism economic with computer networking infrastructures and communication technologies in Southern of Thailand. Our model is described on four majors - business environment, network readiness, network usage and network impacts. It aims to explore the problems and obstacles for improvement on computer networking infrastructure and

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  14. Academic training: QCD: are we ready for the LHC

    CERN Multimedia

    2006-01-01

    2006-2007 ACADEMIC TRAINING PROGRAMME LECTURE SERIES 4, 5, 6, 7 December, from 11:00 to 12:00 4, 5, 6 December - Main Auditorium, bldg. 500, 7 December - TH Auditorium, bldg. 4 - 3-006 QCD: are we ready for the LHC S. FRIXIONE / INFN, Genoa, Italy The LHC energy regime poses a serious challenge to our capability of predicting QCD reactions to the level of accuracy necessary for a successful programme of searches for physics beyond the Standard Model. In these lectures, I'll introduce basic concepts in QCD, and present techniques based on perturbation theory, such as fixed-order and resummed computations, and Monte Carlo simulations. I'll discuss applications of these techniques to hadron-hadron processes, concentrating on recent trends in perturbative QCD aimed at improving our understanding of LHC phenomenology. ENSEIGNEMENT ACADEMIQUE ACADEMIC TRAINING Françoise Benz 73127 academic.training@cern.ch If you wish to participate in one of the following courses, please tell to your supervisor and apply ...

  15. Computational challenges and human factors influencing the design and use of clinical research participant eligibility pre-screening tools

    Directory of Open Access Journals (Sweden)

    Pressler Taylor R

    2012-05-01

    Full Text Available Abstract Background Clinical trials are the primary mechanism for advancing clinical care and evidenced-based practice, yet challenges with the recruitment of participants for such trials are widely recognized as a major barrier to these types of studies. Data warehouses (DW store large amounts of heterogenous clinical data that can be used to enhance recruitment practices, but multiple challenges exist when using a data warehouse for such activities, due to the manner of collection, management, integration, analysis, and dissemination of the data. A critical step in leveraging the DW for recruitment purposes is being able to match trial eligibility criteria to discrete and semi-structured data types in the data warehouse, though trial eligibility criteria tend to be written without concern for their computability. We present the multi-modal evaluation of a web-based tool that can be used for pre-screening patients for clinical trial eligibility and assess the ability of this tool to be practically used for clinical research pre-screening and recruitment. Methods The study used a validation study, usability testing, and a heuristic evaluation to evaluate and characterize the operational characteristics of the software as well as human factors affecting its use. Results Clinical trials from the Division of Cardiology and the Department of Family Medicine were used for this multi-modal evaluation, which included a validation study, usability study, and a heuristic evaluation. From the results of the validation study, the software demonstrated a positive predictive value (PPV of 54.12% and 0.7%, respectively, and a negative predictive value (NPV of 73.3% and 87.5%, respectively, for two types of clinical trials. Heuristic principles concerning error prevention and documentation were characterized as the major usability issues during the heuristic evaluation. Conclusions This software is intended to provide an initial list of eligible patients to a

  16. Factors of children's school readiness

    Directory of Open Access Journals (Sweden)

    Ljubica Marjanovič Umek

    2006-12-01

    Full Text Available The purpose of the study was to examine the effect of preschool on children's school readiness in connection with their intellectual abilities, language competence, and parents' education. The sample included 219 children who were 68 to 83 months old and were attending the first year of primary school. Children were differentiated by whether or not they had attended preschool before starting school. Children's intellectual ability was determined using Raven's Coloured Progressive Matrices (CPM; Raven, Raven, & Court, 1999, language competence using the Lestvice splošnega govornegarazvoja–LJ (LSGR–LJ, Scales of General Language Development; Marjanovič Umek, Kranjc, Fekonja in Bajc, 2004, and school readiness with the Preizkus pripravljenosti za šolo (PPŠ, Test of School Readiness; Toličič, 1986. The results indicate that children's intellectual ability and language competence have a high predictive value for the school readiness — they explained 51% of the variance in children's scores on the PPŠ. Preschool enrollment has a positive effect on school readiness for children whose parents have a low level of education, but not for those whose parents are highly educated.

  17. Predicting ready biodegradability of premanufacture notice chemicals.

    Science.gov (United States)

    Boethling, Robert S; Lynch, David G; Thom, Gary C

    2003-04-01

    Chemical substances other than pesticides, drugs, and food additives are regulated by the U.S. Environmental Protection Agency (U.S. EPA) under the Toxic Substances Control Act (TSCA), but the United States does not require that new substances be tested automatically for such critical properties as biodegradability. The resulting lack of submitted data has fostered the development of estimation methods, and the BioWIN models for predicting biodegradability from chemical structure have played a prominent role in premanufacture notice (PMN) review. Until now, validation efforts have used only the Japanese Ministry of International Trade and Industry (MITI) test data and have not included all models. To assess BioWIN performance with PMN substances, we assembled a database of PMNs for which ready biodegradation data had been submitted over the period 1995 through 2001. The 305 PMN structures are highly varied and pose major challenges to chemical property estimation. Despite the variability of ready biodegradation tests, the use of at least six different test methods, and widely varying quality of submitted data, accuracy of four of six BioWIN models (MITI linear, MITI nonlinear, survey ultimate, survey primary) was in the 80+% range for predicting ready biodegradability. Greater accuracy (>90%) can be achieved by using model estimates only when the four models agree (true for 3/4 of the PMNs). The BioWIN linear and nonlinear probability models did not perform as well even when classification criteria were optimized. The results suggest that the MITI and survey BioWIN models are suitable for use in screening-level applications.

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  19. Ready to work (yet)?

    DEFF Research Database (Denmark)

    Danneris, Sophie

    2018-01-01

    By analysing changes through time in vulnerable welfare recipients’ perception of their unemployment trajectories, the article revisits the assumption that unemployed clients’ trajectories are linear pathways following a clear and predictable line of causes and effects. The analysis is based on a...... of the clients, the study challenges current research attempts to find a quick fix or ‘one-size-fits-all’ intervention to improve the employability of vulnerable clients....

  20. Exploring English as a Foreign Language (EFL) Teacher Trainers' Perspectives on Challenges to Promoting Computer Literacy of EFL Teachers

    Science.gov (United States)

    Dashtestani, Reza

    2014-01-01

    Computer literacy is a significant component of language teachers' computer-assisted language learning (call) knowledge. Despite its importance, limited research has been undertaken to analyze factors which might influence language teachers' computer literacy levels. This qualitative study explored the perspectives of 39 Iranian EFL teacher…

  1. Magnetic fusion: Environmental Readiness Document

    International Nuclear Information System (INIS)

    1981-03-01

    Environmental Readiness Documents are prepared periodically to review and evaluate the environmental status of an energy technology during the several phases of development of that technology. Through these documents, the Office of Environment within the Department of Energy provides an independent and objective assessment of the environmental risks and potential impacts associated with the progression of the technology to the next stage of development and with future extensive use of the technology. This Environmental Readiness Document was prepared to assist the Department of Energy in evaluating the readiness of magnetic fusion technology with respect to environmental issues. An effort has been made to identify potential environmental problems that may be encountered based upon current knowledge, proposed and possible new environmental regulations, and the uncertainties inherent in planned environmental research

  2. Computer based training for oil spill management

    International Nuclear Information System (INIS)

    Goodman, R.

    1993-01-01

    Large oil spills are infrequent occurrences, which poses a particular problem for training oil spill response staff and for maintaining a high level of response readiness. Conventional training methods involve table-top simulations to develop tactical and strategic response skills and boom-deployment exercises to maintain operational readiness. Both forms of training are quite effective, but they are very time-consuming to organize, are expensive to conduct, and tend to become repetitious. To provide a variety of response experiences, a computer-based system of oil spill response training has been developed which can supplement a table-top training program. Using a graphic interface, a realistic and challenging computerized oil spill response simulation has been produced. Integral to the system is a program editing tool which allows the teacher to develop a custom training exercise for the area of interest to the student. 1 ref

  3. Operational readiness of EFAD systems

    International Nuclear Information System (INIS)

    Kabat, M.J.

    1992-02-01

    An assessment of the operational readiness of the Emergency Filtered Air Discharge (EFAD) systems, installed in Canadian CANDU multi-unit nuclear power plants, was performed in this project. Relevant Canadian and foreign standards and regulatory requirements have been reviewed and documentation on EFAD system design, operation, testing and maintenance have been assessed to identify likely causes and potential failures of EFAD systems and their components under both standby and accident conditions. Recommendations have also been provided in this report for revisions which are needed to achieve and maintain appropriate operational readiness of EFAD systems

  4. Progression in work readiness

    DEFF Research Database (Denmark)

    Jensen, Sophie Danneris

    2013-01-01

    This paper is based partly on literature concerning the construction of identities in social work settings (especially Juhila & Abrams 2011, Eskelinen & Olesen 2010) and partly on literature that addresses the dilemmas and challenges in providing evidence about the effectiveness of interventions...... in social work programs (amongst others Boaz & Blewett 2010 and Koivisto 2008). Initially there will be a short presentation of the research topic of my Ph.D. and the central research question related to the project. Following this is a methodological discussion in two levels - the first discussion...... be understood through short narratives about work identity....

  5. Predicting implementation from organizational readiness for change: a study protocol

    Directory of Open Access Journals (Sweden)

    Kelly P Adam

    2011-07-01

    Full Text Available Abstract Background There is widespread interest in measuring organizational readiness to implement evidence-based practices in clinical care. However, there are a number of challenges to validating organizational measures, including inferential bias arising from the halo effect and method bias - two threats to validity that, while well-documented by organizational scholars, are often ignored in health services research. We describe a protocol to comprehensively assess the psychometric properties of a previously developed survey, the Organizational Readiness to Change Assessment. Objectives Our objective is to conduct a comprehensive assessment of the psychometric properties of the Organizational Readiness to Change Assessment incorporating methods specifically to address threats from halo effect and method bias. Methods and Design We will conduct three sets of analyses using longitudinal, secondary data from four partner projects, each testing interventions to improve the implementation of an evidence-based clinical practice. Partner projects field the Organizational Readiness to Change Assessment at baseline (n = 208 respondents; 53 facilities, and prospectively assesses the degree to which the evidence-based practice is implemented. We will conduct predictive and concurrent validities using hierarchical linear modeling and multivariate regression, respectively. For predictive validity, the outcome is the change from baseline to follow-up in the use of the evidence-based practice. We will use intra-class correlations derived from hierarchical linear models to assess inter-rater reliability. Two partner projects will also field measures of job satisfaction for convergent and discriminant validity analyses, and will field Organizational Readiness to Change Assessment measures at follow-up for concurrent validity (n = 158 respondents; 33 facilities. Convergent and discriminant validities will test associations between organizational readiness and

  6. Identifying the readiness of patients in implementing telemedicine in northern Louisiana for an oncology practice.

    Science.gov (United States)

    Gurupur, Varadraj; Shettian, Kruparaj; Xu, Peixin; Hines, Scott; Desselles, Mitzi; Dhawan, Manish; Wan, Thomas Th; Raffenaud, Amanda; Anderson, Lindsey

    2017-09-01

    This study identified the readiness factors that may create challenges in the use of telemedicine among patients in northern Louisiana with cancer. To identify these readiness factors, the team of investigators developed 19 survey questions that were provided to the patients or to their caregivers. The team collected responses from 147 respondents from rural and urban residential backgrounds. These responses were used to identify the individuals' readiness for utilising telemedicine through factor analysis, Cronbach's alpha reliability test, analysis of variance and ordinary least squares regression. The analysis results indicated that the favourable factor (positive readiness item) had a mean value of 3.47, whereas the unfavourable factor (negative readiness item) had a mean value of 2.76. Cronbach's alpha reliability test provided an alpha value of 0.79. Overall, our study indicated a positive attitude towards the use of telemedicine in northern Louisiana.

  7. The Social Context of Readiness.

    Science.gov (United States)

    Nelson, Regena Fails

    This study examined how kindergarten teachers' views of readiness (maturational, learning, or school) are influenced by students from urban, suburban, and rural areas; by minority and non-minority students; and by students from lower and middle class backgrounds. The framework for the study was the social constructivist theory, the theory that all…

  8. Consequence Management - Ready or Not?

    Science.gov (United States)

    2003-04-07

    Defense will have sufficient capability and be ready to respond to a Weapons of Mass Destruction/ Effects attack. An effective consequence management...Defense adopts the National Military Strategy and its consequence management approach, it must identify Weapons of Mass Destruction/ Effects threats...that the Department of Defense: develop Weapons of Mass Destruction/ Effects performance standards for response assets; implement a consequence

  9. Onderzoek online readiness modezaken 2012

    NARCIS (Netherlands)

    Boels, Han; Weltevreden, Jesse

    2013-01-01

    In dit onderzoek is de online readiness van modezaken in 2012 in kaart gebracht. In totaal hebben 124 (voornamelijk zelfstandige) modezaken deelgenomen aan het onderzoek. Het onderzoek is uitgevoerd door het lectoraat Online Ondernemen samen met studenten van de minor Marketing Tomorrow van de

  10. Onderzoek online readiness rijscholen 2013

    NARCIS (Netherlands)

    Weltevreden, Jesse; Boels, Han

    2013-01-01

    In dit onderzoek is de online readiness van rijscholen in 2013 in kaart gebracht. In totaal hebben 115 rijscholen deelgenomen aan het onderzoek. Het onderzoek is uitgevoerd door het lectoraat Online Ondernemen samen met studenten van de minor Marketing Tomorrow van de Hogeschool van Amsterdam.

  11. Workplace Readiness for Communicating Diversity.

    Science.gov (United States)

    Muir, Clive

    1996-01-01

    Proposes a model for communicating change about diversity using a workplace-readiness approach. Discusses ways organizational change agents can assess the company's current interpersonal and social dynamics, use appropriate influence strategies, and create effective messages that will appeal to employees and help to achieve the desired acceptance…

  12. Safe, Healthy and Ready to Succeed: Arizona School Readiness Key Performance Indicators

    Science.gov (United States)

    Migliore, Donna E.

    2006-01-01

    "Safe, Healthy and Ready to Succeed: Arizona School Readiness Key Performance Indicators" presents a set of baseline measurements that gauge how well a statewide system of school readiness supports is addressing issues that affect Arizona children's readiness for school. The Key Performance Indicators (KPIs) measure the system, rather…

  13. Solar Ready: An Overview of Implementation Practices

    Energy Technology Data Exchange (ETDEWEB)

    Watson, A.; Guidice, L.; Lisell, L.; Doris, L.; Busche, S.

    2012-01-01

    This report explores three mechanisms for encouraging solar ready building design and construction: solar ready legislation, certification programs for solar ready design and construction, and stakeholder education. These methods are not mutually exclusive, and all, if implemented well, could contribute to more solar ready construction. Solar ready itself does not reduce energy use or create clean energy. Nevertheless, solar ready building practices are needed to reach the full potential of solar deployment. Without forethought on incorporating solar into design, buildings may be incompatible with solar due to roof structure or excessive shading. In these cases, retrofitting the roof or removing shading elements is cost prohibitive. Furthermore, higher up-front costs due to structural adaptations and production losses caused by less than optimal roof orientation, roof equipment, or shading will lengthen payback periods, making solar more expensive. With millions of new buildings constructed each year in the United States, solar ready can remove installation barriers and increase the potential for widespread solar adoption. There are many approaches to promoting solar ready, including solar ready legislation, certification programs, and education of stakeholders. Federal, state, and local governments have the potential to implement programs that encourage solar ready and in turn reduce barriers to solar deployment. With the guidance in this document and the examples of jurisdictions and organizations already working to promote solar ready building practices, federal, state, and local governments can guide the market toward solar ready implementation.

  14. When Life and Learning Do Not Fit: Challenges of Workload and Communication in Introductory Computer Science Online

    Science.gov (United States)

    Benda, Klara; Bruckman, Amy; Guzdial, Mark

    2012-01-01

    We present the results of an interview study investigating student experiences in two online introductory computer science courses. Our theoretical approach is situated at the intersection of two research traditions: "distance and adult education research," which tends to be sociologically oriented, and "computer science education…

  15. Development of computational fluid dynamics--habitat suitability (CFD-HSI) models to identify potential passage--Challenge zones for migratory fishes in the Penobscot River

    Science.gov (United States)

    Haro, Alexander J.; Dudley, Robert W.; Chelminski, Michael

    2012-01-01

    A two-dimensional computational fluid dynamics-habitat suitability (CFD–HSI) model was developed to identify potential zones of shallow depth and high water velocity that may present passage challenges for five anadromous fish species in the Penobscot River, Maine, upstream from two existing dams and as a result of the proposed future removal of the dams. Potential depth-challenge zones were predicted for larger species at the lowest flow modeled in the dam-removal scenario. Increasing flows under both scenarios increased the number and size of potential velocity-challenge zones, especially for smaller species. This application of the two-dimensional CFD–HSI model demonstrated its capabilities to estimate the potential effects of flow and hydraulic alteration on the passage of migratory fish.

  16. Promoting community readiness for physical activity among older adults in Germany--protocol of the ready to change intervention trial.

    Science.gov (United States)

    Brand, Tilman; Gansefort, Dirk; Rothgang, Heinz; Röseler, Sabine; Meyer, Jochen; Zeeb, Hajo

    2016-02-01

    Healthy ageing is an important concern for many societies facing the challenge of an ageing population. Physical activity (PA) is a major contributor to healthy ageing; however insufficient PA levels are prevalent in old age in Germany. Community capacity building and community involvement are often recommended as key strategies to improve equitable access to prevention and health promotion. However, evidence for the effectiveness of these strategies is scarce. This study aims to assess the community readiness for PA promotion in local environments and to analyse the utility of strategies to increase community readiness for reaching vulnerable groups. We designed a mixed method intervention trial comprising three study modules. The first module includes an assessment of community readiness for PA interventions in older adults. The assessment is carried out in a sample of 24 municipalities in the Northwest of Germany using structured key informant interviews. In the second module, eight municipalities with the low community readiness are selected from the sample and randomly assigned to one of two study groups: active enhancement of community readiness (intervention) versus no enhancement (control). After enhancing community readiness in the active enhancement group, older adults in both study groups will be recruited for participation in a PA intervention. Participation rates are compared between the study groups to evaluate the effects of the intervention. In addition, a cost-effectiveness analysis is carried out calculating recruitment costs per person reached in the two study groups. In the third module, qualitative interviews are conducted with participants and non-participants of the PA intervention exploring reasons for participation or non-participation. This study offers the potential to contribute to the evidence base of reaching vulnerable older adults for PA interventions and provide ideas on how to reduce participation barriers. Its findings will inform

  17. What are the characteristics of 'sexually ready' adolescents? Exploring the sexual readiness of youth in urban poor Accra.

    Science.gov (United States)

    Biney, Adriana A E; Dodoo, F Nii-Amoo

    2016-01-05

    Adolescent sexual activity, especially among the urban poor, remains a challenge. Despite numerous interventions and programs to address the negative consequences arising from early and frequent sexual activity among youth in sub-Saharan Africa, including Ghana, only slight progress has been made. A plausible explanation is that our understanding of what adolescents think about sex and about their own sexuality is poor. In that sense, examining how adolescents in urban poor communities think about their sexual readiness, and identifying characteristics associated with that sexual self-concept dimension, should deepen our understanding of this topical issue. A total of 196 male and female adolescents, ages 12 to 19, were surveyed in the 2011 RIPS Urban Health and Poverty Project in Accra, Ghana. The youth responded to three statements which determined their levels of sexual readiness. Other background characteristics were also obtained enabling the assessment of the correlates of their preparedness to engage in sex. The data were analyzed using ordered logistic regression models. Overall, the majority of respondents did not consider themselves ready for sex. Multivariate analyses indicated that sexual experience, exposure to pornographic movies, gender, ethnicity and household wealth were significantly linked to their readiness for sex. Sexual readiness is related to sexual activity as well as other characteristics of the adolescents, suggesting the need to consider these factors in the design of programs and interventions to curb early sex. The subject of sexual readiness has to be investigated further to ensure adolescents do not identify with any negative effects of this sexual self-view.

  18. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  19. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  2. The challenge of ubiquitous computing in health care: technology, concepts and solutions. Findings from the IMIA Yearbook of Medical Informatics 2005.

    Science.gov (United States)

    Bott, O J; Ammenwerth, E; Brigl, B; Knaup, P; Lang, E; Pilgram, R; Pfeifer, B; Ruderich, F; Wolff, A C; Haux, R; Kulikowski, C

    2005-01-01

    To review recent research efforts in the field of ubiquitous computing in health care. To identify current research trends and further challenges for medical informatics. Analysis of the contents of the Yearbook on Medical Informatics 2005 of the International Medical Informatics Association (IMIA). The Yearbook of Medical Informatics 2005 includes 34 original papers selected from 22 peer-reviewed scientific journals related to several distinct research areas: health and clinical management, patient records, health information systems, medical signal processing and biomedical imaging, decision support, knowledge representation and management, education and consumer informatics as well as bioinformatics. A special section on ubiquitous health care systems is devoted to recent developments in the application of ubiquitous computing in health care. Besides additional synoptical reviews of each of the sections the Yearbook includes invited reviews concerning E-Health strategies, primary care informatics and wearable healthcare. Several publications demonstrate the potential of ubiquitous computing to enhance effectiveness of health services delivery and organization. But ubiquitous computing is also a societal challenge, caused by the surrounding but unobtrusive character of this technology. Contributions from nearly all of the established sub-disciplines of medical informatics are demanded to turn the visions of this promising new research field into reality.

  3. Scientific Grand Challenges: Discovery In Basic Energy Sciences: The Role of Computing at the Extreme Scale - August 13-15, 2009, Washington, D.C.

    Energy Technology Data Exchange (ETDEWEB)

    Galli, Giulia [Univ. of California, Davis, CA (United States). Workshop Chair; Dunning, Thom [Univ. of Illinois, Urbana, IL (United States). Workshop Chair

    2009-08-13

    The U.S. Department of Energy’s (DOE) Office of Basic Energy Sciences (BES) and Office of Advanced Scientific Computing Research (ASCR) workshop in August 2009 on extreme-scale computing provided a forum for more than 130 researchers to explore the needs and opportunities that will arise due to expected dramatic advances in computing power over the next decade. This scientific community firmly believes that the development of advanced theoretical tools within chemistry, physics, and materials science—combined with the development of efficient computational techniques and algorithms—has the potential to revolutionize the discovery process for materials and molecules with desirable properties. Doing so is necessary to meet the energy and environmental challenges of the 21st century as described in various DOE BES Basic Research Needs reports. Furthermore, computational modeling and simulation are a crucial complement to experimental studies, particularly when quantum mechanical processes controlling energy production, transformations, and storage are not directly observable and/or controllable. Many processes related to the Earth’s climate and subsurface need better modeling capabilities at the molecular level, which will be enabled by extreme-scale computing.

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  5. High Throughput Computing Impact on Meta Genomics (Metagenomics Informatics Challenges Workshop: 10K Genomes at a Time)

    Energy Technology Data Exchange (ETDEWEB)

    Gore, Brooklin

    2011-10-12

    This presentation includes a brief background on High Throughput Computing, correlating gene transcription factors, optical mapping, genotype to phenotype mapping via QTL analysis, and current work on next gen sequencing.

  6. Energy-Efficient Management of Data Center Resources for Cloud Computing: A Vision, Architectural Elements, and Open Challenges

    OpenAIRE

    Buyya, Rajkumar; Beloglazov, Anton; Abawajy, Jemal

    2010-01-01

    Cloud computing is offering utility-oriented IT services to users worldwide. Based on a pay-as-you-go model, it enables hosting of pervasive applications from consumer, scientific, and business domains. However, data centers hosting Cloud applications consume huge amounts of energy, contributing to high operational costs and carbon footprints to the environment. Therefore, we need Green Cloud computing solutions that can not only save energy for the environment but also reduce operational cos...

  7. Nuclear explosives testing readiness evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Valk, T.C.

    1993-09-01

    This readiness evaluation considers hole selection and characterization, verification, containment issues, nuclear explosive safety studies, test authorities, event operations planning, canister-rack preparation, site preparation, diagnostic equipment setup, device assembly facilities and processes, device delivery and insertion, emplacement, stemming, control room activities, readiness briefing, arming and firing, test execution, emergency response and reentry, and post event analysis to include device diagnostics, nuclear chemistry, and containment. This survey concludes that the LLNL program and its supporting contractors could execute an event within six months of notification, and a second event within the following six months, given the NET group`s evaluation and the following three restraints: (1) FY94 (and subsequent year) funding is essentially constant with FY93, (2) Preliminary work for the initial event is completed to the historical sic months status, (3) Critical personnel, currently working in dual use technologies, would be recallable as needed.

  8. Teachers' Knowledge and Readiness towards Implementation of School Based Assessment in Secondary Schools

    Science.gov (United States)

    Veloo, Arsaythamby; Krishnasamy, Hariharan N.; Md-Ali, Ruzlan

    2015-01-01

    School-Based Assessment (SBA) was implemented in Malaysian secondary schools in 2012. Since its implementation, teachers have faced several challenges to meet the aims and objectives of the School-Based Assessment. Based on these challenges this study aims to find the level of teachers' knowledge and readiness towards the implementation of…

  9. GRENADA. Renewables Readiness Assessment 2012

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    Grenada, like many Caribbean islands, is dependent on costly oil imports for its energy needs, including the generation of electricity. The transition to renewable energy could potentially support price reductions and improve the overall competitiveness of key sectors of the economy, particularly tourism. This report provides facts and analysis to support the country's discussion on ways to move forward with the renewable energy agenda. IRENA is ready to provide support in the implementation of the actions identified in this report.

  10. Outcomes and challenges of global high-resolution non-hydrostatic atmospheric simulations using the K computer

    Science.gov (United States)

    Satoh, Masaki; Tomita, Hirofumi; Yashiro, Hisashi; Kajikawa, Yoshiyuki; Miyamoto, Yoshiaki; Yamaura, Tsuyoshi; Miyakawa, Tomoki; Nakano, Masuo; Kodama, Chihiro; Noda, Akira T.; Nasuno, Tomoe; Yamada, Yohei; Fukutomi, Yoshiki

    2017-12-01

    This article reviews the major outcomes of a 5-year (2011-2016) project using the K computer to perform global numerical atmospheric simulations based on the non-hydrostatic icosahedral atmospheric model (NICAM). The K computer was made available to the public in September 2012 and was used as a primary resource for Japan's Strategic Programs for Innovative Research (SPIRE), an initiative to investigate five strategic research areas; the NICAM project fell under the research area of climate and weather simulation sciences. Combining NICAM with high-performance computing has created new opportunities in three areas of research: (1) higher resolution global simulations that produce more realistic representations of convective systems, (2) multi-member ensemble simulations that are able to perform extended-range forecasts 10-30 days in advance, and (3) multi-decadal simulations for climatology and variability. Before the K computer era, NICAM was used to demonstrate realistic simulations of intra-seasonal oscillations including the Madden-Julian oscillation (MJO), merely as a case study approach. Thanks to the big leap in computational performance of the K computer, we could greatly increase the number of cases of MJO events for numerical simulations, in addition to integrating time and horizontal resolution. We conclude that the high-resolution global non-hydrostatic model, as used in this five-year project, improves the ability to forecast intra-seasonal oscillations and associated tropical cyclogenesis compared with that of the relatively coarser operational models currently in use. The impacts of the sub-kilometer resolution simulation and the multi-decadal simulations using NICAM are also reviewed.

  11. Highly Parallel Computing Architectures by using Arrays of Quantum-dot Cellular Automata (QCA): Opportunities, Challenges, and Recent Results

    Science.gov (United States)

    Fijany, Amir; Toomarian, Benny N.

    2000-01-01

    There has been significant improvement in the performance of VLSI devices, in terms of size, power consumption, and speed, in recent years and this trend may also continue for some near future. However, it is a well known fact that there are major obstacles, i.e., physical limitation of feature size reduction and ever increasing cost of foundry, that would prevent the long term continuation of this trend. This has motivated the exploration of some fundamentally new technologies that are not dependent on the conventional feature size approach. Such technologies are expected to enable scaling to continue to the ultimate level, i.e., molecular and atomistic size. Quantum computing, quantum dot-based computing, DNA based computing, biologically inspired computing, etc., are examples of such new technologies. In particular, quantum-dots based computing by using Quantum-dot Cellular Automata (QCA) has recently been intensely investigated as a promising new technology capable of offering significant improvement over conventional VLSI in terms of reduction of feature size (and hence increase in integration level), reduction of power consumption, and increase of switching speed. Quantum dot-based computing and memory in general and QCA specifically, are intriguing to NASA due to their high packing density (10(exp 11) - 10(exp 12) per square cm ) and low power consumption (no transfer of current) and potentially higher radiation tolerant. Under Revolutionary Computing Technology (RTC) Program at the NASA/JPL Center for Integrated Space Microelectronics (CISM), we have been investigating the potential applications of QCA for the space program. To this end, exploiting the intrinsic features of QCA, we have designed novel QCA-based circuits for co-planner (i.e., single layer) and compact implementation of a class of data permutation matrices, a class of interconnection networks, and a bit-serial processor. Building upon these circuits, we have developed novel algorithms and QCA

  12. Evidence based practice readiness: A concept analysis.

    Science.gov (United States)

    Schaefer, Jessica D; Welton, John M

    2018-01-15

    To analyse and define the concept "evidence based practice readiness" in nurses. Evidence based practice readiness is a term commonly used in health literature, but without a clear understanding of what readiness means. Concept analysis is needed to define the meaning of evidence based practice readiness. A concept analysis was conducted using Walker and Avant's method to clarify the defining attributes of evidence based practice readiness as well as antecedents and consequences. A Boolean search of PubMed and Cumulative Index for Nursing and Allied Health Literature was conducted and limited to those published after the year 2000. Eleven articles met the inclusion criteria for this analysis. Evidence based practice readiness incorporates personal and organisational readiness. Antecedents include the ability to recognize the need for evidence based practice, ability to access and interpret evidence based practice, and a supportive environment. The concept analysis demonstrates the complexity of the concept and its implications for nursing practice. The four pillars of evidence based practice readiness: nursing, training, equipping and leadership support are necessary to achieve evidence based practice readiness. Nurse managers are in the position to address all elements of evidence based practice readiness. Creating an environment that fosters evidence based practice can improve patient outcomes, decreased health care cost, increase nurses' job satisfaction and decrease nursing turnover. © 2018 John Wiley & Sons Ltd.

  13. A qualitative readiness-requirements assessment model for enterprise big-data infrastructure investment

    Science.gov (United States)

    Olama, Mohammed M.; McNair, Allen W.; Sukumar, Sreenivas R.; Nutaro, James J.

    2014-05-01

    In the last three decades, there has been an exponential growth in the area of information technology providing the information processing needs of data-driven businesses in government, science, and private industry in the form of capturing, staging, integrating, conveying, analyzing, and transferring data that will help knowledge workers and decision makers make sound business decisions. Data integration across enterprise warehouses is one of the most challenging steps in the big data analytics strategy. Several levels of data integration have been identified across enterprise warehouses: data accessibility, common data platform, and consolidated data model. Each level of integration has its own set of complexities that requires a certain amount of time, budget, and resources to implement. Such levels of integration are designed to address the technical challenges inherent in consolidating the disparate data sources. In this paper, we present a methodology based on industry best practices to measure the readiness of an organization and its data sets against the different levels of data integration. We introduce a new Integration Level Model (ILM) tool, which is used for quantifying an organization and data system's readiness to share data at a certain level of data integration. It is based largely on the established and accepted framework provided in the Data Management Association (DAMADMBOK). It comprises several key data management functions and supporting activities, together with several environmental elements that describe and apply to each function. The proposed model scores the maturity of a system's data governance processes and provides a pragmatic methodology for evaluating integration risks. The higher the computed scores, the better managed the source data system and the greater the likelihood that the data system can be brought in at a higher level of integration.

  14. The concept of readiness to change.

    Science.gov (United States)

    Dalton, Cindy C; Gottlieb, Laurie N

    2003-04-01

    Readiness is associated with change, yet there is little understanding of this construct. The purpose of this study was to examine readiness; its referents, associated factors and the resulting consequences. In the course of nursing five clients living with multiple sclerosis over a 7-month period using a Reflective Practice Model, data were systematically gathered using open-ended and then more focused questioning. Data collected during 42 client encounters (28 face-to-face encounters; 14 telephone contacts) were analysed using Chinn and Kramer's concept analysis technique. Findings. The concept of readiness was inductively derived. Readiness is both a state and a process. Before clients can create change they need to become ready to change. A number of factors trigger readiness. These include when: (a) clients perceive that a health concern is not going to resolve, (b) a change in a client's physical condition takes on new significance, (c) clients feel better able to manage their stress, (d) clients have sufficient energy, (e) clients perceive that they have adequate support in undertaking change. When one or more of these factors is present clients become ready to consider change. The process of readiness involves recognizing the need to change, weighing the costs and benefits and, when benefits outweigh costs, planning for change. The desire to change and to take action determines clients' degree of readiness. When they experience a high degree of readiness they report less anger, less depression, and view their condition in a more positive light. In contrast, when they experience a low degree of readiness they report feeling depressed, afraid and vulnerable in the face of change. Nursing has an important role to play in creating conditions to support change. To fulfil this role, nurses need to be able to assess readiness for change and the factors that enable it and then to intervene in ways that facilitate readiness.

  15. New challenges for HEP computing: RHIC [Relativistic Heavy Ion Collider] and CEBAF [Continuous Electron Beam Accelerator Facility

    International Nuclear Information System (INIS)

    LeVine, M.J.

    1990-01-01

    We will look at two facilities; RHIC and CEBF. CEBF is in the construction phase, RHIC is about to begin construction. For each of them, we examine the kinds of physics measurements that motivated their construction, and the implications of these experiments for computing. Emphasis will be on on-line requirements, driven by the data rates produced by these experiments

  16. Computers that negotiate on our behalf: Major challenges for self-sufficient, self-directed, and interdependent negotiating agents

    NARCIS (Netherlands)

    T. Baarslag (Tim); M. Kaisers (Michael); E.H. Gerding (Enrico); C.M. Jonker (Catholijn); J. Gratch (Jonathan)

    2017-01-01

    textabstractComputers that negotiate on our behalf hold great promise for the future and will even become indispensable in emerging application domains such as the smart grid, autonomous driving, and the Internet of Things. Much research has thus been expended to create agents that are able to

  17. Computational fluid dynamics-habitat suitability index (CFD-HSI) modelling as an exploratory tool for assessing passability of riverine migratory challenge zones for fish

    Science.gov (United States)

    Haro, Alexander J.; Chelminski, Michael; Dudley, Robert W.

    2015-01-01

    We developed two-dimensional computational fluid hydraulics-habitat suitability index (CFD-HSI) models to identify and qualitatively assess potential zones of shallow water depth and high water velocity that may present passage challenges for five major anadromous fish species in a 2.63-km reach of the main stem Penobscot River, Maine, as a result of a dam removal downstream of the reach. Suitability parameters were based on distribution of fish lengths and body depths and transformed to cruising, maximum sustained and sprint swimming speeds. Zones of potential depth and velocity challenges were calculated based on the hydraulic models; ability of fish to pass a challenge zone was based on the percent of river channel that the contiguous zone spanned and its maximum along-current length. Three river flows (low: 99.1 m3 sec-1; normal: 344.9 m3 sec-1; and high: 792.9 m3 sec-1) were modelled to simulate existing hydraulic conditions and hydraulic conditions simulating removal of a dam at the downstream boundary of the reach. Potential depth challenge zones were nonexistent for all low-flow simulations of existing conditions for deeper-bodied fishes. Increasing flows for existing conditions and removal of the dam under all flow conditions increased the number and size of potential velocity challenge zones, with the effects of zones being more pronounced for smaller species. The two-dimensional CFD-HSI model has utility in demonstrating gross effects of flow and hydraulic alteration, but may not be as precise a predictive tool as a three-dimensional model. Passability of the potential challenge zones cannot be precisely quantified for two-dimensional or three-dimensional models due to untested assumptions and incomplete data on fish swimming performance and behaviours.

  18. Determining registered nurses' readiness for evidence-based practice.

    Science.gov (United States)

    Thiel, Linda; Ghosh, Yashowanto

    2008-01-01

    As health care systems worldwide move toward instituting evidence-based practice (EBP), its implementation can be challenging. Conducting a baseline assessment to determine nurses' readiness for EBP presents opportunities to plan strategies before implementation. Although a growing body of research literature is focused on implementing EBP, little attention has been paid to assessing nurses' readiness for EBP. The purpose of this study was to assess registered nurses' readiness for EBP in a moderate-sized acute care hospital in the Midwestern United States before implementation of a hospital-wide nursing EBP initiative. A descriptive cross-sectional survey design was used; 121 registered nurses completed the survey. The participants (n= 121) completed the 64-item Nurses' Readiness for Evidence-Based Practice Survey that allowed measurement of information needs, knowledge and skills, culture, and attitudes. Data were analyzed using descriptive statistics and a post hoc analysis. The majority (72.5%) of respondents indicated that when they needed information, they consulted colleagues and peers rather than using journals and books; 24% of nurses surveyed used the health database, Cumulative Index to Nursing & Allied Health Literature (CINAHL). The respondents perceived their EBP knowledge level as moderate. Cultural EBP scores were moderate, with unit scores being higher than organizational scores. The nurses' attitudes toward EBP were positive. The post hoc analysis showed many significant correlations. Nurses have access to technological resources and perceive that they have the ability to engage in basic information gathering but not in higher level evidence gathering. The elements important to EBP such as a workplace culture and positive attitudes are present and can be built upon. A "site-specific" baseline assessment provides direction in planning EBP initiatives. The Nurses' Readiness for EBP Survey is a streamlined tool with established reliability and

  19. Readiness to change criminal women and men

    Directory of Open Access Journals (Sweden)

    Krzysztof Biel

    2017-12-01

    Full Text Available The readiness of offenders to social rehabilitation is a new category in our country. Meanwhile, the research conducted in many countries indicates its usefulness in the diagnosis and selection of participants of rehabilitation programmes. This entails more effective interaction with convicted persons and greater responsibility on the part of convicted people for their own social rehabilitation process. The aim of this article is to present the main assumptions and models of readiness for change and their usefulness in social rehabilitation practice and to present pilot studies of readiness for change among criminal women and men in Kraków. Application of the Polish adaptation of the CVTRQ questionnaire made it possible to determine the level of convicted persons’ readiness, taking into account deficits in particular scales of the questionnaire and variables differentiating the group of ready and not ready people. At the end, guidelines for further research will be presented.

  20. Librarian readiness for research partnerships.

    Science.gov (United States)

    Mazure, Emily S; Alpi, Kristine M

    2015-04-01

    This study investigated health sciences librarians' knowledge and skill-based readiness to partner on sponsored research involving human participants. The authors developed and deployed, at two time points, a web-based survey on nine indicators of research activities with response choices reflecting the transtheoretical model of stages of behavior change. Librarians with research experience or membership in the Medical Library Association Research Section reported higher levels of having completed indicators. Our results suggest that creating awareness in precontemplation responders could encourage skill development. Mentoring and continuing education could support librarians who are contemplating or preparing to perform indicator activities.

  1. NASA Technology Readiness Level Definitions

    Science.gov (United States)

    Mcnamara, Karen M.

    2012-01-01

    This presentation will cover the basic Technology Readiness Level (TRL) definitions used by the National Aeronautics and Space Administration (NASA) and their specific wording. We will discuss how they are used in the NASA Project Life Cycle and their effectiveness in practice. We'll also discuss the recent efforts by the International Standards Organization (ISO) to develop a broadly acceptable set of TRL definitions for the international space community and some of the issues brought to light. This information will provide input for further discussion of the use of the TRL scale in manufacturing.

  2. Readiness for banking technologies in developing countries

    African Journals Online (AJOL)

    Professor in the Department of Marketing Management, University of Johannesburg. ... From the organisation's perspective, it has been suggested ... technological readiness of developing countries' consumers, in an urban environment,.

  3. Organizational factors associated with readiness for change in residential aged care settings.

    Science.gov (United States)

    von Treuer, Kathryn; Karantzas, Gery; McCabe, Marita; Mellor, David; Konis, Anastasia; Davison, Tanya E; O'Connor, Daniel

    2018-02-01

    Organizational change is inevitable in any workplace. Previous research has shown that leadership and a number of organizational climate and contextual variables can affect the adoption of change initiatives. The effect of these workplace variables is particularly important in stressful work sectors such as aged care where employees work with challenging older clients who frequently exhibit dementia and depression. This study sought to examine the effect of organizational climate and leadership variables on organizational readiness for change across 21 residential aged care facilities. Staff from each facility (N = 255) completed a self-report measure assessing organizational factors including organizational climate, leadership and readiness for change. A hierarchical regression model revealed that the organizational climate variables of work pressure, innovation, and transformational leadership were predictive of employee perceptions of organizational readiness for change. These findings suggest that within aged care facilities an organization's capacity to change their organizational climate and leadership practices may enhance an organization's readiness for change.

  4. Preparing Canada's power systems for transition to the year 2000 : Y2K readiness assessment results for Canadian electric utility companies : first quarter 1999

    International Nuclear Information System (INIS)

    1999-01-01

    The effort made by Canadian electric utilities to minimize any power disruptions during the year 2000 (Y2K) transition is discussed and the state of readiness of the electric power industry with respect to the Y2K computer challenge is outlined. Canadian utilities started addressing Y2K issues several years ago, and today, reports show that every major electric utility in Canada is either on, or ahead of schedule to meet the industry established milestones for Y2K readiness. This report includes the assessment of all of Canada's large electric utilities, plus about 95 per cent of Canada's small distribution utilities. On average, the bulk electric utilities in Canada expect to be Y2K ready by mid-June 1999. This means that equipment and systems will operate properly for all dates including Y2K, or that there will be an operating strategy in place to mitigate the effects of any improper operations of equipment or systems. In terms of overall preparations for Y2K, Canada is ahead of the North American averages. Bulk electric utilities for non-nuclear generation are now 100 per cent complete in the inventory phase, 99 per cent complete in the assessment phase, and 91 per cent complete in the remediation/testing phase. For nuclear generation, completion rates are the same except for the remediation/testing phase which is 97 per cent complete. 1 tab., 21 figs

  5. Ready or Not...? Teen Sexuality and the Troubling Discourse of Readiness

    Science.gov (United States)

    Ashcraft, Catherine

    2006-01-01

    In this article, I explore how talk about being "ready" or "not ready" for sex shapes teen and adult understandings of sexuality. I argue that this "discourse of readiness" poses serious threats to teens' identity development, sexual decision making, and educators efforts to help them through these processes. To illustrate, I draw from my…

  6. The Staff Council, ready for the challenges of 2015

    CERN Document Server

    Staff Association

    2015-01-01

    In order to fulfil its mission of representing CERN staff with the Management and the Member States in an optimal way, the Staff Council relies on the work of a number of commissions, amongst them employment conditions, pensions, legal matters, social security, health and safety and InFormAction (training, information and action). All of these commissions have as a goal to try and improve the employment conditions of CERN members of personnel. This is the case in particular in the context of the five-yearly review process, ending in December 2015 (5YR 2015). Let us recall that the objective of a five-yearly review is to ensure that the financial and social conditions offered by the Organisation favour recruitment from all Member States, and to retain and motivate staff necessary for the fulfilment of its mission. The convenor of each Commission reports regularly to the Staff Council and Executive Committee on the work performed in their group. The commissions are open to all members of the Staff Associati...

  7. The challenge of raising ethical awareness: a case-based aiding system for use by computing and ICT students.

    Science.gov (United States)

    Sherratt, Don; Rogerson, Simon; Ben Fairweather, N

    2005-04-01

    Students, the future Information and Communication Technology (ICT) professionals, are often perceived to have little understanding of the ethical issues associated with the use of ICTs. There is a growing recognition that the moral issues associated with the use of the new technologies should be brought to the attention of students. Furthermore, they should be encouraged to explore and think more deeply about the social and legal consequences of the use of ICTs. This paper describes the development of a tool designed to raise students' awareness of the social impact of ICTs. The tool offers guidance to students undertaking computing and computer-related courses when considering the social, legal and professional implications of the actions of participants in situations of ethical conflict. However, unlike previous work in this field, this tool is not based on an artificial intelligence paradigm. Aspects of the theoretical basis for the design of the tool and the tool's practical development are discussed. Preliminary results from the testing of the tool are also discussed.

  8. Conquer the FPSO (Floating Production Storage and Off loading) separation challenge using CFD (Computational Fluid Dynamics) and laboratory experiments

    Energy Technology Data Exchange (ETDEWEB)

    Kristoffersen, Astrid R.; Hannisdal, Andreas; Amarzguioui, Morad; Wood, Deborah; Tor Andersen [Aibel, Stavanger (Norway)

    2008-07-01

    To have the necessary confidence in a separators' performance, the design must be based on more than simple design rules. A combination of separation testing, computer modelling, and general knowledge of the process is needed. In addition, new technologies can provide enhanced overall performance when it is required. This paper describes how all of these techniques can be combined to get the most out of separator design. We will describe how Aibel has used Computational Fluid Dynamics (CFD), together with laboratory testing, multi-disciplinary knowledge and new technology in order to revolutionize the way we design separators. This paper will present a study of separation performance for one of our customers. A CFD simulation was performed to predict the internal waves inside a separator located on a FPSO, and how these affect separation phenomena. The performance of the theoretical CFD model was verified by laboratory wave experiments. Separation tests were performed to test new solutions which could increase the performance of the process. Based on the CFD simulations and the separation tests, a modification of the separator was proposed. (author)

  9. Future Research Challenges for a Computer-Based Interpretative 3D Reconstruction of Cultural Heritage - A German Community's View

    Science.gov (United States)

    Münster, S.; Kuroczyński, P.; Pfarr-Harfst, M.; Grellert, M.; Lengyel, D.

    2015-08-01

    The workgroup for Digital Reconstruction of the Digital Humanities in the German-speaking area association (Digital Humanities im deutschsprachigen Raum e.V.) was founded in 2014 as cross-disciplinary scientific society dealing with all aspects of digital reconstruction of cultural heritage and currently involves more than 40 German researchers. Moreover, the workgroup is dedicated to synchronise and foster methodological research for these topics. As one preliminary result a memorandum was created to name urgent research challenges and prospects in a condensed way and assemble a research agenda which could propose demands for further research and development activities within the next years. The version presented within this paper was originally created as a contribution to the so-called agenda development process initiated by the German Federal Ministry of Education and Research (BMBF) in 2014 and has been amended during a joint meeting of the digital reconstruction workgroup in November 2014.

  10. In silico regenerative medicine: how computational tools allow regulatory and financial challenges to be addressed in a volatile market.

    Science.gov (United States)

    Geris, L; Guyot, Y; Schrooten, J; Papantoniou, I

    2016-04-06

    The cell therapy market is a highly volatile one, due to the use of disruptive technologies, the current economic situation and the small size of the market. In such a market, companies as well as academic research institutes are in need of tools to advance their understanding and, at the same time, reduce their R&D costs, increase product quality and productivity, and reduce the time to market. An additional difficulty is the regulatory path that needs to be followed, which is challenging in the case of cell-based therapeutic products and should rely on the implementation of quality by design (QbD) principles. In silico modelling is a tool that allows the above-mentioned challenges to be addressed in the field of regenerative medicine. This review discusses such in silico models and focuses more specifically on the bioprocess. Three (clusters of) examples related to this subject are discussed. The first example comes from the pharmaceutical engineering field where QbD principles and their implementation through the use of in silico models are both a regulatory and economic necessity. The second example is related to the production of red blood cells. The described in silico model is mainly used to investigate the manufacturing process of the cell-therapeutic product, and pays special attention to the economic viability of the process. Finally, we describe the set-up of a model capturing essential events in the development of a tissue-engineered combination product in the context of bone tissue engineering. For each of the examples, a short introduction to some economic aspects is given, followed by a description of the in silico tool or tools that have been developed to allow the implementation of QbD principles and optimal design.

  11. Jean Claude Risset’s Duet for One Pianist: Challenges of a Real-Time Performance Interaction with a Computer-Controlled Acoustic Piano 16 Years Later

    Directory of Open Access Journals (Sweden)

    Sofia Lourenço

    2014-12-01

    Full Text Available This study aims to discuss the work Duet for one Pianist (1989 by the French composer Jean-Claude Risset (b. 13 March 1938 by analyzing the challenges of the music performance of this Computer-Aided Composition work Disklavier and implies Human-Computer Interaction performance. Extremely honored to perform the revised version of the 8 Sketches for One Pianist and Disklavier within a research project of CITAR and a new Sketch Reflections (2012 by Jean-Claude Risset dedicated to me in a World premiere in the closing ceremony of Black&White 2012 Film Festival promoted by the Catholic University of Portugal. Several issues on the performance of this work are analysed as a case-study, from the point of view of the performer, particularly the components of expressive performance in a real-time interaction between performer and computer. These components can work as analysis criteria of a piano interpretation, in here, of a pianist and Disklavier interpretation. 

  12. Using Puppet to contextualize computing resources for ATLAS analysis on Google Compute Engine

    International Nuclear Information System (INIS)

    Öhman, Henrik; Panitkin, Sergey; Hendrix, Valerie

    2014-01-01

    With the advent of commercial as well as institutional and national clouds, new opportunities for on-demand computing resources for the HEP community become available. The new cloud technologies also come with new challenges, and one such is the contextualization of computing resources with regard to requirements of the user and his experiment. In particular on Google's new cloud platform Google Compute Engine (GCE) upload of user's virtual machine images is not possible. This precludes application of ready to use technologies like CernVM and forces users to build and contextualize their own VM images from scratch. We investigate the use of Puppet to facilitate contextualization of cloud resources on GCE, with particular regard to ease of configuration and dynamic resource scaling.

  13. From Readiness to Action: How Motivation Works

    Directory of Open Access Journals (Sweden)

    Kruglanski Arie W.

    2014-09-01

    Full Text Available We present a new theoretical construct labeled motivational readiness. It is defined as the inclination, whether or not ultimately implemented, to satisfy a desire. A general model of readiness is described which builds on the work of prior theories, including animal learning models and personality approaches, and which aims to integrate a variety of research findings across different domains of motivational research. Components of this model include the Want state (that is, an individual’s currently active desire, and the Expectancy of being able to satisfy that Want. We maintain that the Want concept is the critical ingredient in motivational readiness: without it, readiness cannot exist. In contrast, some motivational readiness can exist without Expectancy. We also discuss the role of incentive in motivational readiness. Incentive is presently conceived of in terms of a Match between a Want and a Perceived Situational Affordance. Whereas in classic models incentive was portrayed as a first order determinant of motivational readiness, here we describe it as a second order factor which affects readiness by influencing Want, Expectancy, or both. The new model’s relation to its theoretical predecessors, and its implications for future research, also are discussed.

  14. Universal School Readiness Screening at Kindergarten Entry

    Science.gov (United States)

    Quirk, Matthew; Dowdy, Erin; Dever, Bridget; Carnazzo, Katherine; Bolton, Courtney

    2018-01-01

    Researchers examined the concurrent and predictive validity of a brief (12-item) teacher-rated school readiness screener, the Kindergarten Student Entrance Profile (KSEP), using receiver operating characteristic (ROC) curve analysis to examine associations between (N = 78) children's social-emotional (SE) and cognitive (COG) readiness with…

  15. Overview: Texas College and Career Readiness Standards

    Science.gov (United States)

    Texas Higher Education Coordinating Board, 2009

    2009-01-01

    The Texas College and Career Readiness Standards define what students should know and be able to accomplish in order to succeed in entry-level college courses or skilled workforce opportunities upon graduation from high school. This paper answers the following questions: (1) Who developed the Texas College and Career Readiness Standards?; (2) What…

  16. Understanding Early Educators' Readiness to Change

    Science.gov (United States)

    Peterson, Shira M.

    2012-01-01

    Researchers in the fields of humanistic psychology, counseling, organizational change, and implementation science have been asking a question that is at the heart of today's early care and education quality improvement efforts: When it comes to changing one's behavior, what makes a person ready to change? Although the concept of readiness to…

  17. Measuring the strategic readiness of intangible assets.

    Science.gov (United States)

    Kaplan, Robert S; Norton, David P

    2004-02-01

    Measuring the value of intangible assets such as company culture, knowledge management systems, and employees' skills is the holy grail of accounting. Executives know that these intangibles, being hard to imitate, are powerful sources of sustainable competitive advantage. If managers could measure them, they could manage the company's competitive position more easily and accurately. In one sense, the challenge is impossible. Intangible assets are unlike financial and physical resources in that their value depends on how well they serve the organizations that own them. But while this prevents an independent valuation of intangible assets, it also points to an altogether different approach for assessing their worth. In this article, the creators of the Balanced Scorecard draw on its tools and framework--in particular, a tool called the strategy map--to present a step-by-step way to determine "strategic readiness," which refers to the alignment of an organization's human, information, and organization capital with its strategy. In the method the authors describe, the firm identifies the processes most critical to creating and delivering its value proposition and determines the human, information, and organization capital the processes require. Some managers shy away from measuring intangible assets because they seem so subjective. But by using the systematic approaches set out in this article, companies can now measure what they want, rather than wanting only what they can currently measure.

  18. The WIPP transportation system: Demonstrated readiness

    International Nuclear Information System (INIS)

    Ward, T.R.; Spooner, R.

    1991-01-01

    The Department of Energy (DOE) has developed an integrated transportation system to transport transuranic (TRU) waste from ten widely-dispersed generator sites to the Waste Isolation Pilot Plant (WIPP). The system consists of a Type B container, a specially- designed trailer, a lightweight tractor, the DOE ''TRANSCOM'' vehicle tracking system, and uniquely qualified and highly-trained drivers. In June of 1989, the National Academy of Sciences reviewed the transportation system and concluded that: ''The system proposed for transportation of TRU waste to WIPP is safer than that employed for any other hazardous material in the United States today and will reduce risk to very low levels'' (emphasis added). The next challenge facing the DOE was demonstrating that this system was ready to transport the TRU waste to the WIPP site efficiently and in the safest manner possible. Not only did the DOE feel that is was necessary to convince itself that the system was safe, but also representatives of the 20 states through which it would travel

  19. The WIPP transportation system: Demonstrated readiness

    International Nuclear Information System (INIS)

    Ward, T.R.; Spooner, R.

    1991-01-01

    The Department of Energy (DOE) has developed an integrated transportation system to transport transuranic (TRU) waste from ten widely-dispersed generator sites to the Waste Isolation Pilot Plant (WIPP). The system consists of a Type B container, a specially-designed trailer, a lightweight tractor, the DOE ''TRANSCOM'' vehicle tracing system, and uniquely qualified and highly-trained drivers. In June of 1989, the National Academy of Sciences reviewed the transportation system and concluded that: ''The system proposed for transportation of TRU waste to WIPP is safer than that employed for any other hazardous material in the United States today and will reduce risk to very low levels.'' The next challenge facing the DOE was demonstrating that this system was ready to transport the TRU waste to the WIPP site in the safest manner possible. Not only did the DOE feel that it was necessary to convince itself that the system was safe, but also representatives of the 23 states through which it traveled

  20. Differences in Readiness between Rural Hospitals and Primary Care Providers for Telemedicine Adoption and Implementation: Findings from a Statewide Telemedicine Survey

    Science.gov (United States)

    Martin, Amy Brock; Probst, Janice C.; Shah, Kyle; Chen, Zhimin; Garr, David

    2012-01-01

    Purpose: Published advantages of and challenges with telemedicine led us to examine the scope of telemedicine adoption, implementation readiness, and barriers in a southern state where adoption has been historically low. We hypothesized that rural hospitals and primary care providers (RPCPs) differ on adoption, readiness, and implementation…

  1. Lightning Arrestor Connectors Production Readiness

    Energy Technology Data Exchange (ETDEWEB)

    Marten, Steve; Linder, Kim; Emmons, Jim; Gomez, Antonio; Hasam, Dawud; Maurer, Michelle

    2008-10-20

    The Lightning Arrestor Connector (LAC), part “M”, presented opportunities to improve the processes used to fabricate LACs. The A## LACs were the first production LACs produced at the KCP, after the product was transferred from Pinnellas. The new LAC relied on the lessons learned from the A## LACs; however, additional improvements were needed to meet the required budget, yield, and schedule requirements. Improvement projects completed since 2001 include Hermetic Connector Sealing Improvement, Contact Assembly molding Improvement, development of a second vendor for LAC shells, general process improvement, tooling improvement, reduction of the LAC production cycle time, and documention of the LAC granule fabrication process. This report summarizes the accomplishments achieved in improving the LAC Production Readiness.

  2. Teenage employment and career readiness.

    Science.gov (United States)

    Greene, Kaylin M; Staff, Jeremy

    2012-01-01

    Most American youth hold a job at some point during adolescence, but should they work? This article presents a broad overview of teenage employment in the United States. It begins by describing which teenagers work and for how long and then focuses attention on the consequences (both good and bad) of paid work in adolescence. It then presents recent nationally representative data from the Monitoring the Future Study suggesting that limited hours of paid work do not crowd out developmentally appropriate after-school activities. A review of the literature also supports the idea that employment for limited hours in good jobs can promote career readiness and positive development. The article concludes with a discussion of the implications of youth work for practitioners and policymakers who are delivering career-related programming. Copyright © 2012 Wiley Periodicals, Inc., A Wiley Company.

  3. Systems security and functional readiness

    International Nuclear Information System (INIS)

    Bruckner, D.G.

    1988-01-01

    In Protective Programming Planning, it is important that every facility or installation be configured to support the basic functions and mission of the using organization. This paper addresses the process of identifying the key functional operations of our facilities in Europe and providing the security necessary to keep them operating in natural and man-made threat environments. Functional Readiness is important since many of our existing facilities in Europe were not constructed to meet the demands of today's requirements. There are increased requirements for real-time systems with classified terminals and stringent access control, tempest and other electronic protection devices. One must prioritize the operations of these systems so that essential functions are provided even when the facilities are affected by overt or covert hostile activities

  4. MIBS breadboard ready for testing

    Science.gov (United States)

    Leijtens, Johan; de Goeij, Bryan; Boslooper, Erik

    2017-11-01

    MIBS is a spectrometer operating in the thermal infrared wavelength region, designed in frame of the phase A study for the ESA EarthCARE mission as part of the multispectral Imaging instrument MSI, which uses a 2D microbolometer array detector in stead of the more common MCT detectors. Utilization of a microbolometer and using an integrated calibration system, results in a sensor with a size and mass reduction of at least an order of magnitude when compared to currently flying instruments with similar spectral resolution. In order to demonstrate feasiblity a breadboard has been designed, which will be build and aligned in 2006 and will be ready for testing the forth quarter of 2006.

  5. Defense Treaty Inspection Readiness Program

    International Nuclear Information System (INIS)

    Cronin, J.J.; Kohen, M.D.; Rivers, J.D.

    1996-01-01

    The Defense Treaty Inspection Readiness Program (DTIRP) was established by the Department of Defense in 1990 to assist defense facilities in preparing for treaty verification activities. Led by the On-Site Inspection Agency (OSIA), an element of the Department of Defense, DTIRP''s membership includes representatives from other Department of Defense agencies, the Department of Energy (DOE), the Central Intelligence Agency, the Federal Bureau of Investigation, the Department of Commerce, and others. The Office of Safeguards and Security has a significant interest in this program, due to the number of national defense facilities within its purview that are candidates for future inspections. As a result, the Office of Safeguards and Security has taken a very active role in DTIRP. This paper discusses the Office of Safeguards and Security''s increasing involvement in various elements of the DTIRP, ranging from facility assessments to training development and implementation

  6. Utility shopping: are consumers ready?

    International Nuclear Information System (INIS)

    Barrados, A.

    1999-01-01

    This report provides an overview of public readiness to deal with deregulation of the electric power industry , based on an analysis of public reaction to the deregulation of the transportation, telecommunications and natural gas industries which already have taken place. The report also examines the reasons why residential consumers have reason to be wary of deregulation. These include the likelihood of slow development of the intended competition, the consequent limits on consumer choices, the possibility of increased prices, decreased quality of service and erosion of social values such as affordability and accessibility. The report concludes with a number of recommendations aimed at ensuring the existence of workable competition for residential consumers, that reliable and meaningful information is available as competition in deregulated markets gets underway, that independent sources of information are widely available, and that basic consumer protection against deceptive and borderline marketing practices, a regulatory oversight mechanism and public reporting mechanisms are in place before competition begins. 33 refs

  7. Design and preliminary evaluation of the FINGER rehabilitation robot: controlling challenge and quantifying finger individuation during musical computer game play.

    Science.gov (United States)

    Taheri, Hossein; Rowe, Justin B; Gardner, David; Chan, Vicki; Gray, Kyle; Bower, Curtis; Reinkensmeyer, David J; Wolbrecht, Eric T

    2014-02-04

    This paper describes the design and preliminary testing of FINGER (Finger Individuating Grasp Exercise Robot), a device for assisting in finger rehabilitation after neurologic injury. We developed FINGER to assist stroke patients in moving their fingers individually in a naturalistic curling motion while playing a game similar to Guitar Hero. The goal was to make FINGER capable of assisting with motions where precise timing is important. FINGER consists of a pair of stacked single degree-of-freedom 8-bar mechanisms, one for the index and one for the middle finger. Each 8-bar mechanism was designed to control the angle and position of the proximal phalanx and the position of the middle phalanx. Target positions for the mechanism optimization were determined from trajectory data collected from 7 healthy subjects using color-based motion capture. The resulting robotic device was built to accommodate multiple finger sizes and finger-to-finger widths. For initial evaluation, we asked individuals with a stroke (n = 16) and without impairment (n = 4) to play a game similar to Guitar Hero while connected to FINGER. Precision design, low friction bearings, and separate high speed linear actuators allowed FINGER to individually actuate the fingers with a high bandwidth of control (-3 dB at approximately 8 Hz). During the tests, we were able to modulate the subject's success rate at the game by automatically adjusting the controller gains of FINGER. We also used FINGER to measure subjects' effort and finger individuation while playing the game. Test results demonstrate the ability of FINGER to motivate subjects with an engaging game environment that challenges individuated control of the fingers, automatically control assistance levels, and quantify finger individuation after stroke.

  8. Solar Sail Propulsion Technology Readiness Level Database

    Science.gov (United States)

    Adams, Charles L.

    2004-01-01

    The NASA In-Space Propulsion Technology (ISPT) Projects Office has been sponsoring 2 solar sail system design and development hardware demonstration activities over the past 20 months. Able Engineering Company (AEC) of Goleta, CA is leading one team and L Garde, Inc. of Tustin, CA is leading the other team. Component, subsystem and system fabrication and testing has been completed successfully. The goal of these activities is to advance the technology readiness level (TRL) of solar sail propulsion from 3 towards 6 by 2006. These activities will culminate in the deployment and testing of 20-meter solar sail system ground demonstration hardware in the 30 meter diameter thermal-vacuum chamber at NASA Glenn Plum Brook in 2005. This paper will describe the features of a computer database system that documents the results of the solar sail development activities to-date. Illustrations of the hardware components and systems, test results, analytical models, relevant space environment definition and current TRL assessment, as stored and manipulated within the database are presented. This database could serve as a central repository for all data related to the advancement of solar sail technology sponsored by the ISPT, providing an up-to-date assessment of the TRL of this technology. Current plans are to eventually make the database available to the Solar Sail community through the Space Transportation Information Network (STIN).

  9. Copernicus POD Service: Ready for Sentinel-3

    Science.gov (United States)

    Peter, H.; Fernández, J.; Escobar, D.; Féménias, P.; Flohrer, C.; Otten, M.

    2015-12-01

    The Copernicus POD Service is part of the Copernicus PDGS Ground Segment of the Sentinel missions. A GMV-led consortium is operating the Copernicus POD Service being in charge of generating precise orbital products and auxiliary data files for their use as part of the processing chains of the respective Sentinel PDGS. The Sentinel-1, -2 & -3 missions have different but very demanding requirements in terms of orbital accuracy and timeliness. Orbital products in Near Real Time (latency: 30 min), Short Time Critical (1.5 days) and Non-time Critical (20-30 days) are required. The accuracy requirements are very challenging, targeting 5 cm in 3D for Sentinel-1 and 2-3 cm in radial direction for Sentinel-3. Sentinel-3A carries, in addition to a GPS receiver a laser retro reflector and a DORIS receiver. On the one hand, the three different techniques GPS, SLR and DORIS make POD more complex but, on the other hand, it is very helpful to have independent techniques available for validation of the orbit results. The successful POD processing for Sentinel-1A is a good preparation for Sentinel-3A due to the similar demanding orbit accuracy requirements. The Copernicus POD Service is ready for Sentinel-3A and the service will process GPS and SLR data routinely and has the capacity to process DORIS in NTC and reprocessing campaigns. The three independent orbit determination techniques on Sentinel-3 offer big potential for scientific exploitation. Carrying all three techniques together makes the satellite, e.g., very useful for combining all the techniques on observation level as it could only be done for Jason-2 until now. The Sentinel POD Quality Working Group strongly supporting the CPOD Service delivers additional orbit solutions to validate the CPOD results independently. The recommendations from this body guarantee that the CPOD Service is updated following state-of-the-art algorithms, models and conventions. The QWG also focuses on the scientific exploitation of the

  10. Implementing a Zero Energy Ready Home Multifamily Project

    Energy Technology Data Exchange (ETDEWEB)

    Springer, David [Alliance for Residential Building Innovation, Davis, CA (United States); German, Alea [Alliance for Residential Building Innovation, Davis, CA (United States)

    2015-08-01

    An objective of this project was to gain a highly visible foothold for residential buildings built to the U.S. Department of Energy's Zero Energy Ready Home (ZERH) specification that can be used to encourage participation by other California builders. This report briefly describes two single family homes that were ZERH-certified, and focuses on the experience of working with developer Mutual Housing on a 62 unit multi-family community at the Spring Lake subdivision in Woodland, CA. The Spring Lake project is expected to be the first ZERH certified multi-family project nationwide. This report discusses challenges encountered, lessons learned, and how obstacles were overcome.

  11. Employing Inquiry-Based Computer Simulations and Embedded Scientist Videos To Teach Challenging Climate Change and Nature of Science Concepts

    Science.gov (United States)

    Cohen, E.

    2013-12-01

    Design based research was utilized to investigate how students use a greenhouse effect simulation in order to derive best learning practices. During this process, students recognized the authentic scientific process involving computer simulations. The simulation used is embedded within an inquiry-based technology-mediated science curriculum known as Web-based Inquiry Science Environment (WISE). For this research, students from a suburban, diverse, middle school setting use the simulations as part of a two week-long class unit on climate change. A pilot study was conducted during phase one of the research that informed phase two, which encompasses the dissertation. During the pilot study, as students worked through the simulation, evidence of shifts in student motivation, understanding of science content, and ideas about the nature of science became present using a combination of student interviews, focus groups, and students' conversations. Outcomes of the pilot study included improvements to the pedagogical approach. Allowing students to do 'Extreme Testing' (e.g., making the world as hot or cold as possible) and increasing the time for free exploration of the simulation are improvements made as a result of the findings of the pilot study. In the dissertation (phase two of the research design) these findings were implemented in a new curriculum scaled for 85 new students from the same school during the next school year. The modifications included new components implementing simulations as an assessment tool for all students and embedded modeling tools. All students were asked to build pre and post models, however due to technological constraints these were not an effective tool. A non-video group of 44 students was established and another group of 41 video students had a WISE curriculum which included twelve minutes of scientists' conversational videos referencing explicit aspects on the nature of science, specifically the use of models and simulations in science

  12. Are they ready? Organizational readiness for change among clinical teaching teams.

    Science.gov (United States)

    Bank, Lindsay; Jippes, Mariëlle; Leppink, Jimmie; Scherpbier, Albert Jja; den Rooyen, Corry; van Luijk, Scheltus J; Scheele, Fedde

    2017-01-01

    Curriculum change and innovation are inevitable parts of progress in postgraduate medical education (PGME). Although implementing change is known to be challenging, change management principles are rarely looked at for support. Change experts contend that organizational readiness for change (ORC) is a critical precursor for the successful implementation of change initiatives. Therefore, this study explores whether assessing ORC in clinical teaching teams could help to understand how curriculum change takes place in PGME. Clinical teaching teams in hospitals in the Netherlands were requested to complete the Specialty Training's Organizational Readiness for curriculum Change, a questionnaire to measure ORC in clinical teaching teams. In addition, change-related behavior was measured by using the "behavioral support-for-change" measure. A two-way analysis of variance was performed for all response variables of interest. In total, 836 clinical teaching team members were included in this study: 288 (34.4%) trainees, 307 (36.7%) clinical staff members, and 241 (28.8%) program directors. Overall, items regarding whether the program director has the authority to lead scored higher compared with the other items. At the other end, the subscales "management support and leadership," "project resources," and "implementation plan" had the lowest scores in all groups. The study brought to light that program directors are clearly in the lead when it comes to the implementation of educational innovation. Clinical teaching teams tend to work together as a team, sharing responsibilities in the implementation process. However, the results also reinforce the need for change management support in change processes in PGME.

  13. Readiness of communities to engage with childhood obesity prevention initiatives in disadvantaged areas of Victoria, Australia.

    Science.gov (United States)

    Cyril, Sheila; Polonsky, Michael; Green, Julie; Agho, Kingsley; Renzaho, Andre

    2017-07-01

    Objective Disadvantaged communities bear a disproportionate burden of childhood obesity and show low participation in childhood obesity prevention initiatives. This study aims to examine the level of readiness of disadvantaged communities to engage with childhood obesity prevention initiatives. Methods Using the community readiness model, 95 semi-structured interviews were conducted among communities in four disadvantaged areas of Victoria, Australia. Community readiness analysis and paired t-tests were performed to assess the readiness levels of disadvantaged communities to engage with childhood obesity prevention initiatives. Results The results showed that disadvantaged communities demonstrated low levels of readiness (readiness score=4/9, 44%) to engage with the existing childhood obesity prevention initiatives, lacked knowledge of childhood obesity and its prevention, and reported facing challenges in initiating and sustaining participation in obesity prevention initiatives. Conclusion This study highlights the need to improve community readiness by addressing low obesity-related literacy levels among disadvantaged communities and by facilitating the capacity-building of bicultural workers to deliver obesity prevention messages to these communities. Integrating these needs into existing Australian health policy and practice is of paramount importance for reducing obesity-related disparities currently prevailing in Australia. What is known about the topic? Childhood obesity prevalence is plateauing in developed countries including Australia; however, obesity-related inequalities continue to exist in Australia especially among communities living in disadvantaged areas, which experience poor engagement in childhood obesity prevention initiatives. Studies in the USA have found that assessing disadvantaged communities' readiness to participate in health programs is a critical initial step in reducing the disproportionate obesity burden among these communities

  14. Readiness for hospital discharge: A concept analysis.

    Science.gov (United States)

    Galvin, Eileen Catherine; Wills, Teresa; Coffey, Alice

    2017-11-01

    To report on an analysis on the concept of 'readiness for hospital discharge'. No uniform operational definition of 'readiness for hospital discharge' exists in the literature; therefore, a concept analysis is required to clarify the concept and identify an up-to-date understanding of readiness for hospital discharge. Clarity of the concept will identify all uses of the concept; provide conceptual clarity, an operational definition and direction for further research. Literature review and concept analysis. A review of literature was conducted in 2016. Databases searched were: Academic Search Complete, CINAHL Plus with Full Text, PsycARTICLES, Psychology and Behavioural Sciences Collection, PsycINFO, Social Sciences Full Text (H.W. Wilson) and SocINDEX with Full Text. No date limits were applied. Identification of the attributes, antecedents and consequences of readiness for hospital discharge led to an operational definition of the concept. The following attributes belonging to 'readiness for hospital discharge' were extracted from the literature: physical stability, adequate support, psychological ability, and adequate information and knowledge. This analysis contributes to the advancement of knowledge in the area of hospital discharge, by proposing an operational definition of readiness for hospital discharge, derived from the literature. A better understanding of the phenomenon will assist healthcare professionals to recognize, measure and implement interventions where necessary, to ensure patients are ready for hospital discharge and assist in the advancement of knowledge for all professionals involved in patient discharge from hospital. © 2017 John Wiley & Sons Ltd.

  15. Maintenance-Ready Web Application Development

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2016-01-01

    Full Text Available The current paper tackles the subject of developing maintenance-ready web applications. Maintenance is presented as a core stage in a web application’s lifecycle. The concept of maintenance-ready is defined in the context of web application development. Web application maintenance tasks types are enunciated and suitable task types are identified for further analysis. The research hypothesis is formulated based on a direct link between tackling maintenance in the development stage and reducing overall maintenance costs. A live maintenance-ready web application is presented and maintenance related aspects are highlighted. The web application’s features, that render it maintenance-ready, are emphasize. The cost of designing and building the web-application to be maintenance-ready are disclosed. The savings in maintenance development effort facilitated by maintenance ready features are also disclosed. Maintenance data is collected from 40 projects implemented by a web development company. Homogeneity and diversity of collected data is evaluated. A data sample is presented and the size and comprehensive nature of the entire dataset is depicted. Research hypothesis are validated and conclusions are formulated on the topic of developing maintenance-ready web applications. The limits of the research process which represented the basis for the current paper are enunciated. Future research topics are submitted for debate.

  16. Achieving Business Excellence by Optimizing Corporate Forensic Readiness

    Directory of Open Access Journals (Sweden)

    Gojko Grubor

    2017-02-01

    Full Text Available In order to improve their business excellence, all organizations, despite their size (small, medium or large one should manage their risk of fraud. Fraud, in today’s world, is often committed by using computers and can only be revealed by digital forensic investigator. Not even small or medium-sized companies are secure from fraud. In the light of recent financial scandals that literary demolished not just economies of specific countries but entire world economy, we propose in this paper an optimal model of corporative computer incident digital forensic investigation (CCIDFI by using adopted mathematic model of the greed MCDM – multi-criteria decision-making method and the Expert Choice software tool for multi-criteria optimization of the CCIDFI readiness. Proposed model can, first of all, help managers of small and medium-sized companies to justify their decisions to employ digital forensic investigators and include them in their information security teams in order to choose the optimal CCIDFI model and improve forensic readiness in the computer incident management process that will result with minimization of potential losses of company in the future and improve its business quality.

  17. The clients’ readiness to use mental health care services: Experiences and perceptions from Iranian context

    Science.gov (United States)

    Alavi, Mousa; Irajpour, Alireza

    2013-01-01

    Background: Underutilization of mental health care services has been a challenge for the health care providers for many years. This challenge could be met in part by improving the clients’ readiness to use such services. This study aimed to introduce the important aspects of the clients’ readiness to use mental health services in the Iranian context. Materials and Methods: A thematic analysis of in-depth interviews was undertaken using a constant comparative approach. Participants (11 health professionals consisting of 3 physicians, 7 nurses, 1 psychologist, and 5 patients/their family members) were recruited from educational hospitals affiliated with Isfahan University of Medical Sciences, Iran. The credibility and trustworthiness was grounded on four aspects: factual value, applicability, consistency, and neutrality. Results: The study findings uncovered two important aspects of the clients’ readiness for utilizing mental health care services. These are described through two themes and related sub-themes: “The clients’ awareness” implies the cognitive aspect of readiness and “the clients’ attitudes” implies the psychological aspect of readiness, both of which have perceived to cultivate a fertile context through which the clients could access and use the mental health services more easily. Conclusions: For the health care system in Isfahan, Iran to be successful in delivering mental health services, training programs directed to prepare service users should be considered. Improving the clients’ favorable attitudes and awareness should be considered. PMID:24554948

  18. The clients' readiness to use mental health care services: Experiences and perceptions from Iranian context.

    Science.gov (United States)

    Alavi, Mousa; Irajpour, Alireza

    2013-11-01

    Underutilization of mental health care services has been a challenge for the health care providers for many years. This challenge could be met in part by improving the clients' readiness to use such services. This study aimed to introduce the important aspects of the clients' readiness to use mental health services in the Iranian context. A thematic analysis of in-depth interviews was undertaken using a constant comparative approach. Participants (11 health professionals consisting of 3 physicians, 7 nurses, 1 psychologist, and 5 patients/their family members) were recruited from educational hospitals affiliated with Isfahan University of Medical Sciences, Iran. The credibility and trustworthiness was grounded on four aspects: factual value, applicability, consistency, and neutrality. The study findings uncovered two important aspects of the clients' readiness for utilizing mental health care services. These are described through two themes and related sub-themes: "The clients' awareness" implies the cognitive aspect of readiness and "the clients' attitudes" implies the psychological aspect of readiness, both of which have perceived to cultivate a fertile context through which the clients could access and use the mental health services more easily. For the health care system in Isfahan, Iran to be successful in delivering mental health services, training programs directed to prepare service users should be considered. Improving the clients' favorable attitudes and awareness should be considered.

  19. A theory of organizational readiness for change

    Directory of Open Access Journals (Sweden)

    Weiner Bryan J

    2009-10-01

    Full Text Available Abstract Background Change management experts have emphasized the importance of establishing organizational readiness for change and recommended various strategies for creating it. Although the advice seems reasonable, the scientific basis for it is limited. Unlike individual readiness for change, organizational readiness for change has not been subject to extensive theoretical development or empirical study. In this article, I conceptually define organizational readiness for change and develop a theory of its determinants and outcomes. I focus on the organizational level of analysis because many promising approaches to improving healthcare delivery entail collective behavior change in the form of systems redesign--that is, multiple, simultaneous changes in staffing, work flow, decision making, communication, and reward systems. Discussion Organizational readiness for change is a multi-level, multi-faceted construct. As an organization-level construct, readiness for change refers to organizational members' shared resolve to implement a change (change commitment and shared belief in their collective capability to do so (change efficacy. Organizational readiness for change varies as a function of how much organizational members value the change and how favorably they appraise three key determinants of implementation capability: task demands, resource availability, and situational factors. When organizational readiness for change is high, organizational members are more likely to initiate change, exert greater effort, exhibit greater persistence, and display more cooperative behavior. The result is more effective implementation. Summary The theory described in this article treats organizational readiness as a shared psychological state in which organizational members feel committed to implementing an organizational change and confident in their collective abilities to do so. This way of thinking about organizational readiness is best suited for

  20. Uncovering University Students' Readiness through Their Assessment of Workplace Communication Skills

    Science.gov (United States)

    Magogwe, Joel M.; Nkosana, Leonard B. M.; Ntereke, Beauty B.

    2014-01-01

    Employers in today's competitive and challenging global world prefer employees who possess "soft skills" in addition to "hard skills" because they make an impact and create a good impression in the workplace. This study examined employment readiness of the University of Botswana (UB) students who took the Advanced Communication…

  1. Lessons Learned about Instruction from Inclusion of Students with Disabilities in College and Career Ready Assessments

    Science.gov (United States)

    Heritage, Margaret; Lazarus, Sheryl S.

    2016-01-01

    The new large-scale assessments rolled out by consortia and states are designed to measure student achievement of rigorous college- and career-ready (CCR) standards. Recent surveys of teachers in several states indicate that students with disabilities like many features of the new assessments, but that there also are challenges. This Brief was…

  2. En sus marcas--Listos--A leer! Para los cuidadores de ninos pequenos: Actividades de lenguaje para la primera infancia y ninez entre el nacimiento y los 5 anos. El reto: A leer, America! (Ready--Set--Read! For Caregivers: Early Childhood Language Activities for Children from Birth through Age Five. America Reads Challenge).

    Science.gov (United States)

    Department of Education, Washington, DC.

    This Ready--Set--Read Kit includes an activity guide for caregivers, a 1997-98 early childhood activity calendar, and an early childhood growth chart. The activity guide presents activities and ideas that caregivers (family child care providers and the teachers, staff, and volunteers in child development programs) can use to help young children…

  3. En sus marcas--Listos--A leer! Para las familias: Actividades de lenguaje para la primera infancia y ninez entre el nacimiento y los 5 anos. El reto: A leer, America! (Ready--Set--Read! For Families: Early Childhood Language Activities for Children from Birth through Age Five. America Reads Challenge).

    Science.gov (United States)

    Department of Education, Washington, DC.

    This Ready--Set--Read Kit includes an activity guide for families, a 1997-98 early childhood activity calendar, and an early childhood growth wallchart. The activity guide presents activities and ideas that families (adults who have nurturing relationships with a child--a mother, father, grandparent, other relative, or close friend) can use to…

  4. Readiness of healthcare providers for eHealth: the case from primary healthcare centers in Lebanon.

    Science.gov (United States)

    Saleh, Shadi; Khodor, Rawya; Alameddine, Mohamad; Baroud, Maysa

    2016-11-10

    eHealth can positively impact the efficiency and quality of healthcare services. Its potential benefits extend to the patient, healthcare provider, and organization. Primary healthcare (PHC) settings may particularly benefit from eHealth. In these settings, healthcare provider readiness is key to successful eHealth implementation. Accordingly, it is necessary to explore the potential readiness of providers to use eHealth tools. Therefore, the purpose of this study was to assess the readiness of healthcare providers working in PHC centers in Lebanon to use eHealth tools. A self-administered questionnaire was used to assess participants' socio-demographics, computer use, literacy, and access, and participants' readiness for eHealth implementation (appropriateness, management support, change efficacy, personal beneficence). The study included primary healthcare providers (physicians, nurses, other providers) working in 22 PHC centers distributed across Lebanon. Descriptive and bivariate analyses (ANOVA, independent t-test, Kruskal Wallis, Tamhane's T2) were used to compare participant characteristics to the level of readiness for the implementation of eHealth. Of the 541 questionnaires, 213 were completed (response rate: 39.4 %). The majority of participants were physicians (46.9 %), and nurses (26.8 %). Most physicians (54.0 %), nurses (61.4 %), and other providers (50.9 %) felt comfortable using computers, and had access to computers at their PHC center (physicians: 77.0 %, nurses: 87.7 %, others: 92.5 %). Frequency of computer use varied. The study found a significant difference for personal beneficence, management support, and change efficacy among different healthcare providers, and relative to participants' level of comfort using computers. There was a significant difference by level of comfort using computers and appropriateness. A significant difference was also found between those with access to computers in relation to personal beneficence and

  5. Military Readiness: DODs Readiness Rebuilding Efforts May Be at Risk without a Comprehensive Plan

    Science.gov (United States)

    2016-09-01

    specific elements that are to be in strategic plans. 8Chairman of the Joint Chiefs of Staff Guide 3401D, CJCS Guide to the Chairman’s Readiness ...all its major functions and operations. DOD strategic guidance makes it clear that rebuilding readiness is a priority that supports the... readiness recovery efforts. Evaluations of the plan to monitor goals and objectives Assessments, through objective measurement and systematic

  6. Ready-to-use foods for management of moderate acute malnutrition: Considerations for scaling up production and use in programs

    Science.gov (United States)

    Ready-to-use foods are one of the available strategies for the treatment of moderate acute malnutrition (MAM), but challenges remain in the use of these products in programs at scale. This paper focuses on two challenges: the need for cheaper formulations using locally available ingredients that are...

  7. Electricity market readiness plan : Ontario Energy Board

    International Nuclear Information System (INIS)

    2001-03-01

    This document informs electric power market participants of the Ontario Energy Board's newly developed market readiness plan and target timelines that local distribution companies (LDCs) must meet for retail marketing. The Ontario Energy Board's plan incorporates relevant independent market operator (IMO)-administered market milestones with retail market readiness targeted for September 2001. The market readiness framework involves a self-certification process for LDCs by August 10, 2001, through which the Board will be able to monitor progress and assess the feasibility of meeting the target timelines. For retail market readiness, all LDCs will have to calculate settlement costs, produce unbundled bills, provide standard supply service, change suppliers and accommodate retail transactions. LDCs must be either authorized participants in the IMO-administered market or become retail customers of their host LDC. Unbundled bills will include itemized charges for energy price, transmission, distribution and debt retirement charge. 1 tab., 1 fig

  8. Enhancing Mental Readiness in Military Personnel

    National Research Council Canada - National Science Library

    Thompson, Megan M; McCreary, Donald R

    2006-01-01

    In this paper we explore how the psychological literature on stress and coping might inform military training programs to enhance "mental readiness" as a method to develop the baseline psychological...

  9. Some Thoughts on Systematic Reading Readiness Instruction.

    Science.gov (United States)

    Palardy, J. Michael

    1984-01-01

    Examines four specific areas of reading readiness--visual discrimination, visual memory, auditory discrimination, and auditory comprehension--and reviews teaching strategies in each of the four areas. (FL)

  10. Readiness Assessment Plan, Hanford 200 areas treated effluent disposal facilities

    International Nuclear Information System (INIS)

    Ulmer, F.J.

    1995-01-01

    This Readiness Assessment Plan documents Liquid Effluent Facilities review process used to establish the scope of review, documentation requirements, performance assessment, and plant readiness to begin operation of the Treated Effluent Disposal system in accordance with DOE-RLID-5480.31, Startup and Restart of Facilities Operational Readiness Review and Readiness Assessments

  11. University Research Initiative Program for Combat Readiness

    Science.gov (United States)

    1999-05-01

    microscope image of one of the lenses. This array was selected for testing because it is fabricated in a relatively inexpensive polyacrylic material, the...potent analogues of the potassium -sparing diuretic, amiloride. However, our results 179 University Reasearch Initiative for Combat Readiness Annual Report...for Combat Readiness Annual Report for the period June 1, 1998 - June 30, 1999 Roger H. Sawyer University of South Carolina Columbia, SC 29208 May

  12. Solar Training Network and Solar Ready Vets

    Energy Technology Data Exchange (ETDEWEB)

    Dalstrom, Tenley Ann

    2016-09-14

    In 2016, the White House announced the Solar Ready Vets program, funded under DOE's SunShot initiative would be administered by The Solar Foundation to connect transitioning military personnel to solar training and employment as they separate from service. This presentation is geared to informing and recruiting employer partners for the Solar Ready Vets program, and the Solar Training Network. It describes the programs, and the benefits to employers that choose to connect to the programs.

  13. Making Technology Ready: Integrated Systems Health Management

    Science.gov (United States)

    Malin, Jane T.; Oliver, Patrick J.

    2007-01-01

    This paper identifies work needed by developers to make integrated system health management (ISHM) technology ready and by programs to make mission infrastructure ready for this technology. This paper examines perceptions of ISHM technologies and experience in legacy programs. Study methods included literature review and interviews with representatives of stakeholder groups. Recommendations address 1) development of ISHM technology, 2) development of ISHM engineering processes and methods, and 3) program organization and infrastructure for ISHM technology evolution, infusion and migration.

  14. Year 2000 Readiness Kit: A Compilation of Y2K Resources for Schools, Colleges and Universities.

    Science.gov (United States)

    Department of Education, Washington, DC.

    This kit was developed to assist the postsecondary education community's efforts to resolve the Year 2000 (Y2K) computer problem. The kit includes a description of the Y2K problem, an assessment of the readiness of colleges and universities, a checklist for institutions, a Y2K communications strategy, articles on addressing the problem in academic…

  15. Measuring E-Learning Readiness among EFL Teachers in Intermediate Public Schools in Saudi Arabia

    Science.gov (United States)

    Al-Furaydi, Ahmed Ajab

    2013-01-01

    This study will determine their readiness level for the e-learning in several aspects such as attitude toward e-learning, and computer literacy also this study attempt to investigate the main the barriers that EFL teachers have to overcome while incorporating e-learning into their teaching. The theory upon which the study was technology acceptance…

  16. ACR: Licensing and design readiness

    International Nuclear Information System (INIS)

    Alizadeh, A.

    2009-01-01

    Full text The Canadian nuclear technology has a long history dating back to the 1940s. In this regard, Canada is in a unique situation, shared only by a very few countries, where original nuclear power technology has been invented and further developed. Canadian Nuclear Safety Commission (CNSC), then called AECB, was established in 1946. CNSC focuses on nuclear security, nuclear safety, establishing health and safety regulations, and has also played an instrumental role in the formation of the IAEA. CNSC has provided assistance to the establishment of regulatory authorities in AECL's client countries such as Korea, Argentina, China and Romania. AECL has developed the Gen III+ ACR 1000 as evolutionary advancement of the current CANDU 6 reactor. ACR-1000 has evolved from AECL's in depth experience with CANDU systems, components, and materials, as well as the feedback received from owners and operators of CANDU plants. The ACR-1000 design retains the proven strengths and features of CANDU reactors, while incorporating innovations and state-of-the-art technology. It also features major improvements in economics, inherent safety characteristics, and performance. ACR-1000 has completed its Basic Engineering, has advanced in the licensing process in Canada, and is ready for deployment in Canadian and world markets. EC6 is an evolution of CANDU 6 and is a Gen III natural uranium fuelled reactor. Its medium size and potential for fuel localization and advanced fuel cycles is an optimal strategic solution in many markets.AECL's reactor products are shown to be compliant with a variety of licensing and regulatory requirements. These include the new CNSC DRD-337, IAEA NS-R1, and EUR. This allows the countries interested in CANDU reactor products to be confident of its licensing in their own regulatory regimes.

  17. The development of an online decision support tool for organizational readiness for change.

    Science.gov (United States)

    Khan, Sobia; Timmings, Caitlyn; Moore, Julia E; Marquez, Christine; Pyka, Kasha; Gheihman, Galina; Straus, Sharon E

    2014-05-10

    Much importance has been placed on assessing readiness for change as one of the earliest steps of implementation, but measuring it can be a complex and daunting task. Organizations and individuals struggle with how to reliably and accurately measure readiness for change. Several measures have been developed to help organizations assess readiness, but these are often underused due to the difficulty of selecting the right measure. In response to this challenge, we will develop and test a prototype of a decision support tool that is designed to guide individuals interested in implementation in the selection of an appropriate readiness assessment measure for their setting. A multi-phase approach will be used to develop the decision support tool. First, we will identify key measures for assessing organizational readiness for change from a recently completed systematic review. Included measures will be those developed for healthcare settings (e.g., acute care, public health, mental health) and that have been deemed valid and reliable. Second, study investigators and field experts will engage in a mapping exercise to categorize individual items of included measures according to key readiness constructs from an existing framework. Third, a stakeholder panel will be recruited and consulted to determine the feasibility and relevance of the selected measures using a modified Delphi process. Fourth, findings from the mapping exercise and stakeholder consultation will inform the development of a decision support tool that will guide users in appropriately selecting change readiness measures. Fifth, the tool will undergo usability testing. Our proposed decision support tool will address current challenges in the field of organizational change readiness by aiding individuals in selecting a valid and reliable assessment measure that is relevant to user needs and practice settings. We anticipate that implementers and researchers who use our tool will be more likely to conduct

  18. Beyond College Eligibility: A New Framework for Promoting College Readiness. College Readiness Indicator Systems Resource Series

    Science.gov (United States)

    Annenberg Institute for School Reform at Brown University, 2014

    2014-01-01

    The College Readiness Indicator Systems (CRIS) initiative was developed in response to a troubling pattern: More students than ever are enrolling in college after high school, but many of them are not college ready, as evidenced by persistently low rates of college completion. The sense of urgency to close the gap between college eligibility and…

  19. First Responder Readiness: A Systems Approach to Readiness Assessment Using Model Based Vulnerability Analysis Techniques

    Science.gov (United States)

    2005-09-01

    to come—if it be not to come, it will be now—if it be not now, yet it will come—the readiness is all. . .” --- Shakespeare , Hamlet , 5.2.215-219...BLANK 1 I. READINESS OVERVIEW A. INTRODUCTION “ Hamlet : . . . There is a special providence in the fall of a sparrow. If it be now ‘tis not

  20. Pathways to School Readiness: Executive Functioning Predicts Academic and Social-Emotional Aspects of School Readiness

    Science.gov (United States)

    Mann, Trisha D.; Hund, Alycia M.; Hesson-McInnis, Matthew S.; Roman, Zachary J.

    2017-01-01

    The current study specified the extent to which hot and cool aspects of executive functioning predicted academic and social-emotional indicators of school readiness. It was unique in focusing on positive aspects of social-emotional readiness, rather than problem behaviors. One hundred four 3-5-year-old children completed tasks measuring executive…

  1. Determining transition readiness in congenital heart disease: Assessing the utility of the Transition Readiness Questionnaire

    Science.gov (United States)

    The Transition Readiness Assessment Questionnaire (TRAQ) is a tool commonly used to assess transition readiness in adolescents with chronic diseases. It was previously validated in youth with special health care needs (YSHCN), but no patients with congenital heart disease (CHD) were included in the ...

  2. QCD are we ready for the LHC?

    CERN Multimedia

    CERN. Geneva

    2006-01-01

    The LHC energy regime poses a serious challenge to our capability of predicting QCD reactions to the level of accuracy necessary for a successful programme of searches for physics beyond the Standard Model. In these lectures, I'll introduce basic concepts in QCD, and present techniques based on perturbation theory, such as fixed-order and resummed computations, and Monte Carlo simulations. I'll discuss applications of these techniques to hadron-hadron processes, concentrating on recent trends in perturbative QCD aimed at improving our understanding of LHC phenomenology.

  3. Methods and computing challenges of the realistic simulation of physics events in the presence of pile-up in the ATLAS experiment

    CERN Document Server

    Chapman, J D; The ATLAS collaboration

    2014-01-01

    We are now in a regime where we observe substantial multiple proton-proton collisions within each filled LHC bunch-crossing and also multiple filled bunch-crossings within the sensitive time window of the ATLAS detector. This will increase with increased luminosity in the near future. Including these effects in Monte Carlo simulation poses significant computing challenges. We present a description of the standard approach used by the ATLAS experiment and details of how we manage the conflicting demands of keeping the background dataset size as small as possible while minimizing the effect of background event re-use. We also present details of the methods used to minimize the memory footprint of these digitization jobs, to keep them within the grid limit, despite combining the information from thousands of simulated events at once. We also describe an alternative approach, known as Overlay. Here, the actual detector conditions are sampled from raw data using a special zero-bias trigger, and the simulated physi...

  4. Readiness to reconcile and post-traumatic distress in German survivors of wartime rapes in 1945.

    Science.gov (United States)

    Eichhorn, S; Stammel, N; Glaesmer, H; Klauer, T; Freyberger, H J; Knaevelsrud, C; Kuwert, P

    2015-05-01

    Sexual violence and wartime rapes are prevalent crimes in violent conflicts all over the world. Processes of reconciliation are growing challenges in post-conflict settings. Despite this, so far few studies have examined the psychological consequences and their mediating factors. Our study aimed at investigating the degree of longtime readiness to reconcile and its associations with post-traumatic distress within a sample of German women who experienced wartime rapes in 1945. A total of 23 wartime rape survivors were compared to age- and gender-matched controls with WWII-related non-sexual traumatic experiences. Readiness to reconcile was assessed with the Readiness to Reconcile Inventory (RRI-13). The German version of the Post-traumatic Diagnostic Scale (PDS) was used to assess post-traumatic stress disorder (PTSD) symptomatology. Readiness to reconcile in wartime rape survivors was higher in those women who reported less post-traumatic distress, whereas the subscale "openness to interaction" showed the strongest association with post-traumatic symptomatology. Moreover, wartime rape survivors reported fewer feelings of revenge than women who experienced other traumatization in WWII. Our results are in line with previous research, indicating that readiness to reconcile impacts healing processes in the context of conflict-related traumatic experiences. Based on the long-lasting post-traumatic symptomatology we observed that our findings highlight the need for psychological treatment of wartime rape survivors worldwide, whereas future research should continue focusing on reconciliation within the therapeutic process.

  5. Are consumers ready for RFID?

    DEFF Research Database (Denmark)

    Aguiar, Luis Kluwe; Brofman, Freddy; de Barcellos, Marcia Dutra

    2010-01-01

    Marketing orientation is both the key objective of most food producers and their biggest challenge. Connecting food and agricultural production with the changing needs and aspirations of the customer provides the means to ensure competitive advantage, resilience and added value in what you produc...... organic foods to old world wines. All the chapters provide exceptional insight into understanding how market orientation can benefit food suppliers and how it is essential for long-term success....

  6. Concept of economic readiness levels assessment

    Science.gov (United States)

    Yuniaristanto, Sutopo, W.; Widiyanto, A.; Putri, A. S.

    2017-11-01

    This research aims to build a concept of Economic Readiness Level (ERL) assessment for incubation center. ERL concept is arranged by considering both market and business aspects. Every aspect is divided into four phases and each of them consists of some indicators. Analytic Hierarchy Process (AHP) is used to develop the ERL in calculating the weight of every single aspect and indicator. Interval scale between 0 and 4 is also applied in indicator assessment. In order to calculate ERL, score in every indicator and the weight of both the aspect and indicator are considered. ERL value is able to show in detail the innovative product readiness level from economic sight, market and business aspect. There are four levels in Economic Readiness Level scheme which are investigation, feasibility, planning and introduction.

  7. Ready for kindergarten: Are intelligence skills enough?

    Directory of Open Access Journals (Sweden)

    Caroline Fitzpatrick

    2017-12-01

    Full Text Available This study investigated how different profiles of kindergarten readiness in terms of student intellectual ability, academic skills and classroom engagement relate to future academic performance. Participants are French-Canadian children followed in the context of the Quebec Longitudinal Study of Child Development (N = 670. Trained examiners measured number knowledge, receptive vocabulary and fluid intelligence when children were in kindergarten. Teachers rated kindergarten classroom engagement. Outcomes included fourth-grade teacherrated achievement and directly assessed mathematical skills. Latent class analyses revealed three kindergarten readiness profiles: high (57%, moderate (34% and low (9.3% readiness. Using multiple regression, we found that a more favourable kindergarten profile predicted better fourth-grade academic performance. Identifying children at risk of academic difficulty is an important step for preventing underachievement and dropout. These results suggest the importance of promoting a variety of cognitive, academic and behavioural skills to enhance later achievement in at-risk learners.

  8. Organisational readiness: exploring the preconditions for success in organisation-wide patient safety improvement programmes.

    Science.gov (United States)

    Burnett, Susan; Benn, Jonathan; Pinto, Anna; Parand, Anam; Iskander, Sandra; Vincent, Charles

    2010-08-01

    Patient safety has been high on the agenda for more than a decade. Despite many national initiatives aimed at improving patient safety, the challenge remains to find coherent and sustainable organisation-wide safety-improvement programmes. In the UK, the Safer Patients' Initiative (SPI) was established to address this challenge. Important in the success of such an endeavour is understanding 'readiness' at the organisational level, identifying the preconditions for success in this type of programme. This article reports on a case study of the four NHS organisations participating in the first phase of SPI, examining the perceptions of organisational readiness and the relationship of these factors with impact by those actively involved in the initiative. A mixed-methods design was used, involving a survey and semistructured interviews with senior executive leads, the principal SPI programme coordinator and the four operational leads in each of the SPI clinical work areas in all four organisations taking part in the first phase of SPI. This preliminary work would suggest that prior to the start of organisation-wide quality- and safety-improvement programmes, organisations would benefit from an assessment of readiness with time spent in the preparation of the organisational infrastructure, processes and culture. Furthermore, a better understanding of the preconditions that mark an organisation as ready for improvement work would allow policymakers to set realistic expectations about the outcomes of safety campaigns.

  9. Bioprinting: an assessment based on manufacturing readiness levels.

    Science.gov (United States)

    Wu, Changsheng; Wang, Ben; Zhang, Chuck; Wysk, Richard A; Chen, Yi-Wen

    2017-05-01

    Over the last decade, bioprinting has emerged as a promising technology in the fields of tissue engineering and regenerative medicine. With recent advances in additive manufacturing, bioprinting is poised to provide patient-specific therapies and new approaches for tissue and organ studies, drug discoveries and even food manufacturing. Manufacturing Readiness Level (MRL) is a method that has been applied to assess manufacturing maturity and to identify risks and gaps in technology-manufacturing transitions. Technology Readiness Level (TRL) is used to evaluate the maturity of a technology. This paper reviews recent advances in bioprinting following the MRL scheme and addresses corresponding MRL levels of engineering challenges and gaps associated with the translation of bioprinting from lab-bench experiments to ultimate full-scale manufacturing of tissues and organs. According to our step-by-step TRL and MRL assessment, after years of rigorous investigation by the biotechnology community, bioprinting is on the cusp of entering the translational phase where laboratory research practices can be scaled up into manufacturing products specifically designed for individual patients.

  10. Validating Acquisition IS Integration Readiness with Drills

    DEFF Research Database (Denmark)

    Wynne, Peter J.

    2017-01-01

    To companies, mergers and acquisitions are important strategic tools, yet they often fail to deliver their expected value. Studies have shown the integration of information systems is a significant roadblock to the realisation of acquisition benefits, and for an IT department to be ready......), to understand how an IT department can use them to validate their integration plans. The paper presents a case study of two drills used to validate an IT department’s readiness to carry out acquisition IS integration, and suggests seven acquisition IS integration drill characteristics others could utilise when...

  11. NHI Component Technical Readiness Evaluation System

    International Nuclear Information System (INIS)

    Sherman, S.; Wilson, Dane F.; Pawel, Steven J.

    2007-01-01

    A decision process for evaluating the technical readiness or maturity of components (i.e., heat exchangers, chemical reactors, valves, etc.) for use by the U.S. DOE Nuclear Hydrogen Initiative is described. This system is used by the DOE NHI to assess individual components in relation to their readiness for pilot-scale and larger-scale deployment and to drive the research and development work needed to attain technical maturity. A description of the evaluation system is provided, and examples are given to illustrate how it is used to assist in component R and D decisions.

  12. Variability of computational fluid dynamics solutions for pressure and flow in a giant aneurysm: the ASME 2012 Summer Bioengineering Conference CFD Challenge.

    Science.gov (United States)

    Steinman, David A; Hoi, Yiemeng; Fahy, Paul; Morris, Liam; Walsh, Michael T; Aristokleous, Nicolas; Anayiotos, Andreas S; Papaharilaou, Yannis; Arzani, Amirhossein; Shadden, Shawn C; Berg, Philipp; Janiga, Gábor; Bols, Joris; Segers, Patrick; Bressloff, Neil W; Cibis, Merih; Gijsen, Frank H; Cito, Salvatore; Pallarés, Jordi; Browne, Leonard D; Costelloe, Jennifer A; Lynch, Adrian G; Degroote, Joris; Vierendeels, Jan; Fu, Wenyu; Qiao, Aike; Hodis, Simona; Kallmes, David F; Kalsi, Hardeep; Long, Quan; Kheyfets, Vitaly O; Finol, Ender A; Kono, Kenichi; Malek, Adel M; Lauric, Alexandra; Menon, Prahlad G; Pekkan, Kerem; Esmaily Moghadam, Mahdi; Marsden, Alison L; Oshima, Marie; Katagiri, Kengo; Peiffer, Véronique; Mohamied, Yumnah; Sherwin, Spencer J; Schaller, Jens; Goubergrits, Leonid; Usera, Gabriel; Mendina, Mariana; Valen-Sendstad, Kristian; Habets, Damiaan F; Xiang, Jianping; Meng, Hui; Yu, Yue; Karniadakis, George E; Shaffer, Nicholas; Loth, Francis

    2013-02-01

    Stimulated by a recent controversy regarding pressure drops predicted in a giant aneurysm with a proximal stenosis, the present study sought to assess variability in the prediction of pressures and flow by a wide variety of research groups. In phase I, lumen geometry, flow rates, and fluid properties were specified, leaving each research group to choose their solver, discretization, and solution strategies. Variability was assessed by having each group interpolate their results onto a standardized mesh and centerline. For phase II, a physical model of the geometry was constructed, from which pressure and flow rates were measured. Groups repeated their simulations using a geometry reconstructed from a micro-computed tomography (CT) scan of the physical model with the measured flow rates and fluid properties. Phase I results from 25 groups demonstrated remarkable consistency in the pressure patterns, with the majority predicting peak systolic pressure drops within 8% of each other. Aneurysm sac flow patterns were more variable with only a few groups reporting peak systolic flow instabilities owing to their use of high temporal resolutions. Variability for phase II was comparable, and the median predicted pressure drops were within a few millimeters of mercury of the measured values but only after accounting for submillimeter errors in the reconstruction of the life-sized flow model from micro-CT. In summary, pressure can be predicted with consistency by CFD across a wide range of solvers and solution strategies, but this may not hold true for specific flow patterns or derived quantities. Future challenges are needed and should focus on hemodynamic quantities thought to be of clinical interest.

  13. Preparing Canada`s power systems for transition to the Year 2000: Y2K readiness assessment results for Canadian electric utility companies: second quarter 1999

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-08-25

    The report describes the state of readiness of Canadian electric utility companies with respect to the Year 2000 computer challenge. It complements the North American Electric Reliability Council Report entitled `Preparing the Electric Power Systems of North America for Transition to the Year 2000: A Status and Work Plan.` Two surveys were employed to gather information for this report. The first, a detailed survey prepared by NERC, was forwarded to all major electric utilities that comprise the Bulk Electricity System in North America. CEA has removed the Canadian findings from the overall North American results, and has presented those findings in this report. The second was a shorter, more simplified study, conducted by CEA and Natural Resources Canada. Whereas small companies involved only in the distribution aspect of the electricity business were not asked to complete the NERC assessment, all Canadian electric utility companies were part of the shorter survey. Chapter 2 covers specifically the readiness status and project management for non-nuclear generation, nuclear generation, energy management systems, telecommunications systems, substation controls, system protection and distribution systems, business information systems, and small distribution companies.

  14. Information Warfare: Issues Associated with the Defense of DOD Computers and Computer Networks

    National Research Council Canada - National Science Library

    Franklin, Derek

    2002-01-01

    ... that may threaten the critical information pathways of the armed forces An analysis of the history of computer information warfare reveals that there was an embarrassing lack of readiness and defense...

  15. A Proposed Conceptual Model of Military Medical Readiness

    National Research Council Canada - National Science Library

    Van Hall, Brian M

    2007-01-01

    .... The purpose of this research is to consolidate existing literature on the latent variable of medical readiness, and to propose a composite theoretical model of medical readiness that may provide...

  16. Individual Ready Reserve: It's Relevance in Supporting the Long War

    National Research Council Canada - National Science Library

    Chisholm, Shelley A

    2008-01-01

    ... in sustaining personnel readiness while supporting on going operations. In response to meeting these personnel readiness concerns, the Army Reserve will require the call-up of Soldiers currently serving in the IRR...

  17. e-Learning readiness amongst nursing students at the Durban ...

    African Journals Online (AJOL)

    e-Learning readiness amongst nursing students at the Durban University of ... make the shift from traditional learning to the technological culture of e-Learning at a ... equipment and technological readiness for the change in learning method.

  18. PENGARUH TECHNOLOGY READINESS TERHADAP PENERIMAAN TEKNOLOGI KOMPUTER PADA UMKM DI YOGYAKARTA

    Directory of Open Access Journals (Sweden)

    Mimin Nur Aisyah

    2014-10-01

    Full Text Available Abstrak: Pengaruh Technology Readiness terhadap Penerimaan Teknologi Komputer pada UMKM di Yogyakarta. Penelitian ini bertujuan untuk mengekplorasi pengaruh kesiapan teknologi terhadap persepsi kemanfaatan sistem dan persepsi kemudahan penggunaan sistem serta pengaruh kedua persepsi terhadap teknologi tersebut terhadap minat menggunakan teknologi komputer dalam membantu proses bisnis pada UMKM di Yogyakarta.  Sampel penelitian ini sejumlah 498 UMKM yang terdaftar di Disperindagkop Yogyakarta. Teknik pengambilan sampel menggunakan teknik simple random sampling. Data diperoleh menggunakan kuesioner. Analisis data dan uji hipotesis menggunakan model Partial-Least-Square (PLS. Penelitian ini menemukan bahwa terdapat pengaruh kesiapan teknologi terhadap persepsi kemanfaatan sistem dan persepsi kemudahan penggunaan sistem, serta terdapat pengaruh persepsi kemanfaatan teknologi dan persepsi kemudahan penggunaan teknologi terhadap minat menggunakan teknologi komputer dalam membantu proses bisnis pada UMKM di Yogyakarta.   Kata kunci: kesiapan teknologi, persepsi kemanfaatan, persepsi kemudahan penggunaan, minat menggunakan, UMKM Abstract: The Effect of Technology Readiness toward Acceptance of Computer Technology on SMEs in Yogyakarta. This research aims to explore the effect of technology readiness to the perceived of usefulness of system and perceived ease of use of the system and the influence of both perceptions of these technologies to the behavioral intention of computer technology in business processes in SMEs in Yogyakarta. The research sample number of 498 SMEs were registered in Disperindagkop Yogyakarta. The sampling technique using simple random sampling technique. The data were obtained using a questionnaire. Data analysis and hypothesis testing using a model of the Partial-Least-Square (PLS. The research found that there are significant technology readiness to the perception of the benefit system and perceived ease of use of the system

  19. Ready a Commodore 64 retrospective

    CERN Document Server

    Dillon, Roberto

    2015-01-01

    How did the Commodore 64 conquer the hearts of millions and become a platform people still actively develop for even today? What made it so special? This book will appeal to both those who like tinkering with old technology as a hobby and nostalgic readers who simply want to enjoy a trip down memory lane. It discusses in a concise but rigorous format the different areas of home gaming and personal computing where the C64 managed to innovate and push forward existing boundaries. Starting from Jack Tramiel's vision of designing computers "for the masses, not the classes," the book introduces the 6510, VIC-II and SID chips that made the C64 unique. It briefly discusses its Basic programming language and then proceeds to illustrate not only many of the games that are still so fondly remembered but also the first generation of game engines that made game development more approachable − among other topics that are often neglected but are necessary to provide a comprehensive overview of how far reaching the C64 in...

  20. Career Readiness: Are We There Yet?

    Science.gov (United States)

    Guidry, Christopher

    2012-01-01

    ACT is committed to working with career and technical educators in order to prepare students to meet the standards of the high-performance workplace. In short, prepare them to be career- and job-ready. This commitment is a reflection of ACT's mission: "helping people achieve education and workplace success." After devoting more than two decades of…

  1. Weight Loss: Ready to Change Your Habits?

    Science.gov (United States)

    ... calories more than you consume each day — through diet and exercise. You might lose weight more quickly if you ... yourself with regular weigh-ins and tracking your diet and activity, which is ... don't have a positive attitude about losing weight, you might not be ready — ...

  2. Development toward School Readiness: A Holistic Model

    Science.gov (United States)

    Gaynor, Alan Kibbe

    2015-01-01

    A systemic analysis of early childhood development factors explains the variance in school readiness among representative U.S. 5-year-olds. The underlying theory incorporates a set of causally interactive endogenous variables that are hypothesized to be driven by the effects of three exogenous variables: parental education, immigrant status and…

  3. The Developmental Approach to School Readiness.

    Science.gov (United States)

    Ogletree, Earl J.

    In the United States, a psychometric psychology dominates the thinking of educators. For traditional, political, and social reasons, developmental psychology rarely informs educational practices. This is the case even though studies show that the inducing of cognitive learning before a child is ready will reduce the child's learning potential and…

  4. Ready Texas: Stakeholder Convening. Proceedings Report

    Science.gov (United States)

    Intercultural Development Research Association, 2016

    2016-01-01

    With the adoption of substantial changes to Texas high school curricula in 2013 (HB5), a central question for Texas policymakers, education and business leaders, families, and students is whether and how HB5 implementation impacts the state of college readiness and success in Texas. Comprehensive research is needed to understand the implications…

  5. Readiness of Teachers for Change in Schools

    Science.gov (United States)

    Kondakci, Yasar; Beycioglu, Kadir; Sincar, Mehmet; Ugurlu, Celal Teyyar

    2017-01-01

    Theorizing on the role of teacher attitudes in change effectiveness, this study examined the predictive value of context (trust), process (social interaction, participative management and knowledge sharing) and outcome (job satisfaction and workload perception) variables for cognitive, emotional and intentional readiness of teachers for change.…

  6. Child Physical Punishment, Parenting, and School Readiness

    Science.gov (United States)

    Weegar, Kelly; Guérin-Marion, Camille; Fréchette, Sabrina; Romano, Elisa

    2018-01-01

    This study explored how physical punishment (PP) and other parenting approaches may predict school readiness outcomes. By using the Canada-wide representative data, 5,513 children were followed over a 2-year period. Caregivers reported on their use of PP and other parenting approaches (i.e., literacy and learning activities and other disciplinary…

  7. Birth Preparedness and Complication Readiness of Pregnant ...

    African Journals Online (AJOL)

    Birth Preparedness and Complication Readiness of Pregnant Women Attending the Three Levels of Health Facilities in Ife Central Local Government, Nigeria. ... Only 24 (6.0%) had adequate knowledge of obstetric danger signs without prompting. Three hundred and forty (84.8%) and 312 (78.3%) women respectively had ...

  8. Emotional Readiness and Music Therapeutic Activities

    Science.gov (United States)

    Drossinou-Korea, Maria; Fragkouli, Aspasia

    2016-01-01

    The purpose of this study is to understand the children's expression with verbal and nonverbal communication in the Autistic spectrum. We study the emotional readiness and the music therapeutic activities which exploit the elements of music. The method followed focused on the research field of special needs education. Assumptions on the parameters…

  9. Readiness to proceed: Characterization planning basis

    International Nuclear Information System (INIS)

    Adams, M.R.

    1998-01-01

    This report summarizes characterization requirements, data availability, and data acquisition plans in support of the Phase 1 Waste Feed Readiness to Proceed Mid-Level Logic. It summarizes characterization requirements for the following program planning documents: Waste Feed Readiness Mid-Level Logic and Decomposition (in development); Master blue print (not available); Tank Waste Remediation System (TWRS) Operations and Utilization Plan and Privatization Contract; Enabling assumptions (not available); Privatization low-activity waste (LAW) Data Quality Objective (DQO); Privatization high-level waste (HLW) DQO (draft); Problem-specific DQOs (in development); Interface control documents (draft). Section 2.0 defines the primary objectives for this report, Section 3.0 discusses the scope and assumptions, and Section 4.0 identifies general characterization needs and analyte-specific characterization needs or potential needs included in program documents and charts. Section 4.0 also shows the analyses that have been conducted, and the archive samples that are available for additional analyses. Section 5.0 discusses current plans for obtaining additional samples and analyses to meet readiness-to-proceed requirements. Section 6.0 summarizes sampling needs based on preliminary requirements and discusses other potential characterization needs. Many requirements documents are preliminary. In many cases, problem-specific DQOs have not been drafted, and only general assumptions about the document contents could be obtained from the authors. As a result, the readiness-to-proceed characterization requirements provided in this document are evolving and may change

  10. College and Career Readiness in Elementary Schools

    Science.gov (United States)

    Pulliam, Nicole; Bartek, Samantha

    2018-01-01

    This conceptual article will provide an in-depth exploration of the relevant literature focused on college and career readiness interventions in elementary schools. Beginning with a theoretical framework, a rationale is provided for early intervention by elementary school counselors. While professional guidelines and standards exist supporting…

  11. Service Availability and Readiness Assessment of Maternal ...

    African Journals Online (AJOL)

    AJRH Managing Editor

    Keywords: Madagascar, Maternal and Child health services, Service availability and readiness assessment, Public health facilities. Résumé ..... Table 2: Percentage of Health Facilities Equipped with Tracer Items for Antenatal Care Services Among Facilities. Providing this ... 32 CSBs, due to its location in a tourist area.

  12. Remedial action and waste disposal project - ERDF readiness evaluation plan

    International Nuclear Information System (INIS)

    Casbon, M.A.

    1996-06-01

    This Readiness Evaluation Report presents the results of the project readiness evaluation to assess the readiness of the Environmental Restoration and Disposal Facility. The evaluation was conducted at the conclusion of a series of readiness activities that began in January 1996. These activities included completion of the physical plant; preparation, review, and approval of operating procedures; definition and assembly of the necessary project and operational organizations; and activities leading to regulatory approval of the plant and operating plans

  13. Ready or Not: Namibia As a Potentially Successful Oil Producer

    Directory of Open Access Journals (Sweden)

    Andrzej Polus

    2015-01-01

    Full Text Available The primary objective of this paper is to assess whether Namibia is ready to become an oil producer. The geological estimates suggest that the country may possess the equivalent of as many as 11 billion barrels of crude oil. If the numbers are correct, Namibia would be sitting on the second-largest oil reserves in sub-Saharan Africa, and exploitation could start as soon as 2017. This clearly raises the question of whether Namibia is next in line to become a victim of the notorious “resource curse.” On the basis of critical discourse analysis and findings from field research, the authors have selected six dimensions of the resource curse and contextualised them within the spheres of Namibian politics and economy. While Namibia still faces a number of important challenges, our findings offer little evidence that the oil will have particularly disruptive effects.

  14. A Study of Fleet Surgical Teams Readiness Posture in Amphibious Readiness Groups

    National Research Council Canada - National Science Library

    Tennyson, Ruby

    2000-01-01

    This thesis describes and evaluates Fleet Surgical Teams (FSTs). It examines how Navy Medicine adapted FSTs to changing support requirements associated with the Total Health Care Support Readiness Requirement (THCSRR...

  15. Cloud Computing Fundamentals

    Science.gov (United States)

    Furht, Borko

    In the introductory chapter we define the concept of cloud computing and cloud services, and we introduce layers and types of cloud computing. We discuss the differences between cloud computing and cloud services. New technologies that enabled cloud computing are presented next. We also discuss cloud computing features, standards, and security issues. We introduce the key cloud computing platforms, their vendors, and their offerings. We discuss cloud computing challenges and the future of cloud computing.

  16. Category 3 investigation-derived waste Readiness Evaluation Plan

    International Nuclear Information System (INIS)

    Ludowise, J.D.

    1996-08-01

    This Readiness Evaluation Plan presents the methodology used to assess the readiness for loading investigation-derived waste (IDW) drums on trucks for transport to the Environmental Restoration Disposal Facility (ERDF). The scope of this Readiness Evaluation Plan includes an assessment of the organizations, procedures, and regulatory approvals necessary for the handling of IDW containers and the subsequent transportation of materials to ERDF

  17. What Are the ACT College Readiness Benchmarks? Information Brief

    Science.gov (United States)

    ACT, Inc., 2013

    2013-01-01

    The ACT College Readiness Benchmarks are the minimum ACT® college readiness assessment scores required for students to have a high probability of success in credit-bearing college courses--English Composition, social sciences courses, College Algebra, or Biology. This report identifies the College Readiness Benchmarks on the ACT Compass scale…

  18. Instructional Alignment of Workplace Readiness Skills in Marketing Education

    Science.gov (United States)

    Martin, Sarah J.; Reed, Philip A.

    2015-01-01

    This study examined high school marketing education teachers' knowledge of workplace readiness skills and whether that knowledge had an impact on student workplace readiness skill achievement. Further, this study examined the usage of Virginia's 13 Workplace Readiness Skills curriculum and identified the teaching methods and instructional…

  19. Diagnostics of children's school readiness in scientific studies abroad

    Directory of Open Access Journals (Sweden)

    Nazarenko V.V.

    2012-06-01

    Full Text Available The article considers the problem of children's school readiness as it is represented in contemporary studies of foreign scholars. It displays a variety of approaches to estimation of school readiness as well as the ways of measuring the levels of child development as relating to school readiness, namely those of them which are in common practice in education.

  20. Public webinar: Wildland Fire Sensors Challenge

    Science.gov (United States)

    This multi-agency challenge seeks a field-ready prototype system capable of measuring constituents of smoke, including particulates, carbon monoxide, ozone, and carbon dioxide, over the wide range of levels expected during wildland fires. The prototype system should be accurate, ...

  1. Implementing a Zero Energy Ready Home Multifamily Project

    Energy Technology Data Exchange (ETDEWEB)

    Springer, David [National Renewable Energy Laboratory (NREL), Golden, CO (United States); German, Alea [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2015-08-17

    Building cost-effective, high-performance homes that provide superior comfort, health, and durability is the goal of the U.S. Department of Energy’s (DOE’s) Zero Energy Ready Home (ZERH) program. Building America research and other innovative programs throughout the country have addressed many of the technical challenges of building to the ZERH standard. The cost-effectiveness of measure packages that result in 30% source energy savings compared to a code-compliant home have been demonstrated. However, additional challenges remain, particularly with respect to convincing production builders of the strong business case for ZERH. The Alliance for Residential Building Innovation (ARBI) team believes that the keys to successfully engaging builders and developers in the California market are to help them leverage development agreement requirements, code compliance requirements, incentives, and competitive market advantages of ZERH certification, and navigate through this process. A primary objective of this project was to gain a highly visible foothold for residential buildings that are built to the DOE ZERH specification that can be used to encourage participation by other California builders. This report briefly describes two single-family homes that were ZERH certified and focuses on the experience of working with developer Mutual Housing on a 62-unit multifamily community at the Spring Lake subdivision in Woodland, California. The Spring Lake project is expected to be the first ZERH-certified multifamily project in the country. This report discusses the challenges encountered, lessons learned, and how obstacles were overcome.

  2. Defining a New North Star: Aligning Local Control Accountability Plans to College and Career Readiness. Policy Brief

    Science.gov (United States)

    Beach, Paul; Thier, Michael; Lench, Sarah Collins; Coleman, Matt

    2015-01-01

    Educators have the increasingly difficult task of preparing students to live, learn, and work in the 21st century. Amidst those challenges, a growing body of research suggests that college and career readiness depends upon students' ability to think critically, learn how to learn, communicate, and collaborate. The No Child Left Behind (NCLB) era…

  3. Beyond a Logic of Quality: Opening Space for Material-Discursive Practices of "Readiness" in Early Years Education

    Science.gov (United States)

    Evans, Katherine

    2016-01-01

    This article is an exploration of the possibilities encountered through shifting from a "logic of quality" to a "space of meaning-making" within early years education. Focusing on ideas of "readiness", this discussion aims to challenge normative understandings that relate this concept to the predictable achievement of…

  4. Computer Tree

    Directory of Open Access Journals (Sweden)

    Onur AĞAOĞLU

    2014-12-01

    Full Text Available It is crucial that gifted and talented students should be supported by different educational methods for their interests and skills. The science and arts centres (gifted centres provide the Supportive Education Program for these students with an interdisciplinary perspective. In line with the program, an ICT lesson entitled “Computer Tree” serves for identifying learner readiness levels, and defining the basic conceptual framework. A language teacher also contributes to the process, since it caters for the creative function of the basic linguistic skills. The teaching technique is applied for 9-11 aged student level. The lesson introduces an evaluation process including basic information, skills, and interests of the target group. Furthermore, it includes an observation process by way of peer assessment. The lesson is considered to be a good sample of planning for any subject, for the unpredicted convergence of visual and technical abilities with linguistic abilities.

  5. A database application for the Naval Command Physical Readiness Testing Program

    OpenAIRE

    Quinones, Frances M.

    1998-01-01

    Approved for public release; distribution is unlimited 1T21 envisions a Navy with tandardized, state-of-art computer systems. Based on this vision, Naval database management systems will also need to become standardized among Naval commands. Today most commercial off the shelf (COTS) database management systems provide a graphical user interface. Among the many Naval database systems currently in use, the Navy's Physical Readiness Program database has continued to exist at the command leve...

  6. Challenging makerspaces

    DEFF Research Database (Denmark)

    Sandvik, Kjetil; Thestrup, Klaus

    This paper takes its departure in the EU-project MakEY - Makerspaces in the early years – enhancing digital literacy and creativity that is part of a H2020 RISE-program and is running January 2017 - June 2019. Here digital literacy and creative skills of young children between the age of 3......-8 will be developed through participation in creative activities in specially-designed spaces termed ‘makerspaces’. This paper discusses, develops and challenges this term in relation to Danish pedagogical traditions, to expanding makerspaces onto the internet and on how to combine narratives and construction....... The Danish part of the project will be undertaken by a small network of partners: DOKK1, a public library and open urban space in Aarhus, that is experimenting with different kind of makerspaces, spaces and encounters between people, The LEGO-LAB situated at Computer Science, Aarhus University, that has...

  7. STS-114: Discovery Launch Readiness Press Conference

    Science.gov (United States)

    2005-01-01

    Michael Griffin, NASA Administrator; Wayne Hale, Space Shuttle Deputy Program Manager; Mike Wetmore, Director of Shuttle Processing; and 1st Lieutenant Mindy Chavez, Launch Weather Officer-United States Air Force 45th Weather Squadron are in attendance for this STS-114 Discovery launch readiness press conference. The discussion begins with Wayne Hale bringing to the table a low level sensor device for everyone to view. He talks in detail about all of the extensive tests that were performed on these sensors and the completion of these ambient tests. Chavez presents her weather forecast for the launch day of July 26th 2005. Michael Griffin and Wayne Hale answer questions from the news media pertaining to the sensors and launch readiness. The video ends with footage of Pilot Jim Kelly and Commander Eileen Collins conducting test flights in a Shuttle Training Aircraft (STA) that simulates Space Shuttle landing.

  8. A Challenge to Watson

    Science.gov (United States)

    Detterman, Douglas K.

    2011-01-01

    Watson's Jeopardy victory raises the question of the similarity of artificial intelligence and human intelligence. Those of us who study human intelligence issue a challenge to the artificial intelligence community. We will construct a unique battery of tests for any computer that would provide an actual IQ score for the computer. This is the same…

  9. Vers un modele d'intervention precoce en lecture en actualisation linguistique (Towards a Model of Early Intervention in Reading Readiness).

    Science.gov (United States)

    Berger, Marie Josee

    1999-01-01

    Argues that in Ontario's French-medium schools, reading is often a challenge, particularly for those in readiness classes who speak little or no French. A model for early intervention in reading is recommended, combining reading and writing to address the linguistic challenges of students in a minority-language community. (Author/MSE)

  10. Medical Readiness of the Reserve Component

    Science.gov (United States)

    2012-01-01

    a public service of the RAND Corporation. CHILDREN AND FAMILIES EDUCATION AND THE ARTS ENERGY AND ENVIRONMENT HEALTH AND HEALTH CARE INFRASTRUCTURE...Health and Nutrition Examination Survey showed that from 1999 to 2002, 27 percent of those 20 to 39 years old and 21 percent of those 40 to 59 years old...readiness; these include tests for Glu- cose-6-phosphate dehydrogenase or hemoglobin S (sickle cell disease), but they are not part of the DoD core

  11. The Pediatrician's Role in Optimizing School Readiness.

    Science.gov (United States)

    2016-09-01

    School readiness includes not only the early academic skills of children but also their physical health, language skills, social and emotional development, motivation to learn, creativity, and general knowledge. Families and communities play a critical role in ensuring children's growth in all of these areas and thus their readiness for school. Schools must be prepared to teach all children when they reach the age of school entry, regardless of their degree of readiness. Research on early brain development emphasizes the effects of early experiences, relationships, and emotions on creating and reinforcing the neural connections that are the basis for learning. Pediatricians, by the nature of their relationships with families and children, may significantly influence school readiness. Pediatricians have a primary role in ensuring children's physical health through the provision of preventive care, treatment of illness, screening for sensory deficits, and monitoring nutrition and growth. They can promote and monitor the social-emotional development of children by providing anticipatory guidance on development and behavior, by encouraging positive parenting practices, by modeling reciprocal and respectful communication with adults and children, by identifying and addressing psychosocial risk factors, and by providing community-based resources and referrals when warranted. Cognitive and language skills are fostered through timely identification of developmental problems and appropriate referrals for services, including early intervention and special education services; guidance regarding safe and stimulating early education and child care programs; and promotion of early literacy by encouraging language-rich activities such as reading together, telling stories, and playing games. Pediatricians are also well positioned to advocate not only for children's access to health care but also for high-quality early childhood education and evidence-based family supports such as

  12. Ready for kindergarten: Are intelligence skills enough?

    OpenAIRE

    Caroline Fitzpatrick

    2017-01-01

    This study investigated how different profiles of kindergarten readiness in terms of student intellectual ability, academic skills and classroom engagement relate to future academic performance. Participants are French-Canadian children followed in the context of the Quebec Longitudinal Study of Child Development (N = 670). Trained examiners measured number knowledge, receptive vocabulary and fluid intelligence when children were in kindergarten. Teachers rated kindergarten classroom engageme...

  13. Technology Readiness for the Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    Kirkham, Harold; Marinovici, Maria C.; Fitzpatrick, G.; Lindsey, K.; McBride, James; Clark, G. L.

    2013-06-30

    Reluctance to adopt new technology into a utility application is understandable, given the critical nature of the infrastructure and the less-than-ideal experiences of some power companies. The authors of this paper have considered adapting the NASA approach of classifying technology readiness, but find it not quite appropriate because NASA was both the developer and the eventual user of the new technology it was evaluating, whereas a utility is ordinarily in the mode of a customer, acquiring a new product from a manufacturer. Instead of a generic scale of technology readiness, a scale of readiness is proposed specifically for the smart grid, based on the many standards that exist for the relevant technologies. In this paper we present an overall structure for organization those standards. The acceptance of new technology is organized into five SGL (Smart Grid Level) steps, numbered five through 9 to correspond approximately to the last five numbers of the NASA TRL scale. SGL 5 is a certification that the hardware and software of the technology is safe for the system into which is intended to be placed. SGL 6 is documentation that the system is safe for itself, and will have adequate reliability. It is thus clear that the steps differ from NASA’s TRL in that technology development is not required, the transition is more one of documenting already existing system readiness. Since SGL 6 describes a system that is safe for the power system and for itself, it should not be restricted from being in a pilot-scale study, and achieving SGL 7. A larger-scale demonstration in a realistic environment will demonstrate interoperability and achieve SGL 8. Only when systems are installed and operating, and when disposal plans are in place will the designation of fully operable at SGL 9 be granted.

  14. Shelf-Life Specifications for Mission Readiness

    Science.gov (United States)

    1993-03-01

    R applies to this item. Arthur D Little t.𔃻.., 4-48 NSN: 7930009353794 Name: Polish , Plastic Description: White lotion with a slight odor Intended...MISSION READINESS TC•T I AR16 19931 Abstract The Navy disposes of tons of hazardous material as hazardous waste due to the expiration of excessively...of hazardous material as hazardous waste due to the expiration of excessively conservati’e shielf-Ihfe terms. In order to reduce this occurrence, the

  15. DOE Richland readiness review for PUREX

    International Nuclear Information System (INIS)

    Zamorski, M.J.

    1984-01-01

    For ten months prior to the November 1983 startup of the Plutonium and URanium EXtraction (PUREX) Plant, the Department of Energy's Richland Operations Office conducted an operational readiness review of the facility. This review was performed consistent with DOE and RL Order 5481.1 and in accordance with written plans prepared by the program and safety divisions. It involved personnel from five divisions within the office. The DOE review included two tasks: (1) overview and evaluation of the operating contractor's (Rockwell Hanford) readiness review for PUREX, and (2) independent assessment of 25 significant aspects of the startup effort. The RL reviews were coordinated by the program division and were phased in succession with the contractor's readiness review. As deficiencies or concerns were noted in the course of the review they were documented and required formal response from the contractor. Startup approval was given in three steps as the PUREX Plant began operation. A thorough review was performed and necessary documentation was prepared to support startup authorization in November 1983, before the scheduled startup date

  16. NGNP Infrastructure Readiness Assessment: Consolidation Report

    International Nuclear Information System (INIS)

    Castle, Brian K.

    2011-01-01

    The Next Generation Nuclear Plant (NGNP) project supports the development, demonstration, and deployment of high temperature gas-cooled reactors (HTGRs). The NGNP project is being reviewed by the Nuclear Energy Advisory Council (NEAC) to provide input to the DOE, who will make a recommendation to the Secretary of Energy, whether or not to continue with Phase 2 of the NGNP project. The NEAC review will be based on, in part, the infrastructure readiness assessment, which is an assessment of industry's current ability to provide specified components for the FOAK NGNP, meet quality assurance requirements, transport components, have the necessary workforce in place, and have the necessary construction capabilities. AREVA and Westinghouse were contracted to perform independent assessments of industry's capabilities because of their experience with nuclear supply chains, which is a result of their experiences with the EPR and AP-1000 reactors. Both vendors produced infrastructure readiness assessment reports that identified key components and categorized these components into three groups based on their ability to be deployed in the FOAK plant. The NGNP project has several programs that are developing key components and capabilities. For these components, the NGNP project have provided input to properly assess the infrastructure readiness for these components.

  17. Assessing students' readiness towards e-learning

    Science.gov (United States)

    Rahim, Nasrudin Md; Yusoff, Siti Hawa Mohd; Latif, Shahida Abd

    2014-07-01

    The usage of e-Learning methodology has become a new attraction for potential students as shown by some higher learning institutions in Malaysia. As such, Universiti Selangor (Unisel) should be ready to embark on e-Learning teaching and learning in the near future. The purpose of the study is to gauge the readiness of Unisel's students in e-Learning environment. A sample of 110 students was chosen to participate in this study which was conducted in January 2013. This sample consisted of students from various levels of study that are foundation, diploma and degree program. Using a structured questionnaire, respondents were assessed on their basic Internet skills, access to technology required for e-Learning and their attitude towards characteristics of successful e-Learning student based on study habits, abilities, motivation and time management behaviour. The result showed that respondents did have access to technology that are required for e-Learning environment, and respondents were knowledgeable regarding the basic Internet skills. The finding also showed that respondents' attitude did meet all characteristics of successful e-Learning student. Further analysis showed that there is no significant relationshipeither among gender, level of study or faculty with those characteristics. As a conclusion, the study shows that current Unisel's students are ready to participate in e-Learning environment if the institution decided to embark on e-Learning methodology.

  18. NGNP Infrastructure Readiness Assessment: Consolidation Report

    Energy Technology Data Exchange (ETDEWEB)

    Brian K Castle

    2011-02-01

    The Next Generation Nuclear Plant (NGNP) project supports the development, demonstration, and deployment of high temperature gas-cooled reactors (HTGRs). The NGNP project is being reviewed by the Nuclear Energy Advisory Council (NEAC) to provide input to the DOE, who will make a recommendation to the Secretary of Energy, whether or not to continue with Phase 2 of the NGNP project. The NEAC review will be based on, in part, the infrastructure readiness assessment, which is an assessment of industry's current ability to provide specified components for the FOAK NGNP, meet quality assurance requirements, transport components, have the necessary workforce in place, and have the necessary construction capabilities. AREVA and Westinghouse were contracted to perform independent assessments of industry's capabilities because of their experience with nuclear supply chains, which is a result of their experiences with the EPR and AP-1000 reactors. Both vendors produced infrastructure readiness assessment reports that identified key components and categorized these components into three groups based on their ability to be deployed in the FOAK plant. The NGNP project has several programs that are developing key components and capabilities. For these components, the NGNP project have provided input to properly assess the infrastructure readiness for these components.

  19. Irradiation of ready made meals -Lasagne

    International Nuclear Information System (INIS)

    Barkia, Ines

    2007-01-01

    The effect of ionizing radiation on the microbiological, nutritional, chemical and sensory quality of chilled ready-made meals was assessed. The ready meals used for this experimental work are lasagne. Following arrival at the semi-industrial Cobalt 60 irradiation facility, the meals were either left unirradiated or irradiated with doses of 2 or 4 kGy after which they were stored for up to 23 days at 3C. Results showed that 2 or 4 kGy doses of gamma irradiation decreased the total counts of mesophilic aerobic bacteria and increased the shelf-life of lasagne. In terms of nutritional quality, it was found that losses of vitamin A and E due to irradiation treatment were considerable at 4 kGy. Total acidity, and p H, were all well within the acceptable limit for up to one week for ready meals treated with 2 and 4 kGy whereas peroxide index showed high values at 4 kGy. Sensory results showed no significant differences between the non-irradiated and irradiated meals at 2 kGy. However, the results were less promising at 4 kGy since differences were significant. (Author). 60 refs

  20. German contributions to the CMS computing infrastructure

    International Nuclear Information System (INIS)

    Scheurer, A

    2010-01-01

    The CMS computing model anticipates various hierarchically linked tier centres to counter the challenges provided by the enormous amounts of data which will be collected by the CMS detector at the Large Hadron Collider, LHC, at CERN. During the past years, various computing exercises were performed to test the readiness of the computing infrastructure, the Grid middleware and the experiment's software for the startup of the LHC which took place in September 2008. In Germany, several tier sites are set up to allow for an efficient and reliable way to simulate possible physics processes as well as to reprocess, analyse and interpret the numerous stored collision events of the experiment. It will be shown that the German computing sites played an important role during the experiment's preparation phase and during data-taking of CMS and, therefore, scientific groups in Germany will be ready to compete for discoveries in this new era of particle physics. This presentation focuses on the German Tier-1 centre GridKa, located at Forschungszentrum Karlsruhe, the German CMS Tier-2 federation DESY/RWTH with installations at the University of Aachen and the research centre DESY. In addition, various local computing resources in Aachen, Hamburg and Karlsruhe are briefly introduced as well. It will be shown that an excellent cooperation between the different German institutions and physicists led to well established computing sites which cover all parts of the CMS computing model. Therefore, the following topics are discussed and the achieved goals and the gained knowledge are depicted: data management and distribution among the different tier sites, Grid-based Monte Carlo production at the Tier-2 as well as Grid-based and locally submitted inhomogeneous user analyses at the Tier-3s. Another important task is to ensure a proper and reliable operation 24 hours a day, especially during the time of data-taking. For this purpose, the meta-monitoring tool 'HappyFace', which was