WorldWideScience

Sample records for computing readiness challenge

  1. Computer Skills Training and Readiness to Work with Computers

    Directory of Open Access Journals (Sweden)

    Arnon Hershkovitz

    2016-05-01

    Full Text Available In today’s job market, computer skills are part of the prerequisites for many jobs. In this paper, we report on a study of readiness to work with computers (the dependent variable among unemployed women (N=54 after participating in a unique, web-supported training focused on computer skills and empowerment. Overall, the level of participants’ readiness to work with computers was much higher at the end of the course than it was at its begin-ning. During the analysis, we explored associations between this variable and variables from four categories: log-based (describing the online activity; computer literacy and experience; job-seeking motivation and practice; and training satisfaction. Only two variables were associated with the dependent variable: knowledge post-test duration and satisfaction with content. After building a prediction model for the dependent variable, another log-based variable was highlighted: total number of actions in the course website along the course. Overall, our analyses shed light on the predominance of log-based variables over variables from other categories. These findings might hint at the need of developing new assessment tools for learners and trainees that take into consideration human-computer interaction when measuring self-efficacy variables.

  2. CBI students: ready for new challenges

    CERN Multimedia

    Antonella Del Rosso

    2015-01-01

    Twenty-seven students from four universities and over ten countries gathered at IdeaSquare to start their Challenge-Based Innovation (CBI) course (see here). Labour mobility, food safety, literacy in the developing world and water safety are the four projects that the students will work on now that they are back at their home institutions. The final ideas and prototypes will be presented at CERN in December.   The CBI students enjoy some training sessions at IdeaSquare. (Image: Joona Kurikka for IdeaSquare). The intensive first week of the four-month CBI Mediterranean course took place from 14 to 18 September. The students, from four universities – ESADE, IED and UPC in Barcelona and UNIMORE in Italy – gathered at CERN to meet researchers and carry out need-finding and benchmarking studies. “The idea of CBI courses is to get multidisciplinary student teams and their instructors to collaborate with researchers at CERN to develop novel solutions that meet societal need...

  3. Challenges and insights for situated language processing: Comment on "Towards a computational comparative neuroprimatology: Framing the language-ready brain" by Michael A. Arbib

    Science.gov (United States)

    Knoeferle, Pia

    2016-03-01

    In his review article [19], Arbib outlines an ambitious research agenda: to accommodate within a unified framework the evolution, the development, and the processing of language in natural settings (implicating other systems such as vision). He does so with neuro-computationally explicit modeling in mind [1,2] and inspired by research on the mirror neuron system in primates. Similar research questions have received substantial attention also among other scientists [3,4,12].

  4. Challenges in Computational Commutative Algebra

    OpenAIRE

    Abbott, John

    2006-01-01

    In this paper we consider a number of challenges from the point of view of the CoCoA project one of whose tasks is to develop software specialized for computations in commutative algebra. Some of the challenges extend considerably beyond the boundary of commutative algebra, and are addressed to the computer algebra community as a whole.

  5. Data challenges in ATLAS computing

    CERN Document Server

    Vaniachine, A

    2003-01-01

    ATLAS computing is steadily progressing towards a highly functional software suite, plus a World Wide computing model which gives all ATLAS equal and equal quality of access to ATLAS data. A key component in the period before the LHC is a series of Data Challenges of increasing scope and complexity. The goals of the ATLAS Data Challenges are the validation of the computing model, of the complete software suite, of the data model, and to ensure the correctness of the technical choices to be made. We are committed to 'common solutions' and look forward to the LHC Computing Grid being the vehicle for providing these in an effective way. In close collaboration between the Grid and Data Challenge communities ATLAS is testing large-scale testbed prototypes around the world, deploying prototype components to integrate and test Grid software in a production environment, and running DC1 production at 39 'tier' centers in 18 countries on four continents.

  6. The challenge of computer mathematics.

    Science.gov (United States)

    Barendregt, Henk; Wiedijk, Freek

    2005-10-15

    Progress in the foundations of mathematics has made it possible to formulate all thinkable mathematical concepts, algorithms and proofs in one language and in an impeccable way. This is not in spite of, but partially based on the famous results of Gödel and Turing. In this way statements are about mathematical objects and algorithms, proofs show the correctness of statements and computations, and computations are dealing with objects and proofs. Interactive computer systems for a full integration of defining, computing and proving are based on this. The human defines concepts, constructs algorithms and provides proofs, while the machine checks that the definitions are well formed and the proofs and computations are correct. Results formalized so far demonstrate the feasibility of this 'computer mathematics'. Also there are very good applications. The challenge is to make the systems more mathematician-friendly, by building libraries and tools. The eventual goal is to help humans to learn, develop, communicate, referee and apply mathematics.

  7. CMS Software and Computing Ready for Run 2

    CERN Document Server

    Bloom, Kenneth

    2015-01-01

    In Run 1 of the Large Hadron Collider, software and computing was a strategic strength of the Compact Muon Solenoid experiment. The timely processing of data and simulation samples and the excellent performance of the reconstruction algorithms played an important role in the preparation of the full suite of searches used for the observation of the Higgs boson in 2012. In Run 2, the LHC will run at higher intensities and CMS will record data at a higher trigger rate. These new running conditions will provide new challenges for the software and computing systems. Over the two years of Long Shutdown 1, CMS has built upon the successes of Run 1 to improve the software and computing to meet these challenges. In this presentation we will describe the new features in software and computing that will once again put CMS in a position of physics leadership.

  8. Professional Readiness of Teachers to Use Computer Visualization Tools: A Crucial Drive

    Directory of Open Access Journals (Sweden)

    Elena V. Semenikhina

    2016-12-01

    Full Text Available The training of teachers involves the formation of skills which are meant to be used in their future professional activities. Given the exponential increase in information content, there is a need to look into the levels and components of the professional readiness of teachers to use computer visualization tools. This article describes the four levels of teachers’ readiness [passive, basic, conscious, active] to use computer visualization tools. These levels are based on the proposed components of teachers’ readiness [motivational, cognitive, technological, reflexive] to use these tools.

  9. Towards a Computational Comparative Neuroprimatology: Framing the language-ready brain

    Science.gov (United States)

    Arbib, Michael A.

    2016-03-01

    We make the case for developing a Computational Comparative Neuroprimatology to inform the analysis of the function and evolution of the human brain. First, we update the mirror system hypothesis on the evolution of the language-ready brain by (i) modeling action and action recognition and opportunistic scheduling of macaque brains to hypothesize the nature of the last common ancestor of macaque and human (LCA-m); and then we (ii) introduce dynamic brain modeling to show how apes could acquire gesture through ontogenetic ritualization, hypothesizing the nature of evolution from LCA-m to the last common ancestor of chimpanzee and human (LCA-c). We then (iii) hypothesize the role of imitation, pantomime, protosign and protospeech in biological and cultural evolution from LCA-c to Homo sapiens with a language-ready brain. Second, we suggest how cultural evolution in Homo sapiens led from protolanguages to full languages with grammar and compositional semantics. Third, we assess the similarities and differences between the dorsal and ventral streams in audition and vision as the basis for presenting and comparing two models of language processing in the human brain: A model of (i) the auditory dorsal and ventral streams in sentence comprehension; and (ii) the visual dorsal and ventral streams in defining ;what language is about; in both production and perception of utterances related to visual scenes provide the basis for (iii) a first step towards a synthesis and a look at challenges for further research.

  10. Security and Privacy in Fog Computing: Challenges

    OpenAIRE

    Mukherjee, Mithun; Matam, Rakesh; Shu, Lei; Maglaras, Leandros; Ferrag, Mohamed Amine; Choudhry, Nikumani; Kumar, Vikas

    2017-01-01

    open access article Fog computing paradigm extends the storage, networking, and computing facilities of the cloud computing toward the edge of the networks while offloading the cloud data centers and reducing service latency to the end users. However, the characteristics of fog computing arise new security and privacy challenges. The existing security and privacy measurements for cloud computing cannot be directly applied to the fog computing due to its features, such as mobility, heteroge...

  11. Global Health Governance Challenges 2016 - Are We Ready?

    Science.gov (United States)

    Kickbusch, Ilona

    2016-02-29

    The year 2016 could turn out to be a turning point for global health, new political realities and global insecurities will test governance and financing mechanisms in relation to both people and planet. But most importantly political factors such as the global power shift and "the rise of the rest" will define the future of global health. A new mix of health inequity and security challenges has emerged and the 2015 humanitarian and health crises have shown the limits of existing systems. The global health as well as the humanitarian system will have to prove their capacity to respond and reform. The challenge ahead is deeply political, especially for the rising political actors. They are confronted with the consequences of a model of development that has neglected sustainability and equity, and was built on their exploitation. Some direction has been given by the path breaking international conferences in 2015. Especially the agreement on the Sustainable Development Goals (SDGs) and the Paris agreement on climate change will shape action. Conceptually, we will need a different understanding of global health and its ultimate goals - the health of people can no longer be seen separate from the health of the planet and wealth measured by parameters of growth will no longer ensure health. © 2016 by Kerman University of Medical Sciences.

  12. Global Health Governance Challenges 2016 – Are We Ready?

    Directory of Open Access Journals (Sweden)

    Ilona Kickbusch

    2016-06-01

    Full Text Available The year 2016 could turn out to be a turning point for global health, new political realities and global insecurities will test governance and financing mechanisms in relation to both people and planet. But most importantly political factors such as the global power shift and “the rise of the rest” will define the future of global health. A new mix of health inequity and security challenges has emerged and the 2015 humanitarian and health crises have shown the limits of existing systems. The global health as well as the humanitarian system will have to prove their capacity to respond and reform. The challenge ahead is deeply political, especially for the rising political actors. They are confronted with the consequences of a model of development that has neglected sustainability and equity, and was built on their exploitation. Some direction has been given by the path breaking international conferences in 2015. Especially the agreement on the Sustainable Development Goals (SDGs and the Paris agreement on climate change will shape action. Conceptually, we will need a different understanding of global health and its ultimate goals - the health of people can no longer be seen separate from the health of the planet and wealth measured by parameters of growth will no longer ensure health.

  13. Using a Computer Simulation to Improve Psychological Readiness for Job Interviewing in Unemployed Individuals of Pre-Retirement Age.

    Science.gov (United States)

    Aysina, Rimma M; Efremova, Galina I; Maksimenko, Zhanna A; Nikiforov, Mikhail V

    2017-05-01

    Unemployed individuals of pre-retirement age face significant challenges in finding a new job. This may be partly due to their lack of psychological readiness to go through a job interview. We view psychological readiness as one of the psychological attitude components. It is an active conscious readiness to interact with a certain aspect of reality, based on previously acquired experience. It includes a persons' special competence to manage their activities and cope with anxiety. We created Job Interview Simulation Training (JIST) - a computer-based simulator, which allowed unemployed job seekers to practice interviewing repeatedly in a stress-free environment. We hypothesized that completion of JIST would be related to increase in pre-retirement job seekers' psychological readiness for job interviewing in real life. Participants were randomized into control (n = 18) and experimental (n = 21) conditions. Both groups completed pre- and post-intervention job interview role-plays and self-reporting forms of psychological readiness for job interviewing. JIST consisted of 5 sessions of a simulated job interview, and the experimental group found it easy to use and navigate as well as helpful to prepare for interviewing. After finishing JIST-sessions the experimental group had significant decrease in heart rate during the post-intervention role-play and demonstrated significant increase in their self-rated psychological readiness, whereas the control group did not have changes in these variables. Future research may help clarify whether JIST is related to an increase in re-employment of pre-retirement job seekers.

  14. New LHCb Management readies for run 2 challenges

    CERN Multimedia

    Antonella Del Rosso

    2014-01-01

    As of 1 July, LHCb, one of the four biggest experiments at the LHC, will have a new Management. Ahead are the huge challenges of run 2 and the following long technical shutdown during which LHCb will undergo a major upgrade. In the meantime, the discovery of new physics could be a dream within reach…   New LHCb Spokesperson, Guy Wilkinson.   “We have to make sure that the detector wakes up after its long hibernation and goes back to data taking in the most efficient way and that we are able to process all these data to produce high-quality physics results,” says Guy Wilkinson, new Spokesperson of the LHCb collaboration. Although this already sounds like a considerable “to-do” list for the coming months, it’s just the beginning of a much longer and ambitious plan. “The previous management has done an excellent job in analysing the data we took during run 1. They also put on a very sound footing the LHCb upgrade, whi...

  15. Challenges in Computer Security Education

    Science.gov (United States)

    1997-10-01

    Although awareness is increasing about the need for better computer security , to actually move in that direction we need people who know what they want...group met to discuss some of these issues at the First ACM Workshop on Education in Computer Security , held in Monterey, California. Representatives from

  16. Computational medicine tools and challenges

    CERN Document Server

    Trajanoski, Zlatko

    2014-01-01

    This book covers a number of contemporary computational medicine topics spanning scales from molecular to cell to organ and organism, presenting a state-of-the-art IT infrastructure, and reviewing four hierarchical scales.

  17. Computing challenges of the CMS experiment

    Science.gov (United States)

    Krammer, N.; Liko, D.

    2017-06-01

    The success of the LHC experiments is due to the magnificent performance of the detector systems and the excellent operating computing systems. The CMS offline software and computing system is successfully fulfilling the LHC Run 2 requirements. For the increased data rate of future LHC operation, together with high pileup interactions, improvements of the usage of the current computing facilities and new technologies became necessary. Especially for the challenge of the future HL-LHC a more flexible and sophisticated computing model is needed. In this presentation, I will discuss the current computing system used in the LHC Run 2 and future computing facilities for the HL-LHC runs using flexible computing technologies like commercial and academic computing clouds. The cloud resources are highly virtualized and can be deployed for a variety of computing tasks providing the capacities for the increasing needs of large scale scientific computing.

  18. Beyond moore computing research challenge workshop report.

    Energy Technology Data Exchange (ETDEWEB)

    Huey, Mark C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Aidun, John Bahram [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-10-01

    We summarize the presentations and break out session discussions from the in-house workshop that was held on 11 July 2013 to acquaint a wider group of Sandians with the Beyond Moore Computing research challenge.

  19. Challenges and solutions in enterprise computing

    NARCIS (Netherlands)

    van Sinderen, Marten J.

    2008-01-01

    The emergence of the networked enterprise has a profound effect on enterprise computing. This introduction discusses some important challenges in enterprise computing, which are the result of the mentioned networking trend, and positions the articles of this special issue with respect to these

  20. Are Computer Science Students Ready for the Real World.

    Science.gov (United States)

    Elliot, Noreen

    The typical undergraduate program in computer science includes an introduction to hardware and operating systems, file processing and database organization, data communication and networking, and programming. However, many graduates may lack the ability to integrate the concepts "learned" into a skill set and pattern of approaching problems that…

  1. Cloud Computing: Opportunities and Challenges for Businesses

    Directory of Open Access Journals (Sweden)

    İbrahim Halil Seyrek

    2011-12-01

    Full Text Available Cloud computing represents a new approach for supplying and using information technology services. Considering its benefits for firms and the potential of changes that it may lead to, it is envisioned that cloud computing can be the most important innovation in information technology since the development of the internet. In this study, firstly, the development of cloud computing and related technologies are explained and classified by giving current application examples. Then the benefits of this new computing model for businesses are elaborated especially in terms of cost, flexibility and service quality. In spite of its benefits, cloud computing also poses some risks for firms, of which security is one of the most important, and there are some challenges in its implementation. This study points out the risks that companies should be wary about and some legal challenges related to cloud computing. Lastly, approaches that companies may take against cloud computing and different strategies that they may adopt are discussed and some recommendations are made

  2. Microbiological challenge testing for Listeria monocytogenes in ready-to-eat food: a practical approach

    Directory of Open Access Journals (Sweden)

    Carlo Spanu

    2014-12-01

    Full Text Available Food business operators (FBOs are the primary responsible for the safety of food they place on the market. The definition and validation of the product’s shelf-life is an essential part for ensuring microbiological safety of food and health of consumers. In the frame of the Regulation (EC No 2073/2005 on microbiological criteria for foodstuffs, FBOs shall conduct shelf-life studies in order to assure that their food does not exceed the food safety criteria throughout the defined shelf-life. In particular this is required for ready-to-eat (RTE food that supports the growth of Listeria monocytogenes. Among other studies, FBOs can rely on the conclusion drawn by microbiological challenge tests. A microbiological challenge test consists in the artificial contamination of a food with a pathogen microorganism and aims at simulating its behaviour during processing and distribution under the foreseen storage and handling conditions. A number of documents published by international health authorities and research institutions describes how to conduct challenge studies. The authors reviewed the existing literature and described the methodology for implementing such laboratory studies. All the main aspects for the conduction of L. monocytogenes microbiological challenge tests were considered, from the selection of the strains, preparation and choice of the inoculum level and method of contamination, to the experimental design and data interpretation. The objective of the present document is to provide an exhaustive and practical guideline for laboratories that want to implement L. monocytogenes challenge testing on RTE food.

  3. New challenges in computational collective intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Ngoc Thanh; Katarzyniak, Radoslaw Piotr [Wroclaw Univ. of Technology (Poland). Inst. of Informatics; Janiak, Adam (eds.) [Wroclaw Univ. of Technology (Poland). Inst. of Computer Engineering, Control and Robotics

    2009-07-01

    The book consists of 29 chapters which have been selected and invited from the submissions to the 1{sup st} International Conference on Collective Intelligence - Semantic Web, Social Networks and Multiagent Systems (ICCCI 2009). All chapters in the book discuss various examples of applications of computational collective intelligence and related technologies to such fields as semantic web, information systems ontologies, social networks, agent and multiagent systems. The editors hope that the book can be useful for graduate and Ph.D. students in Computer Science, in particular participants to courses on Soft Computing, Multi-Agent Systems and Robotics. This book can also be useful for researchers working on the concept of computational collective intelligence in artificial populations. It is the hope of the editors that readers of this volume can find many inspiring ideas and use them to create new cases intelligent collectives. Many such challenges are suggested by particular approaches and models presented in particular chapters of this book. (orig.)

  4. Taking the Challenge at Singer Village--A Cold Climate Zero Energy Ready Home

    Energy Technology Data Exchange (ETDEWEB)

    Puttagunta, S.; Gaakye, O.

    2014-10-01

    After progressively incorporating ENERGY STAR(R) for Homes Versions 1, 2, and 3 into its standard practices over the years, this builder, Brookside Development, was seeking to build an even more sustainable product that would further increase energy efficiency, while also addressing indoor air quality, water conservation, renewable-ready, and resiliency. These objectives align with the framework of the DOE Challenge Home program, which 'builds upon the comprehensive building science requirements of ENERGY STAR for Homes Version 3, along with proven Building America innovations and best practices. Other special attribute programs are incorporated to help builders reach unparalleled levels of performance with homes designed to last hundreds of years.' CARB partnered with Brookside Development on the design optimization and construction of the first home in a small development of seven planned new homes being built on the old Singer Estate in Derby, CT.

  5. Taking the Challenge at Singer Village. A Cold Climate Zero Energy Ready Home

    Energy Technology Data Exchange (ETDEWEB)

    Puttagunta, S. [Consortium for Advanced Residential Buildings, Norwalk, CT (United States); Faakye, O. [Consortium for Advanced Residential Buildings, Norwalk, CT (United States)

    2014-10-01

    After progressively incorporating ENERGY STAR® for Homes Versions 1, 2, and 3 into its standard practices over the years, this builder, Brookside Development, was seeking to build an even more sustainable product that would further increase energy efficiency, while also addressing indoor air quality, water conservation, renewable-ready, and resiliency. These objectives align with the framework of the DOE Challenge Home program, which "builds upon the comprehensive building science requirements of ENERGY STAR for Homes Version 3, along with proven Building America innovations and best practices. Other special attribute programs are incorporated to help builders reach unparalleled levels of performance with homes designed to last hundreds of years." Consortium for Advanced Residential Buildings (CARB) partnered with Brookside Development on the design optimization and construction of the first home in a small development of seven planned new homes being built on the old Singer Estate in Derby, CT.

  6. Computer Aided Modelling – Opportunities and Challenges

    DEFF Research Database (Denmark)

    2011-01-01

    This chapter considers the opportunities that are present in developing, extending and applying aspects of computer-aided modelling principles and practice. What are the best tasks to be done by modellers and what needs the application of CAPE tools? How do we efficiently develop model-based solu......This chapter considers the opportunities that are present in developing, extending and applying aspects of computer-aided modelling principles and practice. What are the best tasks to be done by modellers and what needs the application of CAPE tools? How do we efficiently develop model......-based solutions to significant problems? The important issues of workflow and data flow are discussed together with fit-for-purpose model development. As well, the lack of tools around multiscale modelling provides opportunities for the development of efficient tools to address such challenges. The ability...... to easily generate new models from underlying phenomena continues to be a challenge, especially in the face of time and cost constraints.Integrated frameworks that allow flexibility of model development and access to a range of embedded tools are central to future model developments. The challenges...

  7. The Challenge of Massively Parallel Computing

    Energy Technology Data Exchange (ETDEWEB)

    WOMBLE,DAVID E.

    1999-11-03

    Since the mid-1980's, there have been a number of commercially available parallel computers with hundreds or thousands of processors. These machines have provided a new capability to the scientific community, and they been used successfully by scientists and engineers although with varying degrees of success. One of the reasons for the limited success is the difficulty, or perceived difficulty, in developing code for these machines. In this paper we discuss many of the issues and challenges in developing scalable hardware, system software and algorithms for machines comprising hundreds or thousands of processors.

  8. Computer Network Security- The Challenges of Securing a Computer Network

    Science.gov (United States)

    Scotti, Vincent, Jr.

    2011-01-01

    This article is intended to give the reader an overall perspective on what it takes to design, implement, enforce and secure a computer network in the federal and corporate world to insure the confidentiality, integrity and availability of information. While we will be giving you an overview of network design and security, this article will concentrate on the technology and human factors of securing a network and the challenges faced by those doing so. It will cover the large number of policies and the limits of technology and physical efforts to enforce such policies.

  9. Readiness for Change: Case Studies of Young People with Challenging and Risky Behaviours

    Science.gov (United States)

    Carroll, Annemaree; Ashman, Adrian; Bower, Julie; Hemingway, Francene

    2013-01-01

    Readiness for change (or treatment readiness) is a core concept of many rehabilitation programs for adult and juvenile offenders. The present study examined the experiences of six young people aged 13 to 17 years who participated in Mindfields[R], a 6-week self-regulatory intervention aimed at enhancing life skills and goal setting among youths…

  10. Achievements and Challenges in Computational Protein Design.

    Science.gov (United States)

    Samish, Ilan

    2017-01-01

    Computational protein design (CPD), a yet evolving field, includes computer-aided engineering for partial or full de novo designs of proteins of interest. Designs are defined by a requested structure, function, or working environment. This chapter describes the birth and maturation of the field by presenting 101 CPD examples in a chronological order emphasizing achievements and pending challenges. Integrating these aspects presents the plethora of CPD approaches with the hope of providing a "CPD 101". These reflect on the broader structural bioinformatics and computational biophysics field and include: (1) integration of knowledge-based and energy-based methods, (2) hierarchical designated approach towards local, regional, and global motifs and the integration of high- and low-resolution design schemes that fit each such region, (3) systematic differential approaches towards different protein regions, (4) identification of key hot-spot residues and the relative effect of remote regions, (5) assessment of shape-complementarity, electrostatics and solvation effects, (6) integration of thermal plasticity and functional dynamics, (7) negative design, (8) systematic integration of experimental approaches, (9) objective cross-assessment of methods, and (10) successful ranking of potential designs. Future challenges also include dissemination of CPD software to the general use of life-sciences researchers and the emphasis of success within an in vivo milieu. CPD increases our understanding of protein structure and function and the relationships between the two along with the application of such know-how for the benefit of mankind. Applied aspects range from biological drugs, via healthier and tastier food products to nanotechnology and environmentally friendly enzymes replacing toxic chemicals utilized in the industry.

  11. Are we ready to accept the challenge? Addressing the shortcomings of contemporary qualitative health research.

    Science.gov (United States)

    Lau, Sofie Rosenlund; Traulsen, Janine M

    Qualitative approaches represent an important contributor to health care research. However, several researchers argue that contemporary qualitative research does not live up to its full potential. By presenting a snapshot of contemporary qualitative research in the field of social and administrative pharmacy, this study challenges contributors to the field by asking: Are we ready to accept the challenge and take qualitative research one step further? The purpose of this study was to initiate a constructive dialogue on the need for increased transparency in qualitative data analysis, including explicitly reflecting upon theoretical perspectives affecting the research process. Content analysis was used to evaluate levels of theoretical visibility and analysis transparency in selected qualitative research articles published in Research in Social and Administrative Pharmacy between January 2014 and January 2015. In 14 out of 21 assessed papers, the use of theory was found to be Seemingly Absent (lowest level of theory use), and the data analyses did not include any interpretive endeavors. Only two papers consistently applied theory throughout the entire study and clearly took the data analyses from a descriptive to an interpretive level. It was found that the aim of the majority of assessed papers was to change or modify a given practice, which however, resulted in a lack of both theoretical underpinnings and analysis transparency. This study takes the standpoint that theory and high-quality analysis go hand-in-hand. Based on the content analysis, articles that were deemed to be high in quality were explicit about the theoretical framework of their study and transparent in how they analyzed their data. It was found that theory contributed to the transparency of how the data were analyzed and interpreted. Two ways of improving contemporary qualitative research in the field of social and administrative pharmacy are discussed: engaging with social theory and establishing

  12. Computational Challenges in Nuclear Weapons Simulation

    Energy Technology Data Exchange (ETDEWEB)

    McMillain, C F; Adams, T F; McCoy, M G; Christensen, R B; Pudliner, B S; Zika, M R; Brantley, P S; Vetter, J S; May, J M

    2003-08-29

    After a decade of experience, the Stockpile Stewardship Program continues to ensure the safety, security and reliability of the nation's nuclear weapons. The Advanced Simulation and Computing (ASCI) program was established to provide leading edge, high-end simulation capabilities needed to meet the program's assessment and certification requirements. The great challenge of this program lies in developing the tools and resources necessary for the complex, highly coupled, multi-physics calculations required to simulate nuclear weapons. This paper describes the hardware and software environment we have applied to fulfill our nuclear weapons responsibilities. It also presents the characteristics of our algorithms and codes, especially as they relate to supercomputing resource capabilities and requirements. It then addresses impediments to the development and application of nuclear weapon simulation software and hardware and concludes with a summary of observations and recommendations on an approach for working with industry and government agencies to address these impediments.

  13. CMS Computing Software and Analysis Challenge 2006

    Science.gov (United States)

    De Filippis, N.; CMS Collaboration

    2007-10-01

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work.

  14. Computational gene expression modeling identifies salivary biomarker analysis that predict oral feeding readiness in the newborn.

    Science.gov (United States)

    Maron, Jill L; Hwang, Jooyeon S; Pathak, Subash; Ruthazer, Robin; Russell, Ruby L; Alterovitz, Gil

    2015-02-01

    To combine mathematical modeling of salivary gene expression microarray data and systems biology annotation with reverse-transcription quantitative polymerase chain reaction amplification to identify (phase I) and validate (phase II) salivary biomarker analysis for the prediction of oral feeding readiness in preterm infants. Comparative whole-transcriptome microarray analysis from 12 preterm newborns pre- and postoral feeding success was used for computational modeling and systems biology analysis to identify potential salivary transcripts associated with oral feeding success (phase I). Selected gene expression biomarkers (15 from computational modeling; 6 evidence-based; and 3 reference) were evaluated by reverse-transcription quantitative polymerase chain reaction amplification on 400 salivary samples from successful (n = 200) and unsuccessful (n = 200) oral feeders (phase II). Genes, alone and in combination, were evaluated by a multivariate analysis controlling for sex and postconceptional age (PCA) to determine the probability that newborns achieved successful oral feeding. Advancing PCA (P genes, neuropeptide Y2 receptor (hunger signaling), adneosine-monophosphate-activated protein kinase (energy homeostasis), plexin A1 (olfactory neurogenesis), nephronophthisis 4 (visual behavior), and wingless-type MMTV integration site family, member 3 (facial development), in addition to PCA and sex, demonstrated good accuracy for determining feeding success (area under the receiver operator characteristic curve = 0.78). We have identified objective and biologically relevant salivary biomarkers that noninvasively assess a newborn's developing brain, sensory, and facial development as they relate to oral feeding success. Understanding the mechanisms that underlie the development of oral feeding readiness through translational and computational methods may improve clinical decision making while decreasing morbidities and health care costs. Copyright © 2015 Elsevier Inc

  15. Assessing Learners' Perceived Readiness for Computer-Supported Collaborative Learning (CSCL): A Study on Initial Development and Validation

    Science.gov (United States)

    Xiong, Yao; So, Hyo-Jeong; Toh, Yancy

    2015-01-01

    The main purpose of this study was to develop an instrument that assesses university students' perceived readiness for computer-supported collaborative learning (CSCL). Assessment in CSCL research had predominantly focused on measuring "after-collaboration" outcomes and "during-collaboration" behaviors while…

  16. "Defining Computer 'Speed': An Unsolved Challenge"

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Abstract: The reason we use computers is their speed, and the reason we use parallel computers is that they're faster than single-processor computers. Yet, after 70 years of electronic digital computing, we still do not have a solid definition of what computer 'speed' means, or even what it means to be 'faster'. Unlike measures in physics, where the definition of speed is rigorous and unequivocal, in computing there is no definition of speed that is universally accepted. As a result, computer customers have made purchases misguided by dubious information, computer designers have optimized their designs for the wrong goals, and computer programmers have chosen methods that optimize the wrong things. This talk describes why some of the obvious and historical ways of defining 'speed' haven't served us well, and the things we've learned in the struggle to find a definition that works. Biography: Dr. John Gustafson is a Director ...

  17. CLOUD COMPUTING OVERVIEW AND CHALLENGES: A REVIEW PAPER

    OpenAIRE

    Satish Kumar*, Vishal Thakur, Payal Thakur, Ashok Kumar Kashyap

    2017-01-01

    Cloud computing era is the most resourceful, elastic, utilized and scalable period for internet technology to use the computing resources over the internet successfully. Cloud computing did not provide only the speed, accuracy, storage capacity and efficiency for computing but it also lead to propagate the green computing and resource utilization. In this research paper, a brief description of cloud computing, cloud services and cloud security challenges is given. Also the literature review o...

  18. The Coming Challenge: Are Community Colleges Ready for the New Wave of Contextual Learners?

    Science.gov (United States)

    Hull, Dan; Souders, John C., Jr.

    1996-01-01

    Defines contextual learning, or presenting new information to students in familiar contexts. Argues that community colleges must be ready for an anticipated increase in contextual learners due to its use in tech prep programs. Describes elements of contextual learning, its application in the classroom, and ways that colleges can prepare for…

  19. Developing MOOCs to Narrow the College Readiness Gap: Challenges and Recommendations for a Writing Course

    Science.gov (United States)

    Bandi-Rao, Shoba; Devers, Christopher J.

    2015-01-01

    Massive Open Online Courses (MOOCs) have demonstrated the potential to deliver quality and cost effective course materials to large numbers of students. Approximately 60% of first-year students at community colleges are underprepared for college-level coursework. One reason for low graduation rates is the lack of the overall college readiness.…

  20. Female challenges in acquiring computer education at the federal ...

    African Journals Online (AJOL)

    These challenges were poor academic background, molestation of female students, negative perception of formal computer education, gender discrimination, among others. To ameliorate these challenges, the study, recommended introduction of computer science in all primary/secondary schools, campaign against female ...

  1. Homogeneous Buchberger algorithms and Sullivant's computational commutative algebra challenge

    DEFF Research Database (Denmark)

    Lauritzen, Niels

    2005-01-01

    We give a variant of the homogeneous Buchberger algorithm for positively graded lattice ideals. Using this algorithm we solve the Sullivant computational commutative algebra challenge.......We give a variant of the homogeneous Buchberger algorithm for positively graded lattice ideals. Using this algorithm we solve the Sullivant computational commutative algebra challenge....

  2. Ready…, Set, Go!. Comment on "Towards a Computational Comparative Neuroprimatology: Framing the language-ready brain" by Michael A. Arbib

    Science.gov (United States)

    Iriki, Atsushi

    2016-03-01

    ;Language-READY brain; in the title of this article [1] seems to be the expression that the author prefers to use to illustrate his theoretical framework. The usage of the term ;READY; appears to be of extremely deep connotation, for three reasons. Firstly, of course it needs a ;principle; - the depth and the width of the computational theory depicted here is as expected from the author's reputation. However, ;readiness; implies that it is much more than just ;a theory;. That is, such a principle is not static, but it rather has dynamic properties, which are ready to gradually proceed to flourish once brains are put in adequate conditions to make time progressions - namely, evolution and development. So the second major connotation is that this article brought in the perspectives of the comparative primatology as a tool to relativise the language-realizing human brains among other animal species, primates in particular, in the context of evolutionary time scale. The tertiary connotation lies in the context of the developmental time scale. The author claims that it is the interaction of the newborn with its care takers, namely its mother and other family or social members in its ecological conditions, that brings the brain mechanism subserving language faculty to really mature to its final completion. Taken together, this article proposes computational theories and mechanisms of Evo-Devo-Eco interactions for language acquisition in the human brains.

  3. End-User Computing: A Multifaceted Challenge.

    Science.gov (United States)

    Barone, Carole A.

    1988-01-01

    Data custodians are described as "owners" of the data and administrative computing staff the data processors. New hardware and software have put the processing tools into the hands of anyone who can afford them. The term "end-user computing" refers to the processing of data in the administrative units. (MLW)

  4. Advances and Challenges in Computational Plasma Science

    Energy Technology Data Exchange (ETDEWEB)

    W.M. Tang; V.S. Chan

    2005-01-03

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behavior. Recent advances in simulations of magnetically-confined plasmas are reviewed in this paper with illustrative examples chosen from associated research areas such as microturbulence, magnetohydrodynamics, and other topics. Progress has been stimulated in particular by the exponential growth of computer speed along with significant improvements in computer technology.

  5. Challenges in computational statistics and data mining

    CERN Document Server

    Mielniczuk, Jan

    2016-01-01

    This volume contains nineteen research papers belonging to the areas of computational statistics, data mining, and their applications. Those papers, all written specifically for this volume, are their authors’ contributions to honour and celebrate Professor Jacek Koronacki on the occcasion of his 70th birthday. The book’s related and often interconnected topics, represent Jacek Koronacki’s research interests and their evolution. They also clearly indicate how close the areas of computational statistics and data mining are.

  6. Mobile Computing and Ubiquitous Networking: Concepts, Technologies and Challenges.

    Science.gov (United States)

    Pierre, Samuel

    2001-01-01

    Analyzes concepts, technologies and challenges related to mobile computing and networking. Defines basic concepts of cellular systems. Describes the evolution of wireless technologies that constitute the foundations of mobile computing and ubiquitous networking. Presents characterization and issues of mobile computing. Analyzes economical and…

  7. Biomedical Visual Computing: Case Studies and Challenges

    KAUST Repository

    Johnson, Christopher

    2012-01-01

    Advances in computational geometric modeling, imaging, and simulation let researchers build and test models of increasing complexity, generating unprecedented amounts of data. As recent research in biomedical applications illustrates, visualization will be critical in making this vast amount of data usable; it\\'s also fundamental to understanding models of complex phenomena. © 2012 IEEE.

  8. Mathematical challenges from theoretical/computational chemistry

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-12-31

    The committee believes that this report has relevance and potentially valuable suggestions for a wide range of readers. Target audiences include: graduate departments in the mathematical and chemical sciences; federal and private agencies that fund research in the mathematical and chemical sciences; selected industrial and government research and development laboratories; developers of software and hardware for computational chemistry; and selected individual researchers. Chapter 2 of this report covers some history of computational chemistry for the nonspecialist, while Chapter 3 illustrates the fruits of some past successful cross-fertilization between mathematical scientists and computational/theoretical chemists. In Chapter 4 the committee has assembled a representative, but not exhaustive, survey of research opportunities. Most of these are descriptions of important open problems in computational/theoretical chemistry that could gain much from the efforts of innovative mathematical scientists, written so as to be accessible introductions to the nonspecialist. Chapter 5 is an assessment, necessarily subjective, of cultural differences that must be overcome if collaborative work is to be encouraged between the mathematical and the chemical communities. Finally, the report ends with a brief list of conclusions and recommendations that, if followed, could promote accelerated progress at this interface. Recognizing that bothersome language issues can inhibit prospects for collaborative research at the interface between distinctive disciplines, the committee has attempted throughout to maintain an accessible style, in part by using illustrative boxes, and has included at the end of the report a glossary of technical terms that may be familiar to only a subset of the target audiences listed above.

  9. Challenges & Roadmap for Beyond CMOS Computing Simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Rodrigues, Arun F. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Frank, Michael P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-12-01

    Simulating HPC systems is a difficult task and the emergence of “Beyond CMOS” architectures and execution models will increase that difficulty. This document presents a “tutorial” on some of the simulation challenges faced by conventional and non-conventional architectures (Section 1) and goals and requirements for simulating Beyond CMOS systems (Section 2). These provide background for proposed short- and long-term roadmaps for simulation efforts at Sandia (Sections 3 and 4). Additionally, a brief explanation of a proof-of-concept integration of a Beyond CMOS architectural simulator is presented (Section 2.3).

  10. Cloud computing challenges, limitations and R&D solutions

    CERN Document Server

    Mahmood, Zaigham

    2014-01-01

    This important text/reference reviews the challenging issues that present barriers to greater implementation of the cloud computing paradigm, together with the latest research into developing potential solutions. Exploring the strengths and vulnerabilities of cloud provision and cloud environments, Cloud Computing: Challenges, Limitations and R&D Solutions provides case studies from a diverse selection of researchers and practitioners of international repute. The implications of emerging cloud technologies are also analyzed from the perspective of consumers. Topics and features: presents

  11. Computer graphics visions and challenges: a European perspective.

    Science.gov (United States)

    Encarnação, José L

    2006-01-01

    I have briefly described important visions and challenges in computer graphics. They are a personal and therefore subjective selection. But most of these issues have to be addressed and solved--no matter if we call them visions or challenges or something else--if we want to make and further develop computer graphics into a key enabling technology for our IT-based society.

  12. Security Certification Challenges in a Cloud Computing Delivery Model

    Science.gov (United States)

    2010-04-27

    vulnerabilities Identification and Authentication (IA) – LDAP and Active directory integration issues – Immature concepts Access Control (AC) – Customer...2010 The MITRE Corporation. All Rights Reserved. April 27, 2010 Security Certification Challenges in a Cloud Computing Delivery Model Systems and...TYPE 3. DATES COVERED 00-00-2010 to 00-00-2010 4. TITLE AND SUBTITLE Security Certification Challenges in a Cloud Computing Delivery Model 5a

  13. U.S. Workforce and Educational Facilities' Readiness to Meet the Future Challenges of Nuclear Energy

    Science.gov (United States)

    Mtingwa, Sekazi

    2008-04-01

    Using nuclear energy to generate electricity continues to be a topic of considerable debate. Currently, 20% of the electricity in the U.S. comes from its fleet of 104 commercial nuclear reactors, and they annually displace on the order of one hundred million metric tons of carbon emissions. These reactors currently account for 70% of the non-carbon emitting electricity production in the United States. Due to the recent interest by the Federal government and others in expanding the nuclear energy option, the American Physical Society's Panel on Public Affairs sponsored a study of the U.S. workforce and educational facilities' readiness for three scenarios out to the year 2050. They range from maintaining the current number of nuclear reactors, although some may be retired and replaced by new ones; significantly increasing the number of reactors, to perhaps as high as 200 or more; up to significantly increasing the number of reactors while closing the fuel cycle by reprocessing and recycling spent fuel. This talk reports on the results of that study.

  14. A Nanotechnology-Ready Computing Scheme based on a Weakly Coupled Oscillator Network

    Science.gov (United States)

    Vodenicarevic, Damir; Locatelli, Nicolas; Abreu Araujo, Flavio; Grollier, Julie; Querlioz, Damien

    2017-03-01

    With conventional transistor technologies reaching their limits, alternative computing schemes based on novel technologies are currently gaining considerable interest. Notably, promising computing approaches have proposed to leverage the complex dynamics emerging in networks of coupled oscillators based on nanotechnologies. The physical implementation of such architectures remains a true challenge, however, as most proposed ideas are not robust to nanotechnology devices’ non-idealities. In this work, we propose and investigate the implementation of an oscillator-based architecture, which can be used to carry out pattern recognition tasks, and which is tailored to the specificities of nanotechnologies. This scheme relies on a weak coupling between oscillators, and does not require a fine tuning of the coupling values. After evaluating its reliability under the severe constraints associated to nanotechnologies, we explore the scalability of such an architecture, suggesting its potential to realize pattern recognition tasks using limited resources. We show that it is robust to issues like noise, variability and oscillator non-linearity. Defining network optimization design rules, we show that nano-oscillator networks could be used for efficient cognitive processing.

  15. E-records readiness in the ESARBICA region: Challenges and the ...

    African Journals Online (AJOL)

    Managing e-records is one area that has always challenged archivists and records managers, especially in the developing countries, partly because of the following reasons: • Their creation, use and preservation requires acquisition of costly hardware and software. • The archivist and records manager may be required to ...

  16. Chips challenging champions games, computers and artificial intelligence

    CERN Document Server

    Schaeffer, J

    2002-01-01

    One of the earliest dreams of the fledgling field of artificial intelligence (AI) was to build computer programs that could play games as well as or better than the best human players. Despite early optimism in the field, the challenge proved to be surprisingly difficult. However, the 1990s saw amazing progress. Computers are now better than humans in checkers, Othello and Scrabble; are at least as good as the best humans in backgammon and chess; and are rapidly improving at hex, go, poker, and shogi. This book documents the progress made in computers playing games and puzzles. The book is the

  17. Molecular Science Computing Facility Scientific Challenges: Linking Across Scales

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, Wibe A.; Windus, Theresa L.

    2005-07-01

    The purpose of this document is to define the evolving science drivers for performing environmental molecular research at the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) and to provide guidance associated with the next-generation high-performance computing center that must be developed at EMSL's Molecular Science Computing Facility (MSCF) in order to address this critical research. The MSCF is the pre-eminent computing facility?supported by the U.S. Department of Energy's (DOE's) Office of Biological and Environmental Research (BER)?tailored to provide the fastest time-to-solution for current computational challenges in chemistry and biology, as well as providing the means for broad research in the molecular and environmental sciences. The MSCF provides integral resources and expertise to emerging EMSL Scientific Grand Challenges and Collaborative Access Teams that are designed to leverage the multiple integrated research capabilities of EMSL, thereby creating a synergy between computation and experiment to address environmental molecular science challenges critical to DOE and the nation.

  18. Challenges for the CMS Computing Model in the First Year

    CERN Document Server

    Fisk, Ian

    2009-01-01

    CMS is in the process of commissioning a complex detector and a globally distributed computing infrastructure simultaneously. This represents a unique challenge. Even at the beginning there is not sufficient analysis or organized processing resources at CERN alone. In this presentation we discuss the unique computing challenges CMS expects to face during the first year of running and how they influence the baseline computing model decisions. During the early accelerator commissioning periods, CMS will attempt to collect as many events as possible when the beam is on in order to provide adequate early commissioning data. Some of these plans involve overdriving the Tier-0 infrastructure during data collection with recovery when the beam is off. In addition to the larger number of triggered events, there will be pressure in the first year to collect and analyze more complete data formats as the summarized formats mature. The large event formats impact the required storage, bandwidth, and processing capacity acro...

  19. Mobile computing in medical education: opportunities and challenges.

    Science.gov (United States)

    Chu, Larry F; Erlendson, Matthew J; Sun, John S; Alva, Heather L; Clemenson, Anna M

    2012-12-01

    There is an increasing importance of incorporating mobile computing into the academic medical environment. A growing majority of physicians, residents and medical students currently use mobile devices for education, access to clinical information and to facilitate bedside care. Therefore, it is important to assess the current opportunities and challenges in the use of mobile computing devices in the academic medical environment. Current research has found that a majority of physicians, residents and medical students either own or use mobile devices. In addition, studies have shown that these devices are effective as educational tools, resource guides and aids in patient care. Although there are opportunities for medical education, issues of deployment must still be addressed, such as privacy, connectivity, standardization and professionalism. Understanding the opportunities and challenges of using mobile computing devices in the academic medical environment can help determine the feasibility and benefits of their use for individuals and institutions.

  20. Implant maintenance for the prevention of biological complications: Are you ready for the next challenge?

    Science.gov (United States)

    Goh, Edwin X J; Lim, Lum Peng

    2017-11-01

    With increasing knowledge of wound biology and material sciences, the provision of dental implants as a treatment modality has become increasingly predictable and more commonly used to replace missing teeth. However, without appropriate follow up, peri-implant diseases could develop and affect the long-term success of implants. Currently, there is not enough focus on the prevention of peri-implant diseases, as compared to the definition of the disease, its prevalence, and treatment. In the present study, we aim to summarize various factors influencing the successful maintenance of dental implants and highlight current gaps in knowledge. Factors influencing the successful maintenance of dental implants can be divided into three categories: implant-, dentist-, and patient-related factors. Patients with dental implants are often more dentally aware, and this offers an advantage. Compared to gingiva, peri-implant mucosa responds at a different pace to the bacterial challenge. Dental practitioners should be aware of how treatment protocols affect long-term success, and be vigilant in detecting peri-implant diseases at an early stage. Compared to periodontal maintenance, less longitudinal studies on implant maintenance are available, and therefore, there is a tendency to rely heavily on information extrapolated from the periodontal literature. More studies on the significance of implant maintenance care are required. © 2016 John Wiley & Sons Australia, Ltd.

  1. Opinions differ on whether nuclear energy industry is ready for cyber-challenges

    Energy Technology Data Exchange (ETDEWEB)

    Dalton, David [NucNet, Brussels (Belgium)

    2017-05-15

    In October 2015 the UK's respected Chatham House think-tank published a report that drew some worrying conclusions about the civil nuclear industry. It said many in the sector do not fully understand the risks posed by hackers and the industry needs to be ''more robust'' on taking the initiative in cyberspace and funding effective responses to the challenge. The industry does not seem to be prepared for a large-scale cyber security emergency and needs to invest in counter-measures and response plans, the report said. It warned that developing countries are ''particularly vulnerable'' to cyber-attacks at nuclear facilities. The industry should develop guidelines to measure cyber security risk, including an integrated risk assessment that takes both security and safety measures into account. All countries with nuclear facilities should adopt an effective regulatory approach to cyber security e.g. on the basis of IAEA guidance.

  2. Scenario-Based Digital Forensics Challenges in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Erik Miranda Lopez

    2016-10-01

    Full Text Available The aim of digital forensics is to extract information to answer the 5Ws (Why, When, Where, What, and Who from the data extracted from the evidence. In order to achieve this, most digital forensic processes assume absolute control of digital evidence. However, in a cloud environment forensic investigation, this is not always possible. Additionally, the unique characteristics of cloud computing create new technical, legal and architectural challenges when conducting a forensic investigation. We propose a hypothetical scenario to uncover and explain the challenges forensic practitioners face during cloud investigations. Additionally, we also provide solutions to address the challenges. Our hypothetical case scenario has shown that, in the long run, better live forensic tools, development of new methods tailored for cloud investigations and new procedures and standards are indeed needed. Furthermore, we have come to the conclusion that forensic investigations biggest challenge is not technical but legal.

  3. Multiscale Modeling in Computational Biomechanics: Determining Computational Priorities and Addressing Current Challenges

    Energy Technology Data Exchange (ETDEWEB)

    Tawhai, Merryn; Bischoff, Jeff; Einstein, Daniel R.; Erdemir, Ahmet; Guess, Trent; Reinbolt, Jeff

    2009-05-01

    Abstract In this article, we describe some current multiscale modeling issues in computational biomechanics from the perspective of the musculoskeletal and respiratory systems and mechanotransduction. First, we outline the necessity of multiscale simulations in these biological systems. Then we summarize challenges inherent to multiscale biomechanics modeling, regardless of the subdiscipline, followed by computational challenges that are system-specific. We discuss some of the current tools that have been utilized to aid research in multiscale mechanics simulations, and the priorities to further the field of multiscale biomechanics computation.

  4. A Study of Student-Teachers' Readiness to Use Computers in Teaching: An Empirical Study

    Science.gov (United States)

    Padmavathi, M.

    2016-01-01

    This study attempts to analyze student-teachers' attitude towards the use of computers for classroom teaching. Four dimensions of computer attitude on a Likert-type five-point scale were used: Affect (liking), Perceived usefulness, Perceived Control, and Behaviour Intention to use computers. The effect of student-teachers' subject area, years of…

  5. Precision Medicine and PET/Computed Tomography: Challenges and Implementation.

    Science.gov (United States)

    Subramaniam, Rathan M

    2017-01-01

    Precision Medicine is about selecting the right therapy for the right patient, at the right time, specific to the molecular targets expressed by disease or tumors, in the context of patient's environment and lifestyle. Some of the challenges for delivery of precision medicine in oncology include biomarkers for patient selection for enrichment-precision diagnostics, mapping out tumor heterogeneity that contributes to therapy failures, and early therapy assessment to identify resistance to therapies. PET/computed tomography offers solutions in these important areas of challenges and facilitates implementation of precision medicine. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Challenges for the CMS computing model in the first year

    Energy Technology Data Exchange (ETDEWEB)

    Fisk, I.; /Fermilab

    2009-05-01

    CMS is in the process of commissioning a complex detector and a globally distributed computing infrastructure simultaneously. This represents a unique challenge. Even at the beginning there is not sufficient analysis or organized processing resources at CERN alone. In this presentation we discuss the unique computing challenges CMS expects to face during the first year of running and how they influence the baseline computing model decisions. During the early accelerator commissioning periods, CMS will attempt to collect as many events as possible when the beam is on in order to provide adequate early commissioning data. Some of these plans involve overdriving the Tier-0 infrastructure during data collection with recovery when the beam is off. In addition to the larger number of triggered events, there will be pressure in the first year to collect and analyze more complete data formats as the summarized formats mature. The large event formats impact the required storage, bandwidth, and processing capacity across all the computing centers. While the understanding of the detector and the event selections is being improved, there will likely be a larger number of reconstruction passes and skims performed by both central operations and individual users. We discuss how these additional stresses impact the allocation of resources and the changes from the baseline computing model.

  7. Challenges for the CMS computing model in the first year

    Energy Technology Data Exchange (ETDEWEB)

    Fisk, I, E-mail: ifisk@fnal.go [Fermi National Accelerator Laboratory (United States)

    2010-04-01

    CMS is in the process of commissioning a complex detector and a globally distributed computing infrastructure simultaneously. This represents a unique challenge. Even at the beginning there is not sufficient analysis or organized processing resources at CERN alone. In this presentation we discuss the unique computing challenges CMS expects to face during the first year of running and how they influence the baseline computing model decisions. During the early accelerator commissioning periods, CMS will attempt to collect as many events as possible when the beam is on in order to provide adequate early commissioning data. Some of these plans involve overdriving the Tier-0 infrastructure during data collection with recovery when the beam is off. In addition to the larger number of triggered events, there will be pressure in the first year to collect and analyze more complete data formats as the summarized formats mature. The large event formats impact the required storage, bandwidth, and processing capacity across all the computing centers. While the understanding of the detector and the event selections is being improved, there will likely be a larger number of reconstruction passes and skims performed by both central operations and individual users. We discuss how these additional stresses impact the allocation of resources and the changes from the baseline computing model.

  8. Enhancing the Cloud Computing Performance by Labeling the Free Node Services as Ready-To-Execute Tasks

    Directory of Open Access Journals (Sweden)

    Radwan S. Abujassar

    2017-01-01

    Full Text Available The huge bandwidth and hardware capacity form a high combination together which leads to a vigorous development in the Internet. On the other hand, different problems will come up during the use of the networks such as delay and node tasks load. These problems lead to degrade the network performance and then affect network service for users. In cloud computing, users are looking to be provided with a high level of services from the service provider. In addition, cloud computing service facilitates the execution of complicated tasks that needed high-storage scale for the computation. In this paper, we have implemented a new technique to retain the service and assign tasks to the best and free available node already labeled by the manager node. The Cloud Computing Alarm (CCA technique is working to provide all information about the services node and which one is ready to receive the task from users. According to the simulation results, the CCA technique is making good enhancements on the QoS which will increase the number of users to use the service. Additionally, the results showed that the CCA technique improved the services without any degrading of network performance by completing each task in less time.

  9. Underwater Visual Computing: The Grand Challenge Just around the Corner.

    Science.gov (United States)

    von Lukas, Uwe Freiherr

    2016-01-01

    Visual computing technologies have traditionally been developed for conventional setups where air is the surrounding medium for the user, the display, and/or the camera. However, given mankind's increasingly need to rely on the oceans to solve the problems of future generations (such as offshore oil and gas, renewable energies, and marine mineral resources), there is a growing need for mixed-reality applications for use in water. This article highlights the various research challenges when changing the medium from air to water, introduces the concept of underwater mixed environments, and presents recent developments in underwater visual computing applications.

  10. Reviews on Security Issues and Challenges in Cloud Computing

    Science.gov (United States)

    An, Y. Z.; Zaaba, Z. F.; Samsudin, N. F.

    2016-11-01

    Cloud computing is an Internet-based computing service provided by the third party allowing share of resources and data among devices. It is widely used in many organizations nowadays and becoming more popular because it changes the way of how the Information Technology (IT) of an organization is organized and managed. It provides lots of benefits such as simplicity and lower costs, almost unlimited storage, least maintenance, easy utilization, backup and recovery, continuous availability, quality of service, automated software integration, scalability, flexibility and reliability, easy access to information, elasticity, quick deployment and lower barrier to entry. While there is increasing use of cloud computing service in this new era, the security issues of the cloud computing become a challenges. Cloud computing must be safe and secure enough to ensure the privacy of the users. This paper firstly lists out the architecture of the cloud computing, then discuss the most common security issues of using cloud and some solutions to the security issues since security is one of the most critical aspect in cloud computing due to the sensitivity of user's data.

  11. Towards brain-computer music interfaces: progress and challenges

    OpenAIRE

    Miranda, E. R.; Durrant, Simon; Anders, T.

    2008-01-01

    Brain-Computer Music Interface (BCMI) is a new research area that is emerging at the cross roads of neurobiology,engineering sciences and music. This research involves three major challenging problems: the extraction of meaningful control information from signals emanating directly from the brain, the design of generative music techniques that respond to such information, and the training of subjects to use the system. We have implemented a proof-of-concept BCMI system that is able to use ...

  12. Computing Platforms for Big Biological Data Analytics: Perspectives and Challenges.

    Science.gov (United States)

    Yin, Zekun; Lan, Haidong; Tan, Guangming; Lu, Mian; Vasilakos, Athanasios V; Liu, Weiguo

    2017-01-01

    The last decade has witnessed an explosion in the amount of available biological sequence data, due to the rapid progress of high-throughput sequencing projects. However, the biological data amount is becoming so great that traditional data analysis platforms and methods can no longer meet the need to rapidly perform data analysis tasks in life sciences. As a result, both biologists and computer scientists are facing the challenge of gaining a profound insight into the deepest biological functions from big biological data. This in turn requires massive computational resources. Therefore, high performance computing (HPC) platforms are highly needed as well as efficient and scalable algorithms that can take advantage of these platforms. In this paper, we survey the state-of-the-art HPC platforms for big biological data analytics. We first list the characteristics of big biological data and popular computing platforms. Then we provide a taxonomy of different biological data analysis applications and a survey of the way they have been mapped onto various computing platforms. After that, we present a case study to compare the efficiency of different computing platforms for handling the classical biological sequence alignment problem. At last we discuss the open issues in big biological data analytics.

  13. ATLAS computing challenges before the next LHC run

    CERN Document Server

    Barberis, D; The ATLAS collaboration

    2016-01-01

    ATLAS software and computing is in a period of intensive evolution. The current long shutdown presents an opportunity to assimilate lessons from the very successful Run 1 (2009-2013) and to prepare for the substantially increased computing requirements for Run 2 (from spring 2015). Run 2 will bring a near doubling of the energy and the data rate, high event pile-up levels, and higher event complexity from detector upgrades, meaning the number and complexity of events to be analyzed will increase dramatically. At the same time operational loads must be reduced through greater automation, a wider array of opportunistic resources must be supported, costly storage must be used with greater efficiency, a sophisticated new analysis model must be integrated, and concurrency features of new processors must be exploited. This paper surveys the distributed computing aspects of the upgrade program and the plans for 2014 to exercise the new capabilities in a large scale Data Challenge.

  14. ATLAS computing challenges before the next LHC run

    CERN Document Server

    Barberis, D; The ATLAS collaboration

    2014-01-01

    ATLAS software and computing is in a period of intensive evolution. The current long shutdown presents an opportunity to assimilate lessons from the very successful Run 1 (2009-2013) and to prepare for the substantially increased computing requirements for Run 2 (from spring 2015). Run 2 will bring a near doubling of the energy and the data rate, high event pile-up levels, and higher event complexity from detector upgrades, meaning the number and complexity of events to be analyzed will increase dramatically. At the same time operational loads must be reduced through greater automation, a wider array of opportunistic resources must be supported, costly storage must be used with greater efficiency, a sophisticated new analysis model must be integrated, and concurrency features of new processors must be exploited. This presentation will survey the distributed computing aspects of the upgrade program and the plans for 2014 to exercise the new capabilities in a large scale Data Challenge.

  15. Achievements and challenges in structural bioinformatics and computational biophysics.

    Science.gov (United States)

    Samish, Ilan; Bourne, Philip E; Najmanovich, Rafael J

    2015-01-01

    The field of structural bioinformatics and computational biophysics has undergone a revolution in the last 10 years. Developments that are captured annually through the 3DSIG meeting, upon which this article reflects. An increase in the accessible data, computational resources and methodology has resulted in an increase in the size and resolution of studied systems and the complexity of the questions amenable to research. Concomitantly, the parameterization and efficiency of the methods have markedly improved along with their cross-validation with other computational and experimental results. The field exhibits an ever-increasing integration with biochemistry, biophysics and other disciplines. In this article, we discuss recent achievements along with current challenges within the field. © The Author 2014. Published by Oxford University Press.

  16. Opportunities and challenges of high-performance computing in chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Guest, M.F.; Kendall, R.A.; Nichols, J.A. [eds.] [and others

    1995-06-01

    The field of high-performance computing is developing at an extremely rapid pace. Massively parallel computers offering orders of magnitude increase in performance are under development by all the major computer vendors. Many sites now have production facilities that include massively parallel hardware. Molecular modeling methodologies (both quantum and classical) are also advancing at a brisk pace. The transition of molecular modeling software to a massively parallel computing environment offers many exciting opportunities, such as the accurate treatment of larger, more complex molecular systems in routine fashion, and a viable, cost-effective route to study physical, biological, and chemical `grand challenge` problems that are impractical on traditional vector supercomputers. This will have a broad effect on all areas of basic chemical science at academic research institutions and chemical, petroleum, and pharmaceutical industries in the United States, as well as chemical waste and environmental remediation processes. But, this transition also poses significant challenges: architectural issues (SIMD, MIMD, local memory, global memory, etc.) remain poorly understood and software development tools (compilers, debuggers, performance monitors, etc.) are not well developed. In addition, researchers that understand and wish to pursue the benefits offered by massively parallel computing are often hindered by lack of expertise, hardware, and/or information at their site. A conference and workshop organized to focus on these issues was held at the National Institute of Health, Bethesda, Maryland (February 1993). This report is the culmination of the organized workshop. The main conclusion: a drastic acceleration in the present rate of progress is required for the chemistry community to be positioned to exploit fully the emerging class of Teraflop computers, even allowing for the significant work to date by the community in developing software for parallel architectures.

  17. Ready, set, go . . . well maybe

    Energy Technology Data Exchange (ETDEWEB)

    Alexandre, Melanie M; Bartolome, Terri-Lynn C

    2011-02-28

    The agenda for this presentation is: (1) understand organizational readiness for changes; (2) review benefits and challenges of change; (3) share case studies of ergonomic programs that were 'not ready' and some that were 'ready'; and (4) provide some ideas for facilitating change.

  18. Addressing capability computing challenges of high-resolution global climate modelling at the Oak Ridge Leadership Computing Facility

    Science.gov (United States)

    Anantharaj, Valentine; Norman, Matthew; Evans, Katherine; Taylor, Mark; Worley, Patrick; Hack, James; Mayer, Benjamin

    2014-05-01

    During 2013, high-resolution climate model simulations accounted for over 100 million "core hours" using Titan at the Oak Ridge Leadership Computing Facility (OLCF). The suite of climate modeling experiments, primarily using the Community Earth System Model (CESM) at nearly 0.25 degree horizontal resolution, generated over a petabyte of data and nearly 100,000 files, ranging in sizes from 20 MB to over 100 GB. Effective utilization of leadership class resources requires careful planning and preparation. The application software, such as CESM, need to be ported, optimized and benchmarked for the target platform in order to meet the computational readiness requirements. The model configuration needs to be "tuned and balanced" for the experiments. This can be a complicated and resource intensive process, especially for high-resolution configurations using complex physics. The volume of I/O also increases with resolution; and new strategies may be required to manage I/O especially for large checkpoint and restart files that may require more frequent output for resiliency. It is also essential to monitor the application performance during the course of the simulation exercises. Finally, the large volume of data needs to be analyzed to derive the scientific results; and appropriate data and information delivered to the stakeholders. Titan is currently the largest supercomputer available for open science. The computational resources, in terms of "titan core hours" are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) and ASCR Leadership Computing Challenge (ALCC) programs, both sponsored by the U.S. Department of Energy (DOE) Office of Science. Titan is a Cray XK7 system, capable of a theoretical peak performance of over 27 PFlop/s, consists of 18,688 compute nodes, with a NVIDIA Kepler K20 GPU and a 16-core AMD Opteron CPU in every node, for a total of 299,008 Opteron cores and 18,688 GPUs offering a cumulative 560

  19. The emergence of grammar in a language-ready brain. Comment on "Towards a Computational Comparative Neuroprimatology: Framing the language-ready brain" by Michael A. Arbib

    Science.gov (United States)

    Hawkins, John A.

    2016-03-01

    Arbib makes the interesting proposal [3, §1.6] that the first Homo sapiens could have been ;language-ready;, without possessing the kind of rich lexicon, grammar and compositional semantics that we see in the world's languages today. This early language readiness would have consisted of a set of ;protolanguage; abilities, which he enumerates (1-7 in §1.6), supported by brain mechanisms unique to humans. The transition to full ;language; (properties 8-11 in §1.6 and §3) would have required no changes in the genome, he argues, but could have resulted from cultural evolution plus some measure of Baldwinian evolution favoring offspring with greater linguistic skill. The full picture is set out in [1].

  20. Computational Aspects of Dam Risk Analysis: Findings and Challenges

    Directory of Open Access Journals (Sweden)

    Ignacio Escuder-Bueno

    2016-09-01

    Full Text Available In recent years, risk analysis techniques have proved to be a useful tool to inform dam safety management. This paper summarizes the outcomes of three themes related to dam risk analysis discussed in the Benchmark Workshops organized by the International Commission on Large Dams Technical Committee on “Computational Aspects of Analysis and Design of Dams.” In the 2011 Benchmark Workshop, estimation of the probability of failure of a gravity dam for the sliding failure mode was discussed. Next, in 2013, the discussion focused on the computational challenges of the estimation of consequences in dam risk analysis. Finally, in 2015, the probability of sliding and overtopping in an embankment was analyzed. These Benchmark Workshops have allowed a complete review of numerical aspects for dam risk analysis, showing that risk analysis methods are a very useful tool to analyze the risk of dam systems, including downstream consequence assessments and the uncertainty of structural models.

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  2. Challenges of technology integration and computer-assisted surgery.

    Science.gov (United States)

    Rivkin, Gurion; Liebergall, Meir

    2009-02-01

    The rapid progress of modern computerized capabilities has not been paralleled by a similar progress in the operating room setting and in operating techniques. The major advance in orthopaedic surgery during the past fifty years has been the introduction of intraoperative fluoroscopic imaging, while surgical techniques have remained mostly unchanged. Orthopaedic procedures dealing with bones--a nondeformable tissue--are suitable for computerized guidance based on preoperatively and intraoperatively obtained images. Computer-assisted surgery progressed from the first-generation systems of the 1990 s to the present third-generation systems, enabling surgeons to implant a knee or hip prosthesis with high precision. However, most orthopaedic surgeons avoid using computer-navigation surgical techniques. Why has the implementation of computer-assisted surgery procedures met so many hurdles and obstacles? The factors that make up the answer to this question can be grouped into three categories: human, technological, and financial. Computer-assisted surgery has the potential to revolutionize orthopaedic surgery just as fluoroscopy did a few decades ago; however, its widespread use has been hampered by a lack of sufficient clinical data on the one hand and by a reluctance to use the technique and thereby collect and share data on the other. The challenge is to overcome the human, technological, and financial hurdles. Once these obstacles are addressed, we believe that computer-assisted surgery will set a new standard of care. Until that time, some will be willing to lead the revolution and pay the price of progress, and others will be reluctant to take part in this endeavor.

  3. Computer and internet use in vascular outpatients--ready for interactive applications?

    Science.gov (United States)

    Richter, J G; Schneider, M; Klein-Weigel, P

    2009-11-01

    Exploring patients' computer and internet use, their expectations and attitudes is mandatory for successful introduction of interactive online health-care applications in Angiology. We included 165 outpatients suffering from peripheral arterial disease (PAD; n = 62) and chronic venous and / or lymphatic disease (CVLD; n = 103) in a cross-sectional-study. Patients answered a paper-based questionnaire. Patients were predominantly female (54.5%). 142 (86.1%) reported regular computer use for 9.7 +/- 5.8 years and 134 (81.2 %) used the internet for 6.2 +/- 3.6 years. CVLD-patients and internet-user were younger and higher educated, resulting in a significant difference in computer and internet use between the disease groups (p online summed up to 4.3 +/- 2.2 days per week and 1.44 +/- 1.2 hours per day for all internet users without significant differences between the groups. The topics retrieved from the internet covered a wide spectrum and searches for health information were mentioned by 41.2 %. Although confidence in the internet (3.3 +/- 1.1 on a 1-6 Likert scale) and reliability in information retrieved from the internet (3.1 +/- 1.1) were relatively low, health-related issues were of high actual and future interest. 42.8% of the patients were even interested in interactive applications like health educational programs, 37.4% in self-reported assessments and outcome questionnaires and 26.9% in chatforums; 50% demanded access to their medical data on an Internetserver. Compared to older participants those shopping, chatting, and e-mailing, but not for health information retrieval and interactive applications. Computers are commonly used and the internet has been adopted as an important source of information by patients suffering from PAD and CVLD. Besides, the internet offers great potentials and new opportunities for interactive disease (self-)management in angiology. To increase confidence and reliability in the medium a careful introduction and evaluation of

  4. In-stent restenosis and multislice computed tomography: is the method ready to start?

    Science.gov (United States)

    Martuscelli, Eugenio; Razzini, Cinzia; D'Eliseo, Alessia; Di Luozzo, Marco; Mauro, Borzi; Romeo, Francesco

    2007-05-01

    We present two patients revascularized by coronary stents and evaluated by multislice computed tomography (CT). In first patient, angio-CT (16 slices/rotation scanner) detected a high-grade restenosis on the distal part of a drug-eluting stent; conventional coronary angiography confirmed the diagnosis. In second patient, angio-CT (64 slices/rotation) showed a tissue proliferation, non-flow-limiting, in the proximal part of a bare metal stent; conventional angiography confirmed the diagnosis. Blooming effects and partial volume averaging still limit the widespread application of this method. New scanners and the use of a special convolution kernel are likely to improve the accuracy of CT angiography in patients with stents.

  5. Computational chemistry meets cultural heritage: challenges and perspectives.

    Science.gov (United States)

    Fantacci, Simona; Amat, Anna; Sgamellotti, Antonio

    2010-06-15

    heritage can complement experimental investigations by establishing or rationalizing structure-property relations of the fundamental artwork components. These insights allow researchers to understand the interdependence of such components and eventually the composition of the artwork materials. As a perspective, we aim to extend the simulations to systems of increasing complexity that are similar to the realistic materials encountered in works of art. A challenge is the computational investigation of materials degradation and their associated reactive pathways; here the possible initial components, intermediates, final materials, and various deterioration mechanisms must all be simulated.

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  7. Computer vision challenges and technologies for agile manufacturing

    Science.gov (United States)

    Molley, Perry A.

    1996-02-01

    Sandia National Laboratories, a Department of Energy laboratory, is responsible for maintaining the safety, security, reliability, and availability of the nuclear weapons stockpile for the United States. Because of the changing national and global political climates and inevitable budget cuts, Sandia is changing the methods and processes it has traditionally used in the product realization cycle for weapon components. Because of the increasing age of the nuclear stockpile, it is certain that the reliability of these weapons will degrade with time unless eventual action is taken to repair, requalify, or renew them. Furthermore, due to the downsizing of the DOE weapons production sites and loss of technical personnel, the new product realization process is being focused on developing and deploying advanced automation technologies in order to maintain the capability for producing new components. The goal of Sandia's technology development program is to create a product realization environment that is cost effective, has improved quality and reduced cycle time for small lot sizes. The new environment will rely less on the expertise of humans and more on intelligent systems and automation to perform the production processes. The systems will be robust in order to provide maximum flexibility and responsiveness for rapidly changing component or product mixes. An integrated enterprise will allow ready access to and use of information for effective and efficient product and process design. Concurrent engineering methods will allow a speedup of the product realization cycle, reduce costs, and dramatically lessen the dependency on creating and testing physical prototypes. Virtual manufacturing will allow production processes to be designed, integrated, and programed off-line before a piece of hardware ever moves. The overriding goal is to be able to build a large variety of new weapons parts on short notice. Many of these technologies that are being developed are also

  8. A robust beamforming approach for early detection of readiness potential with application to brain-computer interface systems.

    Science.gov (United States)

    Mahmoodi, Maryam; Abadi, Bahador Makki; Khajepur, Hassan; Harirchian, Mohammad Hossein

    2017-07-01

    Early detection of intention to move, at self-paced voluntary movements from the activities of neural current sources on the motor cortex, can be an effective approach to brain-computer interface (BCI) systems. Achieving high sensitivity and pre-movement negative latency are important issues for increasing the speed of BCI and other rehabilitation and neurofeedback systems used by disabled and stroke patients and helps enhance their movement abilities. Therefore, developing high-performance extractors or beamformers is a necessary task in this regard. In this paper, for the sake of improving the beamforming performance in well reconstruction of sources of readiness potential, related to hand movement, one kind of surface spatial filter (spherical spline derivative on electrode space) and the linearly constrained minimum variance (LCMV) beamformer are utilized jointly. Moreover, in order to achieve better results, the real head model of each subject was created, using individual head MRI, and was used in beamformer algorithm. Also, few optimizations were done on reconstructed source signal powers to help our template matching classifier with detection of movement onset for five healthy subjects. Our classification results show an average true positive rate (TPR) of 77.1% and 73.1%, false positive rate (FPR) of 28.96% and 28.74% and latency of -512.426 ±396. 7ms and - 360.29 ±252. 16 ms from signals of current sources of motor cortex and sensor space respectively. It can be seen that the proposed method has reliable sensitivity and is faster in prediction of movement onset and more reliable to be used for online BCI in future.

  9. Community Mobilization and Readiness: Planning Flaws which Challenge Effective Implementation of 'Communities that Care' (CTC) Prevention System.

    Science.gov (United States)

    Basic, Josipa

    2015-01-01

    This article reviews the experience of implementing a community approach to drug use and youth delinquency prevention based on the 'Communities that Care' (CTC) system implemented in one Croatian county consisting of 12 communities, 2002 to 2013 (Hawkins, 1999; Hawkins & Catalano, 2004). This overview explores selected critical issues which are often not considered in substance use(r) community intervention planning, implementation as well as in associated process and outcome assessments. These issues include, among others, the mobilization process of adequate representation of people; the involvement of relevant key individual and organizational stakeholders and being aware of the stakeholders' willingness to participate in the prevention process. In addition, it is important to be aware of the stakeholders' knowledge and perceptions about the 'problems' of drug use and youth delinquency in their communities as well as the characteristics of the targeted population(s). Sometimes there are community members and stakeholders who block needed change and therefore prevention process enablers and 'bridges' should be involved in moving prevention programming forward. Another barrier that is often overlooked in prevention planning is community readiness to change and a realistic assessment of available and accessible resources for initiating the planned change(s) and sustaining them. All of these issues have been found to be potentially related to intervention success. At the end of this article, I summarize perspectives from prevention scientists and practitioners and lessons learned from communities' readiness research and practice in Croatian that has international relevance.

  10. High performance computing and communications grand challenges program

    Energy Technology Data Exchange (ETDEWEB)

    Solomon, J.E.; Barr, A.; Chandy, K.M.; Goddard, W.A., III; Kesselman, C.

    1994-10-01

    The so-called protein folding problem has numerous aspects, however it is principally concerned with the {ital de novo} prediction of three-dimensional (3D) structure from the protein primary amino acid sequence, and with the kinetics of the protein folding process. Our current project focuses on the 3D structure prediction problem which has proved to be an elusive goal of molecular biology and biochemistry. The number of local energy minima is exponential in the number of amino acids in the protein. All current methods of 3D structure prediction attempt to alleviate this problem by imposing various constraints that effectively limit the volume of conformational space which must be searched. Our Grand Challenge project consists of two elements: (1) a hierarchical methodology for 3D protein structure prediction; and (2) development of a parallel computing environment, the Protein Folding Workbench, for carrying out a variety of protein structure prediction/modeling computations. During the first three years of this project, we are focusing on the use of two proteins selected from the Brookhaven Protein Data Base (PDB) of known structure to provide validation of our prediction algorithms and their software implementation, both serial and parallel. Both proteins, protein L from {ital peptostreptococcus magnus}, and {ital streptococcal} protein G, are known to bind to IgG, and both have an {alpha} {plus} {beta} sandwich conformation. Although both proteins bind to IgG, they do so at different sites on the immunoglobin and it is of considerable biological interest to understand structurally why this is so. 12 refs., 1 fig.

  11. Aneesur Rahman Prize for Computational Physics Lecture: Addressing Dirac's Challenge

    Science.gov (United States)

    Chelikowsky, James

    2013-03-01

    After the invention of quantum mechanics, P. A. M. Dirac made the following observation: ``The underlying physical laws necessary for the mathematical theory of a large part of physics and the whole of chemistry are thus completely known, and the difficulty is only that the exact application of these laws leads to equations much too complicated to be soluble. It therefore becomes desirable that approximate practical methods of applying quantum mechanics should be developed, which can lead to an explanation of the main features of complex atomic systems...'' The creation of ``approximate practical methods'' in response to Dirac's challenge has included the one electron picture, density functional theory and the pseudopotential concept. The combination of such methods in conjunction with contemporary computational platforms and new algorithms offer the possibility of predicting properties of materials solely from knowledge of the atomic species present. I will give an overview of progress in this field with an emphasis on materials at the nanoscale. Support from the Department of Energy and the National Science Foundation is acknowledged.

  12. Towards a 'siliconeural computer': technological successes and challenges.

    Science.gov (United States)

    Hughes, Mark A; Shipston, Mike J; Murray, Alan F

    2015-07-28

    Electronic signals govern the function of both nervous systems and computers, albeit in different ways. As such, hybridizing both systems to create an iono-electric brain-computer interface is a realistic goal; and one that promises exciting advances in both heterotic computing and neuroprosthetics capable of circumventing devastating neuropathology. 'Neural networks' were, in the 1980s, viewed naively as a potential panacea for all computational problems that did not fit well with conventional computing. The field bifurcated during the 1990s into a highly successful and much more realistic machine learning community and an equally pragmatic, biologically oriented 'neuromorphic computing' community. Algorithms found in nature that use the non-synchronous, spiking nature of neuronal signals have been found to be (i) implementable efficiently in silicon and (ii) computationally useful. As a result, interest has grown in techniques that could create mixed 'siliconeural' computers. Here, we discuss potential approaches and focus on one particular platform using parylene-patterned silicon dioxide.

  13. School readiness.

    Science.gov (United States)

    High, Pamela C

    2008-04-01

    School readiness includes the readiness of the individual child, the school's readiness for children, and the ability of the family and community to support optimal early child development. It is the responsibility of schools to be ready for all children at all levels of readiness. Children's readiness for kindergarten should become an outcome measure for community-based programs, rather than an exclusion criterion at the beginning of the formal educational experience. Our new knowledge of early brain and child development has revealed that modifiable factors in a child's early experience can greatly affect that child's learning trajectory. Many US children enter kindergarten with limitations in their social, emotional, cognitive, and physical development that might have been significantly diminished or eliminated through early identification of and attention to child and family needs. Pediatricians have a role in promoting school readiness for all children, beginning at birth, through their practices and advocacy. The American Academy of Pediatrics affords pediatricians many opportunities to promote the physical, social-emotional, and educational health of young children, with other advocacy groups. This technical report supports American Academy of Pediatrics policy statements "Quality Early Education and Child Care From Birth to Kindergarten" and "The Inappropriate Use of School 'Readiness' Tests."

  14. Computer usage among nurses in rural health-care facilities in South Africa: obstacles and challenges.

    Science.gov (United States)

    Asah, Flora

    2013-04-01

    This study discusses factors inhibiting computer usage for work-related tasks among computer-literate professional nurses within rural healthcare facilities in South Africa. In the past two decades computer literacy courses have not been part of the nursing curricula. Computer courses are offered by the State Information Technology Agency. Despite this, there seems to be limited use of computers by professional nurses in the rural context. Focus group interviews held with 40 professional nurses from three government hospitals in northern KwaZulu-Natal. Contributing factors were found to be lack of information technology infrastructure, restricted access to computers and deficits in regard to the technical and nursing management support. The physical location of computers within the health-care facilities and lack of relevant software emerged as specific obstacles to usage. Provision of continuous and active support from nursing management could positively influence computer usage among professional nurses. A closer integration of information technology and computer literacy skills into existing nursing curricula would foster a positive attitude towards computer usage through early exposure. Responses indicated that change of mindset may be needed on the part of nursing management so that they begin to actively promote ready access to computers as a means of creating greater professionalism and collegiality. © 2011 Blackwell Publishing Ltd.

  15. TCLOUD: Challenges and Best Practices for Cloud Computing

    OpenAIRE

    Ullah, Sultan; Xuefeng, Zheng; Feng, Zhou; Haichun, Zhao

    2013-01-01

    Cloud computing has achieved an unbelievable adoption response rate but still its infancy stage is not over. It is an emerging paradigm and amazingly gaining popularity. The size of the market shared of the applications provided by cloud computing is still not much behind the expectations. It provides the organizations with great potential to minimize the cost and maximizes the overall operating effectiveness of computing required by an organization. Despite its growing popularity, still it i...

  16. GIS Readiness Survey 2014

    DEFF Research Database (Denmark)

    Schrøder, Lise; Hvingel, Line Træholt; Hansen, Henning Sten

    2014-01-01

    The GIS Readiness Survey 2014 is a follow-up to the corresponding survey that was carried out among public institutions in Denmark in 2009. The present survey thus provides an updated image of status and challenges in relation to the use of spatial information, the construction of the com- mon...

  17. The Challenge '88 Project: Interfacing of Chemical Instruments to Computers.

    Science.gov (United States)

    Lyons, Jim; Verghese, Manoj

    The main part of this project involved using a computer, either an Apple or an IBM, as a chart recorder for the infrared (IR) and nuclear magnetic resonance (NMR) spectrophotometers. The computer "reads" these machines and displays spectra on its monitor. The graphs can then be stored for future reference and manipulation. The program to…

  18. Gender Digital Divide and Challenges in Undergraduate Computer Science Programs

    Science.gov (United States)

    Stoilescu, Dorian; McDougall, Douglas

    2011-01-01

    Previous research revealed a reduced number of female students registered in computer science studies. In addition, the female students feel isolated, have reduced confidence, and underperform. This article explores differences between female and male students in undergraduate computer science programs in a mid-size university in Ontario. Based on…

  19. School Readiness

    OpenAIRE

    BENEŠOVÁ, Marcela

    2012-01-01

    Bachelor thesis titled School maturity deals with the development of preschool age. The aim is to evaluate the optimal maturity in pre-school ride. The theoretical part describes preschool age, his motor, cognitive, perceptual, emotional and social development. It defines the concepts of school readiness and its components, school readiness, school immaturity. Describes measures for education immature kids. The practical part contains the results of investigations on a selected sample of chil...

  20. Multicore Challenges and Benefits for High Performance Scientific Computing

    Directory of Open Access Journals (Sweden)

    Ida M.B. Nielsen

    2008-01-01

    Full Text Available Until recently, performance gains in processors were achieved largely by improvements in clock speeds and instruction level parallelism. Thus, applications could obtain performance increases with relatively minor changes by upgrading to the latest generation of computing hardware. Currently, however, processor performance improvements are realized by using multicore technology and hardware support for multiple threads within each core, and taking full advantage of this technology to improve the performance of applications requires exposure of extreme levels of software parallelism. We will here discuss the architecture of parallel computers constructed from many multicore chips as well as techniques for managing the complexity of programming such computers, including the hybrid message-passing/multi-threading programming model. We will illustrate these ideas with a hybrid distributed memory matrix multiply and a quantum chemistry algorithm for energy computation using Møller–Plesset perturbation theory.

  1. Mastering the Challenge of High-Performance Computing.

    Science.gov (United States)

    Roach, Ronald

    2003-01-01

    Discusses how, just as all of higher education got serious with wiring individual campuses for the Internet, the nation's leading research institutions have initiated "high-performance computing." Describes several such initiatives involving historically black colleges and universities. (EV)

  2. Q. Is Internal Audit Ready for Blockchain?

    National Research Council Canada - National Science Library

    Hugh Rooney; Brian Aiken; Megan Rooney

    2017-01-01

    The question whether internal audit ready for blockchain is answered. Blockchain technology offers the promise of "a safe, transparent, rapid and affordable digital solution to many government challenges...

  3. Technology Ready to be Launched, but is there a Payer? Challenges for Implementing eHealth in Sweden.

    Science.gov (United States)

    Hollmark, Malin; Lefevre Skjöldebrand, Anna; Andersson, Christoffer; Lindblad, Ragnar

    2015-01-01

    The development of a sustainable, high-quality, affordable health care is today a high priority for many actors in the society. This is to ensure that we will continue to afford to care for the growing portion of elderly in our population. One solution is to enable the individual's power over her own health or illness, and participation in her own care. There are evidently opportunities with the rapid development of eHealth and wearable sensors. Tracking and measuring vital data can help to keep people out of the hospital. Loads of data is generated to help us understand disease, to provide us with early diagnostics and warnings. It is providing us with possibilities to collect and capture the true health status of individuals. Successful technologies demonstrate savings, acceptance among users and improved access to healthcare. But there are also challenges. Implementing new technologies in health care is difficult. Researchers from around the world are reporting on similar problems, such as reimbursement, interoperability, usability and regulatory issues. This paper will discuss a few of these implementation challenges as well as a few of the efforts in meeting them. To conclude, eHealth solutions can contribute to patient empowerment and a sustainable health care. Our assumption is however, that as long as we do not face the implementation challenges and invest in overcoming the pressing obstacles, society will not be able, or willing, to pay for the solutions.

  4. Integrated challenge test: a new approach evaluating quantitative risk assessment of Listeria in ready to eat foods

    Directory of Open Access Journals (Sweden)

    Paolo Matteini

    2012-02-01

    Full Text Available The study was aimed to predict the maximum concentration of Listeria monocytogenes during the shelf life in chicken liver paté. The prediction has been performed using the integrated challenge test: a test based on the interaction between indigenous lactic flora and L. monocytogenes and their growth parameters. Two different approaches were investigated: the former is based on the time difference between the onset of the L. monocytogenes and the lactic flora stationary phases, while the latter is based on the lactic flora concentration capable to induct the stationary phase of L. monocytogenes. Three different strains of L. monocytogenes, isolated from meat products, were used to perform three challenge tests. Triplicate samples from three different batches of liver paté were inoculated with a single-strain inoculum of 1.8 Log CFU/g. Samples were then stored at 4°C, 8°C and 12°C. Lactobacillus spp. (ISO 15214:1998 and L. monocytogenes (UNI EN ISO 11290-02:2005 plate counts were performed daily on each sample until the stationary phase was reached by both populations. The challenge test results were input in the Combase software to determine the growth parameters, later used for the calculation method. Predictive data were then statically assessed against the results of two additional challenge tests using triplicate samples from two different batches, the same strains and the same single-strain inoculum. Samples from the first batch were stored for 5 days at 4°C + 5 days at 8°C + 5 days at 12°C; samples from the second batch were stored for 3 days at 4°C + 3 days at 8°C + 4 days at 12°C. The results obtained showed that both approaches provided results very close to the reality. Therefore the Integrated challenge test is useful to determine the maximum concentration of L. monocytogenes, by simply knowing the concentration of the concerned microbial populations at a given time.

  5. Computational pan-genomics: status, promises and challenges

    NARCIS (Netherlands)

    Marschall, Tobias; Ridder, de D.; Sheikhizadeh Anari, S.; Smit, S.

    2016-01-01

    Many disciplines, from human genetics and oncology to plant breeding, microbiology and virology, commonly face the challenge of analyzing rapidly increasing numbers of genomes. In case of Homo sapiens, the number of sequenced genomes will approach hundreds of thousands in the next few years. Simply

  6. Computational pan-genomics: status, promises and challenges

    NARCIS (Netherlands)

    The Computational Pan-Genomics Consortium; T. Marschall (Tobias); M. Marz (Manja); T. Abeel (Thomas); L.J. Dijkstra (Louis); B.E. Dutilh (Bas); A. Ghaffaari (Ali); P. Kersey (Paul); W.P. Kloosterman (Wigard); V. Mäkinen (Veli); A.M. Novak (Adam); B. Paten (Benedict); D. Porubsky (David); E. Rivals (Eric); C. Alkan (Can); J.A. Baaijens (Jasmijn); P.I.W. de Bakker (Paul); V. Boeva (Valentina); R.J.P. Bonnal (Raoul); F. Chiaromonte (Francesca); R. Chikhi (Rayan); F.D. Ciccarelli (Francesca); C.P. Cijvat (Robin); E. Datema (Erwin); C.M. van Duijn (Cornelia); E.E. Eichler (Evan); C. Ernst (Corinna); E. Eskin (Eleazar); E. Garrison (Erik); M. El-Kebir (Mohammed); G.W. Klau (Gunnar); J.O. Korbel (Jan); E.-W. Lameijer (Eric-Wubbo); B. Langmead (Benjamin); M. Martin; P. Medvedev (Paul); J.C. Mu (John); P.B.T. Neerincx (Pieter); K. Ouwens (Klaasjan); P. Peterlongo (Pierre); N. Pisanti (Nadia); S. Rahmann (Sven); B.J. Raphael (Benjamin); K. Reinert (Knut); D. de Ridder (Dick); J. de Ridder (Jeroen); M. Schlesner (Matthias); O. Schulz-Trieglaff (Ole); A.D. Sanders (Ashley); S. Sheikhizadeh (Siavash); C. Shneider (Carl); S. Smit (Sandra); D. Valenzuela (Daniel); J. Wang (Jiayin); L.F.A. Wessels (Lodewyk); Y. Zhang (Ying); V. Guryev (Victor); F. Vandin (Fabio); K. Ye (Kai); A. Schönhuth (Alexander)

    2016-01-01

    textabstractMany disciplines, from human genetics and oncology to plant breeding, microbiology and virology, commonly face the challenge of analyzing rapidly increasing numbers of genomes. In case of Homo sapiens, the number of sequenced genomes will approach hundreds of thousands in the next few

  7. Addressing the Challenges of Training Competent Trainers in Computer Literacy.

    Science.gov (United States)

    Stemmer, Paul M., Jr.; Carlson, Elizabeth Uzdavinis

    This report on the TMT (Training Modules for Trainers) Project, part of the Special Discretionary Grant Program developed by the Michigan Department of Education (MDE) in response to the need for coordinated training activities, begins with a discussion of the emerging problem of upgrading teachers' computer literacy skills. A description of the…

  8. User Interface Improvements in Computer-Assisted Instruction, the Challenge.

    Science.gov (United States)

    Chalmers, P. A.

    2000-01-01

    Identifies user interface problems as they relate to computer-assisted instruction (CAI); reviews the learning theories and instructional theories related to CAI user interface; and presents potential CAI user interface improvements for research and development based on learning and instructional theory. Focuses on screen design improvements.…

  9. Challenges in scaling NLO generators to leadership computers

    Science.gov (United States)

    Benjamin, D.; Childers, JT; Hoeche, S.; LeCompte, T.; Uram, T.

    2017-10-01

    Exascale computing resources are roughly a decade away and will be capable of 100 times more computing than current supercomputers. In the last year, Energy Frontier experiments crossed a milestone of 100 million core-hours used at the Argonne Leadership Computing Facility, Oak Ridge Leadership Computing Facility, and NERSC. The Fortran-based leading-order parton generator called Alpgen was successfully scaled to millions of threads to achieve this level of usage on Mira. Sherpa and MadGraph are next-to-leading order generators used heavily by LHC experiments for simulation. Integration times for high-multiplicity or rare processes can take a week or more on standard Grid machines, even using all 16-cores. We will describe our ongoing work to scale the Sherpa generator to thousands of threads on leadership-class machines and reduce run-times to less than a day. This work allows the experiments to leverage large-scale parallel supercomputers for event generation today, freeing tens of millions of grid hours for other work, and paving the way for future applications (simulation, reconstruction) on these and future supercomputers.

  10. Human-Computer Interaction Software: Lessons Learned, Challenges Ahead

    Science.gov (United States)

    1989-01-01

    appropriate next step. the history of HCI over the last decades, makes it much easier and productive for Because humans and computers are not duringwhich...ivnice et.Lsrivo Clrd.CiiisP ware produict. The cost of hardware and 430. Bulder , CO 80309: CStiet getirrAl~tilder.niloradii.edtI softwsare in future

  11. Artificial Intelligence Methods: Challenge in Computer Based Polymer Design

    Science.gov (United States)

    Rusu, Teodora; Pinteala, Mariana; Cartwright, Hugh

    2009-08-01

    This paper deals with the use of Artificial Intelligence Methods (AI) in the design of new molecules possessing desired physical, chemical and biological properties. This is an important and difficult problem in the chemical, material and pharmaceutical industries. Traditional methods involve a laborious and expensive trial-and-error procedure, but computer-assisted approaches offer many advantages in the automation of molecular design.

  12. Scientific Grand Challenges: Challenges in Climate Change Science and the Role of Computing at the Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.; Johnson, Gary M.; Washington, Warren M.

    2009-07-02

    The U.S. Department of Energy (DOE) Office of Biological and Environmental Research (BER) in partnership with the Office of Advanced Scientific Computing Research (ASCR) held a workshop on the challenges in climate change science and the role of computing at the extreme scale, November 6-7, 2008, in Bethesda, Maryland. At the workshop, participants identified the scientific challenges facing the field of climate science and outlined the research directions of highest priority that should be pursued to meet these challenges. Representatives from the national and international climate change research community as well as representatives from the high-performance computing community attended the workshop. This group represented a broad mix of expertise. Of the 99 participants, 6 were from international institutions. Before the workshop, each of the four panels prepared a white paper, which provided the starting place for the workshop discussions. These four panels of workshop attendees devoted to their efforts the following themes: Model Development and Integrated Assessment; Algorithms and Computational Environment; Decadal Predictability and Prediction; Data, Visualization, and Computing Productivity. The recommendations of the panels are summarized in the body of this report.

  13. Information Assurance and Forensic Readiness

    Science.gov (United States)

    Pangalos, Georgios; Katos, Vasilios

    Egalitarianism and justice are amongst the core attributes of a democratic regime and should be also secured in an e-democratic setting. As such, the rise of computer related offenses pose a threat to the fundamental aspects of e-democracy and e-governance. Digital forensics are a key component for protecting and enabling the underlying (e-)democratic values and therefore forensic readiness should be considered in an e-democratic setting. This position paper commences from the observation that the density of compliance and potential litigation activities is monotonically increasing in modern organizations, as rules, legislative regulations and policies are being constantly added to the corporate environment. Forensic practices seem to be departing from the niche of law enforcement and are becoming a business function and infrastructural component, posing new challenges to the security professionals. Having no a priori knowledge on whether a security related event or corporate policy violation will lead to litigation, we advocate that computer forensics need to be applied to all investigatory, monitoring and auditing activities. This would result into an inflation of the responsibilities of the Information Security Officer. After exploring some commonalities and differences between IS audit and computer forensics, we present a list of strategic challenges the organization and, in effect, the IS security and audit practitioner will face.

  14. Issues and challenges of intelligent systems and computational intelligence

    CERN Document Server

    Pozna, Claudiu; Kacprzyk, Janusz

    2014-01-01

    This carefully edited book contains contributions of prominent and active researchers and scholars in the broadly perceived area of intelligent systems. The book is unique both with respect to the width of coverage of tools and techniques, and to the variety of problems that could be solved by the tools and techniques presented. The editors have been able to gather a very good collection of relevant and original papers by prominent representatives of many areas, relevant both to the theory and practice of intelligent systems, artificial intelligence, computational intelligence, soft computing, and the like. The contributions have been divided into 7 parts presenting first more fundamental and theoretical contributions, and then applications in relevant areas.        

  15. Are project managers ready for the 21th challenges? A review of problem structuring methods for decision support

    Directory of Open Access Journals (Sweden)

    José Mateo

    2017-01-01

    Full Text Available Numerous contemporary problems that project managers face today can be considered as unstructured decision problems characterized by multiple actors and perspectives, incommensurable and/or conflicting objectives, and important intangibles. This work environment demands that project managers possess not only hard skills but also soft skills with the ability to take a management perspective and, above all, develop real leadership capabilities. In this paper, a family of problem structured methods for decision support aimed at assisting project managers in tackling complex problems are presented. Problem structured methods are a family of soft operations research methods for decision support that assist groups of diverse composition to agree a problem focus and make commitments to consequential action. Project management programs are challenged to implement these methodologies in such a way that it is organized around the key competences that a project manager needs in order to be more effective, work efficiently as members of interdisciplinary teams and successfully execute even a small project.

  16. Computational Science And Engineering Software Sustainability And Productivity (CSESSP) Challenges Workshop Report

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — This report details the challenges and opportunities discussed at the NITRD sponsored multi-agency workshop on Computational Science and Engineering Software...

  17. Computational Challenges of 3D Radiative Transfer in Atmospheric Models

    Science.gov (United States)

    Jakub, Fabian; Bernhard, Mayer

    2017-04-01

    The computation of radiative heating and cooling rates is one of the most expensive components in todays atmospheric models. The high computational cost stems not only from the laborious integration over a wide range of the electromagnetic spectrum but also from the fact that solving the integro-differential radiative transfer equation for monochromatic light is already rather involved. This lead to the advent of numerous approximations and parameterizations to reduce the cost of the solver. One of the most prominent one is the so called independent pixel approximations (IPA) where horizontal energy transfer is neglected whatsoever and radiation may only propagate in the vertical direction (1D). Recent studies implicate that the IPA introduces significant errors in high resolution simulations and affects the evolution and development of convective systems. However, using fully 3D solvers such as for example MonteCarlo methods is not even on state of the art supercomputers feasible. The parallelization of atmospheric models is often realized by a horizontal domain decomposition, and hence, horizontal transfer of energy necessitates communication. E.g. a cloud's shadow at a low zenith angle will cast a long shadow and potentially needs to communication through a multitude of processors. Especially light in the solar spectral range may travel long distances through the atmosphere. Concerning highly parallel simulations, it is vital that 3D radiative transfer solvers put a special emphasis on parallel scalability. We will present an introduction to intricacies computing 3D radiative heating and cooling rates as well as report on the parallel performance of the TenStream solver. The TenStream is a 3D radiative transfer solver using the PETSc framework to iteratively solve a set of partial differential equation. We investigate two matrix preconditioners, (a) geometric algebraic multigrid preconditioning(MG+GAMG) and (b) block Jacobi incomplete LU (ILU) factorization. The

  18. New challenges in grid generation and adaptivity for scientific computing

    CERN Document Server

    Formaggia, Luca

    2015-01-01

    This volume collects selected contributions from the “Fourth Tetrahedron Workshop on Grid Generation for Numerical Computations”, which was held in Verbania, Italy in July 2013. The previous editions of this Workshop were hosted by the Weierstrass Institute in Berlin (2005), by INRIA Rocquencourt in Paris (2007), and by Swansea University (2010). This book covers different, though related, aspects of the field: the generation of quality grids for complex three-dimensional geometries; parallel mesh generation algorithms; mesh adaptation, including both theoretical and implementation aspects; grid generation and adaptation on surfaces – all with an interesting mix of numerical analysis, computer science and strongly application-oriented problems.

  19. Algebraic Functions, Computer Programming, and the Challenge of Transfer

    OpenAIRE

    Schanzer, Emmanuel Tanenbaum

    2015-01-01

    Students' struggles with algebra are well documented. Prior to the introduction of functions, mathematics is typically focused on applying a set of arithmetic operations to compute an answer. The introduction of functions, however, marks the point at which mathematics begins to focus on building up abstractions as a way to solve complex problems. A common refrain about word problems is that “the equations are easy to solve - the hard part is setting them up!” A student of algebra is asked to ...

  20. CSA06 Computing, Software and Analysis challenge at the Spanish Tier-1 and Tier-2 sites

    CERN Document Server

    Alcaraz, J; Cabrillo, Iban Jose; Colino, Nicanor; Cuevas-Maestro, J; Delgado Peris, Antonio; Fernandez Menendez, Javier; Flix, Jose; García-Abia, Pablo; González-Caballero, I; Hernández, Jose M; Marco, Rafael; Martinez Ruiz del Arbol, Pablo; Matorras, Francisco; Merino, Gonzalo; Rodríguez-Calonge, F J; Vizan Garcia, Jesus Manuel

    2007-01-01

    This note describes the participation of the Spanish centres PIC, CIEMAT and IFCA as Tier-1 and Tier-2 sites in the CMS CSA06 Computing, Software and Analysis challenge. A number of the facilities, services and workflows have been demonstrated at the 2008 25% scale. Very valuable experience has been gained running the complex computing system under realistic conditions at a significant scale. The focus of this note is on presenting achieved results, operational experience and lessons learnt during the challenge.

  1. Violent Computer Games Pose a Challenge to Education Today

    Directory of Open Access Journals (Sweden)

    Stanko Gerjolj

    2011-09-01

    Full Text Available Violent computer games are becoming an increasingly common phenomenon of leisure activities among children and the young. Most researchers and practical educators consider them a dangerous phenomenon that encourages violence in everyday life. A kind of cyclic round goes from children who, due to a lack of sensitive communication, quickly feel certain tensions and quench them by resorting to media violence where computer games take the lead in the modern environment. Educators suggest the creation of situations where children and adolescents can speak out and express their pain in different ways. An in-depth expression of children’s and adolescents’ experiences does not only change their feelings, but extends to the changes at the level of neurobiological functioning. Adults, especially parents, help children mostly in overcoming violence if, in sensitive communication, they radiate happiness with their own lives and the ability to solve problems and give signs of unconditional acceptance and love. In such communication, children and young people re-experience their parents and other educators as strong personalities and moral authorities whom they love and respect.

  2. Two possible driving forces supporting the evolution of animal communication. Comment on "Towards a Computational Comparative Neuroprimatology: Framing the language-ready brain" by Michael A. Arbib

    Science.gov (United States)

    Moulin-Frier, Clément; Verschure, Paul F. M. J.

    2016-03-01

    In the target paper [1], M.A. Arbib proposes a quite exhaustive review of the (often computational) models developed during the last decades that support his detailed scenario on language evolution (the Mirror System Hypothesis, MSH). The approach considers that language evolved from a mirror system for grasping already present in LCA-m (the last common ancestor of macaques and humans), to a simple imitation system for grasping present in LCA-c (the last common ancestor of chimpanzees and humans), to a complex imitation system for grasping that developed in the hominid line since that ancestor. MSH considers that this complex imitation system is a key evolutionary step for a language-ready brain, providing all the required elements for an open-ended gestural communication system. The transition from the gestural (bracchio-manual and visual) to the vocal (articulatory and auditory) domain is supposed to be a less important evolutionary step.

  3. Leaderboard Now Open: CPTAC’s DREAM Proteogenomics Computational Challenge | Office of Cancer Clinical Proteomics Research

    Science.gov (United States)

    The National Cancer Institute’s Clinical Proteomic Tumor Analysis Consortium (CPTAC) is pleased to announce the opening of the leaderboard to its Proteogenomics Computational DREAM Challenge. The leadership board remains open for submissions during September 25, 2017 through October 8, 2017, with the Challenge expected to run until November 17, 2017.

  4. Computer Security: Join the CERN WhiteHat Challenge!

    CERN Document Server

    Computer Security Team

    2014-01-01

    Over the past couple of months, several CERN users have reported vulnerabilities they have found in computing services and servers running at CERN. All were relevant, many were interesting and a few even surprising. Spotting weaknesses and areas for improvement before malicious people can exploit them is paramount. It helps protect the operation of our accelerators and experiments as well as the reputation of the Organization. Therefore, we would like to express our gratitude to those people for having reported these weaknesses! Great job and well done!   Seizing the opportunity, we would like to reopen the hunt for bugs, vulnerabilities and insecure configurations of CERN applications, websites and devices. You might recall we ran a similar initiative (“Hide & Seek”) in 2012 where we asked you to sift through CERN’s webpages and send us those that hold sensitive and confidential information. Quite a number of juicy documents were found and subsequently remov...

  5. Computational Social Science: Exciting Progress and Future Challenges

    Science.gov (United States)

    Watts, Duncan

    The past 15 years have witnessed a remarkable increase in both the scale and scope of social and behavioral data available to researchers, leading some to herald the emergence of a new field: ``computational social science.'' Against these exciting developments stands a stubborn fact: that in spite of many thousands of published papers, there has been surprisingly little progress on the ``big'' questions that motivated the field in the first place--questions concerning systemic risk in financial systems, problem solving in complex organizations, and the dynamics of epidemics or social movements, among others. In this talk I highlight some examples of research that would not have been possible just a handful of years ago and that illustrate the promise of CSS. At the same time, they illustrate its limitations. I then conclude with some thoughts on how CSS can bridge the gap between its current state and its potential.

  6. Piecemeal journey to "HALCYON" world of pervasive computing: from past progress to future challenges:

    OpenAIRE

    Seth, Rolly; Kapoor, Rishi; Al-Qaheri, Hameed; Sanyal, Sugata

    2010-01-01

    Although 'Halcyon' means serene environment which pervasive computing aims at, we have tried to present a different interpretation of this word. Through our approach, we look at it in context of achieving future 'calm technology'. The paper gives a general overview of the state of pervasive computing today, proposes the 'HALCYON Model' and outlines the 'social' challenges faced by system designers.

  7. Computer-Assisted Diagnostic Decision Support: History, Challenges, and Possible Paths Forward

    Science.gov (United States)

    Miller, Randolph A.

    2009-01-01

    This paper presents a brief history of computer-assisted diagnosis, including challenges and future directions. Some ideas presented in this article on computer-assisted diagnostic decision support systems (CDDSS) derive from prior work by the author and his colleagues (see list in Acknowledgments) on the INTERNIST-1 and QMR projects. References…

  8. Computational Research Challenges and Opportunities for the Optimization of Fossil Energy Power Generation System

    Energy Technology Data Exchange (ETDEWEB)

    Zitney, S.E.

    2007-06-01

    Emerging fossil energy power generation systems must operate with unprecedented efficiency and near-zero emissions, while optimizing profitably amid cost fluctuations for raw materials, finished products, and energy. To help address these challenges, the fossil energy industry will have to rely increasingly on the use advanced computational tools for modeling and simulating complex process systems. In this paper, we present the computational research challenges and opportunities for the optimization of fossil energy power generation systems across the plant lifecycle from process synthesis and design to plant operations. We also look beyond the plant gates to discuss research challenges and opportunities for enterprise-wide optimization, including planning, scheduling, and supply chain technologies.

  9. Computational challenges in magnetic-confinement fusion physics

    Science.gov (United States)

    Fasoli, A.; Brunner, S.; Cooper, W. A.; Graves, J. P.; Ricci, P.; Sauter, O.; Villard, L.

    2016-05-01

    Magnetic-fusion plasmas are complex self-organized systems with an extremely wide range of spatial and temporal scales, from the electron-orbit scales (~10-11 s, ~ 10-5 m) to the diffusion time of electrical current through the plasma (~102 s) and the distance along the magnetic field between two solid surfaces in the region that determines the plasma-wall interactions (~100 m). The description of the individual phenomena and of the nonlinear coupling between them involves a hierarchy of models, which, when applied to realistic configurations, require the most advanced numerical techniques and algorithms and the use of state-of-the-art high-performance computers. The common thread of such models resides in the fact that the plasma components are at the same time sources of electromagnetic fields, via the charge and current densities that they generate, and subject to the action of electromagnetic fields. This leads to a wide variety of plasma modes of oscillations that resonate with the particle or fluid motion and makes the plasma dynamics much richer than that of conventional, neutral fluids.

  10. Computer Adaptive Multistage Testing: Practical Issues, Challenges and Principles

    Directory of Open Access Journals (Sweden)

    Halil Ibrahim SARI

    2016-12-01

    Full Text Available The purpose of many test in the educational and psychological measurement is to measure test takers’ latent trait scores from responses given to a set of items. Over the years, this has been done by traditional methods (paper and pencil tests. However, compared to other test administration models (e.g., adaptive testing, traditional methods are extensively criticized in terms of producing low measurement accuracy and long test length. Adaptive testing has been proposed to overcome these problems. There are two popular adaptive testing approaches. These are computerized adaptive testing (CAT and computer adaptive multistage testing (ca-MST. The former is a well-known approach that has been predominantly used in this field. We believe that researchers and practitioners are fairly familiar with many aspects of CAT because it has more than a hundred years of history. However, the same thing is not true for the latter one. Since ca-MST is relatively new, many researchers are not familiar with features of it. The purpose of this study is to closely examine the characteristics of ca-MST, including its working principle, the adaptation procedure called the routing method, test assembly, and scoring, and provide an overview to researchers, with the aim of drawing researchers’ attention to ca-MST and encouraging them to contribute to the research in this area. The books, software and future work for ca-MST are also discussed.

  11. Computational fluid dynamics challenges for hybrid air vehicle applications

    Science.gov (United States)

    Carrin, M.; Biava, M.; Steijl, R.; Barakos, G. N.; Stewart, D.

    2017-06-01

    This paper begins by comparing turbulence models for the prediction of hybrid air vehicle (HAV) flows. A 6 : 1 prolate spheroid is employed for validation of the computational fluid dynamics (CFD) method. An analysis of turbulent quantities is presented and the Shear Stress Transport (SST) k-ω model is compared against a k-ω Explicit Algebraic Stress model (EASM) within the unsteady Reynolds-Averaged Navier-Stokes (RANS) framework. Further comparisons involve Scale Adaptative Simulation models and a local transition transport model. The results show that the flow around the vehicle at low pitch angles is sensitive to transition effects. At high pitch angles, the vortices generated on the suction side provide substantial lift augmentation and are better resolved by EASMs. The validated CFD method is employed for the flow around a shape similar to the Airlander aircraft of Hybrid Air Vehicles Ltd. The sensitivity of the transition location to the Reynolds number is demonstrated and the role of each vehicle£s component is analyzed. It was found that the ¦ns contributed the most to increase the lift and drag.

  12. Computational Prediction of Effector Proteins in Fungi: Opportunities and Challenges

    Directory of Open Access Journals (Sweden)

    Humira eSonah

    2016-02-01

    Full Text Available Effector proteins are mostly secretory proteins that stimulate plant infection by manipulating the host response. Identifying fungal effector proteins and understanding their function is of great importance in efforts to curb losses to plant diseases. Recent advances in high-throughput sequencing technologies have facilitated the availability of several fungal genomes and thousands of transcriptomes. As a result, the growing amount of genomic information has provided great opportunities to identify putative effector proteins in different fungal species. There is little consensus over the annotation and functionality of effector proteins, and mostly small secretory proteins are considered as effector proteins, a concept that tends to overestimate the number of proteins involved in a plant-pathogen interaction. With the characterization of Avr genes, criteria for computational prediction of effector proteins are becoming more efficient. There are hundreds of tools available for the identification of conserved motifs, signature sequences and structural features in the proteins. Many pipelines and online servers, which combine several tools, are made available to perform genome-wide identification of effector proteins. In this review, available tools and pipelines, their strength and limitations for effective identification of fungal effector proteins are discussed. We also present an exhaustive list of classically secreted proteins along with their key conserved motifs found in 12 common plant pathogens (11 fungi and one oomycete through an analytical pipeline.

  13. P300 brain computer interface: current challenges and emerging trends

    Directory of Open Access Journals (Sweden)

    Reza eFazel-Rezai

    2012-07-01

    Full Text Available A brain-computer interface (BCI enables communication without movement based on brain signals measured with electroencephalography (EEG. BCIs usually rely on one of three types of signals: the P300 and other components of the event-related potential (ERP, steady state visual evoked potential (SSVEP, or event related desynchronization (ERD. Although P300 BCIs were introduced over twenty years ago, the past few years have seen a strong increase in P300 BCI research. This closed-loop BCI approach relies on the P300 and other components of the event-related potential (ERP, based on an oddball paradigm presented to the subject. In this paper, we overview the current status of P300 BCI technology, and then discuss new directions: paradigms for eliciting P300s; signal processing methods; applications; and hybrid BCIs. We conclude that P300 BCIs are quite promising, as several emerging directions have not yet been fully explored and could lead to improvements in bit rate, reliability, usability, and flexibility.

  14. P300 brain computer interface: current challenges and emerging trends

    Science.gov (United States)

    Fazel-Rezai, Reza; Allison, Brendan Z.; Guger, Christoph; Sellers, Eric W.; Kleih, Sonja C.; Kübler, Andrea

    2012-01-01

    A brain-computer interface (BCI) enables communication without movement based on brain signals measured with electroencephalography (EEG). BCIs usually rely on one of three types of signals: the P300 and other components of the event-related potential (ERP), steady state visual evoked potential (SSVEP), or event related desynchronization (ERD). Although P300 BCIs were introduced over twenty years ago, the past few years have seen a strong increase in P300 BCI research. This closed-loop BCI approach relies on the P300 and other components of the ERP, based on an oddball paradigm presented to the subject. In this paper, we overview the current status of P300 BCI technology, and then discuss new directions: paradigms for eliciting P300s; signal processing methods; applications; and hybrid BCIs. We conclude that P300 BCIs are quite promising, as several emerging directions have not yet been fully explored and could lead to improvements in bit rate, reliability, usability, and flexibility. PMID:22822397

  15. 3rd International Symposium on Big Data and Cloud Computing Challenges

    CERN Document Server

    Neelanarayanan, V

    2016-01-01

    This proceedings volume contains selected papers that were presented in the 3rd International Symposium on Big data and Cloud Computing Challenges, 2016 held at VIT University, India on March 10 and 11. New research issues, challenges and opportunities shaping the future agenda in the field of Big Data and Cloud Computing are identified and presented throughout the book, which is intended for researchers, scholars, students, software developers and practitioners working at the forefront in their field. This book acts as a platform for exchanging ideas, setting questions for discussion, and sharing the experience in Big Data and Cloud Computing domain.

  16. The Awareness and Challenges of Cloud Computing Adoption on Tertiary Education in Malaysia

    Science.gov (United States)

    Hazreeni Hamzah, Nor; Mahmud, Maziah; Zukri, Shamsunarnie Mohamed; Yaacob, Wan Fairos Wan; Yacob, Jusoh

    2017-09-01

    This preliminary study aims to investigate the awareness of the adoption of cloud computing among the academicians in tertiary education in Malaysia. Besides, this study also want to explore the possible challenges faced by the academician while adopting this new technology. The pilot study was done on 40 lecturers in Universiti Teknologi MARA Kampus Kota Bharu (UiTMKB) by using self administered questionnaire. The results found that almost half (40 percent) were not aware on the existing of cloud computing in teaching and learning (T&L) process. The challenges confronting the adoption of cloud computing are data insecurity, data insecurity, unsolicited advertisement, lock-in, reluctance to eliminate staff positions, privacy concerns, reliability challenge, regulatory compliance concerns/user control and institutional culture/resistance to change in technology. This possible challenges can be factorized in two major factors which were security and dependency factor and user control and mentality factor.

  17. Challenges to the programmatic implementation of ready to use infant formula in the post-earthquake response, Haiti, 2010: a program review.

    Science.gov (United States)

    Talley, Leisel E; Boyd, Erin

    2013-01-01

    Following the 2010 earthquake in Haiti, infant and young child feeding was identified as a priority nutrition intervention. A new approach to support breastfeeding mothers and distribute ready-to-use infant formula (RUIF) to infants unable to breastfeed was established. The objective of the evaluation was to assess the implementation of infant feeding programs using RUIF in displaced persons camps in Port-au-Prince, Haiti during the humanitarian response. A retrospective record review was conducted from April-July, 2010 to obtain data on infants receiving RUIF in 30 baby tents. A standardized data collection form was created based on data collected across baby tents and included: basic demographics, admission criteria, primary caretaker, feeding practices, and admission and follow-up anthropometrics. Orphans and abandoned infants were the most frequent enrollees (41%) in the program. While the program targeted these groups, it is unlikely that this is a true reflection of population demographics. Despite programmatic guidance, admission criteria were not consistently applied across programs. Thirty-four percent of infants were undernourished (weight for age Z score <-2) at the time of admission. Defaulting accounted for 50% of all program exits and there was no follow-up of these children. Low data quality was a significant barrier. The design, implementation and magnitude of the 'baby tents' using RUIF was novel in response to infant and young child feeding (IYCF) in emergencies and presented multiple challenges that should not be overlooked, including adherence to protocols and the adaption of emergency programs to existing programs. The implementation of IYCF programs should be closely monitored to ensure that they achieve the objectives set by the humanitarian community and national government. IYCF is an often overlooked component of emergency preparedness; however to improve response, generic protocols and pre-emergency training and preparedness should be

  18. Challenges to the programmatic implementation of ready to use infant formula in the post-earthquake response, Haiti, 2010: a program review.

    Directory of Open Access Journals (Sweden)

    Leisel E Talley

    Full Text Available BACKGROUND AND OBJECTIVES: Following the 2010 earthquake in Haiti, infant and young child feeding was identified as a priority nutrition intervention. A new approach to support breastfeeding mothers and distribute ready-to-use infant formula (RUIF to infants unable to breastfeed was established. The objective of the evaluation was to assess the implementation of infant feeding programs using RUIF in displaced persons camps in Port-au-Prince, Haiti during the humanitarian response. METHODS: A retrospective record review was conducted from April-July, 2010 to obtain data on infants receiving RUIF in 30 baby tents. A standardized data collection form was created based on data collected across baby tents and included: basic demographics, admission criteria, primary caretaker, feeding practices, and admission and follow-up anthropometrics. MAIN FINDINGS: Orphans and abandoned infants were the most frequent enrollees (41% in the program. While the program targeted these groups, it is unlikely that this is a true reflection of population demographics. Despite programmatic guidance, admission criteria were not consistently applied across programs. Thirty-four percent of infants were undernourished (weight for age Z score <-2 at the time of admission. Defaulting accounted for 50% of all program exits and there was no follow-up of these children. Low data quality was a significant barrier. CONCLUSIONS: The design, implementation and magnitude of the 'baby tents' using RUIF was novel in response to infant and young child feeding (IYCF in emergencies and presented multiple challenges that should not be overlooked, including adherence to protocols and the adaption of emergency programs to existing programs. The implementation of IYCF programs should be closely monitored to ensure that they achieve the objectives set by the humanitarian community and national government. IYCF is an often overlooked component of emergency preparedness; however to improve

  19. School Readiness for Gifted Children: Considering the Issues

    Science.gov (United States)

    Porath, Marion

    2011-01-01

    This paper discusses issues relevant to gifted children's readiness for school. It raises a number of questions that challenge thinking about what is meant by school readiness. Gifted children can often be ready for school entrance before the age traditionally considered appropriate. Their complex developmental profiles challenge accepted notions…

  20. BigData and computing challenges in high energy and nuclear physics

    Science.gov (United States)

    Klimentov, A.; Grigorieva, M.; Kiryanov, A.; Zarochentsev, A.

    2017-06-01

    In this contribution we discuss the various aspects of the computing resource needs experiments in High Energy and Nuclear Physics, in particular at the Large Hadron Collider. This will evolve in the future when moving from LHC to HL-LHC in ten years from now, when the already exascale levels of data we are processing could increase by a further order of magnitude. The distributed computing environment has been a great success and the inclusion of new super-computing facilities, cloud computing and volunteering computing for the future is a big challenge, which we are successfully mastering with a considerable contribution from many super-computing centres around the world, academic and commercial cloud providers. We also discuss R&D computing projects started recently in National Research Center ``Kurchatov Institute''

  1. Steel for Bodies: Ammunition Readiness During the Korean War

    National Research Council Canada - National Science Library

    Lane, Peter

    2003-01-01

    .... Deficiencies in the U.S. Army's ammunition readiness during the Korean War are illustrative of the many challenges faced in resourcing readiness in the face of competing domestic and military priorities. The U.S...

  2. Operational indicators for measuring organizational e-readiness based on fuzzy logic: A challenge in the Agricultural Organization of Guilan Province, Iran

    Directory of Open Access Journals (Sweden)

    Zahra Daghighi Masouleh

    2014-12-01

    Full Text Available Information and communications technology has exponentially grown in recent years and has played a significant role in organizational development. The main purpose of this study was to collect the data needed for the introduction of a new tool for assessing e-readiness in the Agricultural Organization of Guilan Province, Iran. The study population includes agricultural organization experts and researchers who were familiar with the concepts of IT and the organization status. Based on the relevant literature review, the e-readiness indicators which had theoretically been proposed and practically used by researchers over the past 10 years were identified. Then some parameters were introduced for examination and prioritization. These indicators represented the spatial and temporal factors as well as the characteristics of the condition of the Agricultural Organization of Guilan Province. The proposed structural model included seven factors (Infrastructural, Human, Educational, Government, Management, Socio-cultural and Legal and 44 indicators. After that, based on the experts’ points of view, the coefficient of significance for each of the selected factors and indicators was measured using Minkowski fuzzy screening method. The results obtained from structured questionnaires show that all of the seven main factors and 40 indicators out of 44 indicators were appropriate for assessing electronic readiness and the final model of assessing e-readiness. Furthermore, the results indicated that the most important factor in assessing e-readiness is the human factor and then stand other factors such as educational, infrastructural, management, government, legal and socio-cultural.

  3. High Performance Numerical Computing for High Energy Physics: A New Challenge for Big Data Science

    Directory of Open Access Journals (Sweden)

    Florin Pop

    2014-01-01

    Full Text Available Modern physics is based on both theoretical analysis and experimental validation. Complex scenarios like subatomic dimensions, high energy, and lower absolute temperature are frontiers for many theoretical models. Simulation with stable numerical methods represents an excellent instrument for high accuracy analysis, experimental validation, and visualization. High performance computing support offers possibility to make simulations at large scale, in parallel, but the volume of data generated by these experiments creates a new challenge for Big Data Science. This paper presents existing computational methods for high energy physics (HEP analyzed from two perspectives: numerical methods and high performance computing. The computational methods presented are Monte Carlo methods and simulations of HEP processes, Markovian Monte Carlo, unfolding methods in particle physics, kernel estimation in HEP, and Random Matrix Theory used in analysis of particles spectrum. All of these methods produce data-intensive applications, which introduce new challenges and requirements for ICT systems architecture, programming paradigms, and storage capabilities.

  4. Real-Time Software Vulnerabilities in Cloud Computing : Challenges and Mitigation Techniques

    OpenAIRE

    Okonoboh, Matthias Aifuobhokhan; Tekkali, Sudhakar

    2011-01-01

    Context: Cloud computing is rapidly emerging in the area of distributed computing. In the meantime, many organizations also attributed the technology to be associated with several business risks which are yet to be resolved. These challenges include lack of adequate security, privacy and legal issues, resource allocation, control over data, system integrity, risk assessment, software vulnerabilities and so on which all have compromising effect in cloud environment. Organizations based their w...

  5. CERN readies world's biggest science grid The computing network now encompasses more than 100 sites in 31 countries

    CERN Multimedia

    Niccolai, James

    2005-01-01

    If the Large Hadron Collider (LHC) at CERN is to yield miraculous discoveries in particle physics, it may also require a small miracle in grid computing. By a lack of suitable tools from commercial vendors, engineers at the famed Geneva laboratory are hard at work building a giant grid to store and process the vast amount of data the collider is expected to produce when it begins operations in mid-2007 (2 pages)

  6. Computational Challenge of Fractional Differential Equations and the Potential Solutions: A Survey

    Directory of Open Access Journals (Sweden)

    Chunye Gong

    2015-01-01

    Full Text Available We present a survey of fractional differential equations and in particular of the computational cost for their numerical solutions from the view of computer science. The computational complexities of time fractional, space fractional, and space-time fractional equations are O(N2M, O(NM2, and O(NM(M + N compared with O(MN for the classical partial differential equations with finite difference methods, where M, N are the number of space grid points and time steps. The potential solutions for this challenge include, but are not limited to, parallel computing, memory access optimization (fractional precomputing operator, short memory principle, fast Fourier transform (FFT based solutions, alternating direction implicit method, multigrid method, and preconditioner technology. The relationships of these solutions for both space fractional derivative and time fractional derivative are discussed. The authors pointed out that the technologies of parallel computing should be regarded as a basic method to overcome this challenge, and some attention should be paid to the fractional killer applications, high performance iteration methods, high order schemes, and Monte Carlo methods. Since the computation of fractional equations with high dimension and variable order is even heavier, the researchers from the area of mathematics and computer science have opportunity to invent cornerstones in the area of fractional calculus.

  7. Computing in research and development in Africa benefits, trends, challenges and solutions

    CERN Document Server

    2015-01-01

    This book describes the trends, challenges and solutions in computing use for scientific research and development within different domains in Africa, such as health, agriculture, environment, economy, energy, education and engineering. The benefits expected are discussed by a number of recognized, domain-specific experts, with a common theme being computing as solution enabler. This book is the first document providing such a representative up-to-date view on this topic at the continent level.   • Discusses computing for scientific research and development on the African continent, addressing domains such as engineering, health, agriculture, environment, economy, energy, and education; • Describes the state-of-the-art in usage of computing to address problems in developing countries pertaining to health, productivity, economic growth, and renewable energy; • Offers insights applicable to all developing countries on the use of computing technologies to address a variety of societal issues.

  8. Evaluating a multi-player brain-computer interface game: challenge versus co-experience

    NARCIS (Netherlands)

    Gürkök, Hayrettin; Volpe, G; Reidsma, Dennis; Poel, Mannes; Camurri, A.; Obbink, Michel; Nijholt, Antinus

    2013-01-01

    Brain–computer interfaces (BCIs) have started to be considered as game controllers. The low level of control they provide prevents them from providing perfect control but allows the design of challenging games which can be enjoyed by players. Evaluation of enjoyment, or user experience (UX), is

  9. Investigating the Benefits and Challenges of Using Laptop Computers in Higher Education Classrooms

    Science.gov (United States)

    Kay, Robin Holding; Lauricella, Sharon

    2014-01-01

    The purpose of this study was to investigate the benefits and challenges using laptop computers (hereafter referred to as laptops) inside and outside higher education classrooms. Quantitative and qualitative data were collected from 156 university students (54 males, 102 females) enrolled in either education or communication studies. Benefits of…

  10. Translation Challenges and Strategies: The ASL Translation of a Computer-Based, Psychiatric Diagnostic Interview

    Science.gov (United States)

    Montoya, Louise A.; Egnatovitch, Reginald; Eckhardt, Elizabeth; Goldstein, Marjorie; Goldstein, Richard A.; Steinberg, Annie G.

    2004-01-01

    This article describes the translation goals, challenges, strategies, and solutions employed in the development of a computer-based, self administered, psychiatric diagnostic instrument, the Diagnostic Interview Schedule for the Deaf (D-DIS-IV) in American Sign Language (ASL) with English captions. The article analyzes the impact of the…

  11. Enhancing Competence and Autonomy in Computer-Based Instruction Using a Skill-Challenge Balancing Strategy

    Science.gov (United States)

    Kim, Jieun; Ryu, Hokyoung; Katuk, Norliza; Wang, Ruili; Choi, Gyunghyun

    2014-01-01

    The present study aims to show if a skill-challenge balancing (SCB) instruction strategy can assist learners to motivationally engage in computer-based learning. Csikszentmihalyi's flow theory (self-control, curiosity, focus of attention, and intrinsic interest) was applied to an account of the optimal learning experience in SCB-based learning…

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  13. Challenges and Opportunities in Using Automatic Differentiation with Object-Oriented Toolkits for Scientific Computing

    Energy Technology Data Exchange (ETDEWEB)

    Hovland, P; Lee, S; McInnes, L; Norris, B; Smith, B

    2001-04-17

    The increased use of object-oriented toolkits in large-scale scientific simulation presents new opportunities and challenges for the use of automatic (or algorithmic) differentiation (AD) techniques, especially in the context of optimization. Because object-oriented toolkits use well-defined interfaces and data structures, there is potential for simplifying the AD process. Furthermore, derivative computation can be improved by exploiting high-level information about numerical and computational abstractions. However, challenges to the successful use of AD with these toolkits also exist. Among the greatest challenges is balancing the desire to limit the scope of the AD process with the desire to minimize the work required of a user. They discuss their experiences in integrating AD with the PETSc, PVODE, and TAO toolkits and the plans for future research and development in this area.

  14. A step towards a computing grid for the LHC experiments: ATLAS Data Challenge 1

    Energy Technology Data Exchange (ETDEWEB)

    Sturrock, R.; Bischof, R.; Epp, B.; Ghete, V.M.; Kuhn, D.; Mello, A.G.; Caron, B.; Vetterli, M.C.; Karapetian, G.; Martens, K.; Agarwal, A.; Poffenberger, P.; McPherson, R.A.; Sobie, R.J.; Armstrong, S.; Benekos, N.; Boisvert, V.; Boonekamp, M.; Brandt, S.; Casado, P.; Elsing, M.; Gianotti, F.; Goossens, L.; Grote, M.; Jansen, J.B.; Mair, K.; Nairz, A.; Padilla, C.; Poppleton, A.; Poulard, G.; Richter-Was, E.; Rosati, S.; Schoerner-Sadenius, T.; Wengler, T.; Xu, G.F.; Ping, J.L.; Chudoba, J.; Kosina, J.; Lokajicek, M.; Svec, J.; Tas, P.; Hansen, J.R.; Lytken, E.; Nielsen, J.L.; Waananen, A.; Tapprogge, S.; Calvet, D.; Albrand, S.; Collot, J.; Fulachier, J.; Ledroit-Guillon, F.; Ohlsson-Malek, S.; Viret, S.; Wielers, M.; Bernardet, K.; Correard, S.; Rozanov, A.; de Vivie de Regie, J-B.; Arnault, C.; Bourdarios, C.; Hrivnac, J.; Lechowski, M.; Parrour, G.; Perus, A.; Rousseau, D.; Schaffer, A.; Unal, G.; Derue, F.; Chevalier, L.; Hassani, S.; Laporte, J-F.; Nicolaidou, R.; Pomarede, D.; Virchaux, M.; Nesvadba, N.; Baranov, Sergei; Putzer, A.; Khonich, A.; Duckeck, G.; Schieferdecker, P.; Kiryunin, A.; Schieck, J.; Lagouri, Th.; Duchovni, E.; Levinson, L.; Schrager, D.; Negri, G.; Bilokon, H.; Spogli, L.; Barberis, D.; Parodi, F.; Cataldi, G.; Gorini, E.; Primavera, M.; Spagnolo, S.; Cavalli, D.; Heldmann, M.; Lari, T.; Perini, L.; Rebatto, D.; Resconi, S.; Tartarelli, F.; Vaccarossa, L.; Biglietti, M.; Carlino, G.; Conventi, F.; Doria, A.; Merola, L.; Polesello, G.; Vercesi, V.; De Salvo, A.; Di Mattia, A.; Luminari, L.; Nisati, A.; Reale, M.; Testa, M.; Farilla, A.; Verducci, M.; Cobal, M.; Santi, L.; Hasegawa, Y.; Ishino, M.; Mashimo, T.; Matsumoto, H.; Sakamoto, H.; Tanaka, J.; Ueda, I.; Bentvelsen, S.; Fornaini, A.; Gorfine, G.; Groep, D.; Templon, J.; Koster, J.; Konstantinov, A.; Myklebust, T.; Ould-Saada, F.; Bold, T.; Kaczmarska, A.; Malecki, P.; Szymocha, T.; Turala, M.; Kulchitsky, Y.; Khoreauli, G.; Gromova, N.; Tsulaia, V.; et al.

    2004-04-23

    The ATLAS Collaboration at CERN is preparing for the data taking and analysis at the LHC that will start in 2007. Therefore, a series of Data Challenges was started in 2002 whose goals are the validation of the Computing Model, of the complete software suite, of the data model, and to ensure the correctness of the technical choices to be made. A major feature of the first Data Challenge was the preparation and the deployment of the software required for the production of large event samples as a worldwide-distributed activity. It should be noted that it was not an option to ''run everything at CERN'' even if we had wanted to; the resources were not available at CERN to carry out the production on a reasonable time-scale. The great challenge of organizing and then carrying out this large-scale production at a significant number of sites around the world had the refore to be faced. However, the benefits of this are manifold: apart from realizing the required computing resources, this exercise created worldwide momentum for ATLAS computing as a whole. This report describes in detail the main steps carried out in DC1 and what has been learned from them as a step towards a computing Grid for the LHC experiments.

  15. Opportunities and Challenges of Cloud Computing to Improve Health Care Services

    Science.gov (United States)

    2011-01-01

    Cloud computing is a new way of delivering computing resources and services. Many managers and experts believe that it can improve health care services, benefit health care research, and change the face of health information technology. However, as with any innovation, cloud computing should be rigorously evaluated before its widespread adoption. This paper discusses the concept and its current place in health care, and uses 4 aspects (management, technology, security, and legal) to evaluate the opportunities and challenges of this computing model. Strategic planning that could be used by a health organization to determine its direction, strategy, and resource allocation when it has decided to migrate from traditional to cloud-based health services is also discussed. PMID:21937354

  16. High End Computing Technologies for Earth Science Applications: Trends, Challenges, and Innovations

    Science.gov (United States)

    Parks, John (Technical Monitor); Biswas, Rupak; Yan, Jerry C.; Brooks, Walter F.; Sterling, Thomas L.

    2003-01-01

    Earth science applications of the future will stress the capabilities of even the highest performance supercomputers in the areas of raw compute power, mass storage management, and software environments. These NASA mission critical problems demand usable multi-petaflops and exabyte-scale systems to fully realize their science goals. With an exciting vision of the technologies needed, NASA has established a comprehensive program of advanced research in computer architecture, software tools, and device technology to ensure that, in partnership with US industry, it can meet these demanding requirements with reliable, cost effective, and usable ultra-scale systems. NASA will exploit, explore, and influence emerging high end computing architectures and technologies to accelerate the next generation of engineering, operations, and discovery processes for NASA Enterprises. This article captures this vision and describes the concepts, accomplishments, and the potential payoff of the key thrusts that will help meet the computational challenges in Earth science applications.

  17. Opportunities and challenges of cloud computing to improve health care services.

    Science.gov (United States)

    Kuo, Alex Mu-Hsing

    2011-09-21

    Cloud computing is a new way of delivering computing resources and services. Many managers and experts believe that it can improve health care services, benefit health care research, and change the face of health information technology. However, as with any innovation, cloud computing should be rigorously evaluated before its widespread adoption. This paper discusses the concept and its current place in health care, and uses 4 aspects (management, technology, security, and legal) to evaluate the opportunities and challenges of this computing model. Strategic planning that could be used by a health organization to determine its direction, strategy, and resource allocation when it has decided to migrate from traditional to cloud-based health services is also discussed.

  18. High performance computing and communications Grand Challenges program: Computational structural biology. Final report, August 15, 1992--January 14, 1997

    Energy Technology Data Exchange (ETDEWEB)

    Solomon, J.E.

    1997-10-02

    The Grand Challenge project consists of two elements: (1) a hierarchical methodology for 3D protein structure prediction; and (2) development of a parallel computing environment, the Protein Folding Workbench, for carrying out a variety of protein structure prediction/modeling computations. During the first three years of this project the author focused on the use of selected proteins from the Brookhaven Protein Data Base (PDB) of known structures to provide validation of the prediction algorithms and their software implementation, both serial and parallel. Two proteins in particular have been selected to provide the project with direct interaction with experimental molecular biology. A variety of site-specific mutagenesis experiments are performed on these two proteins to explore the many-to-one mapping characteristics of sequence to structure.

  19. Q. Is Internal Audit Ready for Blockchain?

    National Research Council Canada - National Science Library

    Hugh Rooney; Brian Aiken; Megan Rooney

    2017-01-01

    The question whether internal audit is ready for blockchain is answered. Blockchain technology offers the promise of "a safe, transparent, rapid and affordable digital solution to many government challenges...

  20. DOE Advanced Scientific Computing Advisory Subcommittee (ASCAC) Report: Top Ten Exascale Research Challenges

    Energy Technology Data Exchange (ETDEWEB)

    Lucas, Robert [University of Southern California, Information Sciences Institute; Ang, James [Sandia National Laboratories; Bergman, Keren [Columbia University; Borkar, Shekhar [Intel; Carlson, William [Institute for Defense Analyses; Carrington, Laura [University of California, San Diego; Chiu, George [IBM; Colwell, Robert [DARPA; Dally, William [NVIDIA; Dongarra, Jack [University of Tennessee; Geist, Al [Oak Ridge National Laboratory; Haring, Rud [IBM; Hittinger, Jeffrey [Lawrence Livermore National Laboratory; Hoisie, Adolfy [Pacific Northwest National Laboratory; Klein, Dean Micron; Kogge, Peter [University of Notre Dame; Lethin, Richard [Reservoir Labs; Sarkar, Vivek [Rice University; Schreiber, Robert [Hewlett Packard; Shalf, John [Lawrence Berkeley National Laboratory; Sterling, Thomas [Indiana University; Stevens, Rick [Argonne National Laboratory; Bashor, Jon [Lawrence Berkeley National Laboratory; Brightwell, Ron [Sandia National Laboratories; Coteus, Paul [IBM; Debenedictus, Erik [Sandia National Laboratories; Hiller, Jon [Science and Technology Associates; Kim, K. H. [IBM; Langston, Harper [Reservoir Labs; Murphy, Richard Micron; Webster, Clayton [Oak Ridge National Laboratory; Wild, Stefan [Argonne National Laboratory; Grider, Gary [Los Alamos National Laboratory; Ross, Rob [Argonne National Laboratory; Leyffer, Sven [Argonne National Laboratory; Laros III, James [Sandia National Laboratories

    2014-02-10

    Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a system that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.

  1. Scientific Grand Challenges: Forefront Questions in Nuclear Science and the Role of High Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.

    2009-10-01

    This report is an account of the deliberations and conclusions of the workshop on "Forefront Questions in Nuclear Science and the Role of High Performance Computing" held January 26-28, 2009, co-sponsored by the U.S. Department of Energy (DOE) Office of Nuclear Physics (ONP) and the DOE Office of Advanced Scientific Computing (ASCR). Representatives from the national and international nuclear physics communities, as well as from the high performance computing community, participated. The purpose of this workshop was to 1) identify forefront scientific challenges in nuclear physics and then determine which-if any-of these could be aided by high performance computing at the extreme scale; 2) establish how and why new high performance computing capabilities could address issues at the frontiers of nuclear science; 3) provide nuclear physicists the opportunity to influence the development of high performance computing; and 4) provide the nuclear physics community with plans for development of future high performance computing capability by DOE ASCR.

  2. Primates, computation, and the path to language. Reply to comments on "Towards a Computational Comparative Neuroprimatology: Framing the language-ready brain"

    Science.gov (United States)

    Arbib, Michael A.

    2016-03-01

    The target article [6], henceforth TA, had as its main title Towards a Computational Comparative Neuroprimatology. This unpacks into three claims: Comparative Primatology: If one wishes to understand the behavior of any one primate species (whether monkey, ape or human - TA did not discuss, e.g., lemurs but that study could well be of interest), one will gain new insight by comparing behaviors across species, sharpening one's analysis of one class of behaviors by analyzing similarities and differences between two or more species.

  3. User Identification Roadmap towards 2020 : A study of personal identification challenges for ubiquitous computing world

    OpenAIRE

    Pour, Shiva. Abdi Farzaneh

    2008-01-01

    This thesis is about Personal Identification challenges towards Ubiquitous Computing world as targeted in 2020. The study starts by defining the problems that how diversity of tools for personal identification as an always-foreground activity is problematic in order to become a pervasive interaction. The thesis is divided into three parts. Part one is introduction, background and related works. Part two will describe the empirical study—Triangulation— that is supported by qualitative and quan...

  4. Scientific Grand Challenges: Crosscutting Technologies for Computing at the Exascale - February 2-4, 2010, Washington, D.C.

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.

    2011-02-06

    The goal of the "Scientific Grand Challenges - Crosscutting Technologies for Computing at the Exascale" workshop in February 2010, jointly sponsored by the U.S. Department of Energy’s Office of Advanced Scientific Computing Research and the National Nuclear Security Administration, was to identify the elements of a research and development agenda that will address these challenges and create a comprehensive exascale computing environment. This exascale computing environment will enable the science applications identified in the eight previously held Scientific Grand Challenges Workshop Series.

  5. Computational intelligence in wireless sensor networks recent advances and future challenges

    CERN Document Server

    Falcon, Rafael; Koeppen, Mario

    2017-01-01

    This book emphasizes the increasingly important role that Computational Intelligence (CI) methods are playing in solving a myriad of entangled Wireless Sensor Networks (WSN) related problems. The book serves as a guide for surveying several state-of-the-art WSN scenarios in which CI approaches have been employed. The reader finds in this book how CI has contributed to solve a wide range of challenging problems, ranging from balancing the cost and accuracy of heterogeneous sensor deployments to recovering from real-time sensor failures to detecting attacks launched by malicious sensor nodes and enacting CI-based security schemes. Network managers, industry experts, academicians and practitioners alike (mostly in computer engineering, computer science or applied mathematics) benefit from the spectrum of successful applications reported in this book. Senior undergraduate or graduate students may discover in this book some problems well suited for their own research endeavors. USP: Presents recent advances and fu...

  6. Recent progress and modern challenges in applied mathematics, modeling and computational science

    CERN Document Server

    Makarov, Roman; Belair, Jacques

    2017-01-01

    This volume is an excellent resource for professionals in various areas of applications of mathematics, modeling, and computational science. It focuses on recent progress and modern challenges in these areas. The volume provides a balance between fundamental theoretical and applied developments, emphasizing the interdisciplinary nature of modern trends and detailing state-of-the-art achievements in Applied Mathematics, Modeling, and Computational Science.  The chapters have been authored by international experts in their respective fields, making this book ideal for researchers in academia, practitioners, and graduate students. It can also serve as a reference in the diverse selected areas of applied mathematics, modelling, and computational sciences, and is ideal for interdisciplinary collaborations.

  7. Geant4 Hadronic Cascade Models and CMS Data Analysis : Computational Challenges in the LHC era

    CERN Document Server

    Heikkinen, Aatos

    This work belongs to the field of computational high-energy physics (HEP). The key methods used in this thesis work to meet the challenges raised by the Large Hadron Collider (LHC) era experiments are object-orientation with software engineering, Monte Carlo simulation, the computer technology of clusters, and artificial neural networks. The first aspect discussed is the development of hadronic cascade models, used for the accurate simulation of medium-energy hadron-nucleus reactions, up to 10 GeV. These models are typically needed in hadronic calorimeter studies and in the estimation of radiation backgrounds. Various applications outside HEP include the medical field (such as hadron treatment simulations), space science (satellite shielding), and nuclear physics (spallation studies). Validation results are presented for several significant improvements released in Geant4 simulation tool, and the significance of the new models for computing in the Large Hadron Collider era is estimated. In particular, we es...

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  10. A Step Towards A Computing Grid For The LHC Experiments ATLAS Data Challenge 1

    CERN Document Server

    Sturrock, R; Epp, B; Ghete, V M; Kuhn, D; Mello, A G; Caron, B; Vetterli, M C; Karapetian, G V; Martens, K; Agarwal, A; Poffenberger, P R; McPherson, R A; Sobie, R J; Amstrong, S; Benekos, N C; Boisvert, V; Boonekamp, M; Brandt, S; Casado, M P; Elsing, M; Gianotti, F; Goossens, L; Grote, M; Hansen, J B; Mair, K; Nairz, A; Padilla, C; Poppleton, A; Poulard, G; Richter-Was, Elzbieta; Rosati, S; Schörner-Sadenius, T; Wengler, T; Xu, G F; Ping, J L; Chudoba, J; Kosina, J; Lokajícek, M; Svec, J; Tas, P; Hansen, J R; Lytken, E; Nielsen, J L; Wäänänen, A; Tapprogge, Stefan; Calvet, D; Albrand, S; Collot, J; Fulachier, J; Ledroit-Guillon, F; Ohlsson-Malek, F; Viret, S; Wielers, M; Bernardet, K; Corréard, S; Rozanov, A; De Vivie de Régie, J B; Arnault, C; Bourdarios, C; Hrivnác, J; Lechowski, M; Parrour, G; Perus, A; Rousseau, D; Schaffer, A; Unal, G; Derue, F; Chevalier, L; Hassani, S; Laporte, J F; Nicolaidou, R; Pomarède, D; Virchaux, M; Nesvadba, N; Baranov, S; Putzer, A; Khonich, A; Duckeck, G; Schieferdecker, P; Kiryunin, A E; Schieck, J; Lagouri, T; Duchovni, E; Levinson, L; Schrager, D; Negri, G; Bilokon, H; Spogli, L; Barberis, D; Parodi, F; Cataldi, G; Gorini, E; Primavera, M; Spagnolo, S; Cavalli, D; Heldmann, M; Lari, T; Perini, L; Rebatto, D; Resconi, S; Tatarelli, F; Vaccarossa, L; Biglietti, M; Carlino, G; Conventi, F; Doria, A; Merola, L; Polesello, G; Vercesi, V; De Salvo, A; Di Mattia, A; Luminari, L; Nisati, A; Reale, M; Testa, M; Farilla, A; Verducci, M; Cobal, M; Santi, L; Hasegawa, Y; Ishino, M; Mashimo, T; Matsumoto, H; Sakamoto, H; Tanaka, J; Ueda, I; Bentvelsen, Stanislaus Cornelius Maria; Fornaini, A; Gorfine, G; Groep, D; Templon, J; Köster, L J; Konstantinov, A; Myklebust, T; Ould-Saada, F; Bold, T; Kaczmarska, A; Malecki, P; Szymocha, T; Turala, M; Kulchitskii, Yu A; Khoreauli, G; Gromova, N; Tsulaia, V; Minaenko, A A; Rudenko, R; Slabospitskaya, E; Solodkov, A; Gavrilenko, I; Nikitine, N; Sivoklokov, S Yu; Toms, K; Zalite, A; Zalite, Yu; Kervesan, B; Bosman, M; González, S; Sánchez, J; Salt, J; Andersson, N; Nixon, L; Eerola, Paule Anna Mari; Kónya, B; Smirnova, O G; Sandgren, A; Ekelöf, T J C; Ellert, M; Gollub, N; Hellman, S; Lipniacka, A; Corso-Radu, A; Pérez-Réale, V; Lee, S C; CLin, S C; Ren, Z L; Teng, P K; Faulkner, P J W; O'Neale, S W; Watson, A; Brochu, F; Lester, C; Thompson, S; Kennedy, J; Bouhova-Thacker, E; Henderson, R; Jones, R; Kartvelishvili, V G; Smizanska, M; Washbrook, A J; Drohan, J; Konstantinidis, N P; Moyse, E; Salih, S; Loken, J; Baines, J T M; Candlin, D; Candlin, R; Clifft, R; Li, W; McCubbin, N A; George, S; Lowe, A; Buttar, C; Dawson, I; Moraes, A; Tovey, Daniel R; Gieraltowski, J; Malon, D; May, E; LeCompte, T J; Vaniachine, A; Adams, D L; Assamagan, Ketevi A; Baker, R; Deng, W; Fine, V; Fisyak, Yu; Gibbard, B; Ma, H; Nevski, P; Paige, F; Rajagopalan, S; Smith, J; Undrus, A; Wenaus, T; Yu, D; Calafiura, P; Canon, S; Costanzo, D; Hinchliffe, Ian; Lavrijsen, W; Leggett, C; Marino, M; Quarrie, D R; Sakrejda, I; Stravopoulos, G; Tull, C; Loch, P; Youssef, S; Shank, J T; Engh, D; Frank, E; Sen-Gupta, A; Gardner, R; Meritt, F; Smirnov, Y; Huth, J; Grundhoefer, L; Luehring, F C; Goldfarb, S; Severini, H; Skubic, P L; Gao, Y; Ryan, T; De, K; Sosebee, M; McGuigan, P; Ozturk, N

    2004-01-01

    The ATLAS Collaboration at CERN is preparing for the data taking and analysis at the LHC that will start in 2007. Therefore, a series of Data Challenges was started in 2002 whose goals are the validation of the Computing Model, of the complete software suite, of the data model, and to ensure the correctness of the technical choices to be made for the final offline computing environment. A major feature of the first Data Challenge (DC1) was the preparation and the deployment of the software required for the production of large event samples as a worldwide distributed activity. It should be noted that it was not an option to "run the complete production at CERN" even if we had wanted to; the resources were not available at CERN to carry out the production on a reasonable time-scale. The great challenge of organising and carrying out this large-scale production at a significant number of sites around the world had therefore to be faced. However, the benefits of this are manifold: apart from realising the require...

  11. Challenges to Computational Aerothermodynamic Simulation and Validation for Planetary Entry Vehicle Analysis

    Science.gov (United States)

    Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, Bil

    2010-01-01

    Challenges to computational aerothermodynamic (CA) simulation and validation of hypersonic flow over planetary entry vehicles are discussed. Entry, descent, and landing (EDL) of high mass to Mars is a significant driver of new simulation requirements. These requirements include simulation of large deployable, flexible structures and interactions with reaction control system (RCS) and retro-thruster jets. Simulation of radiation and ablation coupled to the flow solver continues to be a high priority for planetary entry analyses, especially for return to Earth and outer planet missions. Three research areas addressing these challenges are emphasized. The first addresses the need to obtain accurate heating on unstructured tetrahedral grid systems to take advantage of flexibility in grid generation and grid adaptation. A multi-dimensional inviscid flux reconstruction algorithm is defined that is oriented with local flow topology as opposed to grid. The second addresses coupling of radiation and ablation to the hypersonic flow solver--flight- and ground-based data are used to provide limited validation of these multi-physics simulations. The third addresses the challenges of retro-propulsion simulation and the criticality of grid adaptation in this application. The evolution of CA to become a tool for innovation of EDL systems requires a successful resolution of these challenges.

  12. The nature of the (visualization) game: Challenges and opportunities from computational geophysics

    Science.gov (United States)

    Kellogg, L. H.

    2016-12-01

    As the geosciences enters the era of big data, modeling and visualization become increasingly vital tools for discovery, understanding, education, and communication. Here, we focus on modeling and visualization of the structure and dynamics of the Earth's surface and interior. The past decade has seen accelerated data acquisition, including higher resolution imaging and modeling of Earth's deep interior, complex models of geodynamics, and high resolution topographic imaging of the changing surface, with an associated acceleration of computational modeling through better scientific software, increased computing capability, and the use of innovative methods of scientific visualization. The role of modeling is to describe a system, answer scientific questions, and test hypotheses; the term "model" encompasses mathematical models, computational models, physical models, conceptual models, statistical models, and visual models of a structure or process. These different uses of the term require thoughtful communication to avoid confusion. Scientific visualization is integral to every aspect of modeling. Not merely a means of communicating results, the best uses of visualization enable scientists to interact with their data, revealing the characteristics of the data and models to enable better interpretation and inform the direction of future investigation. Innovative immersive technologies like virtual reality, augmented reality, and remote collaboration techniques, are being adapted more widely and are a magnet for students. Time-varying or transient phenomena are especially challenging to model and to visualize; researchers and students may need to investigate the role of initial conditions in driving phenomena, while nonlinearities in the governing equations of many Earth systems make the computations and resulting visualization especially challenging. Training students how to use, design, build, and interpret scientific modeling and visualization tools prepares them

  13. Challenges in reducing the computational time of QSTS simulations for distribution system analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Deboever, Jeremiah [Georgia Inst. of Technology, Atlanta, GA (United States); Zhang, Xiaochen [Georgia Inst. of Technology, Atlanta, GA (United States); Reno, Matthew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grijalva, Santiago [Georgia Inst. of Technology, Atlanta, GA (United States); Therrien, Francis [CME International T& D, St. Bruno, QC (Canada)

    2017-06-01

    The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10 to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.

  14. Transfer Readiness Pilot Study.

    Science.gov (United States)

    Scott-Skillman, Thelma; And Others

    The California Community Colleges (CCC) has implemented a prototype model for determining student transfer readiness as a primary means of assessing community college transfer effectiveness. This report provides definitions of transfer readiness and guidelines for colleges participating in the CCC transfer readiness study. First, a memorandum from…

  15. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  18. Technology readiness assessments: A retrospective

    Science.gov (United States)

    Mankins, John C.

    2009-11-01

    The development of new system capabilities typically depends upon the prior success of advanced technology research and development efforts. These systems developments inevitably face the three major challenges of any project: performance, schedule and budget. Done well, advanced technology programs can substantially reduce the uncertainty in all three of these dimensions of project management. Done poorly, or not at all, and new system developments suffer from cost overruns, schedule delays and the steady erosion of initial performance objectives. It is often critical for senior management to be able to determine which of these two paths is more likely—and to respond accordingly. The challenge for system and technology managers is to be able to make clear, well-documented assessments of technology readiness and risks, and to do so at key points in the life cycle of the program. In the mid 1970s, the National Aeronautics and Space Administration (NASA) introduced the concept of "technology readiness levels" (TRLs) as a discipline-independent, programmatic figure of merit (FOM) to allow more effective assessment of, and communication regarding the maturity of new technologies. In 1995, the TRL scale was further strengthened by the articulation of the first definitions of each level, along with examples (J. Mankins, Technology readiness levels, A White Paper, NASA, Washington, DC, 1995. [1]). Since then, TRLs have been embraced by the U.S. Congress' General Accountability Office (GAO), adopted by the U.S. Department of Defense (DOD), and are being considered for use by numerous other organizations. Overall, the TRLs have proved to be highly effective in communicating the status of new technologies among sometimes diverse organizations. This paper will review the concept of "technology readiness assessments", and provide a retrospective on the history of "TRLs" during the past 30 years. The paper will conclude with observations concerning prospective future

  19. IBM Watson: How Cognitive Computing Can Be Applied to Big Data Challenges in Life Sciences Research.

    Science.gov (United States)

    Chen, Ying; Elenee Argentinis, J D; Weber, Griff

    2016-04-01

    Life sciences researchers are under pressure to innovate faster than ever. Big data offer the promise of unlocking novel insights and accelerating breakthroughs. Ironically, although more data are available than ever, only a fraction is being integrated, understood, and analyzed. The challenge lies in harnessing volumes of data, integrating the data from hundreds of sources, and understanding their various formats. New technologies such as cognitive computing offer promise for addressing this challenge because cognitive solutions are specifically designed to integrate and analyze big datasets. Cognitive solutions can understand different types of data such as lab values in a structured database or the text of a scientific publication. Cognitive solutions are trained to understand technical, industry-specific content and use advanced reasoning, predictive modeling, and machine learning techniques to advance research faster. Watson, a cognitive computing technology, has been configured to support life sciences research. This version of Watson includes medical literature, patents, genomics, and chemical and pharmacological data that researchers would typically use in their work. Watson has also been developed with specific comprehension of scientific terminology so it can make novel connections in millions of pages of text. Watson has been applied to a few pilot studies in the areas of drug target identification and drug repurposing. The pilot results suggest that Watson can accelerate identification of novel drug candidates and novel drug targets by harnessing the potential of big data. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  20. Resource Readiness of Armored Units.

    Science.gov (United States)

    1979-11-01

    rate of 1350 gallons. At high attrition rates the TSAR /AURA simulation suggests that for the attack SOC support resources--spares and manpower--are...important to validate the data processed by the TSAR /AURA simulation model. The uses of readiness measures generally fall into two classes: near-term...This study could not have been done were it not for the assistance of Rand colleagues Don Emerson, who developed the TSAR computer model, and Milt

  1. Tackling some of the most intricate geophysical challenges via high-performance computing

    Science.gov (United States)

    Khosronejad, A.

    2016-12-01

    Recently, world has been witnessing significant enhancements in computing power of supercomputers. Computer clusters in conjunction with the advanced mathematical algorithms has set the stage for developing and applying powerful numerical tools to tackle some of the most intricate geophysical challenges that today`s engineers face. One such challenge is to understand how turbulent flows, in real-world settings, interact with (a) rigid and/or mobile complex bed bathymetry of waterways and sea-beds in the coastal areas; (b) objects with complex geometry that are fully or partially immersed; and (c) free-surface of waterways and water surface waves in the coastal area. This understanding is especially important because the turbulent flows in real-world environments are often bounded by geometrically complex boundaries, which dynamically deform and give rise to multi-scale and multi-physics transport phenomena, and characterized by multi-lateral interactions among various phases (e.g. air/water/sediment phases). Herein, I present some of the multi-scale and multi-physics geophysical fluid mechanics processes that I have attempted to study using an in-house high-performance computational model, the so-called VFS-Geophysics. More specifically, I will present the simulation results of turbulence/sediment/solute/turbine interactions in real-world settings. Parts of the simulations I present are performed to gain scientific insights into the processes such as sand wave formation (A. Khosronejad, and F. Sotiropoulos, (2014), Numerical simulation of sand waves in a turbulent open channel flow, Journal of Fluid Mechanics, 753:150-216), while others are carried out to predict the effects of climate change and large flood events on societal infrastructures ( A. Khosronejad, et al., (2016), Large eddy simulation of turbulence and solute transport in a forested headwater stream, Journal of Geophysical Research:, doi: 10.1002/2014JF003423).

  2. Optimization in castings—An overview of relevant computational technologies and future challenges

    Science.gov (United States)

    Ransing, R. S.; Sood, M. P.

    2006-12-01

    The manufacture of defect-free components at low cost and high productivity is as important to the casting industry today as it was 30 years ago. In the past, experience was gained either by using a “trial and error” method or by undertaking expensive experiments. Many “dos” and “don’ts” have evolved in the casting process over a period of time. However, the important ones that come to mind are so fundamental that they challenge the “academic mind” to think all over again. The rules proposed by Professor John Campbell[1] are classic examples. The message is simple: mathematical complexity in computer models needs to go hand in hand with the rules derived from “first principles.” In the field of optimization, a variety of methods have been proposed over a period of years. At the start of optimization study, the foundryman’s first choice is to use simple but well-established methods such as the use of orthogonal arrays for optimal design of process conditions or the famous “inscribed” or Heuvers’ circle method[2] for optimal feeding design. The computer simulation software has been based on a variety of computational methods ranging from geometric reasoning techniques (the famous Chvorinov rule and its variants)[11,13,15,29 31] to solving complex partial differential equations using one of the numerical methods. Optimization methods based on solving partial differential methods was an active area of research in the mid-1990s.[6 10,17] This article reviews a variety of optimization methods including—probably for the first time—geometric reasoning methods. The contribution from various computational methodologies is highlighted with particular emphasis on characterizing “objective functions” and “constraints.” The article also raises some of the challenging issues that the optimization community is facing today for solving casting problems and reports on our recent work on linking geometric reasoning techniques with the finite

  3. Mobile, Cloud, and Big Data Computing: Contributions, Challenges, and New Directions in Telecardiology

    Directory of Open Access Journals (Sweden)

    Chung-Chi Yang

    2013-11-01

    Full Text Available Many studies have indicated that computing technology can enable off-site cardiologists to read patients’ electrocardiograph (ECG, echocardiography (ECHO, and relevant images via smart phones during pre-hospital, in-hospital, and post-hospital teleconsultation, which not only identifies emergency cases in need of immediate treatment, but also prevents the unnecessary re-hospitalizations. Meanwhile, several studies have combined cloud computing and mobile computing to facilitate better storage, delivery, retrieval, and management of medical files for telecardiology. In the future, the aggregated ECG and images from hospitals worldwide will become big data, which should be used to develop an e-consultation program helping on-site practitioners deliver appropriate treatment. With information technology, real-time tele-consultation and tele-diagnosis of ECG and images can be practiced via an e-platform for clinical, research, and educational purposes. While being devoted to promote the application of information technology onto telecardiology, we need to resolve several issues: (1 data confidentiality in the cloud, (2 data interoperability among hospitals, and (3 network latency and accessibility. If these challenges are overcome, tele-consultation will be ubiquitous, easy to perform, inexpensive, and beneficial. Most importantly, these services will increase global collaboration and advance clinical practice, education, and scientific research in cardiology.

  4. Mobile, cloud, and big data computing: contributions, challenges, and new directions in telecardiology.

    Science.gov (United States)

    Hsieh, Jui-Chien; Li, Ai-Hsien; Yang, Chung-Chi

    2013-11-13

    Many studies have indicated that computing technology can enable off-site cardiologists to read patients' electrocardiograph (ECG), echocardiography (ECHO), and relevant images via smart phones during pre-hospital, in-hospital, and post-hospital teleconsultation, which not only identifies emergency cases in need of immediate treatment, but also prevents the unnecessary re-hospitalizations. Meanwhile, several studies have combined cloud computing and mobile computing to facilitate better storage, delivery, retrieval, and management of medical files for telecardiology. In the future, the aggregated ECG and images from hospitals worldwide will become big data, which should be used to develop an e-consultation program helping on-site practitioners deliver appropriate treatment. With information technology, real-time tele-consultation and tele-diagnosis of ECG and images can be practiced via an e-platform for clinical, research, and educational purposes. While being devoted to promote the application of information technology onto telecardiology, we need to resolve several issues: (1) data confidentiality in the cloud, (2) data interoperability among hospitals, and (3) network latency and accessibility. If these challenges are overcome, tele-consultation will be ubiquitous, easy to perform, inexpensive, and beneficial. Most importantly, these services will increase global collaboration and advance clinical practice, education, and scientific research in cardiology.

  5. Multiscale Mechanics of Articular Cartilage: Potentials and Challenges of Coupling Musculoskeletal, Joint, and Microscale Computational Models

    Science.gov (United States)

    Halloran, J. P.; Sibole, S.; van Donkelaar, C. C.; van Turnhout, M. C.; Oomens, C. W. J.; Weiss, J. A.; Guilak, F.; Erdemir, A.

    2012-01-01

    Articular cartilage experiences significant mechanical loads during daily activities. Healthy cartilage provides the capacity for load bearing and regulates the mechanobiological processes for tissue development, maintenance, and repair. Experimental studies at multiple scales have provided a fundamental understanding of macroscopic mechanical function, evaluation of the micromechanical environment of chondrocytes, and the foundations for mechanobiological response. In addition, computational models of cartilage have offered a concise description of experimental data at many spatial levels under healthy and diseased conditions, and have served to generate hypotheses for the mechanical and biological function. Further, modeling and simulation provides a platform for predictive risk assessment, management of dysfunction, as well as a means to relate multiple spatial scales. Simulation-based investigation of cartilage comes with many challenges including both the computational burden and often insufficient availability of data for model development and validation. This review outlines recent modeling and simulation approaches to understand cartilage function from a mechanical systems perspective, and illustrates pathways to associate mechanics with biological function. Computational representations at single scales are provided from the body down to the microstructure, along with attempts to explore multiscale mechanisms of load sharing that dictate the mechanical environment of the cartilage and chondrocytes. PMID:22648577

  6. Applied & Computational MathematicsChallenges for the Design and Control of Dynamic Energy Systems

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D L; Burns, J A; Collis, S; Grosh, J; Jacobson, C A; Johansen, H; Mezic, I; Narayanan, S; Wetter, M

    2011-03-10

    The Energy Independence and Security Act of 2007 (EISA) was passed with the goal 'to move the United States toward greater energy independence and security.' Energy security and independence cannot be achieved unless the United States addresses the issue of energy consumption in the building sector and significantly reduces energy consumption in buildings. Commercial and residential buildings account for approximately 40% of the U.S. energy consumption and emit 50% of CO{sub 2} emissions in the U.S. which is more than twice the total energy consumption of the entire U.S. automobile and light truck fleet. A 50%-80% improvement in building energy efficiency in both new construction and in retrofitting existing buildings could significantly reduce U.S. energy consumption and mitigate climate change. Reaching these aggressive building efficiency goals will not happen without significant Federal investments in areas of computational and mathematical sciences. Applied and computational mathematics are required to enable the development of algorithms and tools to design, control and optimize energy efficient buildings. The challenge has been issued by the U.S. Secretary of Energy, Dr. Steven Chu (emphasis added): 'We need to do more transformational research at DOE including computer design tools for commercial and residential buildings that enable reductions in energy consumption of up to 80 percent with investments that will pay for themselves in less than 10 years.' On July 8-9, 2010 a team of technical experts from industry, government and academia were assembled in Arlington, Virginia to identify the challenges associated with developing and deploying newcomputational methodologies and tools thatwill address building energy efficiency. These experts concluded that investments in fundamental applied and computational mathematics will be required to build enabling technology that can be used to realize the target of 80% reductions in energy

  7. Ready for What? Constructing Meanings of Readiness for Kindergarten.

    Science.gov (United States)

    Graue, M. Elizabeth

    This book examines the issue of school readiness, focusing on children's readiness for entrance into kindergarten and promotion to first grade. Chapter 1 reviews the literature on school readiness, exploring trends in policy related to readiness and readiness as a child-centered characteristic. Chapter 2 examines various theoretical frameworks for…

  8. INFN Tier-1 experiences with Castor-2 in CMS computing challenges

    CERN Document Server

    AUTHOR|(CDS)2108873

    2007-01-01

    The CMS combined Computing, Software and Analysis challenge of 2006 (CSA06) is a 50 million event exercise to test the workflow and dataflow associated with the data handling model of CMS. It was designed to be a fully Grid-enabled, 25% capacity exercise of what is needed for CMS operations in 2008. All CMS Tier1’s participated, and the INFN Tier-1 - located at CNAF, Bologna, Italy - joined with a production Castor-2 installation as a Hierarchical Storage Manager solution to address data storage, dat access and custodial responsibility. After the prompt reconstruction phase at the Tier-0, the data was distributed to all participating Tier-1’s, and calibration/alignment, re-reconstruction and skimming jobs ran at the Tier-1’s. Output of skimming jobs were propagated to the Tier-2’s, to allow physics analysis job submissions. The experience collected by the INFN Tier-1 storage group during the pre-challenge Monte Carlo production, the preparation and the running of the CSA06 exercise - as well as the Ti...

  9. Computer-Aided Diagnosis Systems for Lung Cancer: Challenges and Methodologies

    Science.gov (United States)

    El-Baz, Ayman; Beache, Garth M.; Gimel'farb, Georgy; Suzuki, Kenji; Okada, Kazunori; Elnakib, Ahmed; Soliman, Ahmed; Abdollahi, Behnoush

    2013-01-01

    This paper overviews one of the most important, interesting, and challenging problems in oncology, the problem of lung cancer diagnosis. Developing an effective computer-aided diagnosis (CAD) system for lung cancer is of great clinical importance and can increase the patient's chance of survival. For this reason, CAD systems for lung cancer have been investigated in a huge number of research studies. A typical CAD system for lung cancer diagnosis is composed of four main processing steps: segmentation of the lung fields, detection of nodules inside the lung fields, segmentation of the detected nodules, and diagnosis of the nodules as benign or malignant. This paper overviews the current state-of-the-art techniques that have been developed to implement each of these CAD processing steps. For each technique, various aspects of technical issues, implemented methodologies, training and testing databases, and validation methods, as well as achieved performances, are described. In addition, the paper addresses several challenges that researchers face in each implementation step and outlines the strengths and drawbacks of the existing approaches for lung cancer CAD systems. PMID:23431282

  10. Computational analyses of ancient pathogen DNA from herbarium samples: challenges and prospects

    Directory of Open Access Journals (Sweden)

    Kentaro eYoshida

    2015-09-01

    Full Text Available The application of DNA sequencing technology to the study of ancient DNA has enabled the reconstruction of past epidemics from genomes of historically important plant-associated microbes. Recently, the genome sequences of the potato late blight pathogen Phytophthora infestans were analyzed from 19th century herbarium specimens. These herbarium samples originated from infected potatoes collected during and after the Irish potato famine. Herbaria have therefore great potential to help elucidate past epidemics of crops, date the emergence of pathogens, and inform about past pathogen population dynamics. DNA preservation in herbarium samples was unexpectedly good, raising the possibility of a whole new research area in plant and microbial genomics. However, the recovered DNA can be extremely fragmented resulting in specific challenges in reconstructing genome sequences. Here we review some of the challenges in computational analyses of ancient DNA from herbarium samples. We also applied the recently developed linkage method to haplotype reconstruction of diploid or polyploid genomes from fragmented ancient DNA.

  11. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  13. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  14. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  15. Challenge.

    Directory of Open Access Journals (Sweden)

    René-Éric Dagorn

    2004-04-01

    Full Text Available Sous le titre « The Hispanic Challenge », Samuel P. Huntington propose dans le numéro de Foreign Policy de mars-avril 2004 une nouvelle démonstration du danger de sa pseudo-théorie du « choc des civilisations ». Quel est donc ce «  challenge  » auquel, d'après Huntington, la société américaine serait aujourd'hui confrontée ? C'est celui de l'immigration « hispanique » qui « menace l'identité américaine, ses valeurs et son mode de vie » ...

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  17. Readiness for Living Technology

    DEFF Research Database (Denmark)

    Peronard, Jean-Paul

    2013-01-01

    This paper is a comparative analysis between workers in healthcare with high and low degree of readiness for living technology such as robotics. To explore the differences among workers’ readiness for robotics in healthcare, statistical analysis was conducted in the data set obtained from 200...

  18. Preschool Children's School Readiness

    Science.gov (United States)

    Pekdogan, Serpil; Akgül, Esra

    2017-01-01

    The purpose of this study is to examine preschool teachers' perspectives about children's school readiness. Qualitative and quantitative research methods were used in the study as a mixed method research. Data, in the quantitative aspects of the research, were collected through the use of "School Readiness Form" developed by Boz (2004)…

  19. Data Challenges

    CERN Multimedia

    McCubbin, N A

    Some two years ago we planned a series of Data Challenges starting at the end of 2001. At the time, that seemed to be comfortingly far in the future... Well, as the saying goes, doesn't time fly when you are having fun! ATLAS Computing is now deep in the throes of getting the first Data Challenge (DC0) up and running. One of the main aims of DC0 is to have a software 'release' in which we can generate full physics events, track all particles through the detector, simulate the detector response, reconstruct the event, and study it, with appropriate data storage en route. As all software is "always 95% ready" (!), we have been able to do most of this, more or less, for some time. But DC0 forces us to have everything working, together, at the same time: a reality check. DC0 should finish early next year, and it will be followed almost immediately afterwards by DC1 (DC0 was foreseen as the 'check' for DC1). DC1 will last into the middle of 2002, and has two major goals. The first is generation, simulation, and r...

  20. Learner Readiness for Online Learning: Scale Development and Student Perceptions

    Science.gov (United States)

    Hung, Min-Ling; Chou, Chien; Chen, Chao-Hsiu; Own, Zang-Yuan

    2010-01-01

    The purpose of this study was to develop and validate a multidimensional instrument for college students' readiness for online learning. Through a confirmatory factor analysis, the Online Learning Readiness Scale (OLRS) was validated in five dimensions: self-directed learning, motivation for learning, computer/Internet self-efficacy, learner…

  1. Computer Literacy of Iranian Teachers of English as a Foreign Language: Challenges and Obstacles

    Science.gov (United States)

    Dashtestani, Reza

    2014-01-01

    Basically, one of the requirements for the implementation of computer-assisted language learning (CALL) is English as a foreign language (EFL) teachers' ability to use computers effectively. Educational authorities and planners should identify EFL teachers' computer literacy levels and make attempts to improve the teachers' computer competence.…

  2. Grand Challenges of Advanced Computing for Energy Innovation Report from the Workshop Held July 31-August 2, 2012

    Energy Technology Data Exchange (ETDEWEB)

    Larzelere, Alex R.; Ashby, Steven F.; Christensen, Dana C.; Crawford, Dona L.; Khaleel, Mohammad A.; John, Grosh; Stults, B. Ray; Lee, Steven L.; Hammond, Steven W.; Grover, Benjamin T.; Neely, Rob; Dudney, Lee Ann; Goldstein, Noah C.; Wells, Jack; Peltz, Jim

    2013-03-06

    On July 31-August 2 of 2012, the U.S. Department of Energy (DOE) held a workshop entitled Grand Challenges of Advanced Computing for Energy Innovation. This workshop built on three earlier workshops that clearly identified the potential for the Department and its national laboratories to enable energy innovation. The specific goal of the workshop was to identify the key challenges that the nation must overcome to apply the full benefit of taxpayer-funded advanced computing technologies to U.S. energy innovation in the ways that the country produces, moves, stores, and uses energy. Perhaps more importantly, the workshop also developed a set of recommendations to help the Department overcome those challenges. These recommendations provide an action plan for what the Department can do in the coming years to improve the nation’s energy future.

  3. Analysis of the United States Computer Emergency Readiness Team’s (U.S. CERT) EINSTEIN III Intrusion Detection System, and Its Impact on Privacy

    Science.gov (United States)

    2013-03-01

    Identifiable Information PIN Personal Identification Number PIV Personal Identity Verification RFID Radio Frequency Identification SECURE IT...victims of computer hacking to request law enforcement assistance in monitoring the trespassers on their computers (Solove, 2006)(U.S. Department of...Machine Readable Technology, such as Radio Frequency Identification ( RFID ) tags, would allow for routine tracking, monitoring and regulating of

  4. FY 1992 Blue Book: Grand Challenges: High Performance Computing and Communications

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — High performance computing and computer communications networks are becoming increasingly important to scientific advancement, economic competition, and national...

  5. FY 1993 Blue Book: Grand Challenges 1993: High Performance Computing and Communications

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — High performance computing and computer communications networks are becoming increasingly important to scientific advancement, economic competition, and national...

  6. Migrating Educational Data and Services to Cloud Computing: Exploring Benefits and Challenges

    Science.gov (United States)

    Lahiri, Minakshi; Moseley, James L.

    2013-01-01

    "Cloud computing" is currently the "buzzword" in the Information Technology field. Cloud computing facilitates convenient access to information and software resources as well as easy storage and sharing of files and data, without the end users being aware of the details of the computing technology behind the process. This…

  7. Computer Games in Pre-School Settings: Didactical Challenges when Commercial Educational Computer Games Are Implemented in Kindergartens

    Science.gov (United States)

    Vangsnes, Vigdis; Gram Okland, Nils Tore; Krumsvik, Rune

    2012-01-01

    This article focuses on the didactical implications when commercial educational computer games are used in Norwegian kindergartens by analysing the dramaturgy and the didactics of one particular game and the game in use in a pedagogical context. Our justification for analysing the game by using dramaturgic theory is that we consider the game to be…

  8. Technology Readiness Level Guidebook

    Science.gov (United States)

    2017-09-01

    This guidebook provides the necessary information for conducting a Technology Readiness Level (TRL) Assessment. TRL Assessments are a tool for determining the maturity of technologies and identifying next steps in the research process. This guidebook...

  9. Computer Literacy and Health Locus of Control as Determinants for Readiness and Acceptability of Telepractice in a Head and Neck Cancer Population.

    Science.gov (United States)

    Cartmill, Bena; Wall, Laurelie R; Ward, Elizabeth C; Hill, Anne J; Porceddu, Sandro V

    2016-01-01

    Understanding end-user populations is required in designing telepractice applications. This study explored computer literacy and health locus of control in head/neck cancer (HNC) patients to inform suitability for telerehabilitation. Sixty individuals with oropharygneal cancer were recruited. Computer literacy was examined using a 10-question survey. The Multidimensional Health Locus of Control Scale Form C (MHLC-C) examined perceptions of health "control". Participants were mostly middle-aged males, from high socioeconomic backgrounds. Only 10% were non-computer users. Of the computers users, 91% reported daily use, 66% used multiple devices and over 75% rated themselves as "confident" users. More than half were open to using technology for health-related activities. High internal scores (MHLC-C) signified a belief that own behaviour influenced health status. HNC patients have high computer literacy and an internal health locus of control, both are positive factors to support telepractice models of care. This may include asynchronous models requiring heightened capacity for self-management.

  10. Computer Literacy and Health Locus of Control as Determinants for Readiness and Acceptability of Telepractice in a Head and Neck Cancer Population

    Directory of Open Access Journals (Sweden)

    Bena Cartmill

    2016-12-01

    Full Text Available Understanding end-user populations is required in designing telepractice applications. This study explored computer literacy and health locus of control in head/neck cancer (HNC patients to inform suitability for telerehabilitation. Sixty individuals with oropharygneal cancer were recruited. Computer literacy was examined using a 10-question survey. The Multidimensional Health Locus of Control Scale Form C (MHLC-C examined perceptions of health “control”.  Participants were mostly middle-aged males, from high socioeconomic backgrounds. Only 10% were non-computer users. Of the computers users, 91% reported daily use, 66% used multiple devices and over 75% rated themselves as “confident” users. More than half were open to using technology for health-related activities. High internal scores (MHLC-C signified a belief that own behaviour influenced health status.  HNC patients have high computer literacy and an internal health locus of control, both are positive factors to support telepractice models of care. This may include asynchronous models requiring heightened capacity for self-management.

  11. First ALMA Transporter Ready for Challenging Duty

    Science.gov (United States)

    2008-07-01

    The first of two ALMA transporters -- unique vehicles designed to move high-tech radio-telescope antennas in the harsh, high-altitude environment of the Atacama Large Millimeter/submillimeter Array -- has been completed and passed its initial operational tests. The 130-ton machine moves on 28 wheels and will be able to transport a 115-ton antenna and set it down on a concrete pad within millimeters of a prescribed position. ALMA Transporter The ALMA Transporter on a Test Run CREDIT: ESO Click on image for high-resolution file (244 KB) The ALMA transporter rolled out of its hangar and underwent the tests at the Scheuerle Fahrzeugfabrik company site near Nuremberg, Germany. The machine is scheduled for delivery at the ALMA site in Chile by the end of 2007, and a second vehicle will follow about three months later. ALMA is a giant, international observatory under construction in the Atacama Desert of northern Chile at an elevation of 16,500 feet. Using at least 66 high-precision antennas, with the possibility of increasing the number in the future, ALMA will provide astronomers with an unprecedented ability to explore the Universe as seen at wavelengths of a few millimeters to less than a millimeter. By moving the antennas from configurations as compact as 150 meters to as wide as 15 kilometers, the system will provide a zoom-lens ability for scientists. "The ability to move antennas to reconfigure the array is vital to fulfilling ALMA's scientific mission. The operations plan calls for moving antennas on a daily basis to provide the flexibility that will be such a big part of ALMA's scientific value. That's why the transporters are so important and why this is such a significant milestone," said Adrian Russell, North American Project Manager for ALMA. "The ALMA antennas will be assembled and their functionality will be verified at a base camp, located at an altitude of 2900 meters (9500 feet) and the transporters will in a first step bring the telescopes up to the 5000-meter (16,500 feet) high observatory," explained Hans Rykaczewski, the European ALMA Project Manager. "There, the transporters will move the antennas from the compact configuration to any extended configuration which could stretch up to 15 kilometers." To do their job for ALMA, the transporters will have to climb a 17-mile, high-altitude road with an average grade of 7 percent. Carrying an antenna, they can move about 7 mph; when empty, they can travel about 12 mph. The trip from the base camp to the high observing site will take about three hours. A special brake system allows them to safely make the downhill trip. The machines also incorporate a number of redundant safety devices to protect both the personnel and the valuable antennas. "In order to operate the transporter at the ALMA site, two engines with a total of about 1400 horsepower are installed and all the components have been checked to meet the requirements at this extreme conditions," says Andreas Kohler, Vice President for Research and Development at Scheuerle Fahrzeugfabrik, the company which built the transporters under contract to ESO. "The human factor was also considered. For example, the backrests of the driver seats are shaped to allow the driver to wear his oxygen tank while driving." At the high elevation of 16,500 feet, the transporter engines will only provide about half their rated power, because of the lowered amount of available oxygen. The ALMA project is a partnership between Europe, Japan and North America in cooperation with the Republic of Chile. ALMA is funded in Europe by ESO, in Japan by the National Institutes of Natural Sciences in cooperation with the Academia Sinica in Taiwan and in North America by the U.S. National Science Foundation in cooperation with the National Research Council of Canada. ALMA construction and operations are led on behalf of Europe by ESO, on behalf of Japan by the National Astronomical Observatory of Japan and on behalf of North America by the National Radio Astronomy Observatory, which is managed by Associated Universities, Inc.

  12. Are we ready to accept the challenge?

    DEFF Research Database (Denmark)

    Lau, Sofie Rosenlund; Traulsen, Janine M

    2017-01-01

    , including explicitly reflecting upon theoretical perspectives affecting the research process. METHODS: Content analysis was used to evaluate levels of theoretical visibility and analysis transparency in selected qualitative research articles published in Research in Social and Administrative Pharmacy...... the standpoint that theory and high-quality analysis go hand-in-hand. Based on the content analysis, articles that were deemed to be high in quality were explicit about the theoretical framework of their study and transparent in how they analyzed their data. It was found that theory contributed...... to the transparency of how the data were analyzed and interpreted. Two ways of improving contemporary qualitative research in the field of social and administrative pharmacy are discussed: engaging with social theory and establishing close collaboration with social scientists....

  13. Is the "Net Generation" Ready for Digital Citizenship? Perspectives from the IEA International Computer and Information Literacy Study 2013. Policy Brief No. 6

    Science.gov (United States)

    Watkins, Ryan; Engel, Laura C.; Hastedt, Dirk

    2015-01-01

    The rise of digital information and communication technologies (ICT) has made the acquisition of computer and information literacy (CIL) a leading factor in creating an engaged, informed, and employable citizenry. However, are young people, often described as "digital natives" or the "net generation," developing the necessary…

  14. Increasing high school girls' exposure to computing activities with e-textiles: challenges and lessons learned

    DEFF Research Database (Denmark)

    Borsotti, Valeria

    2017-01-01

    The number of female students in computer science degrees has been rapidly declining in Denmark in the past 40 years, as in many other European and North-American countries. The main reasons behind this phenomenon are widespread gender stereotypes about who is best suited to pursue a career in CS...... pilot workshop organized by the IT University of Copenhagen which targeted high school girls. The workshop aimed to introduce the girls to coding and computing through hands-on e-textiles activities realized with the Protosnap Lilypad Development board. This contribution discusses the advantages......; stereotypes about computing as a ‘male’ domain; widespread lack of pre-college CS education and perceptions of computing as not socially relevant. STEAM activities have often been used to bridge the gender gap and to broaden the appeal of computing among children and youth. This contribution examines a STEAM...

  15. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  16. Computer-aided detection systems to improve lung cancer early diagnosis: state-of-the-art and challenges

    Science.gov (United States)

    Traverso, A.; Lopez Torres, E.; Fantacci, M. E.; Cerello, P.

    2017-05-01

    Lung cancer is one of the most lethal types of cancer, because its early diagnosis is not good enough. In fact, the detection of pulmonary nodule, potential lung cancers, in Computed Tomography scans is a very challenging and time-consuming task for radiologists. To support radiologists, researchers have developed Computer-Aided Diagnosis (CAD) systems for the automated detection of pulmonary nodules in chest Computed Tomography scans. Despite the high level of technological developments and the proved benefits on the overall detection performance, the usage of Computer-Aided Diagnosis in clinical practice is far from being a common procedure. In this paper we investigate the causes underlying this discrepancy and present a solution to tackle it: the M5L WEB- and Cloud-based on-demand Computer-Aided Diagnosis. In addition, we prove how the combination of traditional imaging processing techniques with state-of-art advanced classification algorithms allows to build a system whose performance could be much larger than any Computer-Aided Diagnosis developed so far. This outcome opens the possibility to use the CAD as clinical decision support for radiologists.

  17. Academic Training: QCD: are we ready for the LHC

    CERN Multimedia

    2006-01-01

    2006-2007 ACADEMIC TRAINING PROGRAMME LECTURE SERIES 4, 5, 6, 7 December, from 11:00 to 12:00 4, 5, 6 December - Main Auditorium, bldg. 500, 7 December - TH Auditorium, bldg. 4 - 3-006 QCD: are we ready for the LHC S. FRIXIONE / INFN, Genoa, Italy The LHC energy regime poses a serious challenge to our capability of predicting QCD reactions to the level of accuracy necessary for a successful programme of searches for physics beyond the Standard Model. In these lectures, I'll introduce basic concepts in QCD, and present techniques based on perturbation theory, such as fixed-order and resummed computations, and Monte Carlo simulations. I'll discuss applications of these techniques to hadron-hadron processes, concentrating on recent trends in perturbative QCD aimed at improving our understanding of LHC phenomenology.

  18. Challenges and Opportunities for Security in High-Performance Computing Environments

    OpenAIRE

    Peisert, S

    2017-01-01

    High-performance computing (HPC) environments have numerous distinctive elements that make securing them different than securing traditional computing systems. In some cases this is due to the way that HPC systems are implemented. In other cases, it is due to the way that HPC systems are used, or a combination of both issues. In this article, we discuss these distinctions and also discuss which security procedures and mechanisms are and are not appropriate in HPC environments, and where gaps ...

  19. Qualitative Computing and Qualitative Research: Addressing the Challenges of Technology and Globalization

    Directory of Open Access Journals (Sweden)

    César A. Cisneros Puebla

    2012-05-01

    Full Text Available Qualitative computing has been part of our lives for thirty years. Today, we urgently call for an evaluation of its international impact on qualitative research. Evaluating the international impact of qualitative research and qualitative computing requires a consideration of the vast amount of qualitative research over the last decades, as well as thoughtfulness about the uneven and unequal way in which qualitative research and qualitative computing are present in different fields of study and geographical regions. To understand the international impact of qualitative computing requires evaluation of the digital divide and the huge differences between center and peripheries. The international impact of qualitative research, and, in particular qualitative computing, is the question at the heart of this array of selected papers from the "Qualitative Computing: Diverse Worlds and Research Practices Conference." In this article, we introduce the reader to the goals, motivation, and atmosphere at the conference, taking place in Istanbul, Turkey, in 2011. The dialogue generated there is still in the air, and this introduction is a call to spread that voice. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1202285

  20. A Technical Guide to College Readiness Indicators. College Readiness Indicator Systems (CRIS) Resource Series

    Science.gov (United States)

    University of Chicago Consortium on Chicago School Research, 2014

    2014-01-01

    Districts now have access to a wealth of new information that can help target students with appropriate supports and bring focus and coherence to college readiness efforts. However, the abundance of data has brought its own challenges. Schools and school systems are often overwhelmed with the amount of data available. The capacity of districts to…

  1. Perspectives on Games, Computers, and Mental Health: Questions about Paradoxes, Evidences, and Challenges.

    Science.gov (United States)

    Desseilles, Martin

    2016-01-01

    In the field of mental health, games and computerized games present questions about paradoxes, evidences, and challenges. This perspective article offers perspectives and personal opinion about these questions, evidences, and challenges with an objective of presenting several ideas and issues in this rapidly developing field. First, games raise some questions in the sense of the paradox between a game and an issue, as well as the paradox of using an amusing game to treat a serious pathology. Second, games also present evidence in the sense that they involve relationships with others, as well as learning, communication, language, emotional regulation, and hedonism. Third, games present challenges, such as the risk of abuse, the critical temporal period that may be limited to childhood, their important influence on sociocognitive learning and the establishment of social norms, and the risk of misuse of games.

  2. Toward the Language-Ready Brain: Biological Evolution and Primate Comparisons.

    Science.gov (United States)

    Arbib, Michael A

    2017-02-01

    The approach to language evolution suggested here focuses on three questions: How did the human brain evolve so that humans can develop, use, and acquire languages? How can the evolutionary quest be informed by studying brain, behavior, and social interaction in monkeys, apes, and humans? How can computational modeling advance these studies? I hypothesize that the brain is language ready in that the earliest humans had protolanguages but not languages (i.e., communication systems endowed with rich and open-ended lexicons and grammars supporting a compositional semantics), and that it took cultural evolution to yield societies (a cultural constructed niche) in which language-ready brains could become language-using brains. The mirror system hypothesis is a well-developed example of this approach, but I offer it here not as a closed theory but as an evolving framework for the development and analysis of conflicting subhypotheses in the hope of their eventual integration. I also stress that computational modeling helps us understand the evolving role of mirror neurons, not in and of themselves, but only in their interaction with systems "beyond the mirror." Because a theory of evolution needs a clear characterization of what it is that evolved, I also outline ideas for research in neurolinguistics to complement studies of the evolution of the language-ready brain. A clear challenge is to go beyond models of speech comprehension to include sign language and models of production, and to link language to visuomotor interaction with the physical and social world.

  3. The LHCb software and computing upgrade for Run 3: opportunities and challenges

    Science.gov (United States)

    Bozzi, C.; Roiser, S.; LHCb Collaboration

    2017-10-01

    The LHCb detector will be upgraded for the LHC Run 3 and will be readout at 30 MHz, corresponding to the full inelastic collision rate, with major implications on the full software trigger and offline computing. If the current computing model and software framework are kept, the data storage capacity and computing power required to process data at this rate, and to generate and reconstruct equivalent samples of simulated events, will exceed the current capacity by at least one order of magnitude. A redesign of the software framework, including scheduling, the event model, the detector description and the conditions database, is needed to fully exploit the computing power of multi-, many-core architectures, and coprocessors. Data processing and the analysis model will also change towards an early streaming of different data types, in order to limit storage resources, with further implications for the data analysis workflows. Fast simulation options will allow to obtain a reasonable parameterization of the detector response in considerably less computing time. Finally, the upgrade of LHCb will be a good opportunity to review and implement changes in the domains of software design, test and review, and analysis workflow and preservation. In this contribution, activities and recent results in all the above areas are presented.

  4. The ontogeny of great ape gesture - not a simple story. Comment on "Towards a Computational Comparative Neuroprimatology: Framing the language-ready brain" by Michael A. Arbib

    Science.gov (United States)

    Liebal, Katja

    2016-03-01

    Although there is an increasing number of studies investigating gestural communication in primates other than humans in both natural and captive settings [1], very little is known about how they acquire their gestures. Different mechanisms have been proposed, including genetic transmission [2], social learning [3], or ontogenetic ritualization [4]. This latter mechanism is central to Arbib's paper [5], because he uses dyadic brain modeling - that is ;modeling the brains of two creatures as they interact with each other, so that the action of one affects the perception of the other and so the cycle of interactions continues, with both brains changing in the process; - to explain how gestures might emerge in ontogeny from previously non-communicative behaviors over the course of repeated and increasingly abbreviated and thus ritualized interactions. The aim of my comment is to discuss the current evidence from primate gesture research with regard the different mechanisms proposed for gesture acquisition and how this might confirm or challenge Arbib's approach.

  5. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  6. The modern era of research on language evolution: Moving forward. Comment on "Towards a computational comparative neuroprimatology: Framing the language-ready brain" by Michael A. Arbib

    Science.gov (United States)

    Stout, Dietrich

    2016-03-01

    Twenty-five years ago, Pinker and Bloom [1] helped reinvigorate research on language evolution by arguing that language ;shows signs of complex design for the communication of propositional structures, and the only explanation for the origin of organs with complex design is the process of natural selection.; Since then, empirical research has tested the assertions of (cross-cultural) universality, (cross-species) uniqueness, and (cross-domain) specificity underpinning this argument from design. Appearances aside, points of consensus have emerged. The existence of a core computational and neural substrate unique to language and/or humans is still debated, but it is widely agreed that: 1) human language performance overlaps with behaviors in other domains and species, and 2) such general, pre-existing capacities provided the context for language-specific evolution (e.g. [2]).

  7. Readiness System Management

    Science.gov (United States)

    1977-05-13

    Training rating. Training shortfall(3a above)is expressed as a readiness rating, 1 through A, through a tabular conversion given in AR 220-1. For...INDIVIDUAL STUDY PROJECT PREFERENCE STATEMENT STUDENT (LAST NAME, INITIALS) MEMORANDUM THRU: COL H.T. Reed WEEKJjiY, ß. M. 29 October I976 ( Dato

  8. Rethinking School Readiness

    Science.gov (United States)

    Farran, Dale C.

    2011-01-01

    In the United States, for typically developing children, age has historically been the most common factor determining when a child starts formal schooling. Recently, there has been increased emphasis on other indicators of being ready for school. Beginning with Head Start in 1965 and mushrooming into state-funded prekindergarten programs in most…

  9. Ethical and Legal Issues in Computer-Mediated Communications: The Educational Challenge.

    Science.gov (United States)

    Resta, Paul E.

    1994-01-01

    Discusses the ethical use of computer-mediated communication (CMC) and the lack of integration of information ethics into elementary, secondary, and higher education curricula. The development of information ethics instruction is proposed which would include such topics as intellectual property, destruction of digital property, confidentiality and…

  10. The Computer-Mediated Communication (CMC) Classroom: A Challenge of Medium, Presence, Interaction, Identity, and Relationship

    Science.gov (United States)

    Sherblom, John C.

    2010-01-01

    There is a "prevalence of computer-mediated communication (CMC) in education," and a concern for its negative psychosocial consequences and lack of effectiveness as an instructional tool. This essay identifies five variables in the CMC research literature and shows their moderating effect on the psychosocial, instructional expevrience of the CMC…

  11. Exploring the Benefits and Challenges of Using Laptop Computers in Higher Education Classrooms: A Formative Analysis

    Science.gov (United States)

    Kay, Robin H.; Lauricella, Sharon

    2011-01-01

    Because of decreased prices, increased convenience, and wireless access, an increasing number of college and university students are using laptop computers in their classrooms. This recent trend has forced instructors to address the educational consequences of using these mobile devices. The purpose of the current study was to analyze and assess…

  12. Expanding Computer Science Education in Schools: Understanding Teacher Experiences and Challenges

    Science.gov (United States)

    Yadav, Aman; Gretter, Sarah; Hambrusch, Susanne; Sands, Phil

    2017-01-01

    The increased push for teaching computer science (CS) in schools in the United States requires training a large number of new K-12 teachers. The current efforts to increase the number of CS teachers have predominantly focused on training teachers from other content areas. In order to support these beginning CS teachers, we need to better…

  13. The challenge associated with the robust computation of meteor velocities from video and photographic records

    Science.gov (United States)

    Egal, A.; Gural, P. S.; Vaubaillon, J.; Colas, F.; Thuillot, W.

    2017-09-01

    The CABERNET project was designed to push the limits for obtaining accurate measurements of meteoroids orbits from photographic and video meteor camera recordings. The discrepancy between the measured and theoretic orbits of these objects heavily depends on the semi-major axis determination, and thus on the reliability of the pre-atmospheric velocity computation. With a spatial resolution of 0.01° per pixel and a temporal resolution of up to 10 ms, CABERNET should be able to provide accurate measurements of velocities and trajectories of meteors. To achieve this, it is necessary to improve the precision of the data reduction processes, and especially the determination of the meteor's velocity. In this work, most of the steps of the velocity computation are thoroughly investigated in order to reduce the uncertainties and error contributions at each stage of the reduction process. The accuracy of the measurement of meteor centroids is established and results in a precision of 0.09 pixels for CABERNET, which corresponds to 3.24‧‧. Several methods to compute the velocity were investigated based on the trajectory determination algorithms described in Ceplecha (1987) and Borovicka (1990), as well as the multi-parameter fitting (MPF) method proposed by Gural (2012). In the case of the MPF, many optimization methods were implemented in order to find the most efficient and robust technique to solve the minimization problem. The entire data reduction process is assessed using simulated meteors, with different geometrical configurations and deceleration behaviors. It is shown that the multi-parameter fitting method proposed by Gural(2012)is the most accurate method to compute the pre-atmospheric velocity in all circumstances. Many techniques that assume constant velocity at the beginning of the path as derived from the trajectory determination using Ceplecha (1987) or Borovicka (1990) can lead to large errors for decelerating meteors. The MPF technique also allows one to

  14. Computer-aided diagnosis in radiological imaging: current status and future challenges

    Science.gov (United States)

    Doi, Kunio

    2009-10-01

    Computer-aided diagnosis (CAD) has become one of the major research subjects in medical imaging and diagnostic radiology. Many different types of CAD schemes are being developed for detection and/or characterization of various lesions in medical imaging, including conventional projection radiography, CT, MRI, and ultrasound imaging. Commercial systems for detection of breast lesions on mammograms have been developed and have received FDA approval for clinical use. CAD may be defined as a diagnosis made by a physician who takes into account the computer output as a "second opinion". The purpose of CAD is to improve the quality and productivity of physicians in their interpretation of radiologic images. The quality of their work can be improved in terms of the accuracy and consistency of their radiologic diagnoses. In addition, the productivity of radiologists is expected to be improved by a reduction in the time required for their image readings. The computer output is derived from quantitative analysis of radiologic images by use of various methods and techniques in computer vision, artificial intelligence, and artificial neural networks (ANNs). The computer output may indicate a number of important parameters, for example, the locations of potential lesions such as lung cancer and breast cancer, the likelihood of malignancy of detected lesions, and the likelihood of various diseases based on differential diagnosis in a given image and clinical parameters. In this review article, the basic concept of CAD is first defined, and the current status of CAD research is then described. In addition, the potential of CAD in the future is discussed and predicted.

  15. Ready for the plunge!

    CERN Document Server

    2007-01-01

    Herman Ten Kate, Project Leader for the ATLAS magnet system, standing in front of the truck transporting the magnet across the Route de Meyrin.Every time any part of the ATLAS detector is moved, it’s quite a spectacle! On Tuesday 29 May, the first end-cap of the ATLAS toroid magnet left Building 180, bound for Point 1. The 240-ton behemoth covered the two short kilometres in no less than five hours. Traffic was interrupted on the Route de Meyrin while the exceptional load was wheeled to its final destination. One of the technical challenges was to keep the magnet horizontal throughout the operation and, to achieve this, computers permanently monitored the magnet’s angles of displacement and hydraulic jacks rectified any tilt. But the most hazardous part of operation remains the 80-m plunge into the ATLAS cavern.

  16. Towards large-scale data analysis: challenges in the design of portable systems and use of Cloud computing.

    Science.gov (United States)

    Diaz, Javier; Arrizabalaga, Saioa; Bustamante, Paul; Mesa, Iker; Añorga, Javier; Goya, Jon

    2013-01-01

    Portable systems and global communications open a broad spectrum for new health applications. In the framework of electrophysiological applications, several challenges are faced when developing portable systems embedded in Cloud computing services. In order to facilitate new developers in this area based on our experience, five areas of interest are presented in this paper where strategies can be applied for improving the performance of portable systems: transducer and conditioning, processing, wireless communications, battery and power management. Likewise, for Cloud services, scalability, portability, privacy and security guidelines have been highlighted.

  17. Solving algebraic computational problems in geodesy and geoinformatics the answer to modern challenges

    CERN Document Server

    Awange, Joseph L

    2004-01-01

    While preparing and teaching 'Introduction to Geodesy I and II' to - dergraduate students at Stuttgart University, we noticed a gap which motivated the writing of the present book: Almost every topic that we taughtrequiredsomeskillsinalgebra,andinparticular,computeral- bra! From positioning to transformation problems inherent in geodesy and geoinformatics, knowledge of algebra and application of computer algebra software were required. In preparing this book therefore, we haveattemptedtoputtogetherbasicconceptsofabstractalgebra which underpin the techniques for solving algebraic problems. Algebraic c- putational algorithms useful for solving problems which require exact solutions to nonlinear systems of equations are presented and tested on various problems. Though the present book focuses mainly on the two ?elds,theconceptsand techniquespresented hereinarenonetheless- plicable to other ?elds where algebraic computational problems might be encountered. In Engineering for example, network densi?cation and robo...

  18. High-Performance Computing Opportunities and Challenges for Army R&D

    Science.gov (United States)

    2006-01-01

    sec Gigabits per Second GMO Genetically Modified Organism GPS Global Positioning System HEC High-End Computing HP Hewlett-Packard HPC High Performance...missing body organs). It includes the engi- neering (design, fabrication, and testing) of genetically modified organisms ( GMOs ). Molecular Modeling...accurate modeling of the physiological processes present in organisms able to down- regulate their metabolism. Research has shown that sleep deprivation is

  19. The NASA Computational Fluid Dynamics (CFD) program - Building technology to solve future challenges

    Science.gov (United States)

    Richardson, Pamela F.; Dwoyer, Douglas L.; Kutler, Paul; Povinelli, Louis A.

    1993-01-01

    This paper presents the NASA Computational Fluid Dynamics program in terms of a strategic vision and goals as well as NASA's financial commitment and personnel levels. The paper also identifies the CFD program customers and the support to those customers. In addition, the paper discusses technical emphasis and direction of the program and some recent achievements. NASA's Ames, Langley, and Lewis Research Centers are the research hubs of the CFD program while the NASA Headquarters Office of Aeronautics represents and advocates the program.

  20. Big Data Analytics-Enhanced Cloud Computing: Challenges, Architectural Elements, and Future Directions

    OpenAIRE

    Buyya, Rajkumar; Ramamohanarao, Kotagiri; Leckie, Chris; Calheiros, Rodrigo N.; Dastjerdi, Amir Vahid; Versteeg, Steve

    2015-01-01

    The emergence of cloud computing has made dynamic provisioning of elastic capacity to applications on-demand. Cloud data centers contain thousands of physical servers hosting orders of magnitude more virtual machines that can be allocated on demand to users in a pay-as-you-go model. However, not all systems are able to scale up by just adding more virtual machines. Therefore, it is essential, even for scalable systems, to project workloads in advance rather than using a purely reactive approa...

  1. Combining Brain–Computer Interfaces and Assistive Technologies: State-of-the-Art and Challenges

    Science.gov (United States)

    Millán, J. d. R.; Rupp, R.; Müller-Putz, G. R.; Murray-Smith, R.; Giugliemma, C.; Tangermann, M.; Vidaurre, C.; Cincotti, F.; Kübler, A.; Leeb, R.; Neuper, C.; Müller, K.-R.; Mattia, D.

    2010-01-01

    In recent years, new research has brought the field of electroencephalogram (EEG)-based brain–computer interfacing (BCI) out of its infancy and into a phase of relative maturity through many demonstrated prototypes such as brain-controlled wheelchairs, keyboards, and computer games. With this proof-of-concept phase in the past, the time is now ripe to focus on the development of practical BCI technologies that can be brought out of the lab and into real-world applications. In particular, we focus on the prospect of improving the lives of countless disabled individuals through a combination of BCI technology with existing assistive technologies (AT). In pursuit of more practical BCIs for use outside of the lab, in this paper, we identify four application areas where disabled individuals could greatly benefit from advancements in BCI technology, namely, “Communication and Control”, “Motor Substitution”, “Entertainment”, and “Motor Recovery”. We review the current state of the art and possible future developments, while discussing the main research issues in these four areas. In particular, we expect the most progress in the development of technologies such as hybrid BCI architectures, user–machine adaptation algorithms, the exploitation of users’ mental states for BCI reliability and confidence measures, the incorporation of principles in human–computer interaction (HCI) to improve BCI usability, and the development of novel BCI technology including better EEG devices. PMID:20877434

  2. Combining Brain-Computer Interfaces and Assistive Technologies: State-of-the-Art and Challenges.

    Science.gov (United States)

    Millán, J D R; Rupp, R; Müller-Putz, G R; Murray-Smith, R; Giugliemma, C; Tangermann, M; Vidaurre, C; Cincotti, F; Kübler, A; Leeb, R; Neuper, C; Müller, K-R; Mattia, D

    2010-01-01

    In recent years, new research has brought the field of electroencephalogram (EEG)-based brain-computer interfacing (BCI) out of its infancy and into a phase of relative maturity through many demonstrated prototypes such as brain-controlled wheelchairs, keyboards, and computer games. With this proof-of-concept phase in the past, the time is now ripe to focus on the development of practical BCI technologies that can be brought out of the lab and into real-world applications. In particular, we focus on the prospect of improving the lives of countless disabled individuals through a combination of BCI technology with existing assistive technologies (AT). In pursuit of more practical BCIs for use outside of the lab, in this paper, we identify four application areas where disabled individuals could greatly benefit from advancements in BCI technology, namely, "Communication and Control", "Motor Substitution", "Entertainment", and "Motor Recovery". We review the current state of the art and possible future developments, while discussing the main research issues in these four areas. In particular, we expect the most progress in the development of technologies such as hybrid BCI architectures, user-machine adaptation algorithms, the exploitation of users' mental states for BCI reliability and confidence measures, the incorporation of principles in human-computer interaction (HCI) to improve BCI usability, and the development of novel BCI technology including better EEG devices.

  3. Combining Brain-Computer Interfaces and Assistive Technologies: State-of-the-Art and Challenges

    Directory of Open Access Journals (Sweden)

    José del R. Millán

    2010-09-01

    Full Text Available In recent years, new research has brought the field of EEG-based Brain-Computer Interfacing (BCI out of its infancy and into a phase of relative maturity through many demonstrated prototypes such as brain-controlled wheelchairs, keyboards, and computer games. With this proof-of-concept phase in the past, the time is now ripe to focus on the development of practical BCI technologies that can be brought out of the lab and into real-world applications. In particular, we focus on the prospect of improving the lives of countless disabled individuals through a combination of BCI technology with existing assistive technologies (AT. In pursuit of more practical BCIs for use outside of the lab, in this paper, we identify four application areas where disabled individuals could greatly benefit from advancements in BCI technology, namely,“Communication & Control”, “Motor Substitution”, “Entertainment”, and “Motor Recovery”. We review the current state of the art and possible future developments, while discussing the main research issues in these four areas. In particular, we expect the most progress in the development of technologies such as hybrid BCI architectures, user-machine adaptation algorithms, the exploitation of users’ mental states for BCI reliability and confidence measures, the incorporation of principles in human-computer interaction (HCI to improve BCI usability, and the development of novel BCI technology including better EEG devices.

  4. Defining dimensions of research readiness: a conceptual model for primary care research networks.

    Science.gov (United States)

    Carr, Helen; de Lusignan, Simon; Liyanage, Harshana; Liaw, Siaw-Teng; Terry, Amanda; Rafi, Imran

    2014-11-26

    Recruitment to research studies in primary care is challenging despite widespread implementation of electronic patient record (EPR) systems which potentially make it easier to identify eligible cases. Literature review and applying the learning from a European research readiness assessment tool, the TRANSFoRm International Research Readiness instrument (TIRRE), to the context of the English NHS in order to develop a model to assess a practice's research readiness. Seven dimensions of research readiness were identified: (1) Data readiness: Is there good data quality in EPR systems; (2) Record readiness: Are EPR data able to identify eligible cases and other study data; (3) Organisational readiness: Are the health system and socio-cultural environment supportive; (4) Governance readiness: Does the study meet legal and local health system regulatory compliance; (5) Study-specific readiness; (6) Business process readiness: Are business processes tilted in favour of participation: including capacity and capability to take on extra work, financial incentives as well as intangibles such as social and intellectual capital; (7) Patient readiness: Are systems in place to recruit patients and obtain informed consent? The model might enable the development of interventions to increase participation in primary care-based research and become a tool to measure the progress of practice networks towards the most advanced state of readiness.

  5. Organisational readiness for introducing a performance management system

    Directory of Open Access Journals (Sweden)

    Michael Ochurub

    2012-09-01

    Full Text Available Orientation: The successful introduction of performance management systems to the public service requires careful measurement of readiness for change.Research purpose: This study investigated the extent to which employees were ready for change as an indication of whether their organisation was ready to introduce a performance management system (PMS.Motivation for the study: Introducing system changes in organisations depends on positive employee preconditions. There is some debate over whether organisations can facilitate these preconditions. This research investigates change readiness linked to the introduction of a PMS in a public sector organisation. The results add to the growing literature on levels of change readiness.Research design, approach and method: The researchers used a quantitative, questionnairebased design. Because the organisation was large, the researchers used stratified sampling to select a sample from each population stratum. The sample size was 460, which constituted 26% of the total population. They used a South African change readiness questionnaire to elicit employee perceptions and opinions.Main findings: The researchers found that the organisation was not ready to introduce a PMS. The study identified various challenges and key factors that were negatively affecting the introduction of a PMS.Practical/managerial implications: The intention to develop and introduce performance management systems is generally to change the attitudes, values and approaches of managers and employees to the new strategies, processes and plans to improve productivity and performance. However, pre-existing conditions and attitudes could have an effect. It is essential to ensure that organisations are ready to introduce performance management systems and to provide sound change leadership to drive the process effectively. This study contributes to the body of knowledge about the challenges and factors organisations should consider when they

  6. Challenges in exome analysis by LifeScope and its alternative computational pipelines.

    Science.gov (United States)

    Pranckevičiene, Erinija; Rančelis, Tautvydas; Pranculis, Aidas; Kučinskas, Vaidutis

    2015-09-07

    Every next generation sequencing (NGS) platform relies on proprietary and open source computational tools to analyze sequencing data. NGS tools for Illumina platforms are well documented which is not the case with AB SOLiD systems. We applied several computational and variant calling pipelines to analyse targeted exome sequencing data obtained using AB SOLiD 5500 system. Our investigated tools comprised proprietary LifeScope's pipeline in combination with open source color-space competent mapping programs and a variant caller. We present instrumental details of the pipelines that were used and quantitative comparative analysis of variant lists generated by LifeScope's pipeline versus open source tools. Sufficient coverage of targeted regions was achieved by all investigated pipelines. High variability was observed in identities of variants across the mapping programs. We observed less than 50% concordance of variant lists produced by approaches based on different mapping algorithms. We summarized different approaches with regards to coverage (DP) and quality (QUAL) properties of the variants provided by GATK and found that LifeScope's computational pipeline is superior. Fusion of information on mapping profiles (pileup) at genomic positions of variants in several different alignments proved to be a useful strategy to assess questionable singleton variants. We quantitatively supported a conclusion that Lifescope's pipeline is superior for processing sequencing data obtained by AB SOLiD 5500 system. Nevertheless the use of alternative pipelines is encouraged because aggregation of information from other mapping and variant calling approaches helps to resolve questionable calls and increases the confidence of the call. It was noted that a coverage threshold for variant to be considered for further analysis has to be chosen in data-driven way to prevent a loss of important information.

  7. Portable non-invasive brain-computer interface: challenges and opportunities of optical modalities

    Science.gov (United States)

    Scholl, Clara A.; Hendrickson, Scott M.; Swett, Bruce A.; Fitch, Michael J.; Walter, Erich C.; McLoughlin, Michael P.; Chevillet, Mark A.; Blodgett, David W.; Hwang, Grace M.

    2017-05-01

    The development of portable non-invasive brain computer interface technologies with higher spatio-temporal resolution has been motivated by the tremendous success seen with implanted devices. This talk will discuss efforts to overcome several major obstacles to viability including approaches that promise to improve spatial and temporal resolution. Optical approaches in particular will be highlighted and the potential benefits of both Blood-Oxygen Level Dependent (BOLD) and Fast Optical Signal (FOS) will be discussed. Early-stage research into the correlations between neural activity and FOS will be explored.

  8. An Australian Perspective On The Challenges For Computer And Network Security For Novice End-Users

    Directory of Open Access Journals (Sweden)

    Patryk Szewczyk

    2012-12-01

    Full Text Available It is common for end-users to have difficulty in using computer or network security appropriately and thus have often been ridiculed when misinterpreting instructions or procedures. This discussion paper details the outcomes of research undertaken over the past six years on why security is overly complex for end-users. The results indicate that multiple issues may render end-users vulnerable to security threats and that there is no single solution to address these problems. Studies on a small group of senior citizens has shown that educational seminars can be beneficial in ensuring that simple security aspects are understood and used appropriately.

  9. Managing Military Readiness

    Science.gov (United States)

    2017-02-01

    unemployment rate (ages 16 and older). They also acknowledge several external determinants that include the population of eligible youth (for example, those...requirements, are vetted by the Joint Staff and globally staffed and approved through the Secretary of Defense.8 Indirect effects: They are caused by...clear explanation of the causes of readiness degradations and options for how to mitigate them that can be traced to precise resource investments

  10. The Computational Fluid Dynamics Rupture Challenge 2013--Phase II: Variability of Hemodynamic Simulations in Two Intracranial Aneurysms.

    Science.gov (United States)

    Berg, Philipp; Roloff, Christoph; Beuing, Oliver; Voss, Samuel; Sugiyama, Shin-Ichiro; Aristokleous, Nicolas; Anayiotos, Andreas S; Ashton, Neil; Revell, Alistair; Bressloff, Neil W; Brown, Alistair G; Chung, Bong Jae; Cebral, Juan R; Copelli, Gabriele; Fu, Wenyu; Qiao, Aike; Geers, Arjan J; Hodis, Simona; Dragomir-Daescu, Dan; Nordahl, Emily; Bora Suzen, Yildirim; Owais Khan, Muhammad; Valen-Sendstad, Kristian; Kono, Kenichi; Menon, Prahlad G; Albal, Priti G; Mierka, Otto; Münster, Raphael; Morales, Hernán G; Bonnefous, Odile; Osman, Jan; Goubergrits, Leonid; Pallares, Jordi; Cito, Salvatore; Passalacqua, Alberto; Piskin, Senol; Pekkan, Kerem; Ramalho, Susana; Marques, Nelson; Sanchi, Stéphane; Schumacher, Kristopher R; Sturgeon, Jess; Švihlová, Helena; Hron, Jaroslav; Usera, Gabriel; Mendina, Mariana; Xiang, Jianping; Meng, Hui; Steinman, David A; Janiga, Gábor

    2015-12-01

    With the increased availability of computational resources, the past decade has seen a rise in the use of computational fluid dynamics (CFD) for medical applications. There has been an increase in the application of CFD to attempt to predict the rupture of intracranial aneurysms, however, while many hemodynamic parameters can be obtained from these computations, to date, no consistent methodology for the prediction of the rupture has been identified. One particular challenge to CFD is that many factors contribute to its accuracy; the mesh resolution and spatial/temporal discretization can alone contribute to a variation in accuracy. This failure to identify the importance of these factors and identify a methodology for the prediction of ruptures has limited the acceptance of CFD among physicians for rupture prediction. The International CFD Rupture Challenge 2013 seeks to comment on the sensitivity of these various CFD assumptions to predict the rupture by undertaking a comparison of the rupture and blood-flow predictions from a wide range of independent participants utilizing a range of CFD approaches. Twenty-six groups from 15 countries took part in the challenge. Participants were provided with surface models of two intracranial aneurysms and asked to carry out the corresponding hemodynamics simulations, free to choose their own mesh, solver, and temporal discretization. They were requested to submit velocity and pressure predictions along the centerline and on specified planes. The first phase of the challenge, described in a separate paper, was aimed at predicting which of the two aneurysms had previously ruptured and where the rupture site was located. The second phase, described in this paper, aims to assess the variability of the solutions and the sensitivity to the modeling assumptions. Participants were free to choose boundary conditions in the first phase, whereas they were prescribed in the second phase but all other CFD modeling parameters were not

  11. Role of High-End Computing in Meeting NASA's Science and Engineering Challenges

    Science.gov (United States)

    Biswas, Rupak; Tu, Eugene L.; Van Dalsem, William R.

    2006-01-01

    Two years ago, NASA was on the verge of dramatically increasing its HEC capability and capacity. With the 10,240-processor supercomputer, Columbia, now in production for 18 months, HEC has an even greater impact within the Agency and extending to partner institutions. Advanced science and engineering simulations in space exploration, shuttle operations, Earth sciences, and fundamental aeronautics research are occurring on Columbia, demonstrating its ability to accelerate NASA s exploration vision. This talk describes how the integrated production environment fostered at the NASA Advanced Supercomputing (NAS) facility at Ames Research Center is accelerating scientific discovery, achieving parametric analyses of multiple scenarios, and enhancing safety for NASA missions. We focus on Columbia s impact on two key engineering and science disciplines: Aerospace, and Climate. We also discuss future mission challenges and plans for NASA s next-generation HEC environment.

  12. The high-rate data challenge: computing for the CBM experiment

    Science.gov (United States)

    Friese, V.; CBM Collaboration

    2017-10-01

    The Compressed Baryonic Matter experiment (CBM) is a next-generation heavy-ion experiment to be operated at the FAIR facility, currently under construction in Darmstadt, Germany. A key feature of CBM is very high interaction rate, exceeding those of contemporary nuclear collision experiments by several orders of magnitude. Such interaction rates forbid a conventional, hardware-triggered readout; instead, experiment data will be freely streaming from self-triggered front-end electronics. In order to reduce the huge raw data volume to a recordable rate, data will be selected exclusively on CPU, which necessitates partial event reconstruction in real-time. Consequently, the traditional segregation of online and offline software vanishes; an integrated on- and offline data processing concept is called for. In this paper, we will report on concepts and developments for computing for CBM as well as on the status of preparations for its first physics run.

  13. Tailored Interactive Multimedia Computer Programs to Reduce Health Disparities: Opportunities and Challenges

    Science.gov (United States)

    Jerant, Anthony; Sohler, Nancy; Fiscella, Kevin; Franks, Becca; Franks, Peter

    2010-01-01

    Objective To review the theory and research evidence suggesting that tailored interactive multimedia computer programs (IMCPs) aimed at optimizing patient health behaviors could lessen socio-demographic health disparities. Methods Selective critical review of research regarding IMCPs tailored to psychological mediators of behavior and their effects on health behavior and outcomes among socio-demographically disadvantaged patients. Results Tailored IMCPs can address patient factors (e.g. language barriers, low self-efficacy) and buffer provider (e.g. cognitive bias) and health system (e.g. office visit time constraints) factors that contribute to poor provider-patient communication and, thereby, suboptimal health behaviors. Research indicates disadvantaged individuals' interactions with providers are disproportionately affected by such factors, and that their behaviors respond favorably to tailored information, thus suggesting tailored IMCPs could mitigate disparities. However, no randomized controlled trials (RCTs) have examined this question. The optimal design and deployment of tailored IMCPs for disadvantaged patients also requires further study. Conclusion Preliminary research suggests tailored IMCPs have the potential to reduce health disparities. RCTs designed expressly to examine this issue are warranted. Practice Implications Many socio-demographic health disparities exist, and there is a dearth of proven disparity-reducing interventions. Thus, if tailored IMCPs were shown to lessen disparities, the public health implications would be considerable. PMID:21146950

  14. Challenging Data Management in CMS Computing with Network-aware Systems

    CERN Document Server

    Bonacorsi, Daniele

    2013-01-01

    After a successful first run at the LHC, and during the Long Shutdown (LS1) of the accelerator, the workload and data management sectors of the CMS Computing Model are entering into an operational review phase in order to concretely assess area of possible improvements and paths to exploit new promising technology trends. In particular, since the preparation activities for the LHC start, the Networks have constantly been of paramount importance for the execution of CMS workflows, exceeding the original expectations - as from the MONARC model - in terms of performance, stability and reliability. The low-latency transfers of PetaBytes of CMS data among dozens of WLCG Tiers worldwide using the PhEDEx dataset replication system is an example of the importance of reliable Networks. Another example is the exploitation of WAN data access over data federations in CMS. A new emerging area of work is the exploitation of �?��??Intelligent Network Services�?��?�, including also bandwidt...

  15. Challenging data and workload management in CMS Computing with network-aware systems

    CERN Document Server

    Wildish, Anthony

    2014-01-01

    After a successful first run at the LHC, and during the Long Shutdown (LS1) of the accelerator, the workload and data management sectors of the CMS Computing Model are entering into an operational review phase in order to concretely assess area of possible improvements and paths to exploit new promising technology trends. In particular, since the preparation activities for the LHC start, the Networks have constantly been of paramount importance for the execution of CMS workflows, exceeding the original expectations - as from the MONARC model - in terms of performance, stability and reliability. The low-latency transfers of PetaBytes of CMS data among dozens of WLCG Tiers worldwide using the PhEDEx dataset replication system is an example of the importance of reliable Networks. Another example is the exploitation of WAN data access over data federations in CMS. A new emerging area of work is the exploitation of "Intelligent Network Services", including also bandwidth on demand concepts. In this paper, we will ...

  16. Challenging data and workload management in CMS Computing with network-aware systems

    Science.gov (United States)

    D, Bonacorsi; T, Wildish

    2014-06-01

    After a successful first run at the LHC, and during the Long Shutdown (LS1) of the accelerator, the workload and data management sectors of the CMS Computing Model are entering into an operational review phase in order to concretely assess area of possible improvements and paths to exploit new promising technology trends. In particular, since the preparation activities for the LHC start, the Networks have constantly been of paramount importance for the execution of CMS workflows, exceeding the original expectations - as from the MONARC model - in terms of performance, stability and reliability. The low-latency transfers of PetaBytes of CMS data among dozens of WLCG Tiers worldwide using the PhEDEx dataset replication system is an example of the importance of reliable Networks. Another example is the exploitation of WAN data access over data federations in CMS. A new emerging area of work is the exploitation of Intelligent Network Services, including also bandwidth on demand concepts. In this paper, we will review the work done in CMS on this, and the next steps.

  17. Fibromuscular dysplasia in living renal donors: Still a challenge to computed tomographic angiography

    Energy Technology Data Exchange (ETDEWEB)

    Blondin, D., E-mail: blondin@med.uni-duesseldorf.d [Institute of Radiology, University Hospital Duesseldorf, Moorenstr. 5, D-40225 Duesseldorf (Germany); Lanzman, R.; Schellhammer, F. [Institute of Radiology, University Hospital Duesseldorf, Moorenstr. 5, D-40225 Duesseldorf (Germany); Oels, M. [Department of Nephrology (Germany); Grotemeyer, D. [Department of Vascular Surgery and Renal Transplantation (Germany); Baldus, S.E. [Institute of Pathology (Germany); Rump, L.C. [Department of Nephrology (Germany); Sandmann, W. [Department of Vascular Surgery and Renal Transplantation (Germany); Voiculescu, A. [Department of Nephrology (Germany)

    2010-07-15

    Background: Computed tomographic angiography has become the standard evaluating method of potential living renal donors in most centers. Although incidence of fibromuscular dysplasia is low (3.5-6%), this pathology may be relevant for success of renal transplantation. The incidence of FMD in our population of LRD and reliability of CTA for detecting vascular pathology were the aims of this study. Materials and methods: 101 living renal donors, examined between 7/2004 and 9/2008 by CTA, were included in a retrospective evaluation. The examinations were carried out using a 64 Multi-detector CT (Siemens Medical Solutions, Erlangen). The presence or absence of the characteristic signs of fibromuscular dysplasia, as 'string-of-beads' appearance, focal stenosis or aneurysms, were assessed and graded from mild (=1) to severe (=3). Furthermore, vascular anatomy and arterial stenosis were investigated in this study. Retrospective analysis of CTA and ultrasound were compared with operative and histological reports. Results: Four cases of fibromuscular dysplasia (incidence 3.9%) in 101 renal donors were diagnosed by transplanting surgeons and histopathology, respectively. Three cases could be detected by CTA. In one donor even retrospective analysis of CTA was negative. Ten accessory arteries, 14 venous anomalies and 12 renal arteries stenosis due to atherosclerosis were diagnosed by CTA and could be confirmed by the operative report. Conclusion: CTA is sufficient for detection of hemodynamic relevant stenosis and vascular anatomy. Only one patient with a mild form of FMD was under estimated. Therefore, if the CTA shows slightest irregularities which are not typical for atherosclerotic lesions, further diagnostic work up by DSA might still be necessary.

  18. Big data challenges in decoding cortical activity in a human with quadriplegia to inform a brain computer interface.

    Science.gov (United States)

    Friedenberg, David A; Bouton, Chad E; Annetta, Nicholas V; Skomrock, Nicholas; Mingming Zhang; Schwemmer, Michael; Bockbrader, Marcia A; Mysiw, W Jerry; Rezai, Ali R; Bresler, Herbert S; Sharma, Gaurav

    2016-08-01

    Recent advances in Brain Computer Interfaces (BCIs) have created hope that one day paralyzed patients will be able to regain control of their paralyzed limbs. As part of an ongoing clinical study, we have implanted a 96-electrode Utah array in the motor cortex of a paralyzed human. The array generates almost 3 million data points from the brain every second. This presents several big data challenges towards developing algorithms that should not only process the data in real-time (for the BCI to be responsive) but are also robust to temporal variations and non-stationarities in the sensor data. We demonstrate an algorithmic approach to analyze such data and present a novel method to evaluate such algorithms. We present our methodology with examples of decoding human brain data in real-time to inform a BCI.

  19. Career Readiness: Has Its Time Finally Come?

    Science.gov (United States)

    DeWitt, Stephen

    2012-01-01

    In 2010, the Association for Career and Technical Education (ACTE) released a "What Is Career Ready?" definition. As the career-readiness definition explains, there is much overlap between "college readiness" and "career readiness," but academic preparedness for college alone is not enough to be truly career-ready.…

  20. SAMPL4, a blind challenge for computational solvation free energies: the compounds considered

    Science.gov (United States)

    Guthrie, J. Peter

    2014-03-01

    For the fifth time I have provided a set of solvation energies (1 M gas to 1 M aqueous) for a SAMPL challenge. In this set there are 23 blind compounds and 30 supplementary compounds of related structure to one of the blind sets, but for which the solvation energy is readily available. The best current values of each compound are presented along with complete documentation of the experimental origins of the solvation energies. The calculations needed to go from reported data to solvation energies are presented, with particular attention to aspects which are new to this set. For some compounds the vapor pressures (VP) were reported for the liquid compound, which is solid at room temperature. To correct from VPsubcooled liquid to VPsublimation requires ΔSfusion, which is only known for mannitol. Estimated values were used for the others, all but one of which were benzene derivatives and expected to have very similar values. The final compound for which ΔSfusion was estimated was menthol, which melts at 42 °C so that modest errors in ΔSfusion will have little effect. It was also necessary to look into the effects of including estimated values of ΔCp on this correction. The approximate sizes of the effects of inclusion of ΔCp in the correction from VPsubcooled liquid to VPsublimation were estimated and it was noted that inclusion of ΔCp invariably makes ΔGS more positive. To extend the set of compounds for which the solvation energy could be calculated we explored the use of boiling point (b.p.) data from Reaxys/Beilstein as a substitute for studies of the VP as a function of temperature. B.p. data are not always reliable so it was necessary to develop a criterion for rejecting outliers. For two compounds (chlorinated guaiacols) it became clear that inclusion represented overreach; for each there were only two independent pressure, temperature points, which is too little for a trustworthy extrapolation. For a number of compounds the extrapolation from lowest

  1. "Ready to Acquire"

    DEFF Research Database (Denmark)

    Yetton, Philip; Henningsson, Stefan; Bjørn-Andersen, Niels

    2013-01-01

    This article describes the experiences of Danisco (a global food ingredients company) as it followed a growth-by-acquisition business strategy, focusing on how a new CIO built the IT resources to ensure the IT organization was "ready to acquire." We illustrate how these IT capabilities expedited...... the IT integration following two acquisitions, one of which involved Danisco expanding the scale of its business and the other extending the scope. Based on insights gained from Danisco, we provide lessons for CIOs to realize business benefits when managing post-acquisition IT integration....

  2. Standardized evaluation of algorithms for computer-aided diagnosis of dementia based on structural MRI: the CADDementia challenge.

    Science.gov (United States)

    Bron, Esther E; Smits, Marion; van der Flier, Wiesje M; Vrenken, Hugo; Barkhof, Frederik; Scheltens, Philip; Papma, Janne M; Steketee, Rebecca M E; Méndez Orellana, Carolina; Meijboom, Rozanna; Pinto, Madalena; Meireles, Joana R; Garrett, Carolina; Bastos-Leite, António J; Abdulkadir, Ahmed; Ronneberger, Olaf; Amoroso, Nicola; Bellotti, Roberto; Cárdenas-Peña, David; Álvarez-Meza, Andrés M; Dolph, Chester V; Iftekharuddin, Khan M; Eskildsen, Simon F; Coupé, Pierrick; Fonov, Vladimir S; Franke, Katja; Gaser, Christian; Ledig, Christian; Guerrero, Ricardo; Tong, Tong; Gray, Katherine R; Moradi, Elaheh; Tohka, Jussi; Routier, Alexandre; Durrleman, Stanley; Sarica, Alessia; Di Fatta, Giuseppe; Sensi, Francesco; Chincarini, Andrea; Smith, Garry M; Stoyanov, Zhivko V; Sørensen, Lauge; Nielsen, Mads; Tangaro, Sabina; Inglese, Paolo; Wachinger, Christian; Reuter, Martin; van Swieten, John C; Niessen, Wiro J; Klein, Stefan

    2015-05-01

    Algorithms for computer-aided diagnosis of dementia based on structural MRI have demonstrated high performance in the literature, but are difficult to compare as different data sets and methodology were used for evaluation. In addition, it is unclear how the algorithms would perform on previously unseen data, and thus, how they would perform in clinical practice when there is no real opportunity to adapt the algorithm to the data at hand. To address these comparability, generalizability and clinical applicability issues, we organized a grand challenge that aimed to objectively compare algorithms based on a clinically representative multi-center data set. Using clinical practice as the starting point, the goal was to reproduce the clinical diagnosis. Therefore, we evaluated algorithms for multi-class classification of three diagnostic groups: patients with probable Alzheimer's disease, patients with mild cognitive impairment and healthy controls. The diagnosis based on clinical criteria was used as reference standard, as it was the best available reference despite its known limitations. For evaluation, a previously unseen test set was used consisting of 354 T1-weighted MRI scans with the diagnoses blinded. Fifteen research teams participated with a total of 29 algorithms. The algorithms were trained on a small training set (n=30) and optionally on data from other sources (e.g., the Alzheimer's Disease Neuroimaging Initiative, the Australian Imaging Biomarkers and Lifestyle flagship study of aging). The best performing algorithm yielded an accuracy of 63.0% and an area under the receiver-operating-characteristic curve (AUC) of 78.8%. In general, the best performances were achieved using feature extraction based on voxel-based morphometry or a combination of features that included volume, cortical thickness, shape and intensity. The challenge is open for new submissions via the web-based framework: http://caddementia.grand-challenge.org. Copyright © 2015 Elsevier Inc

  3. Current challenges facing the translation of brain computer interfaces from preclinical trials to use in human patients

    Directory of Open Access Journals (Sweden)

    Maxwell D. Murphy

    2016-01-01

    Full Text Available Current research in brain computer interface (BCI technology is advancing beyond preclinical studies, with trials beginning in human patients. To date, these trials have been carried out with several different types of recording interfaces. The success of these devices has varied widely, but different factors such as the level of invasiveness, timescale of recorded information, and ability to maintain stable functionality of the device over a long period of time all must be considered in addition to accuracy in decoding intent when assessing the most practical type of device moving forward. Here, we discuss various approaches to BCIs, distinguishing between devices focusing on control of operations extrinsic to the subject (e.g., prosthetic limbs, computer cursors and those focusing on control of operations intrinsic to the brain (e.g. using stimulation or external feedback, including closed-loop or adaptive devices. In this discussion, we consider the current challenges facing the translation of various types of BCI technology to eventual human application.

  4. Challenges for computed tomography of overweight patients; Herausforderungen an die Computertomographie bei uebergewichtigen Patienten

    Energy Technology Data Exchange (ETDEWEB)

    Bamberg, F.; Marcus, R.; Nikolaou, K.; Becker, C.R.; Reiser, M.F.; Johnson, T. [Klinikum der Ludwig-Maximilians-Universitaet Muenchen, Campus Grosshadern, Institut fuer Klinische Radiologie, Muenchen (Germany); Petersilka, M. [Siemens AG, Computed Tomography, Forchheim, Healthcare Sector, Imaging and Therapy Division, Forchheim (Germany)

    2011-05-15

    In morbidly obese patients, computed tomography frequently represents the only viable option for non-invasive imaging diagnostics. The aim of this study was to analyze the weight limits, dose and image quality with standard CT scanners and to determine the diagnostic value and dose with a dual source XXL mode. A total of 15 patients (average body weight 189.6{+-}42 kg) were retrospectively identified who had been examined with the XXL mode. Of these patients 7 (average body weight 176.4{+-}56 kg) had been examined using both the XXL and standard protocols allowing for an intraindividual comparison in this subcollective. Additionally 14 patients weighing between 90 and 150 kg (average 106.1{+-}19 kg) examined with standard protocols were included as references. Dose, image noise and subjectively assessed image quality (rating scale 1-4) were determined. Additionally, a large abdomen phantom of 48 cm diameter was examined with both protocols at equivalent tube current-time product in order to compare the dose efficiency. The patient groups differed significantly in dose (CTDI{sub vol} XXL 72.9{+-}23 versus standard 16.7{+-}11 mGy; intraindividual 64.1{+-}20 versus 27.0{+-}15 mGy). The image noise was generally somewhat higher in the XXL group but significantly lower in the intraindividual comparison (liver 24.2{+-}14 HU versus 36.3{+-}20 HU; p=0.03; fat 15.5{+-}8 HU versus 26.2{+-}12 HU; p=0.02). With ratings of 1.9{+-}0.7 and 1.8{+-}0.7 image quality did not differ significantly in general, whereas there was a clear difference in the intraindividual comparison (1.8{+-}0.8 versus 3.0{+-}1.2) and only the XXL protocol achieved diagnostic quality in all cases, while 43% of the examinations with the standard protocol were rated as non-diagnostic. The quantification of dose efficiency in the phantom scans yielded no significant difference between the protocols. Up to 150 kg body weight, CT can be performed with the standard technique at 120 kVp with tube current

  5. Preparing Global-Ready Teachers

    Science.gov (United States)

    Larson, Lotta; Brown, Jennifer S.

    2017-01-01

    To produce global-ready students who can thrive and compete in an interconnected world, we must prepare global-ready teachers. This article shares how one teacher preparation program focuses on literacy, technology, and globalization, while offering relevant K-12 applications.

  6. Assessing Online Readiness of Students

    Science.gov (United States)

    Doe, Raymond; Castillo, Matthew S.; Musyoka, Millicent M.

    2017-01-01

    The rise of distance education has created a need to understand students' readiness for online learning and to predict their success. Several survey instruments have been developed to assess this construct of online readiness. However, a review of the extant literature shows that these instruments have varying limitations in capturing all of the…

  7. Building America Top Innovations 2013 Profile – Zero Energy-Ready Single-Family Homes

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2013-09-01

    Building homes that are zero energy-ready is a goal of the U.S. Department of Energy’s Building America program and one embodied in Building America’s premier home certification program, the Challenge Home program. This case study describes several examples of successful zero energy-ready home projects completed by Building America teams and partner builders.

  8. DOE Zero Energy Ready Home Case Study: Palo Duro Homes — Palo Duro Homes, Albuquerque, NM

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2014-09-01

    This builder was honored for Most DOE Zero Energy Ready Homes Built in the 2014 Housing Innovation Awards. By July 2014, Palo Duro had completed 152 homes since the program began in 2013 (under the original program title DOE Challenge Home), all of them certified to the stringent efficiency requirements of DOE’s Zero Energy Ready Home program.

  9. The Readiness of Sorsogon State College Faculty for Teaching with ICT: Basis for a Faculty Training Program

    Directory of Open Access Journals (Sweden)

    Catherine A. De Castro

    2016-02-01

    Full Text Available Information and communication technologies (ICT such as computers, multimedia systems, productivity software, and the Internet have greatly improved the performance of different organizations and influenced higher learning institutions like Sorsogon State College (SSC to develop and implement innovative teaching and learning methods. However, despite the many benefits of ICT when used in education, there are still faculty members who do not use these technologies for teaching. Hence, this research was conducted to assess their readiness for teaching with ICT. Findings revealed that most of the surveyed respondents were above forty-five years old, have 1-10 years of government service, and have specialization in the field of education. In terms of readiness to teach with ICT, the results disclosed that they were fairly ready along human-resource readiness, ready along technological skill readiness, and much ready along equipment readiness. Their age was not significantly related to their human resource readiness but significantly related to their technological skill and equipment readiness. The respondents’ number of years in the government was significantly related to their readiness to teach with ICT in terms of human resource, technological skill, and equipment readiness. Their field of specialization was not significantly related to their readiness to teach with ICT. Among the most identified factors why some of them do not use ICT resources were unavailability of ICT resources, lack of knowledge and lack of familiarity to ICT. The output of this research is a faculty training program to enhance their know

  10. Individual Health Readiness - a Leadership Responsibility

    National Research Council Canada - National Science Library

    Lyon, Joan

    1999-01-01

    .... This creates a mandate to enhance individual soldier readiness. Army leaders must learn to provide a comprehensive program to promote individual readiness through soldier preventive maintenance (Soldier PM...

  11. Get ready for physics

    CERN Document Server

    Adelson, Edward

    2011-01-01

    Get Ready for Physics helps science students quickly prepare for their introductory physics course, either algebra-based or calculus-based. It provides useful tools for future success in the course. The booklet gives students tips on recognizing their individual learning styles and helps them maximize their study time. It helps them review the basic mathematics they will need for the course, including ratios, proportions, and graphs. It gives them a bird's-eye preview of the major concepts and physical models so they start the course with a broad perspective of the key physical ideas and the knowledge of important terms that give students most trouble. The booklet concludes with a strong chapter on solving physics problems, replete with practice problems and examples, and with insights into answering conceptual and estimation type questions.

  12. The benefits and challenges of using computer-assisted symptom assessments in oncology clinics: results of a qualitative assessment.

    Science.gov (United States)

    Mark, Tami L; Johnson, Gina; Fortner, Barry; Ryan, Katheryn

    2008-10-01

    Developed for clinical use in oncology settings, the Patient Assessment, Care & Education (PACE) System is a computer technology tool designed to address the under-identification and treatment of chemotherapy-related symptoms. This system includes general core questions together with the Patient Care Monitor (PCM), a validated questionnaire that assesses patient-reported problems, six symptom burden indices, and one global quality of life index. The system automatically scores the PCM and generates a written report. The objective of this study was to assess the manner in which clinicians use this system and identify the benefits and challenges that oncology clinics may face when adopting this system. The study was part of a larger evaluation of the system that included standardized surveys and chart review. Sixteen providers (physicians, nurses, and physician assistants) at 13 community oncology clinics participated in a 30-minute interview. Responses were coded according to common phrases or concepts. Clinicians indicated that they use the system mainly for symptom assessment or review of systems. The most common benefits identified included the improved ability to identify under-reported symptoms, enhanced communication with patients; increased efficiency; and its ability to highlight patients' most bothersome symptoms. Challenges included patient burden from the frequent need to answer the questionnaires, issues with the wording and formatting of the screening questionnaire, and technical difficulties. In sum, these interviews suggest that electronic symptom assessments offer potential advantages in terms improving the integration of routine assessment of patients' symptoms and health-related quality of life into the daily flow of an oncology clinic. The approach should receive additional research and development attention.

  13. A practical implementation science heuristic for organizational readiness: R = MC2

    Science.gov (United States)

    Cook, Brittany S.; Lamont, Andrea; Wandersman, Abraham; Castellow, Jennifer; Katz, Jason; Beidas, Rinad S.

    2015-01-01

    There are many challenges when an innovation (i.e., a program, process, or policy that is new to an organization) is actively introduced into an organization. One critical component for successful implementation is the organization’s readiness for the innovation. In this article, we propose a practical implementation science heuristic, abbreviated as R= MC2. We propose that organizational readiness involves: 1) the motivation to implement an innovation, 2) the general capacities of an organization, and 3) the innovation-specific capacities needed for a particular innovation. Each of these components can be assessed independently and be used formatively. The heuristic can be used by organizations to assess readiness to implement and by training and technical assistance providers to help build organizational readiness. We present an illustration of the heuristic by showing how behavioral health organizations differ in readiness to implement a peer specialist initiative. Implications for research and practice of organizational readiness are discussed. PMID:26668443

  14. A practical implementation science heuristic for organizational readiness: R = MC2.

    Science.gov (United States)

    Scaccia, Jonathan P; Cook, Brittany S; Lamont, Andrea; Wandersman, Abraham; Castellow, Jennifer; Katz, Jason; Beidas, Rinad S

    2015-04-01

    There are many challenges when an innovation (i.e., a program, process, or policy that is new to an organization) is actively introduced into an organization. One critical component for successful implementation is the organization's readiness for the innovation. In this article, we propose a practical implementation science heuristic, abbreviated as R= MC2 . We propose that organizational readiness involves: 1) the motivation to implement an innovation, 2) the general capacities of an organization, and 3) the innovation-specific capacities needed for a particular innovation. Each of these components can be assessed independently and be used formatively. The heuristic can be used by organizations to assess readiness to implement and by training and technical assistance providers to help build organizational readiness. We present an illustration of the heuristic by showing how behavioral health organizations differ in readiness to implement a peer specialist initiative. Implications for research and practice of organizational readiness are discussed.

  15. Analysis, biomedicine, collaboration, and determinism challenges and guidance: wish list for biopharmaceuticals on the interface of computing and statistics.

    Science.gov (United States)

    Goodman, Arnold F

    2011-11-01

    I have personally witnessed processing advance from desk calculators and mainframes, through timesharing and PCs, to supercomputers and cloud computing. I have also witnessed resources grow from too little data into almost too much data, and from theory dominating data into data beginning to dominate theory while needing new theory. Finally, I have witnessed problems advance from simple in a lone discipline into becoming almost too complex in multiple disciplines, as well as approaches evolve from analysis driving solutions into solutions by data mining beginning to drive the analysis itself. How we do all of this has transitioned from competition overcoming collaboration into collaboration starting to overcome competition, as well as what is done being more important than how it is done has transitioned into how it is done becoming as important as what is done. In addition, what or how we do it being more important than what or how we should actually do it has shifted into what or how we should do it becoming just as important as what or how we do it, if not more so. Although we have come a long way in both our methodology and technology, are they sufficient for our current or future complex and multidisciplinary problems with their massive databases? Since the apparent answer is not a resounding yes, we are presented with tremendous challenges and opportunities. This personal perspective adapts my background and experience to be appropriate for biopharmaceuticals. In these times of exploding change, informed perspectives on what challenges should be explored with accompanying guidance may be even more valuable than the far more typical literature reviews in conferences and journals of what has already been accomplished without challenges or guidance. Would we believe that an architect who designs a skyscraper determines the skyscraper's exact exterior, interior and furnishings or only general characteristics? Why not increase dependability of conclusions in

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  17. Computer-aided diagnosis of diagnostically challenging lesions in breast MRI: a comparison between a radiomics and a feature-selective approach

    Science.gov (United States)

    Hoffmann, Sebastian; Lobbes, Marc; Houben, Ivo; Pinker-Domenig, Katja; Wengert, Georg; Burgeth, Bernhard; Meyer-Bäse, Uwe; Lemaitre, Guillaume; Meyer-Baese, Anke

    2016-05-01

    Diagnostically challenging lesions pose a challenge both for the radiological reading and also for current CAD systems. They are not well-defined in both morphology (geometric shape) and kinetics (temporal enhancement) and pose a problem to lesion detection and classification. Their strong phenotypic differences can be visualized by MRI. Radiomics represents a novel approach to achieve a detailed quantification of the tumour phenotypes by analyzing a large number of image descriptors. In this paper, we apply a quantitative radiomics approach based on shape, texture and kinetics tumor features and evaluate it in comparison to a reduced-order feature approach in a computer-aided diagnosis system applied to diagnostically challenging lesions.

  18. Technical Challenges and Lessons from the Migration of the GLOBE Data and Information System to Utilize Cloud Computing Service

    Science.gov (United States)

    Moses, J. F.; Memarsadeghi, N.; Overoye, D.; Littlefield, B.

    2016-12-01

    The Global Learning and Observation to Benefit the Environment (GLOBE) Data and Information System supports an international science and education program with capabilities to accept local environment observations, archive, display and visualize them along with global satellite observations. Since its inception twenty years ago, the Web and database system has been upgraded periodically to accommodate the changes in technology and the steady growth of GLOBE's education community and collection of observations. Recently, near the end-of-life of the system hardware, new commercial computer platform options were explored and a decision made to utilize Cloud services. Now the GLOBE DIS has been fully deployed and maintained using Amazon Cloud services for over two years now. This paper reviews the early risks, actual challenges, and some unexpected findings as a result of the GLOBE DIS migration. We describe the plans, cost drivers and estimates, highlight adjustments that were made and suggest improvements. We present the trade studies for provisioning, for load balancing, networks, processing , storage, as well as production, staging and backup systems. We outline the migration team's skills and required level of effort for transition, and resulting changes in the overall maintenance and operations activities. Examples include incremental adjustments to processing capacity and frequency of backups, and efforts previously expended on hardware maintenance that were refocused onto application-specific enhancements.

  19. Technical Challenges and Lessons from the Migration of the GLOBE Data and Information System to Utilize Cloud Computing Service

    Science.gov (United States)

    Moses, John F.; Memarsadeghi, Nargess; Overoye, David; Littlefield, Brain

    2017-01-01

    The Global Learning and Observation to Benefit the Environment (GLOBE) Data and Information System supports an international science and education program with capabilities to accept local environment observations, archive, display and visualize them along with global satellite observations. Since its inception twenty years ago, the Web and database system has been upgraded periodically to accommodate the changes in technology and the steady growth of GLOBEs education community and collection of observations. Recently, near the end-of-life of the system hardware, new commercial computer platform options were explored and a decision made to utilize Cloud services. Now the GLOBE DIS has been fully deployed and maintained using Amazon Cloud services for over two years now. This paper reviews the early risks, actual challenges, and some unexpected findings as a result of the GLOBE DIS migration. We describe the plans, cost drivers and estimates, highlight adjustments that were made and suggest improvements. We present the trade studies for provisioning, for load balancing, networks, processing, storage, as well as production, staging and backup systems. We outline the migration teams skills and required level of effort for transition, and resulting changes in the overall maintenance and operations activities. Examples include incremental adjustments to processing capacity and frequency of backups, and efforts previously expended on hardware maintenance that were refocused onto application-specific enhancements.

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  1. Operational readiness review phase-1 final report for WRAP-1

    Energy Technology Data Exchange (ETDEWEB)

    Bowen, W., Westinghouse Hanford

    1996-12-27

    This report documents the Operational Readiness Review for WRAP-1 Phase-1 operations. The report includes all criteria, lines of inquiry with resulting Findings and Observations. The review included assessing operational capability of the organization and the computer controlled process and facility systems.

  2. Naval Warfare Doctrine-Is it Ready for the 21st Century?

    Science.gov (United States)

    1996-06-01

    world politics , the opportunity and responsibility to ensure that naval doctrine is ready for the demanding command and control challenges of the high-tech multinational world of 21st Century maritime

  3. Ready, Set, Change! Development and usability testing of an online readiness for change decision support tool for healthcare organizations.

    Science.gov (United States)

    Timmings, Caitlyn; Khan, Sobia; Moore, Julia E; Marquez, Christine; Pyka, Kasha; Straus, Sharon E

    2016-02-24

    To address challenges related to selecting a valid, reliable, and appropriate readiness assessment measure in practice, we developed an online decision support tool to aid frontline implementers in healthcare settings in this process. The focus of this paper is to describe a multi-step, end-user driven approach to developing this tool for use during the planning stages of implementation. A multi-phase, end-user driven approach was used to develop and test the usability of a readiness decision support tool. First, readiness assessment measures that are valid, reliable, and appropriate for healthcare settings were identified from a systematic review. Second, a mapping exercise was performed to categorize individual items of included measures according to key readiness constructs from an existing framework. Third, a modified Delphi process was used to collect stakeholder ratings of the included measures on domains of feasibility, relevance, and likelihood to recommend. Fourth, two versions of a decision support tool prototype were developed and evaluated for usability. Nine valid and reliable readiness assessment measures were included in the decision support tool. The mapping exercise revealed that of the nine measures, most measures (78 %) focused on assessing readiness for change at the organizational versus the individual level, and that four measures (44 %) represented all constructs of organizational readiness. During the modified Delphi process, stakeholders rated most measures as feasible and relevant for use in practice, and reported that they would be likely to recommend use of most measures. Using data from the mapping exercise and stakeholder panel, an algorithm was developed to link users to a measure based on characteristics of their organizational setting and their readiness for change assessment priorities. Usability testing yielded recommendations that were used to refine the Ready, Set, Change! decision support tool . Ready, Set, Change! decision

  4. Checklist for clinical readiness published

    Science.gov (United States)

    Scientists from NCI, together with collaborators from outside academic centers, have developed a checklist of criteria to evaluate the readiness of complex molecular tests that will guide decisions made during clinical trials. The checklist focuses on tes

  5. Mission Readiness Measurement Aid (MIRMAID)

    National Research Council Canada - National Science Library

    Bowden, Tim

    2001-01-01

    .... The tool we have designed is intended to combine automated and observed measures of performance to provide the Commanding Officer feedback regarding the readiness of his unit to perform key missions...

  6. Climate Ready Estuaries Progress Reports

    Science.gov (United States)

    Climate Ready Estuaries has supported adaptation activities in National Estuary Programs since 2008. In 2012, the program partnered with 23 NEPs, completed a pilot project with water utilities, and held workshops. Download annual reports from 2009-2012.

  7. LHCf: ready to go

    CERN Multimedia

    CERN Bulletin

    2015-01-01

    Reinstalled in the tunnel at the end of 2014, the two detectors of the LHCf experiment are now ready for operation. The first data should be taken in May.   LHCf’s Arm1 detector. The Large Hadron Collider forward (LHCf) experiment measures neutral particles emitted at nearly zero degrees from the proton beam direction. Because these "very forward" particles carry a large fraction of the collision energy, they are important for understanding the development of atmospheric air-shower phenomena produced by high-energy cosmic rays. To measure these particles, two detectors, Arm1 and Arm2, sit along the LHC beamline, at 140 metres either side of the ATLAS collision point. In July 2010, after a 9-month operation, the LHCf collaboration removed the two detectors from the tunnel to avoid severe radiation damage. The Arm2 detector was reinstalled in the tunnel for data-taking with proton–lead collisions in 2013, while Arm1 was being upgraded to be a radiation-ha...

  8. Environmental readiness document magnetohydrodynamics

    Energy Technology Data Exchange (ETDEWEB)

    1979-07-01

    The major areas of environmental concern with regard to the commercialization of coal-fired MHD generators are discussed. MHD technology and expectations about its future utilization are described. Information pertinent to the technology was drawn from the DOE technology program office and from an Environmental Development Plan developed for the technology by EV and the program office through an Environmental Coordination Committee. The environmental concerns associated with the technology are examined, and the current status of knowledge about each concern and its potential seriousness and manageability through regulation and control technology, is discussed. Present and projected societal capabilities to reduce and control undesirable environmental, health, safety, and related social impacts to a level of public acceptability -- as reflected in current and proposed environmental standards -- which will allow the technology to be commercialized and utilized in a timely manner are summarized. The ERD as a whole thus provides an assessment, within the limits of available knowledge and remaining uncertainties, of the future environmental readiness of the technology to contribute to the meeting of the Nation's energy needs. (WHK)

  9. Markets: Ready-Mixed Concrete

    OpenAIRE

    Chad Syverson

    2008-01-01

    Concrete's natural color is gray. Its favored uses are utilitarian. Its very ubiquity causes it to blend into the background. But ready-mix concrete does have one remarkable characteristic: other than manufactured ice, perhaps no other manufacturing industry faces greater transport barriers. The transportation problem arises because ready-mix concrete both has a low value-to-weight ratio and is highly perishable -- it absolutely must be discharged from the truck before it hardens. These trans...

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  11. Computational challenges and human factors influencing the design and use of clinical research participant eligibility pre-screening tools

    Directory of Open Access Journals (Sweden)

    Pressler Taylor R

    2012-05-01

    Full Text Available Abstract Background Clinical trials are the primary mechanism for advancing clinical care and evidenced-based practice, yet challenges with the recruitment of participants for such trials are widely recognized as a major barrier to these types of studies. Data warehouses (DW store large amounts of heterogenous clinical data that can be used to enhance recruitment practices, but multiple challenges exist when using a data warehouse for such activities, due to the manner of collection, management, integration, analysis, and dissemination of the data. A critical step in leveraging the DW for recruitment purposes is being able to match trial eligibility criteria to discrete and semi-structured data types in the data warehouse, though trial eligibility criteria tend to be written without concern for their computability. We present the multi-modal evaluation of a web-based tool that can be used for pre-screening patients for clinical trial eligibility and assess the ability of this tool to be practically used for clinical research pre-screening and recruitment. Methods The study used a validation study, usability testing, and a heuristic evaluation to evaluate and characterize the operational characteristics of the software as well as human factors affecting its use. Results Clinical trials from the Division of Cardiology and the Department of Family Medicine were used for this multi-modal evaluation, which included a validation study, usability study, and a heuristic evaluation. From the results of the validation study, the software demonstrated a positive predictive value (PPV of 54.12% and 0.7%, respectively, and a negative predictive value (NPV of 73.3% and 87.5%, respectively, for two types of clinical trials. Heuristic principles concerning error prevention and documentation were characterized as the major usability issues during the heuristic evaluation. Conclusions This software is intended to provide an initial list of eligible patients to a

  12. Computational challenges and human factors influencing the design and use of clinical research participant eligibility pre-screening tools

    Science.gov (United States)

    2012-01-01

    Background Clinical trials are the primary mechanism for advancing clinical care and evidenced-based practice, yet challenges with the recruitment of participants for such trials are widely recognized as a major barrier to these types of studies. Data warehouses (DW) store large amounts of heterogenous clinical data that can be used to enhance recruitment practices, but multiple challenges exist when using a data warehouse for such activities, due to the manner of collection, management, integration, analysis, and dissemination of the data. A critical step in leveraging the DW for recruitment purposes is being able to match trial eligibility criteria to discrete and semi-structured data types in the data warehouse, though trial eligibility criteria tend to be written without concern for their computability. We present the multi-modal evaluation of a web-based tool that can be used for pre-screening patients for clinical trial eligibility and assess the ability of this tool to be practically used for clinical research pre-screening and recruitment. Methods The study used a validation study, usability testing, and a heuristic evaluation to evaluate and characterize the operational characteristics of the software as well as human factors affecting its use. Results Clinical trials from the Division of Cardiology and the Department of Family Medicine were used for this multi-modal evaluation, which included a validation study, usability study, and a heuristic evaluation. From the results of the validation study, the software demonstrated a positive predictive value (PPV) of 54.12% and 0.7%, respectively, and a negative predictive value (NPV) of 73.3% and 87.5%, respectively, for two types of clinical trials. Heuristic principles concerning error prevention and documentation were characterized as the major usability issues during the heuristic evaluation. Conclusions This software is intended to provide an initial list of eligible patients to a clinical study

  13. A Proposed for Assessing Hotel E-Readiness for Tourism in Southern Thailand

    Directory of Open Access Journals (Sweden)

    Piman Sirot

    2016-01-01

    Full Text Available This article only focuses on an overview of the Hotel E-readiness model and model design for tourism in Southern Thailand. “The convergence of information technology (IT and communications technology (CT” will be an important part of these technological innovations. The global economy has been turbulent during the last several years, and governments and enterprises are doing everything possible to inject momentum and effectuate sustainable growth. All member countries of Association of Southeast Asian Nations (ASEAN, aims to be ASEAN Economic Community (AEC by December 2015, have come to realize that an integrated ICT technology will enhance the competitiveness and creativity of their economies and fuel the sustainable growth of the global economy. The role that information and communication technologies (ICTs can play to support economic growth, especially on tourism, has never drawn so much attention and research. According to Networked Readiness Index (NRI, Thailand has made improvement in NRI, edging up from 77th to 74th place in 2013 and from 74th to 67th place to the latest measurement released by the World Economic Forum in2014 and ranked 3 out of 10 countries of ASEAN members. Although we still face serious challenges the impact of ICTs on tourism has become more far reaching as its transformational effects spread to several sectors of the economy and society via innovations. On this research we focus on only the hotels division in Southern of Thailand due to tourism’s economic on this area benefits very high income from oversea and ASEAN. We give an overview of the Hotel E-readiness Model that impact to tourism economic with computer networking infrastructures and communication technologies in Southern of Thailand. Our model is described on four majors - business environment, network readiness, network usage and network impacts. It aims to explore the problems and obstacles for improvement on computer networking infrastructure and

  14. Computer solutions for fuel oil marketers

    Energy Technology Data Exchange (ETDEWEB)

    Berst, J.; Kall, J.

    1984-11-01

    Methods for upgrading computer systems for fuel oil marketers are discussed. A computer system that is free or nearly free of software is proposed. Five suggestions are given to help determine when a computer system is ready for replacement.

  15. Exploring English as a Foreign Language (EFL) Teacher Trainers' Perspectives on Challenges to Promoting Computer Literacy of EFL Teachers

    Science.gov (United States)

    Dashtestani, Reza

    2014-01-01

    Computer literacy is a significant component of language teachers' computer-assisted language learning (call) knowledge. Despite its importance, limited research has been undertaken to analyze factors which might influence language teachers' computer literacy levels. This qualitative study explored the perspectives of 39 Iranian EFL teacher…

  16. Grand Challenges: High Performance Computing and Communications. The FY 1992 U.S. Research and Development Program.

    Science.gov (United States)

    Federal Coordinating Council for Science, Engineering and Technology, Washington, DC.

    This report presents a review of the High Performance Computing and Communications (HPCC) Program, which has as its goal the acceleration of the commercial availability and utilization of the next generation of high performance computers and networks in order to: (1) extend U.S. technological leadership in high performance computing and computer…

  17. Academic training: QCD: are we ready for the LHC

    CERN Multimedia

    2006-01-01

    2006-2007 ACADEMIC TRAINING PROGRAMME LECTURE SERIES 4, 5, 6, 7 December, from 11:00 to 12:00 4, 5, 6 December - Main Auditorium, bldg. 500, 7 December - TH Auditorium, bldg. 4 - 3-006 QCD: are we ready for the LHC S. FRIXIONE / INFN, Genoa, Italy The LHC energy regime poses a serious challenge to our capability of predicting QCD reactions to the level of accuracy necessary for a successful programme of searches for physics beyond the Standard Model. In these lectures, I'll introduce basic concepts in QCD, and present techniques based on perturbation theory, such as fixed-order and resummed computations, and Monte Carlo simulations. I'll discuss applications of these techniques to hadron-hadron processes, concentrating on recent trends in perturbative QCD aimed at improving our understanding of LHC phenomenology. ENSEIGNEMENT ACADEMIQUE ACADEMIC TRAINING Françoise Benz 73127 academic.training@cern.ch If you wish to participate in one of the following courses, please tell to your supervisor and apply ...

  18. Predicting ready biodegradability of premanufacture notice chemicals.

    Science.gov (United States)

    Boethling, Robert S; Lynch, David G; Thom, Gary C

    2003-04-01

    Chemical substances other than pesticides, drugs, and food additives are regulated by the U.S. Environmental Protection Agency (U.S. EPA) under the Toxic Substances Control Act (TSCA), but the United States does not require that new substances be tested automatically for such critical properties as biodegradability. The resulting lack of submitted data has fostered the development of estimation methods, and the BioWIN models for predicting biodegradability from chemical structure have played a prominent role in premanufacture notice (PMN) review. Until now, validation efforts have used only the Japanese Ministry of International Trade and Industry (MITI) test data and have not included all models. To assess BioWIN performance with PMN substances, we assembled a database of PMNs for which ready biodegradation data had been submitted over the period 1995 through 2001. The 305 PMN structures are highly varied and pose major challenges to chemical property estimation. Despite the variability of ready biodegradation tests, the use of at least six different test methods, and widely varying quality of submitted data, accuracy of four of six BioWIN models (MITI linear, MITI nonlinear, survey ultimate, survey primary) was in the 80+% range for predicting ready biodegradability. Greater accuracy (>90%) can be achieved by using model estimates only when the four models agree (true for 3/4 of the PMNs). The BioWIN linear and nonlinear probability models did not perform as well even when classification criteria were optimized. The results suggest that the MITI and survey BioWIN models are suitable for use in screening-level applications.

  19. Factors of children's school readiness

    Directory of Open Access Journals (Sweden)

    Ljubica Marjanovič Umek

    2006-12-01

    Full Text Available The purpose of the study was to examine the effect of preschool on children's school readiness in connection with their intellectual abilities, language competence, and parents' education. The sample included 219 children who were 68 to 83 months old and were attending the first year of primary school. Children were differentiated by whether or not they had attended preschool before starting school. Children's intellectual ability was determined using Raven's Coloured Progressive Matrices (CPM; Raven, Raven, & Court, 1999, language competence using the Lestvice splošnega govornegarazvoja–LJ (LSGR–LJ, Scales of General Language Development; Marjanovič Umek, Kranjc, Fekonja in Bajc, 2004, and school readiness with the Preizkus pripravljenosti za šolo (PPŠ, Test of School Readiness; Toličič, 1986. The results indicate that children's intellectual ability and language competence have a high predictive value for the school readiness — they explained 51% of the variance in children's scores on the PPŠ. Preschool enrollment has a positive effect on school readiness for children whose parents have a low level of education, but not for those whose parents are highly educated.

  20. e-ready legislation

    DEFF Research Database (Denmark)

    Hvingel, Line; Baaner, Lasse

    of the trustworthiness of administration systems. On the other hand, a successful adaption of legislation to a digital setup could help promote good service towards citizens and businesses, and according to land administration theories maybe even promote societal sustainability in large. Based on studies on Denmark......, different challenges within digital land administration solutions are demonstrated. This paper discusses how legislation needs to change in order to be ‘e-Ready’....

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  3. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  4. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  5. Community readiness and health services.

    Science.gov (United States)

    Oetting, E R; Jumper-Thurman, P; Plested, B; Edwards, R W

    2001-01-01

    Community readiness theory is a practical tool for implementing changes in community health services. The theory provides methods for assessment, diagnosis, and community change. First, community key informants are asked semi-structured questions that provide information about what is occurring in the community in relation to a specific problem. The results evaluate readiness to deal with that problem on six dimensions; existing efforts, knowledge about the problem, knowledge about alternative methods or policies, leadership, resources, and community climate. The eventual result is a diagnosis of the overall stage of community readiness. There are nine stages, tolerance or no awareness, denial, vague awareness, preplanning, preparation, initiation, institutionalization or stabilization, confirmation/expansion, and professionalization. Each stage requires different forms of interventions in order to move the community to the next stage until, eventually, initiation and maintenance of health services programs and policies can be achieved.

  6. Computer-aided diagnosis for diagnostically challenging breast lesions in DCE-MRI based on image registration and integration of morphologic and dynamic characteristics

    Science.gov (United States)

    Retter, Felix; Plant, Claudia; Burgeth, Bernhard; Botella, Guillermo; Schlossbauer, Thomas; Meyer-Bäse, Anke

    2013-12-01

    Diagnostically challenging lesions comprise both foci (small lesions) and non-mass-like enhancing lesions and pose a challenge to current computer-aided diagnosis systems. Motion-based artifacts lead in dynamic contrast-enhanced breast magnetic resonance to diagnostic misinterpretation; therefore, motion compensation represents an important prerequisite to automatic lesion detection and diagnosis. In addition, the extraction of pertinent kinetic and morphologic features as lesion descriptors is an equally important task. In the present paper, we evaluate the performance of a computer-aided diagnosis system consisting of motion correction, lesion segmentation, and feature extraction and classification. We develop a new feature extractor, the radial Krawtchouk moment, which guarantees rotation invariance. Many novel feature extraction techniques are proposed and tested in conjunction with lesion detection. Our simulation results have shown that motion compensation combined with Minkowski functionals and Bayesian classifier can improve lesion detection and classification.

  7. When Life and Learning Do Not Fit: Challenges of Workload and Communication in Introductory Computer Science Online

    Science.gov (United States)

    Benda, Klara; Bruckman, Amy; Guzdial, Mark

    2012-01-01

    We present the results of an interview study investigating student experiences in two online introductory computer science courses. Our theoretical approach is situated at the intersection of two research traditions: "distance and adult education research," which tends to be sociologically oriented, and "computer science education…

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  9. Development of computational fluid dynamics--habitat suitability (CFD-HSI) models to identify potential passage--Challenge zones for migratory fishes in the Penobscot River

    Science.gov (United States)

    Haro, Alexander J.; Dudley, Robert W.; Chelminski, Michael

    2012-01-01

    A two-dimensional computational fluid dynamics-habitat suitability (CFD–HSI) model was developed to identify potential zones of shallow depth and high water velocity that may present passage challenges for five anadromous fish species in the Penobscot River, Maine, upstream from two existing dams and as a result of the proposed future removal of the dams. Potential depth-challenge zones were predicted for larger species at the lowest flow modeled in the dam-removal scenario. Increasing flows under both scenarios increased the number and size of potential velocity-challenge zones, especially for smaller species. This application of the two-dimensional CFD–HSI model demonstrated its capabilities to estimate the potential effects of flow and hydraulic alteration on the passage of migratory fish.

  10. Scientific Grand Challenges: Discovery In Basic Energy Sciences: The Role of Computing at the Extreme Scale - August 13-15, 2009, Washington, D.C.

    Energy Technology Data Exchange (ETDEWEB)

    Galli, Giulia [Univ. of California, Davis, CA (United States). Workshop Chair; Dunning, Thom [Univ. of Illinois, Urbana, IL (United States). Workshop Chair

    2009-08-13

    The U.S. Department of Energy’s (DOE) Office of Basic Energy Sciences (BES) and Office of Advanced Scientific Computing Research (ASCR) workshop in August 2009 on extreme-scale computing provided a forum for more than 130 researchers to explore the needs and opportunities that will arise due to expected dramatic advances in computing power over the next decade. This scientific community firmly believes that the development of advanced theoretical tools within chemistry, physics, and materials science—combined with the development of efficient computational techniques and algorithms—has the potential to revolutionize the discovery process for materials and molecules with desirable properties. Doing so is necessary to meet the energy and environmental challenges of the 21st century as described in various DOE BES Basic Research Needs reports. Furthermore, computational modeling and simulation are a crucial complement to experimental studies, particularly when quantum mechanical processes controlling energy production, transformations, and storage are not directly observable and/or controllable. Many processes related to the Earth’s climate and subsurface need better modeling capabilities at the molecular level, which will be enabled by extreme-scale computing.

  11. The challenge of ubiquitous computing in health care: technology, concepts and solutions. Findings from the IMIA Yearbook of Medical Informatics 2005.

    Science.gov (United States)

    Bott, O J; Ammenwerth, E; Brigl, B; Knaup, P; Lang, E; Pilgram, R; Pfeifer, B; Ruderich, F; Wolff, A C; Haux, R; Kulikowski, C

    2005-01-01

    To review recent research efforts in the field of ubiquitous computing in health care. To identify current research trends and further challenges for medical informatics. Analysis of the contents of the Yearbook on Medical Informatics 2005 of the International Medical Informatics Association (IMIA). The Yearbook of Medical Informatics 2005 includes 34 original papers selected from 22 peer-reviewed scientific journals related to several distinct research areas: health and clinical management, patient records, health information systems, medical signal processing and biomedical imaging, decision support, knowledge representation and management, education and consumer informatics as well as bioinformatics. A special section on ubiquitous health care systems is devoted to recent developments in the application of ubiquitous computing in health care. Besides additional synoptical reviews of each of the sections the Yearbook includes invited reviews concerning E-Health strategies, primary care informatics and wearable healthcare. Several publications demonstrate the potential of ubiquitous computing to enhance effectiveness of health services delivery and organization. But ubiquitous computing is also a societal challenge, caused by the surrounding but unobtrusive character of this technology. Contributions from nearly all of the established sub-disciplines of medical informatics are demanded to turn the visions of this promising new research field into reality.

  12. Safe, Healthy and Ready to Succeed: Arizona School Readiness Key Performance Indicators

    Science.gov (United States)

    Migliore, Donna E.

    2006-01-01

    "Safe, Healthy and Ready to Succeed: Arizona School Readiness Key Performance Indicators" presents a set of baseline measurements that gauge how well a statewide system of school readiness supports is addressing issues that affect Arizona children's readiness for school. The Key Performance Indicators (KPIs) measure the system, rather…

  13. PIC Reading Readiness Test Form.

    Science.gov (United States)

    Short, N. J.

    This rating form concerns the measurement of basic skills in connection with assessing reading readiness. Motor skills, ability to adjust to learning situations, familiarity with the alphabet, and general knowledge are assessed. See TM 001 111 for details of the Regional PIC program in which it is used. (DLG)

  14. Blood Lead and Reading Readiness

    Directory of Open Access Journals (Sweden)

    J Gordon Millichap

    2013-08-01

    Full Text Available Investigators at the University of Maryland School of Nursing, Johns Hopkins School of Public Health, and Department of Health, Providence, Rhode Island, evaluated the relationship between reading readiness test scores for children attending public kindergarten in Providence, RI, and state health department records of blood lead levels (BLLs.

  15. The Condition of College & Career Readiness, 2010

    Science.gov (United States)

    ACT, Inc., 2010

    2010-01-01

    Since 1959, ACT has collected and reported data on students' academic readiness for college. Because becoming ready for college and career is a process that occurs throughout elementary and secondary education, measuring academic performance over time in the context of college and career readiness provides meaningful and compelling information…

  16. Progression in work readiness

    DEFF Research Database (Denmark)

    Jensen, Sophie Danneris

    2013-01-01

    in social work programs (amongst others Boaz & Blewett 2010 and Koivisto 2008). Initially there will be a short presentation of the research topic of my Ph.D. and the central research question related to the project. Following this is a methodological discussion in two levels - the first discussion......This paper is based partly on literature concerning the construction of identities in social work settings (especially Juhila & Abrams 2011, Eskelinen & Olesen 2010) and partly on literature that addresses the dilemmas and challenges in providing evidence about the effectiveness of interventions...... at a meta level, where the problems concerning evidence, measurements of effects and how to improve interventions and practice will be addressed. At the second, micro level, the discussion will be about the specific data collection methods in the project and how the identities of the clients can...

  17. What are the characteristics of 'sexually ready' adolescents? Exploring the sexual readiness of youth in urban poor Accra.

    Science.gov (United States)

    Biney, Adriana A E; Dodoo, F Nii-Amoo

    2016-01-05

    Adolescent sexual activity, especially among the urban poor, remains a challenge. Despite numerous interventions and programs to address the negative consequences arising from early and frequent sexual activity among youth in sub-Saharan Africa, including Ghana, only slight progress has been made. A plausible explanation is that our understanding of what adolescents think about sex and about their own sexuality is poor. In that sense, examining how adolescents in urban poor communities think about their sexual readiness, and identifying characteristics associated with that sexual self-concept dimension, should deepen our understanding of this topical issue. A total of 196 male and female adolescents, ages 12 to 19, were surveyed in the 2011 RIPS Urban Health and Poverty Project in Accra, Ghana. The youth responded to three statements which determined their levels of sexual readiness. Other background characteristics were also obtained enabling the assessment of the correlates of their preparedness to engage in sex. The data were analyzed using ordered logistic regression models. Overall, the majority of respondents did not consider themselves ready for sex. Multivariate analyses indicated that sexual experience, exposure to pornographic movies, gender, ethnicity and household wealth were significantly linked to their readiness for sex. Sexual readiness is related to sexual activity as well as other characteristics of the adolescents, suggesting the need to consider these factors in the design of programs and interventions to curb early sex. The subject of sexual readiness has to be investigated further to ensure adolescents do not identify with any negative effects of this sexual self-view.

  18. Promoting community readiness for physical activity among older adults in Germany--protocol of the ready to change intervention trial.

    Science.gov (United States)

    Brand, Tilman; Gansefort, Dirk; Rothgang, Heinz; Röseler, Sabine; Meyer, Jochen; Zeeb, Hajo

    2016-02-01

    Healthy ageing is an important concern for many societies facing the challenge of an ageing population. Physical activity (PA) is a major contributor to healthy ageing; however insufficient PA levels are prevalent in old age in Germany. Community capacity building and community involvement are often recommended as key strategies to improve equitable access to prevention and health promotion. However, evidence for the effectiveness of these strategies is scarce. This study aims to assess the community readiness for PA promotion in local environments and to analyse the utility of strategies to increase community readiness for reaching vulnerable groups. We designed a mixed method intervention trial comprising three study modules. The first module includes an assessment of community readiness for PA interventions in older adults. The assessment is carried out in a sample of 24 municipalities in the Northwest of Germany using structured key informant interviews. In the second module, eight municipalities with the low community readiness are selected from the sample and randomly assigned to one of two study groups: active enhancement of community readiness (intervention) versus no enhancement (control). After enhancing community readiness in the active enhancement group, older adults in both study groups will be recruited for participation in a PA intervention. Participation rates are compared between the study groups to evaluate the effects of the intervention. In addition, a cost-effectiveness analysis is carried out calculating recruitment costs per person reached in the two study groups. In the third module, qualitative interviews are conducted with participants and non-participants of the PA intervention exploring reasons for participation or non-participation. This study offers the potential to contribute to the evidence base of reaching vulnerable older adults for PA interventions and provide ideas on how to reduce participation barriers. Its findings will inform

  19. Evaluation of Organizational E-Government Readiness in the Public Sector

    OpenAIRE

    Ibrahim A. Alghamdi; Robert Goodwin; Giselle Rampersad

    2013-01-01

    The purpose of this paper is to provide an integrated framework to evaluate organizational e-government readiness for government organizations. This framework is necessary as current ones ignore challenges that arise due to organizational transformation issues stemming from diffusion of Information and Communication Technologies (ICTs). This study adopts an e-government framework to highlight the main internal factors involved in the assessment of e-government organizational readiness and to ...

  20. Outcomes and challenges of global high-resolution non-hydrostatic atmospheric simulations using the K computer

    Science.gov (United States)

    Satoh, Masaki; Tomita, Hirofumi; Yashiro, Hisashi; Kajikawa, Yoshiyuki; Miyamoto, Yoshiaki; Yamaura, Tsuyoshi; Miyakawa, Tomoki; Nakano, Masuo; Kodama, Chihiro; Noda, Akira T.; Nasuno, Tomoe; Yamada, Yohei; Fukutomi, Yoshiki

    2017-12-01

    This article reviews the major outcomes of a 5-year (2011-2016) project using the K computer to perform global numerical atmospheric simulations based on the non-hydrostatic icosahedral atmospheric model (NICAM). The K computer was made available to the public in September 2012 and was used as a primary resource for Japan's Strategic Programs for Innovative Research (SPIRE), an initiative to investigate five strategic research areas; the NICAM project fell under the research area of climate and weather simulation sciences. Combining NICAM with high-performance computing has created new opportunities in three areas of research: (1) higher resolution global simulations that produce more realistic representations of convective systems, (2) multi-member ensemble simulations that are able to perform extended-range forecasts 10-30 days in advance, and (3) multi-decadal simulations for climatology and variability. Before the K computer era, NICAM was used to demonstrate realistic simulations of intra-seasonal oscillations including the Madden-Julian oscillation (MJO), merely as a case study approach. Thanks to the big leap in computational performance of the K computer, we could greatly increase the number of cases of MJO events for numerical simulations, in addition to integrating time and horizontal resolution. We conclude that the high-resolution global non-hydrostatic model, as used in this five-year project, improves the ability to forecast intra-seasonal oscillations and associated tropical cyclogenesis compared with that of the relatively coarser operational models currently in use. The impacts of the sub-kilometer resolution simulation and the multi-decadal simulations using NICAM are also reviewed.

  1. Highly Parallel Computing Architectures by using Arrays of Quantum-dot Cellular Automata (QCA): Opportunities, Challenges, and Recent Results

    Science.gov (United States)

    Fijany, Amir; Toomarian, Benny N.

    2000-01-01

    There has been significant improvement in the performance of VLSI devices, in terms of size, power consumption, and speed, in recent years and this trend may also continue for some near future. However, it is a well known fact that there are major obstacles, i.e., physical limitation of feature size reduction and ever increasing cost of foundry, that would prevent the long term continuation of this trend. This has motivated the exploration of some fundamentally new technologies that are not dependent on the conventional feature size approach. Such technologies are expected to enable scaling to continue to the ultimate level, i.e., molecular and atomistic size. Quantum computing, quantum dot-based computing, DNA based computing, biologically inspired computing, etc., are examples of such new technologies. In particular, quantum-dots based computing by using Quantum-dot Cellular Automata (QCA) has recently been intensely investigated as a promising new technology capable of offering significant improvement over conventional VLSI in terms of reduction of feature size (and hence increase in integration level), reduction of power consumption, and increase of switching speed. Quantum dot-based computing and memory in general and QCA specifically, are intriguing to NASA due to their high packing density (10(exp 11) - 10(exp 12) per square cm ) and low power consumption (no transfer of current) and potentially higher radiation tolerant. Under Revolutionary Computing Technology (RTC) Program at the NASA/JPL Center for Integrated Space Microelectronics (CISM), we have been investigating the potential applications of QCA for the space program. To this end, exploiting the intrinsic features of QCA, we have designed novel QCA-based circuits for co-planner (i.e., single layer) and compact implementation of a class of data permutation matrices, a class of interconnection networks, and a bit-serial processor. Building upon these circuits, we have developed novel algorithms and QCA

  2. Computers that negotiate on our behalf: Major challenges for self-sufficient, self-directed, and interdependent negotiating agents

    NARCIS (Netherlands)

    T. Baarslag (Tim); M. Kaisers (Michael); E.H. Gerding (Enrico); C.M. Jonker (Catholijn); J. Gratch (Jonathan)

    2017-01-01

    textabstractComputers that negotiate on our behalf hold great promise for the future and will even become indispensable in emerging application domains such as the smart grid, autonomous driving, and the Internet of Things. Much research has thus been expended to create agents that are able to

  3. A Tale of Two Countries: Successes and Challenges in K-12 Computer Science Education in Israel and the United States

    Science.gov (United States)

    Gal-Ezer, Judith; Stephenson, Chris

    2014-01-01

    This article tells a story of K-12 computer science in two different countries. These two countries differ profoundly in culture, language, government and state structure, and in their education systems. Despite these differences, however, they share the pursuit of excellence and high standards in K-12 education. In Israel, curriculum is…

  4. Computational fluid dynamics-habitat suitability index (CFD-HSI) modelling as an exploratory tool for assessing passability of riverine migratory challenge zones for fish

    Science.gov (United States)

    Haro, Alexander J.; Chelminski, Michael; Dudley, Robert W.

    2015-01-01

    We developed two-dimensional computational fluid hydraulics-habitat suitability index (CFD-HSI) models to identify and qualitatively assess potential zones of shallow water depth and high water velocity that may present passage challenges for five major anadromous fish species in a 2.63-km reach of the main stem Penobscot River, Maine, as a result of a dam removal downstream of the reach. Suitability parameters were based on distribution of fish lengths and body depths and transformed to cruising, maximum sustained and sprint swimming speeds. Zones of potential depth and velocity challenges were calculated based on the hydraulic models; ability of fish to pass a challenge zone was based on the percent of river channel that the contiguous zone spanned and its maximum along-current length. Three river flows (low: 99.1 m3 sec-1; normal: 344.9 m3 sec-1; and high: 792.9 m3 sec-1) were modelled to simulate existing hydraulic conditions and hydraulic conditions simulating removal of a dam at the downstream boundary of the reach. Potential depth challenge zones were nonexistent for all low-flow simulations of existing conditions for deeper-bodied fishes. Increasing flows for existing conditions and removal of the dam under all flow conditions increased the number and size of potential velocity challenge zones, with the effects of zones being more pronounced for smaller species. The two-dimensional CFD-HSI model has utility in demonstrating gross effects of flow and hydraulic alteration, but may not be as precise a predictive tool as a three-dimensional model. Passability of the potential challenge zones cannot be precisely quantified for two-dimensional or three-dimensional models due to untested assumptions and incomplete data on fish swimming performance and behaviours.

  5. Nuclear explosives testing readiness evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Valk, T.C.

    1993-09-01

    This readiness evaluation considers hole selection and characterization, verification, containment issues, nuclear explosive safety studies, test authorities, event operations planning, canister-rack preparation, site preparation, diagnostic equipment setup, device assembly facilities and processes, device delivery and insertion, emplacement, stemming, control room activities, readiness briefing, arming and firing, test execution, emergency response and reentry, and post event analysis to include device diagnostics, nuclear chemistry, and containment. This survey concludes that the LLNL program and its supporting contractors could execute an event within six months of notification, and a second event within the following six months, given the NET group`s evaluation and the following three restraints: (1) FY94 (and subsequent year) funding is essentially constant with FY93, (2) Preliminary work for the initial event is completed to the historical sic months status, (3) Critical personnel, currently working in dual use technologies, would be recallable as needed.

  6. Objective Measurement of Training Readiness.

    Science.gov (United States)

    1980-05-16

    mission training to ARTEP T&E. (Also see EXTEV and FTX EXTEV). Tracking: An idiomatic term meaning the monitoring of a soldier’s individual and...readiness. This numeric data, expressed in raw and percentage terms, are transformed into a numerical REDCON or "C" rating. Since 1971, the...interviews, misinterpretation fears were expressed in terms as: (1) People that don’t understand where it comes from and what lies behind it will take the

  7. GRENADA. Renewables Readiness Assessment 2012

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    Grenada, like many Caribbean islands, is dependent on costly oil imports for its energy needs, including the generation of electricity. The transition to renewable energy could potentially support price reductions and improve the overall competitiveness of key sectors of the economy, particularly tourism. This report provides facts and analysis to support the country's discussion on ways to move forward with the renewable energy agenda. IRENA is ready to provide support in the implementation of the actions identified in this report.

  8. Knowledge Management Readiness In Organizations

    Directory of Open Access Journals (Sweden)

    Alanazi Sultan

    2015-06-01

    Full Text Available Abstract To generate a comprehensive model of Knowledge Management Readiness In Organizationsintending greater value to its practical applicability. This study was based on both secondary and primary data grounded on the deductive paradigm of social research. Survey with 13 professionals in the current business setting was conducted to justify the research findings. The key criterion of KM Readiness In Organizations i.e. its dependency on human acts was ignored in many traditional KM models although literary works paid substantial value to the aspect. Applicability of conventional KM models in the current context was also limited. The study lacked consideration to the influence of organizational characteristics on KM practices based on organizational readiness. The number of respondents was also limited for a wide research such as this. As this study was mainly guided by the contemporary beliefs and attributes of organizational management the developed model is likely to find its worthy applicability in practical experiences. Due emphasis was provided to ethical soundness throughout the paper confirming its originality and value in terms that anti-plagiarism strictness was taken into context and self-infliction of information was avoided entirely.

  9. A qualitative readiness-requirements assessment model for enterprise big-data infrastructure investment

    Science.gov (United States)

    Olama, Mohammed M.; McNair, Allen W.; Sukumar, Sreenivas R.; Nutaro, James J.

    2014-05-01

    In the last three decades, there has been an exponential growth in the area of information technology providing the information processing needs of data-driven businesses in government, science, and private industry in the form of capturing, staging, integrating, conveying, analyzing, and transferring data that will help knowledge workers and decision makers make sound business decisions. Data integration across enterprise warehouses is one of the most challenging steps in the big data analytics strategy. Several levels of data integration have been identified across enterprise warehouses: data accessibility, common data platform, and consolidated data model. Each level of integration has its own set of complexities that requires a certain amount of time, budget, and resources to implement. Such levels of integration are designed to address the technical challenges inherent in consolidating the disparate data sources. In this paper, we present a methodology based on industry best practices to measure the readiness of an organization and its data sets against the different levels of data integration. We introduce a new Integration Level Model (ILM) tool, which is used for quantifying an organization and data system's readiness to share data at a certain level of data integration. It is based largely on the established and accepted framework provided in the Data Management Association (DAMADMBOK). It comprises several key data management functions and supporting activities, together with several environmental elements that describe and apply to each function. The proposed model scores the maturity of a system's data governance processes and provides a pragmatic methodology for evaluating integration risks. The higher the computed scores, the better managed the source data system and the greater the likelihood that the data system can be brought in at a higher level of integration.

  10. From the CMS Computing Experience in the WLCG STEP'09 Challenge to the First Data Taking of the LHC Era

    Science.gov (United States)

    Bonacorsi, D.; Gutsche, O.

    The Worldwide LHC Computing Grid (WLCG) project decided in March 2009 to perform scale tests of parts of its overall Grid infrastructure before the start of the LHC data taking. The "Scale Test for the Experiment Program" (STEP'09) was performed mainly in June 2009 -with more selected tests in September- October 2009 -and emphasized the simultaneous test of the computing systems of all 4 LHC experiments. CMS tested its Tier-0 tape writing and processing capabilities. The Tier-1 tape systems were stress tested using the complete range of Tier-1 work-flows: transfer from Tier-0 and custody of data on tape, processing and subsequent archival, redistribution of datasets amongst all Tier-1 sites as well as burst transfers of datasets to Tier-2 sites. The Tier-2 analysis capacity was tested using bulk analysis job submissions to backfill normal user activity. In this talk, we will report on the different performed tests and present their post-mortem analysis.

  11. Integrating on campus problem based learning and practice based learning: issues and challenges in using computer mediated communication.

    Science.gov (United States)

    Conway, J; Sharkey, R

    2002-10-01

    The Faculty of Nursing, University of Newcastle, Australia, has been keen to initiate strategies that enhance student learning and nursing practice. Two strategies are problem based learning (PBL) and clinical practice. The Faculty has maintained a comparatively high proportion of the undergraduate hours in the clinical setting in times when financial constraints suggest that simulations and on campus laboratory experiences may be less expensive.Increasingly, computer based technologies are becoming sufficiently refined to support the exploration of nursing practice in a non-traditional lecture/tutorial environment. In 1998, a group of faculty members proposed that computer mediated instruction would provide an opportunity for partnership between students, academics and clinicians that would promote more positive outcomes for all and maintain the integrity of the PBL approach. This paper discusses the similarities between problem based and practice based learning and presents the findings of an evaluative study of the implementation of a practice based learning model that uses computer mediated communication to promote integration of practice experiences with the broader goals of the undergraduate curriculum.

  12. Evidence based practice readiness: A concept analysis.

    Science.gov (United States)

    Schaefer, Jessica D; Welton, John M

    2018-01-15

    To analyse and define the concept "evidence based practice readiness" in nurses. Evidence based practice readiness is a term commonly used in health literature, but without a clear understanding of what readiness means. Concept analysis is needed to define the meaning of evidence based practice readiness. A concept analysis was conducted using Walker and Avant's method to clarify the defining attributes of evidence based practice readiness as well as antecedents and consequences. A Boolean search of PubMed and Cumulative Index for Nursing and Allied Health Literature was conducted and limited to those published after the year 2000. Eleven articles met the inclusion criteria for this analysis. Evidence based practice readiness incorporates personal and organisational readiness. Antecedents include the ability to recognize the need for evidence based practice, ability to access and interpret evidence based practice, and a supportive environment. The concept analysis demonstrates the complexity of the concept and its implications for nursing practice. The four pillars of evidence based practice readiness: nursing, training, equipping and leadership support are necessary to achieve evidence based practice readiness. Nurse managers are in the position to address all elements of evidence based practice readiness. Creating an environment that fosters evidence based practice can improve patient outcomes, decreased health care cost, increase nurses' job satisfaction and decrease nursing turnover. © 2018 John Wiley & Sons Ltd.

  13. Determining registered nurses' readiness for evidence-based practice.

    Science.gov (United States)

    Thiel, Linda; Ghosh, Yashowanto

    2008-01-01

    As health care systems worldwide move toward instituting evidence-based practice (EBP), its implementation can be challenging. Conducting a baseline assessment to determine nurses' readiness for EBP presents opportunities to plan strategies before implementation. Although a growing body of research literature is focused on implementing EBP, little attention has been paid to assessing nurses' readiness for EBP. The purpose of this study was to assess registered nurses' readiness for EBP in a moderate-sized acute care hospital in the Midwestern United States before implementation of a hospital-wide nursing EBP initiative. A descriptive cross-sectional survey design was used; 121 registered nurses completed the survey. The participants (n= 121) completed the 64-item Nurses' Readiness for Evidence-Based Practice Survey that allowed measurement of information needs, knowledge and skills, culture, and attitudes. Data were analyzed using descriptive statistics and a post hoc analysis. The majority (72.5%) of respondents indicated that when they needed information, they consulted colleagues and peers rather than using journals and books; 24% of nurses surveyed used the health database, Cumulative Index to Nursing & Allied Health Literature (CINAHL). The respondents perceived their EBP knowledge level as moderate. Cultural EBP scores were moderate, with unit scores being higher than organizational scores. The nurses' attitudes toward EBP were positive. The post hoc analysis showed many significant correlations. Nurses have access to technological resources and perceive that they have the ability to engage in basic information gathering but not in higher level evidence gathering. The elements important to EBP such as a workplace culture and positive attitudes are present and can be built upon. A "site-specific" baseline assessment provides direction in planning EBP initiatives. The Nurses' Readiness for EBP Survey is a streamlined tool with established reliability and

  14. Ready or not: Kindergarten classroom engagement as an indicator of child school readiness

    Directory of Open Access Journals (Sweden)

    Caroline Fitzpatrick

    2012-07-01

    Full Text Available Children’s preparedness for school is an important predictor of their eventual academic attainment, health, and personal success well into adulthood. Although kindergarten knowledge of numbers and vocabulary represent robust indicators of children’s readiness to learn at school entry, theory and research suggest that self-directed learning skills are also important in meeting the challenges of the elementary school classroom. This review examines evidence related to the potential benefits (e.g. improving children’s academic outcomes of targeting classroom engagement skills, a person-environment fit characteristic reflecting task-orientation and industriousness. Reviewed studies suggest that classroom engagement skills are malleable and robust predictors of later elementary school achievement. Research also suggests that cognitive control skills in the form of executive functions are likely to underlie individual differences in classroom engagement. This paper provides evidence that developing pre-school and kindergarten curriculum that target cognitive control can be a useful strategy for enhancing student engagement behaviour. Developing early interventions that bolster school readiness can then help social impairments in childhood and adolescence.

  15. Conquer the FPSO (Floating Production Storage and Off loading) separation challenge using CFD (Computational Fluid Dynamics) and laboratory experiments

    Energy Technology Data Exchange (ETDEWEB)

    Kristoffersen, Astrid R.; Hannisdal, Andreas; Amarzguioui, Morad; Wood, Deborah; Tor Andersen [Aibel, Stavanger (Norway)

    2008-07-01

    To have the necessary confidence in a separators' performance, the design must be based on more than simple design rules. A combination of separation testing, computer modelling, and general knowledge of the process is needed. In addition, new technologies can provide enhanced overall performance when it is required. This paper describes how all of these techniques can be combined to get the most out of separator design. We will describe how Aibel has used Computational Fluid Dynamics (CFD), together with laboratory testing, multi-disciplinary knowledge and new technology in order to revolutionize the way we design separators. This paper will present a study of separation performance for one of our customers. A CFD simulation was performed to predict the internal waves inside a separator located on a FPSO, and how these affect separation phenomena. The performance of the theoretical CFD model was verified by laboratory wave experiments. Separation tests were performed to test new solutions which could increase the performance of the process. Based on the CFD simulations and the separation tests, a modification of the separator was proposed. (author)

  16. The challenge of raising ethical awareness: a case-based aiding system for use by computing and ICT students.

    Science.gov (United States)

    Sherratt, Don; Rogerson, Simon; Ben Fairweather, N

    2005-04-01

    Students, the future Information and Communication Technology (ICT) professionals, are often perceived to have little understanding of the ethical issues associated with the use of ICTs. There is a growing recognition that the moral issues associated with the use of the new technologies should be brought to the attention of students. Furthermore, they should be encouraged to explore and think more deeply about the social and legal consequences of the use of ICTs. This paper describes the development of a tool designed to raise students' awareness of the social impact of ICTs. The tool offers guidance to students undertaking computing and computer-related courses when considering the social, legal and professional implications of the actions of participants in situations of ethical conflict. However, unlike previous work in this field, this tool is not based on an artificial intelligence paradigm. Aspects of the theoretical basis for the design of the tool and the tool's practical development are discussed. Preliminary results from the testing of the tool are also discussed.

  17. Computational methods for detecting copy number variations in cancer genome using next generation sequencing: principles and challenges

    Science.gov (United States)

    Liu, Biao; Morrison, Carl D.; Johnson, Candace S.; Trump, Donald L.; Qin, Maochun; Conroy, Jeffrey C.; Wang, Jianmin; Liu, Song

    2013-01-01

    Accurate detection of somatic copy number variations (CNVs) is an essential part of cancer genome analysis, and plays an important role in oncotarget identifications. Next generation sequencing (NGS) holds the promise to revolutionize somatic CNV detection. In this review, we provide an overview of current analytic tools used for CNV detection in NGS-based cancer studies. We summarize the NGS data types used for CNV detection, decipher the principles for data preprocessing, segmentation, and interpretation, and discuss the challenges in somatic CNV detection. This review aims to provide a guide to the analytic tools used in NGS-based cancer CNV studies, and to discuss the important factors that researchers need to consider when analyzing NGS data for somatic CNV detections. PMID:24240121

  18. In silico regenerative medicine: how computational tools allow regulatory and financial challenges to be addressed in a volatile market.

    Science.gov (United States)

    Geris, L; Guyot, Y; Schrooten, J; Papantoniou, I

    2016-04-06

    The cell therapy market is a highly volatile one, due to the use of disruptive technologies, the current economic situation and the small size of the market. In such a market, companies as well as academic research institutes are in need of tools to advance their understanding and, at the same time, reduce their R&D costs, increase product quality and productivity, and reduce the time to market. An additional difficulty is the regulatory path that needs to be followed, which is challenging in the case of cell-based therapeutic products and should rely on the implementation of quality by design (QbD) principles. In silico modelling is a tool that allows the above-mentioned challenges to be addressed in the field of regenerative medicine. This review discusses such in silico models and focuses more specifically on the bioprocess. Three (clusters of) examples related to this subject are discussed. The first example comes from the pharmaceutical engineering field where QbD principles and their implementation through the use of in silico models are both a regulatory and economic necessity. The second example is related to the production of red blood cells. The described in silico model is mainly used to investigate the manufacturing process of the cell-therapeutic product, and pays special attention to the economic viability of the process. Finally, we describe the set-up of a model capturing essential events in the development of a tissue-engineered combination product in the context of bone tissue engineering. For each of the examples, a short introduction to some economic aspects is given, followed by a description of the in silico tool or tools that have been developed to allow the implementation of QbD principles and optimal design.

  19. Jean Claude Risset’s Duet for One Pianist: Challenges of a Real-Time Performance Interaction with a Computer-Controlled Acoustic Piano 16 Years Later

    Directory of Open Access Journals (Sweden)

    Sofia Lourenço

    2014-12-01

    Full Text Available This study aims to discuss the work Duet for one Pianist (1989 by the French composer Jean-Claude Risset (b. 13 March 1938 by analyzing the challenges of the music performance of this Computer-Aided Composition work Disklavier and implies Human-Computer Interaction performance. Extremely honored to perform the revised version of the 8 Sketches for One Pianist and Disklavier within a research project of CITAR and a new Sketch Reflections (2012 by Jean-Claude Risset dedicated to me in a World premiere in the closing ceremony of Black&White 2012 Film Festival promoted by the Catholic University of Portugal. Several issues on the performance of this work are analysed as a case-study, from the point of view of the performer, particularly the components of expressive performance in a real-time interaction between performer and computer. These components can work as analysis criteria of a piano interpretation, in here, of a pianist and Disklavier interpretation. 

  20. Librarian readiness for research partnerships.

    Science.gov (United States)

    Mazure, Emily S; Alpi, Kristine M

    2015-04-01

    This study investigated health sciences librarians' knowledge and skill-based readiness to partner on sponsored research involving human participants. The authors developed and deployed, at two time points, a web-based survey on nine indicators of research activities with response choices reflecting the transtheoretical model of stages of behavior change. Librarians with research experience or membership in the Medical Library Association Research Section reported higher levels of having completed indicators. Our results suggest that creating awareness in precontemplation responders could encourage skill development. Mentoring and continuing education could support librarians who are contemplating or preparing to perform indicator activities.

  1. The Staff Council, ready for the challenges of 2015

    CERN Multimedia

    Staff Association

    2015-01-01

    In order to fulfil its mission of representing CERN staff with the Management and the Member States in an optimal way, the Staff Council relies on the work of a number of commissions, amongst them employment conditions, pensions, legal matters, social security, health and safety and InFormAction (training, information and action). All of these commissions have as a goal to try and improve the employment conditions of CERN members of personnel. This is the case in particular in the context of the five-yearly review process, ending in December 2015 (5YR 2015). Let us recall that the objective of a five-yearly review is to ensure that the financial and social conditions offered by the Organisation favour recruitment from all Member States, and to retain and motivate staff necessary for the fulfilment of its mission. The convenor of each Commission reports regularly to the Staff Council and Executive Committee on the work performed in their group. The commissions are open to all members of the Staff Associati...

  2. Design and preliminary evaluation of the FINGER rehabilitation robot: controlling challenge and quantifying finger individuation during musical computer game play.

    Science.gov (United States)

    Taheri, Hossein; Rowe, Justin B; Gardner, David; Chan, Vicki; Gray, Kyle; Bower, Curtis; Reinkensmeyer, David J; Wolbrecht, Eric T

    2014-02-04

    This paper describes the design and preliminary testing of FINGER (Finger Individuating Grasp Exercise Robot), a device for assisting in finger rehabilitation after neurologic injury. We developed FINGER to assist stroke patients in moving their fingers individually in a naturalistic curling motion while playing a game similar to Guitar Hero. The goal was to make FINGER capable of assisting with motions where precise timing is important. FINGER consists of a pair of stacked single degree-of-freedom 8-bar mechanisms, one for the index and one for the middle finger. Each 8-bar mechanism was designed to control the angle and position of the proximal phalanx and the position of the middle phalanx. Target positions for the mechanism optimization were determined from trajectory data collected from 7 healthy subjects using color-based motion capture. The resulting robotic device was built to accommodate multiple finger sizes and finger-to-finger widths. For initial evaluation, we asked individuals with a stroke (n = 16) and without impairment (n = 4) to play a game similar to Guitar Hero while connected to FINGER. Precision design, low friction bearings, and separate high speed linear actuators allowed FINGER to individually actuate the fingers with a high bandwidth of control (-3 dB at approximately 8 Hz). During the tests, we were able to modulate the subject's success rate at the game by automatically adjusting the controller gains of FINGER. We also used FINGER to measure subjects' effort and finger individuation while playing the game. Test results demonstrate the ability of FINGER to motivate subjects with an engaging game environment that challenges individuated control of the fingers, automatically control assistance levels, and quantify finger individuation after stroke.

  3. Design and preliminary evaluation of the FINGER rehabilitation robot: controlling challenge and quantifying finger individuation during musical computer game play

    Science.gov (United States)

    2014-01-01

    Background This paper describes the design and preliminary testing of FINGER (Finger Individuating Grasp Exercise Robot), a device for assisting in finger rehabilitation after neurologic injury. We developed FINGER to assist stroke patients in moving their fingers individually in a naturalistic curling motion while playing a game similar to Guitar Hero®a. The goal was to make FINGER capable of assisting with motions where precise timing is important. Methods FINGER consists of a pair of stacked single degree-of-freedom 8-bar mechanisms, one for the index and one for the middle finger. Each 8-bar mechanism was designed to control the angle and position of the proximal phalanx and the position of the middle phalanx. Target positions for the mechanism optimization were determined from trajectory data collected from 7 healthy subjects using color-based motion capture. The resulting robotic device was built to accommodate multiple finger sizes and finger-to-finger widths. For initial evaluation, we asked individuals with a stroke (n = 16) and without impairment (n = 4) to play a game similar to Guitar Hero® while connected to FINGER. Results Precision design, low friction bearings, and separate high speed linear actuators allowed FINGER to individually actuate the fingers with a high bandwidth of control (−3 dB at approximately 8 Hz). During the tests, we were able to modulate the subject’s success rate at the game by automatically adjusting the controller gains of FINGER. We also used FINGER to measure subjects’ effort and finger individuation while playing the game. Conclusions Test results demonstrate the ability of FINGER to motivate subjects with an engaging game environment that challenges individuated control of the fingers, automatically control assistance levels, and quantify finger individuation after stroke. PMID:24495432

  4. CBO Testimony: Trends in Selected Indicators of Military Readiness

    National Research Council Canada - National Science Library

    Singer, Neil M

    1994-01-01

    ... to. This testimony addresses two aspects of readiness: what is the state of current readiness based on available indicators, and what are the implications for future readiness of levels of funding for some important categories of defense resources...

  5. Psychological readiness of students for professional life

    Directory of Open Access Journals (Sweden)

    OLHA UHRYN

    2013-09-01

    Full Text Available The article is devoted to the psychological readiness of student’s personality for professional life. The author considers components of readiness that promote self-development and self-realisation in the professional sphere, and presents the results of an empirical study of willingness to work in a professional field.

  6. From Readiness to Action: How Motivation Works

    Directory of Open Access Journals (Sweden)

    Kruglanski Arie W.

    2014-09-01

    Full Text Available We present a new theoretical construct labeled motivational readiness. It is defined as the inclination, whether or not ultimately implemented, to satisfy a desire. A general model of readiness is described which builds on the work of prior theories, including animal learning models and personality approaches, and which aims to integrate a variety of research findings across different domains of motivational research. Components of this model include the Want state (that is, an individual’s currently active desire, and the Expectancy of being able to satisfy that Want. We maintain that the Want concept is the critical ingredient in motivational readiness: without it, readiness cannot exist. In contrast, some motivational readiness can exist without Expectancy. We also discuss the role of incentive in motivational readiness. Incentive is presently conceived of in terms of a Match between a Want and a Perceived Situational Affordance. Whereas in classic models incentive was portrayed as a first order determinant of motivational readiness, here we describe it as a second order factor which affects readiness by influencing Want, Expectancy, or both. The new model’s relation to its theoretical predecessors, and its implications for future research, also are discussed.

  7. Understanding Early Educators' Readiness to Change

    Science.gov (United States)

    Peterson, Shira M.

    2012-01-01

    Researchers in the fields of humanistic psychology, counseling, organizational change, and implementation science have been asking a question that is at the heart of today's early care and education quality improvement efforts: When it comes to changing one's behavior, what makes a person ready to change? Although the concept of readiness to…

  8. Service Availability and Readiness Assessment of Maternal ...

    African Journals Online (AJOL)

    AJRH Managing Editor

    The Service Availability and Readiness Assessment (SARA) survey was adapted and used to generate information on service availability and the readiness of maternal, newborn and child health facilities to provide basic health care interventions for obstetric care, neonatal and child health in Madagascar. The survey ...

  9. Computational Complexity

    Directory of Open Access Journals (Sweden)

    J. A. Tenreiro Machado

    2017-02-01

    Full Text Available Complex systems (CS involve many elements that interact at different scales in time and space. The challenges in modeling CS led to the development of novel computational tools with applications in a wide range of scientific areas. The computational problems posed by CS exhibit intrinsic difficulties that are a major concern in Computational Complexity Theory. [...

  10. Measuring the strategic readiness of intangible assets.

    Science.gov (United States)

    Kaplan, Robert S; Norton, David P

    2004-02-01

    Measuring the value of intangible assets such as company culture, knowledge management systems, and employees' skills is the holy grail of accounting. Executives know that these intangibles, being hard to imitate, are powerful sources of sustainable competitive advantage. If managers could measure them, they could manage the company's competitive position more easily and accurately. In one sense, the challenge is impossible. Intangible assets are unlike financial and physical resources in that their value depends on how well they serve the organizations that own them. But while this prevents an independent valuation of intangible assets, it also points to an altogether different approach for assessing their worth. In this article, the creators of the Balanced Scorecard draw on its tools and framework--in particular, a tool called the strategy map--to present a step-by-step way to determine "strategic readiness," which refers to the alignment of an organization's human, information, and organization capital with its strategy. In the method the authors describe, the firm identifies the processes most critical to creating and delivering its value proposition and determines the human, information, and organization capital the processes require. Some managers shy away from measuring intangible assets because they seem so subjective. But by using the systematic approaches set out in this article, companies can now measure what they want, rather than wanting only what they can currently measure.

  11. Implementing a Zero Energy Ready Home Multifamily Project

    Energy Technology Data Exchange (ETDEWEB)

    Springer, David [Alliance for Residential Building Innovation, Davis, CA (United States); German, Alea [Alliance for Residential Building Innovation, Davis, CA (United States)

    2015-08-01

    An objective of this project was to gain a highly visible foothold for residential buildings built to the U.S. Department of Energy's Zero Energy Ready Home (ZERH) specification that can be used to encourage participation by other California builders. This report briefly describes two single family homes that were ZERH-certified, and focuses on the experience of working with developer Mutual Housing on a 62 unit multi-family community at the Spring Lake subdivision in Woodland, CA. The Spring Lake project is expected to be the first ZERH certified multi-family project nationwide. This report discusses challenges encountered, lessons learned, and how obstacles were overcome.

  12. Readiness of communities to engage with childhood obesity prevention initiatives in disadvantaged areas of Victoria, Australia.

    Science.gov (United States)

    Cyril, Sheila; Polonsky, Michael; Green, Julie; Agho, Kingsley; Renzaho, Andre

    2017-07-01

    Objective Disadvantaged communities bear a disproportionate burden of childhood obesity and show low participation in childhood obesity prevention initiatives. This study aims to examine the level of readiness of disadvantaged communities to engage with childhood obesity prevention initiatives. Methods Using the community readiness model, 95 semi-structured interviews were conducted among communities in four disadvantaged areas of Victoria, Australia. Community readiness analysis and paired t-tests were performed to assess the readiness levels of disadvantaged communities to engage with childhood obesity prevention initiatives. Results The results showed that disadvantaged communities demonstrated low levels of readiness (readiness score=4/9, 44%) to engage with the existing childhood obesity prevention initiatives, lacked knowledge of childhood obesity and its prevention, and reported facing challenges in initiating and sustaining participation in obesity prevention initiatives. Conclusion This study highlights the need to improve community readiness by addressing low obesity-related literacy levels among disadvantaged communities and by facilitating the capacity-building of bicultural workers to deliver obesity prevention messages to these communities. Integrating these needs into existing Australian health policy and practice is of paramount importance for reducing obesity-related disparities currently prevailing in Australia. What is known about the topic? Childhood obesity prevalence is plateauing in developed countries including Australia; however, obesity-related inequalities continue to exist in Australia especially among communities living in disadvantaged areas, which experience poor engagement in childhood obesity prevention initiatives. Studies in the USA have found that assessing disadvantaged communities' readiness to participate in health programs is a critical initial step in reducing the disproportionate obesity burden among these communities

  13. Are they ready? Organizational readiness for change among clinical teaching teams

    Directory of Open Access Journals (Sweden)

    Bank L

    2017-12-01

    Full Text Available Lindsay Bank,1,2 Mariëlle Jippes,3 Jimmie Leppink,4 Albert JJA Scherpbier,4 Corry den Rooyen,5 Scheltus J van Luijk,6 Fedde Scheele1,2,7 1Department of Healthcare Education, OLVG Hospital, 2Faculty of Earth and Life Sciences, Athena Institute for Transdisciplinary Research, VU University, Amsterdam, 3Department of Plastic Surgery, Erasmus Medical Centre, Rotterdam, 4Faculty of Health, Medicine and Life Sciences, School of Health Professions Education, Maastricht University, Maastricht, 5Movation BV, Maarssen, 6Department of Healthcare Education, Maastricht University Medical Center+, Maastricht, 7School of Medical Sciences, Institute for Education and Training, VU University Medical Center, Amsterdam, the Netherlands Introduction: Curriculum change and innovation are inevitable parts of progress in postgraduate medical education (PGME. Although implementing change is known to be challenging, change management principles are rarely looked at for support. Change experts contend that organizational readiness for change (ORC is a critical precursor for the successful implementation of change initiatives. Therefore, this study explores whether assessing ORC in clinical teaching teams could help to understand how curriculum change takes place in PGME.Methods: Clinical teaching teams in hospitals in the Netherlands were requested to complete the Specialty Training’s Organizational Readiness for curriculum Change, a questionnaire to measure ORC in clinical teaching teams. In addition, change-related behavior was measured by using the “behavioral support-for-change” measure. A two-way analysis of variance was performed for all response variables of interest. Results: In total, 836 clinical teaching team members were included in this study: 288 (34.4% trainees, 307 (36.7% clinical staff members, and 241 (28.8% program directors. Overall, items regarding whether the program director has the authority to lead scored higher compared with the other

  14. Are they ready? Organizational readiness for change among clinical teaching teams.

    Science.gov (United States)

    Bank, Lindsay; Jippes, Mariëlle; Leppink, Jimmie; Scherpbier, Albert Jja; den Rooyen, Corry; van Luijk, Scheltus J; Scheele, Fedde

    2017-01-01

    Curriculum change and innovation are inevitable parts of progress in postgraduate medical education (PGME). Although implementing change is known to be challenging, change management principles are rarely looked at for support. Change experts contend that organizational readiness for change (ORC) is a critical precursor for the successful implementation of change initiatives. Therefore, this study explores whether assessing ORC in clinical teaching teams could help to understand how curriculum change takes place in PGME. Clinical teaching teams in hospitals in the Netherlands were requested to complete the Specialty Training's Organizational Readiness for curriculum Change, a questionnaire to measure ORC in clinical teaching teams. In addition, change-related behavior was measured by using the "behavioral support-for-change" measure. A two-way analysis of variance was performed for all response variables of interest. In total, 836 clinical teaching team members were included in this study: 288 (34.4%) trainees, 307 (36.7%) clinical staff members, and 241 (28.8%) program directors. Overall, items regarding whether the program director has the authority to lead scored higher compared with the other items. At the other end, the subscales "management support and leadership," "project resources," and "implementation plan" had the lowest scores in all groups. The study brought to light that program directors are clearly in the lead when it comes to the implementation of educational innovation. Clinical teaching teams tend to work together as a team, sharing responsibilities in the implementation process. However, the results also reinforce the need for change management support in change processes in PGME.

  15. Achieving Business Excellence by Optimizing Corporate Forensic Readiness

    Directory of Open Access Journals (Sweden)

    Gojko Grubor

    2017-02-01

    Full Text Available In order to improve their business excellence, all organizations, despite their size (small, medium or large one should manage their risk of fraud. Fraud, in today’s world, is often committed by using computers and can only be revealed by digital forensic investigator. Not even small or medium-sized companies are secure from fraud. In the light of recent financial scandals that literary demolished not just economies of specific countries but entire world economy, we propose in this paper an optimal model of corporative computer incident digital forensic investigation (CCIDFI by using adopted mathematic model of the greed MCDM – multi-criteria decision-making method and the Expert Choice software tool for multi-criteria optimization of the CCIDFI readiness. Proposed model can, first of all, help managers of small and medium-sized companies to justify their decisions to employ digital forensic investigators and include them in their information security teams in order to choose the optimal CCIDFI model and improve forensic readiness in the computer incident management process that will result with minimization of potential losses of company in the future and improve its business quality.

  16. Readiness for hospital discharge: A concept analysis.

    Science.gov (United States)

    Galvin, Eileen Catherine; Wills, Teresa; Coffey, Alice

    2017-11-01

    To report on an analysis on the concept of 'readiness for hospital discharge'. No uniform operational definition of 'readiness for hospital discharge' exists in the literature; therefore, a concept analysis is required to clarify the concept and identify an up-to-date understanding of readiness for hospital discharge. Clarity of the concept will identify all uses of the concept; provide conceptual clarity, an operational definition and direction for further research. Literature review and concept analysis. A review of literature was conducted in 2016. Databases searched were: Academic Search Complete, CINAHL Plus with Full Text, PsycARTICLES, Psychology and Behavioural Sciences Collection, PsycINFO, Social Sciences Full Text (H.W. Wilson) and SocINDEX with Full Text. No date limits were applied. Identification of the attributes, antecedents and consequences of readiness for hospital discharge led to an operational definition of the concept. The following attributes belonging to 'readiness for hospital discharge' were extracted from the literature: physical stability, adequate support, psychological ability, and adequate information and knowledge. This analysis contributes to the advancement of knowledge in the area of hospital discharge, by proposing an operational definition of readiness for hospital discharge, derived from the literature. A better understanding of the phenomenon will assist healthcare professionals to recognize, measure and implement interventions where necessary, to ensure patients are ready for hospital discharge and assist in the advancement of knowledge for all professionals involved in patient discharge from hospital. © 2017 John Wiley & Sons Ltd.

  17. Using the Rasch Model to Determine Equivalence of Forms In the Trilingual Lollipop Readiness Test

    Science.gov (United States)

    Lang, W. Steve; Chew, Alex L.; Crownover, Carol; Wilkerson, Judy R.

    2007-01-01

    Determining the cross-cultural equivalence of multilingual tests is a challenge that is more complex than simple horizontal equating of test forms. This study examines the functioning of a trilingual test of preschool readiness to determine the equivalence. Different forms of the test have previously been examined using classical statistical…

  18. Uncovering University Students' Readiness through Their Assessment of Workplace Communication Skills

    Science.gov (United States)

    Magogwe, Joel M.; Nkosana, Leonard B. M.; Ntereke, Beauty B.

    Employers in today's competitive and challenging global world prefer employees who possess "soft skills" in addition to "hard skills" because they make an impact and create a good impression in the workplace. This study examined employment readiness of the University of Botswana (UB) students who took the Advanced Communication…

  19. A theory of organizational readiness for change

    Directory of Open Access Journals (Sweden)

    Weiner Bryan J

    2009-10-01

    Full Text Available Abstract Background Change management experts have emphasized the importance of establishing organizational readiness for change and recommended various strategies for creating it. Although the advice seems reasonable, the scientific basis for it is limited. Unlike individual readiness for change, organizational readiness for change has not been subject to extensive theoretical development or empirical study. In this article, I conceptually define organizational readiness for change and develop a theory of its determinants and outcomes. I focus on the organizational level of analysis because many promising approaches to improving healthcare delivery entail collective behavior change in the form of systems redesign--that is, multiple, simultaneous changes in staffing, work flow, decision making, communication, and reward systems. Discussion Organizational readiness for change is a multi-level, multi-faceted construct. As an organization-level construct, readiness for change refers to organizational members' shared resolve to implement a change (change commitment and shared belief in their collective capability to do so (change efficacy. Organizational readiness for change varies as a function of how much organizational members value the change and how favorably they appraise three key determinants of implementation capability: task demands, resource availability, and situational factors. When organizational readiness for change is high, organizational members are more likely to initiate change, exert greater effort, exhibit greater persistence, and display more cooperative behavior. The result is more effective implementation. Summary The theory described in this article treats organizational readiness as a shared psychological state in which organizational members feel committed to implementing an organizational change and confident in their collective abilities to do so. This way of thinking about organizational readiness is best suited for

  20. A theory of organizational readiness for change.

    Science.gov (United States)

    Weiner, Bryan J

    2009-10-19

    Change management experts have emphasized the importance of establishing organizational readiness for change and recommended various strategies for creating it. Although the advice seems reasonable, the scientific basis for it is limited. Unlike individual readiness for change, organizational readiness for change has not been subject to extensive theoretical development or empirical study. In this article, I conceptually define organizational readiness for change and develop a theory of its determinants and outcomes. I focus on the organizational level of analysis because many promising approaches to improving healthcare delivery entail collective behavior change in the form of systems redesign--that is, multiple, simultaneous changes in staffing, work flow, decision making, communication, and reward systems. Organizational readiness for change is a multi-level, multi-faceted construct. As an organization-level construct, readiness for change refers to organizational members' shared resolve to implement a change (change commitment) and shared belief in their collective capability to do so (change efficacy). Organizational readiness for change varies as a function of how much organizational members value the change and how favorably they appraise three key determinants of implementation capability: task demands, resource availability, and situational factors. When organizational readiness for change is high, organizational members are more likely to initiate change, exert greater effort, exhibit greater persistence, and display more cooperative behavior. The result is more effective implementation. The theory described in this article treats organizational readiness as a shared psychological state in which organizational members feel committed to implementing an organizational change and confident in their collective abilities to do so. This way of thinking about organizational readiness is best suited for examining organizational changes where collective behavior

  1. En sus marcas--Listos--A leer! Para los cuidadores de ninos pequenos: Actividades de lenguaje para la primera infancia y ninez entre el nacimiento y los 5 anos. El reto: A leer, America! (Ready--Set--Read! For Caregivers: Early Childhood Language Activities for Children from Birth through Age Five. America Reads Challenge).

    Science.gov (United States)

    Department of Education, Washington, DC.

    This Ready--Set--Read Kit includes an activity guide for caregivers, a 1997-98 early childhood activity calendar, and an early childhood growth chart. The activity guide presents activities and ideas that caregivers (family child care providers and the teachers, staff, and volunteers in child development programs) can use to help young children…

  2. En sus marcas--Listos--A leer! Para las familias: Actividades de lenguaje para la primera infancia y ninez entre el nacimiento y los 5 anos. El reto: A leer, America! (Ready--Set--Read! For Families: Early Childhood Language Activities for Children from Birth through Age Five. America Reads Challenge).

    Science.gov (United States)

    Department of Education, Washington, DC.

    This Ready--Set--Read Kit includes an activity guide for families, a 1997-98 early childhood activity calendar, and an early childhood growth wallchart. The activity guide presents activities and ideas that families (adults who have nurturing relationships with a child--a mother, father, grandparent, other relative, or close friend) can use to…

  3. Critical Components of Online Learning Readiness and Their Relationships with Learner Achievement

    Science.gov (United States)

    Cigdem, Harun; Ozturk, Mustafa

    2016-01-01

    This study aimed to examine the relationship between certain factors of online learning readiness and learners' end-of-course achievements. The study was conducted at a two-year post-secondary Turkish military school within the scope of the course titled Computer Literacy, which was designed and implemented in a blended way. The data were…

  4. Measuring E-Learning Readiness among EFL Teachers in Intermediate Public Schools in Saudi Arabia

    Science.gov (United States)

    Al-Furaydi, Ahmed Ajab

    2013-01-01

    This study will determine their readiness level for the e-learning in several aspects such as attitude toward e-learning, and computer literacy also this study attempt to investigate the main the barriers that EFL teachers have to overcome while incorporating e-learning into their teaching. The theory upon which the study was technology acceptance…

  5. Testing of a web-based program to facilitate parental smoking cessation readiness in primary care.

    Science.gov (United States)

    Gillaspy, Stephen R; Leffingwell, Thad; Mignogna, Melissa; Mignogna, Joseph; Bright, Brianna; Fedele, David

    2013-01-01

    To test the efficacy of a self-administered web-based computer intervention designed to facilitate readiness to alter tobacco use or secondhand smoke exposure among parents of children visiting a pediatric primary care clinic. The computer program included an assessment of the participant's smoking behavior and personalized feedback. Self-identified smoking parents of children presenting to a general pediatric outpatient clinic completed measures of motivation and readiness to cease smoking. Participants were then randomly assigned to complete the computer program or receive treatment as usual. One month after completing the intervention, participants were contacted either in person or by phone to complete measures of motivational readiness to engage in smoking cessation. Compared to treatment-as-usual parents, intervention parents reported increased readiness to change their smoking at follow-up. This effect appeared to strengthen, favoring the intervention condition, when analyses included only those participants who identified at baseline that they were contemplating quitting smoking in the next 6 months. Results of this small study supported the integration of a brief computerized tobacco intervention in the pediatric primary care setting and provided some evidence for efficacy. Brief, self-administered, and computer-based interventions such as this can be disseminated and deployed at relatively little cost or burden to existing practices, which makes small effects more meaningful and justifiable. Future investigations should investigate this intervention with larger samples and with expanded measures of parent smoking behavior.

  6. Ready-to-use foods for management of moderate acute malnutrition: Considerations for scaling up production and use in programs

    Science.gov (United States)

    Ready-to-use foods are one of the available strategies for the treatment of moderate acute malnutrition (MAM), but challenges remain in the use of these products in programs at scale. This paper focuses on two challenges: the need for cheaper formulations using locally available ingredients that are...

  7. About the Climate Ready Estuaries Program

    Science.gov (United States)

    The Climate Ready Estuaries program is a partnership between EPA and the National Estuary Programs to address climate change in coastal areas. It has helped coastal communities prepare for climate change since 2008.

  8. Social Interpretations of Readiness for Kindergarten.

    Science.gov (United States)

    Graue, M. Elizabeth

    1992-01-01

    Data from this ethnographic study of kindergartens in three communities suggest that teachers, parents, and the school as an institution interacted to develop a social interpretation of school readiness. This interpretation framed children's kindergarten experience in each community. (BC)

  9. From Readiness to Action: How Motivation Works

    National Research Council Canada - National Science Library

    Arie W. Kruglanski; Marina Chernikova; Noa Schori-Eyal

    2014-01-01

    .... A general model of readiness is described which builds on the work of prior theories, including animal learning models and personality approaches, and which aims to integrate a variety of research...

  10. Ready to perform; Zur Leistung bereit

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, Joerg-Rainer

    2012-02-15

    The wind power industry is ready to take on SDL. This is an important precondition for stabilizing power supply grids in future power supply scenarios. However, many grid providers do not make use of the available services.

  11. Measuring Corporate Readiness To Implement Distance Education.

    Science.gov (United States)

    Slick, Joan Friton

    2001-01-01

    Examines distance education issues for corporations and offers a measurement tool to assess corporate readiness for distance education. Topics include costs; competitive advantages; customized training; effectiveness; organizational support; instructional design; managerial support; and learner attitudes. (Author/LRW)

  12. School readiness and later achievement.

    Science.gov (United States)

    Duncan, Greg J; Dowsett, Chantelle J; Claessens, Amy; Magnuson, Katherine; Huston, Aletha C; Klebanov, Pamela; Pagani, Linda S; Feinstein, Leon; Engel, Mimi; Brooks-Gunn, Jeanne; Sexton, Holly; Duckworth, Kathryn; Japel, Crista

    2007-11-01

    Using 6 longitudinal data sets, the authors estimate links between three key elements of school readiness--school-entry academic, attention, and socioemotional skills--and later school reading and math achievement. In an effort to isolate the effects of these school-entry skills, the authors ensured that most of their regression models control for cognitive, attention, and socioemotional skills measured prior to school entry, as well as a host of family background measures. Across all 6 studies, the strongest predictors of later achievement are school-entry math, reading, and attention skills. A meta-analysis of the results shows that early math skills have the greatest predictive power, followed by reading and then attention skills. By contrast, measures of socioemotional behaviors, including internalizing and externalizing problems and social skills, were generally insignificant predictors of later academic performance, even among children with relatively high levels of problem behavior. Patterns of association were similar for boys and girls and for children from high and low socioeconomic backgrounds. (c) 2007 APA.

  13. Making Technology Ready: Integrated Systems Health Management

    Science.gov (United States)

    Malin, Jane T.; Oliver, Patrick J.

    2007-01-01

    This paper identifies work needed by developers to make integrated system health management (ISHM) technology ready and by programs to make mission infrastructure ready for this technology. This paper examines perceptions of ISHM technologies and experience in legacy programs. Study methods included literature review and interviews with representatives of stakeholder groups. Recommendations address 1) development of ISHM technology, 2) development of ISHM engineering processes and methods, and 3) program organization and infrastructure for ISHM technology evolution, infusion and migration.

  14. Cognitive Approaches for Digital Forensic Readiness Planning

    OpenAIRE

    Pooe, Antonio; Labuschagne, Les

    2013-01-01

    Part 2: FORENSIC MODELS; International audience; This paper focuses on the use of cognitive approaches for digital forensic readiness planning. Research has revealed that a well-thought-out and legally contextualized digital forensic readiness strategy can provide organizations with an increased ability to respond to security incidents while maintaining the integrity of the evidence gathered and keeping investigative costs low. This paper contributes to the body of knowledge in digital forens...

  15. Solar Training Network and Solar Ready Vets

    Energy Technology Data Exchange (ETDEWEB)

    Dalstrom, Tenley Ann

    2016-09-14

    In 2016, the White House announced the Solar Ready Vets program, funded under DOE's SunShot initiative would be administered by The Solar Foundation to connect transitioning military personnel to solar training and employment as they separate from service. This presentation is geared to informing and recruiting employer partners for the Solar Ready Vets program, and the Solar Training Network. It describes the programs, and the benefits to employers that choose to connect to the programs.

  16. Increasing Fleet Readiness Through Improved Distance Support

    Science.gov (United States)

    2013-03-01

    Readiness Database MRDB-NG Material Readiness Database - Next Generation MS Microsoft® MSSE Masters of Science in Systems Engineering MTB (EMCE...Mean Time Between Equipment Mission Critical Events MTB (EME) Mean Time Between Equipment Malfunction Events MTBF Mean Time Between Failures MTTR...MTTR Data from MRDB R.1.1.2.32 Display Mean Time Between Equipment Mission Critical Events ( MTB (EMCE)) and Mean Time Between Equipment

  17. Methods and computing challenges of the realistic simulation of physics events in the presence of pile-up in the ATLAS experiment

    CERN Document Server

    Chapman, J D; The ATLAS collaboration

    2014-01-01

    We are now in a regime where we observe substantial multiple proton-proton collisions within each filled LHC bunch-crossing and also multiple filled bunch-crossings within the sensitive time window of the ATLAS detector. This will increase with increased luminosity in the near future. Including these effects in Monte Carlo simulation poses significant computing challenges. We present a description of the standard approach used by the ATLAS experiment and details of how we manage the conflicting demands of keeping the background dataset size as small as possible while minimizing the effect of background event re-use. We also present details of the methods used to minimize the memory footprint of these digitization jobs, to keep them within the grid limit, despite combining the information from thousands of simulated events at once. We also describe an alternative approach, known as Overlay. Here, the actual detector conditions are sampled from raw data using a special zero-bias trigger, and the simulated physi...

  18. Readiness and Achievement Motivation: An Investigation of the Validity of the Readiness Scales in Hersey and Blanchard's Situational Leadership.

    Science.gov (United States)

    Wang, Xiaoping; Knight, W. Hal

    The construct validity of two measures of employee job readiness was investigated by examining the relationships between job readiness and achievement motivation, and between readiness and the variables of education and work experience. The readiness, or maturity level, of employees is an important concept in the situational leadership model,…

  19. The development of an online decision support tool for organizational readiness for change.

    Science.gov (United States)

    Khan, Sobia; Timmings, Caitlyn; Moore, Julia E; Marquez, Christine; Pyka, Kasha; Gheihman, Galina; Straus, Sharon E

    2014-05-10

    Much importance has been placed on assessing readiness for change as one of the earliest steps of implementation, but measuring it can be a complex and daunting task. Organizations and individuals struggle with how to reliably and accurately measure readiness for change. Several measures have been developed to help organizations assess readiness, but these are often underused due to the difficulty of selecting the right measure. In response to this challenge, we will develop and test a prototype of a decision support tool that is designed to guide individuals interested in implementation in the selection of an appropriate readiness assessment measure for their setting. A multi-phase approach will be used to develop the decision support tool. First, we will identify key measures for assessing organizational readiness for change from a recently completed systematic review. Included measures will be those developed for healthcare settings (e.g., acute care, public health, mental health) and that have been deemed valid and reliable. Second, study investigators and field experts will engage in a mapping exercise to categorize individual items of included measures according to key readiness constructs from an existing framework. Third, a stakeholder panel will be recruited and consulted to determine the feasibility and relevance of the selected measures using a modified Delphi process. Fourth, findings from the mapping exercise and stakeholder consultation will inform the development of a decision support tool that will guide users in appropriately selecting change readiness measures. Fifth, the tool will undergo usability testing. Our proposed decision support tool will address current challenges in the field of organizational change readiness by aiding individuals in selecting a valid and reliable assessment measure that is relevant to user needs and practice settings. We anticipate that implementers and researchers who use our tool will be more likely to conduct

  20. Readiness of organizations for change, motivation and conflict-handling intentions: senior nursing students' perceptions.

    Science.gov (United States)

    Mrayyan, Majd T; Modallal, Rola; Awamreh, Khitam; Atoum, Maysoun; Abdullah, Muna; Suliman, Samah

    2008-03-01

    This study examined the perceptions of 62 senior nursing students of the readiness of Jordanian organizations for change, students' motivators and their conflict-handling intentions. Such concepts should be taught at Schools of Nursing in order to prepare the students as nurses in the near future. It is found that the course of "Nursing Leadership and Management" has positive influence on students' understanding of the studied concepts. This descriptive study was conducted in seven hospitals. Grossman and Valiga's (2000) [Grossman, S., Valiga, T.M., 2000. The New Leadership Challenge: Creating the Future of Nursing. F.A. Davis, Philadelphia, pp. 147-148.] instrument was used to measure the readiness of organizations for change. As they progress in the course, the students' perceptions about the organizational readiness to change increased; the students "somehow" perceived that the Jordanian organizations were ready to change. The students were asked what motivates and they were asked about their conflict-handling techniques. Senior nursing students reported that private hospitals were better than governmental hospitals in their readiness for change. In general, male students perceived the readiness of organizations for change more positively than female students. The students were mainly motivated by "achievement" and used "collaboration" as a primary conflict-handling technique. Further studies are needed to explore in-depth the concept of the readiness of organizations for change. Achievement is a strong motivator that should be encouraged among students. Conflict-handling techniques in general and collaboration in particular should be taught for nursing students as these techniques will influence their future professional lives.

  1. Computational astrophysics

    Science.gov (United States)

    Miller, Richard H.

    1987-01-01

    Astronomy is an area of applied physics in which unusually beautiful objects challenge the imagination to explain observed phenomena in terms of known laws of physics. It is a field that has stimulated the development of physical laws and of mathematical and computational methods. Current computational applications are discussed in terms of stellar and galactic evolution, galactic dynamics, and particle motions.

  2. Determining transition readiness in congenital heart disease: Assessing the utility of the Transition Readiness Questionnaire

    Science.gov (United States)

    The Transition Readiness Assessment Questionnaire (TRAQ) is a tool commonly used to assess transition readiness in adolescents with chronic diseases. It was previously validated in youth with special health care needs (YSHCN), but no patients with congenital heart disease (CHD) were included in the ...

  3. Readiness: Some Travel Faster than Others. A Unit on Reading Readiness.

    Science.gov (United States)

    Eitmann, Twila

    An extended readiness unit plan for beginning first graders who demonstrated a limited degree of readiness in kindergarten is provided. The unit theme is the five senses, and the following sequence of presentation is used: sight, hearing, smell, touch, and taste. Visual-perception activities receive the most emphasis. Included in procedures for…

  4. Is School Community Readiness Related to Physical Activity before and after the Ready for Recess Intervention?

    Science.gov (United States)

    Ehlers, Diane K.; Huberty, Jennifer L.; Beseler, Cheryl L.

    2013-01-01

    The purpose of this study was to determine: (i) the effect of schools' baseline community readiness (CR) on youth physical activity (PA) at recess prior to the Ready for Recess intervention; (ii) if changes in PA due to the intervention were explained by baseline CR and (iii) if specific components of the intervention altered an association…

  5. Pathways to School Readiness: Executive Functioning Predicts Academic and Social-Emotional Aspects of School Readiness

    Science.gov (United States)

    Mann, Trisha D.; Hund, Alycia M.; Hesson-McInnis, Matthew S.; Roman, Zachary J.

    2017-01-01

    The current study specified the extent to which hot and cool aspects of executive functioning predicted academic and social-emotional indicators of school readiness. It was unique in focusing on positive aspects of social-emotional readiness, rather than problem behaviors. One hundred four 3-5-year-old children completed tasks measuring executive…

  6. QCD are we ready for the LHC?

    CERN Multimedia

    CERN. Geneva

    2006-01-01

    The LHC energy regime poses a serious challenge to our capability of predicting QCD reactions to the level of accuracy necessary for a successful programme of searches for physics beyond the Standard Model. In these lectures, I'll introduce basic concepts in QCD, and present techniques based on perturbation theory, such as fixed-order and resummed computations, and Monte Carlo simulations. I'll discuss applications of these techniques to hadron-hadron processes, concentrating on recent trends in perturbative QCD aimed at improving our understanding of LHC phenomenology.

  7. Are consumers ready for RFID?

    DEFF Research Database (Denmark)

    Aguiar, Luis Kluwe; Brofman, Freddy; de Barcellos, Marcia Dutra

    2010-01-01

    Marketing orientation is both the key objective of most food producers and their biggest challenge. Connecting food and agricultural production with the changing needs and aspirations of the customer provides the means to ensure competitive advantage, resilience and added value in what you produc...

  8. Variability of computational fluid dynamics solutions for pressure and flow in a giant aneurysm: the ASME 2012 Summer Bioengineering Conference CFD Challenge.

    Science.gov (United States)

    Steinman, David A; Hoi, Yiemeng; Fahy, Paul; Morris, Liam; Walsh, Michael T; Aristokleous, Nicolas; Anayiotos, Andreas S; Papaharilaou, Yannis; Arzani, Amirhossein; Shadden, Shawn C; Berg, Philipp; Janiga, Gábor; Bols, Joris; Segers, Patrick; Bressloff, Neil W; Cibis, Merih; Gijsen, Frank H; Cito, Salvatore; Pallarés, Jordi; Browne, Leonard D; Costelloe, Jennifer A; Lynch, Adrian G; Degroote, Joris; Vierendeels, Jan; Fu, Wenyu; Qiao, Aike; Hodis, Simona; Kallmes, David F; Kalsi, Hardeep; Long, Quan; Kheyfets, Vitaly O; Finol, Ender A; Kono, Kenichi; Malek, Adel M; Lauric, Alexandra; Menon, Prahlad G; Pekkan, Kerem; Esmaily Moghadam, Mahdi; Marsden, Alison L; Oshima, Marie; Katagiri, Kengo; Peiffer, Véronique; Mohamied, Yumnah; Sherwin, Spencer J; Schaller, Jens; Goubergrits, Leonid; Usera, Gabriel; Mendina, Mariana; Valen-Sendstad, Kristian; Habets, Damiaan F; Xiang, Jianping; Meng, Hui; Yu, Yue; Karniadakis, George E; Shaffer, Nicholas; Loth, Francis

    2013-02-01

    Stimulated by a recent controversy regarding pressure drops predicted in a giant aneurysm with a proximal stenosis, the present study sought to assess variability in the prediction of pressures and flow by a wide variety of research groups. In phase I, lumen geometry, flow rates, and fluid properties were specified, leaving each research group to choose their solver, discretization, and solution strategies. Variability was assessed by having each group interpolate their results onto a standardized mesh and centerline. For phase II, a physical model of the geometry was constructed, from which pressure and flow rates were measured. Groups repeated their simulations using a geometry reconstructed from a micro-computed tomography (CT) scan of the physical model with the measured flow rates and fluid properties. Phase I results from 25 groups demonstrated remarkable consistency in the pressure patterns, with the majority predicting peak systolic pressure drops within 8% of each other. Aneurysm sac flow patterns were more variable with only a few groups reporting peak systolic flow instabilities owing to their use of high temporal resolutions. Variability for phase II was comparable, and the median predicted pressure drops were within a few millimeters of mercury of the measured values but only after accounting for submillimeter errors in the reconstruction of the life-sized flow model from micro-CT. In summary, pressure can be predicted with consistency by CFD across a wide range of solvers and solution strategies, but this may not hold true for specific flow patterns or derived quantities. Future challenges are needed and should focus on hemodynamic quantities thought to be of clinical interest.

  9. Modeling and simulation technology readiness levels.

    Energy Technology Data Exchange (ETDEWEB)

    Clay, Robert L.; Shneider, Max S.; Marburger, S. J.; Trucano, Timothy Guy

    2006-01-01

    This report summarizes the results of an effort to establish a framework for assigning and communicating technology readiness levels (TRLs) for the modeling and simulation (ModSim) capabilities at Sandia National Laboratories. This effort was undertaken as a special assignment for the Weapon Simulation and Computing (WSC) program office led by Art Hale, and lasted from January to September 2006. This report summarizes the results, conclusions, and recommendations, and is intended to help guide the program office in their decisions about the future direction of this work. The work was broken out into several distinct phases, starting with establishing the scope and definition of the assignment. These are characterized in a set of key assertions provided in the body of this report. Fundamentally, the assignment involved establishing an intellectual framework for TRL assignments to Sandia's modeling and simulation capabilities, including the development and testing of a process to conduct the assignments. To that end, we proposed a methodology for both assigning and understanding the TRLs, and outlined some of the restrictions that need to be placed on this process and the expected use of the result. One of the first assumptions we overturned was the notion of a ''static'' TRL--rather we concluded that problem context was essential in any TRL assignment, and that leads to dynamic results (i.e., a ModSim tool's readiness level depends on how it is used, and by whom). While we leveraged the classic TRL results from NASA, DoD, and Sandia's NW program, we came up with a substantially revised version of the TRL definitions, maintaining consistency with the classic level definitions and the Predictive Capability Maturity Model (PCMM) approach. In fact, we substantially leveraged the foundation the PCMM team provided, and augmented that as needed. Given the modeling and simulation TRL definitions and our proposed assignment methodology, we

  10. Ready for kindergarten: Are intelligence skills enough?

    Directory of Open Access Journals (Sweden)

    Caroline Fitzpatrick

    2017-12-01

    Full Text Available This study investigated how different profiles of kindergarten readiness in terms of student intellectual ability, academic skills and classroom engagement relate to future academic performance. Participants are French-Canadian children followed in the context of the Quebec Longitudinal Study of Child Development (N = 670. Trained examiners measured number knowledge, receptive vocabulary and fluid intelligence when children were in kindergarten. Teachers rated kindergarten classroom engagement. Outcomes included fourth-grade teacherrated achievement and directly assessed mathematical skills. Latent class analyses revealed three kindergarten readiness profiles: high (57%, moderate (34% and low (9.3% readiness. Using multiple regression, we found that a more favourable kindergarten profile predicted better fourth-grade academic performance. Identifying children at risk of academic difficulty is an important step for preventing underachievement and dropout. These results suggest the importance of promoting a variety of cognitive, academic and behavioural skills to enhance later achievement in at-risk learners.

  11. Concept of economic readiness levels assessment

    Science.gov (United States)

    Yuniaristanto, Sutopo, W.; Widiyanto, A.; Putri, A. S.

    2017-11-01

    This research aims to build a concept of Economic Readiness Level (ERL) assessment for incubation center. ERL concept is arranged by considering both market and business aspects. Every aspect is divided into four phases and each of them consists of some indicators. Analytic Hierarchy Process (AHP) is used to develop the ERL in calculating the weight of every single aspect and indicator. Interval scale between 0 and 4 is also applied in indicator assessment. In order to calculate ERL, score in every indicator and the weight of both the aspect and indicator are considered. ERL value is able to show in detail the innovative product readiness level from economic sight, market and business aspect. There are four levels in Economic Readiness Level scheme which are investigation, feasibility, planning and introduction.

  12. AMCP Partnership Forum: Biosimilars--Ready, Set, Launch.

    Science.gov (United States)

    2016-04-01

    Through 2020, reference biologic products will lose patent protection that will be worth $54 billion to the U.S. economy. Consequently, interest in biosimilars is intensifying across the health care industry. Managed care organizations (MCOs) are depending on the savings opportunity that bio-similars promise. After the first FDA approval of a biosimilar in March 2015, the Academy of Managed Care Pharmacy (AMCP) convened a biosimilar Partnership Forum on June 10-11, 2015. The goal of this forum was to address current readiness of MCOs to optimize biosimilars; identify gaps, challenges, and opportunities with regard to biosimilars; and recommend education and training content to help AMCP best meet the needs of its members and stakeholders. The forum brought together multiple stakeholders from MCOs, pharmacy benefit managers, specialty pharmacies, integrated delivery networks, federal government and standards setting organizations, consumer advocacy groups, and the pharmaceutical industry. Through a series of 4 one-hour webinars and a 1.5-day live workgroup session, participants identified current challenges and readiness issues in addressing biosimilars. These challenges included lack of a consolidated educational strategy for incorporating biosimilars into the clinical decision-making process; deficiencies in current levels of federal (e.g., the FDA) or state (e.g., departments of insurance) guidance; limited intelligence on pricing strategies and consideration of stakeholder contracting alignment and risk sharing; and operational implementation issues. Participants identified necessary tactics for executing a successful bio-similar strategy. These tactics included creating a broad multiple stakeholder coalition to support educational efforts to gain public, provider, and other stakeholder acceptance; aligning utilization incentives through reimbursement policies and programs; encouraging benefit design and stakeholder collaboration; advancing the coding and

  13. Geomagnetically induced currents: Science, engineering, and applications readiness

    Science.gov (United States)

    Pulkkinen, A.; Bernabeu, E.; Thomson, A.; Viljanen, A.; Pirjola, R.; Boteler, D.; Eichner, J.; Cilliers, P. J.; Welling, D.; Savani, N. P.; Weigel, R. S.; Love, J. J.; Balch, C.; Ngwira, C. M.; Crowley, G.; Schultz, A.; Kataoka, R.; Anderson, B.; Fugate, D.; Simpson, J. J.; MacAlester, M.

    2017-07-01

    This paper is the primary deliverable of the very first NASA Living With a Star Institute Working Group, Geomagnetically Induced Currents (GIC) Working Group. The paper provides a broad overview of the current status and future challenges pertaining to the science, engineering, and applications of the GIC problem. Science is understood here as the basic space and Earth sciences research that allows improved understanding and physics-based modeling of the physical processes behind GIC. Engineering, in turn, is understood here as the "impact" aspect of GIC. Applications are understood as the models, tools, and activities that can provide actionable information to entities such as power systems operators for mitigating the effects of GIC and government agencies for managing any potential consequences from GIC impact to critical infrastructure. Applications can be considered the ultimate goal of our GIC work. In assessing the status of the field, we quantify the readiness of various applications in the mitigation context. We use the Applications Readiness Level (ARL) concept to carry out the quantification.

  14. Bioprinting: an assessment based on manufacturing readiness levels.

    Science.gov (United States)

    Wu, Changsheng; Wang, Ben; Zhang, Chuck; Wysk, Richard A; Chen, Yi-Wen

    2017-05-01

    Over the last decade, bioprinting has emerged as a promising technology in the fields of tissue engineering and regenerative medicine. With recent advances in additive manufacturing, bioprinting is poised to provide patient-specific therapies and new approaches for tissue and organ studies, drug discoveries and even food manufacturing. Manufacturing Readiness Level (MRL) is a method that has been applied to assess manufacturing maturity and to identify risks and gaps in technology-manufacturing transitions. Technology Readiness Level (TRL) is used to evaluate the maturity of a technology. This paper reviews recent advances in bioprinting following the MRL scheme and addresses corresponding MRL levels of engineering challenges and gaps associated with the translation of bioprinting from lab-bench experiments to ultimate full-scale manufacturing of tissues and organs. According to our step-by-step TRL and MRL assessment, after years of rigorous investigation by the biotechnology community, bioprinting is on the cusp of entering the translational phase where laboratory research practices can be scaled up into manufacturing products specifically designed for individual patients.

  15. Geomagnetically induced currents: Science, engineering, and applications readiness

    Science.gov (United States)

    Pulkkinen, Antti; Bernabeu, E.; Thomson, A.; Viljanen, A.; Pirjola, R.; Boteler, D.; Eichner, J.; Cilliers, P.J.; Welling, D.; Savani, N.P.; Weigel, R.S.; Love, Jeffrey J.; Balch, Christopher; Ngwira, C.M.; Crowley, G.; Schultz, Adam; Kataoka, R.; Anderson, B.; Fugate, D.; Simpson, J.J.; MacAlester, M.

    2017-01-01

    This paper is the primary deliverable of the very first NASA Living With a Star Institute Working Group, Geomagnetically Induced Currents (GIC) Working Group. The paper provides a broad overview of the current status and future challenges pertaining to the science, engineering, and applications of the GIC problem. Science is understood here as the basic space and Earth sciences research that allows improved understanding and physics-based modeling of the physical processes behind GIC. Engineering, in turn, is understood here as the “impact” aspect of GIC. Applications are understood as the models, tools, and activities that can provide actionable information to entities such as power systems operators for mitigating the effects of GIC and government agencies for managing any potential consequences from GIC impact to critical infrastructure. Applications can be considered the ultimate goal of our GIC work. In assessing the status of the field, we quantify the readiness of various applications in the mitigation context. We use the Applications Readiness Level (ARL) concept to carry out the quantification.

  16. Validating Acquisition IS Integration Readiness with Drills

    DEFF Research Database (Denmark)

    Wynne, Peter J.

    2017-01-01

    To companies, mergers and acquisitions are important strategic tools, yet they often fail to deliver their expected value. Studies have shown the integration of information systems is a significant roadblock to the realisation of acquisition benefits, and for an IT department to be ready......), to understand how an IT department can use them to validate their integration plans. The paper presents a case study of two drills used to validate an IT department’s readiness to carry out acquisition IS integration, and suggests seven acquisition IS integration drill characteristics others could utilise when...

  17. PENGARUH TECHNOLOGY READINESS TERHADAP PENERIMAAN TEKNOLOGI KOMPUTER PADA UMKM DI YOGYAKARTA

    Directory of Open Access Journals (Sweden)

    Mimin Nur Aisyah

    2014-10-01

    Full Text Available Abstrak: Pengaruh Technology Readiness terhadap Penerimaan Teknologi Komputer pada UMKM di Yogyakarta. Penelitian ini bertujuan untuk mengekplorasi pengaruh kesiapan teknologi terhadap persepsi kemanfaatan sistem dan persepsi kemudahan penggunaan sistem serta pengaruh kedua persepsi terhadap teknologi tersebut terhadap minat menggunakan teknologi komputer dalam membantu proses bisnis pada UMKM di Yogyakarta.  Sampel penelitian ini sejumlah 498 UMKM yang terdaftar di Disperindagkop Yogyakarta. Teknik pengambilan sampel menggunakan teknik simple random sampling. Data diperoleh menggunakan kuesioner. Analisis data dan uji hipotesis menggunakan model Partial-Least-Square (PLS. Penelitian ini menemukan bahwa terdapat pengaruh kesiapan teknologi terhadap persepsi kemanfaatan sistem dan persepsi kemudahan penggunaan sistem, serta terdapat pengaruh persepsi kemanfaatan teknologi dan persepsi kemudahan penggunaan teknologi terhadap minat menggunakan teknologi komputer dalam membantu proses bisnis pada UMKM di Yogyakarta.   Kata kunci: kesiapan teknologi, persepsi kemanfaatan, persepsi kemudahan penggunaan, minat menggunakan, UMKM Abstract: The Effect of Technology Readiness toward Acceptance of Computer Technology on SMEs in Yogyakarta. This research aims to explore the effect of technology readiness to the perceived of usefulness of system and perceived ease of use of the system and the influence of both perceptions of these technologies to the behavioral intention of computer technology in business processes in SMEs in Yogyakarta. The research sample number of 498 SMEs were registered in Disperindagkop Yogyakarta. The sampling technique using simple random sampling technique. The data were obtained using a questionnaire. Data analysis and hypothesis testing using a model of the Partial-Least-Square (PLS. The research found that there are significant technology readiness to the perception of the benefit system and perceived ease of use of the system

  18. Computer Tree

    Directory of Open Access Journals (Sweden)

    Onur AĞAOĞLU

    2014-12-01

    Full Text Available It is crucial that gifted and talented students should be supported by different educational methods for their interests and skills. The science and arts centres (gifted centres provide the Supportive Education Program for these students with an interdisciplinary perspective. In line with the program, an ICT lesson entitled “Computer Tree” serves for identifying learner readiness levels, and defining the basic conceptual framework. A language teacher also contributes to the process, since it caters for the creative function of the basic linguistic skills. The teaching technique is applied for 9-11 aged student level. The lesson introduces an evaluation process including basic information, skills, and interests of the target group. Furthermore, it includes an observation process by way of peer assessment. The lesson is considered to be a good sample of planning for any subject, for the unpredicted convergence of visual and technical abilities with linguistic abilities.

  19. A Proposed Conceptual Model of Military Medical Readiness

    National Research Council Canada - National Science Library

    Van Hall, Brian M

    2007-01-01

    .... The purpose of this research is to consolidate existing literature on the latent variable of medical readiness, and to propose a composite theoretical model of medical readiness that may provide...

  20. Specificity of school readiness assessment of children with mental disability

    OpenAIRE

    Klausová, Markéta

    2014-01-01

    This thesis is focused on the school readiness assessment of children with mental disability. Thesis is devoted to theoretical knowledge in relation to pre-school age and also specifically for children with mental disability. Thesis describes the school readiness of child and compares foreign and local view on it. It also includes the issue of school readiness of children with mental disability. Furthermore, the thesis focuses on the school readiness assessment and on resources and tools that...

  1. Legal but limited? Abortion service availability and readiness assessment in Nepal.

    Science.gov (United States)

    Bell, Suzanne O; Zimmerman, Linnea; Choi, Yoonjoung; Hindin, Michelle J

    2018-01-01

    The government of Nepal revised its law in 2002 to allow women to terminate a pregnancy up to 12 weeks gestation for any indication on request, and up to 18 weeks if certain conditions are met. We evaluated the readiness of facilities in Nepal to provide three abortion services, manual vacuum aspiration (MVA), medication abortion (MA) and post-abortion care (PAC), using the service availability and readiness assessment (SARA) framework. The framework consists broadly of three domains; service availability, general service readiness and service readiness specific to individual services (i.e. service-specific readiness). We applied the framework to data from the Nepal Health Facility Survey 2015, a nationally representative survey of 992 health facilities. Overall, we find that access to safe abortion remains limited in Nepal. Of the facilities that reported offering delivery services and were thus eligible to provide safe abortion services, 44.5, 36.0 and 25.6% had provided any MVA, MA or PAC services, respectively, in the 3 months prior to the survey, and MVA, 1.5% of facilities that provided MA and 1.1% of the facilities that provided PAC had all the components of care required. Although the private sector conducted approximately half of all abortion services provided in the 3 months prior to the survey, no private sector facilities had all the abortion service-specific readiness components. Results suggest that accessing safe abortion services remains a significant challenge for Nepalese women, despite a set of permissive laws. © The Author 2017. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. Ready a Commodore 64 retrospective

    CERN Document Server

    Dillon, Roberto

    2015-01-01

    How did the Commodore 64 conquer the hearts of millions and become a platform people still actively develop for even today? What made it so special? This book will appeal to both those who like tinkering with old technology as a hobby and nostalgic readers who simply want to enjoy a trip down memory lane. It discusses in a concise but rigorous format the different areas of home gaming and personal computing where the C64 managed to innovate and push forward existing boundaries. Starting from Jack Tramiel's vision of designing computers "for the masses, not the classes," the book introduces the 6510, VIC-II and SID chips that made the C64 unique. It briefly discusses its Basic programming language and then proceeds to illustrate not only many of the games that are still so fondly remembered but also the first generation of game engines that made game development more approachable − among other topics that are often neglected but are necessary to provide a comprehensive overview of how far reaching the C64 in...

  3. Oral feeding readiness in the neonatal intensive care unit.

    Science.gov (United States)

    Jones, Luann R

    2012-01-01

    Oral feeding is a complex sensorimotor process that is influenced by many variables, making the introduction and management of oral feeding a challenge for many health care providers. Feeding practice guided by tradition or a trial-and-error approach may be inconsistent and has the potential to delay the progression of oral feeding skills. Oral feeding initiation and management should be based on careful, individualized assessment of the NICU infant and requires an understanding of neonatal physiology and neurodevelopment. The purpose of this article is to help the health care provider with this complex process by (a) defining oral feeding readiness, (b) describing the importance of oral feeding in the NICU and the physiology of feeding, and (c) providing a review of the literature regarding the transition from gavage to oral feeding in the NICU.

  4. The Partnership on Work Enrichment and Readiness

    Science.gov (United States)

    Haar, Diane; Raggi, Mindi

    2009-01-01

    The Partnership on Work Enrichment and Readiness (POWER's) unique and innovative curriculum recruits and sustains nontraditional students interested in preparing for employment or continued studies in an institution of higher education. The program specifically targets persons in mental health recovery. Students attend college during a regular…

  5. Emotional Readiness and Music Therapeutic Activities

    Science.gov (United States)

    Drossinou-Korea, Maria; Fragkouli, Aspasia

    2016-01-01

    The purpose of this study is to understand the children's expression with verbal and nonverbal communication in the Autistic spectrum. We study the emotional readiness and the music therapeutic activities which exploit the elements of music. The method followed focused on the research field of special needs education. Assumptions on the parameters…

  6. Readiness of Teachers for Change in Schools

    Science.gov (United States)

    Kondakci, Yasar; Beycioglu, Kadir; Sincar, Mehmet; Ugurlu, Celal Teyyar

    2017-01-01

    Theorizing on the role of teacher attitudes in change effectiveness, this study examined the predictive value of context (trust), process (social interaction, participative management and knowledge sharing) and outcome (job satisfaction and workload perception) variables for cognitive, emotional and intentional readiness of teachers for change.…

  7. Assistant Principals: Their Readiness as Instructional Leaders

    Science.gov (United States)

    Searby, Linda; Browne-Ferrigno, Tricia; Wang, Chih-hsuan

    2017-01-01

    This article reports findings from a study investigating the capacity of assistant principals to be instructional leaders. Analyses of survey responses yielded four interesting findings: (a) years of experience as a teacher and age had no significance on assistant principals' perceived readiness as an instructional leader; (b) those completing…

  8. SPECAL ISSUE Awareness, Readiness, Commitment and ...

    African Journals Online (AJOL)

    teachers, officers at different levels) and the relevance of the programs as perceived by these agents. Statement of ... What is the status of awareness, readiness, commitment level and perception of different agents attached to ..... are very far from the main grid and in remote place under this program of CBE. He believed that ...

  9. SPECAL ISSUE Awareness, Readiness, Commitment and ...

    African Journals Online (AJOL)

    What is the status of awareness, readiness, commitment level and perception of different agents attached to ... the top university officials perceive the pedagogical relevance of CBE;. Significance of the ...... physics students were least committed because they believed that their subject area courses were not related to CBE.

  10. Birth Preparedness and Complication Readiness of Pregnant ...

    African Journals Online (AJOL)

    Background: Birth preparedness and complication readiness (BP/CR) is a safe motherhood strategy which addresses delays that could increase the risk of dying in pregnancy, child birth and the immediate postpartum period. The strategy has not been effectively implemented in Nigeria hence maternal mortality remains ...

  11. Birth Preparedness and Complication Readiness of Pregnant ...

    African Journals Online (AJOL)

    Nigeria records a preparing for childbirth and being ready for .... or after childbirth (264, 65.8%) while just 17 had voluntary counseling and testing on HIV. ..... Children 2009. New York: UNICEF;. 8. Moore M., Copeland R., Chege I., Pido. 2008. D., Griffiths M. A behavior change's. 17. Al-Zirgi I., Vangen S., Forsen I., Stray-.

  12. Child Physical Punishment, Parenting, and School Readiness

    Science.gov (United States)

    Weegar, Kelly; Guérin-Marion, Camille; Fréchette, Sabrina; Romano, Elisa

    2018-01-01

    This study explored how physical punishment (PP) and other parenting approaches may predict school readiness outcomes. By using the Canada-wide representative data, 5,513 children were followed over a 2-year period. Caregivers reported on their use of PP and other parenting approaches (i.e., literacy and learning activities and other disciplinary…

  13. Functional criteria for assessing pointe-readiness.

    Science.gov (United States)

    Richardson, Megan; Liederbach, Marijeanne; Sandow, Emily

    2010-01-01

    The most popular criterion cited in the dance literature for advancement to pointe work is attainment of the chronological age of 12 years. However, dancers at this age vary greatly in terms of musculoskeletal maturity and motor skill development. The purpose of this study was to investigate whether objective, functional tests could be used in conjunction with dance teacher expertise to determine pointe-readiness. It was hypothesized that dynamic tests of motor control can better indicate pointe-readiness than chronological age alone or in combination with static musculoskeletal measurements. Thirty-seven pre-pointe students from two professional ballet schools were tested for muscular strength, ankle joint range of motion, single leg standing balance, dynamic alignment, and turning skill. In addition, the participating students' ballet teachers independently graded each student on her readiness to begin dancing en pointe. Performance on three functional tests (the Airplane test, Sauté test, and Topple test) was closely associated with teacher subjective rating for pointe-readiness. It is concluded that these tests may be more useful for gauging acquisition of the skills required for safe and successful performance than the traditionally accepted indicators of chronological age, years of dance training, and ankle joint range of motion.

  14. IFRS READINESS IN LATIN AMERICAN BUSINESS CURRICULA

    OpenAIRE

    Myrna R. Berríos

    2012-01-01

    Multinational companies doing business in Latin America, and elsewhere in the world, must comply with individual countries’ financial reporting and financial market rules and local legislation when disclosing financial information. This research assesses international financial reporting standards (IFRS) readiness in the finance, accounting, and taxation curricula in Latin American universities.

  15. Public webinar: Wildland Fire Sensors Challenge

    Science.gov (United States)

    This multi-agency challenge seeks a field-ready prototype system capable of measuring constituents of smoke, including particulates, carbon monoxide, ozone, and carbon dioxide, over the wide range of levels expected during wildland fires. The prototype system should be accurate, ...

  16. Effect of Counseling on Graduates' Employment Challenges ...

    African Journals Online (AJOL)

    The study sought to discover the effects of cognitive restructuring counseling strategy on graduates' coping ability and readiness aptitude to employment challenges. A sample of 100 unemployed graduates was used for the study. The sample was pretested and then exposed to a programme of cognitive restructuring ...

  17. Assessment of instructors' readiness for implementing e-learning in continuing medical education in Iran.

    Science.gov (United States)

    Eslaminejad, Tahereh; Masood, Mona; Ngah, Nor Azilah

    2010-01-01

    E-learning provides new levels of flexibility in learning and teaching. This contribution of e-learning is dependent on the levels of readiness in several critical factors particularly in an educational organization. The purpose of this study was to assess instructors' readiness and to identify the most important factors that affect their readiness in e-learning in CME programs in order to use the effective opportunities that facilitate e-learning in CME programs. A 5-point Likert scale instrument consisting of two domains (technical and pedagogical) was constructed according to four subdomains (knowledge, attitude, skills, and habits) and distributed to 70 faculty members. A factor analysis was employed to extract significant factors. The results revealed that the mean of readiness on e-learning for faculty members was 3.25 ± 0.58 in technical and 3.37 ± 0.49 in pedagogical domains on a 5-point Likert scale (1-5). The factors such as "familiarity with learning management system," "willingness to teach by adopting a new technology," "willingness to use e-learning as a viable alternative," "ability to deliver e-material and to provide e-content for teaching," and "being accustomed to the virtual environment and utilization of the computer and the internet" were extracted on technical readiness domain. In addition, the pedagogical readiness factors were: "familiarity with online teaching principle and method," "willingness to use technology in instruction and material development," "ability to design content for e-material and online course evaluation," and "being accustomed to providing information back up regularly and employing eclectic methods and multiple approaches in teaching." The findings of this study suggest that training should be offered to instructors on a continuous, rather than a one-off basis so that their IT knowledge and skills are upgraded over time. In addition, results indicate that pedagogical innovations are required to develop and implement

  18. 75 FR 28594 - Ready-to-Learn Television Program

    Science.gov (United States)

    2010-05-21

    ... Ready-to-Learn Television Program AGENCY: Office of Innovation and Improvement, Department of Education... new awards for FY 2010 for the Ready-to-Learn Television Program. We have extended the deadline for...'' with the date ``June 22, 2010.'' FOR FURTHER INFORMATION CONTACT: The Ready-to-Learn Television Program...

  19. 75 FR 18170 - Ready-to-Learn Television Program

    Science.gov (United States)

    2010-04-09

    ... Ready-to-Learn Television Program AGENCY: Office of Innovation and Improvement, Department of Education... new awards for FY 2010 for the Ready-to-Learn Television Program. There is an error in one of the... INFORMATION CONTACT: The Ready-to-Learn Television Program, U.S. Department of Education, 400 ] Maryland...

  20. NET READINESS OF SME ENTERPRISES FROM WEST POMERANIAN VOIVODESHIP

    OpenAIRE

    Tomasz Ordysiñski

    2012-01-01

    The article presents results of the Net Readiness research of SME enterprises from West Pomeranian voivodeship as the preparation level to e-business. There is described Net Readiness method (invented by CISCO). The main part presents general Internet readiness card and indexes for micro, small and medium enterprises. The final part contains conclusions and the direction of further research.

  1. Diagnostics of children's school readiness in scientific studies abroad

    Directory of Open Access Journals (Sweden)

    Nazarenko V.V.

    2012-06-01

    Full Text Available The article considers the problem of children's school readiness as it is represented in contemporary studies of foreign scholars. It displays a variety of approaches to estimation of school readiness as well as the ways of measuring the levels of child development as relating to school readiness, namely those of them which are in common practice in education.

  2. 75 FR 16763 - Ready-to-Learn Television Program

    Science.gov (United States)

    2010-04-02

    ... Ready-to-Learn Television Program AGENCY: Office of Innovation and Improvement, Department of Education... new awards for FY 2010 for the Ready-to-Learn Television Program. There is an error in one of the... INFORMATION CONTACT: The Ready-to-Learn Television Program, U.S. Department of Education, 400 Maryland Avenue...

  3. Diagnostics of children's school readiness in scientific studies abroad

    OpenAIRE

    Nazarenko V.V.

    2012-01-01

    The article considers the problem of children's school readiness as it is represented in contemporary studies of foreign scholars. It displays a variety of approaches to estimation of school readiness as well as the ways of measuring the levels of child development as relating to school readiness, namely those of them which are in common practice in education.

  4. Ready or Not: Predicting High and Low School Readiness Among Teen Parents' Children.

    Science.gov (United States)

    Mollborn, Stefanie; Dennis, Jeff A

    2012-06-01

    Past research has documented compromised development for teenage mothers' children compared to others, but less is known about predictors of school readiness among these children or among teenage fathers' children. Our multidimensional measures of high and low school readiness incorporated math, reading, and behavior scores and parent-reported health. Using parent interviews and direct assessments from the Early Childhood Longitudinal Study-Birth Cohort, we predicted high and low school readiness shortly before kindergarten among children born to a teenage mother and/or father (N≈800). Factors from five structural and interpersonal domains based on the School Transition Model were measured at two time points, including change between those time points, to capture the dynamic nature of early childhood. Four domains (socioeconomic resources, maternal characteristics, parenting, and exposure to adults) predicted high or low school readiness, but often not both. Promising factors associated with both high and low readiness among teen parents' children came from four domains: maternal education and gains in education (socioeconomic), maternal age of at least 18 and fewer depressive symptoms (maternal characteristics), socioemotional parenting quality and home environment improvements (parenting), and living with fewer children and receiving nonparental child care in infancy (exposure to adults). The findings preliminarily suggest policies that might improve school readiness: encouraging maternal education while supplying child care, focusing teen pregnancy prevention efforts on school-age girls, basic socioeconomic supports, and investments in mental health and high-quality home environments and parenting.

  5. Ready or Not: Predicting High and Low School Readiness Among Teen Parents’ Children*

    Science.gov (United States)

    Mollborn, Stefanie; Dennis, Jeff A.

    2011-01-01

    Past research has documented compromised development for teenage mothers’ children compared to others, but less is known about predictors of school readiness among these children or among teenage fathers’ children. Our multidimensional measures of high and low school readiness incorporated math, reading, and behavior scores and parent-reported health. Using parent interviews and direct assessments from the Early Childhood Longitudinal Study-Birth Cohort, we predicted high and low school readiness shortly before kindergarten among children born to a teenage mother and/or father (N≈800). Factors from five structural and interpersonal domains based on the School Transition Model were measured at two time points, including change between those time points, to capture the dynamic nature of early childhood. Four domains (socioeconomic resources, maternal characteristics, parenting, and exposure to adults) predicted high or low school readiness, but often not both. Promising factors associated with both high and low readiness among teen parents’ children came from four domains: maternal education and gains in education (socioeconomic), maternal age of at least 18 and fewer depressive symptoms (maternal characteristics), socioemotional parenting quality and home environment improvements (parenting), and living with fewer children and receiving nonparental child care in infancy (exposure to adults). The findings preliminarily suggest policies that might improve school readiness: encouraging maternal education while supplying child care, focusing teen pregnancy prevention efforts on school-age girls, basic socioeconomic supports, and investments in mental health and high-quality home environments and parenting. PMID:22582109

  6. Computationally modeling interpersonal trust

    OpenAIRE

    Lee, Jin Joo; Knox, W. Bradley; Wormwood, Jolie B.; Breazeal, Cynthia; DeSteno, David

    2013-01-01

    We present a computational model capable of predicting—above human accuracy—the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind’s readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our pr...

  7. German contributions to the CMS computing infrastructure

    Science.gov (United States)

    Scheurer, A.; German CMS Community

    2010-04-01

    The CMS computing model anticipates various hierarchically linked tier centres to counter the challenges provided by the enormous amounts of data which will be collected by the CMS detector at the Large Hadron Collider, LHC, at CERN. During the past years, various computing exercises were performed to test the readiness of the computing infrastructure, the Grid middleware and the experiment's software for the startup of the LHC which took place in September 2008. In Germany, several tier sites are set up to allow for an efficient and reliable way to simulate possible physics processes as well as to reprocess, analyse and interpret the numerous stored collision events of the experiment. It will be shown that the German computing sites played an important role during the experiment's preparation phase and during data-taking of CMS and, therefore, scientific groups in Germany will be ready to compete for discoveries in this new era of particle physics. This presentation focuses on the German Tier-1 centre GridKa, located at Forschungszentrum Karlsruhe, the German CMS Tier-2 federation DESY/RWTH with installations at the University of Aachen and the research centre DESY. In addition, various local computing resources in Aachen, Hamburg and Karlsruhe are briefly introduced as well. It will be shown that an excellent cooperation between the different German institutions and physicists led to well established computing sites which cover all parts of the CMS computing model. Therefore, the following topics are discussed and the achieved goals and the gained knowledge are depicted: data management and distribution among the different tier sites, Grid-based Monte Carlo production at the Tier-2 as well as Grid-based and locally submitted inhomogeneous user analyses at the Tier-3s. Another important task is to ensure a proper and reliable operation 24 hours a day, especially during the time of data-taking. For this purpose, the meta-monitoring tool "HappyFace", which was

  8. Development of protein fortified mango based ready-to-serve beverage

    OpenAIRE

    Yadav, Deep N.; Vishwakarma, R. K.; Borad, Sanket; Bansal, Sangita; Jaiswal, Arvind K.; Sharma, Monika

    2016-01-01

    Fruit drinks contain negligible amount of protein as nutritional component. Fortification of fruit drinks with protein is a challenge due to protein stability in acidic and ionic environment. Mango ready-to-serve (RTS) beverage was fortified with modified whey protein and its rheological properties were studied. Whey protein was hydrolysed with papain to improve its stability in acidic medium. The water holding capacity of whey protein increased about two times after hydrolysis. Hydrolysed an...

  9. Results of the 2016 International Skin Imaging Collaboration International Symposium on Biomedical Imaging challenge: Comparison of the accuracy of computer algorithms to dermatologists for the diagnosis of melanoma from dermoscopic images.

    Science.gov (United States)

    Marchetti, Michael A; Codella, Noel C F; Dusza, Stephen W; Gutman, David A; Helba, Brian; Kalloo, Aadi; Mishra, Nabin; Carrera, Cristina; Celebi, M Emre; DeFazio, Jennifer L; Jaimes, Natalia; Marghoob, Ashfaq A; Quigley, Elizabeth; Scope, Alon; Yélamos, Oriol; Halpern, Allan C

    2018-02-01

    Computer vision may aid in melanoma detection. We sought to compare melanoma diagnostic accuracy of computer algorithms to dermatologists using dermoscopic images. We conducted a cross-sectional study using 100 randomly selected dermoscopic images (50 melanomas, 44 nevi, and 6 lentigines) from an international computer vision melanoma challenge dataset (n = 379), along with individual algorithm results from 25 teams. We used 5 methods (nonlearned and machine learning) to combine individual automated predictions into "fusion" algorithms. In a companion study, 8 dermatologists classified the lesions in the 100 images as either benign or malignant. The average sensitivity and specificity of dermatologists in classification was 82% and 59%. At 82% sensitivity, dermatologist specificity was similar to the top challenge algorithm (59% vs. 62%, P = .68) but lower than the best-performing fusion algorithm (59% vs. 76%, P = .02). Receiver operating characteristic area of the top fusion algorithm was greater than the mean receiver operating characteristic area of dermatologists (0.86 vs. 0.71, P = .001). The dataset lacked the full spectrum of skin lesions encountered in clinical practice, particularly banal lesions. Readers and algorithms were not provided clinical data (eg, age or lesion history/symptoms). Results obtained using our study design cannot be extrapolated to clinical practice. Deep learning computer vision systems classified melanoma dermoscopy images with accuracy that exceeded some but not all dermatologists. Copyright © 2017 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.

  10. Implementing a Zero Energy Ready Home Multifamily Project

    Energy Technology Data Exchange (ETDEWEB)

    Springer, David [National Renewable Energy Laboratory (NREL), Golden, CO (United States); German, Alea [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2015-08-17

    Building cost-effective, high-performance homes that provide superior comfort, health, and durability is the goal of the U.S. Department of Energy’s (DOE’s) Zero Energy Ready Home (ZERH) program. Building America research and other innovative programs throughout the country have addressed many of the technical challenges of building to the ZERH standard. The cost-effectiveness of measure packages that result in 30% source energy savings compared to a code-compliant home have been demonstrated. However, additional challenges remain, particularly with respect to convincing production builders of the strong business case for ZERH. The Alliance for Residential Building Innovation (ARBI) team believes that the keys to successfully engaging builders and developers in the California market are to help them leverage development agreement requirements, code compliance requirements, incentives, and competitive market advantages of ZERH certification, and navigate through this process. A primary objective of this project was to gain a highly visible foothold for residential buildings that are built to the DOE ZERH specification that can be used to encourage participation by other California builders. This report briefly describes two single-family homes that were ZERH certified and focuses on the experience of working with developer Mutual Housing on a 62-unit multifamily community at the Spring Lake subdivision in Woodland, California. The Spring Lake project is expected to be the first ZERH-certified multifamily project in the country. This report discusses the challenges encountered, lessons learned, and how obstacles were overcome.

  11. A Challenge to Watson

    Science.gov (United States)

    Detterman, Douglas K.

    2011-01-01

    Watson's Jeopardy victory raises the question of the similarity of artificial intelligence and human intelligence. Those of us who study human intelligence issue a challenge to the artificial intelligence community. We will construct a unique battery of tests for any computer that would provide an actual IQ score for the computer. This is the same…

  12. In Their Own Words: Using Self-Assessments of College Readiness to Develop Strategies for Self-Regulated Learning

    Science.gov (United States)

    Verrell, Paul A.; McCabe, Norah R.

    2015-01-01

    The pathway to success in college can be bumpy. To smooth it we first investigated self-assessment of college readiness by undergraduates in terms of skills and habits required for college success. In a survey of almost 700 students, one of every two reported that their college work was more challenging than expected. Although 70% reported that…

  13. Challenging makerspaces

    DEFF Research Database (Denmark)

    Sandvik, Kjetil; Thestrup, Klaus

    . The Danish part of the project will be undertaken by a small network of partners: DOKK1, a public library and open urban space in Aarhus, that is experimenting with different kind of makerspaces, spaces and encounters between people, The LEGO-LAB situated at Computer Science, Aarhus University, that has......-8 will be developed through participation in creative activities in specially-designed spaces termed ‘makerspaces’. This paper discusses, develops and challenges this term in relation to Danish pedagogical traditions, to expanding makerspaces onto the internet and on how to combine narratives and construction...... developed a number of work space activities on children and technology and finally Katrinebjergskolen, a public school that has built a new multi-functional room, that among other things are meant for makerspaces and new combinations of media and materials. This group will work with the notion of Next...

  14. Relating Resources to Personnel Readiness. Use of Army Strength Management Models,

    Science.gov (United States)

    1997-01-01

    form by any electronic or mechanical means (including photocopying, recording, or information storage and retrieval) without permission in writing...CONUS Continental United States C4I Command, control, communications, computer systems, and intelligence ELIM Enlisted Loss Inventory Model EMF ...motivation on personnel readiness is a fertile area for analysis, and we expect that more efforts will be devoted to building re- lationships in this

  15. Health disparities and gaps in school readiness.

    Science.gov (United States)

    Currie, Janet

    2005-01-01

    The author documents pervasive racial disparities in the health of American children and analyzes how and how much those disparities contribute to racial gaps in school readiness. She explores a broad sample of health problems common to U.S. children, such as attention deficit hyperactivity disorder, asthma, and lead poisoning, as well as maternal health problems and health-related behaviors that affect children's behavioral and cognitive readiness for school. If a health problem is to affect the readiness gap, it must affect many children, it must be linked to academic performance or behavior problems, and it must show a racial disparity either in its prevalence or in its effects. The author focuses not only on the black-white gap in health status but also on the poor-nonpoor gap because black children tend to be poorer than white children. The health conditions Currie considers seriously impair cognitive skills and behavior in individual children. But most explain little of the overall racial gap in school readiness. Still, the cumulative effect of health differentials summed over all conditions is significant. Currie's rough calculation is that racial differences in health conditions and in maternal health and behaviors together may account for as much as a quarter of the racial gap in school readiness. Currie scrutinizes several policy steps to lessen racial and socioeconomic disparities in children's health and to begin to close the readiness gap. Increasing poor children's eligibility for Medicaid and state child health insurance is unlikely to be effective because most poor children are already eligible for public insurance. The problem is that many are not enrolled. Even increasing enrollment may not work: socioeconomic disparities in health persist in Canada and the United Kingdom despite universal public health insurance. The author finds more promise in strengthening early childhood programs with a built-in health component, like Head Start; family

  16. The LHC Computing Grid in the starting blocks

    CERN Multimedia

    Danielle Amy Venton

    2010-01-01

    As the Large Hadron Collider ramps up operations and breaks world records, it is an exciting time for everyone at CERN. To get the computing perspective, the Bulletin this week caught up with Ian Bird, leader of the Worldwide LHC Computing Grid (WLCG). He is confident that everything is ready for the first data.   The metallic globe illustrating the Worldwide LHC Computing GRID (WLCG) in the CERN Computing Centre. The Worldwide LHC Computing Grid (WLCG) collaboration has been in place since 2001 and for the past several years it has continually run the workloads for the experiments as part of their preparations for LHC data taking. So far, the numerous and massive simulations of the full chain of reconstruction and analysis software could only be carried out using Monte Carlo simulated data. Now, for the first time, the system is starting to work with real data and with many simultaneous users accessing them from all around the world. “During the 2009 large-scale computing challenge (...

  17. An assessment of pharmacists’ readiness for paperless labeling: a national survey

    Science.gov (United States)

    Ho, Yun-Xian; Chen, Qingxia; Nian, Hui; Johnson, Kevin B

    2014-01-01

    Objective To assess the state of readiness for the adoption of paperless labeling among a nationally representative sample of pharmacies, including chain pharmacies, independent retail pharmacies, hospitals, and other rural or urban dispensing sites. Methods Both quantitative and qualitative analyses were used to analyze responses to a cross-sectional survey disseminated to American Pharmacists Association pharmacists nationwide. The survey assessed factors related to pharmacists’ attitudinal readiness (ie, perceptions of impact) and pharmacies’ structural readiness (eg, availability of electronic resources, internet access) for the paperless labeling initiative. Results We received a total of 436 survey responses (6% response rate) from pharmacists representing 44 US states and territories. Across the spectrum of settings we studied, pharmacists had work access to computers, printers, fax machines and access to the internet or intranet. Approximately 79% of respondents believed that the initiative would improve the adequacy of drug information available in their work site and 95% believed it would either not change (33%) or would improve (62%) communication to patients. Overall, respondents’ comments supported advancing the initiative; however, some comments revealed reservations regarding corporate or pharmacy buy-in, success of implementation, and ease of adoption. Conclusions This is the first nationwide study to report about pharmacists’ perspectives on paperless labeling. In general, pharmacists believe they are ready and that their pharmacies are well equipped for the transition to paperless labeling. Further exploration of perspectives from product label manufacturers and corporate pharmacy offices is needed to understand fully what will be necessary to complete this transition. PMID:23523874

  18. Evaluating bank readiness for CRM implementation

    Directory of Open Access Journals (Sweden)

    Farajullah Rahnavard

    2012-08-01

    Full Text Available These days, we see unexpected changes in customers' behaviors in financial and service institutions, especially in banks. There are different reasons for having such changes but the recent advances of technology could be one of the main reasons. Today banks are obliged to link their existence with customers, recognize their demands and needs in present competitive environment and take necessary actions to increase their productivity. The main objective of the present research is to identify bank readiness for establishing customer relationship management. Findings of the present study show that the readiness degree of bank is well above moderate with respect to intellectual dimension and is below moderate with respect to social and technological dimensions.

  19. The Pediatrician's Role in Optimizing School Readiness.

    Science.gov (United States)

    2016-09-01

    School readiness includes not only the early academic skills of children but also their physical health, language skills, social and emotional development, motivation to learn, creativity, and general knowledge. Families and communities play a critical role in ensuring children's growth in all of these areas and thus their readiness for school. Schools must be prepared to teach all children when they reach the age of school entry, regardless of their degree of readiness. Research on early brain development emphasizes the effects of early experiences, relationships, and emotions on creating and reinforcing the neural connections that are the basis for learning. Pediatricians, by the nature of their relationships with families and children, may significantly influence school readiness. Pediatricians have a primary role in ensuring children's physical health through the provision of preventive care, treatment of illness, screening for sensory deficits, and monitoring nutrition and growth. They can promote and monitor the social-emotional development of children by providing anticipatory guidance on development and behavior, by encouraging positive parenting practices, by modeling reciprocal and respectful communication with adults and children, by identifying and addressing psychosocial risk factors, and by providing community-based resources and referrals when warranted. Cognitive and language skills are fostered through timely identification of developmental problems and appropriate referrals for services, including early intervention and special education services; guidance regarding safe and stimulating early education and child care programs; and promotion of early literacy by encouraging language-rich activities such as reading together, telling stories, and playing games. Pediatricians are also well positioned to advocate not only for children's access to health care but also for high-quality early childhood education and evidence-based family supports such as

  20. Readiness and Resiliency Go Hand in Hand

    Science.gov (United States)

    2012-01-01

    rhythm of a command, resiliency is a needed ingredient to maintain individual and unit readiness. A command will have a natural dive and peak pattern...As we peel back the onion , if I may mix meta- phors, we see that we have only a linear assess- ment and inexact solution. The climate that the...development, and other funda- mental areas—all of which are key ingredients to building relevancy, resiliency, proficiency, and good order and discipline

  1. Sleep Inertia and On-Call Readiness

    Science.gov (United States)

    2000-03-01

    effects of polyphasic & M.J. Colligan (Eds.), New York, and ultrashort sleep schedules. In: Why Spectrum, pp. 553-580. we nap, Evolution, Chronobiology...and Naitoh, P., Kelly, T., & Babkoff, H. (1993) Functions of Polyphasic and Ultrashort Sleep inertia, best time not to wake up? Sleep , C. Stampi Editor...UNCLASSIFIED Defense Technical Information Center Compilation Part Notice ADP010467 TITLE: Sleep Inertia and On-Call Readiness DISTRIBUTION: Approved

  2. HOW READY ARE PEOPLE FOR CASHLESS SOCIETY?

    Directory of Open Access Journals (Sweden)

    Irwan Trinugroh

    2017-05-01

    Full Text Available Financial technology could be an effective tool to achieve financial inclusion. However, it needed a certain level of readiness of society. In this paper, we investigated the determinants of readiness in the implementation of digital financial services which led to cashless society. We did a survey on 993 adults in a province in Indonesia using proportional sampling technique. Estimated using ordinary least square, our empirical results showed that readiness perception had high correlation with the quality of supporting infrastructure. By taking into account some demographic factors, we found that more educated and younger people had high spirit to adopt this system. Evidence showed that men were more enthusiastic in using such technological-based system. However, we did not find an evidence on the difference of readiness perception between those living in rural and those living urban areas. Teknologi keuangan dapat menjadi suatu alat yang efektif untuk mencapai inklusi keuangan. Namun, itu memerlukan tingkat kesiapan masyarakat yang cukup. Dalam paper ini, kami meneliti determinan dari kesiapan implementasi layanan keuangan digital yang mendorong pada masyarakat non-tunai. Kami melakukan survei pada 993 orang dewasa in suatu provinsi di Indonesia dengan menggunakan metode pengambilan sampel proporsional. Diestimasi menggunakan ordinary least square, hasil empiris kami menunjukkan bahwa persepsi kesiapan berkorelasi tinggi dengan kualitas dari infrastruktur pendukung. Dengan memasukkan beberapa faktor demografi, kami menemukan bahwa orang yang lebih teredukasi dan lebih muda bersemangat untuk mengadopsi sistem ini. Sedikit bukti ditemukan bahwa laki-laki lebih antusias untuk menggunakan sistem berbasis teknologi tersebut. Namun demikian, kami tidak menemukan bukti mengenai perbedaaan persepsi kesiapan antara mereka yang tinggal di daerah pedesaan dan perkotaan.

  3. G+ COMMUNITY: MEASURING TEACHERS’ READINESS AND ACCEPTANCE

    OpenAIRE

    Mohd Faisal Farish Ishak; Rashidah Rahamat; Muhammad Hazrul Haris Fadzilah; Abdul Ghani Abu

    2017-01-01

    The purpose of this paper is to explore teachers’ acceptance and readiness in using the cloud-based community as a platform for professional collaboration related to their teaching and learning. Familiarity with certain social networking platforms has made the preferable collaboration among teachers only limited to using Facebook, WhatsApp or Telegram. However, with time and space constraints in schools, some of the sharing sessions could not be done effectively most of the time. The study fo...

  4. Brain readiness and the nature of language

    OpenAIRE

    Denis eBouchard

    2015-01-01

    To identify the neural components that make a brain ready for language, it is important to have well defined linguistic phenotypes, to know precisely what language is. There are two central features to language: the capacity to form signs (words), and the capacity to combine them into complex structures. We must determine how the human brain enables these capacities. A sign is a link between a perceptual form and a conceptual meaning. Acoustic elements and content elements, are already brain-...

  5. NGNP Infrastructure Readiness Assessment: Consolidation Report

    Energy Technology Data Exchange (ETDEWEB)

    Brian K Castle

    2011-02-01

    The Next Generation Nuclear Plant (NGNP) project supports the development, demonstration, and deployment of high temperature gas-cooled reactors (HTGRs). The NGNP project is being reviewed by the Nuclear Energy Advisory Council (NEAC) to provide input to the DOE, who will make a recommendation to the Secretary of Energy, whether or not to continue with Phase 2 of the NGNP project. The NEAC review will be based on, in part, the infrastructure readiness assessment, which is an assessment of industry's current ability to provide specified components for the FOAK NGNP, meet quality assurance requirements, transport components, have the necessary workforce in place, and have the necessary construction capabilities. AREVA and Westinghouse were contracted to perform independent assessments of industry's capabilities because of their experience with nuclear supply chains, which is a result of their experiences with the EPR and AP-1000 reactors. Both vendors produced infrastructure readiness assessment reports that identified key components and categorized these components into three groups based on their ability to be deployed in the FOAK plant. The NGNP project has several programs that are developing key components and capabilities. For these components, the NGNP project have provided input to properly assess the infrastructure readiness for these components.

  6. Assessing students' readiness towards e-learning

    Science.gov (United States)

    Rahim, Nasrudin Md; Yusoff, Siti Hawa Mohd; Latif, Shahida Abd

    2014-07-01

    The usage of e-Learning methodology has become a new attraction for potential students as shown by some higher learning institutions in Malaysia. As such, Universiti Selangor (Unisel) should be ready to embark on e-Learning teaching and learning in the near future. The purpose of the study is to gauge the readiness of Unisel's students in e-Learning environment. A sample of 110 students was chosen to participate in this study which was conducted in January 2013. This sample consisted of students from various levels of study that are foundation, diploma and degree program. Using a structured questionnaire, respondents were assessed on their basic Internet skills, access to technology required for e-Learning and their attitude towards characteristics of successful e-Learning student based on study habits, abilities, motivation and time management behaviour. The result showed that respondents did have access to technology that are required for e-Learning environment, and respondents were knowledgeable regarding the basic Internet skills. The finding also showed that respondents' attitude did meet all characteristics of successful e-Learning student. Further analysis showed that there is no significant relationshipeither among gender, level of study or faculty with those characteristics. As a conclusion, the study shows that current Unisel's students are ready to participate in e-Learning environment if the institution decided to embark on e-Learning methodology.

  7. Reconciling fossil fuel power generation development and climate issues: CCS and CCS-Ready

    Energy Technology Data Exchange (ETDEWEB)

    Paelinck, Philippe; Sonnois, Louis; Leandri, Jean-Francois

    2010-09-15

    This paper intends to analyse how CCS can contribute to reduce CO2 emissions from fossil-fuel power plants and to describe what is its current overall status. Its potential future development is assessed, in both developed and developing countries, and an economical assessment of different investment options highlight the importance of CCS retrofit. The paper analyses then the challenges of the development of fossil fuelled power plants and details case examples to illustrate some technical challenges related to CCS and what are the technical solutions available today to ease and address them: CCS-Ready power plants.

  8. Measuring organizational readiness for knowledge translation in chronic care

    Directory of Open Access Journals (Sweden)

    Ouimet Mathieu

    2011-07-01

    Full Text Available Abstract Background Knowledge translation (KT is an imperative in order to implement research-based and contextualized practices that can answer the numerous challenges of complex health problems. The Chronic Care Model (CCM provides a conceptual framework to guide the implementation process in chronic care. Yet, organizations aiming to improve chronic care require an adequate level of organizational readiness (OR for KT. Available instruments on organizational readiness for change (ORC have shown limited validity, and are not tailored or adapted to specific phases of the knowledge-to-action (KTA process. We aim to develop an evidence-based, comprehensive, and valid instrument to measure OR for KT in healthcare. The OR for KT instrument will be based on core concepts retrieved from existing literature and validated by a Delphi study. We will specifically test the instrument in chronic care that is of an increasing importance for the health system. Methods Phase one: We will conduct a systematic review of the theories and instruments assessing ORC in healthcare. The retained theoretical information will be synthesized in a conceptual map. A bibliography and database of ORC instruments will be prepared after appraisal of their psychometric properties according to the standards for educational and psychological testing. An online Delphi study will be carried out among decision makers and knowledge users across Canada to assess the importance of these concepts and measures at different steps in the KTA process in chronic care. Phase two: A final OR for KT instrument will be developed and validated both in French and in English and tested in chronic disease management to measure OR for KT regarding the adoption of comprehensive, patient-centered, and system-based CCMs. Discussion This study provides a comprehensive synthesis of current knowledge on explanatory models and instruments assessing OR for KT. Moreover, this project aims to create more

  9. Measuring organizational readiness for knowledge translation in chronic care

    Science.gov (United States)

    2011-01-01

    Background Knowledge translation (KT) is an imperative in order to implement research-based and contextualized practices that can answer the numerous challenges of complex health problems. The Chronic Care Model (CCM) provides a conceptual framework to guide the implementation process in chronic care. Yet, organizations aiming to improve chronic care require an adequate level of organizational readiness (OR) for KT. Available instruments on organizational readiness for change (ORC) have shown limited validity, and are not tailored or adapted to specific phases of the knowledge-to-action (KTA) process. We aim to develop an evidence-based, comprehensive, and valid instrument to measure OR for KT in healthcare. The OR for KT instrument will be based on core concepts retrieved from existing literature and validated by a Delphi study. We will specifically test the instrument in chronic care that is of an increasing importance for the health system. Methods Phase one: We will conduct a systematic review of the theories and instruments assessing ORC in healthcare. The retained theoretical information will be synthesized in a conceptual map. A bibliography and database of ORC instruments will be prepared after appraisal of their psychometric properties according to the standards for educational and psychological testing. An online Delphi study will be carried out among decision makers and knowledge users across Canada to assess the importance of these concepts and measures at different steps in the KTA process in chronic care. Phase two: A final OR for KT instrument will be developed and validated both in French and in English and tested in chronic disease management to measure OR for KT regarding the adoption of comprehensive, patient-centered, and system-based CCMs. Discussion This study provides a comprehensive synthesis of current knowledge on explanatory models and instruments assessing OR for KT. Moreover, this project aims to create more consensus on the theoretical

  10. Measuring organizational readiness for knowledge translation in chronic care.

    Science.gov (United States)

    Gagnon, Marie-Pierre; Labarthe, Jenni; Légaré, France; Ouimet, Mathieu; Estabrooks, Carole A; Roch, Geneviève; Ghandour, El Kebir; Grimshaw, Jeremy

    2011-07-13

    Knowledge translation (KT) is an imperative in order to implement research-based and contextualized practices that can answer the numerous challenges of complex health problems. The Chronic Care Model (CCM) provides a conceptual framework to guide the implementation process in chronic care. Yet, organizations aiming to improve chronic care require an adequate level of organizational readiness (OR) for KT. Available instruments on organizational readiness for change (ORC) have shown limited validity, and are not tailored or adapted to specific phases of the knowledge-to-action (KTA) process. We aim to develop an evidence-based, comprehensive, and valid instrument to measure OR for KT in healthcare. The OR for KT instrument will be based on core concepts retrieved from existing literature and validated by a Delphi study. We will specifically test the instrument in chronic care that is of an increasing importance for the health system. Phase one: We will conduct a systematic review of the theories and instruments assessing ORC in healthcare. The retained theoretical information will be synthesized in a conceptual map. A bibliography and database of ORC instruments will be prepared after appraisal of their psychometric properties according to the standards for educational and psychological testing. An online Delphi study will be carried out among decision makers and knowledge users across Canada to assess the importance of these concepts and measures at different steps in the KTA process in chronic care.Phase two: A final OR for KT instrument will be developed and validated both in French and in English and tested in chronic disease management to measure OR for KT regarding the adoption of comprehensive, patient-centered, and system-based CCMs. This study provides a comprehensive synthesis of current knowledge on explanatory models and instruments assessing OR for KT. Moreover, this project aims to create more consensus on the theoretical underpinnings and the

  11. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  12. Preparing for success: Readiness models for rural telehealth

    Directory of Open Access Journals (Sweden)

    Jennett P

    2005-01-01

    Full Text Available Background: Readiness is an integral and preliminary step in the successful implementation of telehealth services into existing health systems within rural communities. Methods and Materials: This paper details and critiques published international peer-reviewed studies that have focused on assessing telehealth readiness for rural and remote health. Background specific to readiness and change theories is provided, followed by a critique of identified telehealth readiness models, including a commentary on their readiness assessment tools. Results: Four current readiness models resulted from the search process. The four models varied across settings, such as rural outpatient practices, hospice programs, rural communities, as well as government agencies, national associations, and organizations. All models provided frameworks for readiness tools. Two specifically provided a mechanism by which communities could be categorized by their level of telehealth readiness. Discussion: Common themes across models included: an appreciation of practice context, strong leadership, and a perceived need to improve practice. Broad dissemination of these telehealth readiness models and tools is necessary to promote awareness and assessment of readiness. This will significantly aid organizations to facilitate the implementation of telehealth.

  13. Quantum Computing

    Science.gov (United States)

    Steffen, Matthias

    Solving computational problems require resources such as time, memory, and space. In the classical model of computation, computational complexity theory has categorized problems according to how difficult it is to solve them as the problem size increases. Remarkably, a quantum computer could solve certain problems using fundamentally fewer resources compared to a conventional computer, and therefore has garnered significant attention. Yet because of the delicate nature of entangled quantum states, the construction of a quantum computer poses an enormous challenge for experimental and theoretical scientists across multi-disciplinary areas including physics, engineering, materials science, and mathematics. While the field of quantum computing still has a long way to grow before reaching full maturity, state-of-the-art experiments on the order of 10 qubits are beginning to reach a fascinating stage at which they can no longer be emulated using even the fastest supercomputer. This raises the hope that small quantum computer demonstrations could be capable of approximately simulating or solving problems that also have practical applications. In this talk I will review the concepts behind quantum computing, and focus on the status of superconducting qubits which includes steps towards quantum error correction and quantum simulations.

  14. Year 2000 Computing Challenge: Education Taking Needed Actions but Work Remains. Testimony before the Subcommittee on Oversight and Investigations, Committee on Education and the Workforce, House of Representatives.

    Science.gov (United States)

    Willemssen, Joel C.

    This document provides testimony on the U.S. Department of Education's efforts to ensure that its computer systems supporting critical student financial aid activities will be able to process information reliably through the turn of the century. After providing some background information, the statement recaps prior findings and the actions that…

  15. The readiness of postgraduate health sciences students for interprofessional education in iran.

    Science.gov (United States)

    Vafadar, Zohreh; Vanaki, Zohreh; Ebadi, Abbas

    2015-01-01

    Interprofessional education has been recognized as an effective educational approach towards enabling students to provide comprehensive and safe team care for promotion of health outcomes of patients. This study was conducted in order to assess the readiness of postgraduate health science students for interprofessional education/learning, as well as identify barriers to the implementation of such an approach in Iran from the students' point of view. This was a cross-sectional and descriptive-analytical study conducted in 2013 on 500 postgraduate students in three main professional groups: medical, nursing and other allied health professions across a number of Iranian Universities using the convenience sampling method. Quantitative Data were collected through self-administering the Readiness for InterProfessional Learning Scale (RIPLS) questionnaire with acceptable internal consistency (? = 0.86). The data were analyzed by SPSS18. Qualitative data were gathered by an open-ended questionnaire and analyzed by qualitative content analysis method. The mean score of the students' readiness (M=80, SD=8.6) was higher than the average score on the Scale (47.5). In comparison between groups, there was no statistically significant difference between groups in their readiness (p>0.05). Also four main categories were identified as barriers to implementation of interprofessional education from the students' point of view; the categories were an inordinately profession-oriented, individualistic culture, style of management and weak evidence. An acceptable degree of readiness and a generally favorable attitude among students towards interprofessional education show that there are appropriate attitudinal and motivational backgrounds for implementation of interprofessional education, but it is necessary to remove the barriers by long-term strategic planning and advancing of interprofessional education in order to address health challenges.

  16. The Readiness of Postgraduate Health Sciences Students for Interprofessional Education in Iran

    Science.gov (United States)

    Vafadar, Zohreh; Vanaki, Zohreh; Ebadi, Abbas

    2015-01-01

    Aim: Interprofessional education has been recognized as an effective educational approach towards enabling students to provide comprehensive and safe team care for promotion of health outcomes of patients. This study was conducted in order to assess the readiness of postgraduate health science students for interprofessional education/learning, as well as identify barriers to the implementation of such an approach in Iran from the students’ point of view. Methods: This was a cross–sectional and descriptive-analytical study conducted in 2013 on 500 postgraduate students in three main professional groups: medical, nursing and other allied health professions across a number of Iranian Universities using the convenience sampling method. Quantitative Data were collected through self-administering the Readiness for InterProfessional Learning Scale (RIPLS) questionnaire with acceptable internal consistency (α = 0.86). The data were analyzed by SPSS18. Qualitative data were gathered by an open–ended questionnaire and analyzed by qualitative content analysis method. Results: The mean score of the students’ readiness (M=80, SD=8.6) was higher than the average score on the Scale (47.5). In comparison between groups, there was no statistically significant difference between groups in their readiness (p>0.05). Also four main categories were identified as barriers to implementation of interprofessional education from the students’ point of view; the categories were an inordinately profession-oriented, individualistic culture, style of management and weak evidence. Conclusion: An acceptable degree of readiness and a generally favorable attitude among students towards interprofessional education show that there are appropriate attitudinal and motivational backgrounds for implementation of interprofessional education, but it is necessary to remove the barriers by long-term strategic planning and advancing of interprofessional education in order to address health

  17. Humanitarian Hackathon @CERN | 14-16 October | Are you ready?

    CERN Multimedia

    2016-01-01

    THE Port is ready for the third edition of its hackathon with eight new challenges. Join us to discover how science can make a huge difference in people's life.   Humanitarian hackathons organised by THE Port and hosted by CERN IdeaSquare have already confirmed that fundamental science can provide tech-enabled responses to humanitarian issues affecting the lives of millions of people around the globe. A great example of the success that technology and collaboration can bring is the substantial improvement of the food airdrop bags, requested by the ICRC to deliver assistance in South Sudan and other critical regions. Watch a 360° video or check out the pictures using the QR code. This year, eight teams will innovate the way humanitarian organisations handle the most critical aspects of field work during a 60-hour event. Groups of experts from all over the world will provide out-of-the-box proposals to tackle challenges set up by the ICRC, Handicap International, the United Natio...

  18. Cognitive Challenges

    Science.gov (United States)

    ... Privacy Policy Sitemap Learn Engage Donate About TSC Cognitive Challenges Approximately 45% to 60% of individuals with TSC develop cognitive challenges (intellectual disabilities), although the degree of intellectual ...

  19. PREPARING FOR EXASCALE: ORNL Leadership Computing Application Requirements and Strategy

    Energy Technology Data Exchange (ETDEWEB)

    Joubert, Wayne [ORNL; Kothe, Douglas B [ORNL; Nam, Hai Ah [ORNL

    2009-12-01

    In 2009 the Oak Ridge Leadership Computing Facility (OLCF), a U.S. Department of Energy (DOE) facility at the Oak Ridge National Laboratory (ORNL) National Center for Computational Sciences (NCCS), elicited petascale computational science requirements from leading computational scientists in the international science community. This effort targeted science teams whose projects received large computer allocation awards on OLCF systems. A clear finding of this process was that in order to reach their science goals over the next several years, multiple projects will require computational resources in excess of an order of magnitude more powerful than those currently available. Additionally, for the longer term, next-generation science will require computing platforms of exascale capability in order to reach DOE science objectives over the next decade. It is generally recognized that achieving exascale in the proposed time frame will require disruptive changes in computer hardware and software. Processor hardware will become necessarily heterogeneous and will include accelerator technologies. Software must undergo the concomitant changes needed to extract the available performance from this heterogeneous hardware. This disruption portends to be substantial, not unlike the change to the message passing paradigm in the computational science community over 20 years ago. Since technological disruptions take time to assimilate, we must aggressively embark on this course of change now, to insure that science applications and their underlying programming models are mature and ready when exascale computing arrives. This includes initiation of application readiness efforts to adapt existing codes to heterogeneous architectures, support of relevant software tools, and procurement of next-generation hardware testbeds for porting and testing codes. The 2009 OLCF requirements process identified numerous actions necessary to meet this challenge: (1) Hardware capabilities must be

  20. Prepared for what? addressing the disaster readiness gap beyond preparedness for survival.

    Science.gov (United States)

    Gowan, Monica E; Sloan, Jeff A; Kirk, Ray C

    2015-11-17

    and Pacific-wide tsunami alerts brought home how persistently vulnerable we all are, and how developing intrinsic personal readiness for scientifically-known risks before disaster unfolds is essential policy. Gap programming that addresses personal readiness challenges in prevention timeframes could save lives and costs. We contend that bridging this readiness gap will prevent situations where people, communities, and systems survive the initial impact, but their resilience trajectories are vulnerable to the challenges of long-haul recovery.

  1. E-health readiness assessment framework in iran.

    Science.gov (United States)

    Rezai-Rad, M; Vaezi, R; Nattagh, F

    2012-01-01

    Concept of e-readiness is used in many areas such as e-business, e-commerce, e-government, and e-banking. In terms of healthcare, e-readiness is a rather new concept, and is propounded under the title of E-healthcare. E-health readiness refers to the readiness of communities and healthcare institutions for the expected changes brought by programs related to Information and Communications Technology (lCT). The present research is conducted aiming at designing E-health Readiness Assessment Framework (EHRAF) in Iran. The e-health readiness assessment framework was designed based on reviewing literature on e-readiness assessment models and opinions of ICT and health experts. In the next step, Delphi method was used to develop and test the designed framework. Three questionnaires developed to test and modify the model while determining weights of the indices; afterward they were either sent to experts through email or delivered to them in face. The designed framework approved with 4 dimensions, 11 constituents and 58 indices. Technical readiness had the highest importance coefficient (0.256099), and the other dimensions were of the next levels of coefficient importance: core readiness (0.25520), social communication readiness (0.244658), and engagement readiness (0.244039). The framework presents the movement route and investment priorities in e-health in Iran. The proposed framework is a good instrument for measuring the e-readiness in health centers in Iran, and for identifying strengths and weaknesses of these centers to access ICT and its implementation for more effectiveness and for analyzing digital divide between them, as well.

  2. E-Health Readiness Assessment Framework in Iran

    Science.gov (United States)

    Rezai-Rad, M; Vaezi, R; Nattagh, F

    2012-01-01

    Background: Concept of e-readiness is used in many areas such as e-business, e-commerce, e-government, and e-banking. In terms of healthcare, e-readiness is a rather new concept, and is propounded under the title of E-healthcare. E-health readiness refers to the readiness of communities and healthcare institutions for the expected changes brought by programs related to Information and Communications Technology (lCT). The present research is conducted aiming at designing E-health Readiness Assessment Framework (EHRAF) in Iran. Methods: The e-health readiness assessment framework was designed based on reviewing literature on e-readiness assessment models and opinions of ICT and health experts. In the next step, Delphi method was used to develop and test the designed framework. Three questionnaires developed to test and modify the model while determining weights of the indices; afterward they were either sent to experts through email or delivered to them in face. Results: The designed framework approved with 4 dimensions, 11 constituents and 58 indices. Technical readiness had the highest importance coefficient (0.256099), and the other dimensions were of the next levels of coefficient importance: core readiness (0.25520), social communication readiness (0.244658), and engagement readiness (0.244039). Conclusion: The framework presents the movement route and investment priorities in e-health in Iran. The proposed framework is a good instrument for measuring the e-readiness in health centers in Iran, and for identifying strengths and weaknesses of these centers to access ICT and its implementation for more effectiveness and for analyzing digital divide between them, as well. PMID:23304661

  3. A Model of Feeding Readiness for Preterm Infants

    OpenAIRE

    Pickler, Rita H.

    2004-01-01

    This paper presents a theoretical model of bottle feeding readiness in preterm infants, which hypothesizes relationships between bottle feeding readiness, experience, and outcomes. The synactive theory of development provided the conceptual foundation for the model. The model, which is currently being tested, is designed to establish bottle feeding readiness criteria that will help nurses decide when to offer a bottle to a preterm infant The model may also provide a useful framework for deter...

  4. Waterfront Challenges

    DEFF Research Database (Denmark)

    Kiib, Hans

    2007-01-01

    An overall view on the waterfront transformation and the planning challenges related to this process. It contributes to the specific challenges and potentials related to Aalborg Waterfront.......An overall view on the waterfront transformation and the planning challenges related to this process. It contributes to the specific challenges and potentials related to Aalborg Waterfront....

  5. Brain readiness and the nature of language

    Directory of Open Access Journals (Sweden)

    Denis eBouchard

    2015-09-01

    Full Text Available To identify the neural components that make a brain ready for language, it is important to have well defined linguistic phenotypes, to know precisely what language is. There are two central features to language: the capacity to form signs (words, and the capacity to combine them into complex structures. We must determine how the human brain enables these capacities.A sign is a link between a perceptual form and a conceptual meaning. Acoustic elements and content elements, are already brain-internal in non-human animals, but as categorical systems linked with brain-external elements. Being indexically tied to objects of the world, they cannot freely link to form signs. A crucial property of a language-ready brain is the capacity to process perceptual forms and contents offline, detached from any brain-external phenomena, so their representations may be linked into signs. These brain systems appear to have pleiotropic effects on a variety of phenotypic traits and not to be specifically designed for language.Syntax combines signs, so the combination of two signs operates simultaneously on their meaning and form. The operation combining the meanings long antedates its function in language: the primitive mode of predication operative in representing some information about an object. The combination of the forms is enabled by the capacity of the brain to segment vocal and visual information into discrete elements. Discrete temporal units have order and juxtaposition, and vocal units have intonation, length, and stress. These are primitive combinatorial processes. So the prior properties of the physical and conceptual elements of the sign introduce combinatoriality into the linguistic system, and from these primitive combinatorial systems derive concatenation in phonology and combination in morphosyntax.Given the nature of language, a key feature to our understanding of the language-ready brain is to be found in the mechanisms in human brains that

  6. Quantum computational supremacy

    Science.gov (United States)

    Harrow, Aram W.; Montanaro, Ashley

    2017-09-01

    The field of quantum algorithms aims to find ways to speed up the solution of computational problems by using a quantum computer. A key milestone in this field will be when a universal quantum computer performs a computational task that is beyond the capability of any classical computer, an event known as quantum supremacy. This would be easier to achieve experimentally than full-scale quantum computing, but involves new theoretical challenges. Here we present the leading proposals to achieve quantum supremacy, and discuss how we can reliably compare the power of a classical computer to the power of a quantum computer.

  7. Alliance College-Ready Public Schools: Alice M. Baxter College-Ready High School

    Science.gov (United States)

    EDUCAUSE, 2015

    2015-01-01

    The largest charter organization in Los Angeles serving more than 11,000 low-income students aims to prove it is possible to educate students at high levels across an entire system of schools. Alliance College-Ready Public Schools developed the PACE blended learning model, launched at the new Baxter High School, to more effectively prepare its…

  8. Social-Emotional School Readiness: How Do We Ensure Children Are Ready to Learn?

    Science.gov (United States)

    Gray, Sarah A. O.; Herberle, Amy E.; Carter, Alice S.

    2012-01-01

    This article begins with a review of research providing evidence that social-emotional competence is a key component of school readiness and that the foundations for social-emotional competence are laid down in the earliest years. We go on to review effective practices and specific interventions that have been found to strengthen children's…

  9. Military Readiness: DODs Readiness Rebuilding Efforts May Be at Risk without a Comprehensive Plan

    Science.gov (United States)

    2016-09-01

    Defense for Personnel & Readiness Memorandum, Deployment-to- Dwell, Mobilization -to-Dwell Policy Revision (Nov. 1, 2013). 16GAO-16-473RC. 17DOD’s efforts...Reports and Testimony Order by Phone Connect with GAO To Report Fraud, Waste, and Abuse in Federal Programs Congressional Relations Public...Affairs Strategic Planning and External Liaison Please Print on Recycled Paper.

  10. Recreation Embedded State Tuning for Optimal Readiness and Effectiveness (RESTORE)

    Science.gov (United States)

    Pope, Alan T.; Prinzel, Lawrence J., III

    2005-01-01

    Physiological self-regulation training is a behavioral medicine intervention that has demonstrated capability to improve psychophysiological coping responses to stressful experiences and to foster optimal behavioral and cognitive performance. Once developed, these psychophysiological skills require regular practice for maintenance. A concomitant benefit of these physiologically monitored practice sessions is the opportunity to track crew psychophysiological responses to the challenges of the practice task in order to detect shifts in adaptability that may foretell performance degradation. Long-duration missions will include crew recreation periods that will afford physiological self-regulation training opportunities. However, to promote adherence to the regimen, the practice experience that occupies their recreation time must be perceived by the crew as engaging and entertaining throughout repeated reinforcement sessions on long-duration missions. NASA biocybernetic technologies and publications have developed a closed-loop concept that involves adjusting or modulating (cybernetic, for governing) a person's task environment based upon a comparison of that person's physiological responses (bio-) with a training or performance criterion. This approach affords the opportunity to deliver physiological self-regulation training in an entertaining and motivating fashion and can also be employed to create a conditioned association between effective performance state and task execution behaviors, while enabling tracking of individuals psychophysiological status over time in the context of an interactive task challenge. This paper describes the aerospace spin-off technologies in this training application area as well as the current spin-back application of the technologies to long-duration missions - the Recreation Embedded State Tuning for Optimal Readiness and Effectiveness (RESTORE) concept. The RESTORE technology is designed to provide a physiological self

  11. Progress and challenges in the computational prediction of gene function using networks [v1; ref status: indexed, http://f1000r.es/SqmJUM

    Directory of Open Access Journals (Sweden)

    Paul Pavlidis

    2012-09-01

    Full Text Available In this opinion piece, we attempt to unify recent arguments we have made that serious confounds affect the use of network data to predict and characterize gene function. The development of computational approaches to determine gene function is a major strand of computational genomics research. However, progress beyond using BLAST to transfer annotations has been surprisingly slow. We have previously argued that a large part of the reported success in using "guilt by association" in network data is due to the tendency of methods to simply assign new functions to already well-annotated genes. While such predictions will tend to be correct, they are generic; it is true, but not very helpful, that a gene with many functions is more likely to have any function. We have also presented evidence that much of the remaining performance in cross-validation cannot be usefully generalized to new predictions, making progressive improvement in analysis difficult to engineer. Here we summarize our findings about how these problems will affect network analysis, discuss some ongoing responses within the field to these issues, and consolidate some recommendations and speculation, which we hope will modestly increase the reliability and specificity of gene function prediction.

  12. Measuring readiness for and satisfaction with a hand hygiene e-learning course among healthcare workers in a paediatric oncology centre in Guatemala City

    Science.gov (United States)

    Gonzalez, Miriam L.; Melgar, Mario; Homsi, Maysam; Shuler, Ana; Antillon-Klussmann, Federico; Matheu, Laura; Ramirez, Marylin; Grant, Michael M.; Lowther, Deborah L.; Relyea, George; Caniza, Miguela A.

    2017-01-01

    E-learning has been widely used in the infection control field and has been recommended for use in hand hygiene (HH) programs by the World Health Organization. Such strategies are effective and efficient for infection control, but factors such as learner readiness for this method should be determined to assure feasibility and suitability in low- to middle-income countries. We developed a tailored, e-learning, Spanish-language HH course based on the WHO guidelines for HH in healthcare settings for the pediatric cancer center in Guatemala City. We aimed to identify e-readiness factors that influenced HH course completion and evaluate HCWs’ satisfaction. Pearson’s chi-square test of independence was used to retrospectively compare e-readiness factors and course-completion status (completed, non-completed, and never-started). We surveyed 194 HCWs for e-readiness; 116 HCWs self-enrolled in the HH course, and 55 responded to the satisfaction survey. Most e-readiness factors were statistically significant between course-completion groups. Moreover, students were significantly more likely to complete the course if they had a computer with an Internet connection (P=0.001) and self-reported comfort with using a computer several times a week (p=0.001) and communicating through online technologies (p=0.001). Previous online course experience was not a significant factor (p=0.819). E-readiness score averages varied among HCWs, and mean scores for all e-readiness factors were significantly higher among medical doctors than among nurses. Nearly all respondents to the satisfaction survey agreed that e-learning was as effective as the traditional teaching method. Evaluating HCWs’ e-readiness is essential while integrating technologies into educational programs in low- to middle-income countries.

  13. The SC gets ready for visitors

    CERN Multimedia

    Antonella Del Rosso

    2012-01-01

    Hall 300, which houses the Synchrocyclotron (SC), CERN’s first accelerator, is getting ready to host a brand-new exhibition. The site will be one of the stops on the new visit itineraries that will be inaugurated for the 2013 CERN Open Day.   The Synchrocyclotron through the years. Just as it did in the late 1950s, when the accelerator was first installed, the gigantic red structure of the Synchrocyclotron's magnet occupies a large part of the 300-square-metre hall. “We have completed the first phase of the project that will give the SC a new lease of life,” says Marco Silari, the project leader and a member of CERN’s Radiation Protection Group. “We have removed all the equipment that was not an integral part of the accelerator. The hall is now ready for the civil-engineering work that will precede the installation of the exhibition.” The SC was witness to a big part of the history of CERN. The accelerator produced ...

  14. Getting Ready for the Human Phenome Project

    DEFF Research Database (Denmark)

    Oetting, William S; Robinson, Peter N; Greenblatt, Marc S

    2013-01-01

    A forum of the Human Variome Project (HVP) was held as a satellite to the 2012 Annual Meeting of the American Society of Human Genetics in San Francisco, California. The theme of this meeting was "Getting Ready for the Human Phenome Project". Understanding the genetic contribution to both rare si...... for studies attempting to identify novel disease genes or causative genetic variants. Improved systems and tools that enhance the collection of phenotype data from clinicians are urgently needed. This meeting begins the HVP's effort towards this important goal.......A forum of the Human Variome Project (HVP) was held as a satellite to the 2012 Annual Meeting of the American Society of Human Genetics in San Francisco, California. The theme of this meeting was "Getting Ready for the Human Phenome Project". Understanding the genetic contribution to both rare...... the impact of genetic variation on disease. To this end, there needs to be a greater sharing of phenotype and genotype data. For this to occur, the many databases that currently exist will need to become interoperable to allow for the combining of cohorts with similar phenotypes to increase statistical power...

  15. 101 ready-to-use Excel formulas

    CERN Document Server

    Alexander, Michael

    2014-01-01

    Mr. Spreadsheet has done it again with 101 easy-to-apply Excel formulas 101 Ready-to-Use Excel Formulas is filled with the most commonly-used, real-world Excel formulas that can be repurposed and put into action, saving you time and increasing your productivity. Each segment of this book outlines a common business or analysis problem that needs to be solved and provides the actual Excel formulas to solve the problem-along with detailed explanation of how the formulas work. Written in a user-friendly style that relies on a tips and tricks approach, the book details how to perform everyday Excel tasks with confidence. 101 Ready-to-Use Excel Formulas is sure to become your well-thumbed reference to solve your workplace problems. The recipes in the book are structured to first present the problem, then provide the formula solution, and finally show how it works so that it can be customized to fit your needs. The companion website to the book allows readers to easily test the formulas and provides visual confirmat...

  16. G+ COMMUNITY: MEASURING TEACHERS’ READINESS AND ACCEPTANCE

    Directory of Open Access Journals (Sweden)

    Mohd Faisal Farish Ishak

    2017-08-01

    Full Text Available The purpose of this paper is to explore teachers’ acceptance and readiness in using the cloud-based community as a platform for professional collaboration related to their teaching and learning. Familiarity with certain social networking platforms has made the preferable collaboration among teachers only limited to using Facebook, WhatsApp or Telegram. However, with time and space constraints in schools, some of the sharing sessions could not be done effectively most of the time. The study focuses on teachers’ acceptance and readiness of having their community in the cloud when they were introduced to the platform during a Continuous Professional Development (CPD course. A total number of 61 teachers used Google Community named as ‘Contemporary Children’s Literature (CCL 2016’ as a platform for their Professional Learning Community (PLC during the course. Descriptive analysis was done using Google Sheets and the findings show that these teachers are receptive towards Google Community in terms of its engagement level, usefulness as well as ease of use. The introduction to Google Community has created a new pathway for their collaboration especially for teaching and learning purposes. In a nutshell, their acceptance towards the cloud-based community indicates that, given the right training channel, teachers are positive and opened to utilising and integrating the cloud-based technology in their current teaching practice.

  17. Assessing air medical crew real-time readiness to perform critical tasks.

    Science.gov (United States)

    Braude, Darren; Goldsmith, Timothy; Weiss, Steven J

    2011-01-01

    Air medical transport has had problems with its safety record, attributed in part to human error. Flight crew members (FCMs) must be able to focus on critical safety tasks in the context of a stressful environment. Flight crew members' cognitive readiness (CR) to perform their jobs may be affected by sleep deprivation, personal problems, high workload, and use of alcohol and drugs. The current study investigated the feasibility of using a computer-based cognitive task to assess FCMs' readiness to perform their job. The FCMs completed a short questionnaire to evaluate their physiologic and psychological state at the beginning and end of each shift. The FCMs then performed 3 minutes of a computer-based cognitive task called synthetic work environment (SYNWIN test battery). Task performance was compared with the questionnaire variables using correlation and regression analysis. Differences between the beginning and end of each shift were matched and compared using a paired Students t test. SYNWIN performance was significantly worse at the end of a shift compared with the beginning of the shift (p = 0.028) primarily because of decrement in the memory component. The SYNWIN composite scores were negatively correlated to degree of irritability felt by the participant, both before (r = -0.25) and after (r = -0.34) a shift and were significantly correlated with amount of sleep (0.22), rest (0.30), and life satisfaction (0.30). Performance by FCMs on a simple, rapid, computer-based psychological test correlates well with self-reported sleep, rest, life satisfaction, and irritability. Although further studies are warranted, these findings suggest that assessment of the performance of FCMs on a simple, rapid, computer-based, multitasking battery is feasible as an approach to determine their readiness to perform critical safety tasks through the SYNWIN task battery.

  18. Hydrologic Terrain Processing Using Parallel Computing

    Science.gov (United States)

    Tarboton, D. G.; Watson, D. W.; Wallace, R. M.; Schreuders, K.; Tesfa, T. K.

    2009-12-01

    Topography in the form of Digital Elevation Models (DEMs), is widely used to derive information for the modeling of hydrologic processes. Hydrologic terrain analysis augments the information content of digital elevation data by removing spurious pits, deriving a structured flow field, and calculating surfaces of hydrologic information derived from the flow field. The increasing availability of high-resolution terrain datasets for large areas poses a challenge for existing algorithms that process terrain data to extract this hydrologic information. This paper will describe parallel algorithms that have been developed to enhance hydrologic terrain pre-processing so that larger datasets can be more efficiently computed. Message Passing Interface (MPI) parallel implementations have been developed for pit removal, flow direction, and generalized flow accumulation methods within the Terrain Analysis Using Digital Elevation Models (TauDEM) package. The parallel algorithm works by decomposing the domain into striped or tiled data partitions where each tile is processed by a separate processor. This method also reduces the memory requirements of each processor so that larger size grids can be processed. The parallel pit removal algorithm is adapted from the method of Planchon and Darboux that starts from a high elevation then progressively scans the grid, lowering each grid cell to the maximum of the original elevation or the lowest neighbor. The MPI implementation reconciles elevations along process domain edges after each scan. Generalized flow accumulation extends flow accumulation approaches commonly available in GIS through the integration of multiple inputs and a broad class of algebraic rules into the calculation of flow related quantities. It is based on establishing a flow field through DEM grid cells, that is then used to evaluate any mathematical function that incorporates dependence on values of the quantity being evaluated at upslope (or downslope) grid cells

  19. CRITICAL COMPONENTS OF ONLINE LEARNING READINESS AND THEIR RELATIONSHIPS WITH LEARNER ACHIEVEMENT

    Directory of Open Access Journals (Sweden)

    Harun CIGDEM

    2016-04-01

    Full Text Available This study aimed to examine the relationship between certain factors of online learning readiness and learners’ end-of-course achievements. The study was conducted at a two-year post-secondary Turkish military school within the scope of the course titled Computer Literacy, which was designed and implemented in a blended way. The data were collected from 155 post-secondary military students through an online questionnaire. Three sub-scales of Hung et al.’s Online Learning Readiness Scale were used to collect the data during the first two weeks of the course. Descriptive and inferential statistics, such as Pearson correlation coefficients and linear regression analyses were performed to analyze the data. The descriptive results of the study indicated that students’ motivation for online learning was higher than both their computer/Internet self-efficacy and their orientations to self-directed learning. The inferential results revealed that the students’ end-of-course grades had significantly positive relationships with their computer/Internet self-efficacy and self-directed learning orientations. Finally, the students’ self-direction towards online learning appeared to be the strongest predictor of their achievements within the course; whereas computer/Internet self-efficacy and motivation for learning did not predict the learner achievement significantly.

  20. The Answer Is Readiness--Now What Is the Question?

    Science.gov (United States)

    Graue, Elizabeth

    2006-01-01

    Although readiness is often posed as the answer in early childhood education, there is typically confusion about exactly what question this complex term responds to. In this article, I explore common uses of the term readiness, examine their theoretical and empirical problems, and suggest a more synthetic conception that merges attention to the…

  1. Predictive Validity of the Gesell School Readiness Tests.

    Science.gov (United States)

    Graue, M. Elizabeth; Shepard, Lorrie A.

    In response to the fact that technical standards for screening and placement tests must be more rigorous than those for readiness tests, the predictive validity of the Gesell School Readiness Tests (GSRT) was examined. The purpose of the GSRT, a commonly used screening instrument, is the assessment of children's developmental behaviors to aid in…

  2. Readiness, Instruction, and Learning to be a Kindergartner.

    Science.gov (United States)

    Graue, M. Elizabeth

    1992-01-01

    Describes the effects of one community's concept of readiness for school on children in a kindergarten class. Of particular interest was the effect on children's understanding of their roles as students. The concept of readiness provided the framework for instructional activities and helped parents understand their children's roles as students.…

  3. 47 CFR 15.118 - Cable ready consumer electronics equipment.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Cable ready consumer electronics equipment. 15... Unintentional Radiators § 15.118 Cable ready consumer electronics equipment. (a) All consumer electronics TV... provisions of this section. Consumer electronics TV receiving equipment that includes features intended for...

  4. Iowa City Ready Mix, Inc. - Clean Water Act Public Notice

    Science.gov (United States)

    The EPA is providing notice of a proposed Administrative Penalty Assessment against Iowa City Ready Mix, Inc., for alleged violations at a facility located at 1854 South Riverside, Iowa City, IA (“facility”). The facility produces and transports ready mixe

  5. E-Learning Readiness in Public Secondary Schools in Kenya

    Science.gov (United States)

    Ouma, Gordon O.; Awuor, Fredrick M.; Kyambo, Benjamin

    2013-01-01

    As e-learning becomes useful to learning institutions worldwide, an assessment of e-learning readiness is essential for the successful implementation of e-learning as a platform for learning. Success in e-learning can be achieved by understanding the level of readiness of e-learning environments. To facilitate schools in Kenya to implement…

  6. Students' Readiness for E-Learning Application in Higher Education

    Science.gov (United States)

    Rasouli, Atousa; Rahbania, Zahra; Attaran, Mohammad

    2016-01-01

    The main goal of this research was to investigate the readiness of art students in applying e-learning. This study adopted a survey research design. From three public Iranian Universities (Alzahra, Tarbiat Modares, and Tehran), 347 students were selected by multistage cluster sampling and via Morgan Table. Their readiness for E-learning…

  7. Variables Affecting Readiness to Benefit from Career Interventions

    Science.gov (United States)

    Sampson, James P., Jr.; McClain, Mary-Catherine; Musch, Elisabeth; Reardon, Robert C.

    2013-01-01

    This article identifies and briefly describes the broad range of variables that may influence clients' readiness to benefit from career interventions. The article also discusses consequences of low readiness for effective use of career interventions and addresses implications for practice as well as for future research. Variables contributing to…

  8. Service availability and readiness assessment of maternal, newborn ...

    African Journals Online (AJOL)

    The Service Availability and Readiness Assessment (SARA) survey was adapted and used to generate information on service availability and the readiness of maternal, newborn and child health facilities to provide basic health care interventions for obstetric care, neonatal and child health in Madagascar. The survey ...

  9. Introduction: The Golden Triangle for Examining Leadership Developmental Readiness.

    Science.gov (United States)

    Avolio, Bruce J

    2016-01-01

    This introduction includes the history of the leadership field that resulted in the creation of the developmental readiness construct, which represents readiness of the leader as well as followers, peers, and the target leader's leader, and how context is to be developed through some form of leadership intervention. © 2016 Wiley Periodicals, Inc., A Wiley Company.

  10. Evaluation of Factors that Influence School Readiness Among ...

    African Journals Online (AJOL)

    The preschool child is often confronted with adverse environmental influences that may affect his/her development and hence his readiness for schooling. The factors that affect school readiness were evaluated in 532 nursery school pupils using a proforma on perinatal problems and gross motor development of the pupils.

  11. Development of the writing readiness inventory tool in context (WRITIC)

    NARCIS (Netherlands)

    Hartingsveldt, M.J. van; Vries, L. de; Cup, E.H.C.; Groot, I.J.M. de; Nijhuis-Van der Sanden, M.W.G.

    2014-01-01

    This article describes the development of the Writing Readiness Inventory Tool in Context (WRITIC), a measurement evaluating writing readiness in Dutch kindergarten children (5 and 6 years old). Content validity was established through 10 expert evaluations in three rounds. Construct validity was

  12. Development of the Writing Readiness Inventory Tool in Context (WRITIC)

    NARCIS (Netherlands)

    van Hartingsveldt, Margo J.; de Vries, Liesbeth; Cup, Edith HC; de Groot, Imelda JM; Nijhuis-van der Sanden, Maria WG

    2014-01-01

    This article describes the development of the Writing Readiness Inventory Tool in Context (WRITIC), a measurement evaluating writing readiness in Dutch kindergarten children (5 and 6 years old). Content validity was established through 10 expert evaluations in three rounds. Construct validity was

  13. Self-Directed Learning Readiness at General Motors Japan.

    Science.gov (United States)

    Beitler, Michael A.

    Although self-directed learning (SDL) has been promoted by businesses as being needed by managers, traditional business schools have not promoted this type of learning. In addition, some adult learners are not ready for SDL, and some subjects (such as accounting) are not suitable for SDL. The concept of self-directed learning readiness (SDLR) can…

  14. Teachers' Readiness to Implement Digital Curriculum in Kuwaiti Schools

    Science.gov (United States)

    Al-Awidi, Hamed; Aldhafeeri, Fayiz

    2017-01-01

    Aim/Purpose: The goal of this study was to investigate how Kuwaiti teachers perceive their own readiness to implement digital curriculum in public schools, and the factors that affect Kuwaiti teachers' readiness to implement digital curriculum from their perspectives. Background: In order to shift from the traditional instructional materials to…

  15. ELL School Readiness and Pre-Kindergarten Care

    Science.gov (United States)

    Gottfried, Michael A.

    2017-01-01

    The increased utilization of non-parental pre-kindergarten care has spurred interest by both researchers and policy makers as to what types of care might be effective at boosting school readiness. Under-developed in the research has been an assessment of the influence of pre-kindergarten care on school readiness for English Language Learners…

  16. Emotional Intelligence as a Determinant of Readiness for Online Learning

    Science.gov (United States)

    Buzdar, Muhammad Ayub; Ali, Akhtar; Tariq, Riaz Ul Haq

    2016-01-01

    Students' performance in online learning environments is associated with their readiness to adopt a digital learning approach. Traditional concept of readiness for online learning is connected with students' competencies of using technology for learning purposes. We in this research, however, investigated psychometric aspects of students'…

  17. Can NATO's new Very High Readiness Joint Task Force deter?

    DEFF Research Database (Denmark)

    Rynning, Sten; Ringsmose, Jens

    2017-01-01

    ” a distinct strategic rival – Russia. Chief among the Welsh summit initiatives was the decision to set up a new multinational spearhead force – the Very High Readiness Joint Task Force (VJTF) – as part of an enhanced NATO Response Force (NRF) and within the framework of a so-called Readiness Action Plan (RAP...

  18. Implementation readiness for user-driven innovation in business networks

    DEFF Research Database (Denmark)

    Lassen, Astrid Heidemann; Jacobsen, Alexia

    2013-01-01

    In this paper we further develop the concept of implementation readiness of user driven innovation in business networks by focusing on how implementation readiness activities are in fact needed not only at the early stages of such network collaboration, but continuously throughout the process...... of developing user-driven innovation in business networks....

  19. Secure cloud computing

    CERN Document Server

    Jajodia, Sushil; Samarati, Pierangela; Singhal, Anoop; Swarup, Vipin; Wang, Cliff

    2014-01-01

    This book presents a range of cloud computing security challenges and promising solution paths. The first two chapters focus on practical considerations of cloud computing. In Chapter 1, Chandramouli, Iorga, and Chokani describe the evolution of cloud computing and the current state of practice, followed by the challenges of cryptographic key management in the cloud. In Chapter 2, Chen and Sion present a dollar cost model of cloud computing and explore the economic viability of cloud computing with and without security mechanisms involving cryptographic mechanisms. The next two chapters addres

  20. Analyzing International Readiness of Small and Medium-Sized Enterprises

    Directory of Open Access Journals (Sweden)

    Mohammad Reza Hamidizadeh

    2015-01-01

    Full Text Available Internationalization has different connotations for different social sciences and its social, economic and cultural impacts have been examined by a number of studies. While firms’ internationalization processes have been understood as being dynamic, the concept of international readiness has rarely been the main focus of research efforts, which until a decade ago, focused principally on explaining sequences of entry modes and choices of markets. The emergence of the study of international entrepreneurship has enhanced the role of readiness. This study reviews the concept of international readiness by experimental and theoretical studies. Axioms in this research are based on content analysis. The framework incorporates measures to evaluate SMEs’ international readiness. The paper concludes with a research agenda as a guide for future work on considering the readiness as a critical phase before the internationalization process.